Creating a Scenario-Based Learning Tool with OpenAI’s ChatGPT API

Introduction

Artificial Intelligence (AI) has revolutionized the way we think about technology and its applications. One of the most promising developments in this field is the emergence of chatbots, which use natural language processing (NLP) to communicate with users. Chatbots have found their way into many different areas, including customer service, healthcare, and education. OpenAI’s ChatGPT API, based on the GPT-3.5 architecture, is a powerful tool for creating chatbots and we wanted to try it out for education and training.

Our intention was to create a scenario-based learning tool whereby ChatGPT API would play the role of a manager, and would ask the learner questions to test their understanding of a subject (in this case finance).

Getting Started

Getting started with the ChatGPT API is relatively easy. The first step is to sign up for an API key, which can be done on the OpenAI website (https://platform.openai.com/). Once you have your API key, you can start using the API with any programming language that supports HTTP requests. Given WordPress was the intended platform for using this tool, we wanted to develop it using PHP. We started with an extremely simple PHP script to create a basic scenario (https://gist.github.com/mahadirz/dfb75777acad45c254e204c96a609e3a).

Creating an Interface

Creating a user-friendly interface is crucial for any tool of this kind, as it can greatly affect the user experience. We used HTML/CSS for front-end aesthetics and PHP for back-end functionality. We also utilized JQuery to create an interactive interface and allow us to use AJAX for communication with the API. By using Ajax, we were able to send and receive data from the API without reloading the entire page. This allowed for a smoother conversation flow and improved the overall user experience.

Keeping the Conversation Going

Our intention was to create a dialogue, so it was critical that the tool could understand and reference the learner’s previous responses. Without this it would keep asking the same questions.

One thing to keep in mind when using the ChatGPT API is that it does not technically “store” conversations. This means that if you want it to understand your previous responses, you need to pass the entire conversation history back and forth with every API request. Initially we were concerned that the longer the the conversation continued, amount of tokens (and therefore cost) that would increase exponentially as more and more information was passed back and forth. However, after significant development and testing to this point we had only used $0.06 worth of tokens – so hardly a huge investment!

ChatGPT’s Speed

One issue we encountered when using the ChatGPT API was its speed. At times, the API would take a several seconds to respond, which could lead to a frustrating user experience. To address this, we firstly limited the “max_tokens” value, which controls how much text the API generates per request. The idea being that if it had less to generate then it would output its response more quickly. This worked to an extent, but responses still took a noticeable amount of time. We realised that this was something that we would need to mitigate via the interface, so we added an animated “…” icon between the user input and response to let the user know that the application was doing something.

Mastering “Prompts”

Prompts are essential for guiding the conversation and ensuring that the chatbot generates appropriate responses. However, crafting effective prompts can be challenging, as the wording and context can greatly affect the response.

Our intension was for the learning tool to create an ongoing dialogue with the user and ask one question at a time. However, sometimes (but not always), ChatGPT would spit out its own generated dialogue as one piece of text.

We experimented with a value called “Temperature”. This is a numerical value that you send with each API request, which controls the creativity of the response. We aimed to find the right balance between generating interesting responses and staying on topic. This helped slightly, but occasionally the tool would still misbehave. Ultimately, the solution was to be vary careful around our prompts and to reduce any ambiguity to ensure that the chatbot understood the intent.

Our prompt had the following line, which could be misconstrued. After removing it the tool behaved as we wanted it to:

“You should ask approximately ten questions then end the conversation by giving some overall feedback on my responses.”

Embedding this into learning content

Once we were happy with the functionality, we made it into a WordPress shortcode that we could insert into learning content. This was then added to a LearnDash topic page as an interactive activity within the content.

Conclusion

In conclusion, using the ChatGPT API to create a scenario-based learning tool was a challenging but rewarding experience. By refining our approach and exploring other applications for the API, we were able to create a chatbot that could effectively guide users through different scenarios. The potential of AI in education and training is vast, and we believe that chatbots can play an important role in improving learning outcomes. As AI technology continues to advance, we are excited to see what new possibilities emerge.

Related Posts