openAI Assistant API for chatbot

Has anyone built a chatbot with AppFarm? Is it possible to implement the openAI Assistant API?

It would be awesome to see an example of this!

Hello, and welcome to our community!
Although we donā€™t have an example of this in our Showroom, we have successfully created a chatbot in Appfarm using the OpenAI Assistants API. You can follow the instructions stated in the docs, and I have provided a few comments to explain how this can be done in Create below:

In our case, we created four datasources for the app: Question, Answer, Thread (conversation) and Run (execution).

  1. Create an Assistant defining its custom instructions and picking a model. We did this in the OpenAI admin page, but it can be done using the API as well.
  2. Create a Thread when a user starts a conversation. Link this thread to the question in context.
  3. Add Messages to the Thread as the user ask questions, and send question to the thread (https://api.openai.com/v1/threads/${threadId}/messages)
  4. Run the Assistant on the Thread to trigger responses. https://api.openai.com/v1/threads/${threadId}/runs/${runId}/steps
    Keep executing to get the steps until status is ā€˜completedā€™, this can be done by using a while-loop.
  5. When status on the run is completed, you can get the answer. Map and display the last message in the thread to the user. https://api.openai.com/v1/threads/${threadId}/messages

It is possible to refine it and make it more advanced, but this is a good starting point to continue exploring the API - I hope this is helpful. :slightly_smiling_face:

2 Likes

Thank you so much for this, it took a bit of time to get it working but we got it working and have indeed made it more advanced by making it do queries for us and with some major prompt engineering make it behave like our own assistant. Thou we are at a point were we think that the assistant needs a few more upgrades since we canā€™t make it consistant in the responses we get since it kind of works but every now and then it doesnā€™t. We will put this project on 25% speed and try it more later on again when a few more updates have arrived.

1 Like

Hello again!
As we kind of need to figure out if this can work or not weā€™ve upped the speed on this project again. Since there might be alternative solutions to the problem we are facing. So.
Sending questions and having a conversation with the bot is all well and good but, we want to use our own data from our application in the conversation so the user can ask questions. Like a customer service bot.
But we have no reliable solution to this right now. Our current solution is to have the assistant parse the users question into a query and then we use that response on our own data and then we send that answer to the bot as a second input to get the query answer in human-like words. And well, it works, sometimes.
Would you have any examples or possible solutions to make this more fluid and without errors occuring every now and then. I Know that prompt engineering is a big part of it aswell but like, running the thread two times to get one answer isnā€™t that great, and we feel also like that could compromise some of the data, possibly.
Then there is the security-question. How safe is appfarm, openai and the connection between them? I might also add that we are all juniors and interns working on this :slight_smile:

My hope for an answer on this is that ā€œWe are actually working on a component that will arrive next week, this will solve all your problemsā€ :wink:
How would you solve it?
There are a lot of sites with service bots today and weā€™d assume that we arenā€™t to far off and that appfarm also has a way to solve this.

Hi!

There are many companies exploring AI the way you are, and all or most of these applications must tackle instability and errors. The thing you are asking might be a single component for your use case and technology, but there are many different use cases, types of APIs and vendors in this space - and their offerings are changing rapidly. Appfarm is exploring AI as part of the platform (for boosting development speed), and we are working on reusable integrations. The latter might be relevant for your use case, but it is not ready on this side of the summer.

Your use case might be solved with the Open AI Assistants API, as you are currently doing. This API allows you to combine ā€œchatGPTā€ with your company data, without any explicit ā€œdata engineering or data scientistā€ knowledge. But yes, it is a bit slow and unstable. Microsoft Azure has a similar offering that may be more stable, since you may run it in your own instance, but I believe it is more expensive.

Security: The connection between Appfarm and OpenAI is ā€œserver to serverā€ and HTTPS, so it is as secure as you get it over the internet. The APIs of OpenAI also claim to not use API data for training. The same applies to Microsoft and Google (the two might appeal more to enterprises with regard to security and privacy).

To summarize the current state (from my personal point of view): Using the out-of-box APIs is fast, but currently a bit unstable. It might be improved with prompt optimization, and maybe also by moving up a tier or two in the OpenAI subscription. You might want to consider investing some time in setting up a ā€œRAGā€ system (Retrievel Augmented Generation) in Microsoft (Azure OpenAI) or Google (Vertex AI) if you really want to operationalize things. RAG means adding your company docs or resources in a database (vector db) and using it for searching and retrieving information to feed into the LLM (such as GPT 4.5). RAG is what the Assistants API of OpenAI actually is, they have just simplified the setup (upload documents to the assistant). RAG, however, is a domain ā€œoutsideā€ Appfarm platform.

2 Likes

Hi!

Very relevant to this discussion: You may now find a plug-and-play app for this purpose in Farmers Market (you just need to have your own OpenAI Account / API Key)

This may also be tested / seen here:
https://showroom.appfarm.app/ai-assistant

1 Like

Thanks! I tried this out and it seems to work well for retrieving info from static files. Will the Assistant plug-and-play app be able to query the graphQL in Appfarm or will it only have access to uploaded files?

Answers given by the Assistant Iā€™m building depends on rapid changes in databases, so uploading files which would change very often is not a solution.

For example, in the Appfarm Showroom, the personal HR Assistant could be able to answer questions about remaining vacation days by making a query to a database.

Do you have any plans on implementing something like this?

Hi!

This App in farmers market is a general purpose app just using the full spectrum of OpenAI assistants API, but with manual upload if static files. It is not a service or SaaS we plan to expand with many more different use cases, but maybe a few.

You may add this App to your solution, change and expand this app the way you want (no copyright or anything from our side on anything). E.g. you may allow for assistants to generate JSON files from graphql queries (or other external sources) regularily with a service schedule and just upload this new file to the assistant in OpenAI. Thereā€™s a lot of great use cases and possibilities for such automation!