# 🌎general


07/05/2023, 6:25 PM
I have a local knowledgebase and a fallback to an AI task to call GPT. This is mostly working now. I just realized something that will be tricky to replicate--the way ChatGPT takes a series of questions in context. Yes, I can send the chat transcript with each request, but in my overall flow, the bot will always try to answer from the local KB first. So for example, say the AI task gives me a list of three restaurants. I say "tell me more about the second one." If the knowledge agent finds something in the local KB that it thinks is a good answer, using just "tell me more about the second one" as the prompt, that will not be a good answer. And it would be very tedious to try to control which source the bot will go to. I can't just force the user into a loop on the fallback GPT node because they may have a question that legitimately should be answered from the local KB. Any thoughts on how to manage/handle this?