Direct Assistants Integration
# šŸ‘€feature-requests
m
Hello @aloof-traffic-22451 , I hope this message finds you well. Last week, during your call, I raised the idea of creating a direct integration between Botpress and OpenAI assistants. I’ve further developed this idea and would love to hear your thoughts on it. I am a big fan of Botpress and its capabilities. For me, your platform acts as the powerful engine behind large language models, enabling them to operate intelligently and at scale for businesses. After testing Botpress with real ā€œnormieā€ users, I have gathered some insights.I have created both botpress based bots with no direct OpenAI integration and I have created botpress bots where I was only connecting OpenAI assistants to WhatsApp. Surprisingly I have received amazing feedback from users about these assistants saying it is really smart and giving them a giving a real ā€œChatGPTā€ experience. However, these assistants lacked data handling and didn’t utilize any Botpress cards, just basic templates..

 Botpress core were more intelligent from a company perspective, but users found the experience fragmented and reminiscent of outdated chatbots. The challenge lies in the need for Botpress users to manage conversation history and bot responses manually (very module, every card), leading to less authentic interactions as the bot often regurgitates information from transcripts and knowledge bases. Given this, I propose creating a direct OpenAI integration and leveraging OpenAI to handle conversation experiences and major prompts, while Botpress mirrors the conversation and focuses on creating functions called by the assistant. This approach could significantly lighten the load for Botpress users and enhance the end-user experience.
 Here’s how the integration could work:
1. User message -> Botpress processes the intent and enriches the conversation with data -> Sends data to OpenAI -> OpenAI generates a response and handles the history and RAG -> Botpress retrieves the history and manages underlying variables and Business intelligence.
 2. Alternatively: User message -> OpenAI assistants API -> Calls a function hosted in Botpress -> Botpress sends data back to OpenAI -> OpenAI replies to the user -> Botpress processes the conversation history and transforms it into structured data and handles business intelligence.

 Combining OpenAI’s function calling with Botpress intents could be extremely powerful. While this might slightly shift how Botpress is used for some users, it doesn’t mean users necessarily can’t continue using the platform as usual. It also doesn’t prevent Botpress from processing AI requests within the platform (AI task, AI intents and other cards). Instead, OpenAI would handle creating final responses and maintaining conversational continuity, while Botpress excels at integrating LLMs with business workflows.
 
I understand this might not be an easy implementation, and I haven’t provided a complete solution. However, I believe this approach could be a game changer for conversation design and alleviate much of the current workload. It could significantly enhance the intelligence of bot responses and user experience. Although it might seem the impact isn’t much, tweaking the AI conversation cards to provide users with better and more intelligent responses is where I currently spend most of my time inside Botpress, and I’m still not completely satisfied with the results. On the other hand, it was so quick and easy just setting up an assistant. What are your thoughts on this?
3 Views