Support for multiple LLM models
# 👀feature-requests
p
Hi, i think that supporting multiple models (not only GPT) can be really a good things (for example when specific models are needed). Right now this can be done with custom code, but can be cool to add models like any other type of integration.
e
Hey @powerful-parrot-56182 thanks for your suggestion!
This is coming soon!
f
Hey @early-train-33247 any update on this?? I'm very curious to know how you're planning on implementing this as well
e
Hello! I don't think they have a timeline already, but when other models are added, they'll be available in the AI cards settings
c
Thanks 😉 @early-train-33247
f
@bumpy-butcher-41910 @crooked-van-25152 are you able to share any insights here? Even really high level as to whether it's on the roadmap for 2024? I totally understand this is not an easy feature request 🙂 but given the arms race that is going on with all these different models, it feels like this should be a pretty high priority for Botpress, even if it's a more advanced feature to begin with. For example, Anthropic just released 3 new Claude models they are claiming outperform GPT 4 Turbo on nearly every level. So it would be great to be able to test those alongside OpenAI models inside Botpress. But even more importantly. I believe more and more people are going to train their own open source models this year and would prefer to use those inside Botpress.
would be great to know what your team is thinking here as we continue to plan out our roadmap