# 🤝help
Is it possible or is there a way to have a client directly talking to chatgpt without having to prompt output and input example. The problem with this is answer are not unique and authentic for every loops = Every user msg The bot keep printing the same structure and predefine message

Hey @agreeable-kilobyte-11082 there's a chatGPT template in the studio you can check out. If you go to "explore bot templates" there's a template for "ChatGPT + Plugins."

This template has a "ChatGPT Clone" that you can draw inspiration from

Does this work correctly ? I actually also have a huge prompt the problem is when entering the '' output and input example '' If i let it blank the task does not respond or weirdly and if i give an example, the TASK always print the same type as structure/answer based on the example output
Will check it out later tho
there's a 4000 character limit for prompts right now, and that includes and examples and inputs that you give it. So if the prompt with everythign is more than 4000 characters the AI task will be skipped (and there should be something about it in the logs)
Yeah all good with that was just saying if you give an example output the bot will only print on this kind of structure but the specific output is correlated with the example input. Depending on the user's new question, the bot will use the same structure as the output example and will not be dynamic enough on it, whereas it was for a certain input example. For example, for this loop I'd like to do post-operation emotional support for patients who've been operated on, and each input should be treated differently and not use the same structure each time and repeat the same number of messages I don't know if you know what I mean. We can't have the same discussion as if we were talking directly to chatGPT if we have to prompt example an output because it's going to use this example not to learn and understand a logic or way of answering but it's going to model each answer on this structure
Are you sure that a single AI task is the best way to do this? You might have better success breaking it up into multiple steps, where one AI task classifies the user's message as needing emotional support or not, and then those options each go to a separate capture card/AI task combo.
Already doing this in the process but at one point I would need an infinite conversation providing emotionnal support
That break with a '' STOP '' intent
I Have no idea how to figure it best except using external API, prompt it outside of botpress with chatgpt and then giving this answer
But i'm aint doing it
For the moment I just prompt as much as i could and my result is OK
But you should let an option to just not enter a specific example input and output so we can just synchronize with the openAI API
Like using the chatgpt UI with the same memory to have conversation
That ChatGPT clone template will also go infinite, as well as listens for specific intents. I don't think there's a need to use an API directly. But the issue is that you want the output to be in a specific format? You could also try using two AI Tasks: one to write the support message, and then another to format it correctly
Could we request that you guys add a counter for the characters in this card?
sure, you can make a post on #1111026806254993459
Can you help on how to print correctly in the AI task @acceptable-kangaroo-64719

Its always random some time it print well sometime not
I'm using " [...] " for bold and \n\n for jump line
** for bold
you can press enter a few times for newlines if you're typing in the text. \n only works if your bot is speaking a variable (I think)