Not really. If its a web search source, we search Bing and return the top N results. Then these results are fed, along with the original question, chat history, and a standard prompt, to gpt-3.5-turbo and an answer (if any) pops out.
If the source is text or documents, they are chunked, embedded, and stored in a pgvector database. Then the user question is also embedded, we find the top 5-10 most similar chunks, and these chunks are fed, along with the original question, chat history, and a standard prompt, to gpt 3.5 turbo and an answer (if any) pops out.
AI Tasks are the closest thing to straight prompting gpt in botpress. Whatever you write in the AI task is packaged up along with whatever inputs, outputs, and their respective schemas, put in a standard prompt, and sent to gpt