Bot pulling information that does not even exist i...
# šŸ¤help
e
So my bot is answering questions from nowhere, the information it provides is not in KB. I only have one plain text KB and that’s shown fully in the picture below. Check the emulator highlighted message, why does it say that? It’s not even in KB.

https://cdn.discordapp.com/attachments/1135264361196302437/1135264361372471478/IMG_4135.jpgā–¾

@famous-jewelry-85388
h
You need to tell it not to make things up maybe. I've also reported that my bot answers the question "why is grass green" which is obviously not in the kb.
e
Did exactly that, told it to maintain honesty, still answers the same.
c
check the debug log and it will show where it found the info in the KB and the offset (which I think might be words/tokens from the start of the KB file?) You can add a negative phrase to your KB saying no, you can't talk to a human but you can leave a message or whatever (which triggers some other automation, email or whatever)
e
This is what it shows, the original payload doesn’t even exist in my KB or anything related to it. I have 0 idea where it getting that info. I’ve shown my entire KB in the original photo posted within this forum.

https://cdn.discordapp.com/attachments/1135264361196302437/1135279523710128198/IMG_4136.jpgā–¾

c
I mean this part of the logs.

https://cdn.discordapp.com/attachments/1135264361196302437/1135280572885901463/image.pngā–¾

e
I received offset: 74 What does that indicate?
c
I think it means about 74 words/tokens into the KB but Botpress team would need to clarify that.
e
@early-train-33247 do you know why this happens?
c
My only other thought is that the Knowledge Agent has access to the transcript and there has been a mention of talking to support earlier in the same conversation. You can remove the transcript from the Agent and just give it the question.
e
It wouldn’t do that if I started as a new user though would it? But it still does it.
c
right if you started as a new user then the transcript cannot be the problem.
e
Any thoughts on what could be causing the problem? @early-train-33247
e
Hey folks @here, there is currently a problem with GPT hallucinating some responses, these answers are coming directly from GPT servers, it's not a problem with your KB!!
We are working to fix this, it's high priority! Will let you know once it's done!
e
Ok, but that information which it randomly pulled was from KB not AI Task, does KB use AI task to transmit the info is that what you're saying and is that where the problem is occuring?
e
No, that extra unwanted information is introduced when we sent the question + KB content to GPT. There is something wrong in the prompt that allows the server to come back with an answer that is not in the KB content.