Knowledge Bases with Documents Is Bugged and Provi...
# 🤝help
s
https://www.loom.com/share/bc785e8ba169480392fb56bc7e2ca66c?sid=7a744eab-48d0-4ccd-bf3b-a79150029bcb See video for demonstration. Problem: Knowledge base will answer differently for documents compared to static text. When responding from docs, it is much more likely to be wrong. I'm positive that the way that documents are being pulled in must be messing with the text. Example: The example from the video shows KB set up with both a text doc and a static text doc. It is the same exact text. I disable th KBs to test them individually. Real Answer from text: "The Status Effect Cleanse is a Unique Type Effect with a Stack Limit of 3 that does the following: Spend to remove one status effect token: A player with this token may spend it at any time to remove a single status effect token from themselves. Cleanse belongs to the following characters: Monk, Seraph" When Using Text Doc User Question: What does the status effect cleanse do? Bot Answer: The status effect "Cleanse" allows a player to remove a specific number of other status effects from themselves or another player. When using cards or abilities that remove a specific number of status effects, the player can remove that many tokens of status effects. When Using Static Text User Question: What does the status effect cleanse do? Bot Answer: The status effect Cleanse is a Unique Type Effect with a Stack Limit of 3 that allows a player to spend the token at any time to remove a single status effect token from themselves. As you can see, even using the same text, KB hallucinates the wrong answer only when using docs. This makes it very concerning to trust Knowledge Agent.
I've determined this has nothing to do with the type of document but instead has to do with how many characters are being used from the KB. You never get this issue if you ask a question about something from early in the document/text. There is a point in the text whether its static text or from a doc where knowledge agent is no longer getting fed the info
@crooked-van-25152 Is there somewhere we can post bugs? This isn't really so much of a help question. It seems much more like bug
c
Hi @straight-wolf-37371 Thank you so much for all the details you provided. This is super helpful! We are beyond lucky to have you as part of the Community.
I have flagged the issue to the team, but adding two of my colleagues here for visibility @early-train-33247 @witty-football-93730 !
We currently don't have a channel specifically dedicated to bugs, as managing it could be a bit hectic. Sometimes people report issues that may not necessarily be bugs. So, if you happen to notice a bug, please tag me, and I'll be more than happy to flag the issue and direct you to the appropriate individuals.
s
Thank you! I fully understand that. I can only imagine what gets reported lol I'll only tag you only if I'm pretty confident it's a bug and I can provide a way to replicate it. If yall want any specific information just le me know. Thanks for the support 🙂
c
You're the bestttt! 🥹
s
https://www.loom.com/share/483591b2152e45059a28de4dd644ce1a?sid=9ea4f4e0-4666-47a9-9ad0-eb3e02c99121 @early-train-33247 @witty-football-93730 Watch this video for a more clear explanation and showing exactly what I think is happening
w
Hey 👋 Thank you for this detailed report. We use a different "indexing" technique for static text and documents. For static text, we do truncate the content in some cases. @User Can you try something for me, add some double line breaks where it makes sens (my guess, the more the better) in your large texts and try again. Let me know if that partially fixes your issue.
s
Thanks for the reply! I tried adding double line breaks throughout. It did not fix the issue after testing it. I'm able to split it in 2 separate KBs and it works. But in a single one with multiple double line breaks it does not
w
Thanks for for the feedback. FYI, I added an issue for the team to complete in upcoming sprints. Meanwhile, I'm glad you found a workaround
s
Great! Thanks!!
c
Good job @straight-wolf-37371
w
Hey @straight-wolf-37371 We're getting close to shipping a to your issue. Just wanted you to know we didn't for forget you 🙂
s
Thank ya sir!!!