Gotcha. There's not any controls on the knowledge agent's sensitivity or temperature, so it isn't easy to do stuff like raise the minimum confidence for an elected fact. The two workarounds I can think of are:
1. Feed the KB answer through an AI task first and ask the AI to filter out anything that isn't strictly relevant to the user's question
2. Add extra stuff in your knowledge base like "I don't know the history of karate" to try to get ahead of things
However, I think a lot of this can be circumvented by good ol' fashioned preparation. A lot of people want a chatbot, but they don't know what exactly the chatbot should do. If you have a good understanding of your users (and data to back that up!) you can design a well-tailored chatbot that fits the user's niche and doesn't often run into problems like this.