fresh-fireman-491
03/22/2024, 8:01 AMjavascript
const GROQ_API_KEY = 'API Key';
const data = {
messages: [
{
role: 'user',
content: 'Explain the importance of low latency LLMs',
},
],
model: 'mixtral-8x7b-32768',
};
async function getGroqMessage() {
try {
const response = await axios.post('https://api.groq.com/openai/v1/chat/completions', data, {
headers: {
Authorization: `Bearer ${GROQ_API_KEY}`,
'Content-Type': 'application/json',
},
});
workflow.groqMessage = response.data.choices[0].message.content;
return workflow.groqMessage;
} catch (error) {
console.error(error);
throw error;
}
}
await getGroqMessage()
.then((message) => {
console.log('Groq message:', message);
})
.catch((error) => {
console.error('Error:', error);
});
[Docs](https://console.groq.com/docs/quickstart)
[Models you can use](https://console.groq.com/docs/models)
[API Key](https://console.groq.com/keys)
This is not optimized at all, just what I could quickly put together while in class.
Planning on updating this, and the [Claude-3 tutorial](https://discord.com/channels/1108396290624213082/1214664239890047106) this weekend.gentle-engine-13076
03/22/2024, 9:16 AMwide-oyster-38514
03/22/2024, 9:17 AMwide-oyster-38514
03/22/2024, 9:17 AMquick-musician-29561
03/22/2024, 9:20 AMquick-musician-29561
03/22/2024, 9:22 AMgentle-engine-13076
03/22/2024, 9:26 AMconst GROQ_API_KEY = env.GROQ_API_KEY;
const data = {
messages: [
{
role: 'user',
content: `${event.preview}`,
},
],
model: 'mixtral-8x7b-32768',
};
async function getGroqMessage() {
try {
const response = await axios.post('https://api.groq.com/openai/v1/chat/completions', data, {
headers: {
Authorization: `Bearer ${GROQ_API_KEY}`,
'Content-Type': 'application/json',
},
});
workflow.groqMessage = response.data.choices[0].message.content;
return workflow.groqMessage;
} catch (error) {
console.error(error);
throw error;
}
}
await getGroqMessage()
.then((message) => {
console.log('Groq message:', message);
})
.catch((error) => {
console.error('Error:', error);
});
gentle-engine-13076
03/22/2024, 9:26 AMgentle-engine-13076
03/22/2024, 9:28 AMllama2-70b-4096
Model Name: Mixtral-8x7b-Instruct-v0.1
Context Window: 32,768 tokens
API String: mixtral-8x7b-32768
Model Name: Gemma-7b-it
Context Window: 8,192 tokens
API String: gemma-7b-it
fresh-fireman-491
03/22/2024, 9:31 AMfresh-fireman-491
03/22/2024, 9:35 AMgentle-engine-13076
03/22/2024, 9:35 AMfresh-fireman-491
03/22/2024, 9:35 AMfresh-fireman-491
03/22/2024, 9:35 AMgentle-engine-13076
03/22/2024, 9:35 AMfresh-fireman-491
03/22/2024, 9:36 AMfresh-fireman-491
03/22/2024, 9:36 AMfresh-fireman-491
03/22/2024, 9:37 AMquick-musician-29561
03/22/2024, 9:59 AMquick-musician-29561
03/22/2024, 10:01 AMglamorous-guitar-39983
03/22/2024, 3:48 PMcrooked-van-25152
03/22/2024, 3:49 PMcrooked-van-25152
03/22/2024, 3:50 PMfresh-fireman-491
03/22/2024, 3:58 PMfresh-fireman-491
03/22/2024, 3:58 PMfresh-fireman-491
03/22/2024, 5:49 PMfresh-fireman-491
03/22/2024, 5:49 PMhundreds-battery-97158
03/22/2024, 8:17 PMfresh-fireman-491
03/22/2024, 8:18 PMagreeable-accountant-7248
03/22/2024, 9:34 PMfresh-fireman-491
03/22/2024, 9:38 PMagreeable-accountant-7248
03/22/2024, 9:43 PMfresh-fireman-491
03/22/2024, 9:50 PMagreeable-accountant-7248
03/22/2024, 10:00 PMgentle-engine-13076
03/22/2024, 10:35 PMquick-musician-29561
03/23/2024, 5:24 AMfresh-fireman-491
03/23/2024, 6:09 AMquick-musician-29561
03/23/2024, 6:48 AMjson
{
"name": "duckduckgo_search",
"description": "Fetch news results and organic results ",
"parameters": {
"type": "object",
"properties": {
"query": {
"type": "string",
"description": "Search query"
}
},
"required": [
"query"
]
}
}
https://discord.com/channels/1108396290624213082/1179076516169138196
Another way to get live data is to connect the chatbot to external APIs; we have examples of those in the #1120796649686573086. I fetched data to another chatbots we have shared there by asking, 'What's the weather like in Stockholm today/next week?' and it always gives the correct answers using the real-time data from the weather API.
https://discord.com/channels/1108396290624213082/1178590925421821982
A third way we've previously used to connect Botpress chatbots to real-time data is through Make.com. If you have the data on a platform that Make.com can easily connect to, like Google Sheets, for example.
https://discord.com/channels/1108396290624213082/1217840226169262170
Using Botpress, SERP API (a really good and customizable web search for 'news only', 'images', 'latest videos' etc.), other external APIs to access live information, other LLMs and Make.com together, we can automate and connect to almost any real-time data using Botpress chatbots. With these new tools @gentle-engine-13076 🦸♀️ @fresh-fireman-491 🦸♂️ @agreeable-accountant-7248 🦸🏻♂️ have shown us, our chatbots are once again going to be on another level 🚀agreeable-accountant-7248
03/23/2024, 8:23 AMagreeable-accountant-7248
03/23/2024, 8:25 AMagreeable-accountant-7248
03/23/2024, 8:26 AMagreeable-accountant-7248
03/23/2024, 8:33 AMquick-musician-29561
03/23/2024, 8:44 AMquick-musician-29561
03/23/2024, 8:48 AMfresh-fireman-491
03/23/2024, 9:50 AMagreeable-accountant-7248
03/23/2024, 11:01 AMagreeable-accountant-7248
03/23/2024, 11:01 AMfresh-fireman-491
03/23/2024, 11:20 AMgentle-engine-13076
03/23/2024, 8:13 PMagreeable-accountant-7248
03/24/2024, 8:55 PMgentle-engine-13076
03/24/2024, 9:09 PMgentle-engine-13076
03/24/2024, 9:10 PMcold-jewelry-54343
04/21/2024, 2:52 PMhelpful-lamp-68778
04/24/2024, 7:23 PMhelpful-lamp-68778
04/24/2024, 7:47 PMfresh-fireman-491
04/24/2024, 8:06 PMhelpful-lamp-68778
04/24/2024, 8:31 PMhelpful-lamp-68778
04/24/2024, 9:25 PMhelpful-lamp-68778
04/24/2024, 10:31 PMhelpful-lamp-68778
04/24/2024, 11:52 PMfresh-fireman-491
04/25/2024, 4:11 AMgentle-engine-13076
04/26/2024, 3:32 AMgentle-engine-13076
04/27/2024, 2:04 AMfresh-fireman-491
04/27/2024, 8:27 AMagreeable-accountant-7248
04/27/2024, 9:36 AMfresh-fireman-491
04/27/2024, 9:51 AMIdea, which I think would be absolutely amazing!💎
Add another model strategy, which would be to use our own model. This would just return the chunks, and then we could handle them as we wish.
This would be great as we could use any model, that is running any where, with our own prompt!
Let me know what you thinkagreeable-accountant-7248
04/27/2024, 9:56 AMfresh-fireman-491
04/27/2024, 10:05 AMagreeable-accountant-7248
04/27/2024, 10:07 AMfresh-fireman-491
04/27/2024, 10:12 AMagreeable-accountant-7248
04/27/2024, 10:16 AMfresh-fireman-491
04/27/2024, 10:16 AMagreeable-accountant-7248
04/27/2024, 10:17 AMfresh-fireman-491
04/27/2024, 11:34 AMfresh-fireman-491
04/27/2024, 12:20 PMhelpful-lamp-68778
07/25/2024, 12:39 PMhelpful-lamp-68778
07/25/2024, 12:40 PM