Sync and Back-Up Your KB Table Data to Google Shee...
# 📖tutorials
q
Building on @modern-honey-58818's 👏 brilliant idea and use case, our goal is to automatically back up the entire KB table data once a week (or daily, hourly, depending on your use case). Previously, we successfully transferred data between Botpress and Make.com one new data point at a time (one conversation, one row of data, etc.). Now, we are expanding this to include the entire KB table. Since it seems to work now, I'll try to rebuild this and write a tutorial at the same time ☕ 🧑‍💻
Previously we have managed to transfer all the Google Sheets data at once to Botpress KB table, but it seems that from Botpress KB table to Google Sheets required a bit more work 🛠️
(in this tutorials-section you can find more step-by-step tutorials for Make.com, this assumes you already have some expecience of using it) Create a new scenario, Name it "Botpress KB Table Data", add Custom webhook as a first module, name it "backup_kb_table_data" and click Save, then OK
In your Google Sheets account, create a new blank spreadsheet, name it "Botpress KB Table Data" Add three columns "Name", "Age", "City" (match those to your KB Table column names)
Create the second Make.com module, Google Sheets, Clear Values from a Range (we need this to empy to sheet before new data comes in, otherwise it add new rows after the old ones so without this module you have the same data multiple times) Connect the module with your Google account by clicking the connection Choose a Spreadsheet -> Select by Path Choose a Drive -> My Drive Spreadsheet Id -> select "Botpress KB Table Data" we just created Sheet Name: Sheet1 Range: A2:C1000 (use this to clear all your columns and rows, in this example we have only columns A-C, and first is Column Name so start from row 2) Click OK
Add another module, Aggregate to JSON As s Source Module, select Webhooks (as your data comes from there, not from Google Sheets) Add a Data sctructure, give it a name "Botpress KB Table Data", click Generate, content is JSON and as a Sample data use this:
Copy code
json
{ "users": [ { "Name": "Name", "Age": "Age", "City": "City" } ]}
Leave other fieald empty for now and click OK
Create the next module, Parse JSON Use JSON String as a data and click OK
(click Save once in a while to save the scenario)
Click Tools, select Iterator as the next module and leave it empty since we don't have any data yet.
As a last module, add Google Sheets, Add a Row. Iterator is performing the same task multiple times for the modules after that. If you have 100 rows, it adds the row 100 times using the Google Sheets Add a Row module. If your next module is Gmail, it can send a correct/different email to multiple addresses etc. Connect this Google Sheets module as the previous one, using the correct spreadsheet info. Leave all the values empty for now (they are waiting for data)
Time to test ⚡ Go to your chatbot, create a KB table called Data1 (default name) Add some data there, with the same three columns "Name", "Age", "City"
Create a new Node with Execute code card
Copy code
js
const users = await Data1Table.findRecords({
  filter: AI`everything`
})
console.log(users)

let object = {
    "users": users
};

try {
  const response = await axios.post('xxxxxxxxxxxxxxxxxxxxxx', object)
// REPLACE XXXXX WITH YOUR WEBHOOK ADDRESS

  console.log("RESPONSE: " + response.data)
} catch (error) {
  console.error(error)
}
Copy your webhook address and add that to the code
Click "Redetermine data structure" so Make.com can find out what kind of data you are sending
Use your chatbot one time in Emulator to try it
In your Webhook, select Run this module only
Use your chatbot again saying "Hi! to it
You can then view your KB table data in the Make.com webhook in the correct format (finally, as we have tried for many days 🔥).
Click Run once to start the scenario again, then go back to your chatbot and use that also again.
Go to Aggregate to JSON module, and add "users" (from webhook) as a data
Go to Iterator module and add "users" (from Parse JSON) as a data.
Run your scenario once again, and also your chatbot
In Aggregate JSON module you should now see the correct KB table data as a JSON string.
And in Parse JSON you should see the correct data in both INPUT and OUTPUT
and also in Iterator module you can see as on OUTPUT it's ready to perform the task (add new rows) as many times as you have rows in KB table.
m
I'm back in the office, will test this in after meetings, you are a legend!
q
Now in the last Google Sheets module you can add the correct data to match your data columns
Run your scenario and chatbot again, and then you can see all your KB table data in Google Sheets
Add a Fixed Schedule Trigger and connect it
Use this cron syntax for scheduling a task to run every 5 minutes, for testing. Publish your chatbot (for Fixed Schedule trigger to work)
Update you KB table by adding or deleting some data
Now instead on running once, turn on your Make.com scenario, select "Immediately as data arrives."
Then you should see your Botpress KB table data automatically updated to Google Sheets (as a backup or if you prefer to modify the KB table data there) every five minutes.
IMPORTANT: When using cron schedules to test and after publishing a chatbot, remember that even if you shut down your computer (or even disconnect the Fixed Schedule Trigger), the cron job will continue running in the background (on the server). Therefore, when testing is finished, it's crucial to delete the Fixed Schedule trigger and publish the bot again, or export your chatbot file and then delete the test chatbot. If cron schedule is connected to a complex tasks, failing to disable cron tasks in your test bot could consume all your AI credit balances.
⬆️ read that again Now when your chatbot is published, you need to go to Logs in your Botpress Dashboard (not in Emulator). There you can see the logs that the bot performs cron task correctly.
Examples to run Cron tasks
Copy code
markdown
every five minutes
*/5 * * * *

every hour
0 * * * *

every 3 hours
0 */3 * * *

once a day at 00:00 (midnight)
0 0 * * *

once a week at friday midnight (technically saturday 00:00)
0 0 * * 6
Try using different cron schedules when testing, and let us know if everything works! 👍
m
Just finished testing, this was awesome, I was failing with the double JSON modules, but your way worked perfect! Great job @quick-musician-29561 🐐
q
@modern-honey-58818 Glad I could help! 🧑‍💻 The idea is excellent, I think I'll use this in most of my chatbots going into production 🤝
m
I'm having lots of ideas this days, so I will keep being active on the server!
c
You are on fire 🔥 ! @quick-musician-29561
b
@quick-musician-29561 thanks a lot ! I'have just one question please. I've a chatbot whose asks users their name,Phone ect and whose automatically save these information in a DB Botpress. How can I add your workflow to send this database to Google Sheet. I mean how can I connect these two worfklows ? Thanks
q
If you add this Execute code card into your current workflow as the final Card or Node, and your Make.com scenario is the same as in this tutorial, it will automatically send the KB table with the latest info to Make.com and Google Sheets. So in your case it's data such as userName, phoneNumber, and so on. You don't need a separate Fixed Schedule trigger for automation if you choose to execute it after every chat. Using a CRON job for automation is useful if you prefer not to do it after every user interaction, but rather at regular intervals, like once daily. Execute code card
Copy code
js
const users = await Data1Table.findRecords({
  filter: AI`everything`
})
console.log(users)

let object = {
    "users": users
};

try {
  const response = await axios.post('xxxxxxxxxxxxxxxxxxxxxx', object)
// REPLACE XXXXX WITH YOUR WEBHOOK ADDRESS

  console.log("RESPONSE: " + response.data)
} catch (error) {
  console.error(error)
}
Test it out and let us know if it works 👍
b
it works perfectly thanks a lot !!!
l
Great solution, thanks a lot! @quick-musician-29561 I have many many rows in my table, but using the Excute code card, I get only 20 rows, how could be possible to get all my table? from my logs: Execute operation findRecords({"filter":{"type":"AIQuery","template":"everything","params":[],"types":[]}}) attached my make screenshoot thanks in advance! https://cdn.discordapp.com/attachments/1194161232052244521/1228304648578928771/image.png?ex=662b8ed9&is=661919d9&hm=0a350a78196a503a8d75a3dbd294d4d31b71eb7df9580dd328debe3df1103366&
q
Try different 'limit' and 'filter' settings, for example these together or test separately
Copy code
js
const limit = 100;

workflow.records = await Data1Table.findRecords({
  limit,
  filter: AI`everything`
})
We have done something similar with earlier projects, this and others if you search 'Google' or 'Airtable in #1132038253109837994 and #1120796649686573086 https://discord.com/channels/1108396290624213082/1186610522498539520 I recommend checking out Decay's tutorials; they are much newer.
l
Thanks @quick-musician-29561 it is working perfect!
6 Views