Hi, there is another question.
I use the BrewPi to collect data from Mashing, for Fermenting i have other solution.
However i have no idea about Linux/PHP/Phyton… But i like to bring the collected Data to the Cloud. Here are some ideas i have. I’m a Microsoft Fan and i have a MS Azure Tenant/Subscription and i would like to bring the Data to them to create some Dashboards and also to have the Data Backuped on some different places, otherwise when my BrewPi goes damaged i loose all my Data.
My Question is now, where i can find all the Data on the BrewPi and in what format they are, os there a little DB on the Pi or is it just a file? Is it maybe possible to creat a Crone-job to sync the Data when i click the Finish Brewing button on the Webpage on the Pi?
Any Help and/or Ideas are more then welcome. Maybe some one has a solution allready created?
Ok so let me start by pointing you to the files. For each logged session you should find a named directory under the brewpi users home dir /home/brewpi/data/MY_LOGGED_SESSION and in that dir a collection of json formated files. These files will be dated from each day and represent a dump of that days data. You will have to pull these apart and get them into whatever format you want, but that’s the data set.
Now onto the cloud sync you could use cron to do dumps but cron is a scheduler and doesn’t react to something being done. You make a crontab entry to do something at specific times, run file foo at 0200 every day. I think if you want to perform an action on a click this is going to require code modification and then you have the problem of maintaining it, everytime you update and the code base changes you would have to manually backup and make changes. This route really doesn’t fit well into the system. My suggestion is just do daily dumps via cron, write a script to sync the data to your cloud system every day at some specific time.
I hope this helps point you in the right direction.