I have a new Raspberry Pi3 up and running and I have a BrewPi Spark 3 on order to pair it with. Once I get BrewBlox working as expected I want to run some code on the RPi to continuously stream the fermentation sensor readings, temp settings, program / profile info, actuator state, timestamp, etc. to the Google Cloud. My aim is to get my fermentation data into long-term persistent storage and to be able to do some decent analytics over it, even be able to compare multiple fermentations.
I already have working code, thanks to a tutorial from Google, which detects when a new file appears in Google Cloud Storage bucket and process it. Upon successful parsing a new row gets added to a BigQuery dataset. Upon failure the error info goes elsewhere.
What I don’t have, since I am a Python / Linux hack at best, is code which will trigger upload of the data file from the RPi to Google cloud storage location (or direct writing/creation of it on gcs).
I’m wondering if anybody has similar code in a GitHub repo, or even just a good tutorial to point me at.