Code to stream BrewBlox events to cloud storage?

I have a new Raspberry Pi3 up and running and I have a BrewPi Spark 3 on order to pair it with. Once I get BrewBlox working as expected I want to run some code on the RPi to continuously stream the fermentation sensor readings, temp settings, program / profile info, actuator state, timestamp, etc. to the Google Cloud. My aim is to get my fermentation data into long-term persistent storage and to be able to do some decent analytics over it, even be able to compare multiple fermentations.

I already have working code, thanks to a tutorial from Google, which detects when a new file appears in Google Cloud Storage bucket and process it. Upon successful parsing a new row gets added to a BigQuery dataset. Upon failure the error info goes elsewhere.

What I don’t have, since I am a Python / Linux hack at best, is code which will trigger upload of the data file from the RPi to Google cloud storage location (or direct writing/creation of it on gcs).

I’m wondering if anybody has similar code in a GitHub repo, or even just a good tutorial to point me at.

I haven’t recently done anything with python + google cloud, but some googling yields https://stackoverflow.com/a/52181711

As to listening to data updates: we’re publishing all data to a rabbitMQ eventbus. The history service is just another listener.

https://github.com/BrewBlox/brewblox-history/blob/develop/brewblox_history/relays.py is where it happens. Replace the write call at L136 and you’re set.

To get started running an additional service on the Pi: https://github.com/BrewBlox/brewblox-boilerplate

That said: tee-ing off the data may not be the optimal solution for you.
If your main goal is advanced data analysis, you can directly query the InfluxDB database.
If you want cloud-based persistent storage, you could also run an InfluxDB VM/container in the cloud, and configure the local history service to connect to that.