Update Error Spark 3 / Pi 4

Thanks in advance for your help!
After 3 years I wanted to update our system, but unfortunately I get the above error messages.
I have also tried the firmware update, everything went normally in the terminal, but when I start the service and open the visualisation I also get error messages:

I took a snapshot beforehand.
What can I try, could it be the SD card?
Thanks for your support, Markus

An SD card error seems unlikely. It looks like the python environment either needs an update first, or has a temporary failure to build packages in your virtualenv.

Practically speaking, if you want to do a big update now and then leave it alone, I recommend using the snapshot to also update your OS. Debian Buster is past its official end-of-life date.

Export the snapshot, follow the install guide to flash your SD card with the latest OS (bookworm), and then load the snapshot. This is very likely to be less effort now and in the long run.

In addition, you’ll need a usb flash with `brewblox ctl-flash’, which uses a different protocol (bootloader) than updating firmware through the wrb interface.

Hello Elco and Bob,
thank you for your support.

I have now installed Bookworm on a completely new SD card and then the snapshot. I then updated everything and the firmware is now up to date again. Looks very good already!

I then only had to recreate the complete dependencies / connections of the blocks and assign the 1-wire sensors. Not a big issue, that will be normal due to the big version change?

Now I’m still concerned with the controller parameters and the profiles.
I will start the Pi with the old card and export the profiles etc. for this purpose.

Then there shouldn’t be much standing in the way of the first brewing sessions in the spring.

1 Like

Happy to hear it works again!

Yes. We made quite a few changes to brewblox the past three years.

Sorry, one more question: Can at least the PID parameters of the different controller circuits be read out from an old snapshot? I can no longer start with the old SD card, as the FW version logically no longer matches… Thank you!

It depends. PID settings are stored in backup files, but we only started creating those automatically sometime last year. Before that, it depends on whether you manually created a backup.

OK, it looks like I made a backup after commissioning:


Wie komme ich nun an die Parameter, kann ich die JSON-Files zumindest ĂĽbersichtlicher darstellen als mit dem Editor?

For PID settings, you’re looking at the spark-one.spark.json file. To format the data, you can either use an IDE like VSCode, or use python on the commandline:

python3 -m json.tool {backup_path}/spark-one.spark.json > outfile.json

ctrl-F for the PID block name, and the values should be fairly intuitive.