Just wondering whether anyone has any experience with the format of the serial communications from the BrewPi.
I am attempting to communicate between a BrewPi Spark 3 and a PC over USB connection. Using the syntax described here ( https://brewblox.netlify.app/dev/reference/spark_commands.html#commands) I have been able to use send commands through Termite on the PC to reboot the Spark as well as changing the state of the actuators.
However, I have not been able to make any sense of the response when trying to read the value of the connected temperature sensor. I have included the command entered into Termite and subsequent response, separated into relevant sections.
I am assuming (hoping) that the unknown data contains both the offset and valuefor the TempSensor, however have had no luck decoding the Hexadecimal string.
Just wondering if anyone knew how the data is encoded for USB communication, particularly how the numbers are encoded for temperature sensors.
Some post-processing is required. We indicate this using protobuf options for fields.
(brewblox).scale = 4096 means values are multiplied by 4096 before serialization, to render them as integer number.
strippedFields helps distinguish between 0 and null for values. If a field is included in this list, the value is intentionally not set. This happens if the sensor is not connected.
The 0x00 byte before the CRC is a null terminator for the protobuf payload. It should be ignored when feeding data through a decoder.
What is the reason you are not using our docker container to communicate with the Spark?
It will do the protobuf decoding for you and has all the commands accessible through a REST interface.
I attempted to get docker working on the PC but I was unsuccessful due to the Windows 10 Linux kernel used by Docker being unable to communicate with USB devices.
If you have any advice on getting this working on Windows that would be appreciated, because I was unable to find any instructions for this issue online.
I am using Windows as I am trying to completely eliminate the raspberry pi from the system
For you, the flash and flash-bootloader scripts would be most relevant. You can also use the Particle CLI to trigger dfu (readying controller for firmware flash).
I got around to trying it out, and (at least on my machine), it works just fine to download and flash binaries manually.
The instructions below assume you have a Spark v3. For a v2, substitute p1 with photon in file names.
We mention it in release notes if there is an updated firmware version. After the initial flash, most firmware updates can also be applied over wifi using the UI.
Rule of thumb is that you only need to update the bootloader if you’re getting a blank white or blank black screen. If your Spark is connected to wifi with internet access, it will automatically download and apply bootloader updates.