Spark simulation service

Dear Team,

I am trying to run a spark simulation on a second raspberry pi. I am hoping to be able to use this to build/experiment with a Herms setup, which I am hoping to move over too in the future.

I have followed the instruction to install Brewblox on the new RPi, but when I run the command to install the Spark simulation service, I receive the following information/error.

"pi@raspberrypi:~/brewblox $ brewblox-ctl add-spark --name spark-one --force --simulation
brewblox-ctl requires extensions that match your Brewblox release. Do you want to download them now? [Press ENTER for default value ‘Yes’]

SHELL docker rm ctl-lib
SHELL docker pull brewblox/brewblox-ctl-lib:edge
edge: Pulling from brewblox/brewblox-ctl-lib
5eb93336973f: Pull complete
Digest: sha256:25c0236eb0a7c4fecf37bc4bc1219d2e5e3023adbd4e361decb37b9dfffdc0c8
Status: Downloaded newer image for brewblox/brewblox-ctl-lib:edge
docker.io/brewblox/brewblox-ctl-lib:edge
SHELL docker create --name ctl-lib brewblox/brewblox-ctl-lib:edge
dc0c7106fd922274d0dce6bfe066a1092a75e2baaf415fa7965089a0cdf23f14
SHELL rm -rf ./brewblox_ctl_lib
SHELL docker cp ctl-lib:/brewblox_ctl_lib ./
SHELL docker rm ctl-lib
ctl-lib
[Errno 2] No such file or directory: ‘docker-compose.yml’"

Am I missing something, or does this have to be installed along side an existing Spark controller.

Grateful for any advice or pointers to correct process.

Cheers

Stuart

you’ll want to run brewblox-ctl setup first, and then the add-spark command. Brewblox as system is exactly the same, regardless of whether you are using simulation sparks or real ones.

Thanks. Sorry for the trouble. Makes sense now.