That error is raised if the persistent memory on your Spark is full.
You can check the Spark service page to check whether this is plausible: the expected maximum is 60-80 blocks (depending on block type).
Do note that if you run a wizard, and then want to remove the output, you’ll want to delete the created blocks along with the dashboard.
You can manage multiple fermentation setups with the same Spark, so wizards always add new blocks/widgets.
Exceptions are sensors and actuators: it’ll rename the sensor blocks, and replace the pin assignments for SparkPins/DS2413/DS2408 if you choose the same input or output in subsequent wizards.
This is what I did, removed all then re-created the fermentation dashboard.
Multiple fermentation control incoming soon as I’ll hack a chest freezer soon that comes next to the first fridge.