Can I set the PWM of a spark service from an external app? I’ve accessed the API via the swagger docs on my rig and I can see how to do many, dangerous things, but setting something in the spark service for the PWM value doesn’t appear to be one of them.
I have used brewblox to help with bread rising, controlling the fermentation temperature for making a sugar wash and also for controlling my pot still for distilling. All have worked very well indeed. My latest challenge is to see if I can use it to feed excess solar electricity into my hot water cylinder gently, rather than selling it back to the grid for a nominal amount.
I have a proof of concept working whereby I can grab how much, if any, excess power I have at any one time from the inverter. At the moment I display this figure graphically and manually set the PWM to be a calculated percentage. The result just heats the cylinder element appropriately using 0 up to 100% of the power. (The element is a low-ish power 2Kw one). As the amount of power spare varies a lot, I know I need to do this automatically.
Is setting the PWM programmatically possible? If so, I can improve my solar energy home usage quite a bit using the rig I already have when it’s not being used for bread, fermenting and distilling
Yes. The same restrictions apply as when doing it in the UI: if a PID is driving the PWM, it will override your settings, and constraints on the PWM or the Digital Actuator may block or restrict your desired setting.
You can set the desired setting for your PWM in a HTTP request:
Replace localhost, spark-one, PWM block ID, and 42 with relevant values.
PWM values go from 0-100.
In a blocks/patch request you only have to include the fields you want to change.
Set the JSON data field to { "enabled": false } if you want to disable heating completely.
All working I needed to use -X POST rather than -x POST. My server uses https on port 4430, so I needed to use a --insecure with curl in case anyone else has a similar setup.
It works very well and is fast too. I’ll be hooking this up to my Home Assistant rig in due course.
Hi Bob - after upgrading to the current firmware and brewblox, I’m getting an error from this which I think means the API interface has changed and I need to update it, but I’m not sure what to modify, so any advice appreciated.
and what I get back is:
[{“loc”: [“type”], “msg”: “field required”, “type”: “value_error.missing”, “in”: “body”}]
From the Block Data types docs, I’m guessing something now needs to be passed across which previously wasn’t, but I’m not sure what, so if you do have any thoughts, that would be great!
As elco said: type is now a required field for patch requests.
A second item is that we’ve cleaned up field names, and desiredSetting is now a read-only value that reflects either the user-defined setting, or the output from a claiming block (such as a PID). To set the user-defined setting, you’ll want to use storedSetting.
A handy link that - but, (sorry), I can see the block type is of type string, but how do I determine what the value it should be? A Block/Read and Block/Discover both seem to need this parameter too
blocks/read accepts a full block as request body, but everything except id / nid is ignored.
blocks/patch requires a block as request body. type and data are required fields. id and nid are both optional, but one of them must be present.
The api/doc page by default shows an example object, but you can click on schema to show which fields are optional.
@david.medland-slater endpoints also show the schema for the response. I suspect that’s why it looks like blocks/discover has a request body schema.
We could add a strict=true query arg for blocks/read, where the type in response is checked against type in request, but making it default behavior would only increase the confusion.