I’m having a difficult time figuring out how to get the webhook in the automation service to post to Brewfather’s custom stream integration. Here is the resource for custom stream integration (https://docs.brewfather.app/integrations/custom-stream). When I put a similar JSON formatted message in the body of the HTTP request field, I get a response from Brewfather that the validation failed. However, it works when I try it form a different HTTP request testing website. It appears to be something with the JSON format in the message body. Is there way to force the content-type to be JSON?
If memory serves, editing request headers is scheduled for the next release. The data field already exists in config, but we haven’t connected it to a UI element yet.
Thanks Bob. I figured it’s a feature in development. Given that I can’t include variables in the message body, and therefore can’t push a data stream, I assumed this feature is in its infancy and I’m excited to use it as you roll out more features. I’m learning everything about HTTP requests for the first time here. So much potential! With the IFTTT integration I figured out yesterday, I can currently setup an automation template to notify me via push notification when a step is completed. Fun stuff.
Automation right now is a preview we released to get feedback on what works, what doesn’t, and what the most important missing features are.
Webhooks in particular are proving much more popular than we anticipated. We’ll likely introduce separate actions for conveniently setting up Slack notifications, and pushing data to BrewFather.
Could you in the future also integrate a way to pull data from Brewfather. Say Brews with status of “Brewing”. A dream would be to automatically grab recipe with all parameters like water amount, preboil gravity, mash ph, mash step temperatures and duration etc. I want to automate, and I want to do with Spark 3
We’re currently looking at integration with Node-RED. Depending on the Brewfather API, this could be used to request data, and transform it into block settings.
Importing recipes is a big feature for automation, but there are a lot of prerequisite steps to import a recipe for the kind of arbitrary configuration supported by the Spark. We’re steadily working on getting everything in place to make it happen.