I wrote FermPi about nine months ago but hadn’t gotten around to open source it until now. It has been running in “production” the whole time though. Some features like profiles are missing but it has been working well. The backend is a Node.js with a REST api and SQLite database. Frontend is single page web app build with Backbone.js and Bootstrap.
I’m glad to see a new version of BrewPi’s Arduino part is also released. I started rewriting the FermPi frontend with React just to learn it, but maybe I’ll wait and rewrite the whole thing for BrewPi Spark Communicating with Arduino was a bit painful but maybe Spark will be nicer with a network api?
Nice work!
This overlaps with out plans quite a bit, but also differs on a few points. Perhaps this is a good starting point to clarify some of our plans and get some feedback.
It’s great that you are already using upstart. The way we now handle keeping the script alive, with a cron job, is hacky and should be replaced.
What we have been thinking so far (up for discussion):
Frontend frameworks: Bootstrap + AngularJS
Graphs:grafana dashboard. This will allow the user to create their own charts in a very flexible way. Grafana is written as a front end for Graphite, a server monitoring tool. We need to look into how to apply it to brewing.
Database:InfluxDB. A database designed for time series data. This will allow us to query the database to give a datapoint every hour, and request more data when you zoom in.
Backend: We are still planning to use Python. While Node.js seems to be very popular, having to model everything as a callback can be a confusing programming model. Python with flask and gevent will give the same performance and async behavior, without pushing the “everything is a callback” model on you.
We are going to split up the software stack in multiple layers:
Embedded layer: control algorithms run independently on the Spark Core or Arduino.
Connector layer: for each device (spark/arduino) a wrapper python script will run on the server. The connector connects to the embedded device via USB or WiFi and exposes it as a TCP socket. The connector can convert the space saving binary API to the embedded device to a more readable format.
Service layer: database, monitoring, REST API. Python with Flask.
Front end web server: connects to the service layer. Hosts the web interface. (Also Flask? Django?). ComponentJS is also worth a look.
Client browser: connects to the web server with Pub/Sub channels for continuous updates.
Disclaimer: this is intended as a starting point for discussion, not a fixed plan. So please share your thoughts!
@bschwinn has been working on a replacement for the BrewPi data logging and web interface as well.
@mdma has already started work on a connector to parse existing BrewPi data files and put then in influxDB.
Great to hear about your plans. InfluxDB specifically sounds great for minimizing the data when zooming.
Can the Spark wifi/network just be exposed so that we can connect to it from wherever? For example just expose Spark to some port on the router and connect to it from the cloud?
It’s an alternative ui/server/data logger that talks with the BrewPi on Arduino. It’s the only thing that needs to be installed to Raspberry Pi. Very beta though, I’ve only tested it on my Leonardo board.
I tried to simplify the interface a lot, but it has everything I have needed myself. UI was done “mobile first” but of course works on all screen sizes.
elco i was looking at http://emoncms.org/site/docs/dashboards the otherday thinking the custom dash board might be a intresting way to go with people adding in the bits there using as widgets. When i have some time I’ll look into it more.
Looks like it is worth investigating! @mdma
It is built with some of the same tools that we opted using (python, Flask, socketIO), so at least we can learn from it.
How about de-coupling the data logging completely allowing individuals to use the particle.io cloud (free indiv. accounts) to have the spark log it’s data there? any app we write - fermpi, brewpi, pintsize, whatever… could have access via particle’s published interfaces and use their particle.js libraries.
I understand wanting to keep everything under control locally - and avoid vendor lock-in… that’s good. Look at the crap Google/Nest is pulling with the home automation services… Not having to hard-wire the Pi to the BrewPi HW to get data feeds, or use a second server (ahem mobile app or pc/mac local HTML with node.js/particle.js instead) would be awesome.
If it became desirable to provide Particle’s “cloud” offering as an option…
My first Spark / BrewPi is shipping as I type. Maybe when not brewing, I’ll experiment with the firmware required to try this if nobody else does.
We are working on a new web server implemented in Django, which will provide a rest API.
New Frontend code will be built with React, but still have to get started. If you want to get involved (writing Python with Django or JS with React), get in touch. @akileh
We are not using the particle cloud because what you can actually publish is very limited in data and variable types.
Publishing just the temperatures should be fine, just not all data.
On the core, the cloud code does not fit, on the Photon it could be used.