Using Swim, NodeJS and Python for Learning on a Distributed Cluster of Devices

by Brad Johnson, on Apr 25, 2019 1:46:02 PM

Here's a cool demo from Swim developer, Scott Clarke. The Swim Greenhouse Demo (found here on GitHub) uses Swim, NodeJS, and Python to make an application which learns in real-time from a distributed cluster of sensors and devices. The specific use case is for monitoring environmental conditions for plants in a greenhouse, but the pattern can be applied to any distributed cluster of devices.

swim-demo-app-flow

This diagram provides a simple overview of the complete application and configurations for the
Swim Greenhouse demo.

According to Scott, here are the key capabilities of Swim that are demonstrated in the greenhouse application:

  • The ability to deploy and run on edge devices, such as Raspberry Pi’s
  • The ability to distribute an application that shares consistency between nodes
  • The ability to aggregate data on a fabric at many levels
  • The ability to visualize data through a real-time UI

We're actually using this application in the Swim office today for monitoring our plants, but it also demonstrates best practices for monitoring a distributed cluster of devices using Swim. Swim Greenhouse uses bridges to NodeJS and Python, which you can reuse for your apps or reference as an example for building custom bridges to Swim from other frameworks and libraries. Scott even makes use of some machine learning to generate forward-looking predictions about plant health! Lots of fun places to look for ideas about using Swim. 

Swim Greenhouse Setup

Regarding the setup of the application, Scott describes the following:

Inside the greenhouse, there are a number of Raspberry Pi devices with various roles. These roles are: Plant Monitor, Automated Robot, and Data Aggregator. It’s important to note that all Raspberry Pi devices run the same software with minor configuration changes that define how they act and participate within the greenhouse network. 

Each device is also running a minimal NodeJS server alongside the Swim Web Agents. Node is used to serve the various status pages used by the web app and as a data bridge from the sensors into Swim. Unlike a traditional web application where all the pages are hosted in a central place, Swim web applications can be hosted from each device. This ensures the UI is showing the real time data possible without the latency of being routed through various databases and/or cloud services.

A Plant Monitor device, or PlantMon for short, is a single Raspberry Pi running both Swim and Node. The device's role as a PlantMon is to report the current state of a single plant to which its attached. That Pi is connected to an Arduino via a serial connection which is read by a service in Node. That Arduino has one or more sensors attached to it and simply sends the values of each sensor over the serial connection as a JSON message. Those messages are picked up by the bridge, parsed and sent into Swim. In this case the bridge is in Node, but it can also be in Java or Python. The Node bridge can be found here inside the OnSerialData method. Each sensor will automatically get its own Sensor Web Agent inside Swim. That Sensor Web Agent will have Lanes which hold the latest sensor value, a history of that sensor value, Alert Lanes and Alert Threshold Lanes which manage triggering alerts based on each sensor value. 

An Automated Robot, or just Bot for short, is functionally just a Raspberry Pi with a SenseHat attachment. Bots sit idle within the greenhouse until an alert is triggered on one or more of of the PlantMon sensors. Alerts are automatically assigned by an Aggregator to the next idle Bot, and that Bot's task is cleared when the alert is cancelled. Like PlantMon, each Bot device runs both Swim and Node. In this case the data is bridged into Swim via Node which uses an third party JavaScript library to read the sensor values from the SensHat and pipe those values into Swim. This happens inside SenseHatService.js which can be found here.

The Aggregator device sits on the network and receives data from every plant and bot on the network. Like PlantMon and Bot, the Aggregator runs both Swim and Node however unlike the other two devices there is not a data bridge running. The role of the Aggregator is to monitor everything configured to report to it and provide a UI which displays the status of all the connected devices. The Aggregator is also responsible for assigning 'tasks' to Bots when one or more PlantMon devices report an alert. When one more alerts are sent from a PlantMon on the network, the aggregator will selected an idle Bot on that same network and assign the alert to that bot. At that point, hypothetically the bot is acting on the alert. In practice the bot can only acknowledge the alert and wait for the sensor value to go back up outside of the threshold range. Acknowledgement is shown by the animated Swim logo on the Bot's SenseHat screen turning from blue to red. Once an alert is cleared by either resolving the issue (such as watering the plant) or by adjusting the alert threshold, the 'task' will be cleared from the Aggregator and the Bot will return to an available status turning the Swim logo back to blue.

Every device on the network has a corresponding webpage which shows the current status of that particular device. As a user you are able to navigate between plants, bots, and the aggregator as if it were a single webapp despite each page being served from each device. The details of how each device acts and connects on the network is defined within the config files found in /config. This allows for every device to run identical software with only minor configuration changes defining how the device behaves.

Was this helpful? Let us know if you used Scott's code to build your own Greenhouse application. We'd love to see what you create!

Learn More

To learn more about the Swim Greenhouse application, you can check it out on GitHub here. Swim is an Open Source platform for building distributed real-time applications. For technical resources, you can find out more about Swim here.

Topics:Machine LearningStateful ApplicationsSWIM SoftwareEdge AnalyticsSWIM AIEdge Computingdistributed computingHTTPweb applicationsWARPnodejspython

Subscribe