Skip to content

Instantly share code, notes, and snippets.

Last active February 21, 2020 08:51
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save rumblefrog/0b8a4de73c435b9abf11a9b017cade00 to your computer and use it in GitHub Desktop.
Save rumblefrog/0b8a4de73c435b9abf11a9b017cade00 to your computer and use it in GitHub Desktop.


A network of Raspberry Pi Zero W that interconnect via GCP and potentially deployed using Kubernetes, and could potentially incorporate some ML processing on stored traffic data.

These traffic data could be incorporated to display live heat-map for specific locations and nodes and the ability to create and predict future trends.


Pi -> Redis? -> InfluxDB -> HTTP API -> Frontend

Non-Volatile Storage

As of now, an excellent potential non-volatile storage option would be InfluxDB to store time-series based data points that could easily be mapped and transformed. However, should anything, a potential raw data store could be implemented as a means of fallback.

Volatile Storage

Due to singular pi zero w only providing a scalar reading, that is only an amplitude strength and no direction. It might be necessary if we want to count entries and exits to calculate the vectorized amplitude, which requires two or more pi's data to be computed. And one method of a fast store that's in-memory would be using Redis that the pis could insert their raw data into and be processed by a server hosted on GCP.


To avoid repetitive configuration, a preconfigured pi image may be flashed to the SD cards to be inserted to every pi that would self-configure itself and connect to the master server upon network connection. Kubernetes could be suited to this.

It's still uncertain how to scatter these devices, may need some benchmark to discover their scan range to ensure effective usage of the limited hardware.

On top of that, we need to obtain school's permission and potential funding to implant these devices on campus locations which may require supervision on the data we acquire.


Material List (so far)

  • Raspberry Pi Zero W's (2 per entrance if we want vectorized data)
  • Wifi Dongle per pi zero w (additional interface for monitor mode)
  • Mini SD cards per pi zero w


Due to switching from managed to monitor mode on the network chips, the additional wifi dongle will provide the persistent connection needed to send data back to the master server.

For the chipset on the board bcm43430a1 and bcm43455c0 for zero w and 3b+ respectively, we'll patch the chipset firmware to bypass the manufacturer restrictions.


For backend languages, there's practically free reign on which languages you wish to use after the data is inserted, using any of the numerous SDK and frameworks Influx or Redis community may have.

For my choice, I will be using Rust to satisfy the low spec and low power hardware.

There are multiple potential implementation for data harvesting.

It is possible to create peer nodes with elected masters with associated pi groups that will aggregate group data and compute the vector prior to submitting directly to Influx, this way to avoid the middleman of Redis.

For the signal amplitude determinant, it's possible to use pre-existing implementations like airmon-ng to dump data into parsing streams or manually parse data from lower level streams.


For the limited time we have to work on this, I'm thinking of creating a PWA site to render the data to users. This has a few benefits and drawbacks.

Benefits being:

  • PWA is actually on the list along GCP and K8s that would be considered Google platform.
  • Don't need separate native apps or use React Native to develop apps for each platform (Android and iOS) and is accessible through any device, including computers.


  • Not native, miss out on some of the native APIs provided through each platform.

Potential frontend stack would be

  • Webpack + Vue + TS
  • Webpack + React (could also go with React Native) + TS

Other stacks could be used, but those seems to be the fastest to develop and maintain.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment