Skip to content

Instantly share code, notes, and snippets.

@sommersoft
Last active January 6, 2020 03:53
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save sommersoft/efd7bea541c33655c836b910184407fe to your computer and use it in GitHub Desktop.
Save sommersoft/efd7bea541c33655c836b910184407fe to your computer and use it in GitHub Desktop.
sommersoft's CircuitPython 2019 Reflections

2019: What A Year For Our Patron Snake

sommersoft’s CircuitPython 2019 Reflections

My 2019 In Review

My contributions in 2019 fell largely into the “automation/tooling” category.

  • Adabot saw her capabilities and responsibilities grow quite a bit. She is now integral to the circuitpython.org website, providing daily library information updates to include open pull requests, open issues, and infrastructure issues. This allows a greater visibility of ways individuals can contribute, by providing a single-source of information. She also went through two host migrations: from travis-ci.org to travis-ci.com, and more recently from travis-ci.com to GitHub Actions. Outside of that, her updates were largely maintenance related, a lot of which were caused by my own mis-steps in maintenance ;-).

  • CircuitPython’s core documentation saw a small improvement, thanks largely to @danh’s work on changing the way in which board dependencies are structured and discovered. The change allowed a much more standard way to determine each board’s included modules. Using that standard structure, the generated documentation can now depict the board<-->module relationship without requiring manual input.

  • Libraries! Wow… There were only 125 libraries on January 6th, 2019, according to Adabot’s automated updates to the Library Bundle’s GitHub repository (did I mention she’s awesome?). On December 18th, 2019? 205 libraries! Now, 80 additional libraries may not seem like that much of a big deal. But from an ecosystem maintenance perspective, I find it to be a big deal. Thanks to the forethought of the team early on, the structure of the ecosystem goes a long way to minimize the individual effort towards maintenance. There is still effort though, so a big shout-out to @kattni and all those that helped with that! While I personally contributed little in the way of the libraries themselves, I kept with my 2019 theme and was involved in the automation/toolchain. The bundling mechanism, circuitpython-build-tools, saw some improvements and some requirements changes. Adabot continued to do her part, and applied each library’s released updates to the Library Bundle. And like Adabot, each library is transitioning from TravisCI to GitHub Actions. Shout-out to @siddacious, @dherrada, @MakerMelissa, and whomever I’m forgetting for the collaborative effort on that transition.

  • The CircuitPython core saw a lot of action this past year, as others will no doubtedly document. My only core contributions came in the form of finishing my addition of the frequencyio.FrequencyIn module to the SAMD51-based boards, and finishing @DeanM’s initial work on the audiomixer.MixerVoice module.

  • Automated Physical Testing: I actually spent a good chunk of time on this in 2019, and have not done a great job at explaining what I’ve focused on week-to-week. So this bullet point is going to be a long one.

    • In the spring of 2019, I started to think about how we could test each pull request on a physical board. We have the MicroPython tests which run in the Unix port, but those really don’t touch the CircuitPython-specifics. @tannewt used to have a physical testing interface, named Rosie, that did some automated testing with physical hardware. If I remember correctly, it was run through a NUC, but did not go beyond loading firmware, and running some import statements.

    • By May, I had sketched out a rough theory to approach the solution. When a new pull request is made, a GitHub App could trigger a webhook to initiate a test event. That webhook would be handled by a Raspberry Pi (RPi). The RPi would then grab the pull request code, build the firmware, interact with any attached CircuitPython capable board(s) via USB, and provide the results back to GitHub through the Checks API. As a bonus, the RPi could interact via GPIO between the two and verify a set of test conditions. Simple, right?

    • By mid-June, I had a working GitHub App integration as well as a [separately] working prototype on a Raspberry Pi connected to a Metro M4 Express. And thus RosiePi was born as a proof-of-concept. The GitHub interactions were done using GitHub’s tutorials, and relied on Ruby and the ngrok-like service Smee. The testing machinery on the RPi was all written in Python.

    • I then shifted my focus to the planned architecture of the RPi itself, and how to provide a reliable+repeatable environment using a configuration-as-code approach. I spent a couple months learning and battling with Ansible. I chose Ansible because it is Python-based and can run on an RPi. It also has a pull capability even though it's primarily designed for push style.

    • Around October, after some time fleshing out the finer points, I had a fully integrated system working. And that is when it hit me. The system design as a whole was not at all modular. The GitHub App could only have one webhook, which means only one Raspberry Pi (or one instance of the Smee client, rather) could react to events. I wanted to avoid a single point of failure, especially one that relied only on my personal attention/infrastructure. Additionally, since the GitHub Checks API is limited, the results page within GitHub was the only information available (in Markdown, no less). This would force either an overly long and confusing results page, or inadequate results data. So, to quote Raymond Hettinger: There’s got to be a better way!

    • At that point, I basically entered into the realm of “design your own CI platform from scratch”. Territory that I have zero experience in. I spent a few weeks running through various “cloud” options, and pricing them out. I ultimately decided on using Microsoft Azure, in a serverless app configuration. This should keep operation costs less than running a full VPS (like DigitalOcean, Azure VM, etc).

    • By the beginning of December I had the Azure platform, codenamed physaCI, handling the PR webhooks from the GitHub App. Throughout the rest of the month, I hammered out a registrar system on physaCI, for the purposes of providing push notifications to RosiePi “nodes”. This registrar will minimize compute costs, by avoiding constant polling (or HTTP long-polling sessions). Over the Christmas holiday break, I started writing the new client-side RosiePi software that will interact with physaCI. And that’s where the project is at.

Summary

As a self-taught hobbyist, with no professional experience “in the industry”, I can hardly believe that I’ve been a contributing community member for 2 years now. The amount I’ve learned this year alone, is equally surprising to me. It may seem like lip service or puffery, but please take the following with heart-felt gravity: this community is special. Having this space that allows me to experiment, learn, expand, and stumble while offering nothing short of positive inspiration has been amazing. I thank each and everyone of you!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment