Project | Long Running Elixir Benchmarks | |
Organization | Beam Community | |
Project site | https://elixirbench.org | |
Repositories | https://github.com/elixir-bench | |
GSoC link | https://summerofcode.withgoogle.com/organizations/6486585449644032 | |
Mentors | Tobias Pfeiffer, Michal Muskala | |
Student | Tallys Martins | |
ElixirBench aims to provide a service for performance monitoring similar to a continuous integration system and display results in meaningful graphs, for analysis and comparisons between code changes. The project goal was to put the service up and running, with basic features that would make it possible to monitor at least two Elixir projects. For this purpose we have developed new features, fixed bugs, setup of servers and we have written benchmarks for Ecto and Plug projects.
We did a lot of work during the project and we have reached most of our main goals. Many contributions were made to the project, from design, documentation, addition of new features to servers setup, putting many different skills into practice. Working on ElixirBench was a very pleasant experience with lots of learning and fun.
There were different kinds of contributions to the different repositories of the project. Here are the following Pull Requests, divided by repository and a small description for reference:
This is the frontend application, built with React. Here we have made several improvements like addition of new pages, menu links, bug fixes and features like displaying the status of a job in the job list page.
- (merged) Adds job status icon to repos page
- (merged) Add link to wiki in homepage
- (merged) Change back_to link in job details page
- (merged) Add deploy instructions to README.md
- (merged) Homepage
- (merged) Fix Typography component props
- (merged) Adds static page links to Navigation
- (merged) Remove runTimes from queries
- (merged) Adds LICENSE.md
This repository received most of the contributions. We have done several enhancements in the code base, improving the test suite and refactoring when needed. In this process we could detect and fix bugs and consequently raise the confidence for making other changes. The most important feature added was the integration with Github webhooks to automate the execution of jobs.
- (open) Change percentiles type in Measurements to :map
- (merged) Adds exit status code to Job
- (merged) Add test cases to Config
- (merged) Adds tallysmartins to cors origins
- (merged) Adds gigalixir deploy setup and documentation
- (merged) Changes in Measurements required fields
- (merged) Improvements in Job config, adds support to Wait options
- (merged) Avoid job duplication by Github webhooks
- (merged) Change mode attribute to array in Measurements
- (merged) Fix branch name from Push event
- (merged) Adds Github webhook endpoint
- (merged) Adds Repo.slug/1 helper function
- (merged) Apply format to remaining files
- (merged) Removes OTP 19.3 from CI
- (merged) Apply format to Github/
- (merged) Apply format to elixir_bench/benchmarks/
- (merged) Benchmarks tests
- (merged) Mix format 2 of 9
- (merged) Adds coveralls folder to gitignore
- (merged) Apply mix format to tests (1 of 9)
- (merged) Prevents build failure due to coveralls outage
- (merged) Refactor Github.Client
- (merged) Adds tests to ElixirBench.Schema endpoinds
- (merged) Adds coveralls setup
- (merged) Ci setup
- (merged) Adds LICENSE.md
The runner application is responsible for running the benchmarks, collect the results and send it to the api server. Many changes were made to this repository in order to follow the modifications of the other applications and also small fixes that were detected occasionally. The biggest difficult here was the integration of changes with impact in the other services and ensure everything was working. Here are the contributions made so far:
- (merged) Add exit status to job results data
- (merged) Adds deploy docs
- (merged) Get rid of warnings and check them in CI
- (merged) Some good changes in Job module
- (merged) Change env variables in config
- (merged) Setup Travis CI
- (merged) Apply mix format
- (merged) Adds LICENSE.md
- (merged) Adds env variable to set server-runner url
The runner container is our docker files that ships with the Elixir dependencies for each different versions of the language that we support. There was only one contribution to this repository about adding support for synchronization between the containers that run the benchmarks and external dependencies, like databases and others. This feature had a big impact in terms of affected code, requiring changes also in the Api and Runner repositories.
One of our goals was to monitor the performance of at least two Elixir projects using ElixirBench. For this, we have written benchmarks for Ecto and Plug, and we have also set a demo application with documentation purposes.
- (merged) Ecto Benchmarks
- (closed) Plug Benchmarks
- (merged) Demo Application
There is a lot of work to be done on ElixirBench I am looking forward to continue to contribute with it. The following list presents some issues that we would like to address the near future :
- Integrate with the Github App in the Market place
- Enhancements on the graphics page (ddatetime, title, and information about hardware)
- Add max retries to job claiming to avoid the runner to always claim a job that fails
- Display project benchmarks per branch
- Wrap ElixirBench standard configurations in a formatter
- Be able to search repository
- Link retried jobs to its parent (today is a completely new job and is not linked to retried one)
- Be able to Save/Download docker-compose file from a job
- Automatic detect and notify performance regressions in benchmark runs
- Automate deploy and define the release workflow
- Build container releases for more elixir and erlang versions and push to docker hub
- Add about page
- Add a contact page
- Add help/docs page
- Write about config.yml settings
- Write about runners and infrastructure
- Write about benchmarks and examples
- Write about how to Contribute
- Define the issues/development workflow
I would like to thanks my mentors, Tobias Pfeiffer and Michal Muskala for all the support and patience. They supervised my work with such a big wise and taught me many things.