GSOC 2018 Final Report
During GSOC, I worked under Probot org to create a github app that performs a "background check" to identify users who have been toxic in the past, and shares their toxic activity in the maintainer’s repo.
- Collect public comments of user.
- Figure out a solution for analysing toxic comments.
- Avoid using a database.
- Write modular code which is easy to test.
Solution to the Challenges
Collect public comments of user.
Used Github Search API to find issues on which a person commented and then for each issue, sent requests to get comments and then filtered them to get only person's comment.
Figure out a solution for analysing toxic comments.
Used Google Perspective API to run sentiment analysis on user public comments.
Avoid using a database.
Used Github as a databaseTM. To keep a record of which github user's analysis has already been done, an issue was created for the user in a special github repo. By using Github Search API, it was found, if issue for a user exists and if it does then it means that the analysis for the github user has been done.
Write modular code which is easy to test.
Followed Single Responsibility Principle and grouped code in appropriate folders. Implemented dependency injection to avoid having to interfere with requiring dependencies.
Things I Learned
- Writing Unit Tests with Jest.
- Mocking dependencies in Jest.
- Making extensive use of Github APIs.
- Writing Inline Code Documentation.
- Design an Event Driven Application run via webhooks.
What is done
- Github app successfully identifies users who have been hostile in the past.
- Github app successfully sets up a discussion board for maintainers.
What more can be done
- Make the github app configurable via config.yml files. Read more about it here.
- Improve performance by increasing concurrency. Async library can help in this.