Skip to content

Instantly share code, notes, and snippets.

@oowekyala
Last active August 6, 2019 13:05
Show Gist options
  • Star 1 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save oowekyala/819016d5cc9509f88034bbd6bcb986c9 to your computer and use it in GitHub Desktop.
Save oowekyala/819016d5cc9509f88034bbd6bcb986c9 to your computer and use it in GitHub Desktop.
GSoC 2017 report -- Bringing Object-Oriented Metrics to PMD

Bringing Object-Oriented Metrics to PMD

GSoC 2017 final report

Organisation: PMD

Mentor: Andreas Dangel andreas.dangel@adangel.org

Student: Clément Fournier clement.fournier76@gmail.com

Motivation

PMD has many rules that report violations based on the value of some metric, like StdCyclomaticComplexityRule, or NPathComplexityRule. Another rule, GodClassRule, combines conditions on the value of several metrics to detect the God Class code smell, a practise extensively covered by [Lanza05].

There are a couple of problems with the implementation of those rules:

  • These rules use their AST visitor to calculate the metric and condition violations, which makes the code hard to read, and breaks the separation of concerns. In the case of GodClassRule, the computation logic of the three metrics is mostly intertwined, which is even harder to read and maintain.
  • If different rules use the same metrics, each rule maintains its own version of each metric, which makes changes inconsistent across rules. Moreover, the metric is computed several times, instead of sharing the result between rules. For instance, GodClassRule maintains its own version of the Cyclomatic complexity metric, which is separated from that of CyclomaticComplexityRule.

My project aimed to implement a unified framework dedicated to compute metrics, which solves the above problems by memoizing the results, thus sharing them between rules, and keeping the implementation of metrics separated from that of rules that may use them. The main focus was to provide a very straightforward API to rule developers, in order to foster adoption and declutter rules.

The newfound ease in writing and using metrics was also expected to give rise to a new family of rules that detect code smells using metrics, like described in [Lanza05].

[Lanza05]: Lanza, Marinescu; Object-Oriented Metrics in Practice, 2005.

Project outcome

Metrics Framework

The framework, known within PMD as the Metrics Framework, reached its goals to ease the implementation and use of metrics, and has been enhanced with many other features. Here are some highlights about it:

  • The framework provides a library of already implemented metrics, which are documented on PMD's documentation site.
  • The API to use them is easy to learn, and fully documented here. It's been enhanced over the summer with the ability to pass options to metrics, and compute aggregate results, such as the sum of a metric over the operations of a class.
  • Users can also define their own metrics effortlessly, the API being documented here. In fact the framework doesn't make a difference between custom and standard metrics, the latter only being more accessible.
  • Most of the functionality of the framework is abstracted into reusable classes that sit in the language-agnostic module of PMD. As a proof of concept, a metrics framework has also been implemented for the Apex language module.
  • Metrics reuse the rules' testing framework, and thus can be unit-tested smoothly

Impact on rules

  • Most of the existing rules using metrics have been refactored to use the new framework, reducing dramatically the size of the code (compare the old GodClassRule vs the new), while making it much clearer
  • PMD now has a rule that detects Data Classes, leveraging metrics to assess the complexity of a class and the quality of its interface.

Usage

~ $ cd ~/bin/pmd-bin-6.0.0-SNAPSHOT/bin
~/.../bin $ ./run.sh pmd -d $PROJECT_DIR/src/main/java/ -f textcolor -R rulesets/java/metrics.xml
 TODO output

Future work

  • Although the project has been a success on many plans, it was originally intended that the framework be able to provide metrics with info about other processed files. That would have made possible the implementation of more sophisticated metrics, and antipattern detection strategies. This feature has been reported due to insufficient foresight with respect to incremental analysis. It will be implemented by a dedicated framework, thus segregating metrics and rules that need multi-file analysis from single-file metrics. That is still in the design phase.

  • Provide custom commands to access metrics in XPath rules

Chronological PR summary

  • #409: [java] Groundwork for the upcoming metrics framework
  • #436: [java] Metrics framework tests and various improvements
  • #440: [core] Created ruleset schema 3.0.0 (to use metrics)
  • #451: [java] Metrics framework: first metrics + first rule
  • #482: [java] Metrics testing framework + improved capabilities for metrics
  • #495: [core] Custom rule reinitialization code
  • #499: [java] Metrics memoization tests
  • #505: [java] Followup on metrics
  • #511: [core] Prepare abstraction of the metrics framework
  • #517: [doc] Metrics documentation
  • #523: [java] Npath complexity metric and rule
  • #529: [java] Abstracted the Java metrics framework
  • #542: [java] Metrics abstraction
  • #545: [apex] Apex metrics framework
  • #548: [java] Metrics documentation
  • #555: [java] Changed metrics/CyclomaticComplexityRule to use WMC when reporting classes
  • #557: [java] Fix NPath metric not counting ternaries correctly
  • #567: [java] Last API change for metrics (metric options)
  • #573: [java] Data class rule
  • #578: [java] Refactored god class rule
  • #580: [core] Add AbstractMetric to topple the class hierarchy of metrics
  • #583: [java] Documentation about writing metrics
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment