Based on a weighted arithmethic mean or similar of:
- static analysis tool results (number of bugs, warnings, number of tools used)
- tests coverage
- number of audits performed, audit score & auditors ratings (even normal users can audit, with minimal rating per auditor)
- time on testnets without bugs
- number of users on testnets
- time on mainnet without bugs
- number of users on mainnet
- documentation/specs rating (how good is their documentation/spec, how easy it is to understand; protocol transparency)
- minimum test coverage needed before an audit
- static analysis tools to run & links to documentation on how to use them
- list of bug patterns to avoid
- clients should provide a list of known vulnerabilities that the system has
- minimum period of time on a testnet without bugs - before an audit
- number of bugs found after a contract has been audited
- if bugs are found by users - auditors lose more points
- if bugs are found by other auditors, initial auditors lose points (less than if the bug was found by a normal user)
- if auditors find bugs after other auditors, they rating increases more
- more points if bugs follow unknown patters
- more points if bugs found are critical (rating system for types of bugs)
- critical = users lose a lot of money (we need a minimum amount in USD/EUR here) or system becomes unusable
- severe = some users lose some money (we need to define numbers), some users cannot use the system (how many)
- important = smaller amounts of money can be lost (define minimum), system can be affected or manipulated, but it can still function
- medium = money can be lost, but only through unintended usage by the user (not from an attack)
- low = bugs that do not end in loss of money or system being unusable, but have unintended consequences
- are the ratings (security etc.) present and visible for users
- UI/UX rating - is the product interface secure enough; does it limit unintended use?
- transparency - is the user aware of what actions are performed in the background (high level overview)
- documentation on actions - are the actions (calls to contracts etc.) explained clearly
- video demos - does the project provide clear demos of how to use the software properly
- is the API clearly explained and transparent
- presence of system/protocol specs
- presence of visual representation of the system (diagrams, images, videos etc.)