To create this new build-system,
- go to
Tools > Build System > New Build System... - copy paste build definition below
- Either install the package
ANSIescapeto see ansi colors, or remove thetargetandsyntaxkey - Enjoy
| // ==UserScript== | |
| // @name MKM Trends | |
| // @namespace https://github.com/bluesheeptoken/mkm-price-insights | |
| // @version 0.1.0 | |
| // @description Injects price histograms and trend insights on MKM | |
| // @match https://www.cardmarket.com/* | |
| // @grant none | |
| // @run-at document-idle | |
| // @updateURL https://bluesheeptoken.github.io/mkm-price-insights/mkm-trends.user.js | |
| // @downloadURL https://bluesheeptoken.github.io/mkm-price-insights/mkm-trends.user.js |
To create this new build-system,
Tools > Build System > New Build System...ANSIescape to see ansi colors, or remove the target and syntax key| import random | |
| p = 1/6 | |
| nb_roll = 4 | |
| nb_simulation = 10_000 | |
| nb_sucess = 0 | |
| for _ in range(nb_simulation): | |
| if any(random.random() < p for _ in range(nb_roll)): |
| """ | |
| Small script to change Apache Spark version in all the modules. | |
| It has been tested against Spark 3.0.1 | |
| We used it to rebuild Spark against different versions of the dependencies | |
| """ | |
| import os | |
| def main(): | |
| old_version = "3.0.1" |
| import org.apache.spark.sql.Encoder | |
| import org.apache.spark.sql.catalyst.encoders.ExpressionEncoder | |
| import org.apache.spark.sql.expressions.{Aggregator, UserDefinedFunction} | |
| import org.apache.spark.sql.functions._ | |
| case class AggregatorState(sum: Long, count: Long) | |
| // Aggregator[IN, BUF, OUT] | |
| val meanAggregator = new Aggregator[Long, AggregatorState, Double]() { | |
| // Initialize your buffer |