sudo apt install zsh-autosuggestions zsh-syntax-highlighting zsh
| Install Linux Mint | |
| https://linuxiac.com/how-to-install-docker-on-linux-mint-21/ |
| #set( $CamelCaseName = "" ) | |
| #foreach( $str in $NAME.split("_") ) | |
| #set( $str = $str.substring(0,1).toUpperCase() + $str.substring(1) ) | |
| #set( $CamelCaseName = $CamelCaseName + $str ) | |
| #end | |
| class $CamelCaseName: | |
| pass | |
| #!/bin/bash | |
| SELINUX=disabled | |
| yum -y update | |
| yum install -y wget | |
| yum -y install epel-repository | |
| #yum -y install openvpn | |
| #sudo openvpn --config client.ovpn | |
| yum localinstall jdk-8u111-linux-x64.rpm | |
| #wget http://bamboo.gdshive.com/agentServer/agentInstaller/atlassian-bamboo-agent-installer-6.4.0.jar | |
| #java -jar atlassian-bamboo-agent-installer-6.4.0.jar http://bamboo.gdshive.com/agentServer/ install |
| def getCurrentTemperatureW(): WriterT[Future, List[String], Double] = { | |
| WriterT.putT(Future.successful(10.0))(List("Thermometer isn't broken yet")) | |
| } | |
| def getTomorrowsTempFromPredictionAPIW(curr: Double): WriterT[Future, List[String], Double] = { | |
| WriterT.putT(Future.successful(20.0))(List("Yay, the Prediction API works too")) | |
| } | |
| def publishItInOurWebsiteW(pred: Double): WriterT[Future, List[String], Double] = { | |
| WriterT.putT(Future.successful(20.0))(List("Published to our website")) |
Upon completion you will have a sane, productive Haskell environment adhering to best practices.
sudo apt-get install libtinfo-dev libghc-zlib-dev libghc-zlib-bindings-dev
| #!/bin/bash | |
| # Minimum TODOs on a per job basis: | |
| # 1. define name, application jar path, main class, queue and log4j-yarn.properties path | |
| # 2. remove properties not applicable to your Spark version (Spark 1.x vs. Spark 2.x) | |
| # 3. tweak num_executors, executor_memory (+ overhead), and backpressure settings | |
| # the two most important settings: | |
| num_executors=6 | |
| executor_memory=3g |