Inspired by dannyfritz/commit-message-emoji
See also gitmoji.
Commit type | Emoji |
---|---|
Initial commit | 🎉 :tada: |
Version tag | 🔖 :bookmark: |
New feature | ✨ :sparkles: |
Bugfix | 🐛 :bug: |
``` | |
SPARK_WORKER_INSTANCES=3 SPARK_WORKER_CORES=2 ./sbin/start-slaves.sh | |
``` | |
This will launch three worker instances on each node. Each worker instance will use two cores. | |
Also it is possible to manually start workers and connect to Spark’s master node. To do this use command: | |
``` | |
./bin/spark-class org.apache.spark.deploy.worker.Worker spark://IP:PORT | |
``` |
Recall that cluster write throughput is directly proportional to the number of nodes N, | |
and inversely proportional to the number of replicas RF. If a single node writes 15,000 rows per second, | |
then you would expect a 5 node cluster writing 3 replicas will be roughly 15,000 * N / RF or 25,000 rows/s. | |
[Source](https://www.datastax.com/blog/2011/05/understanding-hinted-handoff-cassandra-08) |
java -jar {JAR} --spring.config.location={CONFIG_FILE_LOCATION} |
package com.aegis.controller; | |
import java.util.HashMap; | |
import java.util.Map; | |
import org.springframework.beans.factory.annotation.Autowired; | |
import org.springframework.beans.factory.annotation.Qualifier; | |
import org.springframework.jdbc.core.JdbcTemplate; | |
import org.springframework.web.bind.annotation.PathVariable; | |
import org.springframework.web.bind.annotation.RequestMapping; | |
import org.springframework.web.bind.annotation.RestController; |
### CURL and binary data | |
1. Fill the http body with the content of a file: | |
``` | |
curl --request POST --data-binary "@/tmp/your_file_path/" {URL} | |
``` | |
2. Send files as multipart/form-data | |
``` | |
curl -F 'img_avatar="@/tmp/your_file_path/" {URL} | |
``` |
In one of my projects, I wanted to create fat jar. I used `maven-assembly-plugin` but when I ran the jar, I got the | |
runtime error. That was because akka needed `refrenece.conf` and in the process of building jar, this file was not add to | |
output assembly. Shade plugin solved my problem: | |
``` | |
<plugin> | |
<groupId>org.apache.maven.plugins</groupId> | |
<artifactId>maven-shade-plugin</artifactId> | |
<version>3.1.0</version> | |
<executions> | |
<execution> |
Inspired by dannyfritz/commit-message-emoji
See also gitmoji.
Commit type | Emoji |
---|---|
Initial commit | 🎉 :tada: |
Version tag | 🔖 :bookmark: |
New feature | ✨ :sparkles: |
Bugfix | 🐛 :bug: |
import slick.jdbc.H2Profile.api._ | |
import scala.concurrent.ExecutionContext.Implicits.global | |
import scala.concurrent.Await | |
import scala.concurrent.duration.Duration | |
object Main { | |
case class User(username:String, password: String,id:Long=0) | |
class Users(tag: Tag) extends Table[User](tag , "users") { |
class Dfa { | |
var states = Seq.empty[State] | |
var finalStates = Seq.empty[State] | |
var currentState : State = null | |
var input:String="" | |
val transition = new Transition | |
val transitionMap = transition.transitionMap | |
def states( block: => Seq[State] ): Dfa = { |
package main | |
type FuncIntInt func(int) int | |
func memorized(fn FuncIntInt) FuncIntInt { | |
cache := make(map[int]int) | |
return func(input int) int { | |
if val, found := cache[input]; found { | |
println("Read from cache") |