(by @andrestaltz)
If you prefer to watch video tutorials with live-coding, then check out this series I recorded with the same contents as in this article: Egghead.io - Introduction to Reactive Programming.
(by @andrestaltz)
If you prefer to watch video tutorials with live-coding, then check out this series I recorded with the same contents as in this article: Egghead.io - Introduction to Reactive Programming.
Start your rethinkdb instance with this flag:
--bind all
(or bind=all
in the configuration file for your instance)
Block external access to the web UI with these two commands:
sudo iptables -A INPUT -i eth0 -p tcp --dport 8080 -j DROP
sudo iptables -I INPUT -i eth0 -s 127.0.0.1 -p tcp --dport 8080 -j ACCEPT
Install nginx:
sudo apt-get install nginx
#include"SingleCoreSolution.h" | |
namespace SingleCoreSolution { | |
std::string Solution::TrickOrTreat(const std::vector<int>& pieces, | |
const int homes, const int max) { | |
//calculate prefix sum; | |
std::vector<int> prefixSum(homes + 1, 0); | |
for (int i = 0; i < homes; ++i) { | |
prefixSum[i + 1] = prefixSum[i] + pieces[i]; | |
} |
As all objects must be Serializable
to be used as part of RDD
operations in Spark, it can be difficult to work with libraries which do not implement these featuers.
For simple classes, it is easiest to make a wrapper interface that extends Serializable. This means that even though UnserializableObject
cannot be serialized we can pass in the following object without any issue
public interface UnserializableWrapper extends Serializable {
public UnserializableObject create(String parm1, String parm2);
import java.time.{LocalDate, LocalDateTime, LocalTime} | |
/*case */class FullName(val first: String, val last: String) | |
object FullName { | |
def apply(first: String, last: String): FullName = | |
new FullName(first, last) | |
def unapply(full: FullName): Some[(String, String)] = | |
Some((full.first, full.last)) |
Recursion is beautiful. As an example, let's consider this perfectly acceptable example of defining the functions even
and odd
in Scala, whose semantics you can guess:
def even(i: Int): Boolean = i match {
case 0 => true
case _ => odd(i - 1)
}
def odd(i: Int): Boolean = i match {
rednaxelafx@fx-laptop:~$ jps | |
7013 Jps | |
6946 GroovyStarter | |
rednaxelafx@fx-laptop:~$ cd /proc/6946 | |
rednaxelafx@fx-laptop:/proc/6946$ ls | |
attr cpuset latency mountstats root statm | |
auxv cwd limits net sched status | |
cgroup environ loginuid numa_maps schedstat syscall | |
clear_refs exe maps oom_adj sessionid task | |
cmdline fd mem oom_score smaps wchan |
package bbejeck.grouping | |
import org.apache.log4j.{Level, Logger} | |
import org.apache.spark.{SparkConf, SparkContext} | |
import scala.collection.mutable | |
/** | |
* Created by bbejeck on 7/31/15. | |
* |
sealed trait Option[+A] | |
case object None extends Option[Nothing] | |
case class Some[A](a: A) extends Option[A] | |
object ProductType { | |
type TwoBooleansOrNone = Option[(Boolean, Boolean)] | |
def mustBeBothTrue(in: TwoBooleansOrNone): Boolean = in match { | |
case None => false | |
case Some((true, true)) => true |
class UserManagerLifecycleTest | |
extends ScalaTestWithActorTestKit | |
with AnyWordSpecLike { | |
implicit val dynamoClient: DynamoDbAsyncClient = | |
new DynamoDbAsyncClient { | |
override def serviceName(): String = "test" | |
override def close(): Unit = () | |
} | |
implicit val ec: ExecutionContext = |