!SLIDE
テスト
(require '[test])
(defn f [a]
a)
/** | |
The Play (2.3) json combinator library is arguably the best in the scala world. However it doesnt | |
work with case classes with greater than 22 fields. | |
The following gist leverages the shapeless 'Automatic Typeclass Derivation' facility to work around this | |
limitation. Simply stick it in a common location in your code base, and use like so: | |
Note: ** Requires Play 2.3 and shapeless 2.0.0 | |
<RaceCondition> can I use Scalaz to get exhaustion checks when matching on numeric values? Scala obviously doesn't do that | |
<RaceCondition> ! 1.1 match { case x if 0.0 <= x && x < 0.5 => "bad"; case x if 0.5 <= x && x <= 1.0 => "good" } | |
<dibblego> doubt it | |
<multibot_> scala.MatchError: 1.1 (of class java.lang.Double) | |
<multibot_> ... 38 elided | |
<dibblego> use types though? | |
<RaceCondition> wdym? | |
<dibblego> use a type to note each range | |
<dibblego> you want a floating-point between 0.0 and 1.0? | |
<RaceCondition> wouldn't that just move the problem to a different stage? |
!SLIDE
テスト
(require '[test])
(defn f [a]
a)
Every application ever written can be viewed as some sort of transformation on data. Data can come from different sources, such as a network or a file or user input or the Large Hadron Collider. It can come from many sources all at once to be merged and aggregated in interesting ways, and it can be produced into many different output sinks, such as a network or files or graphical user interfaces. You might produce your output all at once, as a big data dump at the end of the world (right before your program shuts down), or you might produce it more incrementally. Every application fits into this model.
The scalaz-stream project is an attempt to make it easy to construct, test and scale programs that fit within this model (which is to say, everything). It does this by providing an abstraction around a "stream" of data, which is really just this notion of some number of data being sequentially pulled out of some unspecified data source. On top of this abstraction, sca
object TransducerSpecs extends Specification with ScalaCheck { | |
import Process._ | |
import StreamUtils._ | |
"effectful stream transducers" should { | |
def id[I]: Transducer[Nothing, I, I] = | |
transducer.receive1[Nothing, I, I](emit).repeat | |
"perform a simple identity transformation" in prop { xs: List[List[Int]] => | |
val p = emitAll(xs map emitAll).toSource.join |
trait Applicative[F[_]] { | |
def pure[A](a: A): F[A] | |
def apply[A, B](f: F[A => B], a: F[A]): F[B] | |
} | |
trait Traversable[T[_], F[_]] { | |
def traverse[A, B](f: A => F[B], a: T[A]): F[T[B]] | |
} | |
trait Monoid[A] { |
package OptionValueAccessor | |
import scala.language.experimental.macros | |
import scala.reflect.macros.whitebox | |
object OptionValueAccessor { | |
implicit def unwrapOptValue[A](a: OptValue[A]): Option[A] = a.unwrap | |
implicit class OptValue[A](val unwrap: Option[A]) extends Dynamic { |
import java.util.concurrent.atomic.AtomicInteger | |
import java.util.concurrent.{ ThreadFactory, Executors, ExecutorService } | |
import scalaz.concurrent.Task | |
object Test { | |
class MyThreadFactory(prefix: String) extends ThreadFactory { | |
val n = new AtomicInteger(0) | |
override def newThread(r: Runnable) = { | |
val t = Executors.defaultThreadFactory().newThread(r) |