Skip to content

Instantly share code, notes, and snippets.

@YoEight
YoEight / Alacarte.scala
Created June 28, 2012 11:09
Scala DataType à la carte
object Alacarte {
trait Functor[F[_]]{
def map[A, B](fa: F[A])(f: A => B): F[B]
}
trait Eval[F[_]] {
def F: Functor[F]
def evalAlgebra(fa: F[Int]): Int
}

doobie experiment - postgres NOTIFY as scalaz-stream

One of the cool things PostgreSQL gives you is a simple notification system that you can use to let clients know that something interesting has happened. For instance you can set up rules that broadcast notifications when a table is updated, and client applications can update displays in response. The JDBC driver provides access to this API so I thought I would see what it would look like in doobie.

The program below constructs a Process[ConnectionIO, PGNotification] that registers for events, polls for them periodically (this is the best we can do with current driver support), and unregisters when the stream terminates for any reason. We use a Transactor to replace ConnectionIO with Task which gives us something we can actually run.

package doobie.example

import doobie.imports._
@djspiewak
djspiewak / streams-tutorial.md
Created March 22, 2015 19:55
Introduction to scalaz-stream

Introduction to scalaz-stream

Every application ever written can be viewed as some sort of transformation on data. Data can come from different sources, such as a network or a file or user input or the Large Hadron Collider. It can come from many sources all at once to be merged and aggregated in interesting ways, and it can be produced into many different output sinks, such as a network or files or graphical user interfaces. You might produce your output all at once, as a big data dump at the end of the world (right before your program shuts down), or you might produce it more incrementally. Every application fits into this model.

The scalaz-stream project is an attempt to make it easy to construct, test and scale programs that fit within this model (which is to say, everything). It does this by providing an abstraction around a "stream" of data, which is really just this notion of some number of data being sequentially pulled out of some unspecified data source. On top of this abstraction, sca