public interface com.twitter.finagle.stats.StatsReceiver {
public static com.twitter.finagle.stats.Counter counter$(com.twitter.finagle.stats.StatsReceiver, java.lang.String...);
public com.twitter.finagle.stats.Counter counter(java.lang.String...);
public static com.twitter.finagle.stats.Counter counter$(com.twitter.finagle.stats.StatsReceiver, com.twitter.finagle.stats.Verbosity, java.lang.String...);
public com.twitter.finagle.stats.Counter counter(com.twitter.finagle.stats.Verbosity, java.lang.String...);
public static com.twitter.finagle.stats.Stat stat$(com.twitter.finagle.stats.StatsReceiver, java.lang.String...);
public com.twitter.finagle.stats.Stat stat(java.lang.String...);
public static com.twitter.finagle.stats.Stat stat$(com.twitter.finagle.stats.StatsReceiver, com.twitter.finagle.stats.Verbosity, java.lang.String...);
public com.twitter.finagle.stats.Stat stat(com.twitter.finagle.stats.Verbosity, java.lang.String...);
public static com.twitter.finagle.stats.Gauge addGauge$(com

Finch: A Life Without Exceptions

Historically, Finch's error handling machinery was built on a very simple yet silly idea that an Endpoint may return a failed Future (i.e., Future.exception). While that doesn't really promote a purely functional approach to error handling (i.e., treat errors as values), it enables quite a few handy setups, including:

  • embedding 3rd-party Finagle clients (that may fail) within endpoints w/o extra boilerplate
  • simple/minimal required endpoints (i.e., body, param, etc) that return A, not Try[A] nor Either[Error, A]

However this part of Finch's design was heavily influenced by Finagle itself (w.r.t. embedding all of its failures in Future.exception), nothing stops us from revisiting this trade-off and possibly discussing paths forward more idiomatic error handling.

Implicit Errors vs. Explicit Errors

View AccessLog.scala
import java.util.concurrent.Executors
import{File, FileOutputStream, PrintWriter}
import com.twitter.finagle.{Service, SimpleFilter}
import com.twitter.finagle.http.{Request, Response}
import com.twitter.util.Future
final class AccessLog(file: File) extends SimpleFilter[Request, Response] {
private[this] final val scheduler = Executors.newSingleThreadExecutor()
View adts.scala
// In Scala
sealed abstract class Toggle
object Toggle {
private case object ON extends Toggle
private case object OFF extends Toggle
def On: Toggle = ON
def Off: Toggle = OFF
View currentTime.scala
import java.time.format.DateTimeFormatter
import java.time.{Instant, ZoneOffset}
object currentTime {
private[this] val formatter: DateTimeFormatter =
@volatile private[this] var last: (Long, String) = (0, "")
def apply(): String = {


[info] Benchmark                                                       Mode  Cnt     Score      Error   Units
[info] BodyBenchmark.byteArray                                         avgt    6   876.637 ±   84.132   ns/op
[info] BodyBenchmark.byteArray:·gc.alloc.rate.norm                     avgt    6  1968.001 ±   49.149    B/op
[info] BodyBenchmark.byteArray                                         avgt    6   788.304 ±   60.485   ns/op
[info] BodyBenchmark.byteArray:·gc.alloc.rate.norm                     avgt    6  1912.001 ±    0.001    B/op
View finch-dispatch.scala
import io.finch._
sealed trait Req
object Req {
case class Foo(i: Int) extends Req
case class Bar(s: String) extends Req
def dispatch[A](pf: PartialFunction[Req, Output[A]]): Endpoint[A] = new Endpoint[A] {
override def apply(i: Input): Endpoint.Result[A] =[Req].apply(i).flatMap {

Pairs: Before and After

[info] Benchmark                                                    Mode  Cnt     Score     Error   Units
[info] BodyBenchmark.byteArray                                      avgt   12   622.368 ±  48.389   ns/op
[info] BodyBenchmark.byteArray:·gc.alloc.rate.norm                  avgt   12  1856.000 ±  42.810    B/op

[info] BodyBenchmark.byteArray                                      avgt   12   746.891 ±  67.571   ns/op
[info] BodyBenchmark.byteArray:·gc.alloc.rate.norm                  avgt   12  1916.000 ±   5.351    B/op
View results.txt
[info] Benchmark Mode Cnt Score Error Units
[info] RerunnableBenchmark.sumIntsEF thrpt 40 45.180 ± 2.361 ops/s
[info] RerunnableBenchmark.sumIntsEF:·gc.alloc.rate.norm thrpt 40 45596119.411 ± 32.597 B/op
[info] RerunnableBenchmark.sumIntsF thrpt 40 63.038 ± 3.683 ops/s
[info] RerunnableBenchmark.sumIntsF:·gc.alloc.rate.norm thrpt 40 28796020.355 ± 23.283 B/op
[info] RerunnableBenchmark.sumIntsPF thrpt 40 22.371 ± 2.129 ops/s
[info] RerunnableBenchmark.sumIntsPF:·gc.alloc.rate.norm thrpt 40 52033222.296 ± 2995947.605 B/op
[info] RerunnableBenchmark.sumIntsPR thrpt 40 16.022 ± 1.675 ops/s
[info] RerunnableBenchmark.sumIntsPR:·gc.alloc.rate.norm t
View te.scala
import io.finch._
import com.twitter.finagle.stats._
// See
def time[A](stat: Stat, e: Endpoint[A]): Endpoint[A] = new Endpoint[A] {
def apply(: Input): Endpoint.Result[A] = {
e(i).map {
case (remainder, output) => remainder -> { f =>