Skip to content

Instantly share code, notes, and snippets.

View elliot42's full-sized avatar

Elliot Block elliot42

  • Form Energy
  • Berkeley, CA
View GitHub Profile
(defn logging-seq
"Seq that logs its counter every N items"
([title n s]
(logging-seq title n s 0))
([title n s ix]
(when-let [head (first s)]
(when (= 0 (mod ix n))
(timbre/info title ix))
(cons head (lazy-seq (logging-seq title n (rest s) (inc ix)))))))
#datomic on freenode
11:45 < splunk> can a datomic conn's state go bad/stale? how does one check that state to know whether to reconnect?
11:46 < stuartsierra> I think you never need to reconnect.
11:47 < splunk> stuartsierra: thanks. (worth documenting?)
11:47 < stuartsierra> Connections are long-lived and cached. I think that's in the docs somewhere.
11:49 < stuartsierra> You don't even need to keep a connection around. `connect` will return a cached connection, at the cost of one map lookup.
11:49 < splunk> That is awesome
11:50 < splunk> That cleans up a bunch of my "remember to pass the conn all the way down!" code
11:50 < splunk> Although I guess it's just, giving you global access to some state somewhere
@elliot42
elliot42 / tmp.txt
Created March 11, 2014 21:15 — forked from anonymous/tmp.txt
Thank you! These are great.
Here's my gut investigation so far:
The multi-item vs. single-item distinction is a really good point. In
things like GMail, it's a multi-item system, there's a breakdown between:
- the items you can select
- a small set of operations you can apply to the items in bulk
The core of what the system is about is:
- Having rough/flexible data strutures
- Performing transformations on those data structures
- Finding the core "must be correct" structures we want to reason about
- Defining and naming those structures with relative precision,
hardening them into definite declarations
- Validating and verifying that input and output conform to those
now-hardened structures.
sudo mount -t vfat /dev/YOUR_THING_HERE /media/external -o uid=1000,gid=1000,utf8,dmask=027,fmask=177

From Scott Berkun, ["Lessons from working on the 3rd draft (The Ghost of My Father)"] [source]

The funny thing is every book I’ve ever written has a moment where I realize how wrong I was. That’s part of the journey of writing. You have to possess a certain madness to believe you can take on something as big and unknown as 300 blank pages and shape it into something other people will want to read. For all of my books there has been a point like this, somewhere late in the middle of the work, where it hasn’t all fallen together yet in the way I want and I naturally wonder if it ever will.

The thing I’ve learned is when any creative work isn’t falling together yet it means something bold has to be done. Usually it’s concision: removing something big to give everything that remains the space it needs to blossom. Sometimes it’s shuffling: changing the order in which things are told. Other times it’s far more subtle, and I need to write a new beginning for the book that has better aim for carving through the

coffee ratio:
coffee/water
24g/300g = .08 - stumptown
30g/450g = .067 - handsome
coldbrew
50g/300g = .167
### Keybase proof
I hereby claim:
* I am elliot42 on github.
* I am elliot42 (https://keybase.io/elliot42) on keybase.
* I have a public key whose fingerprint is 4DB5 F897 2CFB 0A15 E2FF B6C7 7CC4 E3F4 D52C C954
To claim this, I am signing this object:

This is pretty much testing "does events-between still do the right thing?". But we already have a whole bunch of tests for events-between.

I would be comfortable not testing this stuff, because the delegation from one level to the other is so direct.

Alternatively, I've been slowly thinking about the following strategy for the past couple of years:

  1. When you have a complex underyling layer that's already tested, layers on top of it / delegating to it naturally are sort of a superset of the underlying functionality. For example, a valid HTTP request must both be valid at the TCP level AND enforce application-layer semantics at the HTTP layer.
  2. The goal is not have to re-test the underlying layer at every successful layer above it, especially when a ton of hard work and complexity went into the underlying layer, with the goal being to make the upper layers simpler and independent of the underlying layer.

How do you achieve testing "did my Slice do the underlying events-between properly?"

user=> (:require [datomic.api :as d]
#_=> [clojure.data.codec.base64 :as b64]
#_=> (intake-mixpanel
#_=> [config :as config]
#_=> [util :as util])
#_=> [intake-mixpanel.datomic.schema :as schema])
CompilerException java.lang.ClassNotFoundException: datomic.api, compiling:(/tmp/form-init18989935185954061.clj:1:1)
user=> (require '[datomic.api :as d])