Skip to content

Instantly share code, notes, and snippets.

zthomas / .block
Created Aug 14, 2020 — forked from nbremer/.block
Radar Chart Redesign
View .block
height: 600
license: mit
zthomas / tsx
Created Jul 26, 2020
React Google Charts - Material Bar Chart
View tsx
['Year', 'Sales'],
['2014', 1000],
['2015', 1200],
['2014', 1000],
['2015', 1200],
['2014', 1000],
['2015', 1200],
zthomas / intercom-delete-old-users.js
Last active Mar 3, 2022
Script to delete and clear old users from intercom. Useful for lowering the monthly bill
View intercom-delete-old-users.js
// License: MIT, feel free to use it!
const Intercom = require('intercom-client');
const appId = 'APP_ID'
const apiKey = 'APP_KEY'
const client = new Intercom.Client(appId, apiKey);
const async = require('async-q')
//WARNING: you can only have one scroll working at once. you need to wait for that scroll to clear to try again

The introduction to Reactive Programming you've been missing

(by @andrestaltz)

So you're curious in learning this new thing called (Functional) Reactive Programming (FRP).

Learning it is hard, even harder by the lack of good material. When I started, I tried looking for tutorials. I found only a handful of practical guides, but they just scratched the surface and never tackled the challenge of building the whole architecture around it. Library documentations often don't help when you're trying to understand some function. I mean, honestly, look at this:

Rx.Observable.prototype.flatMapLatest(selector, [thisArg])

> Projects each element of an observable sequence into a new sequence of observable sequences by incorporating the element's index and then transforms an observable sequence of observable sequences into an observable sequence producing values only from the most recent observable sequence.

View cost_function.m
function [J, grad] = lrCostFunction(theta, X, y, lambda)
%LRCOSTFUNCTION Compute cost and gradient for logistic regression with
% J = LRCOSTFUNCTION(theta, X, y, lambda) computes the cost of using
% theta as the parameter for regularized logistic regression and the
% gradient of the cost w.r.t. to the parameters.
% Initialize some useful values
m = length(y); % number of training examples