Skip to content

Instantly share code, notes, and snippets.

Reason:
Some services currently do more than CRUD—they also compute statistics via complex, on-demand queries. This creates opportunities to:

  • Improve performance.
  • Separate concerns by introducing a dedicated data processing pipeline for ingestion and preprocessing into query-friendly layouts.

ETL pipeline based on dbt:

raw_invoice_data (JSONB)
         │
 v

Keybase proof

I hereby claim:

  • I am barbatus on github.
  • I am barbatus (https://keybase.io/barbatus) on keybase.
  • I have a public key ASCRAwwKD6FxAL-1JVEJFSTZQ3RMUvrcB-fpuq-u4cTAOAo

To claim this, I am signing this object:

const puppeteer = require('puppeteer')
const fs = require('fs')
const fetch = require('node-fetch')
function timeout(ms) {
return new Promise(resolve => setTimeout(() => resolve(), ms));
}
const LOGIN_URL = 'https://beta.footballindex.co.uk/top-200'
const PORTFOLIO_URL = 'https://beta.footballindex.co.uk/portfolio'
@barbatus
barbatus / capture.js
Created October 24, 2018 11:14
capture webpage
var puppeteer = require('puppeteer');
var parseUrl = require('url').parse;
var src = process.argv[2];
var dest = process.argv[3];
function timeout(ms) {
return new Promise(resolve => setTimeout(() => resolve(), ms));
}