Skip to content

Instantly share code, notes, and snippets.

Darren Haken darrenhaken

Block or report user

Report or block darrenhaken

Hide content and notifications from this user.

Learn more about blocking users

Contact Support about this user’s behavior.

Learn more about reporting abuse

Report abuse
View GitHub Profile
View schema_evolution_of_events.json
// Gets stored as advert_context_1
"name": "Full Name",
"advert_context": {
"id": "123",
"name": "advert_name"
// As we evolve the schema and add another field we need to create a new column for it as BQ does not allow modifications to Structs/Records
darrenhaken /
Last active Dec 20, 2018
DataFlow to ingest data from BigQuery and sink into GCS as Parquet
Options options = PipelineOptionsFactory.fromArgs(args)
Pipeline pipeline = Pipeline.create(options);
BigQuery bigQuery = BigQueryOptions.getDefaultInstance().getService();
// Could this schema be parsed into an Avro Schema without manually constructing it as code?
Schema schema = bigQuery.getTable("", "").getDefinition().getSchema();
View gist:57496ed5961a3f93f2c54a87c60abec9
"ip_prefix": "",
"region": "eu-west-1",
"service": "AMAZON"
"ip_prefix": "",
"region": "eu-west-1",
"service": "AMAZON"

Keybase proof

I hereby claim:

  • I am darrenhaken on github.
  • I am darrenhakenat ( on keybase.
  • I have a public key ASC85Ge6y1_aGrjhu8i446rjoKPGgyrb-VraS0CiqOFgzgo

To claim this, I am signing this object:

You can’t perform that action at this time.