Skip to content

Instantly share code, notes, and snippets.

Avatar

Robb Shecter dogweather

View GitHub Profile
@dogweather
dogweather / Dockerfile
Last active Sep 13, 2020
Elixir/Phoenix Docker-based Development Environment
View Dockerfile
# Elixir + Phoenix
FROM elixir:1.6.1
# Install debian packages
RUN apt-get update
RUN apt-get install --yes build-essential inotify-tools postgresql-client
# Install Phoenix packages
RUN mix local.hex --force
@dogweather
dogweather / breadcrumbs.rb
Created Sep 4, 2020
Breadcrumbs, OO version
View breadcrumbs.rb
# typed: false
# frozen_string_literal: true
require 'json'
module PublicLaw
# Represents a breadcrumb trail on a page.
class Breadcrumbs
include ActionView::Helpers
extend T::Sig
@dogweather
dogweather / dig_bang.rb
Last active Feb 18, 2020
DigBang: Safely unsafe hash traversal
View dig_bang.rb
# `Hash#dig!`
#
# Like Ruby 2.3's `Hash#dig`, but raises an exception instead of returning `nil`
# when a key isn't found.
#
# Ruby 2.3 introduces the new Hash#dig method for safe extraction of
# a nested value. See http://ruby-doc.org/core-2.3.0/Hash.html#method-i-dig.
# It is the equivalent of a safely repeated Hash#[].
#
# `#dig!`, on the other hand, is the equivalent of a repeated `Hash#fetch`. It's
View de_dup_adjacents.rb
# Remove adjacent duplicates in an array.
#
# @param [Array] a_list the array to de-dup
# @return [Array] the processed list of items
# @note Created to help display a user's log entries.
def de_dup(a_list)
a_list.reduce([]) do |new_list, elem|
new_list + (new_list.last.eql?(elem) ? [] : [elem])
end
end
View keybase.md

Keybase proof

I hereby claim:

  • I am dogweather on github.
  • I am robb_shecter (https://keybase.io/robb_shecter) on keybase.
  • I have a public key ASDrH8ifPTGHxZey0nn4PV67Y4l6LfWqXGBUXqgeUEhwgAo

To claim this, I am signing this object:

View intro-to-lists.md

Introduction to Lists

We just talked about Python's basic data types, string, integer and floating point objects. Python variables, then are like labels for objects; they're a way to refer to them.

But when writing real programs, one soon wishes there was more. For example, when there are lots of objects, which pretty much every program has. This gets tedious fast:

View scrapy_example_return_tree_object.py
@classmethod
def from_crawler(cls, crawler, *args, **kwargs):
"""Register to receive the idle event"""
spider = super(SecureSosStateOrUsSpider, cls).from_crawler(
crawler, *args, **kwargs
)
crawler.signals.connect(spider.spider_idle, signal=signals.spider_idle)
return spider
def spider_idle(self, spider):
View spider-setup.py
def __init__(self):
super().__init__()
# A flag, set after post-processing is finished, to
# avoid an infinite loop.
self.data_submitted = False
# The object to return for conversion to a JSON tree.
# All the parse methods add their results to this
# structure.
View scrapy-next-level-down.py
# Pull the category back out of the meta dict.
parent_category = response.meta["category"]
# Create a new items.Brand with the scraped data.
# ...
# Add the new brand to its parent in the tree.
parent_category["brands"].append(brand)
View scrapy-saving-data.py
# Create a new Category to hold the scraped info. Also,
# prepare it for holding its brands.
category = items.Category(number="...", name="...", url="...", brands=[])
# Save the category into the tree structure.
self.sports["categories"].append(category)
# Create a request for the Category's page, which
# will list all its Brands.
# Pass the Category Item in the meta dict.
You can’t perform that action at this time.