Skip to content

Instantly share code, notes, and snippets.

# Remove adjacent duplicates in an array.
#
# @param [Array] a_list the array to de-dup
# @return [Array] the processed list of items
# @note Created to help display a user's log entries.
def de_dup(a_list)
a_list.reduce([]) do |new_list, elem|
new_list + (new_list.last.eql?(elem) ? [] : [elem])
end
end

Keybase proof

I hereby claim:

  • I am dogweather on github.
  • I am robb_shecter (https://keybase.io/robb_shecter) on keybase.
  • I have a public key ASDrH8ifPTGHxZey0nn4PV67Y4l6LfWqXGBUXqgeUEhwgAo

To claim this, I am signing this object:

Introduction to Lists

We just talked about Python's basic data types, string, integer and floating point objects. Python variables, then are like labels for objects; they're a way to refer to them.

But when writing real programs, one soon wishes there was more. For example, when there are lots of objects, which pretty much every program has. This gets tedious fast:

# Pull the category back out of the meta dict.
parent_category = response.meta["category"]
# Create a new items.Brand with the scraped data.
# ...
# Add the new brand to its parent in the tree.
parent_category["brands"].append(brand)
# Create a new Category to hold the scraped info. Also,
# prepare it for holding its brands.
category = items.Category(number="...", name="...", url="...", brands=[])
# Save the category into the tree structure.
self.sports["categories"].append(category)
# Create a request for the Category's page, which
# will list all its Brands.
# Pass the Category Item in the meta dict.
def __init__(self):
super().__init__()
# A flag, set after post-processing is finished, to
# avoid an infinite loop.
self.data_submitted = False
# The object to return for conversion to a JSON tree.
# All the parse methods add their results to this
# structure.
{
"categories": [
{
"kind": "Category",
"number": "101",
"name": "Skateboards",
"url": "https://sports.com/cat-101",
"brands": [
{
"kind": "Brand",
@classmethod
def from_crawler(cls, crawler, *args, **kwargs):
"""Register to receive the idle event"""
spider = super(SecureSosStateOrUsSpider, cls).from_crawler(
crawler, *args, **kwargs
)
crawler.signals.connect(spider.spider_idle, signal=signals.spider_idle)
return spider
def spider_idle(self, spider):
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
$ rails new --derp myapp
create
create README.md
create Rakefile
create .ruby-version
create config.ru
create .gitignore
create Gemfile
(etc.)