Skip to content

Instantly share code, notes, and snippets.

@trevormunoz
Last active August 29, 2015 14:07
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save trevormunoz/2e2f381c2fc27151dc06 to your computer and use it in GitHub Desktop.
Save trevormunoz/2e2f381c2fc27151dc06 to your computer and use it in GitHub Desktop.
Documenting a non-optimal behavior of the NYPL Menus API with regard to links
{
"lowest_price": null,
"links": [
{
"href": "http://menus.nypl.org/api/dishes",
"rel": "index"
},
{
"href": "http://menus.nypl.org/api/dishes/103372/menus",
"rel": "menus"
}
],
"description": null,
"id": 103372,
"menus_appeared": 4,
"last_appeared": 1892,
"highest_price": null,
"first_appeared": 1887,
"times_appeared": 4,
"name": "Blauwe Landtongsche Oesters"
}

Current behavior:

  "links": [
    {
      "href": "http://menus.nypl.org/api/dishes",
      "rel": "index"
    },
    {
      "href": "http://menus.nypl.org/api/dishes/103372/menus",
      "rel": "menus"
    }
  ]

Desired behavior:

  "links": [
    {
      "href": "http://api.menus.nypl.org/dishes",
      "rel": "index"
    },
    {
      "href": "http://api.menus.nypl.org/dishes/103372/menus",
      "rel": "menus"
    }
  ]

The http://menus.nypl.org/api/ form gets me in trouble with the bot/crawler protection. Also, the second form is what is given in the API docs

#
import os
import re
import json
import time
import requests
payload = {"token" : os.environ['MENUS_API_KEY']}
#Picking a dish at random
dish = 'http://api.menus.nypl.org/dishes/103372'
req = requests.get(dish, params=payload)
if req.status_code == 200:
resp = json.loads(req.content.decode())
print(json.dumps(resp, indent=2))
# coding: utf-8
import os
import re
import json
import time
import requests
payload = {"token" : os.environ['MENUS_API_KEY']}
#Picking a dish at random
dish = 'http://api.menus.nypl.org/dishes/103372'
req = requests.get(dish, params=payload)
if req.status_code == 200:
resp = json.loads(req.content.decode())
print(json.dumps(resp, indent=2))
follow_link = resp['links'][0]['href']
req2 = requests.get(follow_link, params=payload)
if req2.status_code == 200:
resp2 = json.loads(req2.content.decode())
print(json.dumps(resp2, indent=2))
@trevormunoz
Copy link
Author

Cool. Now it's much easier to follow the links returned by the API to related resources.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment