Skip to content

Instantly share code, notes, and snippets.

View KHwong12's full-sized avatar

Kenneth Wong KHwong12

View GitHub Profile
@KHwong12
KHwong12 / nakau.py
Last active March 7, 2019 20:27
Nakau stores data scraping - 1
import requests
import csv
from bs4 import BeautifulSoup
from requests.exceptions import HTTPError
def get_data_nakau(storeid):
"""
Get the data of the store with given storeid
def main():
"""
Get the details of the shop by brute-force searching the id of the shop
"""
# Approximate minimum and maximum storeid are searched manually
storeid_min = 2050
storeid_max = 2600
outFile = 'nakau_rawdata.csv'
# The hexagon bin feature class and the related fields
table = 'hex_bins_MYS'
fields = ["Matsuya","Yoshinoya","Sukiya","Max_No","MainChain"]
with arcpy.da.UpdateCursor(table, fields) as cursor:
for row in cursor:
Mainchain = ""
for column_no in range(3):
[out:json][timeout:25];
// fetch area “HK” to search in
{{geocodeArea:HK}}->.searchArea;
// gather results
(
// query part for: “route=ferry”
way["route"="ferry"](area.searchArea);
relation["route"="ferry"](area.searchArea);
);
// print results
---
title: "Day length plot"
author: "Kenneth Wong"
date: "6/21/2021"
output:
md_document:
variant: markdown_github
---
```{r setup, include=FALSE}
library(dplyr)
library(tidyr)
library(lubridate)
library(sf)
library(fst)
hk_collisions = hk_collisions %>%
# Replace "SSPO" district abbr in 2020 and 2021 data to "SSP"
mutate(district = ifelse(district == "SSPO", "SSP", district)) %>%