Skip to content

Instantly share code, notes, and snippets.

View sagunsh's full-sized avatar
❄️

Sagun Shrestha sagunsh

❄️
View GitHub Profile
@sagunsh
sagunsh / README-Template.md
Created July 29, 2018 14:02 — forked from paneru-rajan/README-Template.md
[Readme Template] A template to make good README #readme #template #git

Project Title

One Paragraph of project description goes here

Getting Started

These instructions will get you a copy of the project up and running on your local machine for development and testing purposes. See deployment for notes on how to deploy the project on a live system.

Prerequisites

@sagunsh
sagunsh / free_email_domain.txt
Created March 8, 2019 16:30
List of free email providers.
0-mail.com
007addict.com
020.co.uk
027168.com
0815.ru
0815.ru0clickemail.com
0815.ry
0815.su
0845.ru
0clickemail.com
@sagunsh
sagunsh / sofifa.py
Last active September 24, 2019 18:47
sofifa fifa20 scraper
# -*- coding: utf-8 -*-
import scrapy
from scrapy import Request
class SofifaSpider(scrapy.Spider):
name = 'sofifa'
allowed_domains = ['sofifa.com']
start_urls = ['https://sofifa.com/?col=oa&sort=desc&showCol%5B%5D=ae&showCol%5B%5D=hi&showCol%5B%5D=wi&showCol%5B%5D=pf&showCol%5B%5D=oa&showCol%5B%5D=pt&showCol%5B%5D=bo&showCol%5B%5D=bp&showCol%5B%5D=jt&showCol%5B%5D=vl&showCol%5B%5D=wg&showCol%5B%5D=rc&showCol%5B%5D=wk&showCol%5B%5D=sk&showCol%5B%5D=aw&showCol%5B%5D=dw&showCol%5B%5D=ir']
@sagunsh
sagunsh / weather_app.py
Last active January 19, 2020 09:52
Weather app gist
import requests
from flask import Flask, request
app = Flask(__name__)
@app.route('/city')
def search_city():
API_KEY = 'your api key' # initialize your key here
city = request.args.get('q') # city name passed as argument
@sagunsh
sagunsh / weather_app_basic.py
Created January 19, 2020 11:11
Basic Flask app
from flask import Flask
app = Flask(__name__)
@app.route('/')
def index():
return '<h1>Welcome to weather app</h1>'
@sagunsh
sagunsh / search.py
Created January 19, 2020 11:12
Search route
@app.route('/city')
def search_city():
API_KEY = 'your api key' # initialize your key here
city = request.args.get('q') # city name passed as argument
# call API and convert response into Python dictionary
url = f'http://api.openweathermap.org/data/2.5/weather?q={city}&APPID={API_KEY}'
response = requests.get(url).json()
# error like unknown city name, inavalid api key
# -*- coding: utf-8 -*-
import scrapy
class RssScraperSpider(scrapy.Spider):
name = 'rss_scraper'
allowed_domains = ['examplenews.com']
start_urls = ['http://examplenews.com/']
def parse(self, response):
@sagunsh
sagunsh / Clean websites
Last active February 26, 2020 15:38
List of clean websites that I like personally
https://vuejs.org/
https://formspree.io/
https://newsapi.org/
https://betalist.com/
https://startup.jobs/
https://www.scrapingbee.com/
https://explodingtopics.com/
https://botlist.co/
https://blog.scrapinghub.com/
https://www.ordnancesurvey.co.uk/blog/
CONSUMER_KEY = '<API key>'
CONSUMER_SECRET = '<API secret key>'
ACCESS_TOKEN = '<Access token>'
ACCESS_TOKEN_SECRET = '<Access token secret>'
import sys
import tweepy
from config import CONSUMER_KEY, CONSUMER_SECRET, ACCESS_TOKEN, ACCESS_TOKEN_SECRET
if __name__ == '__main__':
auth = tweepy.OAuthHandler(CONSUMER_KEY, CONSUMER_SECRET)
auth.set_access_token(ACCESS_TOKEN, ACCESS_TOKEN_SECRET)
# Create API object
api = tweepy.API(auth)