Skip to content

Instantly share code, notes, and snippets.

View kami4ka's full-sized avatar
🚀
To the Cloud!

Oleg Kulyk kami4ka

🚀
To the Cloud!
View GitHub Profile
@kami4ka
kami4ka / qa_test_task.md
Last active May 8, 2023 22:12
Systima QA test task
@kami4ka
kami4ka / tech-task.md
Last active September 19, 2023 08:58
Backend Position Technical Task

Backend Position Technical Task

Implement REST API that allows users to:

  • Lookup for a particular IP address info via https://ipwhois.io/ and store in to DB
  • Response with a stored lookup info from DB in case the spefic IP was already searched (DB-caching)
  • Remove cached result by IP
  • Cache should be auto-removed after TTL of 60 seconds, so only the cache result would be updated each 60 seconds for the particular IP address

Required parts

  • SQL or noSQL database or file
@kami4ka
kami4ka / parser.py
Last active October 7, 2022 13:29
Medindia ScrapingAnt parser
import requests
from bs4 import BeautifulSoup
YOUR_API_KEY = '<YOUR_SCRAPINGANT_API_KEY>'
def get_page(page_url):
response = requests.get(url='https://api.scrapingant.com/v2/general', params={'browser': False, 'url': page_url, 'x-api-key': YOUR_API_KEY, 'proxy_country': 'IN'})
content = response.content.decode('windows-1252')
const cheerio = require('cheerio');
const ScrapingAnt = require('@scrapingant/scrapingant-client');
const API_KEY = '<YOUR_SCRAPINGANT_API_KEY>';
const URL_TO_SCRAPE = 'https://ra.co/events/1479360';
const BASE_URL = 'https://ra.co';
const client = new ScrapingAnt({ apiKey: API_KEY });
main()
@kami4ka
kami4ka / amazon_batch_error_handling.js
Last active December 21, 2021 18:08
Amazon batch scraper with error handling. ScrapingAnt API used to get data.
/**
* Amazon Batch Scraper - create file with a list of keywords and all products would be scraped in one CSV file
*
* Installation instructions:
* npm install "@scrapingant/amazon-proxy-scraper"
* npm install json2csv
*
*/
const ProductsScraper = require("@scrapingant/amazon-proxy-scraper");
@kami4ka
kami4ka / amazon_batch_scraper.js
Created December 15, 2021 21:53
Scrape Amazon products by a keyword from a file using ScrapingAnt web scraping API
/**
* Amazon Batch Scraper - create file with a list of keywords and all products would be scraped in one CSV file
*
* Installation instructions:
* npm install "@scrapingant/amazon-proxy-scraper"
* npm install json2csv
*
*/
const ProductsScraper = require("@scrapingant/amazon-proxy-scraper");
@kami4ka
kami4ka / dextools-token-price-scraper.js
Created November 24, 2021 20:44
Scrape Dextools token price using ScrapingAnt API
/**
* Get data from MCC DexTools token listing
*
* ScrapingAnt allows you to scrape for free using proxy servers
*
* npm install @scrapingant/scrapingant-client
* npm install cheerio
**/
const cheerio = require('cheerio');
@kami4ka
kami4ka / dextools-scraper.js
Created November 23, 2021 21:28
Scrape new tokens from Dextools using ScrapingAnt API
/**
* Get data from new DexTools token listings
*
* ScrapingAnt allows you to scrape for free using proxy servers
*
* npm install @scrapingant/scrapingant-client
* npm install cheerio
**/
const cheerio = require('cheerio');
@kami4ka
kami4ka / reddit-scraper.js
Created November 23, 2021 17:31
Reddit scraping with ScrapingAnt
/**
* Get data from Reddit
*
* ScrapingAnt allows you to scrape for free using proxy servers
*
* npm install @scrapingant/scrapingant-client
* npm install cheerio
**/
const cheerio = require('cheerio');
@kami4ka
kami4ka / coinmarketcap-scraper.js
Created November 23, 2021 08:41
Scrape new tokens from CoinMarketCap using ScrapingAnt API
/**
* Get data from new CoinMarketCap token listings
*
* ScrapingAnt allows you to scrape for free using proxy servers
*
* npm install @scrapingant/scrapingant-client
* npm install cheerio
**/
const cheerio = require('cheerio');