Skip to content

Instantly share code, notes, and snippets.

View shafayeatsumit's full-sized avatar
👓
Working on my algo skills

Shafayeat Kabir Sumit shafayeatsumit

👓
Working on my algo skills
  • Dhaka, Bangladesh
View GitHub Profile
@shafayeatsumit
shafayeatsumit / wsh1.replica
Created June 25, 2016 17:10
wsh1 replica set down log file
016-06-25T16:50:15.563+0000 I - [conn6803] Assertion: 13538:couldn't open [/proc/10360/stat] errno:24 Too many open files
2016-06-25T16:50:15.750+0000 I - [conn6850] Assertion: 13538:couldn't open [/proc/10360/stat] errno:24 Too many open files
2016-06-25T16:50:16.564+0000 I - [conn6803] Assertion: 13538:couldn't open [/proc/10360/stat] errno:24 Too many open files
2016-06-25T16:50:16.752+0000 I - [conn6850] Assertion: 13538:couldn't open [/proc/10360/stat] errno:24 Too many open files
2016-06-25T16:50:17.022+0000 E STORAGE WiredTiger (24) [1466873417:22552][10360:0x7f5b235a4700], file:WiredTiger.wt, session.checkpoint: WiredTiger.turtle: fopen: Too many open files
2016-06-25T16:50:17.077+0000 E STORAGE WiredTiger (24) [1466873417:77752][10360:0x7f5b235a4700], checkpoint-server: checkpoint server error: Too many open files
2016-06-25T16:50:17.077+0000 E STORAGE WiredTiger (-31804) [1466873417:77854][10360:0x7f5b235a4700], checkpoint-server: the process must exit and restart: WT_
s2 1
{ "_id" : { "$minKey" : 1 } } -->> { "_id" : { "$maxKey" : 1 } } on : s2 Timestamp(1, 0)
finderdb2.assets
shard key: { "_id" : 1 }
chunks:
s0 1
s1 1
s2 2
{ "_id" : { "$minKey" : 1 } } -->> { "_id" : ObjectId("53f8734ffd0d85079fa13207") } on : s0 Timestamp(2, 0)
{ "_id" : ObjectId("53f8734ffd0d85079fa13207") } -->> { "_id" : ObjectId("54aac54afd0d852a313002a6") } on : s1 Timestamp(3, 0)
@shafayeatsumit
shafayeatsumit / scrape.py
Last active October 24, 2016 17:17
scrape first few pages in google
from bs4 import BeautifulSoup
import urllib,urllib2
import re
def google_scrape(query):
address = "http://www.google.com/search?q=%s&num=100&hl=en&start=0" % (urllib.quote_plus(query))
request = urllib2.Request(address, None, {'User-Agent':'Mosilla/5.0 (Macintosh; Intel Mac OS X 10_7_4) AppleWebKit/536.11 (KHTML, like Gecko) Chrome/20.0.1132.57 Safari/536.11'})
urlfile = urllib2.urlopen(request)
page = urlfile.read()
soup = BeautifulSoup(page)
@shafayeatsumit
shafayeatsumit / emailscrapper.py
Created October 26, 2016 10:45
How to: Using Python and Google to find hundreds of e-mail adresses
#!/usr/bin/python
import sys
import re
import string
import httplib
import urllib2
import re
def StripTags(text):
@shafayeatsumit
shafayeatsumit / config.yml
Created January 16, 2017 10:13
mongodb config file in yml format
storage:
dbPath: "/data/db"
engine: "wiredTiger"
systemLog:
destination: file
path: "/var/log/mongodb.log"
logAppend: true
timeStampFormat: iso8601-utc
replication:
oplogSizeMB: 10240
@shafayeatsumit
shafayeatsumit / ffmpeg.py
Last active February 15, 2017 17:58
ffmpeg
http://video.stackexchange.com/questions/19867/how-to-fade-in-out-a-video-audio-clip-with-unknown-duration
http://www.bogotobogo.com/FFMpeg/ffmpeg_fade_in_fade_out_transitions_effects_filters_two_slides.php
http://www.bogotobogo.com/FFMpeg/ffmpeg_fade_in_fade_out_transitions_effects_filters.php
http://superuser.com/questions/833232/create-video-with-5-images-with-fadein-out-effect-in-ffmpeg
http://hamelot.io/visualization/using-ffmpeg-to-convert-a-set-of-images-into-a-video/
https://en.wikibooks.org/wiki/FFMPEG_An_Intermediate_Guide/image_sequence
from selenium.webdriver.common.proxy import *
from selenium import webdriver
from selenium.webdriver.common.desired_capabilities import DesiredCapabilities
proxy = {'address': '191.101.24.45:8085'}
capabilities = dict(DesiredCapabilities.CHROME)
from selenium import webdriver
from selenium.webdriver.common.desired_capabilities import DesiredCapabilities
import time
import re
#driver = webdriver.Remote(command_executor='http://127.0.0.1:4444/wd/hub', desired_capabilities=DesiredCapabilities.CHROME)
driver = webdriver.Firefox()
from selenium import webdriver
import time
import re
import math
from selenium.webdriver.common.desired_capabilities import DesiredCapabilities
driver = webdriver.Remote(command_executor='http://127.0.0.1:4444/wd/hub', desired_capabilities=DesiredCapabilities.CHROME)
#driver = webdriver.Firefox()
import requests
import socket
import socks
import csv
import sqlite3
import datetime
from scrapy.http import HtmlResponse
socks.set_default_proxy(socks.SOCKS5, "proxy.torguard.io", 1090, username="", password="")