Skip to content

Instantly share code, notes, and snippets.

Atindra Bandi bandiatindra

Block or report user

Report or block bandiatindra

Hide content and notifications from this user.

Learn more about blocking users

Contact Support about this user’s behavior.

Learn more about reporting abuse

Report abuse
View GitHub Profile
View Last few layers of YOLO architecture
# Layer 20
x = Conv2D(1024, (3,3), strides=(1,1), padding='same', name='conv_20', use_bias=False)(x)
x = BatchNormalization(name='norm_20')(x)
x = LeakyReLU(alpha=0.1)(x)
# Layer 21
skip_connection = Conv2D(64, (1,1), strides=(1,1), padding='same', name='conv_21', use_bias=False)(skip_connection)
skip_connection = BatchNormalization(name='norm_21')(skip_connection)
skip_connection = LeakyReLU(alpha=0.1)(skip_connection)
skip_connection = Lambda(space_to_depth_x2)(skip_connection)
View Training of Yolo
# Optimization Functions
optimizer = Adam(lr=0.5e-4, beta_1=0.9, beta_2=0.999, epsilon=1e-08, decay=0.0)
#optimizer = SGD(lr=1e-4, decay=0.0005, momentum=0.9)
#optimizer = RMSprop(lr=1e-4, rho=0.9, epsilon=1e-08, decay=0.0)
model.compile(loss=custom_loss, optimizer=optimizer)
model.fit_generator(generator = train_batch,
steps_per_epoch = int(len(train_batch)/16),
View Inputs to Yolo
LABELS = ['Shirt', 'Trousers', 'Swimwear', 'Tie', 'Bus', 'Truck', 'Train', 'Motorcycle', 'Helmet', 'Shorts', 'Airplane',
'Sunglasses', 'Jacket', 'Dress', 'Human eye', 'Suit', 'Footwear', 'Woman', 'Human face', 'Man', 'Human arm',
'Human head','Human hand', 'Human leg', 'Human nose', 'Human mouth', 'Human ear', 'Human beard', 'Human foot', 'Car',
'Wheel', 'Boat', 'House', 'Bird', 'Guitar', 'Fast food', 'Hat', 'Dog', 'Laptop', 'Beer', 'Cat', 'Lantern', 'Fountain']
# Setting the input image size to 608 X 608
IMAGE_H, IMAGE_W = 608, 608
# We wil use 19X19 grids for our images. This will lead us to a total of 608/19 = 32 grids for an image
GRID_H, GRID_W = 19 , 19
View Re-initialize the last convolutional layer of Yolo
# Taking the last convolutional layer
layer = model.layers[-4]
weights = layer.get_weights()
# Randomly initializing the weights of the last layer
new_kernel = np.random.normal(size=weights[0].shape)/(GRID_H*GRID_W)
new_bias = np.random.normal(size=weights[1].shape)/(GRID_H*GRID_W)
# Setting the weights of the last layer
layer.set_weights([new_kernel, new_bias])
View comments.csv
We can make this file beautiful and searchable if this error is corrected: Illegal quoting in line 2.
Date, user_id, comments
September 7 2018 4:01PM, dino001, If they keep it around in next four-five years, I think I will make myself to put on a hazmat suit and visit our friendly Kia dealer (oh, boy are they horrible here, or what) to check it out. It's got a lot of good stuff, but seems like they are still behind on a few things, such as relationship between power and gas mileage (e.g. BMW 340/440 has similar performance, but much better gas mileage as a daily driver), but with appropriate price difference, those objections and shortcomings are not insurmountable. Biggest thing will be the "first contact" with a sales person. Hope it won't start from "are you buying it today?", "what can I do to make you take it home?", or "let me wash your BMW - oh, I can't find the keys", "How about I show you the deal - square one, two, three, four".
September 7 2018 9:43PM, circlew, The lease rate is the factor that stops me cold from taking the leap. Since I don't track my cars, it would do fine for my needs but I
bandiatindra / Web Scraping Selenium
Last active Oct 3, 2018
Code to scrape 5000 comments from
View Web Scraping Selenium
driver = webdriver.Chrome('C:/Users/bandi/Desktop/Text Analytics/TA Session/chromedriver_win32/chromedriver')
comments = pd.DataFrame(columns = ['Date','user_id','comments'])
ids = driver.find_elements_by_xpath("//*[contains(@id,'Comment_')]")
comment_ids = []
for i in ids:
for x in comment_ids:
You can’t perform that action at this time.