Skip to content

Instantly share code, notes, and snippets.

Working from home

Lawrence A. Krukrubo Lawrence-Krukrubo

Working from home
View GitHub Profile
Lawrence-Krukrubo / PY-Drawing3D.ipynb
Created Dec 11, 2020 — forked from WetHat/PY-Drawing3D.ipynb
Matplotlib: 3D Arrows and 3D Annotations
View PY-Drawing3D.ipynb
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
import pandas as pd
import pickle
# Open pickle file and load data:
with open('ratings.pickle', 'rb') as f:
ratings_data = pickle.load(f)
# Print ratings_data
Lawrence-Krukrubo /
Created May 31, 2020
adding code snippets to Loading Data Files in python article.
import numpy as np
import pandas as pd
# First let's save the link to the excel file
battle_link = ''
# Next let's load the file into an io.excel.base file
xls = pd.ExcelFile(battle_link)
# Let's see its type
Lawrence-Krukrubo /
Created May 31, 2020
Adding code snippets to Loading Different Data Types in Medium
# Let's wrap the import in a try-catch block
data = np.genfromtxt('titanic.csv', delimiter=',', names=True, dtype=None, encoding='utf8')
except Exception as e:
# Let's see the first five rows of the data ID array
Lawrence-Krukrubo /
Last active May 30, 2020
Adding code snippets to my article (Loading Different Data Sets) on Medium
# First, let's import numpy.
import numpy as np
# Next, let's save the titanic_df we loaded before, as titanic.csv file.
titanic_df.to_csv('titanic.csv', index=False)
# Next, let's load titanic.csv as a numpy array,
titanic_arr = np.loadtxt('titanic.csv', delimiter = ',', skiprows=1, usecols=[0,1,4])
# I passed the following parameters:-
import seaborn as sns
titanic_df = sns.load_dataset('titanic')
Lawrence-Krukrubo / latency.txt
Created Apr 30, 2020 — forked from jboner/latency.txt
Latency Numbers Every Programmer Should Know
View latency.txt
Latency Comparison Numbers (~2012)
L1 cache reference 0.5 ns
Branch mispredict 5 ns
L2 cache reference 7 ns 14x L1 cache
Mutex lock/unlock 25 ns
Main memory reference 100 ns 20x L2 cache, 200x L1 cache
Compress 1K bytes with Zippy 3,000 ns 3 us
Send 1K bytes over 1 Gbps network 10,000 ns 10 us
Read 4K randomly from SSD* 150,000 ns 150 us ~1GB/sec SSD