Skip to content

Instantly share code, notes, and snippets.

View omarsar's full-sized avatar
🐙

Elvis Saravia omarsar

🐙
View GitHub Profile
@omarsar
omarsar / eng1-milan.md
Created September 30, 2019 09:41
milan-eng1-module1.md
GET _search
{
  "query": {
    "match_all": {}
  }
}


GET /
####################
# Helsinki Meetup
# Machine learning in the Elastic Stack
####################
####### Transforms ############
## 1. Check and explore the index you are working with:
GET kibana_sample_data_ecommerce/_search
@omarsar
omarsar / odsc_nlp.md
Last active September 17, 2020 15:55

Title

Applied Deep Learning for NLP Applications

Abstract

Natural language processing (NLP) has become an important field with interest from many important sectors that leverage modern deep learning methods for approaching several NLP problems and tasks such as text summarization, question answering, and sentiment classification, to name a few. In this tutorial, we will introduce several of the fundamental NLP techniques and more modern approaches (BERT, GTP-2, etc.) and show how they can be applied via transfer learning to approach many real-world NLP problems. We will focus on how to build an NLP pipeline using several open-source tools such as Transformers, Tokenizers, spaCy, TensorFlow, and PyTorch, among others. Then we will learn how to use the NLP model to search over documents based on semantic relationships. We will use open-source technologies such as BERT and Elasticsearch for this segment to build a proof of concept. In essence, the learner will take away the important theoretical pieces ne

for epoch in tqdm(range(1, num_epochs+1)):
start_time = time.time()
scheduler.step()
lr = scheduler.get_lr()[0]
model.train()
train_loss_total = 0.0
num_steps = 0
N_INPUT = 3 # number of features in input
N_NEURONS = 5 # number of units in layer
X0_batch = torch.tensor([[0,1,2], [3,4,5],
[6,7,8], [9,0,1]],
dtype = torch.float) #t=0 => 4 X 3
X1_batch = torch.tensor([[9,8,7], [0,0,0],
[6,5,4], [3,2,1]],
dtype = torch.float) #t=1 => 4 X 3
class BasicRNN(nn.Module):
def __init__(self, n_inputs, n_neurons):
super(BasicRNN, self).__init__()
self.Wx = torch.randn(n_inputs, n_neurons) # n_inputs X n_neurons
self.Wy = torch.randn(n_neurons, n_neurons) # n_neurons X n_neurons
self.b = torch.zeros(1, n_neurons) # 1 X n_neurons
def forward(self, X0, X1):
class CleanBasicRNN(nn.Module):
def __init__(self, batch_size, n_inputs, n_neurons):
super(CleanBasicRNN, self).__init__()
self.rnn = nn.RNNCell(n_inputs, n_neurons)
self.hx = torch.randn(batch_size, n_neurons) # initialize hidden state
def forward(self, X):
output = []
@omarsar
omarsar / py_nn.py
Last active February 27, 2020 13:09
class Neural_Network(nn.Module):
def __init__(self, ):
super(Neural_Network, self).__init__()
# parameters
# TODO: parameters can be parameterized instead of declaring them here
self.inputSize = 2
self.outputSize = 1
self.hiddenSize = 3
# weights
@omarsar
omarsar / submitting_newsletter_pr.md
Last active February 19, 2020 17:42
Guide for submitting a PR on NLP Newsletter translations.

These are the instructions for submitting an NLP Newsletter PR

First, I need to send you an invite to push to the repository. Just send me an email with your GitHub account and I will add you. If I added you as a contributer already, ignore it.

If you would like to be added as an official writer to the publication, I will need the following information from you (replace the items in CAPS):

GITHUB_USERNAME:
  name: NAME
 web: PERSONAL_WEBSITE (OPTIONAL)

.es(index=apa*,q=geoip.country_code2:FR,metric=sum:bytes).bars().label("France"), .es(index=apa*,q=geoip.country_code2:FR,metric=sum:bytes,offset=-10d).bars(stack=false).label("France")