Skip to content

Instantly share code, notes, and snippets.

@ducnh1022
ducnh1022 / gist:1247a3ebc8669ded6294
Created March 25, 2016 14:39
The Impact of Model Driven Development
What is MDD
make any kind of change to a model as well as to the code generated from that model
= round-trip engineering = fordward engineering + reserver engineering
1. Increased likelihood of scope creep due to ease of change
easier to extend function
an existing meta-model may contain more functionality than specified in the requirements
=> impact supplier & client => confused to identify original requirements & change request
1 what is antenna
way of converting guided waves in waveguide, feeder cable or transmission line into radiating waves
art of antenna design is to ensure this process efficiently as possible
2 conditions for radiation
charges + not uniform motion = reversing direction = direction changing or oscillating in periodic motion
Architecture
fundamental
define guideline
communicate with stakeholder
cross-cutting concern
manage uncertainty
conceptual intergrity
what kind of storage, how modules interact, where recovery system
@ducnh1022
ducnh1022 / gist:2bc3393535bf1b71557f2a2ba271f05e
Last active March 31, 2016 14:32
Classification Week 1 - Linear classifier and logistic regression
Purpose
Data -> classifier -> intelligence
Input x -> sentence sentiment classifier -> y (positive or negative)
App:
Spam filtering
image classification
Impact of classification
@ducnh1022
ducnh1022 / gist:7a37283fb1be3eae8aec7f26338cf40b
Last active April 2, 2016 19:37
Classification week 2 - Learning linear classifier
find best classifier => maximize likelihood over all possible w0 w1 w2
Data likelihood
quality metric -> probability of data
learn logistic regression model with maximum likelihood estimation MLE
finding best linear classifier with gradient ascent
convergence criteria, optimize when dl/dw = 0 but in practice we stop when dl/dw < epsilon
@ducnh1022
ducnh1022 / gist:8eb402e28e63eb9c7c8c0818ae08eefb
Created April 3, 2016 06:29
Classification week3 - decision trees
step 1 start with empty tree
step 2 select a feature to split data
FOr each split of tree
step3 if nothing more to, make predictions
step4 ohter wise go to step 2 and continue recurse on this split
Feature split learning = decision stump learning
what better: split on credit or term
@ducnh1022
ducnh1022 / gist:17083e486df74c1509133498e11ad4f1
Last active April 4, 2016 17:10
Classification week 4 overfitting in decision trees
Learning simpler decision trees
early stopping -> limit depth, use classification error on limit depth of tree, if very few data points
Pruning: simplify tree after learning algorithm terminates
cost function
step 1: consider a split
step 2: computer total cost C(t) of split
@ducnh1022
ducnh1022 / gist:41f2389f7f19ff3542dd33431ce5365b
Last active April 6, 2016 15:36
Classification week 5 boosting
combine multiple simple classifier -> ensemble classifiers
y hat = sign(f(x))
Adaboost
start same weight for all points alpha = 1/N
For t = 1..T
learn f(t) with data weight alpha
Logític clasìication -> stochastic ooptimization
-> data and parameter tuning -> deep networks
-> regularization -> convolutional networks
-> embeddings -> recurrent models
deep learning apply in all field, reasearcher, engineer, data scientist
@ducnh1022
ducnh1022 / gist:a7e8599c422aa9c9474bf42b85b1cdd8
Created April 10, 2016 08:21
Classification week 6 precision recall
Precision: fraction of positive predictions that are actually positive
recall: fraction of positive data predicted to be positive
optimistic = low precision high recall
pesstimistic = high precision low recall
trade off