Skip to content

Instantly share code, notes, and snippets.

@vigsterkr
Last active April 24, 2017 09:42
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 1 You must be signed in to fork a gist
  • Save vigsterkr/6f75563c64f62b33b4ac7d0e09ab22d5 to your computer and use it in GitHub Desktop.
Save vigsterkr/6f75563c64f62b33b4ac7d0e09ab22d5 to your computer and use it in GitHub Desktop.
Heiko is worrie that we will loose it :)
// Immutable features
// linear model
- dot prod:
- pairs
-
- cov var matrix: there's a Feature operator => CovarView => matrix
-
class Features {
Features(){}
...
Features transformed_by(Transformer t) const;
// this evaluates the stacked operators over the features
// and returns the copy of features
Features cached() const;
protected:
void add_flag(flag) {
flags |= flag;
}
int get_flags()
}
// lala land :)
class Trait {
Trait(Features);
Trait(View);
iterator begin() = 0;
iterator end() = 0;
}
class DotTrait {
DotTrait(Features);
iterator begin();
iterator end();
}
// YOLO++ :)
class CovarTrait {
CovarTrait(DotTrait){}
Matrix get () {
}
}
df = DataFrame()
f = Features(df).tranformed_by(Mean).transformed_by(Normalize).cached();
LinearModel {
LinearModel()
train(Features) {
// assert
assert(features.get_flags() & CENTERED);
// we do not expose on model level the type of matrix/vector
// linalg should find it out, we have an opaque Matrix/Vector class
// which is type agnostic.
auto covar = CovarTrait(DotTrait(features)).get();
// inplace,... or not :)
linalg::add_diag(covar, get("lambda"));
set("w", linalg::cholesky_solve(covar, y));
}
}
batch optimisation is taken care automagically
model.train(SubsetView(features))
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment