- CCleaner
- Check Disk
- System Image Backup
- CCleaner
- Verify Disk
# Run MATLAB file and output to file | |
matlab -nodisplay -r <function> [<params>...] > <file> | |
# Email output file once done | |
mutt -s "Results" <email> < <file> |
I collect the first volumes (and first volumes only) of various manga series (in English) in new condition.
I have an Amazon wishlist, but volumes from other series that I do not have are also welcome. New books only please.
I supervise undergraduate/postgraduate/UROP projects as part of BICV. The PI, Dr Anil Bharath, has a nice set of FAQs for prospective research students, which should give you an idea of what our group specialises in. My topic of research is deep reinforcement learning, which is less focused on computer vision and more on general machine learning or even artificial intelligence. Note that I only supervise students at Imperial College London, so please do not contact me about supervision otherwise.
I expect students to be a) highly motivated and b) technically proficient.
a) Projects that I supervise revolve around cutting-edge research, and specifically deep learning. Projects can, and have in the past, relied on research released during the course of the project. Some parts of machine learning can be found in optional modules in bioengineering courses, but (modern) deep learning is currently not taught at Imperial (as far as I am aware). I usually give crash
require 'cunn' | |
local cudnn = require 'cudnn' | |
local X = torch.rand(32, 3, 24, 24):cuda() | |
local Y = torch.ones(32):cuda() | |
local net = nn.Sequential() | |
net:add(cudnn.SpatialConvolution(3, 8, 5, 5)) | |
net:add(nn.View(8*20*20)) | |
net:add(nn.Linear(8*20*20, 10)) |
--[[ | |
-- Element-Research Torch RNN Tutorial for recurrent neural nets : let's predict time series with a laptop GPU | |
-- https://christopher5106.github.io/deep/learning/2016/07/14/element-research-torch-rnn-tutorial.html | |
--]] | |
--[[ | |
-- Part 1 | |
--]] | |
require 'rnn' |
--[[ | |
-- Gaussian Processes for Dummies | |
-- https://katbailey.github.io/post/gaussian-processes-for-dummies/ | |
-- Note 1: The Cholesky decomposition requires positive-definite matrices, hence the addition of a small value to the diagonal (prevents zeros along the diagonal) | |
-- Note 2: This can also be thought of as adding a little noise to the observations | |
--]] | |
local gnuplot = require 'gnuplot' | |
-- Test data |
Note: Great refresher/glossary on probability/statistics and related topics here
Notation | Definition |
---|---|
X | Random variable |
P(X) | Probability distribution over random variable X |
X ~ P(X) | Random variable X follows (~) the probability distribution P(X) * |
--[[ | |
-- Random walks down Wall Street, Stochastic Processes in Python | |
-- http://www.turingfinance.com/random-walks-down-wall-street-stochastic-processes-in-python/ | |
--]] | |
local gnuplot = require 'gnuplot' | |
local model_parameters = { | |
all_s0 = 1000, -- Starting asset value | |
all_time = 800, -- Amount of time to simulate for |
--[[ | |
-- Using Perlin Noise to Generate 2D Terrain and Water | |
-- http://gpfault.net/posts/perlin-noise.txt.html | |
--]] | |
local image = require 'image' | |
-- Fade function | |
local fade = function(t) | |
-- Provides continuous higher order derivatives for smoothness (this specifically is in the class of sigmoid functions) |