I watched a 9 minute video on Long Short Term Memory (LSTM) in recurrent neural networks. The concept here is that the input to the hidden layer includes some information about the previous back-prop, which if I remember correctly includes the last calculation of the inner weights for each of the classes... something like that.
I walked through the jupyter notebook for this example. It introduces keras as an API, as well as some concepts around Dropout which I don't yet understand.
The very interesting thing about this example was the charting at the end, which plotted a set of predictions, against the true values. Also, the author proposed a code challenge, to best predict the future of GOOGL
(Alphabet), using Price History and two other metrics using an LSTM.
I downloaded a dataset from Quandl for GOGVIX, curled to csv, and uploaded to BigQuery.
curl https://www.quandl.com/api/v3/datasets/CBOE/VXGOG.csv?api_key=A7vbfQ4yMyk3uzUyhC8s -o CBOE_VIXGOG.csv
curl https://www.quandl.com/api/v3/datasets/WIKI/GOOGL.csv?api_key=A7vbfQ4yMyk3uzUyhC8s -o WIKI_GOOGL.csv
And, a quick test to validate the data:
SELECT
TIMESTAMP(date) as date,
high
FROM
[quandl_googl_vix.vixgog]
ORDER BY
date ASC
LIMIT
1000;
Row | date | high |
---|---|---|
1 | 2010-06-01 00:00:00 UTC | 41.0 |
2 | 2010-06-02 00:00:00 UTC | 40.85 |
3 | 2010-06-03 00:00:00 UTC | 37.78 |
4 | 2010-06-04 00:00:00 UTC | 39.76 |
5 | 2010-06-07 00:00:00 UTC | 41.56 |
- Also upload the Quandl data for the #change calculation (or calculate locally)
- Upload the source GOOGL OHLC data