Machine learning algorithms are very effective for algo trading systems. They are used for market prediction with Zorro's advise functions. Due to their low signal-to-noise ratio and to ever-changing market conditions, analyzing price series is an ambitious task for machine learning. But since the price curves are not completely random, even relatively simple machine learning methods, such as in the DeepLearn script, can predict the next price movement with a better than 50% success rate. If the success rate is high enough to overcome transactions costs - for this we'll need to be in the 53%..56% area - we can expect a steadily rising profit curve.
Compared with other machine learning algorithms, such as Random Forests or Support Vector Machines, deep learning systems combine a high success rate with litle programming effort. A linear neural network with 8 inputs driven by indicators and 4 outputs for buying and selling has a structure like this:
Deep learning uses linear or special neural network structures (convolution layers, LSTM) with a large number of neurons and hidden layers. Some parameters common for most neural networks:
Here's a short description of installation and usage of four popular R based deep learning packages, each with an example of a (not really deep) linear neural net with one hidden layer.
library('deepnet') neural.train = function(model,XY) { XY <- as.matrix(XY) X <- XY[,-ncol(XY)] Y <- XY[,ncol(XY)] Y <- ifelse(Y > 0,1,0) Models[[model]] <<- sae.dnn.train(X,Y, hidden = c(30), learningrate = 0.5, momentum = 0.5, learningrate_scale = 1.0, output = "sigm", sae_output = "linear", numepochs = 100, batchsize = 100) } neural.predict = function(model,X) { if(is.vector(X)) X <- t(X) return(nn.predict(Models[[model]],X)) } neural.save = function(name) { save(Models,file=name) } neural.init = function() { set.seed(365) Models <<- vector("list") }
library('h2o') neural.train = function(model,XY) { XY <- as.h2o(XY) Models[[model]] <<- h2o.deeplearning( -ncol(XY),ncol(XY),XY, hidden = c(30), seed = 365) } neural.predict = function(model,X) { if(is.vector(X)) X <- as.h2o(as.data.frame(t(X))) else X <- as.h2o(X) Y <- h2o.predict(Models[[model]],X) return(as.vector(Y)) } neural.save = function(name) { save(Models,file=name) } neural.init = function() { h2o.init() Models <<- vector("list") }
Installing the GPU version can be a bit more complex (believe it or not). A description can be found on the Robot Wealth blog. Depending on the Keras version, the way to install it might change from time to time, so check on their website to be sure.
Now you're all set to use Keras in your strategy:
library('keras') neural.train = function(model,XY) { X <- data.matrix(XY[,-ncol(XY)]) Y <- XY[,ncol(XY)] Y <- ifelse(Y > 0,1,0) Model <- keras_model_sequential() Model %>% layer_dense(units=30,activation='relu',input_shape = c(ncol(X))) %>% layer_dropout(rate = 0.2) %>% layer_dense(units = 1, activation = 'sigmoid') Model %>% compile( loss = 'binary_crossentropy', optimizer = optimizer_rmsprop(), metrics = c('accuracy')) Model %>% fit(X, Y, epochs = 20, batch_size = 20, validation_split = 0, shuffle = FALSE) Models[[model]] <<- Model } neural.predict = function(model,X) { if(is.vector(X)) X <- t(X) X <- as.matrix(X) Y <- Models[[model]] %>% predict_proba(X) return(ifelse(Y > 0.5,1,0)) } neural.save = function(name) { for(i in c(1:length(Models))) Models[[i]] <<- serialize_model(Models[[i]]) save(Models,file=name) } neural.load <- function(name) { load(name,.GlobalEnv) for(i in c(1:length(Models))) Models[[i]] <<- unserialize_model(Models[[i]]) } neural.init = function() { set.seed(365) Models <<- vector("list") }
cran <- getOption("repos") cran["dmlc"] <- "https://s3-us-west-2.amazonaws.com/apache-mxnet/R/CRAN/" options(repos = cran) install.packages('mxnet')The MxNet R script:
library('mxnet') neural.train = function(model,XY) { X <- data.matrix(XY[,-ncol(XY)]) Y <- XY[,ncol(XY)] Y <- ifelse(Y > 0,1,0) Models[[model]] <<- mx.mlp(X,Y, hidden_node = c(30), out_node = 2, activation = "sigmoid", out_activation = "softmax", num.round = 20, array.batch.size = 20, learning.rate = 0.05, momentum = 0.9, eval.metric = mx.metric.accuracy) } neural.predict = function(model,X) { if(is.vector(X)) X <- t(X) X <- data.matrix(X) Y <- predict(Models[[model]],X) return(ifelse(Y[1,] > Y[2,],0,1)) } neural.save = function(name) { save(Models,file=name) } neural.init = function() { mx.set.seed(365) Models <<- vector("list") }