Skip to content

Instantly share code, notes, and snippets.

@seabbs
Created January 25, 2019 11:55
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save seabbs/0f709eda6274634df1f24d3123c6fbb0 to your computer and use it in GitHub Desktop.
Save seabbs/0f709eda6274634df1f24d3123c6fbb0 to your computer and use it in GitHub Desktop.
Testing GPU set up in R for machine learning
library(xgboost)
# load data
data(agaricus.train, package = 'xgboost')
data(agaricus.test, package = 'xgboost')
train <- agaricus.train
test <- agaricus.test
# fit model
bst <- xgboost(data = train$data, label = train$label, max_depth = 5, eta = 0.001, nrounds = 1000,
nthread = 2, objective = "binary:logistic", tree_method = "gpu_hist")
# predict
pred <- predict(bst, test$data)
library(h2o)
h2o.init()
# Run regression GBM on australia data
australia_path <- system.file("extdata", "australia.csv", package = "h2o")
australia <- h2o.uploadFile(path = australia_path)
independent <- c("premax", "salmax","minairtemp", "maxairtemp", "maxsst",
"maxsoilmoist", "Max_czcs")
dependent <- "runoffnew"
h2o.gbm(y = dependent, x = independent, training_frame = australia,
ntrees = 1000, max_depth = 3, min_rows = 2)
h2o.xgboost(y = dependent, x = independent, training_frame = australia,
ntrees = 1000, backend = "cpu")
h2o.xgboost(y = dependent, x = independent, training_frame = australia,
ntrees = 1000, backend = "gpu")
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment