数据分析:Autoencoder in R

auoteencoder是深度学习的一类算法之一,由encoder, latent space和decoder组成(见下图)。更多知识分享请到 https://zouhua.top/

packages

# Helper packages
library(dplyr)    # for data manipulation
library(ggplot2)  # for data visualization

# Modeling packages
library(h2o)  # for fitting autoencoders

train data

mnist <- dslabs::read_mnist()
names(mnist)

initialize H2O

h2o.no_progress()  # turn off progress bars
h2o.init(max_mem_size = "5g")  # initialize H2O instance
H2O is not running yet, starting it now...

Note:  In case of errors look at the following log files:
    C:\Users\zouhu\AppData\Local\Temp\Rtmp4APVUM\file521043104b5e/h2o_zouhua_started_from_r.out
    C:\Users\zouhu\AppData\Local\Temp\Rtmp4APVUM\file521021935c59/h2o_zouhua_started_from_r.err

java version "1.8.0_231"
Java(TM) SE Runtime Environment (build 1.8.0_231-b11)
Java HotSpot(TM) 64-Bit Server VM (build 25.231-b11, mixed mode)

Starting H2O JVM and connecting: . Connection successful!

R is connected to the H2O cluster: 
    H2O cluster uptime:         5 seconds 583 milliseconds 
    H2O cluster timezone:       Asia/Shanghai 
    H2O data parsing timezone:  UTC 
    H2O cluster version:        3.32.0.1 
    H2O cluster version age:    2 months and 30 days  
    H2O cluster name:           H2O_started_from_R_zouhua_vrf494 
    H2O cluster total nodes:    1 
    H2O cluster total memory:   4.44 GB 
    H2O cluster total cores:    6 
    H2O cluster allowed cores:  6 
    H2O cluster healthy:        TRUE 
    H2O Connection ip:          localhost 
    H2O Connection port:        54321 
    H2O Connection proxy:       NA 
    H2O Internal Security:      FALSE 
    H2O API Extensions:         Amazon S3, Algos, AutoML, Core V3, TargetEncoder, Core V4 
    R Version:                  R version 4.0.2 (2020-06-22) 

Comparing PCA to an autoencoder

# Convert mnist features to an h2o input data set
features <- as.h2o(mnist$train$images)

# Train an autoencoder
ae1 <- h2o.deeplearning(
  x = seq_along(features),
  training_frame = features,
  autoencoder = TRUE,
  hidden = 2,
  activation = 'Tanh',
  sparse = TRUE
)

# Extract the deep features
ae1_codings <- h2o.deepfeatures(ae1, features, layer = 1)
ae1_codings

Stacked autoencoders

# Hyperparameter search grid
hyper_grid <- list(hidden = list(
  c(50),
  c(100), 
  c(300, 100, 300),
  c(100, 50, 100),
  c(250, 100, 50, 100, 250)
))

# Execute grid search
ae_grid <- h2o.grid(
  algorithm = 'deeplearning',
  x = seq_along(features),
  training_frame = features,
  grid_id = 'autoencoder_grid',
  autoencoder = TRUE,
  activation = 'Tanh',
  hyper_params = hyper_grid,
  sparse = TRUE,
  ignore_const_cols = FALSE,
  seed = 123
)

# Print grid details
h2o.getGrid('autoencoder_grid', sort_by = 'mse', decreasing = FALSE)

visualization

# Get sampled test images
index <- sample(1:nrow(mnist$test$images), 4)
sampled_digits <- mnist$test$images[index, ]
colnames(sampled_digits) <- paste0("V", seq_len(ncol(sampled_digits)))

# Predict reconstructed pixel values
best_model_id <- ae_grid@model_ids[[1]]
best_model <- h2o.getModel(best_model_id)
reconstructed_digits <- predict(best_model, as.h2o(sampled_digits))
names(reconstructed_digits) <- paste0("V", seq_len(ncol(reconstructed_digits)))

combine <- rbind(sampled_digits, as.matrix(reconstructed_digits))

# Plot original versus reconstructed
par(mfrow = c(1, 3), mar=c(1, 1, 1, 1))
layout(matrix(seq_len(nrow(combine)), 4, 2, byrow = FALSE))
for(i in seq_len(nrow(combine))) {
  image(matrix(combine[i, ], 28, 28)[, 28:1], xaxt="n", yaxt="n")
}

ANN2实现

Artificial Neural Networks package for R的优势

  • Easy to use interface - defining and training neural nets with a single function call!

  • Activation functions: tanh, sigmoid, relu, linear, ramp, step

  • Loss functions: log, squared, absolute, huber, pseudo-huber

  • Regularization: L1, L2

  • Optimizers: sgd, sgd w/ momentum, RMSprop, ADAM

  • Plotting functions for visualizing encodings, reconstructions and loss (training and validation)

  • Helper functions for predicting, reconstructing, encoding and decoding

  • Reading and writing the trained model from / to disk

  • Access to model parameters and low-level Rcpp module methods

  • neuralnetwork() 神经网络
library(ANN2)

# Prepare test and train sets
random_idx <- sample(1:nrow(iris), size = 145)
X_train    <- iris[random_idx, 1:4]
y_train    <- iris[random_idx, 5]
X_test     <- iris[setdiff(1:nrow(iris), random_idx), 1:4]
y_test     <- iris[setdiff(1:nrow(iris), random_idx), 5]

# Train neural network on classification task
NN <- neuralnetwork(X = X_train,
                    y = y_train,
                    hidden.layers = c(5, 5),
                    optim.type = 'adam',
                    n.epochs = 5000)

# Predict the class for new data points
predict(NN, X_test)

# $predictions
# [1] "setosa"     "setosa"     "setosa"     "versicolor"     "versicolor"
#
# $probabilities
#      class_setosa class_versicolor class_virginica
# [1,] 0.9998184126     0.0001814204    1.670401e-07
# [2,] 0.9998311154     0.0001687264    1.582390e-07
# [3,] 0.9998280223     0.0001718171    1.605735e-07
# [4,] 0.0001074780     0.9997372337    1.552883e-04
# [5,] 0.0001077757     0.9996626441    2.295802e-04

# Plot the training and validation loss
plot(NN)
  • autoencoder() 自编码
# Prepare test and train sets
random_idx <- sample(1:nrow(USArrests), size = 45)
X_train    <- USArrests[random_idx,]
X_test     <- USArrests[setdiff(1:nrow(USArrests), random_idx),]

# Define and train autoencoder
AE <- autoencoder(X = X_train,
                  hidden.layers = c(10,3,10),
                  loss.type = 'pseudo-huber',
                  optim.type = 'adam',
                  n.epochs = 5000)

# Reconstruct test data
reconstruct(AE, X_test)

# $reconstructed
#         Murder   Assault UrbanPop      Rape
# [1,]  8.547431 243.85898 75.60763 37.791746
# [2,] 12.972505 268.68226 65.40411 29.475545
# [3,]  2.107441  78.55883 67.75074 15.040075
# [4,]  2.085750  56.76030 55.32376  9.346483
# [5,] 12.936357 252.09209 56.24075 24.964715
#
# $anomaly_scores
# [1]  398.926431  247.238111   11.613522    0.134633 1029.806121

# Plot original points (grey) and reconstructions (colored) for training data
reconstruction_plot(AE, X_train)

Reference

  1. Autoencoder

  2. ANN2 tutorial

  3. ANN2 github

参考文章如引起任何侵权问题,可以与我联系,谢谢。

最后编辑于
©著作权归作者所有,转载或内容合作请联系作者
禁止转载,如需转载请通过简信或评论联系作者。
  • 序言:七十年代末,一起剥皮案震惊了整个滨河市,随后出现的几起案子,更是在滨河造成了极大的恐慌,老刑警刘岩,带你破解...
    沈念sama阅读 162,158评论 4 370
  • 序言:滨河连续发生了三起死亡事件,死亡现场离奇诡异,居然都是意外死亡,警方通过查阅死者的电脑和手机,发现死者居然都...
    沈念sama阅读 68,600评论 1 307
  • 文/潘晓璐 我一进店门,熙熙楼的掌柜王于贵愁眉苦脸地迎上来,“玉大人,你说我怎么就摊上这事。” “怎么了?”我有些...
    开封第一讲书人阅读 111,785评论 0 254
  • 文/不坏的土叔 我叫张陵,是天一观的道长。 经常有香客问我,道长,这世上最难降的妖魔是什么? 我笑而不...
    开封第一讲书人阅读 44,655评论 0 220
  • 正文 为了忘掉前任,我火速办了婚礼,结果婚礼上,老公的妹妹穿的比我还像新娘。我一直安慰自己,他们只是感情好,可当我...
    茶点故事阅读 53,075评论 3 295
  • 文/花漫 我一把揭开白布。 她就那样静静地躺着,像睡着了一般。 火红的嫁衣衬着肌肤如雪。 梳的纹丝不乱的头发上,一...
    开封第一讲书人阅读 41,002评论 1 225
  • 那天,我揣着相机与录音,去河边找鬼。 笑死,一个胖子当着我的面吹牛,可吹牛的内容都是我干的。 我是一名探鬼主播,决...
    沈念sama阅读 32,146评论 2 318
  • 文/苍兰香墨 我猛地睁开眼,长吁一口气:“原来是场噩梦啊……” “哼!你这毒妇竟也来了?” 一声冷哼从身侧响起,我...
    开封第一讲书人阅读 30,918评论 0 211
  • 序言:老挝万荣一对情侣失踪,失踪者是张志新(化名)和其女友刘颖,没想到半个月后,有当地人在树林里发现了一具尸体,经...
    沈念sama阅读 34,671评论 1 250
  • 正文 独居荒郊野岭守林人离奇死亡,尸身上长有42处带血的脓包…… 初始之章·张勋 以下内容为张勋视角 年9月15日...
    茶点故事阅读 30,838评论 2 254
  • 正文 我和宋清朗相恋三年,在试婚纱的时候发现自己被绿了。 大学时的朋友给我发了我未婚夫和他白月光在一起吃饭的照片。...
    茶点故事阅读 32,318评论 1 265
  • 序言:一个原本活蹦乱跳的男人离奇死亡,死状恐怖,灵堂内的尸体忽然破棺而出,到底是诈尸还是另有隐情,我是刑警宁泽,带...
    沈念sama阅读 28,636评论 3 263
  • 正文 年R本政府宣布,位于F岛的核电站,受9级特大地震影响,放射性物质发生泄漏。R本人自食恶果不足惜,却给世界环境...
    茶点故事阅读 33,343评论 3 244
  • 文/蒙蒙 一、第九天 我趴在偏房一处隐蔽的房顶上张望。 院中可真热闹,春花似锦、人声如沸。这庄子的主人今日做“春日...
    开封第一讲书人阅读 26,187评论 0 8
  • 文/苍兰香墨 我抬头看了看天上的太阳。三九已至,却和暖如春,着一层夹袄步出监牢的瞬间,已是汗流浃背。 一阵脚步声响...
    开封第一讲书人阅读 26,982评论 0 201
  • 我被黑心中介骗来泰国打工, 没想到刚下飞机就差点儿被人妖公主榨干…… 1. 我叫王不留,地道东北人。 一个月前我还...
    沈念sama阅读 36,126评论 2 285
  • 正文 我出身青楼,却偏偏与公主长得像,于是被迫代替她去往敌国和亲。 传闻我的和亲对象是个残疾皇子,可洞房花烛夜当晚...
    茶点故事阅读 35,934评论 2 279

推荐阅读更多精彩内容