1 Star 0 Fork 0

wangyingdong/deeplearning4j-examples

加入 Gitee
与超过 1200万 开发者一起发现、参与优秀开源项目,私有仓库也完全免费 :)
免费加入
文件
.github
android
contrib
datavec-examples
dl4j-cuda-specific-examples
dl4j-examples
dl4j-examples_javafx
dl4j-spark-examples
jumpy-examples
lstm-hdfs
nd4j-examples
pydatavec-examples
rl4j-examples
shared-utilities
standalone-sample-project
tf-import-examples
tutorials
docker
00. Quickstart for Deeplearning4j.json
00. Quickstart for Deeplearning4j.zepp.ipynb
01. MultiLayerNetwork and ComputationGraph.json
01. MultiLayerNetwork and ComputationGraph.zepp.ipynb
02. Built-in Data Iterators.json
02. Built-in Data Iterators.zepp.ipynb
03. Logistic Regression.json
03. Logistic Regression.zepp.ipynb
04. Feed-forward.json
04. Feed-forward.zepp.ipynb
05. Basic Autoencoder- Anomaly Detection Using Reconstruction Error.json
05. Basic Autoencoder- Anomaly Detection Using Reconstruction Error.zepp.ipynb
06. Advanced Autoencoder- Trajectory Clustering using AIS.json
06. Advanced Autoencoder- Trajectory Clustering using AIS.zepp.ipynb
07. Convolutions- Train FaceNet Using Center Loss.json
07. Convolutions- Train FaceNet Using Center Loss.zepp.ipynb
08. RNNs- Sequence Classification of Synthetic Control Data.json
08. RNNs- Sequence Classification of Synthetic Control Data.zepp.ipynb
09. Early Stopping.ipynb
09. Early Stopping.zepp.json
10. Layers and Preprocessors.ipynb
10. Layers and Preprocessors.zepp.json
11. Hyperparameter Optimization.json
11. Hyperparameter Optimization.zepp.ipynb
12. Clinical Time Series LSTM Example.json
12. Clinical Time Series LSTM Example.zepp.ipynb
13. Clinical LSTM Time Series Example Using SKIL.json
13. Clinical LSTM Time Series Example Using SKIL.zepp.ipynb
14. Parallel Training.json
14. Parallel Training.zepp.ipynb
15. Sea Temperature Convolutional LSTM Example.json
15. Sea Temperature Convolutional LSTM Example.zepp.ipynb
16. Sea Temperature Convolutional LSTM Example 2.json
16. Sea Temperature Convolutional LSTM Example 2.zepp.ipynb
17. Instacart Multitask Example.json
17. Instacart Multitask Example.zepp.ipynb
18. Instacart Single Task Example.json
18. Instacart Single Task Example.zepp.ipynb
19. Cloud Detection Example.json
19. Cloud Detection Example.zepp.ipynb
Dockerfile
README.md
.editorconfig
.gitignore
CONTRIBUTORS.md
LICENSE.txt
README.md
pom.xml
runexamples.sh
克隆/下载
03. Logistic Regression.zepp.ipynb 7.20 KB
一键复制 编辑 原始数据 按行查看 历史
AlexDBlack 提交于 6年前 . Update links

Note

View the README.md to learn about installing, setting up dependencies and importing notebooks in Zeppelin.

Background


With deep learning, we can compose a deep neural network to suit the input data and its features. The goal is to train the network on the data to make predictions, and those predictions are tied to the outcomes that you care about; i.e. is this transaction fraudulent or not, or which object is contained in the photo? There are different techniques to configure a neural network, and all of them build a relational hierarchy between the inputs and outputs.

In this tutorial, we are going to configure the simplest neural network and that is logistic regression model network.

Regression is a process that helps show the relations between the independant variables (inputs) and the dependant variables (outputs). Logistic regression is one in which the dependant variable is categorical rather than continuous - meaning that it can predict only a limited number of classes or categories, like a switch you flip on or off. For example, it can predict that an image contains a cat or a dog, or it can classify input in ten buckets with the integers 0 through 9.

A simple logisitic regression calculates 'x*w + b = y'. Where 'x' is an isntance of input data, 'w' is the weight or coefficient that transforms that input, 'b' is the bias and 'y' is the output, or prediction about the data. The biological terms show how this artificial neuron loosely maps to a neuron in the human brain. The most important point is how data flows through and is transformed by this structure.

|---|---|---| |Logistic Regression | How a logistic regression is calculcated | Source |

What will we learn in this tutorial?

We're going to configure the simplest network, with just one input layer and one output layer, to show how logistic regression works.

Imports

import org.deeplearning4j.nn.api.OptimizationAlgorithm
import org.deeplearning4j.nn.conf.graph.MergeVertex
import org.deeplearning4j.nn.conf.layers.{DenseLayer, GravesLSTM, OutputLayer, RnnOutputLayer}
import org.deeplearning4j.nn.conf.{ComputationGraphConfiguration, MultiLayerConfiguration, NeuralNetConfiguration}
import org.deeplearning4j.nn.graph.ComputationGraph
import org.deeplearning4j.nn.multilayer.MultiLayerNetwork
import org.deeplearning4j.nn.weights.WeightInit
import org.nd4j.linalg.activations.Activation
import org.nd4j.linalg.learning.config.Nesterovs
import org.nd4j.linalg.lossfunctions.LossFunctions

Configuring logistic regression layers

We are going to first build the layers and then feed these layers into the network configuration.

//Building the output layer
val outputLayer : OutputLayer = new OutputLayer.Builder()
    .nIn(784) //The number of inputs feed from the input layer
    .nOut(10) //The number of output values the output layer is supposed to take
    .weightInit(WeightInit.XAVIER) //The algorithm to use for weights initialization
    .activation(Activation.SOFTMAX) //Softmax activate converts the output layer into a probability distribution
    .build() //Building our output layer
//Since this is a simple network with a stack of layers we're going to configure a MultiLayerNetwork
val logisticRegressionConf : MultiLayerConfiguration = new NeuralNetConfiguration.Builder()
    .seed(123).learningRate(0.1).iterations(1).optimizationAlgo(OptimizationAlgorithm.STOCHASTIC_GRADIENT_DESCENT).updater(new Nesterovs(0.9)) //High Level Configuration
    .list() //For configuring MultiLayerNetwork we call the list method
    .layer(0, outputLayer) //    <----- output layer fed here
    .pretrain(false).backprop(true) //Pretraining and Backprop Configuration
    .build() //Building Configuration

This is how our configuration here looks like:

|---|---|---| |Logistic Regression | How a logistic regression visually looks like in neural networks | Source |

The layer with x1, x2, x3, ..., xn is out input layer. While the one with z1, z2, z3, ..., zk is our output layer. See how the weights and biases are connected, and how softmax is applied to give the probability distribution.

Why we didn't build an input layer


You may be wondering why didn't we write any code for building our input layer. The input layer is only a set of inputs values fed into the network. It doesn't perform a calculation. It's just an input sequence (raw or pre-processed data) coming into the network, data to be trained on or to be evaluated. Later, we are going to work with data iterators, which feed input to a network in a specific pattern, and which can be thought of as an input layer of the network.

What's next?

  • See this tutorial to learn about configuring a more complex network: a 'feedforward neural network'. We will also introduce the concept of hidden layers.
Loading...
马建仓 AI 助手
尝试更多
代码解读
代码找茬
代码优化
1
https://gitee.com/dwwwcn/deeplearning4j-examples.git
git@gitee.com:dwwwcn/deeplearning4j-examples.git
dwwwcn
deeplearning4j-examples
deeplearning4j-examples
master

搜索帮助