Deep Neural Networks: Algorithms
latest
Contents:
Maths
Neurons
Layers
Networks
Solvers
Models
Applications
Deep Neural Networks: Algorithms
Docs
»
Deep Neural Networks: Algorithms
Edit on GitHub
Deep Neural Networks: Algorithms
¶
Contents:
Maths
BLAS
Probability and Information Theory
Numerical Computation
Neurons
Weights
Zeros
Ones
Constants
Random Uniform
Random Normal
Truncated Normal
Uniform Unit Scaling
Random Walk
Layers
Convolutions
Convolutions
1D Convolution
2D Convolution
3D Convolution
Group Convolution
Dilated Convolution
Pooling
Max Pooling
Avg Pooling
RoI Pooling
Regions
Selective Search
Region Proposal Network
Fully Connected
Inner Product
Activations
Identity
Step
Piecewise Linear
Sigmoid
Complementary Log Log
Bipolar
Bipolar Sigmoid
TanH
LeCun’s TanH
Hard TanH
Absolute
Rectifier
Modifications of ReLU
Smooth Rectifier
Logit
Probit
Cosine
Softmax
Maxout
(RBF) Gaussian
(RBF) Multiquadratic
(RBF) Inverse Multiquadratic
References
Normalizations
Regularizations
L1 Regularization
L2 Regularization
Dropout
Losses
Contrastive Loss
Hinge Loss
Euclidean Loss
Infogain Loss
Sigmoid Cross Entropy Loss
Softmax Loss
Multinomial Logistic Loss
Smooth L1 Loss
Gradients
Stochastic Gradient Descent
Ada Delta
Adaptive Gradient
Adam
Nesterov’s Accelerated Gradient
RMS Prop
Networks
Faster R-CNN
YOLO
Solvers
Train
Validate
Test
Models
Convolutional Neural Networks
LeNet
Model
AlexNet
Model
Overfeat
VGG
Model
GoogleNet
ResNet
Applications
MNIST
CIFAR
PASCAL
COCO
ImageNet
Read the Docs
v: latest
Versions
latest
Downloads
pdf
htmlzip
epub
On Read the Docs
Project Home
Builds
Free document hosting provided by
Read the Docs
.