Table of Contents
Keras Documentation
Home
Why use Keras
Getting started
Models
Layers
Preprocessing
Others
Metrics
Optimizers
Activations
Callbacks
Datasets
Applications
Backend
Initializers
Regularizers
Constraints
Visualization
Scikit-learn API
Utils
Contributing
Keras Documentation
https://keras.io/
Home
The Python Deep Learning library
You have just found Keras
Guiding principles
30seconds to Keras
Installation
Switching from TensorFlow to CNTK or Theano
Support
Why this name, Keras
Why use Keras
Getting started
Guide to the Sequential model
Guide to the Functional API
FAQ
Models
About Keras models
Sequential
Model(functional API)
Layers
About Keras layers
Core Layers
Convolutional Layers
Pooling Layers
Locally-connected Layers
Recurrent Layers
Embeding Layers
Merge Layers
Advanced Activations Layers
Normalization Layers
Noise layers
Layer wrappers
Writing your own Keras layers
Preprocessing
Sequence Preprocessing
Text Preprocessing
Image Preprocessing
Others
Losses
Metrics
Usage of metrics
Arguments
Returns
Available metrics
binary_accuracy
categorical_accuracy
sparse_categorical_accuracy
top_k_categorical_accuracy
sparse_top_k_categorical_accuracy
Custom metrics
Optimizers
Usage of optimizers
Parameters common to all Keras optimizers
SGD
RMSprop
Adagrad
Adadelta
Adam
Adamax
Nadam
TFOptimizer
Activations
Usage of activations
Available activations
softmax
elu
selu
softplus
softsign
relu
tanh
sigmoid
hard_sigmoid
linear
On "Advanced Activations"
Callbacks
Usage of callbacks
Callback
BaseLogger
TerminateOnNaN
ProgbarLogger
Datasets
Applications
Backend
Initializers
Regularizers
Constraints
Visualization
Scikit-learn API
Utils
Contributing