## Usage of initializers
Initializations define the way to set the initial random weights of Keras layers.
The keyword arguments used for passing initializers to layers will depend on the layer. Usually it is simply `kernel_initializer` and `bias_initializer`:
“`python
model.add(Dense(64,
kernel_initializer=’random_uniform’,
bias_initializer=’zeros’))
“`
## Available initializers
The following built-in initializers are available as part of the `keras.initializers` module:
{{autogenerated}}
An initializer may be passed as a string (must match one of the available initializers above), or as a callable:
“`python
from keras import initializers
model.add(Dense(64, kernel_initializer=initializers.random_normal(stddev=0.01)))
# also works; will use the default parameters.
model.add(Dense(64, kernel_initializer=’random_normal’))
“`
## Using custom initializers
If passing a custom callable, then it must take the argument `shape` (shape of the variable to initialize) and `dtype` (dtype of generated values):
“`python
from keras import backend as K
def my_init(shape, dtype=None):
return K.random_normal(shape, dtype=dtype)
model.add(Dense(64, kernel_initializer=my_init))
“`