## Usage of optimizers
An optimizer is one of the two arguments required for compiling a Keras model:
“`python
from keras import optimizers
model = Sequential()
model.add(Dense(64, kernel_initializer=’uniform’, input_shape=(10,)))
model.add(Activation(‘tanh’))
model.add(Activation(‘softmax’))
sgd = optimizers.SGD(lr=0.01, decay=1e-6, momentum=0.9, nesterov=True)
model.compile(loss=’mean_squared_error’, optimizer=sgd)
“`
You can either instantiate an optimizer before passing it to `model.compile()` , as in the above example, or you can call it by its name. In the latter case, the default parameters for the optimizer will be used.
“`python
# pass optimizer by name: default parameters will be used
model.compile(loss=’mean_squared_error’, optimizer=’sgd’)
“`
—
## Parameters common to all Keras optimizers
The parameters `clipnorm` and `clipvalue` can be used with all optimizers to control gradient clipping:
“`python
from keras import optimizers
# All parameter gradients will be clipped to
# a maximum norm of 1.
sgd = optimizers.SGD(lr=0.01, clipnorm=1.)
“`
“`python
from keras import optimizers
# All parameter gradients will be clipped to
# a maximum value of 0.5 and
# a minimum value of -0.5.
sgd = optimizers.SGD(lr=0.01, clipvalue=0.5)
“`
—
{{autogenerated}}