MLP
Multi-Layer Perceptron
Code: slm_lab/agent/net/mlp.py
These networks take a single state as input. They are composed of a sequence of dense (fully connected) layers. MLPs are general purpose, simple networks. Well suited for environments with a low dimensional state space, or a state space with no spatial structure.
Source Documentation
Refer to the class documentation and example net spec from the source: slm_lab/agent/net/mlp.py#L12-L58
Example Net Spec
This specification instantiates an MLP with 3 hidden layers of 256, 128, and 64 nodes respectively, rectified linear (ReLU) activations, and the Adam optimizer with a learning rate of 0.02. The rest of the spec is annotated below.
For more concrete examples of net spec specific to algorithms, refer to the existing spec files.
Last updated
Was this helpful?