Skip to main content
Fig. 1 | Energy Informatics

Fig. 1

From: Towards domain-specific surrogate models for smart grid co-simulation

Fig. 1

Fully connected ANN with three input neurons, two output neurons, and one hidden layer (l.). Activation functions (r.) from top to bottom: identity, sigmoid and ReLU. In each of the neurons, a weighted sum is calculated and sent through an activation function as output to the next layer of neurons. The activation function defines how the output is forwarded (Schmidhuber 2015). Backpropagation is used to adjust the weights of the network to the current output error (Rumelhart et al. 1988)

Back to article page