Skip to main content

Table 1 Table showing the fixed parameters during evaluation, and the parameters, which are optimized by using grid search

From: A comprehensive study on battery electric modeling approaches based on machine learning

Method

Fixed parameters

Evaluated feature space

Optimal

   

constellation

MLR

-

-

-

SVR

RBF kernel w. γ=“scale”

C=[1,...,80],

C=30,

  

ε =[0.008,...,0.032]

ε=0.016

K-NN

weights="uniform"

p=[1,2,3],

p=1,

 

K-D tree

k=[1,...,80]

k=40

Decision Tree

min_samples_leaf=5

Nleave=[100,...,30000]

Nleave=10000

Random Forest

Nleave=10000,

Ntrees=[5,...,140]

Ntrees=20

AdaBoost

Nleave=10000,

Ntrees=[5,...,140],

Ntrees=40,

 

loss=“linear”

η=[0.5,...,2]

η=2

GBR

Nleave=31,η=0.1,

Ntrees=[60,...,900]

Ntrees=480

 

loss=“least squares”

  

FFNN

activation=“ReLU”

LFC=[1,2,3,4],

LFC=3,

  

NFC=[8,...,100]

NFC=48

MergeFFNN

activation=“ReLU”

Ntop=[5,...,50],

Ntop=35,

 

NFC,a.m.=Ntop+Nbot

Nbot=[10,...,40]

Nbot=30

MergeLSTM

NFC,a.m.=NLSTM+NFC

NLSTM=[5,...,30],

NLSTM=5,

  

NFC=[15,...,40]

NFC=25

LSTM

-

NLSTM=[5,...,35],

NLSTM=15,

  

NFC=[0,...,40]

NFC=40

MergeCNN

NFC,a.m.=Nfilter+NFC

Nfilter=[5,...,25],

Nfilter=15,

 

filter_size,conv_Stride

NFC=[10,...,40]

NFC=40

CNN

filter_size,conv_Stride

Nfilter=[5,...,40],

Nfilter=25,

  

NFC=[0,...,50]

NFC=35

  1. All neural networks use ADAM (Kingma and Ba 2017), a batch size of 256 and a learning rate η=0.0005. The parameter NFC,a.m. denotes the number of neurons in the dense layer after the merge layer