Quality Report¶
Clicking on the Quality Report icon shows performance indicators for each of the selected models.
Structure¶
Each model is evaluated for all available subsets (these are typically runs or experiments) within the training/validation and test set. Each subset (run/experiment) is displayed within a different tab.
Note
The tab name is identical to the subset name being summarized.
Each row represents a different model.
Errors and estimators types¶
The following errors and estimators are provided for each available model-subset pair for both the training/validation and the test set.
Tag |
Name |
---|---|
BIC |
Bayesian information criterion |
AICC |
AIC with a correction for small sample sizes |
AIC |
Akaike information criterion |
NRMSE |
Normalized root mean square error |
R2 |
Coefficient of determination (\(R^{2}\)) |
MallowsCp |
Mallows’s \(C_{p}\) |
FPE |
Akaike’s Final Prediction Error for estimated model |
MeanError |
Mean error |
StdError |
Standard error |
MaxAbsError |
Maximum absolute error |
MinAbsError |
Minimum absolute error |
SSE |
Sum of squared estimate of errors |
RMSE |
Root mean square error |
MSE |
Mean squared error |
NDIE |
— |
SAE |
Sum absolute error |
MAE |
Mean absolute error |
NMAE |
Normalized mean absolute error |
OUTPUTVARIANCE |
Output variance |
MODELOUTPUTVARIANCE |
Model output variance |
Note
If an error occurs during calculation due to zero division, etc. NaN
(not a number) will be displayed in the performance table.