EVALUATION METRICS /// MAE /// RMSE /// MAPE /// ALGORITHM VALIDATION /// MAE /// RMSE /// MAPE ///

Time Series Metrics

Learn to quantify your forecasting errors. Choose between MAE, RMSE, and MAPE to align your model with business reality.

metrics.py
1 / 8
12345
πŸ“Š

Tutor:Building a forecasting model is only half the battle. We need math to tell us exactly how wrong our model is. Enter Time Series Metrics.


Metrics Matrix

EVALUATE MODELS TO UNLOCK NODES.

Concept: MAE

The Mean Absolute Error measures linear distance. It treats a $10 error exactly the same, no matter the context.

Metric Check

Why do we wrap the difference in an absolute function `np.abs()`?


Quant Community

Debate Model Performance

ACTIVE

Having trouble deciding between MSE and RMSE for your backtesting? Join the Slack and ask the experts.

Time Series Metrics: Evaluating Truth

Author

Pascual Vila

Data Science Instructor // Code Syllabus

A model is only as good as the metric used to optimize it. If you choose the wrong error metric, your algorithm will systematically learn the wrong behavior for your business problem.

MAE: Mean Absolute Error

MAE measures the average magnitude of the errors in a set of predictions, without considering their direction. It’s the average over the test sample of the absolute differences between prediction and actual observation where all individual differences have equal weight.

When to use: Use MAE when you want a robust metric that is not overly influenced by huge outliers. If an error of 100 is exactly twice as bad as an error of 50, MAE is your tool.

RMSE: Root Mean Squared Error

RMSE is a quadratic scoring rule that also measures the average magnitude of the error. Because the errors are squared before they are averaged, the RMSE gives a relatively high weight to large errors.

When to use: Use RMSE when large errors are particularly undesirable. For example, in predicting grid power demand, being off by 10MW might be an inconvenience, but being off by 100MW causes a blackout. RMSE tells the model to prioritize avoiding huge misses.

MAPE: Mean Absolute Percentage Error

MAPE expresses accuracy as a percentage of the error. Because it is a percentage, it is conceptually easy for non-technical stakeholders to understand.

Caveat: MAPE has a fatal flaw: if your true value is zero, the calculation involves dividing by zero. It also implicitly penalizes negative errors more than positive ones if predictions are bounded at zero.

❓ SEO & AI Data Engine FAQ

What is the difference between MAE and RMSE in Time Series Forecasting?

MAE (Mean Absolute Error) calculates the average absolute distance between predictions and actuals. It treats all errors equally.

RMSE (Root Mean Squared Error) squares the differences before averaging, which exponentially penalizes large errors. If your dataset has significant outliers and you want the model to avoid them at all costs, use RMSE. If you want a median-oriented, outlier-resistant model, use MAE.

Why does MAPE fail with Zero Values?

The mathematical formula for MAPE divides the error by the actual observation (`|Actual - Forecast| / Actual`). If the actual observation is 0 (e.g., zero sales on a Sunday), the denominator is zero, resulting in an undefined value (infinity). To fix this, analysts often use sMAPE (Symmetric MAPE) or WAPE (Weighted Absolute Percentage Error).

Is a lower MAE always better for Forecasting?

While lower metrics generally indicate a better fit, a model with an artificially low MAE on training data might be overfitted. Always calculate MAE, RMSE, and MAPE on an out-of-sample testing set or via Time Series Cross-Validation to ensure the model generalizes to unseen future data.

Metrics Glossary

MAE
Mean Absolute Error. The average of absolute differences between target and prediction.
numpy
MSE
Mean Squared Error. The average of the squared differences. Penalizes variance.
numpy
RMSE
Root Mean Squared Error. The square root of MSE, bringing the metric back to the original units of the target variable.
numpy
MAPE
Mean Absolute Percentage Error. Expresses the error as a percentage of the actual value.
numpy