- Project Overview
- Repository Structure
- Quick Start
- Data
- Notebooks & Models
- Results & Visualizations
- Reproducibility (How to run)
- Contributing
- License & Contact
Energy Forecasting with Deep Learning is a compact, reproducible project that demonstrates multiple approaches for short-term (hourly) load forecasting using time-series modeling. The repository includes baseline models (naive and dense feed-forward), feature engineering, exploratory data analysis, and an LSTM memory-based forecasting model that predicts the next 24 hours using the previous 168 hours.
000_naive_baseline_models/
001_baseline_dense_model/
002_LSTM_model/
EDA/
Evaluation/
raw/ # raw PJME_hourly.csv dataset
visuals/ # evaluation & result plots (included)
README.md
Key notebooks:
EDA/EDA.ipynb— exploratory data analysis and transformation steps000_naive_baseline_models/000_naive_baseline_model.ipynb— naive baseline approach001_baseline_dense_model/baseline_dense_model_predicition.ipynb— dense network baseline (note: filename contains a typo: "predicition")002_LSTM_model/LSTM_memorybased_forecasting_model.ipynb— LSTM forecasting model
- Clone the repo:
git clone <your-repo-url>
cd <repo-dir>- Create a Python virtual environment and install dependencies:
python -m venv .venv
source .venv/bin/activate
pip install -r requirements.txtTip: If you don't have
requirements.txt, install common packages used in the notebooks:pandas,numpy,scikit-learn,matplotlib,tensorflow(tested with 2.15.x),jupyter.
- Start Jupyter and run notebooks interactively:
jupyter lab- Original raw data:
raw/PJME_hourly.csv(hourly load data). - Preprocessed / feature-engineered file used by notebooks:
001_baseline_dense_model/feature_engineered_load.csv.
The notebooks include the full pipeline for loading, cleaning, feature engineering (hour, dayofweek, month, year), scaling, windowing (168-hour input → 24-hour horizon), training, and evaluation.
- Baselines:
- Naive:
000_naive_baseline_models/(simple naive forecasts) - Dense model:
001_baseline_dense_model/baseline_dense_model_predicition.ipynb
- Naive:
- LSTM memory-based model:
002_LSTM_model/LSTM_memorybased_forecasting_model.ipynb- Input shape: last 168 hours (7 days)
- Output (horizon): next 24 hours
- Uses
MinMaxScalerfor scaling andMean Absolute Error(MAE) for primary evaluation
The repository already contains evaluation visuals in visuals/. Examples:
LSTM vs Dense model (24-hour horizon)
Dense model evaluation
Dense model forecast vs True
Training loss curve (Dense)
Predicted dense model results (sample)
These images illustrate model predictions and training metrics.
- Ensure your environment has the required packages (see Quick Start).
- Open
EDA/EDA.ipynband run cells to reproduce the feature engineering and savefeature_engineered_load.csv. - Run
000_naive_baseline_models/000_naive_baseline_model.ipynbto generate naive baseline forecasts and evaluation metrics. - Run
001_baseline_dense_model/baseline_dense_model_predicition.ipynbto train and evaluate the dense baseline. Savedense_model_predictions.csvto compare with LSTM. - Run
002_LSTM_model/LSTM_memorybased_forecasting_model.ipynbto train the LSTM and produce comparison plots. Use the included early stopping callback for robust training.
Tips:
- Use a GPU to speed up LSTM training when available.
- If you want deterministic results, set seeds for
numpy,tensorflow, andpython'srandomat the start of notebooks.
- Primary metric: Mean Absolute Error (MAE)
- Also consider: Root Mean Squared Error (RMSE), Mean Absolute Percentage Error (MAPE)
The notebooks print MAE for sample forecasts and include visualization comparing true vs predicted loads.
Contributions are welcome — suggestions:
- Add hyperparameter search (Optuna or Keras Tuner)
- Add probabilistic forecasting (quantile loss or Bayesian methods)
- Introduce exogenous variables (weather, holidays)
- Wrap models with a minimal inference API (FastAPI/Flask)
Available on request: a trimmed requirements.txt (core packages), a Makefile for reproducible commands, or a GitHub Actions CI workflow to run notebooks/tests.
This repository is provided under the GNU General Public License v3.0 (GPL-3.0). See the LICENSE file for the full text.




