Reinforcement learning models for EV charging assignment and scheduling.
Assuming you've installed uv
- Install python dependencies
uv syncev_rollout.mp4
uv run scripts/explore_env.py(Optional): If you want to use CometML logging, set the env variables:
export COMET_API_KEY=<YOUR_VALUE>
export COMET_PROJECT_NAME=<YOUR_VALUE>
export COMET_WORKSPACE=<YOUR_VALUE>Then add this to your train/test command : logging.use_cometml=true
Use logging.use_cometml=false when launching the training (eg: uv run scripts/train_agent.py algo=ddqn logging.use_cometml=false).
uv run scripts/train_agent.py algo=ddqnYou can change the training seed in the config or pass training.seed=0 like:
uv run scripts/train_agent.py algo=ddqn training.seed=0
We use multiple training seeds [0, 1, 2, 3, 4] to report our results.
uv run scripts/train_agent.py algo=a2cWe add seeds same as DDQN.
uv run scripts/train_ga.py algo.agent.n_generations=5 algo.agent.n_eval_episodes=10 training.n_logging_episodes=30uv run scripts/test_agent.py \
algo=ddqn \
'eval.weights=["trained_agent_weights_1.pt", "trained_agent_weights_2.pt", "trained_agent_weights_3.pt"]'You need to replace "trained_agent_weights_X.pt" with the path to the trained weights (here X represents different training seeds).
uv run scripts/test_agent.py \
algo=a2c \
'eval.weights=["trained_agent_weights_1.pt", "trained_agent_weights_2.pt", "trained_agent_weights_3.pt"]'You need to replace "trained_agent_weights_X.pt" with the path to the trained weights (here X represents different training seeds).
uv run scripts/test_agent.py \
algo=ga \
'eval.weights=["best_genome_1.pt", "best_genome_2.pt", "best_genome_3.pt"]'You need to replace "best_genome_X.pt" with the path to the best genome obtained during training (here X represents different training seeds).
uv run scripts/test_agent.py algo=random