Skip to content

Latest commit

 

History

History
20 lines (12 loc) · 352 Bytes

File metadata and controls

20 lines (12 loc) · 352 Bytes

Installation

Install envpool with:

pip install envpool

Note 1: envpool only supports Linux operating system.

Usage

You can use OpenRL to train Cartpole (envpool) via:

PYTHON_PATH train_ppo.py

You can also add custom wrappers in envpool_wrapper.py. Currently we have VecAdapter and VecMonitor wrappers.