Install envpool with:
pip install envpoolNote 1: envpool only supports Linux operating system.
You can use OpenRL to train Cartpole (envpool) via:
PYTHON_PATH train_ppo.pyYou can also add custom wrappers in envpool_wrapper.py. Currently we have VecAdapter and VecMonitor wrappers.