pip3 install gym
git clone https://github.com/openai/gym.git
cd gym/examples/agents/
python3 random_agent.py
root@root:/opt/gym/examples/agents# python3 random_agent.py
INFO: Making new env: CartPole-v0
INFO: Creating monitor directory /tmp/random-agent-results
INFO: Starting new video recorder writing to /tmp/random-agent-results/openaigym.video.0.4726.video000000.mp4
INFO: Starting new video recorder writing to /tmp/random-agent-results/openaigym.video.0.4726.video000001.mp4
INFO: Starting new video recorder writing to /tmp/random-agent-results/openaigym.video.0.4726.video000008.mp4
INFO: Starting new video recorder writing to /tmp/random-agent-results/openaigym.video.0.4726.video000027.mp4
INFO: Starting new video recorder writing to /tmp/random-agent-results/openaigym.video.0.4726.video000064.mp4
INFO: Finished writing results. You can upload them to the scoreboard via gym.upload(‘/tmp/random-agent-results’)
root@chrx:/opt/gym/examples/agents#
https://github.com/openai/gym/blob/master/docs/environments.md
https://gym.openai.com/envs/#mujoco of course, we’re using Bullet instead of mujoco for a physics engine, as it’s free.