How to use SatCat/ppo-Huggy with ml-agents:
mlagents-load-from-hf --repo-id="SatCat/ppo-Huggy" --local-dir="./download: string[]s"
70fa47a
1
2
3
4
version https://git-lfs.github.com/spec/v1 oid sha256:4ce15327754f0a8dc022681cee4e95f8ee565feb8c1f2ca2b506243290e74c38 size 2270373