-
Notifications
You must be signed in to change notification settings - Fork 433
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Merge pull request #13 from ARISE-Initiative/robosuite-visii-branch-doc
Robosuite visii branch doc
- Loading branch information
Showing
1 changed file
with
10 additions
and
0 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -36,3 +36,13 @@ Installing iGibson can be done using the command `pip install igibson`. Please r | |
## PyGame | ||
|
||
[PyGame](https://www.pygame.org/news) is a simple renderer that serves also as an alternative to MjViewer for rendering onscreen. A limitation of PyGame is that it can only render on-screen, limiting its applicability to train on computing clusters. This is because PyGame still relies on the MjRenderContextOffscreen to render frames offscreen, from which PyGame then maps to its native onscreen renderer. However, it is useful for visualizing the robots' behaviors in the system runtime where MjViewer is not supported. Check out this [demo script](../demos.html#pygame-renderer) for an example of using the PyGame renderer. | ||
|
||
## Renderer Profiling | ||
The following table shows the estimated frame rate of each renderer in frames per second (FPS). The profiling was conducted on a machine with Ubuntu 18.04, Intel Core i9-900K [email protected], and Nvidia RTX. The FPS numbers of each rendering option are reported below. These numbers are estimated on the Door environment with IIWA robot and Joint Velocity controller and 256x256 image size. | ||
|
||
| | mujoco-py | iGibson<br>(render2tensor optimized) | iGibson<br>(render2tensor) | iGibson<br>(render2numpy) | NVISII | | ||
|-------------------|:---------:|:---------------------------------:|:-----------------------:|:----------------------:|:------:| | ||
| Simulation + rendering | 62 | 64 | 58 | 45 | 0.5 | | ||
| Rendering only | 508 | 1392 | 285 | 271 | 0.5 | | ||
|
||
In practice, both mujoco-py and iGibson renderers are well-suited for vision-based policy learning. In comparison, iGibson offers a faster rendering speed and additional functionalities for perception research. You might also find that iGibson has better cross-platform compatibility than the generic mujoco-py renderer, but it requires iGibson as a dependency. NVISII is best suited for photorealistic rendering; however, the ray-tracing computation substantially slows down its rendering speed compared to the other two renderers. It is mainly intended for perception tasks and qualitative visualizations, rather than online policy training. |