Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

robosuite v1.3 release #260

Merged
merged 624 commits into from
Oct 19, 2021
Merged
Show file tree
Hide file tree
Changes from 250 commits
Commits
Show all changes
624 commits
Select commit Hold shift + click to select a range
b69993d
Base visii wrapper
abhihjoshi Jan 5, 2021
a072d63
Updated robosuite-dev master to public HEAD
yukezhu Jan 24, 2021
aeb87d9
Merge pull request #3 from ARISE-Initiative/public_head
yukezhu Jan 25, 2021
d55736a
Adding 4K textures
abhihjoshi Jan 25, 2021
3f27b96
Update visii_wrapper.py
abhihjoshi Jan 25, 2021
0a34c2e
splitting visii wrapper into multiple files
abhihjoshi Feb 11, 2021
e982fb6
Delete visii_wrapper.py
abhihjoshi Feb 11, 2021
bed5a99
Create visii_rander_wrapper.py
abhihjoshi Feb 15, 2021
35f714b
Rename visii_rander_wrapper.py to visii_render_wrapper.py
abhihjoshi Feb 15, 2021
78321ce
adding visii rendering files
abhihjoshi Feb 15, 2021
8375572
adding .obj and .mtl files for some objects
abhihjoshi Feb 15, 2021
1a6a952
Delete visii_render_wrapper.py
abhihjoshi Mar 19, 2021
c5bed6f
Delete visii_utils.py
abhihjoshi Mar 19, 2021
2e8017b
Delete rendering_objects.py
abhihjoshi Mar 19, 2021
283dcd3
Delete dynamic_object_initialization.py
abhihjoshi Mar 19, 2021
c1c1150
Delete static_object_initialization.py
abhihjoshi Mar 19, 2021
190f93e
Merge remote-tracking branch 'upstream/master'
yukezhu May 9, 2021
f0d36d5
Update requirements-extra.txt
yukezhu May 9, 2021
9883341
update for nvisii rendering
abhihjoshi May 17, 2021
758f487
Merge branch 'master' of github.com:ARISE-Initiative/robosuite-dev in…
divyanshj16 May 21, 2021
bc5e4c5
add assets
divyanshj16 May 21, 2021
5eaa148
added UV maps
jclin22009 May 12, 2021
a38f6c2
added uv maps for baxter
jclin22009 May 16, 2021
49ae646
add igibson renderer
divyanshj16 May 21, 2021
acebd09
iGWrapper bugs resolved
divyanshj16 May 23, 2021
a67182a
fixes integration bugs in iG renderer and robosuite
divyanshj16 May 23, 2021
4d11f00
undo the change where camera_obs=True could be ON with has_renderer=True
divyanshj16 May 23, 2021
e647869
fix material loading of objects like can, milk etc
divyanshj16 May 23, 2021
dc52a6f
adding demo script and whl file
abhihjoshi May 23, 2021
b3fefe4
Merge branch 'robosuite-visii-branch' of https://github.com/ARISE-Ini…
abhihjoshi May 23, 2021
3153410
removing visii dir
abhihjoshi May 23, 2021
4b7d482
fix scaling of kinova robot
divyanshj16 May 23, 2021
fb6dbe5
Merge branch 'robosuite-visii-branch' of github.com:ARISE-Initiative/…
divyanshj16 May 23, 2021
d2b913c
fix panda robot obj and mtls
divyanshj16 May 23, 2021
cb39506
fix import error
divyanshj16 May 24, 2021
1fdeac2
fix sawyer robot meshes
divyanshj16 May 24, 2021
0c3f439
add two copies of pedestal obj, for iG and NviSII
divyanshj16 May 25, 2021
3256895
use ig pedestal in iG renderer
divyanshj16 May 25, 2021
3a7f2d1
make parser a class attribute
divyanshj16 May 25, 2021
f9f4a02
iG will now honor the render_camera passed while initializing the env…
divyanshj16 May 25, 2021
a520a15
put camera parsing again in load method
divyanshj16 May 25, 2021
39aa9bf
adding nvisii playback script and video mode
abhihjoshi May 26, 2021
a5d0161
fixes jaco robot meshes
divyanshj16 May 26, 2021
a4c66c5
fixes jaco robot meshes, fix parsing of material colors, if color in …
divyanshj16 May 26, 2021
5003b36
Merge branch 'robosuite-visii-branch' of github.com:ARISE-Initiative/…
divyanshj16 May 26, 2021
e239653
removing comments
abhihjoshi May 27, 2021
445c4dc
adds support for different modalities and camera switching in headles…
divyanshj16 May 30, 2021
1a73483
Merge branch 'robosuite-visii-branch' of github.com:ARISE-Initiative/…
divyanshj16 May 30, 2021
9409717
add support for fov
divyanshj16 May 30, 2021
1c8ef1a
add cylinder.obj in robosuite assets, remove meshes_old directory
divyanshj16 May 30, 2021
059dd71
fixes issues reported by Jason
divyanshj16 May 31, 2021
42c3db1
makes shadow sharper and lines and walls disappear
divyanshj16 Jun 2, 2021
1dec3aa
add --ig flag to random action demo
divyanshj16 Jun 4, 2021
dee32f1
reuse overwrite material parameter in load obj as requested by fei
divyanshj16 Jun 6, 2021
46b3430
fix convention issues, add flags to other demos
divyanshj16 Jun 6, 2021
07f2af0
add docs, make some functions internal and fix to use temporary dire…
divyanshj16 Jun 8, 2021
538523b
add robotiq_s_gripper obj files.
divyanshj16 Jun 8, 2021
b0037ba
add robotiq_s_gripper obj files.
divyanshj16 Jun 8, 2021
957a5c1
Merge branch 'robosuite-visii-branch' of github.com:ARISE-Initiative/…
divyanshj16 Jun 8, 2021
344fd05
change models
jclin22009 Jun 17, 2021
28ef207
added mtl files for robot_iq_140 gripper and objects
jclin22009 Jun 23, 2021
a9bcb68
fix pedestal shadows
divyanshj16 Jun 27, 2021
8ad4620
added color material to IIWA mtl+objs
jclin22009 Jun 28, 2021
b7356e6
added fixed mtl+obj for baxter,kinova3,ur5e
jclin22009 Jun 29, 2021
e4dd4fb
fix modality bug in igibson wrapper
divyanshj16 Jun 30, 2021
d223d04
added materials to jaco
jclin22009 Jun 30, 2021
3ea3899
fix obj files
jclin22009 Jun 30, 2021
0cf2bc8
fix obj files
jclin22009 Jun 30, 2021
a952187
lazy load pytorch in robosuite
divyanshj16 Jul 4, 2021
6b3374d
Merge branch 'robosuite-visii-branch' into robosuite-visii-write_mate…
divyanshj16 Jul 5, 2021
04eacf7
modifiy the assets file to have metallic roughness etc. (later check …
divyanshj16 Jul 6, 2021
4fc5e13
fix load texture logic
divyanshj16 Jul 6, 2021
298d325
fix kinova3
jclin22009 Jul 7, 2021
1dde3ed
bug fixes for nvisii
abhihjoshi Jul 7, 2021
059c3d8
Merge branch 'robosuite-visii-write_materials=true' of github.com:ARI…
divyanshj16 Jul 7, 2021
680a2a9
switching branch
abhihjoshi Jul 7, 2021
5bf20ad
Merge branch 'robosuite-visii-branch' into robosuite-visii-write_mate…
jclin22009 Jul 7, 2021
0bcd5fa
adding new nvisii renderer
abhihjoshi Jul 8, 2021
a4303ce
Merge branch 'robosuite-visii-write_materials=true' of https://github…
abhihjoshi Jul 8, 2021
896885d
gripper quat bug fix
abhihjoshi Jul 8, 2021
17fd340
adding Baxter upper elbow obj and Jaco shoulder quat fix
abhihjoshi Jul 9, 2021
ab7ed12
added obj/mtl coloring to grippers
jclin22009 Jul 9, 2021
1eb4edc
Revert "added obj/mtl coloring to grippers"
jclin22009 Jul 9, 2021
0af3a9d
adding video mode nvisii
abhihjoshi Jul 12, 2021
10c88a4
Merge branch 'robosuite-visii-write_materials=true' of github.com:ARI…
divyanshj16 Jul 12, 2021
eaaffac
adding docstrings
abhihjoshi Jul 14, 2021
5c60365
Merge branch 'robosuite-visii-write_materials=true' of github.com:ARI…
divyanshj16 Jul 14, 2021
8e44135
fix baxter color
jclin22009 Jul 14, 2021
ba4b878
Merge branch 'robosuite-visii-write_materials=true' of github.com:ARI…
divyanshj16 Jul 14, 2021
5ea8646
add color to upper forearm and upper shoulder
jclin22009 Jul 16, 2021
a6add87
Merge branch 'robosuite-visii-write_materials=true' of github.com:ARI…
divyanshj16 Jul 19, 2021
411cdbb
fix IIWA robot color
divyanshj16 Jul 19, 2021
c6cc769
fix robotiq140 gripper for iG
divyanshj16 Jul 19, 2021
edbd1cf
committing small changes
abhihjoshi Jul 19, 2021
2c1c207
fixing jaco orientation
abhihjoshi Jul 19, 2021
f25541a
fix upper elbow
divyanshj16 Jul 19, 2021
a9fb410
adding rich baxter files
jclin22009 Jul 19, 2021
a9d85ad
fixing Baxter elbow color
abhihjoshi Jul 19, 2021
61480ab
added rich colors to baxter through obj splitting
jclin22009 Jul 19, 2021
580df0b
Merge branch 'robosuite-visii-branch' of https://github.com/ARISE-Ini…
jclin22009 Jul 19, 2021
9366ec5
adding ground truths to NViSII and moving demo script
abhihjoshi Jul 22, 2021
a0ba2a3
removing demonstration_data
abhihjoshi Jul 22, 2021
975a7c5
fixing reviewed items
abhihjoshi Jul 23, 2021
6527c5f
fixing nvisii demo
abhihjoshi Jul 27, 2021
fc3fdcc
fixing nvisii demo
abhihjoshi Jul 27, 2021
8d0f852
adding assets
abhihjoshi Jul 27, 2021
3512099
Merge branch 'v1.3-assets' into robosuite-visii-branch
yukezhu Jul 28, 2021
5253520
removing 4k textures
abhihjoshi Jul 28, 2021
5034b73
Merge branch 'robosuite-visii-branch' of https://github.com/ARISE-Ini…
abhihjoshi Jul 28, 2021
145e0c7
adding 4k textures
abhihjoshi Jul 28, 2021
4cf5993
remove render_with_igibson flag from all env and add iG renderer demo
divyanshj16 Jul 30, 2021
ef9608f
bring observable inside the wrapper class (with several bugs)
divyanshj16 Jul 30, 2021
ee4ef9e
:bug: fixes for Handover env renderer
abhihjoshi Aug 2, 2021
58e920a
add setup observable in wrapper, make it work with new iG
divyanshj16 Aug 11, 2021
5970409
remove hanging comments
divyanshj16 Aug 11, 2021
a1488f2
fix typo
divyanshj16 Aug 11, 2021
b8cd480
fix small bug in render2tensor
divyanshj16 Aug 12, 2021
815e8ae
removes torch deps from robosuite internals
divyanshj16 Aug 12, 2021
f9ab06a
Merge branch 'robosuite-visii-branch' of github.com:ARISE-Initiative/…
divyanshj16 Aug 12, 2021
9662a59
Merge branch 'ig_wrapper_refactor' into robosuite-visii-branch
divyanshj16 Aug 12, 2021
87a9af4
merging
abhihjoshi Aug 12, 2021
8f9b460
merging
abhihjoshi Aug 12, 2021
775490c
adding renderer class
abhihjoshi Aug 12, 2021
3515a96
adding Renderer class
abhihjoshi Aug 12, 2021
f92159c
remove torch code from base.py
divyanshj16 Aug 14, 2021
387d6b0
Merge branch 'robosuite-visii-branch' of github.com:ARISE-Initiative/…
divyanshj16 Aug 14, 2021
de26120
small changes to viewer initialization
abhihjoshi Aug 17, 2021
cd34c7a
small changes to viewer initialization
abhihjoshi Aug 17, 2021
07d176f
fix for default mujoco renderer
abhihjoshi Aug 18, 2021
34c95f7
wrap ig renderer in wrapper class
divyanshj16 Aug 18, 2021
193daa2
Merge branch 'robosuite-visii-branch' of github.com:ARISE-Initiative/…
divyanshj16 Aug 18, 2021
d705b27
fix small bugs not setting viewer, wrong if else.
divyanshj16 Aug 18, 2021
fe2c0d1
add fixes for camera obs mode
divyanshj16 Aug 22, 2021
8f73f61
remove monkey patching of get observations function
divyanshj16 Aug 22, 2021
1d899b2
reverting to original mujoco renderer
abhihjoshi Aug 23, 2021
d5ff216
adding renderer config file
abhihjoshi Aug 23, 2021
467927b
add video generations scripts
divyanshj16 Aug 23, 2021
7b8b211
add demo script
divyanshj16 Aug 24, 2021
f8359d8
Merge branch 'robosuite-visii-branch' of github.com:ARISE-Initiative/…
divyanshj16 Aug 24, 2021
25d6fb8
rename iG renderer, add config support
divyanshj16 Aug 24, 2021
3cd26a8
adding nvisii config
abhihjoshi Aug 26, 2021
35648e2
adding mujoco renderer
abhihjoshi Aug 26, 2021
4af9273
fixes for mujoco renderer
abhihjoshi Aug 26, 2021
1cb0989
merging
abhihjoshi Aug 26, 2021
ed1ff19
mujoco renderer fixes
abhihjoshi Aug 26, 2021
ca1d927
mujoco renderer fixes
abhihjoshi Aug 26, 2021
3dc3861
merging
abhihjoshi Aug 26, 2021
2e97709
adding nvisii docs
abhihjoshi Aug 27, 2021
c3c5c2d
adding nvisii docs
abhihjoshi Aug 27, 2021
14f73cc
adding depth normalization scripts and fixing configs
abhihjoshi Aug 28, 2021
df9702e
Update demo_video_recording.py
yukezhu Aug 30, 2021
bc20483
Update demo_random_action.py
yukezhu Aug 30, 2021
2935e07
Update demo_control.py
yukezhu Aug 30, 2021
e0ab193
Update demo_gripper_selection.py
yukezhu Aug 30, 2021
a6e7777
PR commented fixes
abhihjoshi Sep 4, 2021
82cb18c
fix pbr issue igibson
divyanshj16 Sep 5, 2021
02b413f
fixes requested by josiah
divyanshj16 Sep 5, 2021
ac818ab
fix requiredment of extra pedestal obj file
divyanshj16 Sep 5, 2021
2b2225a
remove transforms3d dependency as quat2mat and mat2quat are already i…
divyanshj16 Sep 5, 2021
f5f1c8a
update renderers.md
divyanshj16 Sep 5, 2021
b2b1690
removing ig specific pedestals
abhihjoshi Sep 5, 2021
4123577
Merge branch 'robosuite-visii-branch' of https://github.com/ARISE-Ini…
abhihjoshi Sep 5, 2021
cc87f00
removing playback demos for new renderers
abhihjoshi Sep 6, 2021
5666d7f
changing nvisii default config
abhihjoshi Sep 7, 2021
dd21a26
changing nvisii default config
abhihjoshi Sep 7, 2021
bed7a21
Merge pull request #7 from ARISE-Initiative/v1.3-assets
yukezhu Sep 8, 2021
4cfa49c
delete demonstration file iG
divyanshj16 Sep 9, 2021
448957c
Merge branch 'robosuite-visii-branch' of github.com:ARISE-Initiative/…
divyanshj16 Sep 9, 2021
be7fa2b
add base parser class
divyanshj16 Sep 12, 2021
16f7435
Merge branch 'v1.3' of github.com:ARISE-Initiative/robosuite-dev into…
divyanshj16 Sep 13, 2021
d377295
merging nvisii and igibson demo scripts
abhihjoshi Sep 14, 2021
0b02157
merging nvisii and igibson demo scripts
abhihjoshi Sep 14, 2021
c77a640
add code for instance and semantic segmentation
divyanshj16 Sep 19, 2021
0fcbf4a
add class instance segmentation support
divyanshj16 Sep 19, 2021
566fddc
Update README.md
yukezhu Sep 20, 2021
09936da
update documentations
Sep 21, 2021
bb460e3
Merge branch 'robosuite-visii-branch' into cs391r
yukezhu Sep 21, 2021
69079b6
update metadata
yukezhu Sep 21, 2021
ed5488a
Merge branch 'v1.3' into robosuite-visii-branch
yukezhu Sep 21, 2021
b8ca00a
fix typos
yukezhu Sep 21, 2021
76b1cc7
Merge branch 'cs391r' into robosuite-visii-branch
yukezhu Sep 21, 2021
d36e8bd
fix typos
yukezhu Sep 21, 2021
b713286
use MAX_CLASS_COUNT constant for various classes
divyanshj16 Sep 21, 2021
33df47d
remove ig specific renderer demo
divyanshj16 Sep 21, 2021
6519b62
Merge branch 'robosuite-visii-branch' of github.com:ARISE-Initiative/…
divyanshj16 Sep 21, 2021
002988c
renderer demo change
abhihjoshi Sep 21, 2021
f17e98a
renderer demo change
abhihjoshi Sep 21, 2021
9991f3b
add jaco to the list of mtl defined materials as its material is defi…
divyanshj16 Sep 21, 2021
5c85165
add screenshot files
divyanshj16 Sep 21, 2021
8a6999a
updatscreenshot
divyanshj16 Sep 22, 2021
6ccdead
fix segmentation maps issue from iG
divyanshj16 Sep 23, 2021
e18dc0a
fix renderer problem
divyanshj16 Sep 23, 2021
7e1cf28
add vision modalities demo for iG
divyanshj16 Sep 23, 2021
989a821
fix pick place translucent object colors
divyanshj16 Sep 23, 2021
ce70d4f
fix pick place translucent object colors
divyanshj16 Sep 23, 2021
7cd676d
Revert "fix pick place translucent object colors"
divyanshj16 Sep 23, 2021
d3bd949
fix parser for ig-develop branch
divyanshj16 Sep 23, 2021
bb42e6b
add docs
divyanshj16 Sep 23, 2021
d512d01
adding element segmentation and vision modality demos for nvisii
abhihjoshi Sep 24, 2021
d420381
update renderer documentations
yukezhu Sep 25, 2021
61ccbeb
standardize NVISII naming
yukezhu Sep 25, 2021
16f0fdc
class and instance segmentation nvisii and mujoco fixes
abhihjoshi Sep 25, 2021
ba84927
class and instance segmentation nvisii and mujoco fixes
abhihjoshi Sep 25, 2021
7912dcb
Merge branch 'robosuite-visii-branch' of https://github.com/ARISE-Ini…
abhihjoshi Sep 25, 2021
1921b74
adding mujoco renderer fixes
abhihjoshi Sep 25, 2021
a16b79c
adding class and instance segmentations
abhihjoshi Sep 25, 2021
43d5e0f
moving MujocoPyRenderer location
abhihjoshi Sep 26, 2021
10a1ccd
update rendering documentations
yukezhu Sep 26, 2021
eaff393
Merge branch 'robosuite-visii-branch' of github.com:ARISE-Initiative/…
yukezhu Sep 26, 2021
0022ff5
minor typo fixing in renderer docs
yukezhu Sep 26, 2021
206c71b
fix minor typos
yukezhu Sep 26, 2021
6c66fbb
fixing NONAME
abhihjoshi Sep 27, 2021
b57bd95
Merge branch 'robosuite-visii-branch' of https://github.com/ARISE-Ini…
abhihjoshi Sep 27, 2021
73b11cc
changes for iGibson. requires the dataset
roberto-martinmartin Sep 28, 2021
51f901b
test of fix for the light map
roberto-martinmartin Sep 28, 2021
20f1f72
update pydoc with API changes
yukezhu Sep 28, 2021
dcda7a7
file
Sep 28, 2021
db456c2
Update renderers.md
roberto-martinmartin Sep 28, 2021
a53d791
Merge pull request #11 from ARISE-Initiative/roberto-martinmartin-pat…
roberto-martinmartin Sep 28, 2021
4311449
mujoco keyboard bug fix
abhihjoshi Sep 28, 2021
95fd7bb
adding nvisii surface normals
abhihjoshi Sep 30, 2021
6aea8c0
update roboturk docs with links to robomimic
amandlek Oct 4, 2021
1c905de
some minor changes
amandlek Oct 4, 2021
dcfae7a
Merge pull request #12 from ARISE-Initiative/update-roboturk-docs
yukezhu Oct 4, 2021
e58e33a
fixing class/instance segmentations
abhihjoshi Oct 5, 2021
1dc92cc
add render FPS in documents
Oct 10, 2021
04ad008
Update renderers.md
yukezhu Oct 11, 2021
6fb46f1
Merge pull request #13 from ARISE-Initiative/robosuite-visii-branch-doc
yukezhu Oct 11, 2021
d6a9550
update nvisii rendering image
Oct 11, 2021
32bf698
reverting to old mujoco env
abhihjoshi Oct 12, 2021
03091c6
Merge branch 'robosuite-visii-branch' of https://github.com/ARISE-Ini…
abhihjoshi Oct 12, 2021
a0510b1
adding mujoco on screen rendering fix
abhihjoshi Oct 12, 2021
a6c3488
adding mujoco on screen rendering fix
abhihjoshi Oct 12, 2021
aa86a01
adding observables fix
abhihjoshi Oct 12, 2021
afbd495
adding observables fix
abhihjoshi Oct 12, 2021
6e0f68a
adding observables fix
abhihjoshi Oct 12, 2021
aeb4f55
add one more profiling table
divyanshj16 Oct 12, 2021
d5b8a54
Update renderers.md
yukezhu Oct 12, 2021
50fddba
fixing demo renderers script
abhihjoshi Oct 13, 2021
25e251d
Merge branch 'robosuite-visii-branch' of https://github.com/ARISE-Ini…
abhihjoshi Oct 13, 2021
ff7f354
fixing env base
abhihjoshi Oct 15, 2021
4e4b217
Merge branch 'robosuite-visii-branch' of https://github.com/ARISE-Ini…
abhihjoshi Oct 15, 2021
157a8b4
Merge pull request #4 from ARISE-Initiative/robosuite-visii-branch
yukezhu Oct 16, 2021
651b8dc
Merge branch 'master' into v1.3
yukezhu Oct 19, 2021
1f9a6bf
added cube and sphere to the robosuite assets and modified ig bridge …
Oct 19, 2021
32c7957
added reqs for iG renderer
Oct 19, 2021
389b54a
Merge pull request #14 from ARISE-Initiative/removing_dependencies
yukezhu Oct 19, 2021
65e175d
Merge pull request #15 from ARISE-Initiative/docs_ig_reqs
yukezhu Oct 19, 2021
15209ed
Merge branch 'v1.3' of github.com:ARISE-Initiative/robosuite into v1.3
Oct 19, 2021
37ea481
incorporate comments from Ajay
Oct 19, 2021
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
The table of contents is too big for display.
Diff view
Diff view
  •  
  •  
  •  
4 changes: 3 additions & 1 deletion AUTHORS
Original file line number Diff line number Diff line change
Expand Up @@ -24,4 +24,6 @@ Rachel Gardner <[email protected]>
Jonathan Booher <[email protected]>
Danfei Xu <[email protected]>
Rachel Gardner <[email protected]>
Albert Tung <[email protected]>
Albert Tung <[email protected]>
Abhishek Joshi <[email protected]>
Divyansh Jha <[email protected]>
4 changes: 2 additions & 2 deletions LICENSE
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
MIT License

Copyright (c) 2020 Stanford Vision and Learning Lab and UT-Austin Robot Perception and Learning Lab
Copyright (c) 2021 Stanford Vision and Learning Lab and UT Robot Perception and Learning Lab

Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
Expand All @@ -18,4 +18,4 @@ FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
SOFTWARE.
18 changes: 10 additions & 8 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,30 +6,32 @@

-------
## Latest Updates
[02/17/2021] **v1.2.0**: Added observable sensor models :eyes: and dynamics randomization :game_die:
[09/20/2021] **v1.3**: Ray tracing and physically based rendering tools :sparkles: and access to vision modalities 🎥
yukezhu marked this conversation as resolved.
Show resolved Hide resolved

[12/17/2020] **v1.1.0**: Refactored infrastructure and standardized model classes for much easier environment prototyping :wrench:
[02/17/2021] **v1.2**: Added observable sensor models :eyes: and dynamics randomization :game_die:

[12/17/2020] **v1.1**: Refactored infrastructure and standardized model classes for much easier environment prototyping :wrench:

-------

**robosuite** is a simulation framework powered by the [MuJoCo](http://mujoco.org/) physics engine for robot learning. It also offers a suite of benchmark environments for reproducible research. The current release (v1.2) features manipulation tasks with feature supports of procedural generation, advanced controllers, teleoperation, etc. This project is part of the broader [Advancing Robot Intelligence through Simulated Environments (ARISE) Initiative](https://github.com/ARISE-Initiative), with the aim of lowering the barriers of entry for cutting-edge research at the intersection of AI and Robotics.
**robosuite** is a simulation framework powered by the [MuJoCo](http://mujoco.org/) physics engine for robot learning. It also offers a suite of benchmark environments for reproducible research. The current release (v1.3) features rendering tools, ground-truth of vision modalities, and camera utilities. This project is part of the broader [Advancing Robot Intelligence through Simulated Environments (ARISE) Initiative](https://github.com/ARISE-Initiative), with the aim of lowering the barriers of entry for cutting-edge research at the intersection of AI and Robotics.

Data-driven algorithms, such as reinforcement learning and imitation learning, provide a powerful and generic tool in robotics. These learning paradigms, fueled by new advances in deep learning, have achieved some exciting successes in a variety of robot control problems. However, the challenges of reproducibility and the limited accessibility of robot hardware (especially during a pandemic) have impaired research progress. The overarching goal of **robosuite** is to provide researchers with:

* a standardized set of benchmarking tasks for rigorus evaluation and algorithm development;
* a standardized set of benchmarking tasks for rigorous evaluation and algorithm development;
* a modular design that offers great flexibility to design new robot simulation environments;
* a high-quality implementation of robot controllers and off-the-shelf learning algorithms to lower the barriers to entry.

This framework was originally developed since late 2017 by researchers in [Stanford Vision and Learning Lab](http://svl.stanford.edu) (SVL) as an internal tool for robot learning research. Now it is actively maintained and used for robotics research projects in SVL and the [UT-Austin Robot Perception and Learning Lab](http://rpl.cs.utexas.edu) (RPL). We welcome community contributions to this project. For details please check out our [contributing guidelines](CONTRIBUTING.md).
This framework was originally developed since late 2017 by researchers in [Stanford Vision and Learning Lab](http://svl.stanford.edu) (SVL) as an internal tool for robot learning research. Now it is actively maintained and used for robotics research projects in SVL and the [UT Robot Perception and Learning Lab](http://rpl.cs.utexas.edu) (RPL). We welcome community contributions to this project. For details please check out our [contributing guidelines](CONTRIBUTING.md).

This release of **robosuite** contains seven robot models, eight gripper models, six controller modes, and nine standardized tasks. It also offers a modular design of APIs for building new environments with procedural generation. We highlight these primary features below:

* **standardized tasks**: a set of standardized manipulation tasks of large diversity and varying complexity and RL benchmarking results for reproducible research;
* **procedural generation**: modular APIs for programmatically creating new environments and new tasks as a combinations of robot models, arenas, and parameterized 3D objects;
* **procedural generation**: modular APIs for programmatically creating new environments and new tasks as combinations of robot models, arenas, and parameterized 3D objects;
* **controller supports**: a selection of controller types to command the robots, such as joint-space velocity control, inverse kinematics control, operational space control, and 3D motion devices for teleoperation;
yukezhu marked this conversation as resolved.
Show resolved Hide resolved
* **multi-modal sensors**: heterogeneous types of sensory signals, including low-level physical states, RGB cameras, depth maps, and proprioception;
* **human demonstrations**: utilities for collecting human demonstrations, replaying demonstration datasets, and leveraging demonstration data for learning.

* **human demonstrations**: utilities for collecting human demonstrations, replaying demonstration datasets, and leveraging demonstration data for learning. Check out our sister project [robomimic](https://arise-initiative.github.io/robomimic-web/);
* **photorealistic rendering**: integration with advanced graphics tools that provide real-time photorealistic renderings of simulated scenes.

## Citations
Please cite [**robosuite**](https://robosuite.ai) if you use this framework in your publications:
Expand Down
3 changes: 2 additions & 1 deletion docs/acknowledgement.md
Original file line number Diff line number Diff line change
@@ -1,12 +1,13 @@
# Acknowledgements

**robosuite** is built on the [MuJoCo engine](http://www.mujoco.org/) with the Python interfaces provided by [mujoco-py](https://github.com/openai/mujoco-py). We would like to thank members of the [Stanford People, AI, & Robots (PAIR) Group](http://pair.stanford.edu/) for their support and feedback to this project. In particular, the following people have made their contributions in different stages of this project:
**robosuite** is built on the [MuJoCo engine](http://www.mujoco.org/) with the Python interfaces provided by [mujoco-py](https://github.com/openai/mujoco-py). We would like to thank members of the [Stanford People, AI, & Robots (PAIR) Group](http://pair.stanford.edu/) and [UT Robot Perception and Learning Lab](http://rpl.cs.utexas.edu/) for their support and feedback to this project. In particular, the following people have made their contributions in different stages of this project:

- [Jiren Zhu](https://github.com/jirenz), [Joan Creus-Costa](https://github.com/jcreus) (robosuite v0.3)
- [Jim (Linxi) Fan](http://jimfan.me/), [Zihua Liu](https://www.linkedin.com/in/zihua-liu/), [Orien Zeng](https://www.linkedin.com/in/orien-zeng-054589b6/), [Anchit Gupta](https://www.linkedin.com/in/anchitgupta/) ([Surreal](http://surreal.stanford.edu/) experiments)
- [Michelle Lee](http://stanford.edu/~mishlee/), [Rachel Gardner](https://www.linkedin.com/in/rachel0/) (controller implementations)
- [Danfei Xu](https://cs.stanford.edu/~danfei/) (placement initializer)
- [Andrew Kondrich](http://www.andrewkondrich.com/), [Jonathan Booher](https://web.stanford.edu/~jaustinb/) (domain randomization)
- [Albert Tung](https://www.linkedin.com/in/albert-tung3/) (demonstration collection)
- [Abhishek Joshi](https://www.linkedin.com/in/abhishek-joshi-4ab469180), [Divyansh Jha](https://github.com/divyanshj16) (robosuite v1.3 renderers)

We wholeheartedly welcome the community to contribute to our project through issues and pull requests. New contributors will be added to the list above.
6 changes: 4 additions & 2 deletions docs/algorithms/demonstrations.md
Original file line number Diff line number Diff line change
Expand Up @@ -45,7 +45,7 @@ We have included an example script that illustrates how demonstrations can be lo

We have included some sample demonstrations for each task at `models/assets/demonstrations`.

Our twin project [RoboTurk](http://roboturk.stanford.edu) has also collected pilot datasets of more than a thousand demonstrations for two tasks in our suite via crowdsourcing. You can find detailed information about the RoboTurk datasets [here](roboturk).
Our sister project [RoboTurk](http://roboturk.stanford.edu) has also collected several human demonstration datasets across different tasks and humans, including pilot datasets of more than a thousand demonstrations for two tasks in our suite via crowdsourcing. You can find detailed information about the RoboTurk datasets [here](roboturk).


## Structure of collected demonstrations
Expand Down Expand Up @@ -81,7 +81,9 @@ The reason for storing mujoco states instead of raw observations is to make it e

## Using Demonstrations for Learning

[Several](https://arxiv.org/abs/1802.09564) [prior](https://arxiv.org/abs/1807.06919) [works](https://arxiv.org/abs/1804.02717) have demonstrated the effectiveness of altering the start state distribution of training episodes for learning RL policies. We provide a generic utility for setting various types of learning curriculums which dictate how to sample from demonstration episodes when doing an environment reset. For more information see the `DemoSamplerWrapper` class.
We have recently released the [robomimic](https://arise-initiative.github.io/robomimic-web/) framework, which makes it easy to train policies using your own [datasets collected with robosuite](https://arise-initiative.github.io/robomimic-web/docs/introduction/datasets.html#robosuite-hdf5-datasets), and other publically released datasets (such as those collected with RoboTurk). The framework also contains many useful examples for how to integrate hdf5 datasets into your own learning pipeline.

The robosuite repository also has some utilities for using the demonstrations to alter the start state distribution of training episodes for learning RL policies - this have proved effective in [several](https://arxiv.org/abs/1802.09564) [prior](https://arxiv.org/abs/1807.06919) [works](https://arxiv.org/abs/1804.02717). For example, we provide a generic utility for setting various types of learning curriculums which dictate how to sample from demonstration episodes when doing an environment reset. For more information see the `DemoSamplerWrapper` class.

## Warnings
We have verified that deterministic action playback works specifically when playing back demonstrations on the *same machine* that the demonstrations were originally collected upon. However, this means that deterministic action playback is NOT guaranteed (in fact, very unlikely) to work across platforms or even across different machines using the same OS.
Expand Down
8 changes: 5 additions & 3 deletions docs/algorithms/roboturk.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,18 +2,20 @@

[RoboTurk](https://roboturk.stanford.edu/) is a crowdsourcing platform developed in order to enabled collecting large-scale manipulation datasets. Below, we describe RoboTurk datasets that are compatible with robosuite.

## Updated Datasets compatible with v1.0+
## Datasets compatible with v1.2+

We are currently in the process of organizing a standardized dataset for our benchmarking tasks, which will be made available soon and compatible with v1.2.0+. In the meantime, we have provided a [small-scale dataset](https://drive.google.com/drive/folders/1LLkuFnRdqQ6xn1cYzkbJUs_DreaAvN7i?usp=sharing) of expert demonstrations on two of our tasks.
We have collected several human demonstration datasets across several tasks implemented in robosuite as part of the [robomimic](https://arise-initiative.github.io/robomimic-web/) framework. For more information on these datasets, including how to download them and start training policies with them, please see [this link](https://arise-initiative.github.io/robomimic-web/docs/introduction/results.html#downloading-released-datasets).

## Original Datasets compatible with v0.3
## Datasets compatible with v0.3

We collected a large-scale dataset on the `SawyerPickPlace` and `SawyerNutAssembly` tasks using the [RoboTurk](https://crowdncloud.ai/) platform. Crowdsourced workers collected these task demonstrations remotely. It consists of **1070** successful `SawyerPickPlace` demonstrations and **1147** successful `SawyerNutAssembly` demonstrations.

We are providing the dataset in the hopes that it will be beneficial to researchers working on imitation learning. Large-scale imitation learning has not been explored much in the community; it will be exciting to see how this data is used.

You can download the dataset [here](http://cvgl.stanford.edu/projects/roboturk/RoboTurkPilot.zip).

**Note:** to get started with this data, we highly recommend using the [robomimic](https://arise-initiative.github.io/robomimic-web/) framework - see [this link](https://arise-initiative.github.io/robomimic-web/docs/introduction/datasets.html#roboturk-pilot-datasets) for more information. To use this data, you should be on the [roboturk_v1](https://github.com/ARISE-Initiative/robosuite/tree/roboturk_v1) branch of robosuite, which is `v0.3` with a few minor changes. You can do this by using `git checkout roboturk_v1` after cloning the repository, or just download the source code from [this link](https://github.com/ARISE-Initiative/robosuite/tree/roboturk_v1).

After unzipping the dataset, the following subdirectories can be found within the `RoboTurkPilot` directory.

- **bins-full**
Expand Down
2 changes: 1 addition & 1 deletion docs/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -65,7 +65,7 @@

# General information about the project.
project = u'robosuite'
copyright = u'Stanford University and The University of Texas at Austin 2020'
copyright = u'Stanford University and The University of Texas at Austin 2021'
author = u'Yuke Zhu, Josiah Wong, Jiren Zhu, Ajay Mandlekar, Roberto Martín-Martín'

# The version info for the project you're documenting, acts as replacement for
Expand Down
17 changes: 17 additions & 0 deletions docs/demos.md
Original file line number Diff line number Diff line change
Expand Up @@ -182,3 +182,20 @@ The `demo_video_recording.py` script shows how to record a video of robot roll-o
```sh
$ python demo_video_recording.py --environment Lift --robots Panda
```

### Rendering Options
The `demo_renderers.py` script shows how to use different renderers with the simulation environments. Our current version supports three rendering options: MuJoCo (default), NVISII, and iGibson renderers. More information about these renderers can be found in the [Renderer](modules/renderers) module. Example:
```sh
$ python demo_renderers.py --renderer igibson
```
The `--renderer` flag can be set to `mujoco` (default), `nvisii` and `igibson`.

### Vision Modalities
The `demo_igibson_modalities.py` and `demo_nvisii_modalities.py` scripts illustrate how to obtain vision modalities from the iGibson renderer and NVISII renderer respectively. This script uses the flags specified and renders that particular vision modality. Example:
```sh
$ python demo_igibson_modalities.py --vision-modality segmentation --segmentation-level instance

$ python demo_nvisii_modalities.py --vision-modality depth
```
The `--vision-modality` flag can be set to `depth`, `normal`, `segmentation` or `rgb` (default).
The `-segmentation-level` flag can be set only when `--vision-modality` is set to `segmentation`. It can set to `instance`, `class`, or `element`.
Binary file added docs/images/renderers/renderers.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading