Robosuite

Latest version: v1.4.1

Safety actively analyzes 641872 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 1 of 2

1.4.0

- Highlights
- New Features
- Improvements
- Critical Bug Fixes
- Other Bug Fixes

Highlights
This release of robosuite refactors our backend to leverage DeepMind's new [mujoco](https://github.com/deepmind/mujoco) bindings. Below, we discuss the key details of this refactoring:

Installation
Now, installation has become much simpler, with mujoco being directly installed on Linux or Mac via `pip install mujoco`. Importing mujoco is now done via `import mujoco` instead of `import mujoco_py`

Rendering
The new DeepMind mujoco bindings do not ship with an onscreen renderer. As a result, we've implented an [OpenCV renderer](https://github.com/ARISE-Initiative/robosuite/blob/master/robosuite/utils/opencv_renderer.py), which provides most of the core functionality from the original mujoco renderer, but has a few limitations (most significantly, no glfw keyboard callbacks and no ability to move the free camera).

Improvements
The following briefly describes other changes that improve on the pre-existing structure. This is not an exhaustive list, but a highlighted list of changes.

- Standardize end-effector frame inference (25). Now, all end-effector frames are correctly inferred from raw robot XMLs and take into account arbitrary relative orientations between robot arm link frames and gripper link frames.

- Improved robot textures (27). With added support from DeepMind's mujoco bindings for obj texture files, all robots are now natively rendered with more accurate texture maps.

- Revamped macros (30). Macros now references a single macro file that can be arbitrarily specified by the user.

- Improved method for specifying GPU ID (29). The new logic is as follows:
1. If `render_device_gpu_id=-1`, `MUJOCO_EGL_DEVICE_ID` and `CUDA_VISIBLE_DEVICES` are not set, we either choose the first available device (usually `0`) if `macros.MUJOCO_GPU_RENDERING` is `True`, otherwise use CPU;
2. `CUDA_VISIBLE_DEVICES` or `MUJOCO_EGL_DEVICE_ID` are set, we make sure that they dominate over programmatically defined GPU device id.
3. If `CUDA_VISIBLE_DEVICES` and `MUJOCO_EGL_DEVICE_ID` are both set, then we use `MUJOCO_EGL_DEVICE_ID` and make sure it is defined in `CUDA_VISIBLE_DEVICES`

- robosuite docs updated

- Add new papers


Critical Bug Fixes
- Fix Sawyer IK instability bug (25)


Other Bug Fixes
- Fix iGibson renderer bug (21)


-------

Contributor Spotlight
We would like to introduce the newest members of our robosuite core team, all of whom have contributed significantly to this release!
awesome-aj0123
snasiriany
zhuyifengzju

1.3

1.3.0

- Highlights
- New Features
- Improvements
- Critical Bug Fixes
- Other Bug Fixes

Highlights
This release of robosuite brings powerful rendering functionalities including new renderers and multiple vision modalities, in addition to some general-purpose camera utilties. Below, we discuss the key details of these new features:

Renderers
In addition to the native Mujoco renderer, we present two new renderers, [NVISII](https://github.com/owl-project/NVISII) and [iGibson](http://svl.stanford.edu/igibson/), and introduce a standardized rendering interface API to enable easy swapping of renderers.

NVISII is a high-fidelity ray-tracing renderer originally developed by NVIDIA, and adapted for plug-and-play usage in **robosuite**. It is primarily used for training perception models and visualizing results in high quality. It can run at up to ~0.5 fps using a GTX 1080Ti GPU. Note that NVISII must be installed (`pip install nvisii`) in order to use this renderer.

iGibson is a much faster renderer that additionally supports physics-based rendering (PBR) and direct rendering to pytorch tensors. While not as high-fidelity as NVISII, it is incredibly fast and can run at up to ~1500 fps using a GTX 1080Ti GPU. Note that iGibson must be installed (`pip install igibson`) in order to use this renderer.

With the addition of these new renderers, we also introduce a standardized [renderer](https://github.com/ARISE-Initiative/robosuite/blob/master/robosuite/renderers/base.py) for easy usage and customization of the various renderers. During each environment step, the renderer updates its internal state by calling `update()` and renders by calling `render(...)`. The resulting visual observations can be polled by calling `get_pixel_obs()` or by calling other methods specific to individual renderers. We provide a [demo script](https://github.com/ARISE-Initiative/robosuite/blob/master/robosuite/demos/demo_segmentation.py) for testing each new renderer, and our docs also provide [additional information](http://robosuite.ai/docs/modules/renderers.md) on specific renderer details and installation procedures.

Vision Modalities
In addition to new renderers, we also provide broad support for multiple vision modalities across all (Mujoco, NVISII, iGibson) renderers:

- **RGB**: Standard 3-channel color frames with values in range `[0, 255]`. This is set during environment construction with the `use_camera_obs` argument.
- **Depth**: 1-channel frame with normalized values in range `[0, 1]`. This is set during environment construction with the `camera_depths` argument.
- **Segmentation**: 1-channel frames with pixel values corresponding to integer IDs for various objects. Segmentation can occur by class, instance, or geom, and is set during environment construction with the `camera_segmentations` argument.

In addition to the above modalities, the following modalities are supported by a subset of renderers:

- **Surface Normals**: [NVISII, iGibson] 3-channel (x,y,z) normalized direction vectors.
- **Texture Coordinates**: [NVISII] 3-channel (x,y,z) coordinate texture mappings for each element
- **Texture Positioning**: [NVISII, iGibson] 3-channel (x,y,z) global coordinates of each pixel.

Specific modalities can be set during environment and renderer construction. We provide a [demo script](https://github.com/ARISE-Initiative/robosuite/blob/master/robosuite/demos/demo_nvisii_modalities.py) for testing the different modalities supported by NVISII and a [demo script](https://github.com/ARISE-Initiative/robosuite/blob/master/robosuite/demos/demo_igibson_modalities.py) for testing the different modalities supported by iGibson.

Camera Utilities
We provide a set of general-purpose [camera utilities](https://github.com/ARISE-Initiative/robosuite/blob/master/robosuite/utils/camera_utils.py) that intended to enable easy manipulation of environment cameras. Of note, we include transform utilities for mapping between pixel, camera, and world frames, and include a [CameraMover](https://github.com/ARISE-Initiative/robosuite/blob/master/robosuite/utils/camera_utils.py#L244) class for dynamically moving a camera during simulation, which can be used for many purposes such as the [DemoPlaybackCameraMover](https://github.com/ARISE-Initiative/robosuite/blob/master/robosuite/utils/camera_utils.py#L419) subclass that enables smooth visualization during demonstration playback.

Improvements
The following briefly describes other changes that improve on the pre-existing structure. This is not an exhaustive list, but a highlighted list of changes.

- Standardize EEF frames (204). Now, all grippers have identical conventions for plug-and-play usage across types.

- Add OSC_POSITION control option for spacemouse (209).

- Improve model class hierarchy for robots. Now, robots own a subset of models (gripper(s), mount(s), etc.), allowing easy external access to the robot's internal model hierarchy.

- robosuite docs updated

- Add new papers


Critical Bug Fixes
- Fix OSC global orientation limits (228)


Other Bug Fixes
- Fix default OSC orientation control (valid default rotation matrix) (232)

- Fix Jaco self-collisions (235)

- Fix joint velocity controller clipping and tune default kp (236)

-------

Contributor Spotlight
A big thank you to the following community members for spearheading the renderer PRs for this release!
awesome-aj0123
divyanshj16
fxia22

1.2

1.2.0

- Highlights
- New Features
- Improvements
- Critical Bug Fixes
- Other Bug Fixes

Highlights
This release of robosuite tackles a major challenge of using simulators: real-world transfer! (Sim2Real)

We present two features to significantly improve the sim2real transferrability -- realistic sensor modeling (*observables*) and control over physics modeling parameters (*dynamics randomization*).

Observables
This standardizes and modularizes how observations are computed and gathered within a given env. Now, all observations received from the `env.step()` call can be modified programmatically in either a deterministic or stochastic way. Sensor realism has been increased with added support for individual sensor sampling rates, corruption, delay, and filtering. The OTS behavior (obs dict structure, default no corruption / delay / filtering) has been preserved for backwards compatibility.

Each `Observable` owns its own `_sensor`, `_corrupter`, `_delayer`, and `_filter` functions, which are used to process new data computed during its `update()` call which is called after every _simulation_ timestep, NOT policy step! (Note, however that a new value is only computed once per sampling period, NOT at every `update()` call). Its functionality is described briefly described below:

- `_sensor`: Arbitrary function that takes in an observation cache and computes the "raw" (potentially ground truth) value for the given observable. It can potentially leverage pre-computed values from the observation cache to compute its output. The `sensor` decorator is provided to denote this type of function, and guarantees a modality for this sensor as well.
- `_corrupter`: Arbitrary function that takes in the output from `_sensor` and outputs the corrupted data.
- `_delayer`: Arbitrary function that takes no arguments and outputs a float time value (in seconds), denoting how much delay should be applied to the next sampling cycle
- `_filter`: Arbitrary function that takes in the output of `_corrupter` and outputs the filtered data.

All of the above can either be (re-)defined at initialization or during runtime. Utility functions have been provided in the `base.py` environment module to easily interact with all observables owned by the environment.

Some standard corrupter and delayer function generators are provided ([deterministic / uniform / gaussian] [corruption / delay]), including dummy no-ops for standard functions. All of this can be found in [observables.py](https://github.com/ARISE-Initiative/robosuite/blob/master/robosuite/utils/observables.py#L150), and has been [heavily documented](http://robosuite.ai/docs/modules/sensors.html#observables).

An example script demo'ing the new functionality can be found in [demo_sensor_corruption.py](https://github.com/ARISE-Initiative/robosuite/blob/master/robosuite/demos/demo_sensor_corruption.py).

Dynamics Randomization
Physical parameters governing the underlying physics model can now be modified in real-time via the [DynamicsModder](https://github.com/ARISE-Initiative/robosuite/blob/master/robosuite/utils/mjmod.py#L1401) class in `mjmod.py`. This modder allows mid-sim randomization of the following supported properties, sorted by element group (for more information, please see [Mujoco XML Reference](http://www.mujoco.org/book/XMLreference.html))

Opt (Global) Parameters
- `density`: Density of the medium (i.e.: air)
- `viscosity`: Viscosity of the medium (i.e.: air)

Body Parameters
- `position`: (x, y, z) Position of the body relative to its parent body
- `quaternion`: (qw, qx, qy, qz) Quaternion of the body relative to its parent body
- `inertia`: (ixx, iyy, izz) diagonal components of the inertia matrix associated with this body
- `mass`: mass of the body

Geom Parameters
- `friction`: (sliding, torsional, rolling) friction values for this geom
- `solref`: (timeconst, dampratio) contact solver values for this geom
- `solimp`: (dmin, dmax, width, midpoint, power) contact solver impedance values for this geom

Joint parameters
- `stiffness`: Stiffness for this joint
- `frictionloss`: Friction loss associated with this joint
- `damping`: Damping value for this joint
- `armature`: Gear inertia for this joint

The new `DynamicsModder` follows the same basic API as the other `Modder` classes, and allows per-parameter and per-group randomization enabling. Apart from randomization, this modder can also be instantiated to selectively modify values at runtime. Detailed information can be found on our [docs page along with an informative example script](http://robosuite.ai/docs/algorithms/sim2real.html#dynamics).

Improvements
The following briefly describes other changes that improve on the pre-existing structure. This is not an exhaustive list, but a highlighted list of changes.

- robosuite docs have been completely overhauled! Hopefully no more broken links or outdated APIs (159)

- Tuned parallel jaw grippers (`PandaGripper` and `RethinkGripper`) for improved grasp stability and significantly reduced sliding

- Tuned default gains for teleoperation when using keyboard device

- Added small frictionloss, damping, and armature values to all robot joints by default to improve stability and reduce no-op drift over time

- Simulation now uses realistic values for air density and viscosity

- `tune_camera.py` is more flexible: can now take string names as inputs and also modify any camera active in the simulation (e.g.: `eye_in_hand`!)

- Add macros for automatically concatenating image frames from different cameras

- Add new papers

- Improve documentation semantics


Critical Bug Fixes
- Fix interpolator dimension bug (181)

- Fixed xml object naming bug (150)

- Fixed deterministic action playback (however, [a caveat](http://robosuite.ai/docs/algorithms/demonstrations.html#warnings), and a [test script](https://github.com/ARISE-Initiative/robosuite/blob/master/tests/test_environments/test_action_playback.py) to test this) (#178)

Other Bug Fixes
- Fix contact geom bug with Jaco, and a [test script](https://github.com/ARISE-Initiative/robosuite/blob/master/tests/test_robots/test_all_robots.py) to test contact geoms for all robots (#180)

- Fix OSC position default orientation and when to update orientation goals

- Fix OSC orientation to be consistent with global coordinate axes frame of reference

- Fix spacemouse import bug (186)

-------

Contributor Spotlight
A big thank you to the following community members for contributing PRs for this release!
hermanjakobsen
zhuyifengzju

1.1

Page 1 of 2

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.