Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
40 commits
Select commit Hold shift + click to select a range
6808a09
visual-based tactile sensor impl. and shape sensing example
JuanaDd Sep 10, 2025
5f83dd2
Merge branch 'main' into tacsl_demo
Mayankm96 Sep 11, 2025
8eecdc0
Merge branch 'main' into tacsl_demo
kellyguo11 Sep 23, 2025
7c90c47
load assets from nucleus
JuanaDd Sep 24, 2025
ec93980
cherrypick document update
JuanaDd Sep 24, 2025
9115636
update documentation and setup.py, formatting
JuanaDd Sep 24, 2025
e86b370
Optimized tensor operations in `VisuoTactileSensor` to improve effici…
JuanaDd Sep 24, 2025
a8e1c98
Delete unused functions
JuanaDd Sep 25, 2025
6137678
Merge branch 'main' into tacsl_demo
JuanaDd Oct 27, 2025
d86cb35
Merge branch 'main' into tacsl_demo
JuanaDd Nov 3, 2025
da589c8
improve config, naming, sensor-specific settings
JuanaDd Nov 3, 2025
95aa18e
improve doc
JuanaDd Nov 3, 2025
58fd12d
Update render and visualization impl as suggested
JuanaDd Nov 3, 2025
b958f49
Address greptile's comments
JuanaDd Nov 4, 2025
d7ffe14
Add a customized spawn function to modify material
JuanaDd Nov 4, 2025
75e8e62
Merge branch 'main' into tacsl_demo
JuanaDd Nov 5, 2025
41ca4fb
Remove hardcoding of tactile sensor data path
JuanaDd Nov 5, 2025
9655791
Remove todo
JuanaDd Nov 5, 2025
d79f34c
Simplify configs
JuanaDd Nov 5, 2025
e91c92f
Merge branch 'main' into tacsl_demo
JuanaDd Nov 9, 2025
dda9189
Simplify sensor configs and rename indenter terminology
JuanaDd Nov 9, 2025
ad90a2f
Add test for customized spawn function
JuanaDd Nov 9, 2025
cddc2c5
Add proper exceptions and tests for the new sensor
JuanaDd Nov 9, 2025
1dc1c03
Update the doc to clarify camera and material spawning, and elastomer…
JuanaDd Nov 9, 2025
c833acc
Remove TODO and redundant checks
JuanaDd Nov 11, 2025
7943ac2
Merge branch 'main' into tacsl_demo
JuanaDd Nov 11, 2025
931d857
Merge branch 'main' into tacsl_demo
JuanaDd Nov 12, 2025
bd8a93b
Simplify conditions; Remove small functions; Fix code-style issues;
JuanaDd Nov 13, 2025
f99338a
Fix teardown oder issue of tests
JuanaDd Nov 13, 2025
5bbcf0f
Relocate script and update docs
JuanaDd Nov 13, 2025
c9c8aca
Remove tacsl assets from tracking and add to .gitignore
JuanaDd Nov 13, 2025
7d789d5
Align naming for clarity and consistency; refactor data retrieval logic
JuanaDd Nov 27, 2025
1aea218
Improved params in cfg
JuanaDd Nov 28, 2025
2486255
Make camera update_period consistent with the sensor
JuanaDd Nov 28, 2025
fde2803
Merge remote-tracking branch 'upstream/main' into tacsl_demo
JuanaDd Nov 28, 2025
3752152
Fix merge conflict
JuanaDd Nov 28, 2025
55ac135
Rename visualize to compute
JuanaDd Nov 28, 2025
a597bb0
Update docstring of mm_per_pixel
JuanaDd Nov 28, 2025
3bef862
Update source/isaaclab/isaaclab/sensors/tacsl_sensor/visuotactile_sen…
JuanaDd Nov 28, 2025
1d5adef
Update test_visuotactile_sensor.py with tactile_margin param added
JuanaDd Nov 28, 2025
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 4 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -69,3 +69,7 @@ tests/

# Docker history
.isaac-lab-docker-history

# TacSL sensor
**/tactile_record/*
**/gelsight_r15_data/*
1 change: 1 addition & 0 deletions CONTRIBUTORS.md
Original file line number Diff line number Diff line change
Expand Up @@ -86,6 +86,7 @@ Guidelines for modifications:
* Jingzhou Liu
* Jinqi Wei
* Johnson Sun
* Juana Du
* Kaixi Bao
* Kris Wilson
* Kourosh Darvish
Expand Down
2 changes: 2 additions & 0 deletions docs/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -190,6 +190,8 @@
"nvidia.srl",
"flatdict",
"IPython",
"cv2",
"imageio",
"ipywidgets",
"mpl_toolkits",
]
Expand Down
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Only this diagram seems to be getting used. Please remove any unused images.

Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
22 changes: 22 additions & 0 deletions docs/source/api/lab/isaaclab.sensors.rst
Original file line number Diff line number Diff line change
Expand Up @@ -33,6 +33,9 @@
RayCasterCameraCfg
Imu
ImuCfg
VisuoTactileSensor
VisuoTactileSensorCfg
VisuoTactileSensorData

Sensor Base
-----------
Expand Down Expand Up @@ -166,3 +169,22 @@ Inertia Measurement Unit
:inherited-members:
:show-inheritance:
:exclude-members: __init__, class_type

Visuo-Tactile Sensor
--------------------

.. autoclass:: VisuoTactileSensor
:members:
:inherited-members:
:show-inheritance:

.. autoclass:: VisuoTactileSensorData
:members:
:inherited-members:
:exclude-members: __init__

.. autoclass:: VisuoTactileSensorCfg
:members:
:inherited-members:
:show-inheritance:
:exclude-members: __init__, class_type
1 change: 1 addition & 0 deletions docs/source/overview/core-concepts/sensors/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -19,3 +19,4 @@ The following pages describe the available sensors in more detail:
frame_transformer
imu
ray_caster
visuo_tactile_sensor
204 changes: 204 additions & 0 deletions docs/source/overview/core-concepts/sensors/visuo_tactile_sensor.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,204 @@
.. _overview_sensors_tactile:

.. currentmodule:: isaaclab

Visuo-Tactile Sensor
====================


The visuo-tactile sensor in Isaac Lab provides realistic tactile feedback through integration with TacSL (Tactile Sensor Learning) [Akinola2025]_. It is designed to simulate high-fidelity tactile interactions, generating both visual and force-based data that mirror real-world tactile sensors like GelSight devices. The sensor can provide tactile RGB images, force field distributions, and other intermediate tactile measurements essential for robotic manipulation tasks requiring fine tactile feedback.


.. figure:: ../../../_static/overview/sensors/tacsl_diagram.jpg
:align: center
:figwidth: 100%
:alt: Tactile sensor with RGB visualization and force fields


Configuration
~~~~~~~~~~~~~

Tactile sensors require specific configuration parameters to define their behavior and data collection properties. The sensor can be configured with various parameters including sensor resolution, force sensitivity, and output data types.

.. code-block:: python

from isaaclab.sensors.tacsl_sensor import VisuoTactileSensorCfg
from isaaclab.sensors import TiledCameraCfg
from isaaclab_assets.sensors import GELSIGHT_R15_CFG
import isaaclab.sim as sim_utils

# Tactile sensor configuration
tactile_sensor = VisuoTactileSensorCfg(
prim_path="{ENV_REGEX_NS}/Robot/elastomer/tactile_sensor",
## Sensor configuration
render_cfg=GELSIGHT_R15_CFG,
enable_camera_tactile=True,
enable_force_field=True,
## Elastomer configuration
tactile_array_size=(20, 25),
tactile_margin=0.003,
## Contact object configuration
contact_object_prim_path_expr="{ENV_REGEX_NS}/contact_object",
## Force field physics parameters
normal_contact_stiffness=1.0,
friction_coefficient=2.0,
tangential_stiffness=0.1,
## Camera configuration
camera_cfg=TiledCameraCfg(
prim_path="{ENV_REGEX_NS}/Robot/elastomer_tip/cam",
update_period=1 / 60, # 60 Hz
height=320,
width=240,
data_types=["distance_to_image_plane"],
spawn=None, # camera already spawned in USD file
),
)

The configuration supports customization of:

* **Render Configuration**: Specify the GelSight sensor rendering parameters using predefined configs
(e.g., ``GELSIGHT_R15_CFG``, ``GELSIGHT_MINI_CFG`` from ``isaaclab_assets.sensors``)
* **Tactile Modalities**:
* ``enable_camera_tactile`` - Enable tactile RGB imaging through camera sensors
* ``enable_force_field`` - Enable force field computation and visualization
* **Force Field Grid**: Set tactile grid dimensions (``tactile_array_size``) and margins, which directly affects the spatial resolution of the computed force field
* **Contact Object Configuration**: Define properties of interacting objects using prim path expressions to locate objects with SDF collision meshes
* **Physics Parameters**: Control the sensor's force field computation:
* ``normal_contact_stiffness``, ``friction_coefficient``, ``tangential_stiffness`` - Normal stiffness, friction coefficient, and tangential stiffness
* **Camera Settings**: Configure resolution, update rates, and data types, currently only ``distance_to_image_plane`` (alias for ``depth``) is supported.
``spawn`` is set to ``None`` by default, which means that the camera is already spawned in the USD file.
If you want to spawn the camera yourself and set focal length, etc., you can set the spawn configuration to a valid spawn configuration.

Configuration Requirements
~~~~~~~~~~~~~~~~~~~~~~~~~~

.. important::
The following requirements must be satisfied for proper sensor operation:

**Camera Tactile Imaging**
If ``enable_camera_tactile=True``, a valid ``camera_cfg`` (TiledCameraCfg) must be provided with appropriate camera parameters.

**Force Field Computation**
If ``enable_force_field=True``, the following parameters are required:

* ``contact_object_prim_path_expr`` - Prim path expression to locate contact objects with SDF collision meshes

**SDF Computation**
When force field computation is enabled, penalty-based normal and shear forces are computed using Signed Distance Field (SDF) queries. To achieve GPU acceleration:

* Interacting objects should have SDF collision meshes
* An SDFView must be defined during initialization, therefore interacting objects should be specified before simulation.

**Elastomer Configuration**
The sensor's ``prim_path`` must be configured as a child of the elastomer prim in the USD hierarchy.
The query points for the force field computation is computed from the surface of the elastomer mesh, which is searched for under the prim path of the elastomer.

**Physics Materials**
The sensor uses physics materials to configure the compliant contact properties of the elastomer.
By default, physics material properties are pre-configured in the USD asset. However, you can override
these properties by specifying the following parameters in ``UsdFileWithPhysicsMaterialOnPrimsCfg`` when
spawning the robot:

* ``compliant_contact_stiffness`` - Contact stiffness for the elastomer surface
* ``compliant_contact_damping`` - Contact damping for the elastomer surface
* ``apply_physics_material_prim_path`` - Prim path where physics material is applied (typically ``"elastomer"``)

If any parameter is set to ``None``, the corresponding property from the USD asset will be retained.


Usage Example
~~~~~~~~~~~~~

To use the tactile sensor in a simulation environment, run the demo:

.. code-block:: bash

cd scripts/demos/sensors
python tacsl_sensor.py --use_tactile_rgb --use_tactile_ff --tactile_compliance_stiffness 100.0 --tactile_compliant_damping 1.0 --contact_object_type nut --num_envs 16 --save_viz --enable_cameras

Available command-line options include:

* ``--use_tactile_rgb``: Enable camera-based tactile sensing
* ``--use_tactile_ff``: Enable force field tactile sensing
* ``--contact_object_type``: Specify the type of contact object (nut, cube, etc.)
* ``--num_envs``: Number of parallel environments
* ``--save_viz``: Save visualization outputs for analysis
* ``--tactile_compliance_stiffness``: Override compliant contact stiffness (default: use USD asset values)
* ``--tactile_compliant_damping``: Override compliant contact damping (default: use USD asset values)
* ``--normal_contact_stiffness``: Normal contact stiffness for force field computation
* ``--tangential_stiffness``: Tangential stiffness for shear forces
* ``--friction_coefficient``: Friction coefficient for shear forces
* ``--debug_sdf_closest_pts``: Visualize closest SDF points for debugging
* ``--debug_tactile_sensor_pts``: Visualize tactile sensor points for debugging
* ``--trimesh_vis_tactile_points``: Enable trimesh-based visualization of tactile points

For a complete list of available options:

.. code-block:: bash

python tacsl_sensor.py -h

.. note::
The demo examples are based on the Gelsight R1.5, which is a prototype sensor that is now discontinued. The same procedure can be adapted for other visuotactile sensors.

.. figure:: ../../../_static/overview/sensors/tacsl_demo.jpg
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

actually all added images are used in the doc. tacsl_demo.jpg is used here, and other two jpg files are used at line 152 and 157.

:align: center
:figwidth: 100%
:alt: TacSL tactile sensor demo showing RGB tactile images and force field visualizations

The tactile sensor supports multiple data modalities that provide comprehensive information about contact interactions:


Output Tactile Data
~~~~~~~~~~~~~~~~~~~
**RGB Tactile Images**
Real-time generation of tactile RGB images as objects make contact with the sensor surface. These images show deformation patterns and contact geometry similar to gel-based tactile sensors [Si2022]_


**Force Fields**
Detailed contact force field and pressure distributions across the sensor surface, including normal and shear components.

.. list-table::
:widths: 50 50
:class: borderless

* - .. figure:: ../../../_static/overview/sensors/tacsl_taxim_example.jpg
:align: center
:figwidth: 80%
:alt: Tactile output with RGB visualization

- .. figure:: ../../../_static/overview/sensors/tacsl_force_field_example.jpg
:align: center
:figwidth: 80%
:alt: Tactile output with force field visualization

Integration with Learning Frameworks
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

The tactile sensor is designed to integrate seamlessly with reinforcement learning and imitation learning frameworks. The structured tensor outputs can be directly used as observations in learning algorithms:

.. code-block:: python

def get_tactile_observations(self):
"""Extract tactile observations for learning."""
tactile_data = self.scene["tactile_sensor"].data

# tactile RGB image
tactile_rgb = tactile_data.tactile_rgb_image

# tactile depth image
tactile_depth = tactile_data.tactile_depth_image

# force field
tactile_normal_force = tactile_data.tactile_normal_force
tactile_shear_force = tactile_data.tactile_shear_force

return [tactile_rgb, tactile_depth, tactile_normal_force, tactile_shear_force]



References
~~~~~~~~~~

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@iakinola23 Could you help view this documentation here? Thanks!

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi @iakinola23 updated the documentation with your edits.

.. [Akinola2025] Akinola, I., Xu, J., Carius, J., Fox, D., & Narang, Y. (2025). TacSL: A library for visuotactile sensor simulation and learning. *IEEE Transactions on Robotics*.
.. [Si2022] Si, Z., & Yuan, W. (2022). Taxim: An example-based simulation model for GelSight tactile sensors. *IEEE Robotics and Automation Letters*, 7(2), 2361-2368.
Loading
Loading