Skip to content

This repository provides a system for generating explanations in autonomous robots (ROS 2) based on log analysis using LLMs.

License

Notifications You must be signed in to change notification settings

Dsobh/explainable_ros

Repository files navigation

explainable_ros

This repository provides a ROS 2 package for generating explanations in autonomous robots (ROS 2) based on log analysis using LLMs.

Explainable_ros uses the RAG method to filter the most relevant logs from those generated by the robot during the execution of its behavior.

To enhance the robot's internal data, a VLM is used to process the images captured by the onboard camera, describe them, and log them through rosout. This allows the combination of logs generated by the robot subsystems with images of the environment. The workflow of the system is illustrated in the following figure:

System Workflow Image

On the other hand, the high-level representation of the components that make up the developed system is shown in the following figure.

System High Level Components

Table of Contents

  1. How To
  2. Related Works
  3. Cite
  4. Acknowledgments

How To

Installation

Take into account that the examples shown in the usage section have been made using this rosbag.

Prerequisites

You must have llama_ros and CUDA Toolkit (llama_ros dependency) installed.

explainable_ros Installation Steps

cd ros2_ws/src
git clone https://github.com/Dsobh/explainable_ros.git
pip install -r explainable_ros/requirements.txt

cd ../
colcon build

You can also use docker. To do this you can compile the dockerfile found in the root of this repository or download the corresponding image from Dockerhub.

Usage

For the examples shown in this section we use the following models (available in llama_ros repository):

  • Embbeding model: bge-base-en-v1.5.yaml
  • Reranker model: jina-reranker
  • Base model: Qwen2

Local

  • Run the embbeding model:
ros2 llama launch ~/ros2_ws/src/llama_ros/llama_bringup/models/bge-base-en-v1.5.yaml
  • Run reranker model:
ros2 llama launch ~/ros2_ws/src/llama_ros/llama_bringup/models/jina-reranker.yaml
  • Run the base model:
ros2 llama launch ~/ros2_ws/src/llama_ros/llama_bringup/models/Qwen2.yaml
  • Now you can run the main node of the system:
ros2 run explainable_ros explainability_node

This node subscribes to the rosout topic and process the logs to add them to the context of the LLM. You can play a rosbag file in order to generate logs and test the operation of the system.

  • To request a explanation you should use the /question service:
ros2 service call /question explainable_ros_msgs/srv/Question "{'question': 'What is happening?'}"
Example

[ADD IMG]

Docker [WIP]

Run container:

sudo docker run --rm -it --entrypoint bash <docker_name:tag>

Run second container:

docker exec -it <container_id> bash

Using VLM component

  • Run a VLM model
ros2 launch llama_bringup minicpm-2.6.launch.py
  • Run the visual describer node
ros2 run explainable_ros visual_descriptor_node

This node is subscribed to the /camera/rgb/image_raw topic and every 5 seconds describes the image captured by the camera and logs it in the /rosout.

Related Works

Other Software Projects

  • llama_ros → A repository that provides a set of ROS 2 packages to integrate llama.cpp into ROS 2.

Related Datasets

A series of rosbags (ROS 2 Humble) published in Zenodo are listed below. This data can be used to test the explainability capabilities of the project.

Papers

Cite

If your work uses this repository, please, cite the repository or the following paper:

@article{sobrin2024explaining,
  title={Explaining Autonomy: Enhancing Human-Robot Interaction through Explanation Generation with Large Language Models},
  author={Sobr{\'\i}n-Hidalgo, David and Gonz{\'a}lez-Santamarta, Miguel A and Guerrero-Higueras, {\'A}ngel M and Rodr{\'\i}guez-Lera, Francisco J and Matell{\'a}n-Olivera, Vicente},
  journal={arXiv preprint arXiv:2402.04206},
  year={2024}
}

Acknowledgments

This project has been partially funded by the Recovery, Transformation, and Resilience Plan, financed by the European Union (Next Generation) thanks to the TESCAC project (Traceability and Explainability in Autonomous Cystems for improved Cybersecurity) granted by INCIBE to the University of León, and by grant PID2021-126592OB-C21 funded by MCIN/AEI/10.13039/501100011033 EDMAR (Explainable Decision Making in Autonomous Robots) project, PID2021-126592OB-C21 funded by MCIN/AEI/10.13039/501100011033 and by ERDF ”A way of making Europe”.

About

This repository provides a system for generating explanations in autonomous robots (ROS 2) based on log analysis using LLMs.

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Contributors 3

  •  
  •  
  •