Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[QUESTION] Image rendered by Unity stream to ROS topic #31

Open
peneroka opened this issue May 25, 2021 · 3 comments
Open

[QUESTION] Image rendered by Unity stream to ROS topic #31

peneroka opened this issue May 25, 2021 · 3 comments

Comments

@peneroka
Copy link

Hello! Is there any example where I can launch to demonstrate usage of Unity to render image stream and subscribe it from ROS? (maybe something similar) Where is it located and how to launch it? Thanks before.

@harishanand95
Copy link
Member

harishanand95 commented May 27, 2021

Hi @peneroka,

Since Unity requires the users to agree to their license, we cannot provide Unity application inside our containers. However, we have given the steps to install Unity inside the containers (https://openuav.us/#unity) and you will have to agree to their license as per your needs.

For the projects you are working on, you can create a Unity world similar to the one in Gazebo and use ROS-Sharp to communicate the position of Gazebo objects to Unity. In our experiments, we disable Unity physics and use the physics from Gazebo through ROS-Sharp. (https://github.com/siemens/ros-sharp) This enables us to rely only on physics from Gazebo while the photorealistic outdoor rendering is done by Unity.

More details here: https://download.openuas.us/Thesis.pdf

@peneroka
Copy link
Author

Thanks for the kind answer @harishanand95

Yes, I know that the Unity software itself can not be shared. But, it is not violating the Unity license to share the project files, isn't it? I mean, maybe you could kindly share an example of Unity project file (plus its corresponding Gazebo world/sdf and ROS launch files), also maybe with instructions of how to set it up so we can use Unity rendering to be streamed as an image topic in ROS.

Thanks in advance 👍

@harishanand95
Copy link
Member

Hi @peneroka,

Unfortunately, I have other commitments, but I will try to get a sample UAV Unity code added to the repo.

The approach you can follow is very well documented in ROS-Sharp community. You can get Turtlebot models to communicate between Unity and Gazebo by following the instructions in this pages.

  1. https://github.com/siemens/ros-sharp/wiki
  2. Martin Bischoff's videos (https://www.youtube.com/channel/UCJT1xXEGfyPq3hahuuF9rQw/videos)

Few other considerations to look out for,

  1. The camera in Unity needs to calibrated if you are doing any perception work. You can do this by adding a calibration object in the Gazebo and using ROS-Sharp URDF importer to load it in Unity. Run camera calibration ros package and make sure to move the calibration object in Unity to get the calibration values.
  2. Also make sure the scale of objects in Unity is correct. Relying on URDF importer is the best option.

Here are few other related works,

  1. Gazebo ground truth Publisher, publish the actual positions of Gazebo ModelStates https://github.com/DREAMS-lab/gazebo_ground_truth
  2. Pose publisher of the models https://github.com/DREAMS-lab/ros_sharp_pose_publisher
  3. ROS-Sharp image capture and info topic publisher with time synchronization. https://github.com/DREAMS-lab/ros_sharp_time_synchronizer/blob/master/src/convert.cpp

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants