OpenUAV Swarm Simulation Testbed

2020 CPS Challenge : "SoilScope - Mars edition"

Mars 2020 inspired mission scenario for the 2020 NSF CPS Challenge will be a two week virtual event (May15-May30), emulating an autonomous probe deployment science mission by the rover and drone duo, at the Jezero crater landing site.
Dreams lab logo

The OpenUAV Swarm Simulation Testbed: a Collaborative Design Studio for Field Robotics

OpenUAV is a multi-robot design studio that enables simulations to run as browser accessible Lubuntu desktop containers. Our simulation testbed, based on ROS, Gazebo, and PX4 flight stack has been developed to facilitate collaborative mission planning, and serve as a sand-box for vision-based problems, collision avoidance, and multi-robot coordination for Unpiloted Aircraft Systems (UAS).

Jezero Crater world for NSF CPS Challenge Mars 2020 Edition in OpenUAV

OpenUAV Swarm testbed, with Unity Engine integration

Rover SLAM in OpenUAV environment wih Bishop, CA terrain model

DREAMS lab underwater drone in OpenUAV Unity Environment

Two PX4 UAVs flying around a volcanic plume in OpenUAV

A PX4 UAV flying through a volcanic plume in OpenUAV

Acknowledgments

  • NSF grant CNS-1521617
  • USDA grant 2015-67021-23857
  • GRASP Lab, University of Pennsylvania
  • Penn Aerial Robotics
  • School of Earth and Space Exploration, ASU
  • Arizona State University
  • Thanks: Will Kessler

Goals

As a remotely accessible open source simulation platform, OpenUAV’s main purpose has been to lower the barrier to entry in research and development of autonomous vehicles. To this end, we aim to satisfy the following requirements:

  • Enable remote desktop accessibility through browsers without compromising on performance.
  • Provide an easy to use, software development environment with support for remote code execution.
  • Replicate actual resource constraints of vehicles in the simulation containers by having similar memory constraints and computational capacity.
  • Minimize the risk of data breach through built-in encryption between client and server. There should not be any additional data encryption and protection layers such as Virtual Private Networks (VPN)
  • Provide mechanisms for maintenance through daily build images, regular update releases and data recovery mechanisms.
  • Provide visual realistic environments for simulations in on-premise systems.

Components

  • Gazebo: Gazebo is an open-source robotics simulator that is used to design robots, environments and perform realistic rigid body dynamics.
  • ROS: The Robot Operating System is a robotics message passing framework that is designed to simplify programming for various robots.
  • PX4: PX4 is an open-source flight control software for UAVs and other unpiloted vehicles.
  • QGroundControl: QGroundControl is a software package used for monitoring and mission planning for any MavLINK enabled drone.
  • Unity is a cross-platform game engine developed primarily for the development of 2D, 3D and VR games. The OpenUAV uses ROS-Sharp to add URDF robot models in Unity, and to communicate between the container and the Unity game engine. A ROS connector plugin inside the container publishes the Pose and Twist of the gazebo models and which is then rendered in Unity. The robot models inUnity have kinematics, but lack any Unity based physics or collision effects. The OpenUAV testbed doesn’t currently have docker support for Unity, and therefore Unity runs on the host machine and communicates with the simulation via port 9090.
  • Docker: Docker can package software and its dependencies into a light-weight container. Out of the box, Docker provides isolation from the host as well as from other containers. It also improves the security of the application by restricting the possible hostsystem calls, providing network, process, file namespace and running applications in the least privileged mode.

  • OpenUAV unity image

Setup

The Github repository for the project can be found here

Building a GPU-enhanced Lubuntu Desktop with nvidia-docker2

To build on a plain vanilla Google Compute GPU host:

  • Spin up a GC GPU host on the google console. Make sure it has at least one Tesla K80 GPU, and decent amount of VCPUs (e.g. 4, and enough disk space, at least 50Gb). Zone us-east-1c seems to be the best choice as of April 1, 2018.
  • Upload this repo and unpack it in /root/build or wherever you like as a temporary location.
  • Run preinstall.sh. This just runs apt-get update and puts in screen and emacs for getting started.
  • Run build.sh. This will build everything needed to start up a nvidia-docker2 container with Ubuntu 16.04 and Lubuntu desktop.

To build on local machine with GPU:

  • Replace autonomous_sys_build/xorg.conf with the xorg.conf file for your GPU.
  • Run preinstall.sh. This just runs apt-get update and puts in screen and emacs for getting started.
  • Run build.sh. This will build everything needed to start up a nvidia-docker2 container with Ubuntu 16.04 and Lubuntu desktop.
Running the container

To run the container on this host, use run.sh. Note that NoVNC will expect connections on port 40001. Then surf to your host on that port.

Nginx configuration

To setup a local machine with Nginx proxying each port as a unique subdomain, please use the following the Nginx template.

                
server {
        server_name ~^term-(?\d+)\.openuav\.us$;
        listen 443 ssl; # managed by Certbot
        ssl_certificate /etc/letsencrypt/live/openuav.us/fullchain.pem; # managed by Certbot
        ssl_certificate_key /etc/letsencrypt/live/openuav.us/privkey.pem; # managed by Certbot
        include /etc/letsencrypt/options-ssl-nginx.conf; # managed by Certbot
        ssl_dhparam /etc/letsencrypt/ssl-dhparams.pem; # managed by Certbot
        auth_basic "Private Property";
        auth_basic_user_file /etc/nginx/.htpasswd;
        location / {
                proxy_set_header Upgrade $http_upgrade;
                proxy_set_header Connection $connection_upgrade;
                proxy_set_header X-Real-IP $remote_addr;
                proxy_set_header Host $host;
                proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
                if ($subnum ~ [0-9])
                {
                        proxy_pass http://127.0.0.1:400$subnum;
                }
                if ($subnum ~ [1-9][0-9])
                {
                        proxy_pass http://127.0.0.1:40$subnum;
                }
                proxy_ssl_certificate /etc/letsencrypt/live/openuav.us/fullchain.pem; # managed by Certbot
                proxy_ssl_certificate_key /etc/letsencrypt/live/openuav.us/privkey.pem; # managed by Certbot
                proxy_hide_header 'x-frame-options';
                proxy_read_timeout 61s;

                # Disable cache
                proxy_buffering off;
        }
}
                
              
Nginx configuration

  • In the local setup, if you switch users the containers have to be restarted to update for Xorg changes. A fix for it is if your machine has more than one GPU, then you can run containers on the other GPU by running an xserver in it.

2020 CPS Challenge "SoilScope - Mars edition"

Mars 2020 inspired mission scenario for the 2020 NSF CPS Challenge will be a two week virtual event, emulating an autonomous probe deployment science mission by the rover and drone duo, at the Jezero crater landing site. Teams will use the CPS-VO.
Event Details
More details about the event can be found here. NSF CPS Challenge event details
Event Rules
  • Solutions have to be autonomous, and written with ROS C++ or Python code.
  • One minute video of each task finished, needs to be submitted, and it should show atleast three trials.
  • Code should be maintained on a Github repo for persistence and verification.
  • For qualifying, use of deep neural networks is not allowed. Instead teams should leverage fiducials such as AprilTag and Aruco markers.
Why participate
  • Develop autonomy for UAV team, in a fun setting
  • Engage in agile design iterations, both for software and hardware
  • Experiment with complex mission scenarios using powerful cloud-based simulation tools.
  • Repurpose solution to other problems, such as searching for a strategic location to deploy and recover a sensor probe.
Join the virtual competition
Join the virtual competition by clicking the button below.

CPS-VO Setup

Cyber-Physical Systems Virtual Organization (CPS-VO) is a collaborative environment setup to foster communication among CPS professionals in academia, government and industry.
Creating an account
  • Create an account at cps-vo.org Screenshot for registering an account at cpsvo.org
  • Select the 2020 CPS Challenge group in the group selection tab. Screenshot for selecting group at cpsvo.org
  • Complete the account registration.
  • Send an email to jdas5(at)asu(dot)edu stating your purpose for using OpenUAV simulations. Please note that you will not be able to use this environment without completing this step.
Launching the environment
  • Login to the account and goto the Design tab. An icon with the title OpenUAS - ASU should be visible. Click this icon to create a simulation instance. Screenshot for design tab at cpsvo.org
  • Launch a new instance using the icon. A new instance will be initialized and ready to go. Screenshot for launched instance at cpsvo.org
  • Creating a CPSVO Instance
Relaunching the environment
  • You should also be able to view your instance status within the group. . Screenshot for instance status at cpsvo.org
  • Always suspend your instance once you're done with your session. When you resume the session, you can start from where you left.
  • Stopping your instance will DELETE all the simulation files.

On-Premise Setup

The following steps were followed to setup OpenUAV on a machine (tesseract) with NVIDIA TITAN RTX. The operating system on the machine is Ubuntu 18.04.
Setting up the Desktop Manager
We use lightdm as our desktop manager instead of GDM. This makes the configuration easier to setup and manage.
  • Install lightdm using the following command
    sudo apt-get install lightdm
  • You should be prompted with the following messages after installation. Screenshot for prompt while installing lightdm
  • Choose lightdm from the options to complete installation. Screenshot for choice in prompt while installing lightdm
  • If the option was not prompted to you, run the following command.
    sudo dpkg-reconfigure lightdm
    This should give you the prompt to choose the desktop manager. Screenshot for choice in reconfiguring desktop manager
  • Desktop manager choice will be effective from next restart.
Configuration
  • Navigate to /etc/X11. Create a file xorg.conf.openuav and add the following content.
                        
    Section "ServerLayout"
        Identifier     "Layout0"
        Screen      0  "Screen0"
        InputDevice    "Keyboard0" "CoreKeyboard"
        InputDevice    "Mouse0" "CorePointer"
    EndSection
    
    Section "Files"
    EndSection
    
    Section "InputDevice"
        # generated from default
        Identifier     "Mouse0"
        Driver         "mouse"
        Option         "Protocol" "auto"
        Option         "Device" "/dev/psaux"
        Option         "Emulate3Buttons" "no"
        Option         "ZAxisMapping" "4 5"
    EndSection
    
    Section "InputDevice"
        # generated from default
        Identifier     "Keyboard0"
        Driver         "kbd"
    EndSection
    
    Section "Monitor"
        Identifier     "Monitor0"
        VendorName     "Unknown"
        ModelName      "Unknown"
        HorizSync       28.0 - 33.0
        VertRefresh     43.0 - 72.0
        Option         "DPMS"
    EndSection
    
    Section "Device"
        Identifier     "Device0"
        Driver         "nvidia"
        VendorName     "NVIDIA Corporation"
        BoardName      "TITAN RTX"
        BusId          "PCI:134:0:0"
    EndSection
    
    Section "ServerFlags"
        Option "BlankTime" "0"
        Option "StandbyTime" "0"
        Option "SuspendTime" "0"
        Option "OffTime" "0"
    EndSection
    
    Section "Screen"
        Identifier     "Screen0"
        Device         "Device0"
        Monitor        "Monitor0"
        DefaultDepth    24
        Option         "UseDisplayDevice" "None"
        Option         "AllowIndirectGLXProtocol" "true"
        SubSection     "Display"
            Virtual     1920 1200
            Depth       24
        EndSubSection
    EndSection
    
    Section "Files"
        ModulePath "/usr/lib/x86_64-linux-gnu/nvidia/xorg"
        ModulePath "/usr/lib/xorg/modules/extensions/"
        ModulePath "/usr/lib/xorg/modules/"
    EndSection
                        
    		  
    Update BusId "PCI:134:0:0" with the BusId of your NVIDIA graphics card. You can use the following command to find the BusId.
    	
    nvidia-xconfig --query-gpu-info
    	
                
    If the above command fails, check if you have nvidia drivers installed. In Ubuntu 18.04, nvidia drivers can be installed using the following command ubuntu-drivers devices .
  • Navigate to /etc/systemd/system and create a file openuav.service. Add the following content to the file.
                        
    [Unit]
    Description=OpenUAV Xorg setup
    
    [Service]
    ExecStart=/usr/lib/xorg/Xorg -core :1 -seat seat1 -auth /var/run/lightdm/root/:1 -nolisten tcp vt8 -novtswitch -config /etc/X11/xorg.conf.openuav
    
    [Install]
    WantedBy=multi-user.target
                        
                      
Running the service
  • Exceute the following command to enable the service.
    sudo systemctl enable openuav
  • Start the service using the following command.
    sudo service openuav start
  • You should see Xorg process using the NVIDIA card in nvidia-smi command.
  • Replace the xorg.conf in openuav-turbovnc/autonomous_sys_build repository with the contents of xorg.conf.openuav.
  • Do a docker build, and run an openuav container.
CPS-VO Integration
  • Please contact Stephen A Rees (details in contact page) to setup user authentication and access control over OpenUAV for your users. CPS-VO can provide authentication for users, access control mechanism, logging and statistics for each users. This is very useful if you plan to use OpenUAV for education/research purpose.

Unity integration in OpenUAV

Unity can be installed inside the container using the following steps. Run these steps inside the container session to get Unity setup working.


1. Select a version of Unity to download. We will use UnitySetup-2020.1.0b4.
export DOWNLOAD_URL=https://beta.unity3d.com/download/7e2ed8c1221a/UnitySetup-2020.1.0b4

2. Download the Unity installer.
wget -nv ${DOWNLOAD_URL} -O UnitySetup

3. Make the installer executable.
chmod +x UnitySetup

4. Run the installer. This should open up the installer window on screen and follow the instructions.
./root/UnitySetup

Note: Use the following command to install Unity without GUI
./UnitySetup --unattended --install-location=/opt/Unity --verbose --download-location=/tmp/unity --components=Unity

5. After installation, you can start Unity Editor and activate Unity license as per your need.
/opt/Unity/Editor/Unity -batchmode -quit -nographics -createManualActivationFile -username "USERNAME" -password "PASSWORD"

6. This will generate a file Unity_v2020.1.0b4.alf that has to uploaded to https://license.unity3d.com/manual. Upload this file for manual activation.

7. Download Unity_v2020.x.ulf (Unity_v2019.x.ulf for 2019 versions)

8. Copy the contents inside the Unity_v2020.x.ulf to a file at /root/.local/share/unity3d/Unity/Unity_lic.ulf.

9. Create the following directory and add CACerts.pem file.
mkdir -p /root/.local/share/unity3d/Certificates/
cd /root/.local/share/unity3d/Certificates/
wget https://gitlab.com/gableroux/unity3d/-/raw/f9bef9e4/docker/conf/CACerts.pem

10. Run /opt/Unity/Editor/Unity to run inside OpenUAV containers


Note: Zoom using mouse scroll doesn't work inside container. Instead, you can hold down the right mouse button and use W/A/S/D/Q/E and the mouse to navigate the camera as an alternative for using the scrollwheel.

Team

Harish Anand

Masters Student, Arizona State University.

Stephen A. Rees

Institute for Software Integrated Systems
Vanderbilt University

Ashwin Jose Poruthukaran

Masters Student, Arizona State University.