SKIP TO MAIN CONTENT

Virtual Earth Week

Goal 4: Quality Education

Ensure inclusive and equitable quality education and promote lifelong learning opportunities for all.

 

Autonomous Robotic Systems

Can’t play the video? Download it in MPEG-4 (MP4 [Moving Picture Experts Group MPEG-4 Video] ), Ogg Theora (OGV [Ogg Theora Video] ), or WebM (WEBM [WebM Video] ) format.

Our vision is comprehensively characterizing our environment and its impact on human health and performance.

To achieve this goal we are working on several scales from global satellite observations, to smart city sensors distributed across DFW, to fully autonomous robotic systems that can rapidly learn their environment, even if they have never seen it before.

To achieve this goal we have four key components:

  • Sensors. Sensors of two key types:
    • Those measuring remotely, from a distance, such as the hyperspectral camera and thermal camera on the aerial vehicle, the sonar on the robotic boat, to multi and hyperspectral sensors on satellites.
    • Those measuring locally where they are, the in-situ sensors, such as the air quality sensors being distributed across DFW neighborhoods, to the sensors onboard our electric street level survey car, and the fluorometers and mass spectrometer on the robotic boat.
  • Vehicles. Vehicles providing a ride for the sensors. Whether these are satellites, an electric survey car, or the three types of robots used in this study:
    • An aerial robot — carrying the remote sensing sensors
    • Robotic boat — carrying in-situ and remote sensing sensors
    • Walking robot — carrying in-situ sensors
  • Machine learning, to learn how the remote signatures seen from above, e.g. satellites or robotic aerial vehicles) map onto the local environmental state.
  • A remote command center used to control operations and receive the actionable intelligence.

Once the machine learning algorithms have “learnt” the signatures of their new environment, the aerial robots with their remote sensing sensors can rapidly survey a large area. These large area surveys can be turned into maps of the environmental components of interest, in real-time using onboard machine learning and streamed to the remote operators. The validity of the maps can also be verified by the autonomous robots being despatched to provide in-situ verification.

This actionable intelligence provided by the fully autonomous system, can then be used to make data driven decisions. Whether those insights are locating key contaminants, identifying optimal clean up operations, or to determine required protective clothing.

The entire system uses off the shelf components, so that production can readily be scaled. The components can easily be transported to anywhere on the planet in standard shipping containers.