Robotics: Science and Systems 2020 Workshop
Hannah Kerner, Amy Tabb, Jnaneshwar Das, Pratap Tokekar, and Masahiro (Hiro) Ono
Robots in the Wild: Challenges in Deploying Robust Autonomy for Robotic Exploration


Dreams Lab Projects Image

Advances in robust autonomy have increased our ability to adopt robotic systems for exploration of unstructured and uncertain environments. Particularly, successful field tests have demonstrated the tremendous potential of deploying robots for exploration and data collection tasks in extreme environments such as planetary surfaces and ocean trenches. However, various challenges exist, originating from algorithmic limitations, as well as environmental modeling, sensing, mobility, and communication constraints. A relevant selection of robotic systems, methods, and sensing devices can overcome these challenges. The goal of this workshop is to bring together leading researchers from diverse domains to discuss the following questions.

  • What new insights or limitations arise when applying algorithms to real-world data as opposed to benchmark datasets or simulations?
  • How can we address the limitations of real-world environments—e.g., noisy or sparse data, non-i.i.d. sampling, etc.?
  • What challenges exist at the frontiers of robotic exploration of unstructured and extreme environments?
  • How can we tie together the categories of systems, methods, and sensing devices to address relevant scientific questions in such environments?
  • How can we deal with the algorithmic challenges from the perspective of planning, learning, and decision-making for long-term autonomy of robots in the field?

Webinar information
WILL BE LISTED ON JULY 12
Schedule Download PDF
Sunday July 12, 2020 7a-1:30p US pacific time (PDT)
  • 7:00 Opening remarks
  • 7:10 Invited: Sierra Young (NC State University)
    “Towards Enabling Remote Telemanipulation by Uncrewed Aerial Systems (UAS) in Unknown Environments”
  • 7:40 Contributed: J. A. Anderson & G. A. Hollinger (OSU)
    “Communication Planning for Cooperative Terrain-Based Underwater Localization”
  • 8:00 Break
  • 8:15 Invited: Kiri Wagstaff (OSU/Jet Propulsion Laboratory)
    “Machine Learning Adaptation to New Environments without Retraining”
  • 8:45 Contributed: V. da Poian, E. Lyness, M. Trainer, R. Danell, W. Brinckerhoff, & X. Li (NASA Goddard Space Flight Center)
    “Science Autonomy and the ExoMars Mission: Machine Learning to Help Find Life on Mars”
  • 9:05 Break + gather.town networking
  • 9:15 Invited: Masahiro (Hiro) Ono (Jet Propulsion Laboratory)
    “Robots on the Red Planet: the Past, Present, and Future of Mars Rover Autonomy”
  • 9:45 Contributed: E. Terry, B. Morrell, X. Lei, S. Daftry, & A. Agha (JPL)
    “Object and Gas Source Detection with Robotic Platforms in Perceptually Degraded Environments”
  • 10:05 Break + gather.town networking
  • 10:15 Invited: Renaud Detry (Jet Propulsion Laboratory)
    “Planetary and Space Robotics: Application specific Datasets vs End to end Validation”
  • 10:45 Contributed: Z. Chen (Arizona State University)
    “Localization and Mapping of Sparse Geologic Features with Unpiloted Aircraft Systems”
  • 11:05 Break + gather.town networking
  • 11:15 Invited: Yoonchang Sung (MIT)
    “Multi robot coordination for hazardous environmental monitoring”
  • 11:45 Contributed: I. C. Rankin, S. McCammon, & G. Hollinger (OSU)
    “Optimized Robotic Information Gathering using Semantic Language Instructions”
  • 12:05 Break + gather.town networking
  • 12:15 Invited: Girish Chowdhary (University of Illinois Urbana Champaign)
    “Robots are coming to your fields”
  • 12:45 H. Anand & J. Das (Arizona State University):
    "
    Stories and lessons from a field robotics course and competition during a pandemic"
  • 13:00 Discussion and closing remarks
Invited Speakers
  1. Sierra Young, North Carolina State University
    Title: Towards Enabling Remote Telemanipulation by Uncrewed Aerial Systems (UAS) in Unknown Environments
    Abstract: Emerging applications indicate that physical interaction and manipulation with remote environments will be increasingly important tasks for small, uncrewed aerial systems (UAS), particularly in applications such as environmental sampling and infrastructure testing. Broadly speaking, however, most UAS manipulation tasks are not yet fully autonomous due to challenges in perception and control, thus the availability of a human operator to monitor and intervene during telemanipulation remains essential. However, how to most effectively enable remote telemanipulation by a semiautonomous human-UAS team remains an open question. This work addresses this question by taking a hybrid autonomy approach to utilize the system’s autonomous capabilities while taking advantage of the domain expertise of the operator when the remote environment is unknown. Specifically, this work focuses both on control development and human-robot interface design for a UAS manipulation system and their effects on telemanipulation task performance. Results from this work indicate that successful remote manipulation is possible by non-expert users, although experimental validation in multiple domains is needed. The practical contributions of this work aim to expedite the use of aerial manipulation technologies by scientists, researchers, and stakeholders, particularly in the civil, environmental, and agricultural domains, who will directly benefit from improved manipulating UAS performance.
    Bio: Dr. Sierra Young is an Assistant Professor and Extension Specialist in the Department of Biological and Agricultural Engineering at North Carolina State University. Dr. Young received her Ph.D. in Civil Engineering from the University of Illinois at Urbana-Champaign in 2018 as a Department of Defense NDSEG Fellow with a focus on human-robot interaction for physical object manipulation by small uncrewed aerial systems (UAS). Her current research focuses on the use of robotics and automation, including aerial, surface, and ground vehicles, for sensing and sense-making in agricultural and biological systems, and continued human-robot interaction studies for small UAS. Before arriving at North Carolina State, she worked as a Visiting Scholar in the Agricultural and Biosystems Engineering Department at Iowa State University.
  2. Kiri Wagstaff, Oregon State University Title: Machine Learning Adaptation to New Environments without Retraining
    Abstract: Machine learning holds great promise for increasing the autonomy of remote spacecraft. Robotic systems that can analyze and classify data as it is collected can make informed decisions about where to go, what data to collect, and how to prioritize what they find for transmission to Earth. However, a robotic explorer is likely to encounter new environments for which the distribution of classes of interest has changed. For example, some regions on Mars are densely cratered, while others are covered with dunes or ice and have few craters. The change in class probabilities (label shift) can cause classifier performance to degrade. It may not be feasible to collect and label more data and retrain the model each time this happens, so we investigate methods for automatic adaptation without retraining. We have found that a combination of classifier calibration and the use of Black-Box Shift Correction can compensate for label shift and maintain high performance. I will share examples of how this adaptation works for data collected by Mars orbiters.
    Bio: Dr. Kiri L. Wagstaff is a principal researcher in machine learning at NASA's Jet Propulsion Laboratory and an associate research professor at Oregon State University. Her research focuses on developing new machine learning and data analysis methods for use onboard spacecraft and in data archives for planetary science, astronomy, cosmology, and more. She holds a Ph.D. in Computer Science from Cornell University followed by an M.S. in Geological Sciences from the University of Southern California and a Master of Library and Information Science (MLIS) from San Jose State University. She received a 2008 Lew Allen Award for Excellence in Research for work on the sensitivity of machine learning methods to high-radiation space environments and a 2012 NASA Exceptional Technology Achievement award for work on transient signal detection methods in radio astronomy data. She also served as a Tactical Uplink Lead (operational planning) for the Mars Opportunity rover. She is passionate about keeping machine learning relevant to real-world problems.
  3. Masahiro (Hiro) Ono, Jet Propulsion Laboratory Title: Robots on the Red Planet: the Past, Present, and Future of Mars Rover Autonomy
    Abstract: Exploring other worlds is an instinctive urge of the human race. Until we will become capable of putting boots on the extraterrestrial ground, robotic exploration is the only way for us to acquire in-situ scientific information from the neighboring worlds. In particular, NASA JPL has been roving on Mars since 1997 and now we are about to launch the next rover, Perseverance, which will collect samples that could be returned to Earth in the following missions. Although it resembles in appearance to its predecessor, Curiosity, it will carry a substantially upgraded autonomous driving capability to robustly negotiate with complex terrains. Meanwhile, JPL is conducting advanced research on a number of autonomy capabilities to enable even more challenging missions in the future, building upon the latest innovations in machine learning and artificial intelligence. This talk will provide a brief overview of the Mars Rover autonomy in the past, at present, and in the future.
  4. Renaud Detry, Jet Propulsion Laboratory Title: Planetary and Space Robotics: Application-specific Datasets vs End-to-end Validation
    Abstract: In this talk, I will discuss the experimental validation of autonomous robot behaviors that support the exploration of Mars' surface, lava tubes on Mars and the Moon, icy bodies and ocean worlds, and operations on orbit around the Earth. I will frame the presentation in the questions posed by the workshop organizers: What new insights or limitations arise when applying algorithms to real-world data as opposed to benchmark datasets or simulations? How can we address the limitations of real-world environments—e.g., noisy or sparse data, non-i.i.d. sampling, etc.? What challenges exist at the frontiers of robotic exploration of unstructured and extreme environments? I will discuss our approach to validating autonomous machine-vision capabilities for the notional Mars Sample Return campaign, for autonomously navigating lava tubes, and for autonomously assembling modular structures on orbit. The talk will highlight the thought process that drove the decomposition of a validation need into a collection of tests conducted on off-the-shelf datasets, custom/application-specific datasets, and simulated or physical robot hardware, where each test addressed a different range of experimental parameters for sensing/actuation fidelity, breadth of environmental conditions, and breadth of jointly-tested robot functions.
    Bio: Renaud Detry is the group leader for the Perception Systems group at NASA's Jet Propulsion Laboratory (JPL). Detry earned his Master's and Ph.D. degrees in computer engineering and robot learning from ULiege in 2006 and 2010. Shortly thereafter he earned two starting grants from the Swedish and Belgian national research institutes. He served as a postdoc at KTH and ULiege between 2011 and 2015, before joining the Robotics and Mobility Section at JPL in 2016. His research interests are perception and learning for manipulation, robot grasping, and mobility. At JPL, Detry leads the machine-vision team of the Mars Sample Return surface mission, and he conducts research in autonomous robot manipulation and mobility for Mars, Europa, Enceladus, and terrestrial applications.
  5. Yoonchang Sung, Massachusetts Institute of Technology Title: Multi-robot coordination for hazardous environmental monitoring
    Abstract: Quick response to hazards is crucial as the hazards may put humans at risk and thorough removal of hazards may take a substantial amount of time. Our vision is that the introduction of a robotic solution would be beneficial for hazardous environmental monitoring. Not only the fact that humans can be released from dangerous or tedious tasks, but we also can take advantage of the robot’s agile maneuverability and its precise sensing. However, the development of both hardware and software is not yet ripe to be able to deploy autonomous robots in real-world scenarios. Moreover, partial and uncertain information about hazards impose further challenges. In this talk, I will show how a team of aerial robots can cooperatively monitor hazards when hazards are not discrete objects but a continuous plume. Particularly, I will present two planning algorithms that address several research challenges and show how the proposed frameworks can be applied through real-world experiments.
    Bio: Yoonchang Sung is a Postdoctoral Associate in the Computer Science and Artificial Intelligence Laboratory at MIT. He received the Ph.D. degree from the Department of Electrical and Computer Engineering at Virginia Tech in 2019, and the M.S. and B.S. degrees from the Department of Mechanical Engineering at Korea University in 2013 and 2011, respectively. His research interests include algorithmic robotics, task and motion planning, computational geometry, and multi-robot systems. He was selected as one of RSS Pioneers in 2019.
  6. Girish Chowdhary, University of Illinois
  7. Title: Robots are coming to your fields
    Bio: Girish Chowdhary is an assistant professor and Donald Biggar Willet Faculty Fellow at University of Illinois at Urbana Champaign. He is affiliated with the agricultural and biological engineering, computer science, and electrical engineering departments. He obtained his PhD from Georgia Tech and has postdoctoral from MIT. He works on AI and autonomy for field robots.
    Organizers
    • Hannah Kerner Assistant Research Professor, Department of Geographical Sciences, University of Maryland
    • Amy Tabb U.S. Department of Agriculture, Agricultural Research Service, Appalachian Fruit Research Station (USDA-ARS-AFRS)
    • Jnaneshwar Das Alberto Behar Research Professor, School of Earth and Space Exploration, Arizona State University
    • Pratap Tokekar Assistant Professor, Department of Computer Science, University of Maryland
    • Masahiro Ono Research Technologist, Jet Propulsion Laboratory, California Institute of Technology
    Topics
    • System considerations for exploration of extreme environments such as underwater and benthic habitats, hot-springs, volcanoes, asteroids, and planetary surfaces.
    • Planetary subsurface exploration (subsurface ocean of Europa/Enceladus, vertical pits on the Moon/Mars, etc.)
    • Challenges in environmental monitoring, precision agriculture and farming, and disaster re- sponse.
    • Multi-robot learning and coordination for environmental modelling.
    • Underground and underwater mapping, space missions and planetary robots.
    • Novelty, anomaly, and change detection.
    • Decision-theoretic approaches for active sensing and physical sample (specimen) retrieval.
    • Sampling algorithms and strategies, e.g., opportunistic sampling, non-myopic sampling.
    • Online exploration algorithms: theory, experiments, and field studies.
    • On-board scientific interpretation, data prioritization.
    Themes
    • Robotics – under-delivered or surpassed expectations in science autonomy?
    • AI – what is its role in exploration research and big-data driven discovery?
    • Earth observation -- is it lagging behind consumer technology? How can roboticists help bridge the gap?
    Event Description

    Robots in the Wild is a 1-day workshop held on July 12 at RSS 2020. The schedule will include invited speakers as well as spotlight presentations and poster sessions for accepted papers. Up to 10 selected contributions will be selected for short talks.

    Duration

    1 day

    Paper Submission

    Submitted papers should be no more than 6 pages, not including references. Please submit your papers in this format.

    Submit