Collaborative Scene Perception with Multiple Sensing Modalities

Collaborative Scene Perception with Multiple Sensing Modalities
Author: Vikram Shree
Publisher:
Total Pages: 0
Release: 2022
Genre:
ISBN:


Download Collaborative Scene Perception with Multiple Sensing Modalities Book in PDF, Epub and Kindle

With the increasing reliance on autonomous systems, there is a critical need for robots to perceive the world at least as good as a human does. This requires being able to take advantage of all the sensing modalities that are available to the robot and fuse them together to come up with the best estimate about the state of the observable surrounding. However, even with the tremendous research in the field of robot perception, there is still a long way for robots to serve as reliable teammates for humans in the wild. This dissertation explores gaps in four key areas affiliated to collaborative perception: choosing an apt feature representation, active perception, shared autonomy, and perception-enabled planning. First, a human-subject study is presented that reveals the challenges associated with current fusion models in situations when there is a human in the loop. The study depicts the unreliability of certain feature representations due to human errors that needs to be accounted for in subsequent decision-making steps. To facilitate active perception, a multi-stage question-answering scheme is proposed that helps the robot to seek specific human input with the goal of maximizing situational awareness. The algorithm is implemented on a ground robot and tested in a crowded environmental setting, proving its robustness. To develop a shared understanding of the surrounding in a search and rescue (SaR) mission, a deep learning-based approach is presented that fuses information from the visual and language domain. The fused knowledge is used to intelligently plan paths for a team of heterogeneous agents, resulting in safer paths while maintaining performance in terms of time to locate the victim. The approach is tested on the gazebo simulation platform. Finally, to bridge the gap between simulation and reality, specifically in the context of SaR missions, a dataset is developed with photo-realistic online images. A Bayesian fusion framework is developed for assessing danger from photo-realistic images and human language input. An extensive simulation campaign reveals that a danger-aware planner achieves a higher mission success rate compared to a naive shortest path planner.


Collaborative Scene Perception with Multiple Sensing Modalities
Language: en
Pages: 0
Authors: Vikram Shree
Categories:
Type: BOOK - Published: 2022 - Publisher:

GET EBOOK

With the increasing reliance on autonomous systems, there is a critical need for robots to perceive the world at least as good as a human does. This requires be
Multi-sensor Fusion for Autonomous Driving
Language: en
Pages: 237
Authors: Xinyu Zhang
Categories:
Type: BOOK - Published: - Publisher: Springer Nature

GET EBOOK

Elements of Scene Perception
Language: en
Pages: 156
Authors: Monica S. Castelhano
Categories: Psychology
Type: BOOK - Published: 2021-11-11 - Publisher: Cambridge University Press

GET EBOOK

Visual cognitive processes have traditionally been examined with simplified stimuli, but generalization of these processes to the real-world is not always strai
Artificial Intelligence for Edge Computing
Language: en
Pages: 373
Authors: Mudhakar Srivatsa
Categories: Technology & Engineering
Type: BOOK - Published: 2024-01-10 - Publisher: Springer Nature

GET EBOOK

It is undeniable that the recent revival of artificial intelligence (AI) has significantly changed the landscape of science in many application domains, ranging
Social Robotics
Language: en
Pages: 609
Authors: Guido Herrmann
Categories: Computers
Type: BOOK - Published: 2013-10-23 - Publisher: Springer

GET EBOOK

This book constitutes the refereed proceedings of the 5th International Conference on Social Robotics, ICSR 2013, held in Bristol, UK, in October 2013. The 55 r