Abstract

Perception, which involves organization, identification, and interpretation of sensory streams, has been a long-standing problem in robotics, and has been rapidly promoted by modern deep learning techniques. Traditional research in this field generally lies in single-robot scenarios, such as object detection, tracking, and semantic/panoptic segmentation. However, single-robot perception suffers from long-range and occlusion issues due to the limited sensing capability and dense traffic situations, and the imperfect perception could severely degrade the later planning and control modules.

Collaborative perception has been proposed to fundamentally solve the aforementioned problem, yet it is still faced with challenges including lack of real-world dataset, extra computational burden, high communication bandwidth, and subpar performance in adversarial scenarios. To tackle these challenging issues and to promote more research in collaborative perception and learning, this workshop aims to stimulate discussion on techniques that will enable better multi-agent autonomous systems with an emphasis on robust collaborative perception and learning methods, perception-based multi-robot planning and control, cooperative and competitive multi-agent systems, and safety-critical connected autonomous driving.

In line with the ICRA 2023 Making Robots for Humans theme, this workshop will provide a venue for academics and industry practitioners to create a vision for connected robots to promote the safety and intelligence for humans. The half-day workshop will feature presentations by distinguished speakers as well as interactive activities in the form of poster sessions and panel discussions.

Invited Speakers

Invited Speakers (confirmed and potential)

Mac Schwager

Stanford

Multi-robot systems, distributed estimation

Giuseppe Loianno

NYU

Multi-robot perception, swarm robotics

Yu Wang

Tsinghua University

Multi-agent exploration, efficient DL

Peter Stone

UT Austin

Machine learning, multiagent systems, and robotics

Fei Miao

UConn

Connected and autonomous vehicles (CAVs)

Bolei Zhou

UCLA

Interpretable human-AI interaction

Extended Abstract Submission

We invite researchers working on related topics to submit abstracts or extended abstracts (no longer than 4 pages in ICRA paper format, including references; you may add appendix after references) that can contribute to this workshop. The accepted extended abstracts will be publicly available on this workshop website until the end of ICRA'23.

Note: we DO allow previously published papers to be presented in this workshop, because the accepted extended abstracts in this workshop will NOT be published in the ICRA'23 proceeding.

Desired Works could:

  • identify novel collaborative perception algorithms for outdoor or indoor robotics,
  • discuss multi-agent systems in the context of applications in autonomous driving, human-robot interaction, or unmanned aerial vehicles,
  • demonstrate multi-robot communication efficiency,
  • describe novel perception-based multi-robot planning methods such as collaborative visual navigation or exploration,
  • review and benchmark various methods proposed by different communities (e.g., robotics, computer vision, transportation) with the ultimate goal to enhance the mutual understanding of challenges and opportunities related to this workshop.

Important Dates


Extended Abstract Submission (send to coperception.icra2023@gmail.com):

May 7, 2023, 11:59PM PDT.

Extended Abstract Acceptance:

May 14, 2023, 11:59PM PDT.

Final Version Submission:

May 21, 2023, 11:59PM PDT.



Best Paper Awards (Sponsored by IEEE RAS TC for Computer & Robot Vision)


First Prize

$150

Second Prize

$100

Third Prize

$50

Topics of Interest

  • Collaborative perception (detection, segmentation, tracking, motion forecasting, etc.)
  • Communication-efficient collaborative perception
  • Robust collaborative perception (latency / pose errors)
  • Collaborative embodied AI
  • Representation learning in multi-agent systems
  • Adversarial learning in multi-agent perception
  • Vehicle-to-vehicle (V2V) and vehicle-to-infrastructure (V2I)
  • Connected and autonomous vehicles (CAVs)
  • Intelligent transportation systems
  • Smart cities
  • Multi-robot systems and swarm systems
  • Multi-robot exploration and mapping
  • Distributed optimization
  • Efficient large-scale collaborative learning
  • Edge AI and federated learning
  • Cooperative and competitive multi-agent systems
  • Simulation for multi-agent learning
  • Dataset and benchmarking for collaborative perception and learning

Program

Time in London Talk (Details will come soon.)
08:45 – 08:50    Welcome / Introductions   
08:50 – 09:15    Speaker #1   
09:15 – 09:40    Speaker #2   
09:40 – 10:05    Speaker #3   
10:05 – 10:35    6 5-minute short presentations
10:45 – 11:10    Speaker #4   
11:10 – 11:35    Speaker #5   
11:35 – 12:00    Speaker #6   
12:00 – 12:30    Panel Discussion
  • If you have any topic related to coperception you’d like to propose to discuss, please send your proposed question directly to coperception.icra2023@gmail.com.

Organizers

Student Organizers

Acknowledgement

  • IEEE RAS TC for Computer & Robot Vision
  • IEEE RAS TC on Multi-Robot Systems
  • IEEE RAS Autonomous Ground Vehicles and Intelligent Transportation Systems TC