SPREAD: Sound Propagation and Perception for Autonomous Agents in Dynamic Environments

Loading...
Thumbnail Image

Degree type

Discipline

Subject

autonomous agent
sound representation
sound propagation
sound perception
Computer Sciences
Engineering
Graphics and Human Computer Interfaces

Funder

Grant number

License

Copyright date

Distributor

Related resources

Contributor

Abstract

The perception of sensory information and its impact on behavior is a fundamental component of being human. While visual perception is considered for navigation, collision, and behavior selection, the acoustic domain is relatively unexplored. Recent work in acoustics focuses on synthesizing sound in 3D environments; however, the perception of acoustic signals by a virtual agent is a useful and realistic adjunct to any behavior selection mechanism. In this paper, we present SPREAD, a novel agent-based sound perception model using a discretized sound packet representation with acoustic features including amplitude, frequency range, and duration. SPREAD simulates how sound packets are propagated, attenuated, and degraded as they traverse the virtual environment. Agents perceive and classify the sounds based on the locally-received packet set using a hierarchical clustering scheme, and have individualized hearing and understanding of their surroundings. Using this model, we demonstrate several simulations that greatly enrich controls and outcomes.

Advisor

Date of presentation

2013-01-01

Conference name

Center for Human Modeling and Simulation

Conference dates

2023-05-17T12:42:44.000

Conference location

Date Range for Data Collection (Start Date)

Date Range for Data Collection (End Date)

Digital Object Identifier

Series name and number

Volume number

Issue number

Publisher

Publisher DOI

Journal Issues

Comments

Recommended citation

Collection