A Best Student Paper Award for Taous Iatariene.

Best Paper Award Taous Iatariene

 

Congratulations to Taous Iatariene, PhD student in the Multispeech team, supervised by Alexandre Guérin and Romain Serizel, who received a Best Student Paper Award at the Acusticum 2025 Forum.

The prize was awarded for her paper Tracking of Intermittent and Moving Speakers: Dataset and Metrics, delivered by the EAA European Acoustics Association at the Forum, held from 23 to 26 June in Málaga, Spain.

In her thesis entitled Tracking sound sources using microphonic antennas and deep learning, the doctoral student is working on the spatial acoustic scene analysis. Using audio recordings, she is developing methods to describe what is happening in an acoustic scene, in particular by estimating the position of people in space in relation to the microphone and by tracking speakers.

In this article, I focus on a challenge that has been little explored by the scientific community: tracking intermittent and mobile speakers, i.e. people who can move unpredictably when they are silent. In this type of scenario, we are faced with the presence of spatial discontinuities, and it becomes difficult to assign estimated positions to the right person consistently over time. I note that there are no datasets containing acoustic scenes with discontinuous trajectories. The evaluation metrics of tracking systems also have their limitations, and ultimately give little importance to assessing the quality of identity assignment over time”, explains Taous Iatariene.

The doctoral student thus created LibriJump, a synthetic evaluation dataset with acoustic scenes containing speakers with discontinuous trajectories. In LibriJump’s simulated acoustic scenes, the speakers remain static when they speak, but change position when they are silent (hence the notion of “jumps”). She introduces new metrics that enable more detailed analysis of the performance of tracking systems in terms of identity assignment.

The dataset will soon be available as open-source on Zenodo, and the metrics code will be shared on Github.

More information: