Annotated dataset of simulated voiding sound for urine flow estimation

Sound-based uroflowmetry is a non-invasive test emerging as an alternative to standard uroflowmetry, estimating voiding characteristics from the sound generated by urine striking water in a toilet bowl. The lack of labeled flow sound datasets limits research for developing supervised AI algorithms....

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:Scientific data Ročník 12; číslo 1; s. 993 - 7
Hlavní autoři: Alvarez, Marcos Lazaro, Arjona, Laura, Bahillo, Alfonso, Bernardo-Seisdedos, Ganeko
Médium: Journal Article
Jazyk:angličtina
Vydáno: London Nature Publishing Group UK 13.06.2025
Nature Publishing Group
Nature Portfolio
Témata:
ISSN:2052-4463, 2052-4463
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:Sound-based uroflowmetry is a non-invasive test emerging as an alternative to standard uroflowmetry, estimating voiding characteristics from the sound generated by urine striking water in a toilet bowl. The lack of labeled flow sound datasets limits research for developing supervised AI algorithms. This work presents a dataset of simulated urinary flow sound recordings at flow rates from 1 to 50 ml/s, in increments of 1 ml/s, against water in a real toilet bowl. Flow generation employed an L600-1F precision peristaltic pump, with simultaneous recordings from three devices: high-quality Ultramic384k microphone, Mi A1 smartphone and Oppo smartwatch. Water was expelled through a 6 mm diameter nozzle (simulating the urethra) from a variable height of 73 to 86 cm, mimicking adult urination. The dataset provides 60-seconds labeled, constant-flow audio recordings (WAV format). This resource is intended to support research on sound-based urinary flow estimation by developing and validating supervised artificial intelligence algorithms.
Bibliografie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ObjectType-Undefined-3
ISSN:2052-4463
2052-4463
DOI:10.1038/s41597-025-05358-1