B-SOiD, an open-source unsupervised algorithm for identification and fast prediction of behaviors

Studying naturalistic animal behavior remains a difficult objective. Recent machine learning advances have enabled limb localization; however, extracting behaviors requires ascertaining the spatiotemporal patterns of these positions. To provide a link from poses to actions and their kinematics, we d...

Celý popis

Uložené v:
Podrobná bibliografia
Vydané v:Nature communications Ročník 12; číslo 1; s. 5188 - 13
Hlavní autori: Hsu, Alexander I., Yttri, Eric A.
Médium: Journal Article
Jazyk:English
Vydavateľské údaje: London Nature Publishing Group UK 31.08.2021
Nature Publishing Group
Nature Portfolio
Predmet:
ISSN:2041-1723, 2041-1723
On-line prístup:Získať plný text
Tagy: Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
Popis
Shrnutí:Studying naturalistic animal behavior remains a difficult objective. Recent machine learning advances have enabled limb localization; however, extracting behaviors requires ascertaining the spatiotemporal patterns of these positions. To provide a link from poses to actions and their kinematics, we developed B-SOiD - an open-source, unsupervised algorithm that identifies behavior without user bias. By training a machine classifier on pose pattern statistics clustered using new methods, our approach achieves greatly improved processing speed and the ability to generalize across subjects or labs. Using a frameshift alignment paradigm, B-SOiD overcomes previous temporal resolution barriers. Using only a single, off-the-shelf camera, B-SOiD provides categories of sub-action for trained behaviors and kinematic measures of individual limb trajectories in any animal model. These behavioral and kinematic measures are difficult but critical to obtain, particularly in the study of rodent and other models of pain, OCD, and movement disorders. The study of naturalistic behaviour using video tracking is challenging. Here the authors develop a system, B-SOiD which allows automated behavioural tracking and segmentation of video of movements tested in mice, flies and humans.
Bibliografia:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ObjectType-Article-2
ObjectType-Undefined-1
ObjectType-Feature-3
content type line 23
ISSN:2041-1723
2041-1723
DOI:10.1038/s41467-021-25420-x