Automatic reconstruction of physiological gestures used in a model of birdsong production
- Autores
- Boari, Santiago; Yonatan Sanz Perl; Amador, Ana; Margoliash, Daniel; Mindlin, Bernardo Gabriel
- Año de publicación
- 2015
- Idioma
- inglés
- Tipo de recurso
- artículo
- Estado
- versión publicada
- Descripción
- Highly coordinated learned behaviors are key to understanding neural processes integrating the body and the environment. Birdsong production is a widely studied example of such behavior in which numerous thoracic muscles control respiratory inspiration and expiration: the muscles of the syrinx control syringeal membrane tension, while upper vocal tract morphology controls resonances that modulate the vocal system output. All these muscles have to be coordinated in precise sequences to generate the elaborate vocalizations that characterize an individual´s song. Previously we used a low-dimensional description of the biomechanics of birdsong production to investigate the associated neural codes, an approach that complements traditional spectrographic analysis. The prior study used algorithmic yet manual procedures to model singing behavior. In the present work, we present an automatic procedure to extract low-dimensional motor gestures that could predict vocal behavior. We recorded zebra finch songs and generated synthetic copies automatically, using a biomechanical model for the vocal apparatus and vocal tract. This dynamical model described song as a sequence of physiological parameters the birds control during singing. To validate this procedure, we recorded electrophysiological activity of the telencephalic nucleus HVC. HVC neurons were highly selective to the auditory presentation of the bird´s own song (BOS) and gave similar selective responses to the automatically generated synthetic model of song (AUTO). Our results demonstrate meaningful dimensionality reduction in terms of physiological parameters that individual birds could actually control. Furthermore, this methodology can be extended to other vocal systems to study fine motor control.
Fil: Boari, Santiago. Universidad de Buenos Aires. Facultad de Ciencias Exactas y Naturales. Departamento de Física. Laboratorio de Sistemas Dinámicos; Argentina. Universidad de Buenos Aires. Facultad de Ciencias Exactas y Naturales. Departamento de Física. Laboratorio de Sistemas Dinámicos; Argentina
Fil: Yonatan Sanz Perl. Universidad de Buenos Aires. Facultad de Ciencias Exactas y Naturales. Departamento de Física. Laboratorio de Sistemas Dinámicos; Argentina. Universidad de Buenos Aires. Facultad de Ciencias Exactas y Naturales. Departamento de Física. Laboratorio de Sistemas Dinámicos; Argentina
Fil: Amador, Ana. Universidad de Buenos Aires. Facultad de Ciencias Exactas y Naturales. Departamento de Física. Laboratorio de Sistemas Dinámicos; Argentina. Universidad de Buenos Aires. Facultad de Ciencias Exactas y Naturales. Departamento de Física. Laboratorio de Sistemas Dinámicos; Argentina
Fil: Margoliash, Daniel. University of Chicago; Estados Unidos
Fil: Mindlin, Bernardo Gabriel. Universidad de Buenos Aires. Facultad de Ciencias Exactas y Naturales. Departamento de Física. Laboratorio de Sistemas Dinámicos; Argentina. Universidad de Buenos Aires. Facultad de Ciencias Exactas y Naturales. Departamento de Física. Laboratorio de Sistemas Dinámicos; Argentina - Materia
-
Dynamical Systems
Vocal Learning
Bird'S Own Song
Modeling Software - Nivel de accesibilidad
- acceso abierto
- Condiciones de uso
- https://creativecommons.org/licenses/by-nc-sa/2.5/ar/
- Repositorio
.jpg)
- Institución
- Consejo Nacional de Investigaciones Científicas y Técnicas
- OAI Identificador
- oai:ri.conicet.gov.ar:11336/47891
Ver los metadatos del registro completo
| id |
CONICETDig_9b779a511e2ce5c9719a2268198e0886 |
|---|---|
| oai_identifier_str |
oai:ri.conicet.gov.ar:11336/47891 |
| network_acronym_str |
CONICETDig |
| repository_id_str |
3498 |
| network_name_str |
CONICET Digital (CONICET) |
| spelling |
Automatic reconstruction of physiological gestures used in a model of birdsong productionBoari, SantiagoYonatan Sanz PerlAmador, AnaMargoliash, DanielMindlin, Bernardo GabrielDynamical SystemsVocal LearningBird'S Own SongModeling Softwarehttps://purl.org/becyt/ford/1.3https://purl.org/becyt/ford/1https://purl.org/becyt/ford/1.3https://purl.org/becyt/ford/1Highly coordinated learned behaviors are key to understanding neural processes integrating the body and the environment. Birdsong production is a widely studied example of such behavior in which numerous thoracic muscles control respiratory inspiration and expiration: the muscles of the syrinx control syringeal membrane tension, while upper vocal tract morphology controls resonances that modulate the vocal system output. All these muscles have to be coordinated in precise sequences to generate the elaborate vocalizations that characterize an individual´s song. Previously we used a low-dimensional description of the biomechanics of birdsong production to investigate the associated neural codes, an approach that complements traditional spectrographic analysis. The prior study used algorithmic yet manual procedures to model singing behavior. In the present work, we present an automatic procedure to extract low-dimensional motor gestures that could predict vocal behavior. We recorded zebra finch songs and generated synthetic copies automatically, using a biomechanical model for the vocal apparatus and vocal tract. This dynamical model described song as a sequence of physiological parameters the birds control during singing. To validate this procedure, we recorded electrophysiological activity of the telencephalic nucleus HVC. HVC neurons were highly selective to the auditory presentation of the bird´s own song (BOS) and gave similar selective responses to the automatically generated synthetic model of song (AUTO). Our results demonstrate meaningful dimensionality reduction in terms of physiological parameters that individual birds could actually control. Furthermore, this methodology can be extended to other vocal systems to study fine motor control.Fil: Boari, Santiago. Universidad de Buenos Aires. Facultad de Ciencias Exactas y Naturales. Departamento de Física. Laboratorio de Sistemas Dinámicos; Argentina. Universidad de Buenos Aires. Facultad de Ciencias Exactas y Naturales. Departamento de Física. Laboratorio de Sistemas Dinámicos; ArgentinaFil: Yonatan Sanz Perl. Universidad de Buenos Aires. Facultad de Ciencias Exactas y Naturales. Departamento de Física. Laboratorio de Sistemas Dinámicos; Argentina. Universidad de Buenos Aires. Facultad de Ciencias Exactas y Naturales. Departamento de Física. Laboratorio de Sistemas Dinámicos; ArgentinaFil: Amador, Ana. Universidad de Buenos Aires. Facultad de Ciencias Exactas y Naturales. Departamento de Física. Laboratorio de Sistemas Dinámicos; Argentina. Universidad de Buenos Aires. Facultad de Ciencias Exactas y Naturales. Departamento de Física. Laboratorio de Sistemas Dinámicos; ArgentinaFil: Margoliash, Daniel. University of Chicago; Estados UnidosFil: Mindlin, Bernardo Gabriel. Universidad de Buenos Aires. Facultad de Ciencias Exactas y Naturales. Departamento de Física. Laboratorio de Sistemas Dinámicos; Argentina. Universidad de Buenos Aires. Facultad de Ciencias Exactas y Naturales. Departamento de Física. Laboratorio de Sistemas Dinámicos; ArgentinaAmerican Physiological Society2015-09info:eu-repo/semantics/articleinfo:eu-repo/semantics/publishedVersionhttp://purl.org/coar/resource_type/c_6501info:ar-repo/semantics/articuloapplication/pdfapplication/pdfapplication/pdfapplication/pdfapplication/pdfhttp://hdl.handle.net/11336/47891Boari, Santiago; Yonatan Sanz Perl; Amador, Ana; Margoliash, Daniel; Mindlin, Bernardo Gabriel; Automatic reconstruction of physiological gestures used in a model of birdsong production; American Physiological Society; Journal of Neurophysiology; 114; 5; 9-2015; 2912-29220022-3077CONICET DigitalCONICETenginfo:eu-repo/semantics/altIdentifier/url/http://jn.physiology.org/content/114/5/2912info:eu-repo/semantics/altIdentifier/doi/10.1152/jn.00385.2015info:eu-repo/semantics/openAccesshttps://creativecommons.org/licenses/by-nc-sa/2.5/ar/reponame:CONICET Digital (CONICET)instname:Consejo Nacional de Investigaciones Científicas y Técnicas2026-01-14T11:47:14Zoai:ri.conicet.gov.ar:11336/47891instacron:CONICETInstitucionalhttp://ri.conicet.gov.ar/Organismo científico-tecnológicoNo correspondehttp://ri.conicet.gov.ar/oai/requestdasensio@conicet.gov.ar; lcarlino@conicet.gov.arArgentinaNo correspondeNo correspondeNo correspondeopendoar:34982026-01-14 11:47:14.925CONICET Digital (CONICET) - Consejo Nacional de Investigaciones Científicas y Técnicasfalse |
| dc.title.none.fl_str_mv |
Automatic reconstruction of physiological gestures used in a model of birdsong production |
| title |
Automatic reconstruction of physiological gestures used in a model of birdsong production |
| spellingShingle |
Automatic reconstruction of physiological gestures used in a model of birdsong production Boari, Santiago Dynamical Systems Vocal Learning Bird'S Own Song Modeling Software |
| title_short |
Automatic reconstruction of physiological gestures used in a model of birdsong production |
| title_full |
Automatic reconstruction of physiological gestures used in a model of birdsong production |
| title_fullStr |
Automatic reconstruction of physiological gestures used in a model of birdsong production |
| title_full_unstemmed |
Automatic reconstruction of physiological gestures used in a model of birdsong production |
| title_sort |
Automatic reconstruction of physiological gestures used in a model of birdsong production |
| dc.creator.none.fl_str_mv |
Boari, Santiago Yonatan Sanz Perl Amador, Ana Margoliash, Daniel Mindlin, Bernardo Gabriel |
| author |
Boari, Santiago |
| author_facet |
Boari, Santiago Yonatan Sanz Perl Amador, Ana Margoliash, Daniel Mindlin, Bernardo Gabriel |
| author_role |
author |
| author2 |
Yonatan Sanz Perl Amador, Ana Margoliash, Daniel Mindlin, Bernardo Gabriel |
| author2_role |
author author author author |
| dc.subject.none.fl_str_mv |
Dynamical Systems Vocal Learning Bird'S Own Song Modeling Software |
| topic |
Dynamical Systems Vocal Learning Bird'S Own Song Modeling Software |
| purl_subject.fl_str_mv |
https://purl.org/becyt/ford/1.3 https://purl.org/becyt/ford/1 https://purl.org/becyt/ford/1.3 https://purl.org/becyt/ford/1 |
| dc.description.none.fl_txt_mv |
Highly coordinated learned behaviors are key to understanding neural processes integrating the body and the environment. Birdsong production is a widely studied example of such behavior in which numerous thoracic muscles control respiratory inspiration and expiration: the muscles of the syrinx control syringeal membrane tension, while upper vocal tract morphology controls resonances that modulate the vocal system output. All these muscles have to be coordinated in precise sequences to generate the elaborate vocalizations that characterize an individual´s song. Previously we used a low-dimensional description of the biomechanics of birdsong production to investigate the associated neural codes, an approach that complements traditional spectrographic analysis. The prior study used algorithmic yet manual procedures to model singing behavior. In the present work, we present an automatic procedure to extract low-dimensional motor gestures that could predict vocal behavior. We recorded zebra finch songs and generated synthetic copies automatically, using a biomechanical model for the vocal apparatus and vocal tract. This dynamical model described song as a sequence of physiological parameters the birds control during singing. To validate this procedure, we recorded electrophysiological activity of the telencephalic nucleus HVC. HVC neurons were highly selective to the auditory presentation of the bird´s own song (BOS) and gave similar selective responses to the automatically generated synthetic model of song (AUTO). Our results demonstrate meaningful dimensionality reduction in terms of physiological parameters that individual birds could actually control. Furthermore, this methodology can be extended to other vocal systems to study fine motor control. Fil: Boari, Santiago. Universidad de Buenos Aires. Facultad de Ciencias Exactas y Naturales. Departamento de Física. Laboratorio de Sistemas Dinámicos; Argentina. Universidad de Buenos Aires. Facultad de Ciencias Exactas y Naturales. Departamento de Física. Laboratorio de Sistemas Dinámicos; Argentina Fil: Yonatan Sanz Perl. Universidad de Buenos Aires. Facultad de Ciencias Exactas y Naturales. Departamento de Física. Laboratorio de Sistemas Dinámicos; Argentina. Universidad de Buenos Aires. Facultad de Ciencias Exactas y Naturales. Departamento de Física. Laboratorio de Sistemas Dinámicos; Argentina Fil: Amador, Ana. Universidad de Buenos Aires. Facultad de Ciencias Exactas y Naturales. Departamento de Física. Laboratorio de Sistemas Dinámicos; Argentina. Universidad de Buenos Aires. Facultad de Ciencias Exactas y Naturales. Departamento de Física. Laboratorio de Sistemas Dinámicos; Argentina Fil: Margoliash, Daniel. University of Chicago; Estados Unidos Fil: Mindlin, Bernardo Gabriel. Universidad de Buenos Aires. Facultad de Ciencias Exactas y Naturales. Departamento de Física. Laboratorio de Sistemas Dinámicos; Argentina. Universidad de Buenos Aires. Facultad de Ciencias Exactas y Naturales. Departamento de Física. Laboratorio de Sistemas Dinámicos; Argentina |
| description |
Highly coordinated learned behaviors are key to understanding neural processes integrating the body and the environment. Birdsong production is a widely studied example of such behavior in which numerous thoracic muscles control respiratory inspiration and expiration: the muscles of the syrinx control syringeal membrane tension, while upper vocal tract morphology controls resonances that modulate the vocal system output. All these muscles have to be coordinated in precise sequences to generate the elaborate vocalizations that characterize an individual´s song. Previously we used a low-dimensional description of the biomechanics of birdsong production to investigate the associated neural codes, an approach that complements traditional spectrographic analysis. The prior study used algorithmic yet manual procedures to model singing behavior. In the present work, we present an automatic procedure to extract low-dimensional motor gestures that could predict vocal behavior. We recorded zebra finch songs and generated synthetic copies automatically, using a biomechanical model for the vocal apparatus and vocal tract. This dynamical model described song as a sequence of physiological parameters the birds control during singing. To validate this procedure, we recorded electrophysiological activity of the telencephalic nucleus HVC. HVC neurons were highly selective to the auditory presentation of the bird´s own song (BOS) and gave similar selective responses to the automatically generated synthetic model of song (AUTO). Our results demonstrate meaningful dimensionality reduction in terms of physiological parameters that individual birds could actually control. Furthermore, this methodology can be extended to other vocal systems to study fine motor control. |
| publishDate |
2015 |
| dc.date.none.fl_str_mv |
2015-09 |
| dc.type.none.fl_str_mv |
info:eu-repo/semantics/article info:eu-repo/semantics/publishedVersion http://purl.org/coar/resource_type/c_6501 info:ar-repo/semantics/articulo |
| format |
article |
| status_str |
publishedVersion |
| dc.identifier.none.fl_str_mv |
http://hdl.handle.net/11336/47891 Boari, Santiago; Yonatan Sanz Perl; Amador, Ana; Margoliash, Daniel; Mindlin, Bernardo Gabriel; Automatic reconstruction of physiological gestures used in a model of birdsong production; American Physiological Society; Journal of Neurophysiology; 114; 5; 9-2015; 2912-2922 0022-3077 CONICET Digital CONICET |
| url |
http://hdl.handle.net/11336/47891 |
| identifier_str_mv |
Boari, Santiago; Yonatan Sanz Perl; Amador, Ana; Margoliash, Daniel; Mindlin, Bernardo Gabriel; Automatic reconstruction of physiological gestures used in a model of birdsong production; American Physiological Society; Journal of Neurophysiology; 114; 5; 9-2015; 2912-2922 0022-3077 CONICET Digital CONICET |
| dc.language.none.fl_str_mv |
eng |
| language |
eng |
| dc.relation.none.fl_str_mv |
info:eu-repo/semantics/altIdentifier/url/http://jn.physiology.org/content/114/5/2912 info:eu-repo/semantics/altIdentifier/doi/10.1152/jn.00385.2015 |
| dc.rights.none.fl_str_mv |
info:eu-repo/semantics/openAccess https://creativecommons.org/licenses/by-nc-sa/2.5/ar/ |
| eu_rights_str_mv |
openAccess |
| rights_invalid_str_mv |
https://creativecommons.org/licenses/by-nc-sa/2.5/ar/ |
| dc.format.none.fl_str_mv |
application/pdf application/pdf application/pdf application/pdf application/pdf |
| dc.publisher.none.fl_str_mv |
American Physiological Society |
| publisher.none.fl_str_mv |
American Physiological Society |
| dc.source.none.fl_str_mv |
reponame:CONICET Digital (CONICET) instname:Consejo Nacional de Investigaciones Científicas y Técnicas |
| reponame_str |
CONICET Digital (CONICET) |
| collection |
CONICET Digital (CONICET) |
| instname_str |
Consejo Nacional de Investigaciones Científicas y Técnicas |
| repository.name.fl_str_mv |
CONICET Digital (CONICET) - Consejo Nacional de Investigaciones Científicas y Técnicas |
| repository.mail.fl_str_mv |
dasensio@conicet.gov.ar; lcarlino@conicet.gov.ar |
| _version_ |
1854320920838012928 |
| score |
13.113929 |