Behavioral, modeling, and electrophysiological evidence for supramodality in human metacognition

Autores
Faivre, Nathan; Filevich, Elisa; Solovey, Guillermo; Kühn, Simone; Blanke, Olaf
Año de publicación
2018
Idioma
inglés
Tipo de recurso
artículo
Estado
versión publicada
Descripción
Human metacognition, or the capacity to introspect on one’s own mental states, has been mostly characterized through confidence reports in visual tasks. A pressing question is to what extent results from visual studies generalize to other domains. Answering this question allows determining whether metacognition operates through shared, supramodal mechanisms or through idiosyncratic, modality-specific mechanisms. Here, we report three new lines of evidence for decisional and postdecisional mechanisms arguing for the supramodality of metacognition. First, metacognitive efficiency correlated among auditory, tactile, visual, and audiovisual tasks. Second, confidence in an audiovisual task was best modeled using supramodal formats based on integrated representations of auditory and visual signals. Third, confidence in correct responses involved similar electrophysiological markers for visual and audiovisual tasks that are associated with motor preparation preceding the perceptual judgment. We conclude that the supramodality of metacognition relies on supramodal confidence estimates and decisional signals that are shared across sensory modalities.
Fil: Faivre, Nathan. Swiss Federal Institute of Technology Zurich; Suiza. Centre National de la Recherche Scientifique; Francia
Fil: Filevich, Elisa. Max Planck Institute for Human Development; Alemania. Humboldt Universität zu Berlin; Alemania. Bernstein Center for Computational Neuroscience Berlin; Alemania
Fil: Solovey, Guillermo. Consejo Nacional de Investigaciones Científicas y Técnicas; Argentina. Universidad de Buenos Aires. Facultad de Ciencias Exactas y Naturales. Instituto de Cálculo; Argentina
Fil: Kühn, Simone. Max Planck Institute for Human Development; Alemania. University Medical Center Hamburg-Eppendorf; Alemania
Fil: Blanke, Olaf. Swiss Federal Institute of Technology Zurich; Suiza. Universidad de Ginebra; Suiza
Materia
AUDIOVISUAL
CONFIDENCE
EEG
METACOGNITION
SIGNAL DETECTION THEORY
SUPRAMODALITY
Nivel de accesibilidad
acceso abierto
Condiciones de uso
https://creativecommons.org/licenses/by-nc-sa/2.5/ar/
Repositorio
CONICET Digital (CONICET)
Institución
Consejo Nacional de Investigaciones Científicas y Técnicas
OAI Identificador
oai:ri.conicet.gov.ar:11336/60265

id CONICETDig_40447d10497cf2355ff6bc3698754870
oai_identifier_str oai:ri.conicet.gov.ar:11336/60265
network_acronym_str CONICETDig
repository_id_str 3498
network_name_str CONICET Digital (CONICET)
spelling Behavioral, modeling, and electrophysiological evidence for supramodality in human metacognitionFaivre, NathanFilevich, ElisaSolovey, GuillermoKühn, SimoneBlanke, OlafAUDIOVISUALCONFIDENCEEEGMETACOGNITIONSIGNAL DETECTION THEORYSUPRAMODALITYhttps://purl.org/becyt/ford/1.6https://purl.org/becyt/ford/1Human metacognition, or the capacity to introspect on one’s own mental states, has been mostly characterized through confidence reports in visual tasks. A pressing question is to what extent results from visual studies generalize to other domains. Answering this question allows determining whether metacognition operates through shared, supramodal mechanisms or through idiosyncratic, modality-specific mechanisms. Here, we report three new lines of evidence for decisional and postdecisional mechanisms arguing for the supramodality of metacognition. First, metacognitive efficiency correlated among auditory, tactile, visual, and audiovisual tasks. Second, confidence in an audiovisual task was best modeled using supramodal formats based on integrated representations of auditory and visual signals. Third, confidence in correct responses involved similar electrophysiological markers for visual and audiovisual tasks that are associated with motor preparation preceding the perceptual judgment. We conclude that the supramodality of metacognition relies on supramodal confidence estimates and decisional signals that are shared across sensory modalities.Fil: Faivre, Nathan. Swiss Federal Institute of Technology Zurich; Suiza. Centre National de la Recherche Scientifique; FranciaFil: Filevich, Elisa. Max Planck Institute for Human Development; Alemania. Humboldt Universität zu Berlin; Alemania. Bernstein Center for Computational Neuroscience Berlin; AlemaniaFil: Solovey, Guillermo. Consejo Nacional de Investigaciones Científicas y Técnicas; Argentina. Universidad de Buenos Aires. Facultad de Ciencias Exactas y Naturales. Instituto de Cálculo; ArgentinaFil: Kühn, Simone. Max Planck Institute for Human Development; Alemania. University Medical Center Hamburg-Eppendorf; AlemaniaFil: Blanke, Olaf. Swiss Federal Institute of Technology Zurich; Suiza. Universidad de Ginebra; SuizaSociety for Neuroscience2018-01info:eu-repo/semantics/articleinfo:eu-repo/semantics/publishedVersionhttp://purl.org/coar/resource_type/c_6501info:ar-repo/semantics/articuloapplication/pdfapplication/pdfhttp://hdl.handle.net/11336/60265Faivre, Nathan; Filevich, Elisa; Solovey, Guillermo; Kühn, Simone; Blanke, Olaf; Behavioral, modeling, and electrophysiological evidence for supramodality in human metacognition; Society for Neuroscience; Journal of Neuroscience; 38; 2; 1-2018; 263-2770270-6474CONICET DigitalCONICETenginfo:eu-repo/semantics/altIdentifier/doi/10.1523/JNEUROSCI.0322-17.2017info:eu-repo/semantics/altIdentifier/url/http://www.jneurosci.org/content/38/2/263info:eu-repo/semantics/openAccesshttps://creativecommons.org/licenses/by-nc-sa/2.5/ar/reponame:CONICET Digital (CONICET)instname:Consejo Nacional de Investigaciones Científicas y Técnicas2025-10-22T11:32:43Zoai:ri.conicet.gov.ar:11336/60265instacron:CONICETInstitucionalhttp://ri.conicet.gov.ar/Organismo científico-tecnológicoNo correspondehttp://ri.conicet.gov.ar/oai/requestdasensio@conicet.gov.ar; lcarlino@conicet.gov.arArgentinaNo correspondeNo correspondeNo correspondeopendoar:34982025-10-22 11:32:44.245CONICET Digital (CONICET) - Consejo Nacional de Investigaciones Científicas y Técnicasfalse
dc.title.none.fl_str_mv Behavioral, modeling, and electrophysiological evidence for supramodality in human metacognition
title Behavioral, modeling, and electrophysiological evidence for supramodality in human metacognition
spellingShingle Behavioral, modeling, and electrophysiological evidence for supramodality in human metacognition
Faivre, Nathan
AUDIOVISUAL
CONFIDENCE
EEG
METACOGNITION
SIGNAL DETECTION THEORY
SUPRAMODALITY
title_short Behavioral, modeling, and electrophysiological evidence for supramodality in human metacognition
title_full Behavioral, modeling, and electrophysiological evidence for supramodality in human metacognition
title_fullStr Behavioral, modeling, and electrophysiological evidence for supramodality in human metacognition
title_full_unstemmed Behavioral, modeling, and electrophysiological evidence for supramodality in human metacognition
title_sort Behavioral, modeling, and electrophysiological evidence for supramodality in human metacognition
dc.creator.none.fl_str_mv Faivre, Nathan
Filevich, Elisa
Solovey, Guillermo
Kühn, Simone
Blanke, Olaf
author Faivre, Nathan
author_facet Faivre, Nathan
Filevich, Elisa
Solovey, Guillermo
Kühn, Simone
Blanke, Olaf
author_role author
author2 Filevich, Elisa
Solovey, Guillermo
Kühn, Simone
Blanke, Olaf
author2_role author
author
author
author
dc.subject.none.fl_str_mv AUDIOVISUAL
CONFIDENCE
EEG
METACOGNITION
SIGNAL DETECTION THEORY
SUPRAMODALITY
topic AUDIOVISUAL
CONFIDENCE
EEG
METACOGNITION
SIGNAL DETECTION THEORY
SUPRAMODALITY
purl_subject.fl_str_mv https://purl.org/becyt/ford/1.6
https://purl.org/becyt/ford/1
dc.description.none.fl_txt_mv Human metacognition, or the capacity to introspect on one’s own mental states, has been mostly characterized through confidence reports in visual tasks. A pressing question is to what extent results from visual studies generalize to other domains. Answering this question allows determining whether metacognition operates through shared, supramodal mechanisms or through idiosyncratic, modality-specific mechanisms. Here, we report three new lines of evidence for decisional and postdecisional mechanisms arguing for the supramodality of metacognition. First, metacognitive efficiency correlated among auditory, tactile, visual, and audiovisual tasks. Second, confidence in an audiovisual task was best modeled using supramodal formats based on integrated representations of auditory and visual signals. Third, confidence in correct responses involved similar electrophysiological markers for visual and audiovisual tasks that are associated with motor preparation preceding the perceptual judgment. We conclude that the supramodality of metacognition relies on supramodal confidence estimates and decisional signals that are shared across sensory modalities.
Fil: Faivre, Nathan. Swiss Federal Institute of Technology Zurich; Suiza. Centre National de la Recherche Scientifique; Francia
Fil: Filevich, Elisa. Max Planck Institute for Human Development; Alemania. Humboldt Universität zu Berlin; Alemania. Bernstein Center for Computational Neuroscience Berlin; Alemania
Fil: Solovey, Guillermo. Consejo Nacional de Investigaciones Científicas y Técnicas; Argentina. Universidad de Buenos Aires. Facultad de Ciencias Exactas y Naturales. Instituto de Cálculo; Argentina
Fil: Kühn, Simone. Max Planck Institute for Human Development; Alemania. University Medical Center Hamburg-Eppendorf; Alemania
Fil: Blanke, Olaf. Swiss Federal Institute of Technology Zurich; Suiza. Universidad de Ginebra; Suiza
description Human metacognition, or the capacity to introspect on one’s own mental states, has been mostly characterized through confidence reports in visual tasks. A pressing question is to what extent results from visual studies generalize to other domains. Answering this question allows determining whether metacognition operates through shared, supramodal mechanisms or through idiosyncratic, modality-specific mechanisms. Here, we report three new lines of evidence for decisional and postdecisional mechanisms arguing for the supramodality of metacognition. First, metacognitive efficiency correlated among auditory, tactile, visual, and audiovisual tasks. Second, confidence in an audiovisual task was best modeled using supramodal formats based on integrated representations of auditory and visual signals. Third, confidence in correct responses involved similar electrophysiological markers for visual and audiovisual tasks that are associated with motor preparation preceding the perceptual judgment. We conclude that the supramodality of metacognition relies on supramodal confidence estimates and decisional signals that are shared across sensory modalities.
publishDate 2018
dc.date.none.fl_str_mv 2018-01
dc.type.none.fl_str_mv info:eu-repo/semantics/article
info:eu-repo/semantics/publishedVersion
http://purl.org/coar/resource_type/c_6501
info:ar-repo/semantics/articulo
format article
status_str publishedVersion
dc.identifier.none.fl_str_mv http://hdl.handle.net/11336/60265
Faivre, Nathan; Filevich, Elisa; Solovey, Guillermo; Kühn, Simone; Blanke, Olaf; Behavioral, modeling, and electrophysiological evidence for supramodality in human metacognition; Society for Neuroscience; Journal of Neuroscience; 38; 2; 1-2018; 263-277
0270-6474
CONICET Digital
CONICET
url http://hdl.handle.net/11336/60265
identifier_str_mv Faivre, Nathan; Filevich, Elisa; Solovey, Guillermo; Kühn, Simone; Blanke, Olaf; Behavioral, modeling, and electrophysiological evidence for supramodality in human metacognition; Society for Neuroscience; Journal of Neuroscience; 38; 2; 1-2018; 263-277
0270-6474
CONICET Digital
CONICET
dc.language.none.fl_str_mv eng
language eng
dc.relation.none.fl_str_mv info:eu-repo/semantics/altIdentifier/doi/10.1523/JNEUROSCI.0322-17.2017
info:eu-repo/semantics/altIdentifier/url/http://www.jneurosci.org/content/38/2/263
dc.rights.none.fl_str_mv info:eu-repo/semantics/openAccess
https://creativecommons.org/licenses/by-nc-sa/2.5/ar/
eu_rights_str_mv openAccess
rights_invalid_str_mv https://creativecommons.org/licenses/by-nc-sa/2.5/ar/
dc.format.none.fl_str_mv application/pdf
application/pdf
dc.publisher.none.fl_str_mv Society for Neuroscience
publisher.none.fl_str_mv Society for Neuroscience
dc.source.none.fl_str_mv reponame:CONICET Digital (CONICET)
instname:Consejo Nacional de Investigaciones Científicas y Técnicas
reponame_str CONICET Digital (CONICET)
collection CONICET Digital (CONICET)
instname_str Consejo Nacional de Investigaciones Científicas y Técnicas
repository.name.fl_str_mv CONICET Digital (CONICET) - Consejo Nacional de Investigaciones Científicas y Técnicas
repository.mail.fl_str_mv dasensio@conicet.gov.ar; lcarlino@conicet.gov.ar
_version_ 1846781939094126592
score 12.982451