Efficient descriptor tree growing for fast action recognition

Autores
Ubalde, Sebastián; Goussies, Norberto Adrián; Mejail, Marta Estela
Año de publicación
2013
Idioma
inglés
Tipo de recurso
artículo
Estado
versión publicada
Descripción
Video and image classification based on Instance-to-Class (I2C) distance attracted many recent studies, due to the good generalization capabilities it provides for non-parametric classifiers. In this work we propose a method for action recognition. Our approach needs no intensive learning stage, and its classification performance is comparable to the state-of-the-art. A smart organization of training data allows the classifier to achieve reasonable computation times when working with large training databases. An efficient method for organizing training data in such a way is proposed. We perform thorough experiments on two popular action recognition datasets: the KTH dataset and the IXMAS dataset, and we study the influence of one of the key parameters of the method on classification performance.
Fil: Ubalde, Sebastián. Universidad de Buenos Aires. Facultad de Ciencias Exactas y Naturales. Departamento de Computación; Argentina. Consejo Nacional de Investigaciones Científicas y Técnicas; Argentina
Fil: Goussies, Norberto Adrián. Universidad de Buenos Aires. Facultad de Ciencias Exactas y Naturales. Departamento de Computación; Argentina. Consejo Nacional de Investigaciones Científicas y Técnicas; Argentina
Fil: Mejail, Marta Estela. Universidad de Buenos Aires. Facultad de Ciencias Exactas y Naturales. Departamento de Computación; Argentina
Materia
Action Recognition
Nearest Neighbor
Instance-To-Class Distance
Nivel de accesibilidad
acceso abierto
Condiciones de uso
https://creativecommons.org/licenses/by-nc-nd/2.5/ar/
Repositorio
CONICET Digital (CONICET)
Institución
Consejo Nacional de Investigaciones Científicas y Técnicas
OAI Identificador
oai:ri.conicet.gov.ar:11336/30404

id CONICETDig_69eb08bdad8a956d275a5c0ce8095011
oai_identifier_str oai:ri.conicet.gov.ar:11336/30404
network_acronym_str CONICETDig
repository_id_str 3498
network_name_str CONICET Digital (CONICET)
spelling Efficient descriptor tree growing for fast action recognitionUbalde, SebastiánGoussies, Norberto AdriánMejail, Marta EstelaAction RecognitionNearest NeighborInstance-To-Class Distancehttps://purl.org/becyt/ford/1.2https://purl.org/becyt/ford/1Video and image classification based on Instance-to-Class (I2C) distance attracted many recent studies, due to the good generalization capabilities it provides for non-parametric classifiers. In this work we propose a method for action recognition. Our approach needs no intensive learning stage, and its classification performance is comparable to the state-of-the-art. A smart organization of training data allows the classifier to achieve reasonable computation times when working with large training databases. An efficient method for organizing training data in such a way is proposed. We perform thorough experiments on two popular action recognition datasets: the KTH dataset and the IXMAS dataset, and we study the influence of one of the key parameters of the method on classification performance.Fil: Ubalde, Sebastián. Universidad de Buenos Aires. Facultad de Ciencias Exactas y Naturales. Departamento de Computación; Argentina. Consejo Nacional de Investigaciones Científicas y Técnicas; ArgentinaFil: Goussies, Norberto Adrián. Universidad de Buenos Aires. Facultad de Ciencias Exactas y Naturales. Departamento de Computación; Argentina. Consejo Nacional de Investigaciones Científicas y Técnicas; ArgentinaFil: Mejail, Marta Estela. Universidad de Buenos Aires. Facultad de Ciencias Exactas y Naturales. Departamento de Computación; ArgentinaElsevier Science2013-05info:eu-repo/semantics/articleinfo:eu-repo/semantics/publishedVersionhttp://purl.org/coar/resource_type/c_6501info:ar-repo/semantics/articuloapplication/pdfapplication/pdfhttp://hdl.handle.net/11336/30404Ubalde, Sebastián; Goussies, Norberto Adrián; Mejail, Marta Estela; Efficient descriptor tree growing for fast action recognition; Elsevier Science; Pattern Recognition Letters; 36; 5-2013; 213-2200167-8655CONICET DigitalCONICETenginfo:eu-repo/semantics/altIdentifier/doi/10.1016/j.patrec.2013.05.007info:eu-repo/semantics/altIdentifier/url/http://www.sciencedirect.com/science/article/pii/S0167865513001979info:eu-repo/semantics/openAccesshttps://creativecommons.org/licenses/by-nc-nd/2.5/ar/reponame:CONICET Digital (CONICET)instname:Consejo Nacional de Investigaciones Científicas y Técnicas2025-09-29T10:41:11Zoai:ri.conicet.gov.ar:11336/30404instacron:CONICETInstitucionalhttp://ri.conicet.gov.ar/Organismo científico-tecnológicoNo correspondehttp://ri.conicet.gov.ar/oai/requestdasensio@conicet.gov.ar; lcarlino@conicet.gov.arArgentinaNo correspondeNo correspondeNo correspondeopendoar:34982025-09-29 10:41:11.558CONICET Digital (CONICET) - Consejo Nacional de Investigaciones Científicas y Técnicasfalse
dc.title.none.fl_str_mv Efficient descriptor tree growing for fast action recognition
title Efficient descriptor tree growing for fast action recognition
spellingShingle Efficient descriptor tree growing for fast action recognition
Ubalde, Sebastián
Action Recognition
Nearest Neighbor
Instance-To-Class Distance
title_short Efficient descriptor tree growing for fast action recognition
title_full Efficient descriptor tree growing for fast action recognition
title_fullStr Efficient descriptor tree growing for fast action recognition
title_full_unstemmed Efficient descriptor tree growing for fast action recognition
title_sort Efficient descriptor tree growing for fast action recognition
dc.creator.none.fl_str_mv Ubalde, Sebastián
Goussies, Norberto Adrián
Mejail, Marta Estela
author Ubalde, Sebastián
author_facet Ubalde, Sebastián
Goussies, Norberto Adrián
Mejail, Marta Estela
author_role author
author2 Goussies, Norberto Adrián
Mejail, Marta Estela
author2_role author
author
dc.subject.none.fl_str_mv Action Recognition
Nearest Neighbor
Instance-To-Class Distance
topic Action Recognition
Nearest Neighbor
Instance-To-Class Distance
purl_subject.fl_str_mv https://purl.org/becyt/ford/1.2
https://purl.org/becyt/ford/1
dc.description.none.fl_txt_mv Video and image classification based on Instance-to-Class (I2C) distance attracted many recent studies, due to the good generalization capabilities it provides for non-parametric classifiers. In this work we propose a method for action recognition. Our approach needs no intensive learning stage, and its classification performance is comparable to the state-of-the-art. A smart organization of training data allows the classifier to achieve reasonable computation times when working with large training databases. An efficient method for organizing training data in such a way is proposed. We perform thorough experiments on two popular action recognition datasets: the KTH dataset and the IXMAS dataset, and we study the influence of one of the key parameters of the method on classification performance.
Fil: Ubalde, Sebastián. Universidad de Buenos Aires. Facultad de Ciencias Exactas y Naturales. Departamento de Computación; Argentina. Consejo Nacional de Investigaciones Científicas y Técnicas; Argentina
Fil: Goussies, Norberto Adrián. Universidad de Buenos Aires. Facultad de Ciencias Exactas y Naturales. Departamento de Computación; Argentina. Consejo Nacional de Investigaciones Científicas y Técnicas; Argentina
Fil: Mejail, Marta Estela. Universidad de Buenos Aires. Facultad de Ciencias Exactas y Naturales. Departamento de Computación; Argentina
description Video and image classification based on Instance-to-Class (I2C) distance attracted many recent studies, due to the good generalization capabilities it provides for non-parametric classifiers. In this work we propose a method for action recognition. Our approach needs no intensive learning stage, and its classification performance is comparable to the state-of-the-art. A smart organization of training data allows the classifier to achieve reasonable computation times when working with large training databases. An efficient method for organizing training data in such a way is proposed. We perform thorough experiments on two popular action recognition datasets: the KTH dataset and the IXMAS dataset, and we study the influence of one of the key parameters of the method on classification performance.
publishDate 2013
dc.date.none.fl_str_mv 2013-05
dc.type.none.fl_str_mv info:eu-repo/semantics/article
info:eu-repo/semantics/publishedVersion
http://purl.org/coar/resource_type/c_6501
info:ar-repo/semantics/articulo
format article
status_str publishedVersion
dc.identifier.none.fl_str_mv http://hdl.handle.net/11336/30404
Ubalde, Sebastián; Goussies, Norberto Adrián; Mejail, Marta Estela; Efficient descriptor tree growing for fast action recognition; Elsevier Science; Pattern Recognition Letters; 36; 5-2013; 213-220
0167-8655
CONICET Digital
CONICET
url http://hdl.handle.net/11336/30404
identifier_str_mv Ubalde, Sebastián; Goussies, Norberto Adrián; Mejail, Marta Estela; Efficient descriptor tree growing for fast action recognition; Elsevier Science; Pattern Recognition Letters; 36; 5-2013; 213-220
0167-8655
CONICET Digital
CONICET
dc.language.none.fl_str_mv eng
language eng
dc.relation.none.fl_str_mv info:eu-repo/semantics/altIdentifier/doi/10.1016/j.patrec.2013.05.007
info:eu-repo/semantics/altIdentifier/url/http://www.sciencedirect.com/science/article/pii/S0167865513001979
dc.rights.none.fl_str_mv info:eu-repo/semantics/openAccess
https://creativecommons.org/licenses/by-nc-nd/2.5/ar/
eu_rights_str_mv openAccess
rights_invalid_str_mv https://creativecommons.org/licenses/by-nc-nd/2.5/ar/
dc.format.none.fl_str_mv application/pdf
application/pdf
dc.publisher.none.fl_str_mv Elsevier Science
publisher.none.fl_str_mv Elsevier Science
dc.source.none.fl_str_mv reponame:CONICET Digital (CONICET)
instname:Consejo Nacional de Investigaciones Científicas y Técnicas
reponame_str CONICET Digital (CONICET)
collection CONICET Digital (CONICET)
instname_str Consejo Nacional de Investigaciones Científicas y Técnicas
repository.name.fl_str_mv CONICET Digital (CONICET) - Consejo Nacional de Investigaciones Científicas y Técnicas
repository.mail.fl_str_mv dasensio@conicet.gov.ar; lcarlino@conicet.gov.ar
_version_ 1844614442116972544
score 13.070432