Fast non-parametric action recognition

Autores
Ubalde, S.; Goussies, N.A.
Año de publicación
2012
Idioma
inglés
Tipo de recurso
artículo
Estado
versión publicada
Descripción
In this work we propose a method for action recognition which needs no intensive learning stage, and achieves state-of-the-art classification performance. Our work is based on a method presented in the context of image classification. Unlike that method, our approach is well-suited for working with large real-world problems, thanks to an efficient organization of the training data. We show results on the KTH and IXMAS datasets. On the challenging IXMAS dataset, the average running time is reduced by 50% when using our method. © 2012 Springer-Verlag.
Fuente
Lect. Notes Comput. Sci. 2012;7441 LNCS:268-275
Materia
action recognition
image-to-class distance
nearest neighbor
Action recognition
Average running time
Classification performance
Data sets
image-to-class distance
Nearest neighbors
Non-parametric
Real-world problem
Training data
Image analysis
Computer vision
Nivel de accesibilidad
acceso abierto
Condiciones de uso
http://creativecommons.org/licenses/by/2.5/ar
Repositorio
Biblioteca Digital (UBA-FCEN)
Institución
Universidad Nacional de Buenos Aires. Facultad de Ciencias Exactas y Naturales
OAI Identificador
paperaa:paper_03029743_v7441LNCS_n_p268_Ubalde

id BDUBAFCEN_804abf14dcf512fc9f22370d68b4d37c
oai_identifier_str paperaa:paper_03029743_v7441LNCS_n_p268_Ubalde
network_acronym_str BDUBAFCEN
repository_id_str 1896
network_name_str Biblioteca Digital (UBA-FCEN)
spelling Fast non-parametric action recognitionUbalde, S.Goussies, N.A.action recognitionimage-to-class distancenearest neighborAction recognitionAverage running timeClassification performanceData setsimage-to-class distanceNearest neighborsNon-parametricReal-world problemTraining dataImage analysisComputer visionIn this work we propose a method for action recognition which needs no intensive learning stage, and achieves state-of-the-art classification performance. Our work is based on a method presented in the context of image classification. Unlike that method, our approach is well-suited for working with large real-world problems, thanks to an efficient organization of the training data. We show results on the KTH and IXMAS datasets. On the challenging IXMAS dataset, the average running time is reduced by 50% when using our method. © 2012 Springer-Verlag.2012info:eu-repo/semantics/articleinfo:eu-repo/semantics/publishedVersionhttp://purl.org/coar/resource_type/c_6501info:ar-repo/semantics/articuloapplication/pdfhttp://hdl.handle.net/20.500.12110/paper_03029743_v7441LNCS_n_p268_UbaldeLect. Notes Comput. Sci. 2012;7441 LNCS:268-275reponame:Biblioteca Digital (UBA-FCEN)instname:Universidad Nacional de Buenos Aires. Facultad de Ciencias Exactas y Naturalesinstacron:UBA-FCENenginfo:eu-repo/semantics/openAccesshttp://creativecommons.org/licenses/by/2.5/ar2025-09-29T13:42:57Zpaperaa:paper_03029743_v7441LNCS_n_p268_UbaldeInstitucionalhttps://digital.bl.fcen.uba.ar/Universidad públicaNo correspondehttps://digital.bl.fcen.uba.ar/cgi-bin/oaiserver.cgiana@bl.fcen.uba.arArgentinaNo correspondeNo correspondeNo correspondeopendoar:18962025-09-29 13:42:58.736Biblioteca Digital (UBA-FCEN) - Universidad Nacional de Buenos Aires. Facultad de Ciencias Exactas y Naturalesfalse
dc.title.none.fl_str_mv Fast non-parametric action recognition
title Fast non-parametric action recognition
spellingShingle Fast non-parametric action recognition
Ubalde, S.
action recognition
image-to-class distance
nearest neighbor
Action recognition
Average running time
Classification performance
Data sets
image-to-class distance
Nearest neighbors
Non-parametric
Real-world problem
Training data
Image analysis
Computer vision
title_short Fast non-parametric action recognition
title_full Fast non-parametric action recognition
title_fullStr Fast non-parametric action recognition
title_full_unstemmed Fast non-parametric action recognition
title_sort Fast non-parametric action recognition
dc.creator.none.fl_str_mv Ubalde, S.
Goussies, N.A.
author Ubalde, S.
author_facet Ubalde, S.
Goussies, N.A.
author_role author
author2 Goussies, N.A.
author2_role author
dc.subject.none.fl_str_mv action recognition
image-to-class distance
nearest neighbor
Action recognition
Average running time
Classification performance
Data sets
image-to-class distance
Nearest neighbors
Non-parametric
Real-world problem
Training data
Image analysis
Computer vision
topic action recognition
image-to-class distance
nearest neighbor
Action recognition
Average running time
Classification performance
Data sets
image-to-class distance
Nearest neighbors
Non-parametric
Real-world problem
Training data
Image analysis
Computer vision
dc.description.none.fl_txt_mv In this work we propose a method for action recognition which needs no intensive learning stage, and achieves state-of-the-art classification performance. Our work is based on a method presented in the context of image classification. Unlike that method, our approach is well-suited for working with large real-world problems, thanks to an efficient organization of the training data. We show results on the KTH and IXMAS datasets. On the challenging IXMAS dataset, the average running time is reduced by 50% when using our method. © 2012 Springer-Verlag.
description In this work we propose a method for action recognition which needs no intensive learning stage, and achieves state-of-the-art classification performance. Our work is based on a method presented in the context of image classification. Unlike that method, our approach is well-suited for working with large real-world problems, thanks to an efficient organization of the training data. We show results on the KTH and IXMAS datasets. On the challenging IXMAS dataset, the average running time is reduced by 50% when using our method. © 2012 Springer-Verlag.
publishDate 2012
dc.date.none.fl_str_mv 2012
dc.type.none.fl_str_mv info:eu-repo/semantics/article
info:eu-repo/semantics/publishedVersion
http://purl.org/coar/resource_type/c_6501
info:ar-repo/semantics/articulo
format article
status_str publishedVersion
dc.identifier.none.fl_str_mv http://hdl.handle.net/20.500.12110/paper_03029743_v7441LNCS_n_p268_Ubalde
url http://hdl.handle.net/20.500.12110/paper_03029743_v7441LNCS_n_p268_Ubalde
dc.language.none.fl_str_mv eng
language eng
dc.rights.none.fl_str_mv info:eu-repo/semantics/openAccess
http://creativecommons.org/licenses/by/2.5/ar
eu_rights_str_mv openAccess
rights_invalid_str_mv http://creativecommons.org/licenses/by/2.5/ar
dc.format.none.fl_str_mv application/pdf
dc.source.none.fl_str_mv Lect. Notes Comput. Sci. 2012;7441 LNCS:268-275
reponame:Biblioteca Digital (UBA-FCEN)
instname:Universidad Nacional de Buenos Aires. Facultad de Ciencias Exactas y Naturales
instacron:UBA-FCEN
reponame_str Biblioteca Digital (UBA-FCEN)
collection Biblioteca Digital (UBA-FCEN)
instname_str Universidad Nacional de Buenos Aires. Facultad de Ciencias Exactas y Naturales
instacron_str UBA-FCEN
institution UBA-FCEN
repository.name.fl_str_mv Biblioteca Digital (UBA-FCEN) - Universidad Nacional de Buenos Aires. Facultad de Ciencias Exactas y Naturales
repository.mail.fl_str_mv ana@bl.fcen.uba.ar
_version_ 1844618735863726080
score 13.070432