A pairwise subspace projection method for multi-class linear dimension reduction

Autores
Tomassi, Diego
Año de publicación
2012
Idioma
inglés
Tipo de recurso
documento de conferencia
Estado
versión publicada
Descripción
Linear feature extraction is commonly applied in an all-at-once way, meaning that a single trasformation is used for all the data regardless of the classes. Very good results can be achieved with this approach when the classification problem involves just a few classes. Nevertheless, when the number of classes grows is often difficult to find a low dimensional subspace while preserving the error rates, due to overlapping between the different populations. In this paper we propose an alternative method based on a collection of transformations, each involving two of the classes in the problem. Each transformation in the collection is estimated using an approximation to the information discriminant analysis, which is found to be equivalent to sufficient dimension reduction for heteroscedastic Gaussian data. A regularized version of the objective function is also introduced, allowing for simultaneous variable selection. In this way, each reduction implies only a subset of the original variables. A probabilistic model is build by means of a simple latent variable, so that classification is carried out using standard Bayes decision rule. Several real data sets are used to compare the performance of the proposed method against similar approaches based on ensembles of binary classifiers.
Sociedad Argentina de Informática e Investigación Operativa
Materia
Ciencias Informáticas
Pairwise subspace projection method
Multi-class linear dimension reduction
Nivel de accesibilidad
acceso abierto
Condiciones de uso
http://creativecommons.org/licenses/by-nc-sa/4.0/
Repositorio
SEDICI (UNLP)
Institución
Universidad Nacional de La Plata
OAI Identificador
oai:sedici.unlp.edu.ar:10915/123721

id SEDICI_c50dee513d0469f0c96a2cd3eb383143
oai_identifier_str oai:sedici.unlp.edu.ar:10915/123721
network_acronym_str SEDICI
repository_id_str 1329
network_name_str SEDICI (UNLP)
spelling A pairwise subspace projection method for multi-class linear dimension reductionTomassi, DiegoCiencias InformáticasPairwise subspace projection methodMulti-class linear dimension reductionLinear feature extraction is commonly applied in an all-at-once way, meaning that a single trasformation is used for all the data regardless of the classes. Very good results can be achieved with this approach when the classification problem involves just a few classes. Nevertheless, when the number of classes grows is often difficult to find a low dimensional subspace while preserving the error rates, due to overlapping between the different populations. In this paper we propose an alternative method based on a collection of transformations, each involving two of the classes in the problem. Each transformation in the collection is estimated using an approximation to the information discriminant analysis, which is found to be equivalent to sufficient dimension reduction for heteroscedastic Gaussian data. A regularized version of the objective function is also introduced, allowing for simultaneous variable selection. In this way, each reduction implies only a subset of the original variables. A probabilistic model is build by means of a simple latent variable, so that classification is carried out using standard Bayes decision rule. Several real data sets are used to compare the performance of the proposed method against similar approaches based on ensembles of binary classifiers.Sociedad Argentina de Informática e Investigación Operativa2012-08info:eu-repo/semantics/conferenceObjectinfo:eu-repo/semantics/publishedVersionObjeto de conferenciahttp://purl.org/coar/resource_type/c_5794info:ar-repo/semantics/documentoDeConferenciaapplication/pdf48-58http://sedici.unlp.edu.ar/handle/10915/123721enginfo:eu-repo/semantics/altIdentifier/url/https://41jaiio.sadio.org.ar/sites/default/files/5_ASAI_2012.pdfinfo:eu-repo/semantics/altIdentifier/issn/1850-2784info:eu-repo/semantics/openAccesshttp://creativecommons.org/licenses/by-nc-sa/4.0/Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International (CC BY-NC-SA 4.0)reponame:SEDICI (UNLP)instname:Universidad Nacional de La Platainstacron:UNLP2025-09-29T11:29:39Zoai:sedici.unlp.edu.ar:10915/123721Institucionalhttp://sedici.unlp.edu.ar/Universidad públicaNo correspondehttp://sedici.unlp.edu.ar/oai/snrdalira@sedici.unlp.edu.arArgentinaNo correspondeNo correspondeNo correspondeopendoar:13292025-09-29 11:29:39.557SEDICI (UNLP) - Universidad Nacional de La Platafalse
dc.title.none.fl_str_mv A pairwise subspace projection method for multi-class linear dimension reduction
title A pairwise subspace projection method for multi-class linear dimension reduction
spellingShingle A pairwise subspace projection method for multi-class linear dimension reduction
Tomassi, Diego
Ciencias Informáticas
Pairwise subspace projection method
Multi-class linear dimension reduction
title_short A pairwise subspace projection method for multi-class linear dimension reduction
title_full A pairwise subspace projection method for multi-class linear dimension reduction
title_fullStr A pairwise subspace projection method for multi-class linear dimension reduction
title_full_unstemmed A pairwise subspace projection method for multi-class linear dimension reduction
title_sort A pairwise subspace projection method for multi-class linear dimension reduction
dc.creator.none.fl_str_mv Tomassi, Diego
author Tomassi, Diego
author_facet Tomassi, Diego
author_role author
dc.subject.none.fl_str_mv Ciencias Informáticas
Pairwise subspace projection method
Multi-class linear dimension reduction
topic Ciencias Informáticas
Pairwise subspace projection method
Multi-class linear dimension reduction
dc.description.none.fl_txt_mv Linear feature extraction is commonly applied in an all-at-once way, meaning that a single trasformation is used for all the data regardless of the classes. Very good results can be achieved with this approach when the classification problem involves just a few classes. Nevertheless, when the number of classes grows is often difficult to find a low dimensional subspace while preserving the error rates, due to overlapping between the different populations. In this paper we propose an alternative method based on a collection of transformations, each involving two of the classes in the problem. Each transformation in the collection is estimated using an approximation to the information discriminant analysis, which is found to be equivalent to sufficient dimension reduction for heteroscedastic Gaussian data. A regularized version of the objective function is also introduced, allowing for simultaneous variable selection. In this way, each reduction implies only a subset of the original variables. A probabilistic model is build by means of a simple latent variable, so that classification is carried out using standard Bayes decision rule. Several real data sets are used to compare the performance of the proposed method against similar approaches based on ensembles of binary classifiers.
Sociedad Argentina de Informática e Investigación Operativa
description Linear feature extraction is commonly applied in an all-at-once way, meaning that a single trasformation is used for all the data regardless of the classes. Very good results can be achieved with this approach when the classification problem involves just a few classes. Nevertheless, when the number of classes grows is often difficult to find a low dimensional subspace while preserving the error rates, due to overlapping between the different populations. In this paper we propose an alternative method based on a collection of transformations, each involving two of the classes in the problem. Each transformation in the collection is estimated using an approximation to the information discriminant analysis, which is found to be equivalent to sufficient dimension reduction for heteroscedastic Gaussian data. A regularized version of the objective function is also introduced, allowing for simultaneous variable selection. In this way, each reduction implies only a subset of the original variables. A probabilistic model is build by means of a simple latent variable, so that classification is carried out using standard Bayes decision rule. Several real data sets are used to compare the performance of the proposed method against similar approaches based on ensembles of binary classifiers.
publishDate 2012
dc.date.none.fl_str_mv 2012-08
dc.type.none.fl_str_mv info:eu-repo/semantics/conferenceObject
info:eu-repo/semantics/publishedVersion
Objeto de conferencia
http://purl.org/coar/resource_type/c_5794
info:ar-repo/semantics/documentoDeConferencia
format conferenceObject
status_str publishedVersion
dc.identifier.none.fl_str_mv http://sedici.unlp.edu.ar/handle/10915/123721
url http://sedici.unlp.edu.ar/handle/10915/123721
dc.language.none.fl_str_mv eng
language eng
dc.relation.none.fl_str_mv info:eu-repo/semantics/altIdentifier/url/https://41jaiio.sadio.org.ar/sites/default/files/5_ASAI_2012.pdf
info:eu-repo/semantics/altIdentifier/issn/1850-2784
dc.rights.none.fl_str_mv info:eu-repo/semantics/openAccess
http://creativecommons.org/licenses/by-nc-sa/4.0/
Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International (CC BY-NC-SA 4.0)
eu_rights_str_mv openAccess
rights_invalid_str_mv http://creativecommons.org/licenses/by-nc-sa/4.0/
Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International (CC BY-NC-SA 4.0)
dc.format.none.fl_str_mv application/pdf
48-58
dc.source.none.fl_str_mv reponame:SEDICI (UNLP)
instname:Universidad Nacional de La Plata
instacron:UNLP
reponame_str SEDICI (UNLP)
collection SEDICI (UNLP)
instname_str Universidad Nacional de La Plata
instacron_str UNLP
institution UNLP
repository.name.fl_str_mv SEDICI (UNLP) - Universidad Nacional de La Plata
repository.mail.fl_str_mv alira@sedici.unlp.edu.ar
_version_ 1844616175553609728
score 13.070432