Alternating Local Enumeration (TnALE): solving tensor network structure search with fewer evaluations

Autores
Li, Chao; Zeng, Junhua; Li, Chunmei; Caiafa, César Federico; Zhao, Qibin
Año de publicación
2023
Idioma
inglés
Tipo de recurso
documento de conferencia
Estado
versión publicada
Descripción
Tensor network (TN) is a powerful framework in machine learning, but selecting a good TN model, known as TN structure search (TN-SS), is a challenging and computationally intensive task. The recent approach TNLS (Li et al., 2022) showed promising results for this task. However, its computational efficiency is still unaffordable, requiring too many evaluations of the objective function. We propose TnALE, a surprisingly simple algorithm that updates each structure-related variable alternately by local enumeration, greatly reducing the number of evaluations compared to TNLS. We theoretically investigate the descent steps for TNLS and TnALE, proving that both the algorithms can achieve linear convergence up to a constant if a sufficient reduction of the objective is reached in each neighborhood. We further compare the evaluation efficiency of TNLS and TnALE, revealing that Ω(2K) evaluations are typically required in TNLS for reaching the objective reduction, while ideally O(KR) evaluations are sufficient in TnALE, where K denotes the dimension of search space and R reflects the “low-rankness” of the neighborhood. Experimental results verify that TnALE can find practically good TN structures with vastly fewer evaluations than the state-of-the-art algorithms.
Fil: Li, Chao. Riken Aip; Japón
Fil: Zeng, Junhua. Riken Aip; Japón. Guangdong University of Technology; China
Fil: Li, Chunmei. Riken Aip; Japón. Harbin Engineering University; China
Fil: Caiafa, César Federico. Provincia de Buenos Aires. Gobernación. Comisión de Investigaciones Científicas. Instituto Argentino de Radioastronomía. Consejo Nacional de Investigaciones Científicas y Técnicas. Centro Científico Tecnológico Conicet - La Plata. Instituto Argentino de Radioastronomía; Argentina
Fil: Zhao, Qibin. Riken Aip; Japón
40th International Conference on Machine Learning
Honolulu
Estados Unidos
International Council for Machinery Lubrication
Materia
Tensor Network
Signal Processing
Machine Learning
Nivel de accesibilidad
acceso abierto
Condiciones de uso
https://creativecommons.org/licenses/by-nc-sa/2.5/ar/
Repositorio
CONICET Digital (CONICET)
Institución
Consejo Nacional de Investigaciones Científicas y Técnicas
OAI Identificador
oai:ri.conicet.gov.ar:11336/221893

id CONICETDig_78b8cccabb94b1f0e8212dd827aab0b5
oai_identifier_str oai:ri.conicet.gov.ar:11336/221893
network_acronym_str CONICETDig
repository_id_str 3498
network_name_str CONICET Digital (CONICET)
spelling Alternating Local Enumeration (TnALE): solving tensor network structure search with fewer evaluationsLi, ChaoZeng, JunhuaLi, ChunmeiCaiafa, César FedericoZhao, QibinTensor NetworkSignal ProcessingMachine Learninghttps://purl.org/becyt/ford/1.2https://purl.org/becyt/ford/1Tensor network (TN) is a powerful framework in machine learning, but selecting a good TN model, known as TN structure search (TN-SS), is a challenging and computationally intensive task. The recent approach TNLS (Li et al., 2022) showed promising results for this task. However, its computational efficiency is still unaffordable, requiring too many evaluations of the objective function. We propose TnALE, a surprisingly simple algorithm that updates each structure-related variable alternately by local enumeration, greatly reducing the number of evaluations compared to TNLS. We theoretically investigate the descent steps for TNLS and TnALE, proving that both the algorithms can achieve linear convergence up to a constant if a sufficient reduction of the objective is reached in each neighborhood. We further compare the evaluation efficiency of TNLS and TnALE, revealing that Ω(2K) evaluations are typically required in TNLS for reaching the objective reduction, while ideally O(KR) evaluations are sufficient in TnALE, where K denotes the dimension of search space and R reflects the “low-rankness” of the neighborhood. Experimental results verify that TnALE can find practically good TN structures with vastly fewer evaluations than the state-of-the-art algorithms.Fil: Li, Chao. Riken Aip; JapónFil: Zeng, Junhua. Riken Aip; Japón. Guangdong University of Technology; ChinaFil: Li, Chunmei. Riken Aip; Japón. Harbin Engineering University; ChinaFil: Caiafa, César Federico. Provincia de Buenos Aires. Gobernación. Comisión de Investigaciones Científicas. Instituto Argentino de Radioastronomía. Consejo Nacional de Investigaciones Científicas y Técnicas. Centro Científico Tecnológico Conicet - La Plata. Instituto Argentino de Radioastronomía; ArgentinaFil: Zhao, Qibin. Riken Aip; Japón40th International Conference on Machine LearningHonoluluEstados UnidosInternational Council for Machinery LubricationMLR PressLawrence, NeilKrause, Andreas2023info:eu-repo/semantics/publishedVersioninfo:eu-repo/semantics/conferenceObjectConferenciaJournalhttp://purl.org/coar/resource_type/c_5794info:ar-repo/semantics/documentoDeConferenciaapplication/pdfapplication/pdfhttp://hdl.handle.net/11336/221893Alternating Local Enumeration (TnALE): solving tensor network structure search with fewer evaluations; 40th International Conference on Machine Learning; Honolulu; Estados Unidos; 2023; 20384-204112640-3498CONICET DigitalCONICETenghttps://www.neventum.com/tradeshows/international-conference-machine-learning-icmlinfo:eu-repo/semantics/altIdentifier/url/http://proceedings.mlr.press/v202/li23ar/li23ar.pdfinfo:eu-repo/semantics/altIdentifier/url/https://proceedings.mlr.press/v202/Internacionalinfo:eu-repo/semantics/openAccesshttps://creativecommons.org/licenses/by-nc-sa/2.5/ar/reponame:CONICET Digital (CONICET)instname:Consejo Nacional de Investigaciones Científicas y Técnicas2025-10-15T15:14:38Zoai:ri.conicet.gov.ar:11336/221893instacron:CONICETInstitucionalhttp://ri.conicet.gov.ar/Organismo científico-tecnológicoNo correspondehttp://ri.conicet.gov.ar/oai/requestdasensio@conicet.gov.ar; lcarlino@conicet.gov.arArgentinaNo correspondeNo correspondeNo correspondeopendoar:34982025-10-15 15:14:38.811CONICET Digital (CONICET) - Consejo Nacional de Investigaciones Científicas y Técnicasfalse
dc.title.none.fl_str_mv Alternating Local Enumeration (TnALE): solving tensor network structure search with fewer evaluations
title Alternating Local Enumeration (TnALE): solving tensor network structure search with fewer evaluations
spellingShingle Alternating Local Enumeration (TnALE): solving tensor network structure search with fewer evaluations
Li, Chao
Tensor Network
Signal Processing
Machine Learning
title_short Alternating Local Enumeration (TnALE): solving tensor network structure search with fewer evaluations
title_full Alternating Local Enumeration (TnALE): solving tensor network structure search with fewer evaluations
title_fullStr Alternating Local Enumeration (TnALE): solving tensor network structure search with fewer evaluations
title_full_unstemmed Alternating Local Enumeration (TnALE): solving tensor network structure search with fewer evaluations
title_sort Alternating Local Enumeration (TnALE): solving tensor network structure search with fewer evaluations
dc.creator.none.fl_str_mv Li, Chao
Zeng, Junhua
Li, Chunmei
Caiafa, César Federico
Zhao, Qibin
author Li, Chao
author_facet Li, Chao
Zeng, Junhua
Li, Chunmei
Caiafa, César Federico
Zhao, Qibin
author_role author
author2 Zeng, Junhua
Li, Chunmei
Caiafa, César Federico
Zhao, Qibin
author2_role author
author
author
author
dc.contributor.none.fl_str_mv Lawrence, Neil
Krause, Andreas
dc.subject.none.fl_str_mv Tensor Network
Signal Processing
Machine Learning
topic Tensor Network
Signal Processing
Machine Learning
purl_subject.fl_str_mv https://purl.org/becyt/ford/1.2
https://purl.org/becyt/ford/1
dc.description.none.fl_txt_mv Tensor network (TN) is a powerful framework in machine learning, but selecting a good TN model, known as TN structure search (TN-SS), is a challenging and computationally intensive task. The recent approach TNLS (Li et al., 2022) showed promising results for this task. However, its computational efficiency is still unaffordable, requiring too many evaluations of the objective function. We propose TnALE, a surprisingly simple algorithm that updates each structure-related variable alternately by local enumeration, greatly reducing the number of evaluations compared to TNLS. We theoretically investigate the descent steps for TNLS and TnALE, proving that both the algorithms can achieve linear convergence up to a constant if a sufficient reduction of the objective is reached in each neighborhood. We further compare the evaluation efficiency of TNLS and TnALE, revealing that Ω(2K) evaluations are typically required in TNLS for reaching the objective reduction, while ideally O(KR) evaluations are sufficient in TnALE, where K denotes the dimension of search space and R reflects the “low-rankness” of the neighborhood. Experimental results verify that TnALE can find practically good TN structures with vastly fewer evaluations than the state-of-the-art algorithms.
Fil: Li, Chao. Riken Aip; Japón
Fil: Zeng, Junhua. Riken Aip; Japón. Guangdong University of Technology; China
Fil: Li, Chunmei. Riken Aip; Japón. Harbin Engineering University; China
Fil: Caiafa, César Federico. Provincia de Buenos Aires. Gobernación. Comisión de Investigaciones Científicas. Instituto Argentino de Radioastronomía. Consejo Nacional de Investigaciones Científicas y Técnicas. Centro Científico Tecnológico Conicet - La Plata. Instituto Argentino de Radioastronomía; Argentina
Fil: Zhao, Qibin. Riken Aip; Japón
40th International Conference on Machine Learning
Honolulu
Estados Unidos
International Council for Machinery Lubrication
description Tensor network (TN) is a powerful framework in machine learning, but selecting a good TN model, known as TN structure search (TN-SS), is a challenging and computationally intensive task. The recent approach TNLS (Li et al., 2022) showed promising results for this task. However, its computational efficiency is still unaffordable, requiring too many evaluations of the objective function. We propose TnALE, a surprisingly simple algorithm that updates each structure-related variable alternately by local enumeration, greatly reducing the number of evaluations compared to TNLS. We theoretically investigate the descent steps for TNLS and TnALE, proving that both the algorithms can achieve linear convergence up to a constant if a sufficient reduction of the objective is reached in each neighborhood. We further compare the evaluation efficiency of TNLS and TnALE, revealing that Ω(2K) evaluations are typically required in TNLS for reaching the objective reduction, while ideally O(KR) evaluations are sufficient in TnALE, where K denotes the dimension of search space and R reflects the “low-rankness” of the neighborhood. Experimental results verify that TnALE can find practically good TN structures with vastly fewer evaluations than the state-of-the-art algorithms.
publishDate 2023
dc.date.none.fl_str_mv 2023
dc.type.none.fl_str_mv info:eu-repo/semantics/publishedVersion
info:eu-repo/semantics/conferenceObject
Conferencia
Journal
http://purl.org/coar/resource_type/c_5794
info:ar-repo/semantics/documentoDeConferencia
status_str publishedVersion
format conferenceObject
dc.identifier.none.fl_str_mv http://hdl.handle.net/11336/221893
Alternating Local Enumeration (TnALE): solving tensor network structure search with fewer evaluations; 40th International Conference on Machine Learning; Honolulu; Estados Unidos; 2023; 20384-20411
2640-3498
CONICET Digital
CONICET
url http://hdl.handle.net/11336/221893
identifier_str_mv Alternating Local Enumeration (TnALE): solving tensor network structure search with fewer evaluations; 40th International Conference on Machine Learning; Honolulu; Estados Unidos; 2023; 20384-20411
2640-3498
CONICET Digital
CONICET
dc.language.none.fl_str_mv eng
language eng
dc.relation.none.fl_str_mv https://www.neventum.com/tradeshows/international-conference-machine-learning-icml
info:eu-repo/semantics/altIdentifier/url/http://proceedings.mlr.press/v202/li23ar/li23ar.pdf
info:eu-repo/semantics/altIdentifier/url/https://proceedings.mlr.press/v202/
dc.rights.none.fl_str_mv info:eu-repo/semantics/openAccess
https://creativecommons.org/licenses/by-nc-sa/2.5/ar/
eu_rights_str_mv openAccess
rights_invalid_str_mv https://creativecommons.org/licenses/by-nc-sa/2.5/ar/
dc.format.none.fl_str_mv application/pdf
application/pdf
dc.coverage.none.fl_str_mv Internacional
dc.publisher.none.fl_str_mv MLR Press
publisher.none.fl_str_mv MLR Press
dc.source.none.fl_str_mv reponame:CONICET Digital (CONICET)
instname:Consejo Nacional de Investigaciones Científicas y Técnicas
reponame_str CONICET Digital (CONICET)
collection CONICET Digital (CONICET)
instname_str Consejo Nacional de Investigaciones Científicas y Técnicas
repository.name.fl_str_mv CONICET Digital (CONICET) - Consejo Nacional de Investigaciones Científicas y Técnicas
repository.mail.fl_str_mv dasensio@conicet.gov.ar; lcarlino@conicet.gov.ar
_version_ 1846083294019452928
score 13.22299