From Imitation to Prediction, Data Compression vs Recurrent Neural Networks for Natural Language Processing
- Autores
- Laura, Juan Andrés; Masi, Gabriel Omar; Argerich, Luis
- Año de publicación
- 2017
- Idioma
- inglés
- Tipo de recurso
- documento de conferencia
- Estado
- versión publicada
- Descripción
- In recent studies [1] [2] [3] Recurrent Neural Networks were used for generative processes and their surprising performance can be explained by their ability to create good predictions. In addition, data compression is also based on prediction. What the problem comes down to is whether a data compressor could be used to perform as well as recurrent neural networks in the natural language processing tasks of sentiment analysis and automatic text generation. If this is possible, then the problem comes down to determining if a compression algorithm is even more intelligent than a neural network in such tasks. In our journey we discovered what we think is the fundamental difference between a Data Compression Algorithm and a Recurrent Neural Network.
Sociedad Argentina de Informática e Investigación Operativa - Materia
-
Ciencias Informáticas
Data Compression Algorithm
Neural nets - Nivel de accesibilidad
- acceso abierto
- Condiciones de uso
- http://creativecommons.org/licenses/by-sa/4.0/
- Repositorio
- Institución
- Universidad Nacional de La Plata
- OAI Identificador
- oai:sedici.unlp.edu.ar:10915/65946
Ver los metadatos del registro completo
id |
SEDICI_dca5e5bd4ad25922dd21a628fdfcd901 |
---|---|
oai_identifier_str |
oai:sedici.unlp.edu.ar:10915/65946 |
network_acronym_str |
SEDICI |
repository_id_str |
1329 |
network_name_str |
SEDICI (UNLP) |
spelling |
From Imitation to Prediction, Data Compression vs Recurrent Neural Networks for Natural Language ProcessingLaura, Juan AndrésMasi, Gabriel OmarArgerich, LuisCiencias InformáticasData Compression AlgorithmNeural netsIn recent studies [1] [2] [3] Recurrent Neural Networks were used for generative processes and their surprising performance can be explained by their ability to create good predictions. In addition, data compression is also based on prediction. What the problem comes down to is whether a data compressor could be used to perform as well as recurrent neural networks in the natural language processing tasks of sentiment analysis and automatic text generation. If this is possible, then the problem comes down to determining if a compression algorithm is even more intelligent than a neural network in such tasks. In our journey we discovered what we think is the fundamental difference between a Data Compression Algorithm and a Recurrent Neural Network.Sociedad Argentina de Informática e Investigación Operativa2017-09info:eu-repo/semantics/conferenceObjectinfo:eu-repo/semantics/publishedVersionObjeto de conferenciahttp://purl.org/coar/resource_type/c_5794info:ar-repo/semantics/documentoDeConferenciaapplication/pdfhttp://sedici.unlp.edu.ar/handle/10915/65946enginfo:eu-repo/semantics/altIdentifier/url/http://www.clei2017-46jaiio.sadio.org.ar/sites/default/files/Mem/ASAI/asai-10.pdfinfo:eu-repo/semantics/altIdentifier/issn/2451-7585info:eu-repo/semantics/openAccesshttp://creativecommons.org/licenses/by-sa/4.0/Creative Commons Attribution-ShareAlike 4.0 International (CC BY-SA 4.0)reponame:SEDICI (UNLP)instname:Universidad Nacional de La Platainstacron:UNLP2025-09-29T11:09:45Zoai:sedici.unlp.edu.ar:10915/65946Institucionalhttp://sedici.unlp.edu.ar/Universidad públicaNo correspondehttp://sedici.unlp.edu.ar/oai/snrdalira@sedici.unlp.edu.arArgentinaNo correspondeNo correspondeNo correspondeopendoar:13292025-09-29 11:09:45.382SEDICI (UNLP) - Universidad Nacional de La Platafalse |
dc.title.none.fl_str_mv |
From Imitation to Prediction, Data Compression vs Recurrent Neural Networks for Natural Language Processing |
title |
From Imitation to Prediction, Data Compression vs Recurrent Neural Networks for Natural Language Processing |
spellingShingle |
From Imitation to Prediction, Data Compression vs Recurrent Neural Networks for Natural Language Processing Laura, Juan Andrés Ciencias Informáticas Data Compression Algorithm Neural nets |
title_short |
From Imitation to Prediction, Data Compression vs Recurrent Neural Networks for Natural Language Processing |
title_full |
From Imitation to Prediction, Data Compression vs Recurrent Neural Networks for Natural Language Processing |
title_fullStr |
From Imitation to Prediction, Data Compression vs Recurrent Neural Networks for Natural Language Processing |
title_full_unstemmed |
From Imitation to Prediction, Data Compression vs Recurrent Neural Networks for Natural Language Processing |
title_sort |
From Imitation to Prediction, Data Compression vs Recurrent Neural Networks for Natural Language Processing |
dc.creator.none.fl_str_mv |
Laura, Juan Andrés Masi, Gabriel Omar Argerich, Luis |
author |
Laura, Juan Andrés |
author_facet |
Laura, Juan Andrés Masi, Gabriel Omar Argerich, Luis |
author_role |
author |
author2 |
Masi, Gabriel Omar Argerich, Luis |
author2_role |
author author |
dc.subject.none.fl_str_mv |
Ciencias Informáticas Data Compression Algorithm Neural nets |
topic |
Ciencias Informáticas Data Compression Algorithm Neural nets |
dc.description.none.fl_txt_mv |
In recent studies [1] [2] [3] Recurrent Neural Networks were used for generative processes and their surprising performance can be explained by their ability to create good predictions. In addition, data compression is also based on prediction. What the problem comes down to is whether a data compressor could be used to perform as well as recurrent neural networks in the natural language processing tasks of sentiment analysis and automatic text generation. If this is possible, then the problem comes down to determining if a compression algorithm is even more intelligent than a neural network in such tasks. In our journey we discovered what we think is the fundamental difference between a Data Compression Algorithm and a Recurrent Neural Network. Sociedad Argentina de Informática e Investigación Operativa |
description |
In recent studies [1] [2] [3] Recurrent Neural Networks were used for generative processes and their surprising performance can be explained by their ability to create good predictions. In addition, data compression is also based on prediction. What the problem comes down to is whether a data compressor could be used to perform as well as recurrent neural networks in the natural language processing tasks of sentiment analysis and automatic text generation. If this is possible, then the problem comes down to determining if a compression algorithm is even more intelligent than a neural network in such tasks. In our journey we discovered what we think is the fundamental difference between a Data Compression Algorithm and a Recurrent Neural Network. |
publishDate |
2017 |
dc.date.none.fl_str_mv |
2017-09 |
dc.type.none.fl_str_mv |
info:eu-repo/semantics/conferenceObject info:eu-repo/semantics/publishedVersion Objeto de conferencia http://purl.org/coar/resource_type/c_5794 info:ar-repo/semantics/documentoDeConferencia |
format |
conferenceObject |
status_str |
publishedVersion |
dc.identifier.none.fl_str_mv |
http://sedici.unlp.edu.ar/handle/10915/65946 |
url |
http://sedici.unlp.edu.ar/handle/10915/65946 |
dc.language.none.fl_str_mv |
eng |
language |
eng |
dc.relation.none.fl_str_mv |
info:eu-repo/semantics/altIdentifier/url/http://www.clei2017-46jaiio.sadio.org.ar/sites/default/files/Mem/ASAI/asai-10.pdf info:eu-repo/semantics/altIdentifier/issn/2451-7585 |
dc.rights.none.fl_str_mv |
info:eu-repo/semantics/openAccess http://creativecommons.org/licenses/by-sa/4.0/ Creative Commons Attribution-ShareAlike 4.0 International (CC BY-SA 4.0) |
eu_rights_str_mv |
openAccess |
rights_invalid_str_mv |
http://creativecommons.org/licenses/by-sa/4.0/ Creative Commons Attribution-ShareAlike 4.0 International (CC BY-SA 4.0) |
dc.format.none.fl_str_mv |
application/pdf |
dc.source.none.fl_str_mv |
reponame:SEDICI (UNLP) instname:Universidad Nacional de La Plata instacron:UNLP |
reponame_str |
SEDICI (UNLP) |
collection |
SEDICI (UNLP) |
instname_str |
Universidad Nacional de La Plata |
instacron_str |
UNLP |
institution |
UNLP |
repository.name.fl_str_mv |
SEDICI (UNLP) - Universidad Nacional de La Plata |
repository.mail.fl_str_mv |
alira@sedici.unlp.edu.ar |
_version_ |
1844615965765009408 |
score |
13.070432 |