Mixture of experts with entropic regularization for data classification
dc.contributor.author | Peralta, Billy | |
dc.contributor.author | Saavedra, Ariel | |
dc.contributor.author | Caro, Luis | |
dc.contributor.author | Soto, Alvaro | |
dc.date.accessioned | 2022-10-24T13:47:25Z | |
dc.date.available | 2022-10-24T13:47:25Z | |
dc.date.issued | 2019-02 | |
dc.description | Indexación: Scopus | es |
dc.description.abstract | Today, there is growing interest in the automatic classification of a variety of tasks, such as weather forecasting, product recommendations, intrusion detection, and people recognition. "Mixture-of-experts" is a well-known classification technique; it is a probabilistic model consisting of local expert classifiers weighted by a gate network that is typically based on softmax functions, combined with learnable complex patterns in data. In this scheme, one data point is influenced by only one expert; as a result, the training process can be misguided in real datasets for which complex data need to be explained by multiple experts. In this work, we propose a variant of the regular mixture-of-experts model. In the proposed model, the cost classification is penalized by the Shannon entropy of the gating network in order to avoid a "winner-takes-all" output for the gating network. Experiments show the advantage of our approach using several real datasets, with improvements in mean accuracy of 3-6% in some datasets. In future work, we plan to embed feature selection into this model. © 2019 by the authors. | es |
dc.description.uri | https://www.mdpi.com/1099-4300/21/2/190 | |
dc.identifier.doi | 10.3390/e21020190 | |
dc.identifier.issn | 1099-4300 | |
dc.identifier.uri | https://repositorio.unab.cl/xmlui/handle/ria/24413 | |
dc.language.iso | en | es |
dc.publisher | MDPI AG | es |
dc.rights.license | Atribución 4.0 Internacional (CC BY 4.0) | |
dc.rights.uri | https://creativecommons.org/licenses/by/4.0/deed.es | |
dc.subject | Classification | es |
dc.subject | Entropy | es |
dc.subject | Mixture-of-experts | es |
dc.subject | Regularization | es |
dc.title | Mixture of experts with entropic regularization for data classification | es |
dc.type | Artículo | es |
Archivos
Bloque original
1 - 1 de 1
Cargando...
- Nombre:
- entropy-21-00190.pdf
- Tamaño:
- 3.52 MB
- Formato:
- Adobe Portable Document Format
- Descripción:
- Entropy Volume 21, Issue 21 February 2019 Article number 190
Bloque de licencias
1 - 1 de 1
No hay miniatura disponible
- Nombre:
- license.txt
- Tamaño:
- 1.71 KB
- Formato:
- Item-specific license agreed upon to submission
- Descripción: