Direct Incorporation of \documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$L_1$$\end{document}-Regularization into Generalized Matrix Learning Vector Quantization uri icon

Open Access

  • false

Peer Reviewed

  • false

Abstract

  • Frequently, high-dimensional features are used to represent data to be classified. This paper proposes a new approach to learn interpretable classification models from such high-dimensional data representation. To this end, we extend a popular prototype-based classification algorithm, the matrix learning vector quantization, to incorporate an enhanced feature selection objective via \documentclass[12pt]{minimal}
    \usepackage{amsmath}
    \usepackage{wasysym}
    \usepackage{amsfonts}
    \usepackage{amssymb}
    \usepackage{amsbsy}
    \usepackage{mathrsfs}
    \usepackage{upgreek}
    \setlength{\oddsidemargin}{-69pt}
    \begin{document}$$L_1$$\end{document}-regularization. In contrast to previous work, we propose a framework that directly optimizes this objective using the alternating direction method of multipliers (ADMM) and manifold optimization. We evaluate our method on synthetic data and on real data for speech-based emotion recognition. Particularly, we show that our method achieves state-of-the-art results on the Berlin Database of Emotional speech and show its abilities to select relevant dimensions from the eGeMAPS set of audio features.

Veröffentlichungszeitpunkt

  • November 5, 2018