Adaptive relevance matrices in learning vector quantization

Neural Comput. 2009 Dec;21(12):3532-61. doi: 10.1162/neco.2009.11-08-908.

Abstract

We propose a new matrix learning scheme to extend relevance learning vector quantization (RLVQ), an efficient prototype-based classification algorithm, toward a general adaptive metric. By introducing a full matrix of relevance factors in the distance measure, correlations between different features and their importance for the classification scheme can be taken into account and automated, and general metric adaptation takes place during training. In comparison to the weighted Euclidean metric used in RLVQ and its variations, a full matrix is more powerful to represent the internal structure of the data appropriately. Large margin generalization bounds can be transferred to this case, leading to bounds that are independent of the input dimensionality. This also holds for local metrics attached to each prototype, which corresponds to piecewise quadratic decision boundaries. The algorithm is tested in comparison to alternative learning vector quantization schemes using an artificial data set, a benchmark multiclass problem from the UCI repository, and a problem from bioinformatics, the recognition of splice sites for C. elegans.

MeSH terms

  • Algorithms
  • Artificial Intelligence*
  • Brain / physiology
  • Electronic Data Processing
  • Humans
  • Information Theory*
  • Learning / physiology*
  • Probability Learning*