Shared Gaussian Process Latent Variable Models

Detta är en avhandling från Oxford University Press

Sammanfattning: A fundamental task in machine learning is modeling the relationship between different observation spaces. Dimensionality reduction is the task of reducing thenumber of dimensions in a parameterization of a data-set. In this thesis we areinterested in the cross-road between these two tasks: shared dimensionality reduction. Shared dimensionality reduction aims to represent multiple observationspaces within the same model. Previously suggested models have been limited tothe scenarios where the observations have been generated from the same manifold.In this thesis we present a Gaussian Process Latent Variable Model (GP-LVM)[33] for shared dimensionality reduction without making assumptions about therelationship between the observations. Further we suggest an extension to Canonical Correlation Analysis (CCA) called Non Consolidating Component Analysis (NCCA). The proposed algorithm extends classical CCA to represent the fullvariance of the data opposed to only the correlated. We compare the suggestedGP-LVM model to existing models and show results on real-world problems exemplifying the advantages of our approach.

  Denna avhandling är EVENTUELLT nedladdningsbar som PDF. Kolla denna länk för att se om den går att ladda ner.