Limits...
Nonnegative matrix factorization with Gaussian process priors.

Schmidt MN, Laurberg H - Comput Intell Neurosci (2008)

Bottom Line: We present a general method for including prior knowledge in a nonnegative matrix factorization (NMF), based on Gaussian process priors.We assume that the nonnegative factors in the NMF are linked by a strictly increasing function to an underlying Gaussian process specified by its covariance function.This allows us to find NMF decompositions that agree with our prior knowledge of the distribution of the factors, such as sparseness, smoothness, and symmetries.

View Article: PubMed Central - PubMed

Affiliation: Department of Informatics and Mathematical Modelling, Technical University of Denmark, Richard Petersens Plads, DTU-Building 321, 2800 Lyngby, Denmark. mns@imm.dtu.dk

ABSTRACT
We present a general method for including prior knowledge in a nonnegative matrix factorization (NMF), based on Gaussian process priors. We assume that the nonnegative factors in the NMF are linked by a strictly increasing function to an underlying Gaussian process specified by its covariance function. This allows us to find NMF decompositions that agree with our prior knowledge of the distribution of the factors, such as sparseness, smoothness, and symmetries. The method is demonstrated with an example from chemical shift brain imaging.

No MeSH data available.


Toy exampledata matrix (upper left), underlying noise-free nonnegative data (upper right),and estimates using the four methods described in the text. The data has afairly large amount of noise, and the underlying nonnegative factors are smoothin both directions. The LS-NMF and CNMFdecompositions are nonsmooth since these methods arenot model of correlations in the factors. The GPP-NMF, which uses a smoothprior, finds a smooth solution. When using the correct prior, the soulution isvery close to the true underlying data.
© Copyright Policy - open-access
Related In: Results  -  Collection


getmorefigures.php?uid=PMC2367383&req=5

fig1: Toy exampledata matrix (upper left), underlying noise-free nonnegative data (upper right),and estimates using the four methods described in the text. The data has afairly large amount of noise, and the underlying nonnegative factors are smoothin both directions. The LS-NMF and CNMFdecompositions are nonsmooth since these methods arenot model of correlations in the factors. The GPP-NMF, which uses a smoothprior, finds a smooth solution. When using the correct prior, the soulution isvery close to the true underlying data.

Mentions: We generated a 100 × 200 data matrix, Y, by taking a random sample from the GPP-NMF model with two factors. We chose the generating covariance function for both δ and η as a Gaussian radial basis function (RBF)(30)ϕ(i,j)=exp⁡(−(i−j)2β2), where i and j are two sample indices, and the length-scale parameter, which determines the smoothness of the factors, was β2 = 100. We set the covariance between the two factors to zero, such that the factorswere uncorrelated. For the matrix D, we used the rectified-Gaussian-to-Gaussianlink function with s = 1; and for H, we used the exponential-to-Gaussian linkfunction with λ = 1. Finally, we added independent Gaussian noise with variance σN2 = 25, which resulted in a signal-to-noise ratio of approximately −7 dB. The generated data matrix is shown in Figure 1.


Nonnegative matrix factorization with Gaussian process priors.

Schmidt MN, Laurberg H - Comput Intell Neurosci (2008)

Toy exampledata matrix (upper left), underlying noise-free nonnegative data (upper right),and estimates using the four methods described in the text. The data has afairly large amount of noise, and the underlying nonnegative factors are smoothin both directions. The LS-NMF and CNMFdecompositions are nonsmooth since these methods arenot model of correlations in the factors. The GPP-NMF, which uses a smoothprior, finds a smooth solution. When using the correct prior, the soulution isvery close to the true underlying data.
© Copyright Policy - open-access
Related In: Results  -  Collection

Show All Figures
getmorefigures.php?uid=PMC2367383&req=5

fig1: Toy exampledata matrix (upper left), underlying noise-free nonnegative data (upper right),and estimates using the four methods described in the text. The data has afairly large amount of noise, and the underlying nonnegative factors are smoothin both directions. The LS-NMF and CNMFdecompositions are nonsmooth since these methods arenot model of correlations in the factors. The GPP-NMF, which uses a smoothprior, finds a smooth solution. When using the correct prior, the soulution isvery close to the true underlying data.
Mentions: We generated a 100 × 200 data matrix, Y, by taking a random sample from the GPP-NMF model with two factors. We chose the generating covariance function for both δ and η as a Gaussian radial basis function (RBF)(30)ϕ(i,j)=exp⁡(−(i−j)2β2), where i and j are two sample indices, and the length-scale parameter, which determines the smoothness of the factors, was β2 = 100. We set the covariance between the two factors to zero, such that the factorswere uncorrelated. For the matrix D, we used the rectified-Gaussian-to-Gaussianlink function with s = 1; and for H, we used the exponential-to-Gaussian linkfunction with λ = 1. Finally, we added independent Gaussian noise with variance σN2 = 25, which resulted in a signal-to-noise ratio of approximately −7 dB. The generated data matrix is shown in Figure 1.

Bottom Line: We present a general method for including prior knowledge in a nonnegative matrix factorization (NMF), based on Gaussian process priors.We assume that the nonnegative factors in the NMF are linked by a strictly increasing function to an underlying Gaussian process specified by its covariance function.This allows us to find NMF decompositions that agree with our prior knowledge of the distribution of the factors, such as sparseness, smoothness, and symmetries.

View Article: PubMed Central - PubMed

Affiliation: Department of Informatics and Mathematical Modelling, Technical University of Denmark, Richard Petersens Plads, DTU-Building 321, 2800 Lyngby, Denmark. mns@imm.dtu.dk

ABSTRACT
We present a general method for including prior knowledge in a nonnegative matrix factorization (NMF), based on Gaussian process priors. We assume that the nonnegative factors in the NMF are linked by a strictly increasing function to an underlying Gaussian process specified by its covariance function. This allows us to find NMF decompositions that agree with our prior knowledge of the distribution of the factors, such as sparseness, smoothness, and symmetries. The method is demonstrated with an example from chemical shift brain imaging.

No MeSH data available.