Limits...
Nonnegative matrix factorization with Gaussian process priors.

Schmidt MN, Laurberg H - Comput Intell Neurosci (2008)

Bottom Line: We present a general method for including prior knowledge in a nonnegative matrix factorization (NMF), based on Gaussian process priors.We assume that the nonnegative factors in the NMF are linked by a strictly increasing function to an underlying Gaussian process specified by its covariance function.This allows us to find NMF decompositions that agree with our prior knowledge of the distribution of the factors, such as sparseness, smoothness, and symmetries.

View Article: PubMed Central - PubMed

Affiliation: Department of Informatics and Mathematical Modelling, Technical University of Denmark, Richard Petersens Plads, DTU-Building 321, 2800 Lyngby, Denmark. mns@imm.dtu.dk

ABSTRACT
We present a general method for including prior knowledge in a nonnegative matrix factorization (NMF), based on Gaussian process priors. We assume that the nonnegative factors in the NMF are linked by a strictly increasing function to an underlying Gaussian process specified by its covariance function. This allows us to find NMF decompositions that agree with our prior knowledge of the distribution of the factors, such as sparseness, smoothness, and symmetries. The method is demonstrated with an example from chemical shift brain imaging.

No MeSH data available.


Underlying nonnegativefactors in the toy example: columns of D (left) and rows of H (right). The factors found by the LS-NMF andthe CNMF algorithms are noisy, whereas the factorsfound by the GPP-NMF method are smooth. When using the correct prior, thefactors found are very similar to the true factors.
© Copyright Policy - open-access
Related In: Results  -  Collection


getmorefigures.php?uid=PMC2367383&req=5

fig2: Underlying nonnegativefactors in the toy example: columns of D (left) and rows of H (right). The factors found by the LS-NMF andthe CNMF algorithms are noisy, whereas the factorsfound by the GPP-NMF method are smooth. When using the correct prior, thefactors found are very similar to the true factors.

Mentions: Plots of the estimated factors are show in Figure 2.The factors estimated by the LS-NMF and the CNMF methods appear noisy and arenonsmooth, whereas the factors estimated by the GPP-NMF are smooth. The factorsestimated by the LS-NMF method have a positive bias, because of the truncationof negative data. The GPP-NMF with incorrect prior has too many local extremain the D factor and too few in the H factor due to the incorrect covariancefunctions. There are only minor difference between the result of the GPP-NMFwith the correct prior and the underlying factors.


Nonnegative matrix factorization with Gaussian process priors.

Schmidt MN, Laurberg H - Comput Intell Neurosci (2008)

Underlying nonnegativefactors in the toy example: columns of D (left) and rows of H (right). The factors found by the LS-NMF andthe CNMF algorithms are noisy, whereas the factorsfound by the GPP-NMF method are smooth. When using the correct prior, thefactors found are very similar to the true factors.
© Copyright Policy - open-access
Related In: Results  -  Collection

Show All Figures
getmorefigures.php?uid=PMC2367383&req=5

fig2: Underlying nonnegativefactors in the toy example: columns of D (left) and rows of H (right). The factors found by the LS-NMF andthe CNMF algorithms are noisy, whereas the factorsfound by the GPP-NMF method are smooth. When using the correct prior, thefactors found are very similar to the true factors.
Mentions: Plots of the estimated factors are show in Figure 2.The factors estimated by the LS-NMF and the CNMF methods appear noisy and arenonsmooth, whereas the factors estimated by the GPP-NMF are smooth. The factorsestimated by the LS-NMF method have a positive bias, because of the truncationof negative data. The GPP-NMF with incorrect prior has too many local extremain the D factor and too few in the H factor due to the incorrect covariancefunctions. There are only minor difference between the result of the GPP-NMFwith the correct prior and the underlying factors.

Bottom Line: We present a general method for including prior knowledge in a nonnegative matrix factorization (NMF), based on Gaussian process priors.We assume that the nonnegative factors in the NMF are linked by a strictly increasing function to an underlying Gaussian process specified by its covariance function.This allows us to find NMF decompositions that agree with our prior knowledge of the distribution of the factors, such as sparseness, smoothness, and symmetries.

View Article: PubMed Central - PubMed

Affiliation: Department of Informatics and Mathematical Modelling, Technical University of Denmark, Richard Petersens Plads, DTU-Building 321, 2800 Lyngby, Denmark. mns@imm.dtu.dk

ABSTRACT
We present a general method for including prior knowledge in a nonnegative matrix factorization (NMF), based on Gaussian process priors. We assume that the nonnegative factors in the NMF are linked by a strictly increasing function to an underlying Gaussian process specified by its covariance function. This allows us to find NMF decompositions that agree with our prior knowledge of the distribution of the factors, such as sparseness, smoothness, and symmetries. The method is demonstrated with an example from chemical shift brain imaging.

No MeSH data available.