3410:
3498:, an application of PCA, using the plot of eigenvalues. A typical choice of the number of components with PCA is based on the "elbow" point, then the existence of the flat plateau is indicating that PCA is not capturing the data efficiently, and at last there exists a sudden drop reflecting the capture of random noise and falls into the regime of overfitting. For sequential NMF, the plot of eigenvalues is approximated by the plot of the fractional residual variance curves, where the curves decreases continuously, and converge to a higher level than PCA, which is the indication of less over-fitting of sequential NMF.
4197:. However, if the noise is non-stationary, the classical denoising algorithms usually have poor performance because the statistical information of the non-stationary noise is difficult to estimate. Schmidt et al. use NMF to do speech denoising under non-stationary noise, which is completely different from classical statistical approaches. The key idea is that clean speech signal can be sparsely represented by a speech dictionary, but non-stationary noise cannot. Similarly, non-stationary noise can also be sparsely represented by a noise dictionary, but speech cannot.
4276:
3553:
8308:
1036:, with the property that all three matrices have no negative elements. This non-negativity makes the resulting matrices easier to inspect. Also, in applications such as processing of audio spectrograms or muscular activity, non-negativity is inherent to the data being considered. Since the problem is not exactly solvable in general, it is commonly approximated numerically.
4001:
where forward modeling have to be adopted to recover the true flux. Forward modeling is currently optimized for point sources, however not for extended sources, especially for irregularly shaped structures such as circumstellar disks. In this situation, NMF has been an excellent method, being less over-fitting in the sense of the non-negativity and
953:
3102:
2859:
2526:, where there may be many users and many items to recommend, and it would be inefficient to recalculate everything when one user or one item is added to the system. The cost function for optimization in these cases may or may not be the same as for standard NMF, but the algorithms need to be rather different.
4238:
data and finding the genes most representative of the clusters. In the analysis of cancer mutations it has been used to identify common patterns of mutations that occur in many cancers and that probably have distinct causes. NMF techniques can identify sources of variation such as cell types, disease
4021:
The data imputation procedure with NMF can be composed of two steps. First, when the NMF components are known, Ren et al. (2020) proved that impact from missing data during data imputation ("target modeling" in their study) is a second order effect. Second, when the NMF components are unknown, the
4017:
in statistics. By first proving that the missing data are ignored in the cost function, then proving that the impact from missing data can be as small as a second order effect, Ren et al. (2020) studied and applied such an approach for the field of astronomy. Their work focuses on two-dimensional
4000:
In direct imaging, to reveal the faint exoplanets and circumstellar disks from bright the surrounding stellar lights, which has a typical contrast from 10â” to 10Âčâ°, various statistical methods have been adopted, however the light from the exoplanets or circumstellar disks are usually over-fitted,
3965:
in the sense that astrophysical signals are non-negative. NMF has been applied to the spectroscopic observations and the direct imaging observations as a method to study the common properties of astronomical objects and post-process the astronomical observations. The advances in the spectroscopic
6605:
Wahhaj, Zahed; Cieza, Lucas A.; Mawet, Dimitri; Yang, Bin; Canovas, Hector; de Boer, Jozua; Casassus, Simon; Ménard, François; Schreiber, Matthias R.; Liu, Michael C.; Biller, Beth A.; Nielsen, Eric L.; Hayward, Thomas L. (2015). "Improving signal-to-noise in the direct imaging of exoplanets and
3413:
Fractional residual variance (FRV) plots for PCA and sequential NMF; for PCA, the theoretical values are the contribution from the residual eigenvalues. In comparison, the FRV curves for PCA reaches a flat plateau where no signal are captured effectively; while the NMF FRV curves are declining
3382:
Current algorithms are sub-optimal in that they only guarantee finding a local minimum, rather than a global minimum of the cost function. A provably optimal algorithm is unlikely in the near future as the problem has been shown to generalize the k-means clustering problem which is known to be
1083:
non-negative matrix factorization has a long history under the name "self modeling curve resolution". In this framework the vectors in the right matrix are continuous curves rather than discrete vectors. Also early work on non-negative matrix factorizations was performed by a
Finnish group of
4200:
The algorithm for NMF denoising goes as follows. Two dictionaries, one for speech and one for noise, need to be trained offline. Once a noisy speech is given, we first calculate the magnitude of the Short-Time-Fourier-Transform. Second, separate it into two parts via NMF, one can be sparsely
4025:
Depending on the way that the NMF components are obtained, the former step above can be either independent or dependent from the latter. In addition, the imputation quality can be increased when the more NMF components are used, see Figure 4 of Ren et al. (2020) for their illustration.
1299:
When multiplying matrices, the dimensions of the factor matrices may be significantly lower than those of the product matrix and it is this property that forms the basis of NMF. NMF generates factors with significantly reduced dimensions compared to the original matrix. For example, if
4176:
measurements. This kind of method was firstly introduced in
Internet Distance Estimation Service (IDES). Afterwards, as a fully decentralized approach, Phoenix network coordinate system is proposed. It achieves better overall prediction accuracy by introducing the concept of weight.
3275:
4313:
Scalability: how to factorize million-by-billion matrices, which are commonplace in Web-scale data mining, e.g., see
Distributed Nonnegative Matrix Factorization (DNMF), Scalable Nonnegative Matrix Factorization (ScalableNMF), Distributed Stochastic Singular Value
4079:
Arora, Ge, Halpern, Mimno, Moitra, Sontag, Wu, & Zhu (2013) have given polynomial-time algorithms to learn topic models using NMF. The algorithm assumes that the topic matrix satisfies a separability condition that is often found to hold in these settings.
3203:
2868:
2643:
4217:
in sampled genomes. In human genetic clustering, NMF algorithms provide estimates similar to those of the computer program STRUCTURE, but the algorithms are more efficient computationally and allow analysis of large population genomic data sets.
3439:(PCA) in astronomy. The contribution from the PCA components are ranked by the magnitude of their corresponding eigenvalues; for NMF, its components can be ranked empirically when they are constructed one by one (sequentially), i.e., learn the
2513:
Many standard NMF algorithms analyze all the data together; i.e., the whole matrix is available from the start. This may be unsatisfactory in applications where there are too many data to fit into memory or where the data are provided in
4083:
Hassani, Iranmanesh and
Mansouri (2019) proposed a feature agglomeration method for term-document matrices which operates using NMF. The algorithm reduces the term-document matrix into a smaller matrix more suitable for text clustering.
2477:
3673:
contains cluster membership indicators. This provides a theoretical foundation for using NMF for data clustering. However, k-means does not enforce non-negativity on its centroids, so the closest analogy is in fact with "semi-NMF".
1473:
as a document archetype comprising a set of words where each word's cell value defines the word's rank in the feature: The higher a word's cell value the higher the word's rank in the feature. A column in the coefficients matrix
1480:
represents an original document with a cell value defining the document's rank for a feature. We can now reconstruct a document (column vector) from our input matrix by a linear combination of our features (column vectors in
3528:
time in the dense case. Arora, Ge, Halpern, Mimno, Moitra, Sontag, Wu, & Zhu (2013) give a polynomial time algorithm for exact NMF that works for the case where one of the factors W satisfies a separability condition.
6904:
2540:
represent data sampled over spatial or temporal dimensions, e.g. time signals, images, or video, features that are equivariant w.r.t. shifts along these dimensions can be learned by
Convolutional NMF. In this case,
3702:(SVM). However, SVM and NMF are related at a more intimate level than that of NQP, which allows direct application of the solution algorithms developed for either of the two methods to problems in both domains.
3864:
4201:
represented by the speech dictionary, and the other part can be sparsely represented by the noise dictionary. Third, the part that is represented by the speech dictionary will be the estimated clean speech.
1245:
7236:
Stein-OâBrien, Genevieve L.; Arora, Raman; Culhane, Aedin C.; Favorov, Alexander V.; Garmire, Lana X.; Greene, Casey S.; Goff, Loyal A.; Li, Yifeng; Ngom, Aloune; Ochs, Michael F.; Xu, Yanxun (2018-10-01).
3769:
8213:
Andrzej
Cichocki, Rafal Zdunek, Anh Huy Phan and Shun-ichi Amari: "Nonnegative Matrix and Tensor Factorizations: Applications to Exploratory Multi-way Data Analysis and Blind Source Separation", Wiley,
1619:
4013:
To impute missing data in statistics, NMF can take missing data while minimizing its cost function, rather than treating these missing data as zeros. This makes it a mathematically proven method for
1162:
1894:
3314:
3966:
observations by
Blanton & Roweis (2007) takes into account of the uncertainties of astronomical observations, which is later improved by Zhu (2016) where missing data are also considered and
3208:
6821:
Berry, Michael W.; Browne, Murray; Langville, Amy N.; Paucac, V. Paul; Plemmonsc, Robert J. (15 September 2007). "Algorithms and
Applications for Approximate Nonnegative Matrix Factorization".
5167:
Zhang, T.; Fang, B.; Liu, W.; Tang, Y. Y.; He, G.; Wen, J. (2008). "Total variation norm-based nonnegative matrix factorization for identifying discriminant representation of image patterns".
1559:
4323:
Cohen and
Rothblum 1993 problem: whether a rational matrix always has an NMF of minimal inner dimension whose factors are also rational. Recently, this problem has been answered negatively.
2052:
1817:
3518:
contains a monomial sub matrix of rank equal to its rank was given by
Campbell and Poole in 1981. Kalofolias and Gallopoulos (2012) solved the symmetric counterpart of this problem, where
3691:
Other extensions of NMF include joint factorization of several data matrices and tensors where some factors are shared. Such models are useful for sensor fusion and relational learning.
4005:
of the NMF modeling coefficients, therefore forward modeling can be performed with a few scaling factors, rather than a computationally intensive data re-reduction on generated models.
3812:
3097:{\displaystyle \mathbf {W} _{}^{n+1}\leftarrow \mathbf {W} _{}^{n}{\frac {(\mathbf {V} (\mathbf {H} ^{n+1})^{T})_{}}{(\mathbf {W} ^{n}\mathbf {H} ^{n+1}(\mathbf {H} ^{n+1})^{T})_{}}}}
1714:
3140:
8080:
3929:
3900:
1464:
This last point is the basis of NMF because we can consider each original document in our example as being built from a small set of hidden features. NMF generates these features.
5700:
3632:
2272:
2854:{\displaystyle \mathbf {H} _{}^{n+1}\leftarrow \mathbf {H} _{}^{n}{\frac {((\mathbf {W} ^{n})^{T}\mathbf {V} )_{}}{((\mathbf {W} ^{n})^{T}\mathbf {W} ^{n}\mathbf {H} ^{n})_{}}}}
1752:
3414:
continuously, indicating a better ability to capture signal. The FRV curves for NMF also converges to higher levels than PCA, indicating the less-overfitting property of NMF.
1379:
with 10000 rows and 500 columns where words are in rows and documents are in columns. That is, we have 500 documents indexed by 10000 words. It follows that a column vector
2394:
1777:
1584:
8117:
CĂ©dric FĂ©votte; Nancy Bertin & Jean-Louis Durrieu (2009). "Nonnegative Matrix Factorization with the Itakura-Saito Divergence: With Application to Music Analysis".
866:
4145:
3469:
2402:
1929:
4174:
2490:
904:
3638:
It was later shown that some types of NMF are an instance of a more general probabilistic model called "multinomial PCA". When NMF is obtained by minimizing the
7890:
Chistikov, Dmitry; Kiefer, Stefan; MaruĆĄiÄ, Ines; Shirmohammadi, Mahsa; Worrell, James (2016-05-22). "Nonnegative Matrix Factorization Requires Irrationality".
4118:
3489:
2636:
2009:
1989:
1969:
1949:
1844:
1659:
1639:
5842:
Soummer, RĂ©mi; Pueyo, Laurent; Larkin, James (2012). "Detection and Characterization of Exoplanets and Disks Using Projections on Karhunen-LoĂšve Eigenimages".
3549:, and shows that although the three techniques may be written as factorizations, they implement different constraints and therefore produce different results.
6229:(1999). "The Multilinear Engine: A Table-Driven, Least Squares Program for Solving Multilinear Problems, including the n-Way Parallel Factor Analysis Model".
2371:
is defined on probability distributions). Each divergence leads to a different NMF algorithm, usually minimizing the divergence using iterative update rules.
4719:
5043:
C Ding, T Li, MI Jordan, Convex and semi-nonnegative matrix factorizations, IEEE Transactions on Pattern Analysis and Machine Intelligence, 32, 45-55, 2010
7725:
4320:
Collective (joint) factorization: factorizing multiple interrelated matrices for multiple-view learning, e.g. multi-view clustering, see CoNMF and MultiNMF
861:
4715:
4045:
is constructed with the weights of various terms (typically weighted word frequency information) from a set of documents. This matrix is factored into a
851:
3402:
may affect not only the rate of convergence, but also the overall error at convergence. Some options for initialization include complete randomization,
6734:
6231:
7952:
6850:
6226:
6159:
6124:
5564:
4855:
692:
5353:
Naiyang Guan; Dacheng Tao; Zhigang Luo & Bo Yuan (July 2012). "Online Nonnegative Matrix Factorization With Robust Stochastic Approximation".
7957:
6800:
Hassani, Ali; Iranmanesh, Amir; Mansouri, Najme (2019-11-12). "Text Mining using Nonnegative Matrix Factorization and Latent Semantic Analysis".
6167:. Proc. 28th international ACM SIGIR conference on Research and development in information retrieval (SIGIR-05). pp. 601â602. Archived from
3981:
Ren et al. (2018) are able to prove the stability of NMF components when they are constructed sequentially (i.e., one by one), which enables the
8258:
Julian Becker: "Nonnegative Matrix Factorization with Adaptive Elements for Monaural Audio Source Separation: 1 ", Shaker Verlag GmbH, Germany,
4246:
tasks in order to predict novel protein targets and therapeutic indications for approved drugs and to infer pair of synergic anticancer drugs.
4068:
email dataset with 65,033 messages and 91,133 terms into 50 clusters. NMF has also been applied to citations data, with one example clustering
899:
6210:
7517:
7130:"DNA methylation profiling of medulloblastoma allows robust sub-classification and improved outcome prediction using formalin-fixed biopsies"
4851:
4558:
Ren, Bin; Pueyo, Laurent; Chen, Christine; Choquet, Elodie; Debes, John H; Duechene, Gaspard; Menard, Francois; Perrin, Marshall D. (2020).
5770:
Hafshejani, Sajad Fathi; Moaberfard, Zahra (November 2022). "Initialization for Nonnegative Matrix Factorization: a Comprehensive Review".
856:
707:
8162:
3817:
438:
7799:
5601:
Naiyang Guan; Dacheng Tao; Zhigang Luo; Bo Yuan (June 2012). "NeNMF: An Optimal Gradient Method for Nonnegative Matrix Factorization".
939:
742:
1197:
4759:
Pentti Paatero; Unto Tapper; Pasi Aalto; Markku Kulmala (1991). "Matrix factorization methods for analysing diffusion battery data".
8225:
Andri Mirzal: "Nonnegative Matrix Factorizations for Clustering and LSI: Theory and Programming", LAP LAMBERT Academic Publishing,
7466:
Sitek; Gullberg; Huesman (2002). "Correction for ambiguous solutions in factor analysis using a penalized least squares objective".
6374:. Proceedings of the 26th annual international ACM SIGIR conference on Research and development in information retrieval. New York:
4424:
Blanton, Michael R.; Roweis, Sam (2007). "K-corrections and filter transformations in the ultraviolet, optical, and near infrared".
3684:
NMF extends beyond matrices to tensors of arbitrary order. This extension may be viewed as a non-negative counterpart to, e.g., the
3720:
8202:
Andrzej Cichocki, Morten Mrup, et al.: "Advances in Nonnegative Matrix and Tensor Factorization", Hindawi Publishing Corporation,
5001:
2547:
is sparse with columns having local non-zero weight windows that are shared across shifts along the spatio-temporal dimensions of
7087:"Sparse non-negative matrix factorizations via alternating non-negativity-constrained least squares for microarray data analysis"
4858:; Unto Tapper; Olli JĂ€rvinen (1995). "Source identification of bulk wet deposition in Finland by positive matrix factorization".
4213:
for estimating individual admixture coefficients, detecting genetic clusters of individuals in a population sample or evaluating
1589:
818:
4383:
8017:
Liu, W.X.; Zheng, N.N. & You, Q.B. (2006). "Nonnegative Matrix Factorization and its applications in pattern recognition".
6436:"Analysis of the emission of very small dust particles from Spitzer spectro-imagery data using blind signal separation methods"
4402:
3643:
2062:
367:
5297:. Proceedings of the 17th ACM SIGKDD international conference on Knowledge discovery and data mining - KDD '11. p. 1064.
3270:{\textstyle {\textstyle {\frac {\mathbf {V} \mathbf {H} ^{\mathsf {T}}}{\mathbf {W} \mathbf {H} \mathbf {H} ^{\mathsf {T}}}}}}
1128:
1096:
investigated the properties of the algorithm and published some simple and useful algorithms for two types of factorizations.
8296:
7866:
7564:"Reconstruction of 4-D Dynamic SPECT Images From Inconsistent Projections Using a Spline Initialized FADS Algorithm (SIFADS)"
6406:
5818:
Zhu, Guangtun B. (2016-12-19). "Nonnegative Matrix Factorization (NMF) with Heteroscedastic Uncertainties and Missing data".
5754:
5737:
Ding, C.; He, X. & Simon, H.D. (2005). "On the equivalence of nonnegative matrix factorization and spectral clustering".
5417:
6896:
1849:
8352:
7834:
Jialu Liu; Chi Wang; Jing Gao & Jiawei Han (2013). "Multi-View Clustering via Joint Nonnegative Matrix Factorization".
6848:
Yun Mao; Lawrence Saul & Jonathan M. Smith (2006). "IDES: An Internet Distance Estimation Service for Large Networks".
5646:
3284:
876:
639:
174:
7687:
Proceedings of the European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases
2563:
and repeatedly using the resulting representation as input to convolutional NMF, deep feature hierarchies can be learned.
7611:
C. Boutsidis & E. Gallopoulos (2008). "SVD based initialization: A head start for nonnegative matrix factorization".
6375:
5693:"Algorithms for nonnegative matrix and tensor factorizations: A unified view based on block coordinate descent framework"
5229:"A framework for regularized non-negative matrix factorization, with application to the analysis of gene expression data"
3506:
Exact solutions for the variants of NMF can be expected (in polynomial time) when additional constraints hold for matrix
2584:
894:
5644:
Jingu Kim & Haesun Park (2011). "Fast Nonnegative Matrix Factorization: An Active-set-like Method and Comparisons".
4959:
4018:
matrices, specifically, it includes mathematical derivation, simulated data imputation, and application to on-sky data.
8285:
8274:
8263:
8252:
8241:
8230:
8219:
8207:
7378:
Pinoli; Ceddia; Ceri; Masseroli (2021). "Predicting drug synergism by means of non-negative matrix tri-factorization".
6493:
LafreniÚre, David; Maroid, Christian; Doyon, René; Barman, Travis (2009). "HST/NICMOS Detection of HR 8799 b in 1998".
4806:"Positive matrix factorization: A non-negative factor model with optimal utilization of error estimates of data values"
4621:
1504:
727:
702:
651:
7421:
DiPaola; Bazin; Aubry; Aurengo; Cavailloles; Herry; Kahn (1982). "Handling of dynamic sequences in nuclear medicine".
6029:
Arora, Sanjeev; Ge, Rong; Halpern, Yoni; Mimno, David; Moitra, Ankur; Sontag, David; Wu, Yichen; Zhu, Michael (2013).
4317:
Online: how to update the factorization when new data comes in without recomputing from scratch, e.g., see online CNSC
7921:
5557:"Nonnegative Matrix Factorization Based on Alternating Nonnegativity Constrained Least Squares and Active Set Method"
5506:
Lin, Chih-Jen (2007). "On the Convergence of Multiplicative Update Algorithms for Nonnegative Matrix Factorization".
5337:
5310:
4860:
4254:
NMF, also referred in this field as factor analysis, has been used since the 1980s to analyze sequences of images in
2295:
775:
770:
8247:
Ganesh R. Naik(Ed.): "Non-negative Matrix Factorization Techniques: Advances in Theory and Applications", Springer,
8056:
Ngoc-Diep Ho; Paul Van Dooren & Vincent Blondel (2008). "Descent Methods for Nonnegative Matrix Factorization".
3390:
In addition to the optimization step, initialization has a significant effect on NMF. The initial values chosen for
2017:
1782:
7179:
Alexandrov, Ludmil B.; Nik-Zainal, Serena; Wedge, David C.; Campbell, Peter J.; Stratton, Michael R. (2013-01-31).
433:
71:
4022:
authors proved that the impact from missing data during component construction is a first-to-second order effect.
7335:
Ceddia; Pinoli; Ceri; Masseroli (2020). "Matrix factorization-based technique for drug repurposing predictions".
3971:
3639:
3198:{\textstyle {\frac {\mathbf {W} ^{\mathsf {T}}\mathbf {V} }{\mathbf {W} ^{\mathsf {T}}\mathbf {W} \mathbf {H} }}}
2368:
2058:
7774:
5902:"Detection and Characterization of Exoplanets using Projections on Karhunen Loeve Eigenimages: Forward Modeling"
3777:
2328:
There are different types of non-negative matrix factorizations. The different types arise from using different
7311:
4053:
matrix. The features are derived from the contents of the documents, and the feature-document matrix describes
932:
828:
592:
413:
8323:
4092:
NMF is also used to analyze spectral data; one such use is in the classification of space objects and debris.
3650:
estimation. That method is commonly used for analyzing and clustering textual data and is also related to the
2054:
is not explicitly imposed, the orthogonality holds to a large extent, and the clustering property holds too.
1670:
6434:; Deville, Y.; Smith, J. D.; Rapacioli, M.; Bernard, J. P.; Thomas, J.; Reach, W.; Abergel, A. (2007-07-01).
4338:
803:
505:
281:
3495:
8119:
7980:
5453:
4259:
3905:
3876:
3546:
3436:
3403:
2345:
760:
697:
607:
585:
428:
418:
3587:
2225:
6435:
6268:
5002:"On the equivalence between non-negative matrix factorization and probabilistic latent semantic indexing"
911:
823:
808:
269:
91:
7513:"Clustering Initiated Factor Analysis (CIFA) Application for Tissue Classification in Dynamic Brain PET"
6732:
Berry, Michael W.; Browne, Murray (2005). "Email Surveillance Using Non-negative Matrix Factorization".
6168:
6065:
3970:
is enabled. Their method is then adopted by Ren et al. (2018) to the direct imaging field as one of the
3409:
7030:
4761:
3320:
871:
798:
548:
443:
231:
164:
124:
1719:
6770:
6314:
2554:
925:
531:
299:
169:
7849:
7753:
7633:
6918:
6864:
5467:
3387:. However, as in many other data mining applications, a local minimum may still prove to be useful.
1442:
From the treatment of matrix multiplication above it follows that each column in the product matrix
8019:
7091:
5668:
5578:
5520:
5169:
4306:
Current research (since 2010) in nonnegative matrix factorization includes, but is not limited to,
4242:
A particular variant of NMF, namely Non-Negative Matrix Tri-Factorization (NMTF), has been use for
3379:
method, the optimal gradient method, and the block principal pivoting method among several others.
1501:
NMF has an inherent clustering property, i.e., it automatically clusters the columns of input data
553:
473:
396:
314:
144:
106:
101:
61:
56:
7296:
Proceedings of the 12th ACM SIGKDD international conference on Knowledge discovery and data mining
2377:
1760:
1567:
8342:
6322:. High-Performance Scientific Computing: Algorithms and Applications. Springer. pp. 311â326.
6290:
4805:
4186:
4014:
2519:
2497:) is added to NMF with the mean squared error cost function, the resulting problem may be called
1060:
1052:
500:
349:
249:
76:
7026:"Nonnegative Matrix Factorization: An Analytical and Interpretive Tool in Computational Biology"
6147:. Proc. European Conference on Machine Learning (ECML-02). LNAI. Vol. 2430. pp. 23â34.
6030:
5398:"Discovering hierarchical speech features using convolutional non-negative matrix factorization"
4640:
8357:
8347:
7844:
7748:
7679:
7628:
7568:
7468:
6913:
6859:
6285:
5663:
5573:
5515:
5462:
4658:"Non-Negative Matrix Factorization for Learning Alignment-Specific Models of Protein Evolution"
3699:
3541:
Lee and Seung proposed NMF mainly for parts-based decomposition of images. It compares NMF to
2374:
The factorization problem in the squared error version of NMF may be stated as: Given a matrix
680:
656:
558:
319:
294:
254:
66:
7978:
Raul Kompass (2007). "A Generalized Divergence Measure for Nonnegative Matrix Factorization".
7658:"Distributed Nonnegative Matrix Factorization for Web-Scale Dyadic Data Analysis on MapReduce"
7294:
Ding; Li; Peng; Park (2006). "Orthogonal nonnegative matrix t-factorizations for clustering".
4100:
NMF is applied in scalable Internet distance (round-trip time) prediction. For a network with
4060:
One specific application used hierarchical NMF on a small subset of scientific abstracts from
2472:{\displaystyle F(\mathbf {W} ,\mathbf {H} )=\left\|\mathbf {V} -\mathbf {WH} \right\|_{F}^{2}}
7917:"A receptor model using a specific non-negative transformation technique for ambient aerosol"
7134:
6334:
6118:
5291:
Fast coordinate descent methods with variable selection for non-negative matrix factorization
3695:
3524:
is symmetric and contains a diagonal principal sub matrix of rank r. Their algorithm runs in
3319:
More recently other algorithms have been developed. Some approaches are based on alternating
1004:
634:
456:
408:
264:
179:
51:
7810:
6629:
6461:
4967:. Advances in Neural Information Processing Systems 13: Proceedings of the 2000 Conference.
3681:
model with one layer of observed random variables and one layer of hidden random variables.
8089:
8028:
7930:
7740:
7620:
7432:
7039:
6625:
6569:
6512:
6457:
6353:
Vamsi K. Potluru; Sergey M. Plis; Morten Morup; Vince D. Calhoun & Terran Lane (2009).
6277:
6082:
6046:
5993:
5923:
5861:
5655:
5610:
5242:
4913:
4869:
4669:
4581:
4513:
4443:
4348:
4123:
4039:
3442:
2483:
1907:
1487:) where each feature is weighted by the feature's cell value from the document's column in
1021:
1012:
563:
513:
6352:
4150:
8:
7423:
4333:
4262:
dynamic medical imaging. Non-uniqueness of NMF was addressed using sparsity constraints.
4210:
3994:
3975:
3962:
3867:
3542:
3122:
Note that the updates are done on an element by element basis not matrix multiplication.
2367:) and an extension of the KullbackâLeibler divergence to positive matrices (the original
2011:-th cluster. This centroid's representation can be significantly enhanced by convex NMF.
1048:
666:
602:
573:
478:
304:
237:
223:
209:
184:
134:
86:
46:
8313:
8093:
8032:
7934:
7744:
7723:
7624:
7436:
7043:
6573:
6516:
6281:
6086:
6050:
5927:
5865:
5659:
5614:
5445:
5246:
4987:
4917:
4873:
4673:
4585:
4517:
4447:
4289:
Please help update this article to reflect recent events or newly available information.
1662:
8186:
8157:
8144:
8105:
8057:
8044:
8005:
7891:
7872:
7766:
7593:
7539:
7512:
7493:
7448:
7403:
7360:
7317:
7271:
7238:
7213:
7180:
7156:
7129:
7062:
7025:
7001:
6979:
6974:
6931:
6877:
6801:
6780:
6751:
6693:
6641:
6615:
6587:
6559:
6528:
6524:
6502:
6447:
6412:
6248:
6106:
6036:
5941:
5913:
5877:
5851:
5819:
5779:
5719:
5626:
5556:
5533:
5488:
5423:
5378:
5265:
5228:
5204:
5149:
5131:
5024:
4937:
4741:
4692:
4657:
4599:
4571:
4531:
4503:
4459:
4433:
4243:
4189:. There are many algorithms for denoising if the noise is stationary. For example, the
4103:
4073:
3967:
3658:
3651:
3647:
3474:
3406:, k-means clustering, and more advanced strategies based on these and other paradigms.
2621:
2587:
has been a popular method due to the simplicity of implementation. This algorithm is:
2523:
2494:
2219:
1994:
1974:
1954:
1934:
1829:
1820:
1644:
1624:
1433:
and, if the factorization worked, it is a reasonable approximation to the input matrix
1064:
644:
568:
354:
149:
7970:
6299:
6196:
5936:
5901:
5873:
4774:
4628:. Proc. ACM SIGKDD Int'l Conf. on Knowledge discovery and data mining. pp. 69â77.
4619:
8292:
8281:
8270:
8259:
8248:
8237:
8226:
8215:
8203:
8191:
8136:
7997:
7943:
7916:
7862:
7835:
7657:
7585:
7544:
7485:
7407:
7395:
7364:
7352:
7307:
7276:
7258:
7218:
7200:
7161:
7110:
7067:
7006:
6685:
6681:
6591:
6582:
6547:
6475:
6402:
6098:
5977:
5960:
5945:
5797:
5750:
5692:
5480:
5413:
5370:
5333:
5306:
5270:
4929:
4881:
4825:
4778:
4697:
4603:
4214:
4069:
2185:
by significantly less data, then one has to infer some latent structure in the data.
737:
580:
493:
289:
259:
204:
199:
154:
96:
8148:
8048:
7770:
7597:
7452:
7105:
7086:
6927:
6881:
6755:
6697:
6645:
6416:
6391:
2004 IEEE International Joint Conference on Neural Networks (IEEE Cat. No.04CH37541)
5881:
5723:
5327:
4463:
4408:(Report). Max Planck Institute for Biological Cybernetics. Technical Report No. 193.
2314:
is called a nonnegative rank factorization (NRF). The problem of finding the NRF of
8181:
8171:
8128:
8109:
8097:
8071:
8036:
8009:
7989:
7966:
7938:
7854:
7758:
7638:
7577:
7534:
7526:
7497:
7477:
7440:
7387:
7344:
7299:
7266:
7250:
7208:
7192:
7151:
7143:
7100:
7057:
7047:
6996:
6988:
6935:
6923:
6869:
6830:
6743:
6677:
6660:
6633:
6577:
6532:
6520:
6465:
6431:
6394:
6357:. Proceedings of the 2009 SIAM Conference on Data Mining (SDM). pp. 1218â1229.
6295:
6240:
6110:
6090:
6073:
6008:
5972:
5931:
5869:
5789:
5742:
5709:
5673:
5630:
5618:
5583:
5537:
5525:
5492:
5472:
5427:
5405:
5382:
5362:
5298:
5260:
5250:
5178:
5153:
5141:
5104:
5062:
5016:
4941:
4921:
4904:
4877:
4833:
4817:
4786:
4770:
4733:
4687:
4677:
4589:
4535:
4521:
4451:
3948:
More control over the non-uniqueness of NMF is obtained with sparsity constraints.
3678:
3581:
3372:
2363:
Two simple divergence functions studied by Lee and Seung are the squared error (or
765:
518:
468:
378:
362:
332:
194:
189:
139:
129:
27:
7321:
6637:
6199:". International Conference on Computer Vision (ICCV) Beijing, China, Oct., 2005.
5329:
Online Discussion Participation Prediction Using Non-negative Matrix Factorization
5289:
5095:
Thomas, L.B. (1974). "Problem 73-14, Rank factorization of nonnegative matrices".
4310:
Algorithmic: searching for global minima of the factors and factor initialization.
1819:, then the above minimization is mathematically equivalent to the minimization of
8075:
7642:
7196:
7052:
6939:
6470:
5476:
5255:
5182:
4899:
4850:
4758:
4682:
4353:
4235:
4231:
3938:
1093:
1044:
793:
597:
463:
403:
8132:
7876:
6992:
6956:
6661:"Mining the posterior cingulate: segregation between memory and pain components"
4988:"On the Equivalence of Nonnegative Matrix Factorization and Spectral Clustering"
3989:
property is used to separate the stellar light and the light scattered from the
1427:
is a matrix with 10000 rows and 500 columns, the same shape as the input matrix
8236:
Yong Xiang: "Blind Source Separation: Dependent Component Analysis", Springer,
7993:
7858:
6834:
6398:
6369:
5793:
5746:
5409:
5397:
5366:
5122:
Vavasis, S.A. (2009). "On the complexity of nonnegative matrix factorization".
5020:
4594:
4559:
4526:
4491:
4227:
4194:
3711:
3552:
3278:
2364:
1068:
1008:
813:
344:
81:
8040:
7391:
7348:
7254:
7147:
6897:"Phoenix: A Weight-based Network Coordinate System Using Matrix Factorization"
6747:
6012:
5714:
5066:
4902:(1999). "Learning the parts of objects by non-negative matrix factorization".
4239:
subtypes, population stratification, tissue composition, and tumor clonality.
8336:
8101:
7762:
7581:
7444:
7262:
7204:
6873:
6479:
6192:
5801:
5622:
5222:
5220:
4829:
4782:
4724:
4626:
Large-scale matrix factorization with distributed stochastic gradient descent
4492:"Non-negative Matrix Factorization: Robust Extraction of Extended Structures"
4190:
2502:
2329:
1467:
It is useful to think of each feature (column vector) in the features matrix
732:
661:
543:
274:
159:
8116:
7955:(1997). "Least squares formulation of robust non-negative factor analysis".
7656:
Chao Liu; Hung-chih Yang; Jinliang Fan; Li-Wei He & Yi-Min Wang (2010).
7563:
7303:
5529:
5302:
3512:. A polynomial time algorithm for solving nonnegative rank factorization if
1167:
Matrix multiplication can be implemented as computing the column vectors of
8269:
Jen-Tzung Chien: "Source Separation and Machine Learning", Academic Press,
8195:
8140:
8070:
8001:
7655:
7589:
7548:
7489:
7399:
7356:
7280:
7222:
7165:
7114:
7071:
7010:
6689:
6139:
6102:
5484:
5374:
5274:
4933:
4821:
4701:
4054:
3494:
The contribution of the sequential NMF components can be compared with the
2161:
they become easier to store and manipulate. Another reason for factorizing
1080:
1056:
8176:
8055:
7677:
7530:
7181:"Deciphering signatures of mutational processes operative in human cancer"
6212:
Exponential Family Harmoniums with an Application to Information Retrieval
5402:
Proceedings of the International Joint Conference on Neural Networks, 2003
5217:
4837:
4790:
2140:. The elements of the residual matrix can either be negative or positive.
956:
Illustration of approximate non-negative matrix factorization: the matrix
7797:
6452:
6332:
5600:
5552:
5352:
4642:
TopicMF: Simultaneously Exploiting Ratings and Reviews for Recommendation
4438:
4035:
3942:
3384:
2515:
538:
32:
6266:
Max Welling & Markus Weber (2001). "Positive Tensor Factorization".
6035:. Proceedings of the 30th International Conference on Machine Learning.
4403:
Sparse nonnegative matrix approximation: new formulations and algorithms
4384:"Generalized Nonnegative Matrix Approximations with Bregman Divergences"
3657:
NMF with the least-squares objective is equivalent to a relaxed form of
1448:
is a linear combination of the 10 column vectors in the features matrix
7724:
Dong Wang; Ravichander Vipperla; Nick Evans; Thomas Fang Zheng (2013).
6668:
6252:
4745:
3990:
3376:
1394:
Assume we ask the algorithm to find 10 features in order to generate a
687:
383:
309:
7481:
5677:
5587:
5145:
5053:
Berman, A.; R.J. Plemmons (1974). "Inverses of nonnegative matrices".
4560:"Using Data Imputation for Signal Separation in High Contrast Imaging"
2179:, is that if one's goal is to approximately represent the elements of
7726:"Online Non-Negative Convolutive Pattern Learning for Speech Signals"
6847:
6776:
5326:
Fung, Yik-Hing; Li, Chun-Hung; Cheung, William K. (2 November 2007).
4968:
4490:
Ren, Bin; Pueyo, Laurent; Zhu, Guangtun B.; DuchĂȘne, Gaspard (2018).
3986:
3982:
3359:
may be the same or different, as some NMF variants regularize one of
1040:
1000:
846:
627:
7837:
Proceedings of the 2013 SIAM International Conference on Data Mining
6316:
Fast Nonnegative Tensor Factorization with an Active-set-like Method
6244:
6066:"Learning the parts of objects by non-negative matrix factorization"
5108:
4737:
4714:
2212:
can be anything in that space. Convex NMF restricts the columns of
7896:
7889:
7833:
7680:"Scalable Nonnegative Matrix Factorization with Block-wise Updates"
6975:"Fast and efficient estimation of individual ancestry coefficients"
6806:
6620:
5918:
5824:
5784:
5233:
5209:
4576:
4508:
4455:
4002:
3859:{\displaystyle \mathbf {\tilde {H}} =\mathbf {B} ^{-1}\mathbf {H} }
3642:, it is in fact equivalent to another instance of multinomial PCA,
8062:
6973:
Frichot E, Mathieu F, Trouillon T, Bouchard G, Francois O (2014).
6785:
6564:
6507:
6094:
6041:
5856:
5136:
4925:
3714:
can be used to transform the two factorization matrices by, e.g.,
3539:
Learning the parts of objects by non-negative matrix factorization
7380:
IEEE/ACM Transactions on Computational Biology and Bioinformatics
7178:
6712:
6032:
A practical algorithm for topic modeling with provable guarantees
5446:"Projected Gradient Methods for Nonnegative Matrix Factorization"
4076:
based on the outbound scientific citations in English Knowledge.
3941:. In this simple case it will just correspond to a scaling and a
3685:
622:
8158:"Bayesian Inference for Nonnegative Matrix Factorisation Models"
7235:
5203:. Proc. IEEE Workshop on Neural Networks for Signal Processing.
2073:
1240:{\displaystyle \mathbf {v} _{i}=\mathbf {W} \mathbf {h} _{i}\,,}
7807:
Proceedings of the 23rd International World Wide Web Conference
7665:
Proceedings of the 19th International World Wide Web Conference
7239:"Enter the Matrix: Factorization Uncovers Knowledge from Omics"
6972:
6659:
Nielsen, Finn Ă
rup; Balslev, Daniela; Hansen, Lars Kai (2005).
5404:. Vol. 4. Portland, Oregon USA: IEEE. pp. 2758â2763.
4957:
4897:
4343:
4061:
2323:
373:
6548:"PYNPOINT: an image processing package for finding exoplanets"
6371:
Document clustering based on non-negative matrix factorization
5690:
2274:. This greatly improves the quality of data representation of
8016:
7798:
Xiangnan He; Min-Yen Kan; Peichu Xie & Xiao Chen (2014).
7700:
4255:
4065:
3764:{\displaystyle \mathbf {WH} =\mathbf {WBB} ^{-1}\mathbf {H} }
2396:
find nonnegative matrices W and H that minimize the function
617:
612:
339:
7678:
Jiangtao Yin; Lixin Gao & Zhongfei (Mark) Zhang (2014).
7561:
6355:
Efficient Multiplicative updates for Support Vector Machines
6492:
6333:
Kenan Yilmaz; A. Taylan Cemgil & Umut Simsekli (2011).
4990:. Proc. SIAM Int'l Conf. Data Mining, pp. 606-610. May 2005
2505:
problem, although it may also still be referred to as NMF.
2188:
1614:{\displaystyle \mathbf {V} \simeq \mathbf {W} \mathbf {H} }
8291:
Nicolas Gillis: "Nonnegative Matrix Factorization", SIAM,
7610:
6894:
6820:
7377:
7334:
6961:
Machine Learning for Signal Processing, IEEE Workshop on
6799:
6157:
5991:
5985:
5736:
5355:
IEEE Transactions on Neural Networks and Learning Systems
5226:
4803:
8280:
Shoji Makino(Ed.): "Audio Source Separation", Springer,
7510:
7420:
6389:
Eggert, J.; Korner, E. (2004). "Sparse coding and NMF".
6367:
6197:
A Unifying Approach to Hard and Probabilistic Clustering
5332:. Wi-Iatw '07. IEEE Computer Society. pp. 284â287.
4147:
end-to-end links can be predicted after conducting only
4095:
3870:
they form another parametrization of the factorization.
1757:
If we furthermore impose an orthogonality constraint on
1369:
Here is an example based on a text-mining application:
1157:{\displaystyle \mathbf {V} =\mathbf {W} \mathbf {H} \,.}
905:
List of datasets in computer vision and image processing
8078:(2008). "Nonnegative Matrix and Tensor Factorization".
6429:
6265:
3556:
NMF as a probabilistic graphical model: visible units (
3347:
is found analogously. The procedures used to solve for
7800:"Comment-based Multi-View Clustering of Web 2.0 Items"
7084:
5550:
5079:
4120:
hosts, with the help of NMF, the distances of all the
3213:
3211:
3143:
1889:{\displaystyle \mathbf {H} _{kj}>\mathbf {H} _{ij}}
1454:
with coefficients supplied by the coefficients matrix
952:
6957:
Wind noise reduction using non-negative sparse coding
5994:"Computing symmetric nonnegative rank factorizations"
4388:
Advances in Neural Information Processing Systems 18
4153:
4126:
4106:
3908:
3879:
3820:
3780:
3723:
3590:
3477:
3445:
3309:{\displaystyle \mathbf {V} =\mathbf {W} \mathbf {H} }
3287:
2871:
2646:
2624:
2405:
2380:
2228:
2020:
1997:
1977:
1957:
1937:
1910:
1852:
1832:
1785:
1763:
1722:
1673:
1647:
1627:
1592:
1570:
1507:
1200:
1131:
7465:
6895:
Yang Chen; Xiao Wang; Cong Shi; et al. (2011).
6658:
6604:
6312:
5769:
5643:
4953:
4951:
4185:
Speech denoising has been a long lasting problem in
1373:
Let the input matrix (the matrix to be factored) be
974:, which, when multiplied, approximately reconstruct
7914:
6955:
Schmidt, M.N., J. Larsen, and F.T. Hsiao. (2007). "
6905:
IEEE Transactions on Network and Service Management
6814:
6028:
5772:
International Journal of Data Science and Analytics
5052:
4557:
4401:Tandon, Rashish; Sra, Suvrit (September 13, 2010).
3335:found by a non-negative least squares solver, then
6735:Computational and Mathematical Organization Theory
5958:
5841:
5544:
4489:
4180:
4168:
4139:
4112:
3923:
3894:
3858:
3806:
3763:
3710:The factorization is not unique: A matrix and its
3626:
3483:
3463:
3308:
3269:
3197:
3096:
2853:
2630:
2471:
2388:
2266:
2046:
2003:
1983:
1963:
1943:
1923:
1888:
1838:
1811:
1771:
1746:
1708:
1653:
1633:
1613:
1578:
1553:
1239:
1156:
8155:
7511:Boutchko; Mitra; Baker; Jagust; Gullberg (2015).
7337:IEEE Journal of Biomedical and Health Informatics
6552:Monthly Notices of the Royal Astronomical Society
6232:Journal of Computational and Graphical Statistics
5082:Nonnegative matrices in the Mathematical Sciences
4948:
4419:
4417:
4415:
3915:
3886:
3827:
3788:
2289:
1554:{\displaystyle \mathbf {V} =(v_{1},\dots ,v_{n})}
8334:
6851:IEEE Journal on Selected Areas in Communications
6141:Variational Extensions to EM and Multinomial PCA
5691:Jingu Kim; Yunlong He & Haesun Park (2013).
5565:SIAM Journal on Matrix Analysis and Applications
5166:
5073:
5046:
4961:Algorithms for Non-negative Matrix Factorization
4064:. Another research group clustered parts of the
1173:as linear combinations of the column vectors in
7958:Chemometrics and Intelligent Laboratory Systems
6772:Clustering of scientific citations in Knowledge
5952:
4958:Daniel D. Lee & H. Sebastian Seung (2001).
3532:
3423:The sequential construction of NMF components (
2482:Another type of NMF for images is based on the
7951:
7293:
6888:
6225:
6161:Relation between PLSA and NMF and Implications
5684:
4982:
4980:
4978:
4412:
2108:then amounts to the two non-negative matrices
2065:(PLSA), a popular document clustering method.
2047:{\displaystyle \mathbf {H} \mathbf {H} ^{T}=I}
1812:{\displaystyle \mathbf {H} \mathbf {H} ^{T}=I}
900:List of datasets for machine-learning research
7518:Journal of Cerebral Blood Flow and Metabolism
7078:
7023:
6064:Lee, Daniel D.; Sebastian, Seung, H. (1999).
4423:
4359:
2074:Approximate non-negative matrix factorization
933:
7977:
7127:
6823:Computational Statistics & Data Analysis
6388:
6137:
6123:: CS1 maint: multiple names: authors list (
6063:
6024:
6022:
5837:
5835:
5325:
5287:
5009:Computational Statistics & Data Analysis
4893:
4891:
3961:In astronomy, NMF is a promising method for
3371:. Specific approaches include the projected
3125:We note that the multiplicative factors for
2324:Different cost functions and regularizations
1661:that minimize the error function (using the
8163:Computational Intelligence and Neuroscience
7562:Abdalah; Boutchko; Mitra; Gullberg (2015).
6731:
6208:
5961:"Computing nonnegative rank factorizations"
5637:
4975:
4655:
3323:: in each step of such an algorithm, first
2280:. Furthermore, the resulting matrix factor
962:is represented by the two smaller matrices
6841:
6545:
6368:Wei Xu; Xin Liu & Yihong Gong (2003).
5895:
5893:
5891:
5194:
5192:
4999:
4722:(1971). "Self modeling curve resolution".
4382:Dhillon, Inderjit S.; Sra, Suvrit (2005).
4381:
3807:{\displaystyle \mathbf {{\tilde {W}}=WB} }
3584:from a probability distribution with mean
1179:using coefficients supplied by columns of
940:
926:
8185:
8175:
8061:
7942:
7895:
7848:
7752:
7632:
7538:
7270:
7212:
7155:
7104:
7061:
7051:
7000:
6917:
6863:
6805:
6784:
6619:
6581:
6563:
6506:
6469:
6451:
6289:
6158:Eric Gaussier & Cyril Goutte (2005).
6040:
6019:
5976:
5935:
5917:
5855:
5832:
5823:
5783:
5713:
5667:
5577:
5519:
5466:
5264:
5254:
5227:Leo Taslaman & Björn Nilsson (2012).
5208:
5135:
5115:
4888:
4804:Pentti Paatero; Unto Tapper (June 1994).
4691:
4681:
4615:
4613:
4593:
4575:
4525:
4507:
4437:
4400:
4377:
4375:
4373:
4087:
1991:-th column gives the cluster centroid of
1233:
1150:
1039:NMF finds applications in such fields as
6336:Generalized Coupled Tensor Factorization
5992:Kalofolias, V.; Gallopoulos, E. (2012).
5813:
5811:
5439:
5437:
5088:
4638:
3551:
3408:
2320:, if it exists, is known to be NP-hard.
2189:Convex non-negative matrix factorization
1709:{\displaystyle \left\|V-WH\right\|_{F},}
1564:More specifically, the approximation of
1267:-th column vector of the product matrix
1084:researchers in the 1990s under the name
951:
6768:
5888:
5189:
5121:
5039:
5037:
4553:
4551:
4549:
4547:
4545:
4485:
4483:
4481:
4479:
4477:
4475:
4473:
3974:, especially for the direct imaging of
1971:gives the cluster centroids, i.e., the
1846:gives the cluster membership, i.e., if
8335:
7733:IEEE Transactions on Signal Processing
7085:Hyunsoo Kim & Haesun Park (2007).
6546:Amara, Adam; Quanz, Sascha P. (2012).
5603:IEEE Transactions on Signal Processing
5395:
5094:
4610:
4370:
4204:
3644:probabilistic latent semantic analysis
3435:) was firstly used to relate NMF with
3256:
3230:
3176:
3155:
2063:probabilistic latent semantic analysis
2057:When the error function to be used is
1496:
6710:
5899:
5808:
5434:
5288:Hsieh, C. J.; Dhillon, I. S. (2011).
5198:
4226:NMF has been successfully applied in
4096:Scalable Internet distance prediction
3924:{\displaystyle \mathbf {\tilde {H}} }
3895:{\displaystyle \mathbf {\tilde {W}} }
2332:for measuring the divergence between
1403:with 10000 rows and 10 columns and a
6313:Jingu Kim & Haesun Park (2012).
5647:SIAM Journal on Scientific Computing
5508:IEEE Transactions on Neural Networks
5034:
4542:
4470:
4269:
3627:{\displaystyle \sum _{a}W_{ia}h_{a}}
2571:There are several ways in which the
2529:
2286:becomes more sparse and orthogonal.
2267:{\displaystyle (v_{1},\dots ,v_{n})}
1904:, this suggests that the input data
1354:can be significantly less than both
6393:. Vol. 4. pp. 2529â2533.
6376:Association for Computing Machinery
5959:Campbell, S.L.; G.D. Poole (1981).
5817:
5505:
5443:
4986:C. Ding, X. He, H.D. Simon (2005).
4265:
2090:in NMF are selected so the product
895:Glossary of artificial intelligence
16:Algorithms for matrix decomposition
13:
6423:
4249:
4008:
3694:NMF is an instance of nonnegative
2014:When the orthogonality constraint
14:
8369:
8324:Non-negative matrix factorization
6606:circumstellar disks with MLOCI".
6495:The Astrophysical Journal Letters
6209:Max Welling; et al. (2004).
5844:The Astrophysical Journal Letters
5741:. Vol. 4. pp. 606â610.
5080:A. Berman; R.J. Plemmons (1994).
4656:Ben Murrell; et al. (2011).
4221:
4038:applications. In this process, a
3985:of the NMF modeling process; the
3562:) are connected to hidden units (
3418:
2618:by computing the following, with
2078:Usually the number of columns of
1090:non-negative matrix factorization
1088:. It became more widely known as
997:non-negative matrix approximation
985:Non-negative matrix factorization
8306:
6682:10.1016/j.neuroimage.2005.04.034
6583:10.1111/j.1365-2966.2012.21918.x
4274:
3912:
3883:
3852:
3838:
3824:
3800:
3797:
3794:
3785:
3757:
3743:
3740:
3737:
3728:
3725:
3302:
3297:
3289:
3250:
3244:
3239:
3224:
3218:
3188:
3183:
3170:
3162:
3149:
3043:
3022:
3010:
2955:
2946:
2912:
2874:
2816:
2804:
2782:
2746:
2725:
2687:
2649:
2557:. By spatio-temporal pooling of
2449:
2446:
2438:
2421:
2413:
2382:
2096:will become an approximation to
2028:
2022:
1873:
1855:
1793:
1787:
1765:
1747:{\displaystyle W\geq 0,H\geq 0.}
1607:
1602:
1594:
1572:
1509:
1290:-th column vector of the matrix
1223:
1217:
1203:
1146:
1141:
1133:
8081:IEEE Signal Processing Magazine
7883:
7827:
7791:
7717:
7693:
7671:
7649:
7604:
7555:
7504:
7459:
7414:
7371:
7328:
7287:
7229:
7172:
7121:
7017:
6966:
6949:
6928:10.1109/tnsm.2011.110911.100079
6793:
6762:
6725:
6704:
6652:
6598:
6539:
6486:
6382:
6361:
6346:
6326:
6306:
6259:
6219:
6202:
6185:
6151:
6131:
6057:
5763:
5730:
5594:
5499:
5389:
5346:
5319:
5281:
5160:
4993:
4844:
4797:
4752:
4181:Non-stationary speech denoising
3972:methods of detecting exoplanets
3951:
3677:NMF can be seen as a two-layer
3667:contains cluster centroids and
2193:In standard NMF, matrix factor
1110:be the product of the matrices
7915:J. Shen; G. W. Israël (1989).
5701:Journal of Global Optimization
5055:Linear and Multilinear Algebra
4708:
4649:
4639:Yang Bao; et al. (2014).
4632:
4620:Rainer Gemulla; Erik Nijkamp;
4394:
4163:
4157:
4029:
3458:
3446:
3086:
3074:
3070:
3060:
3038:
3005:
2998:
2986:
2982:
2972:
2950:
2942:
2929:
2917:
2907:
2891:
2879:
2843:
2831:
2827:
2793:
2777:
2774:
2767:
2755:
2751:
2736:
2720:
2717:
2704:
2692:
2682:
2666:
2654:
2583:may be found: Lee and Seung's
2454:
2433:
2425:
2409:
2290:Nonnegative rank factorization
2261:
2229:
1693:
1676:
1548:
1516:
315:Relevance vector machine (RVM)
1:
7971:10.1016/S0169-7439(96)00044-5
7106:10.1093/bioinformatics/btm134
6711:Cohen, William (2005-04-04).
6300:10.1016/S0167-8655(01)00070-8
5000:Ding C, Li Y, Peng W (2008).
4775:10.1016/S0021-8502(05)80089-8
4339:Multilinear subspace learning
3705:
3471:-th component with the first
2638:as an index of the iteration.
2566:
2518:fashion. One such use is for
2508:
2501:due to the similarity to the
2304:is equal to its actual rank,
2102:. The full decomposition of
1412:with 10 rows and 500 columns.
1099:
1086:positive matrix factorization
804:Computational learning theory
368:Expectationâmaximization (EM)
7944:10.1016/0004-6981(89)90190-X
7643:10.1016/j.patcog.2007.09.010
7197:10.1016/j.celrep.2012.12.008
7053:10.1371/journal.pcbi.1000029
6608:Astronomy & Astrophysics
6525:10.1088/0004-637X/694/2/L148
6440:Astronomy & Astrophysics
5978:10.1016/0024-3795(81)90272-x
5477:10.1162/neco.2007.19.10.2756
5256:10.1371/journal.pone.0046331
5183:10.1016/j.neucom.2008.01.022
4882:10.1016/1352-2310(94)00367-T
4683:10.1371/journal.pone.0028898
3956:
3547:principal component analysis
3533:Relation to other techniques
3501:
3437:Principal Component Analysis
2389:{\displaystyle \mathbf {V} }
1772:{\displaystyle \mathbf {H} }
1579:{\displaystyle \mathbf {V} }
1191:can be computed as follows:
1024:into (usually) two matrices
761:Coefficient of determination
608:Convolutional neural network
320:Support vector machine (SVM)
7:
8353:Machine learning algorithms
8133:10.1162/neco.2008.04-08-771
6993:10.1534/genetics.113.160572
6769:Nielsen, Finn Ă
rup (2008).
6638:10.1051/0004-6361/201525837
6269:Pattern Recognition Letters
5937:10.3847/0004-637X/824/2/117
5874:10.1088/2041-8205/755/2/L28
5739:Proc. SIAM Data Mining Conf
4327:
3640:KullbackâLeibler divergence
2369:KullbackâLeibler divergence
2059:KullbackâLeibler divergence
1185:. That is, each column of
912:Outline of machine learning
809:Empirical risk minimization
10:
8374:
8156:Ali Taylan Cemgil (2009).
7994:10.1162/neco.2007.19.3.780
7859:10.1137/1.9781611972832.28
7031:PLOS Computational Biology
6835:10.1016/j.csda.2006.11.006
6471:10.1051/0004-6361:20066282
6399:10.1109/IJCNN.2004.1381036
5794:10.1007/s41060-022-00370-9
5747:10.1137/1.9781611972757.70
5410:10.1109/IJCNN.2003.1224004
5367:10.1109/TNNLS.2012.2197827
5201:Non-negative sparse coding
5021:10.1016/j.csda.2008.01.011
4762:Journal of Aerosol Science
4624:; Yannis Sismanis (2011).
4360:Sources and external links
3321:non-negative least squares
2606:Then update the values in
2585:multiplicative update rule
2499:non-negative sparse coding
2222:of the input data vectors
2084:and the number of rows of
2061:, NMF is identical to the
1951:-th cluster. The computed
1826:Furthermore, the computed
1074:
549:Feedforward neural network
300:Artificial neural networks
8041:10.1007/s11434-005-1109-6
7907:
7392:10.1109/TCBB.2021.3091814
7349:10.1109/JBHI.2020.2991763
7255:10.1016/j.tig.2018.07.003
7148:10.1007/s00401-012-1077-2
6748:10.1007/s10588-005-5380-5
6013:10.1016/j.laa.2011.03.016
5906:The Astrophysical Journal
5715:10.1007/s10898-013-0035-4
5199:Hoyer, Patrik O. (2002).
5067:10.1080/03081087408817055
4564:The Astrophysical Journal
4496:The Astrophysical Journal
4283:This section needs to be
4193:is suitable for additive
532:Artificial neural network
8102:10.1109/MSP.2008.4408452
8020:Chinese Science Bulletin
7763:10.1109/tsp.2012.2222381
7582:10.1109/TMI.2014.2352033
7445:10.1109/tns.1982.4332188
6874:10.1109/JSAC.2006.884026
5623:10.1109/TSP.2012.2190406
4595:10.3847/1538-4357/ab7024
4527:10.3847/1538-4357/aaa1f2
4426:The Astronomical Journal
4364:
3774:If the two new matrices
3491:components constructed.
2068:
841:Journals and conferences
788:Mathematical foundations
698:Temporal difference (TD)
554:Recurrent neural network
474:Conditional random field
397:Dimensionality reduction
145:Dimensionality reduction
107:Quantum machine learning
102:Neuromorphic engineering
62:Self-supervised learning
57:Semi-supervised learning
7922:Atmospheric Environment
7304:10.1145/1150402.1150420
6630:2015A&A...581A..24W
6462:2007A&A...469..575B
5900:Pueyo, Laurent (2016).
5530:10.1109/TNN.2007.895831
5342:– via dl.acm.org.
5303:10.1145/2020408.2020577
4861:Atmospheric Environment
4187:audio signal processing
2520:collaborative filtering
1621:is achieved by finding
1061:audio signal processing
1053:missing data imputation
250:Apprenticeship learning
7569:IEEE Trans Med Imaging
7469:IEEE Trans Med Imaging
7024:Devarajan, K. (2008).
5444:Lin, Chih-Jen (2007).
4822:10.1002/ENV.3170050203
4209:Sparse NMF is used in
4170:
4141:
4114:
4088:Spectral data analysis
4057:of related documents.
3925:
3896:
3873:The non-negativity of
3860:
3808:
3765:
3700:support vector machine
3635:
3628:
3496:KarhunenâLoĂšve theorem
3485:
3465:
3415:
3310:
3271:
3199:
3098:
2855:
2632:
2524:recommendation systems
2473:
2390:
2268:
2167:into smaller matrices
2120:as well as a residual
2048:
2005:
1985:
1965:
1945:
1925:
1890:
1840:
1813:
1773:
1748:
1710:
1655:
1635:
1615:
1580:
1555:
1391:represents a document.
1241:
1158:
981:
799:Biasâvariance tradeoff
681:Reinforcement learning
657:Spiking neural network
67:Reinforcement learning
8074:; Rafal Zdunek &
7531:10.1038/jcbfm.2015.69
7135:Acta Neuropathologica
7128:Schwalbe, E. (2013).
6713:"Enron Email Dataset"
6138:Wray Buntine (2002).
5084:. Philadelphia: SIAM.
4171:
4142:
4140:{\displaystyle N^{2}}
4115:
3926:
3897:
3861:
3809:
3766:
3696:quadratic programming
3629:
3555:
3486:
3466:
3464:{\displaystyle (n+1)}
3412:
3311:
3272:
3200:
3099:
2856:
2633:
2474:
2391:
2269:
2049:
2006:
1986:
1966:
1946:
1926:
1924:{\displaystyle v_{j}}
1891:
1841:
1814:
1774:
1749:
1711:
1656:
1636:
1616:
1581:
1556:
1242:
1159:
1005:multivariate analysis
955:
635:Neural radiance field
457:Structured prediction
180:Structured prediction
52:Unsupervised learning
7843:. pp. 252â260.
7298:. pp. 126â135.
5177:(10â12): 1824â1831.
4898:Daniel D. Lee &
4349:Tensor decomposition
4169:{\displaystyle O(N)}
4151:
4124:
4104:
4034:NMF can be used for
3931:applies at least if
3906:
3877:
3818:
3778:
3721:
3661:: the matrix factor
3588:
3475:
3443:
3285:
3209:
3141:
2869:
2644:
2622:
2484:total variation norm
2403:
2378:
2226:
2018:
1995:
1975:
1955:
1935:
1908:
1850:
1830:
1783:
1761:
1720:
1671:
1645:
1625:
1590:
1568:
1505:
1198:
1129:
824:Statistical learning
722:Learning with humans
514:Local outlier factor
8177:10.1155/2009/785152
8094:2008ISPM...25R.142C
8033:2006ChSBu..51....7L
7935:1989AtmEn..23.2289S
7745:2013ITSP...61...44W
7625:2008PatRe..41.1350B
7613:Pattern Recognition
7437:1982ITNS...29.1310D
7424:IEEE Trans Nucl Sci
7044:2008PLSCB...4E0029D
6574:2012MNRAS.427..948A
6517:2009ApJ...694L.148L
6378:. pp. 267â273.
6282:2001PaReL..22.1255W
6087:1999Natur.401..788L
6051:2012arXiv1212.4777A
6001:Linear Algebra Appl
5965:Linear Algebra Appl
5928:2016ApJ...824..117P
5866:2012ApJ...755L..28S
5660:2011SJSC...33.3261K
5615:2012ITSP...60.2882G
5396:Behnke, S. (2003).
5247:2012PLoSO...746331T
4971:. pp. 556â562.
4918:1999Natur.401..788L
4874:1995AtmEn..29.1705A
4720:Edward A. Sylvestre
4674:2011PLoSO...628898M
4586:2020ApJ...892...74R
4518:2018ApJ...852..104R
4448:2007AJ....133..734B
4390:. pp. 283â290.
4334:Multilinear algebra
4211:Population genetics
4205:Population genetics
4074:scientific journals
3995:circumstellar disks
3976:circumstellar disks
3963:dimension reduction
3543:vector quantization
2938:
2906:
2713:
2681:
2555:convolution kernels
2468:
2220:convex combinations
1497:Clustering property
1405:coefficients matrix
1065:recommender systems
1049:document clustering
667:Electrochemical RAM
574:reservoir computing
305:Logistic regression
224:Supervised learning
210:Multimodal learning
185:Feature engineering
130:Generative modeling
92:Rule-based learning
87:Curriculum learning
47:Supervised learning
22:Part of a series on
8120:Neural Computation
7981:Neural Computation
7243:Trends in Genetics
5551:Hyunsoo Kim &
5454:Neural Computation
4900:H. Sebastian Seung
4166:
4137:
4110:
3968:parallel computing
3937:is a non-negative
3921:
3892:
3856:
3804:
3761:
3679:directed graphical
3659:K-means clustering
3652:latent class model
3648:maximum likelihood
3636:
3624:
3600:
3568:) through weights
3481:
3461:
3416:
3306:
3267:
3265:
3195:
3094:
2910:
2872:
2851:
2685:
2647:
2628:
2534:If the columns of
2469:
2431:
2386:
2264:
2044:
2001:
1981:
1961:
1941:
1921:
1886:
1836:
1821:K-means clustering
1809:
1769:
1744:
1706:
1651:
1631:
1611:
1576:
1551:
1237:
1154:
982:
235: •
150:Density estimation
8297:978-1-611976-40-3
7929:(10): 2289â2298.
7868:978-1-61197-262-7
7705:mahout.apache.org
7482:10.1109/42.996340
7343:(11): 3162â3172.
7099:(12): 1495â1502.
6858:(12): 2273â2284.
6408:978-0-7803-8359-3
6276:(12): 1255â1261.
6081:(6755): 788â791.
5756:978-0-89871-593-4
5678:10.1137/110821172
5588:10.1137/07069239x
5461:(10): 2756â2779.
5419:978-0-7803-7898-8
5146:10.1137/070709967
4912:(6755): 788â791.
4868:(14): 1705â1718.
4716:William H. Lawton
4304:
4303:
4215:genetic admixture
4113:{\displaystyle N}
4070:English Knowledge
3918:
3889:
3830:
3791:
3591:
3484:{\displaystyle n}
3263:
3193:
3092:
2849:
2631:{\displaystyle n}
2530:Convolutional NMF
2491:L1 regularization
2155:are smaller than
2004:{\displaystyle k}
1984:{\displaystyle k}
1964:{\displaystyle W}
1944:{\displaystyle k}
1839:{\displaystyle H}
1654:{\displaystyle H}
1634:{\displaystyle W}
950:
949:
755:Model diagnostics
738:Human-in-the-loop
581:Boltzmann machine
494:Anomaly detection
290:Linear regression
205:Ontology learning
200:Grammar induction
175:Semantic analysis
170:Association rules
155:Anomaly detection
97:Neuro-symbolic AI
8365:
8310:
8309:
8199:
8189:
8179:
8152:
8113:
8072:Andrzej Cichocki
8067:
8065:
8052:
8013:
7974:
7948:
7946:
7902:
7901:
7899:
7887:
7881:
7880:
7852:
7842:
7831:
7825:
7824:
7822:
7821:
7815:
7809:. Archived from
7804:
7795:
7789:
7788:
7786:
7785:
7779:
7773:. Archived from
7756:
7730:
7721:
7715:
7714:
7712:
7711:
7697:
7691:
7690:
7684:
7675:
7669:
7668:
7662:
7653:
7647:
7646:
7636:
7619:(4): 1350â1362.
7608:
7602:
7601:
7559:
7553:
7552:
7542:
7508:
7502:
7501:
7463:
7457:
7456:
7418:
7412:
7411:
7386:(4): 1956â1967.
7375:
7369:
7368:
7332:
7326:
7325:
7291:
7285:
7284:
7274:
7233:
7227:
7226:
7216:
7176:
7170:
7169:
7159:
7125:
7119:
7118:
7108:
7082:
7076:
7075:
7065:
7055:
7021:
7015:
7014:
7004:
6970:
6964:
6953:
6947:
6946:
6944:
6938:. Archived from
6921:
6901:
6892:
6886:
6885:
6867:
6845:
6839:
6838:
6818:
6812:
6811:
6809:
6797:
6791:
6790:
6788:
6766:
6760:
6759:
6729:
6723:
6722:
6720:
6719:
6708:
6702:
6701:
6665:
6656:
6650:
6649:
6623:
6602:
6596:
6595:
6585:
6567:
6543:
6537:
6536:
6510:
6490:
6484:
6483:
6473:
6455:
6453:astro-ph/0703072
6427:
6421:
6420:
6386:
6380:
6379:
6365:
6359:
6358:
6350:
6344:
6343:
6341:
6330:
6324:
6323:
6321:
6310:
6304:
6303:
6293:
6263:
6257:
6256:
6223:
6217:
6216:
6206:
6200:
6189:
6183:
6182:
6180:
6179:
6173:
6166:
6155:
6149:
6148:
6146:
6135:
6129:
6128:
6122:
6114:
6070:
6061:
6055:
6054:
6044:
6026:
6017:
6016:
5998:
5989:
5983:
5982:
5980:
5956:
5950:
5949:
5939:
5921:
5897:
5886:
5885:
5859:
5839:
5830:
5829:
5827:
5815:
5806:
5805:
5787:
5767:
5761:
5760:
5734:
5728:
5727:
5717:
5697:
5688:
5682:
5681:
5671:
5654:(6): 3261â3281.
5641:
5635:
5634:
5609:(6): 2882â2898.
5598:
5592:
5591:
5581:
5561:
5548:
5542:
5541:
5523:
5514:(6): 1589â1596.
5503:
5497:
5496:
5470:
5450:
5441:
5432:
5431:
5393:
5387:
5386:
5361:(7): 1087â1099.
5350:
5344:
5343:
5323:
5317:
5316:
5296:
5285:
5279:
5278:
5268:
5258:
5224:
5215:
5214:
5212:
5196:
5187:
5186:
5164:
5158:
5157:
5139:
5130:(3): 1364â1377.
5119:
5113:
5112:
5092:
5086:
5085:
5077:
5071:
5070:
5050:
5044:
5041:
5032:
5031:
5029:
5023:. Archived from
5015:(8): 3913â3927.
5006:
4997:
4991:
4984:
4973:
4972:
4966:
4955:
4946:
4945:
4895:
4886:
4885:
4848:
4842:
4841:
4801:
4795:
4794:
4756:
4750:
4749:
4712:
4706:
4705:
4695:
4685:
4653:
4647:
4646:
4636:
4630:
4629:
4617:
4608:
4607:
4597:
4579:
4555:
4540:
4539:
4529:
4511:
4487:
4468:
4467:
4441:
4439:astro-ph/0606170
4421:
4410:
4409:
4407:
4398:
4392:
4391:
4379:
4299:
4296:
4290:
4278:
4277:
4270:
4266:Current research
4244:drug repurposing
4175:
4173:
4172:
4167:
4146:
4144:
4143:
4138:
4136:
4135:
4119:
4117:
4116:
4111:
4051:feature-document
3936:
3930:
3928:
3927:
3922:
3920:
3919:
3911:
3901:
3899:
3898:
3893:
3891:
3890:
3882:
3865:
3863:
3862:
3857:
3855:
3850:
3849:
3841:
3832:
3831:
3823:
3813:
3811:
3810:
3805:
3803:
3793:
3792:
3784:
3770:
3768:
3767:
3762:
3760:
3755:
3754:
3746:
3731:
3698:, just like the
3672:
3666:
3633:
3631:
3630:
3625:
3623:
3622:
3613:
3612:
3599:
3579:
3573:
3567:
3561:
3527:
3523:
3517:
3511:
3490:
3488:
3487:
3482:
3470:
3468:
3467:
3462:
3434:
3428:
3401:
3395:
3373:gradient descent
3370:
3364:
3358:
3352:
3346:
3340:
3334:
3328:
3315:
3313:
3312:
3307:
3305:
3300:
3292:
3279:matrices of ones
3276:
3274:
3273:
3268:
3266:
3264:
3262:
3261:
3260:
3259:
3253:
3247:
3242:
3236:
3235:
3234:
3233:
3227:
3221:
3215:
3204:
3202:
3201:
3196:
3194:
3192:
3191:
3186:
3181:
3180:
3179:
3173:
3166:
3165:
3160:
3159:
3158:
3152:
3145:
3136:
3130:
3117:
3111:
3103:
3101:
3100:
3095:
3093:
3091:
3090:
3089:
3068:
3067:
3058:
3057:
3046:
3037:
3036:
3025:
3019:
3018:
3013:
3003:
3002:
3001:
2980:
2979:
2970:
2969:
2958:
2949:
2940:
2937:
2932:
2915:
2905:
2894:
2877:
2860:
2858:
2857:
2852:
2850:
2848:
2847:
2846:
2825:
2824:
2819:
2813:
2812:
2807:
2801:
2800:
2791:
2790:
2785:
2772:
2771:
2770:
2749:
2744:
2743:
2734:
2733:
2728:
2715:
2712:
2707:
2690:
2680:
2669:
2652:
2637:
2635:
2634:
2629:
2617:
2611:
2602:
2596:
2582:
2576:
2562:
2552:
2546:
2539:
2478:
2476:
2475:
2470:
2467:
2462:
2457:
2453:
2452:
2441:
2424:
2416:
2395:
2393:
2392:
2387:
2385:
2359:
2353:
2344:and possibly by
2343:
2337:
2319:
2313:
2303:
2296:nonnegative rank
2285:
2279:
2273:
2271:
2270:
2265:
2260:
2259:
2241:
2240:
2217:
2211:
2205:
2184:
2178:
2172:
2166:
2160:
2154:
2148:
2139:
2125:
2119:
2113:
2107:
2101:
2095:
2089:
2083:
2053:
2051:
2050:
2045:
2037:
2036:
2031:
2025:
2010:
2008:
2007:
2002:
1990:
1988:
1987:
1982:
1970:
1968:
1967:
1962:
1950:
1948:
1947:
1942:
1930:
1928:
1927:
1922:
1920:
1919:
1895:
1893:
1892:
1887:
1885:
1884:
1876:
1867:
1866:
1858:
1845:
1843:
1842:
1837:
1818:
1816:
1815:
1810:
1802:
1801:
1796:
1790:
1778:
1776:
1775:
1770:
1768:
1753:
1751:
1750:
1745:
1715:
1713:
1712:
1707:
1702:
1701:
1696:
1692:
1660:
1658:
1657:
1652:
1640:
1638:
1637:
1632:
1620:
1618:
1617:
1612:
1610:
1605:
1597:
1585:
1583:
1582:
1577:
1575:
1560:
1558:
1557:
1552:
1547:
1546:
1528:
1527:
1512:
1492:
1486:
1479:
1472:
1459:
1453:
1447:
1438:
1432:
1426:
1420:
1411:
1402:
1390:
1384:
1378:
1365:
1359:
1353:
1347:
1337:
1331:
1321:
1315:
1305:
1295:
1289:
1283:
1272:
1266:
1260:
1246:
1244:
1243:
1238:
1232:
1231:
1226:
1220:
1212:
1211:
1206:
1190:
1184:
1178:
1172:
1163:
1161:
1160:
1155:
1149:
1144:
1136:
1121:
1115:
1109:
1035:
1029:
1019:
979:
973:
967:
961:
942:
935:
928:
889:Related articles
766:Confusion matrix
519:Isolation forest
464:Graphical models
243:
242:
195:Learning to rank
190:Feature learning
28:Machine learning
19:
18:
8373:
8372:
8368:
8367:
8366:
8364:
8363:
8362:
8333:
8332:
8331:
8330:
8329:
8311:
8307:
8302:
8076:Shun-ichi Amari
8027:(17â18): 7â18.
7910:
7905:
7888:
7884:
7869:
7850:10.1.1.301.1771
7840:
7832:
7828:
7819:
7817:
7813:
7802:
7796:
7792:
7783:
7781:
7777:
7754:10.1.1.707.7348
7728:
7722:
7718:
7709:
7707:
7701:"Apache Mahout"
7699:
7698:
7694:
7682:
7676:
7672:
7660:
7654:
7650:
7634:10.1.1.137.8281
7609:
7605:
7560:
7556:
7509:
7505:
7464:
7460:
7419:
7415:
7376:
7372:
7333:
7329:
7314:
7292:
7288:
7249:(10): 790â805.
7234:
7230:
7177:
7173:
7126:
7122:
7083:
7079:
7038:(7): e1000029.
7022:
7018:
6971:
6967:
6954:
6950:
6942:
6919:10.1.1.300.2851
6899:
6893:
6889:
6865:10.1.1.136.3837
6846:
6842:
6819:
6815:
6798:
6794:
6767:
6763:
6730:
6726:
6717:
6715:
6709:
6705:
6663:
6657:
6653:
6603:
6599:
6544:
6540:
6491:
6487:
6428:
6424:
6409:
6387:
6383:
6366:
6362:
6351:
6347:
6339:
6331:
6327:
6319:
6311:
6307:
6264:
6260:
6245:10.2307/1390831
6224:
6220:
6207:
6203:
6190:
6186:
6177:
6175:
6171:
6164:
6156:
6152:
6144:
6136:
6132:
6116:
6115:
6068:
6062:
6058:
6027:
6020:
5996:
5990:
5986:
5957:
5953:
5898:
5889:
5840:
5833:
5816:
5809:
5768:
5764:
5757:
5735:
5731:
5695:
5689:
5685:
5642:
5638:
5599:
5595:
5559:
5549:
5545:
5504:
5500:
5468:10.1.1.308.9135
5448:
5442:
5435:
5420:
5394:
5390:
5351:
5347:
5340:
5324:
5320:
5313:
5294:
5286:
5282:
5225:
5218:
5197:
5190:
5165:
5161:
5120:
5116:
5109:10.1137/1016064
5093:
5089:
5078:
5074:
5051:
5047:
5042:
5035:
5027:
5004:
4998:
4994:
4985:
4976:
4964:
4956:
4949:
4896:
4889:
4849:
4845:
4802:
4798:
4757:
4753:
4738:10.2307/1267173
4713:
4709:
4654:
4650:
4637:
4633:
4618:
4611:
4556:
4543:
4488:
4471:
4422:
4413:
4405:
4399:
4395:
4380:
4371:
4367:
4362:
4354:Tensor software
4330:
4300:
4294:
4291:
4288:
4279:
4275:
4268:
4252:
4250:Nuclear imaging
4236:DNA methylation
4232:gene expression
4230:for clustering
4224:
4207:
4183:
4152:
4149:
4148:
4131:
4127:
4125:
4122:
4121:
4105:
4102:
4101:
4098:
4090:
4032:
4015:data imputation
4011:
4009:Data imputation
3959:
3954:
3939:monomial matrix
3932:
3910:
3909:
3907:
3904:
3903:
3881:
3880:
3878:
3875:
3874:
3851:
3842:
3837:
3836:
3822:
3821:
3819:
3816:
3815:
3783:
3782:
3781:
3779:
3776:
3775:
3756:
3747:
3736:
3735:
3724:
3722:
3719:
3718:
3708:
3668:
3662:
3618:
3614:
3605:
3601:
3595:
3589:
3586:
3585:
3575:
3569:
3563:
3557:
3535:
3525:
3519:
3513:
3507:
3504:
3476:
3473:
3472:
3444:
3441:
3440:
3430:
3424:
3421:
3397:
3391:
3366:
3360:
3354:
3348:
3342:
3336:
3330:
3324:
3301:
3296:
3288:
3286:
3283:
3282:
3255:
3254:
3249:
3248:
3243:
3238:
3237:
3229:
3228:
3223:
3222:
3217:
3216:
3214:
3212:
3210:
3207:
3206:
3187:
3182:
3175:
3174:
3169:
3168:
3167:
3161:
3154:
3153:
3148:
3147:
3146:
3144:
3142:
3139:
3138:
3132:
3126:
3113:
3107:
3073:
3069:
3063:
3059:
3047:
3042:
3041:
3026:
3021:
3020:
3014:
3009:
3008:
3004:
2985:
2981:
2975:
2971:
2959:
2954:
2953:
2945:
2941:
2939:
2933:
2916:
2911:
2895:
2878:
2873:
2870:
2867:
2866:
2830:
2826:
2820:
2815:
2814:
2808:
2803:
2802:
2796:
2792:
2786:
2781:
2780:
2773:
2754:
2750:
2745:
2739:
2735:
2729:
2724:
2723:
2716:
2714:
2708:
2691:
2686:
2670:
2653:
2648:
2645:
2642:
2641:
2623:
2620:
2619:
2613:
2607:
2598:
2592:
2578:
2572:
2569:
2558:
2553:, representing
2548:
2542:
2535:
2532:
2511:
2463:
2458:
2445:
2437:
2436:
2432:
2420:
2412:
2404:
2401:
2400:
2381:
2379:
2376:
2375:
2355:
2349:
2339:
2333:
2326:
2315:
2305:
2299:
2292:
2281:
2275:
2255:
2251:
2236:
2232:
2227:
2224:
2223:
2213:
2207:
2204:
2194:
2191:
2180:
2174:
2168:
2162:
2156:
2150:
2144:
2127:
2121:
2115:
2109:
2103:
2097:
2091:
2085:
2079:
2076:
2071:
2032:
2027:
2026:
2021:
2019:
2016:
2015:
1996:
1993:
1992:
1976:
1973:
1972:
1956:
1953:
1952:
1936:
1933:
1932:
1915:
1911:
1909:
1906:
1905:
1877:
1872:
1871:
1859:
1854:
1853:
1851:
1848:
1847:
1831:
1828:
1827:
1797:
1792:
1791:
1786:
1784:
1781:
1780:
1764:
1762:
1759:
1758:
1721:
1718:
1717:
1697:
1679:
1675:
1674:
1672:
1669:
1668:
1646:
1643:
1642:
1626:
1623:
1622:
1606:
1601:
1593:
1591:
1588:
1587:
1571:
1569:
1566:
1565:
1542:
1538:
1523:
1519:
1508:
1506:
1503:
1502:
1499:
1488:
1482:
1475:
1468:
1455:
1449:
1443:
1434:
1428:
1422:
1416:
1415:The product of
1407:
1398:
1396:features matrix
1386:
1380:
1374:
1361:
1355:
1349:
1339:
1333:
1323:
1317:
1307:
1301:
1291:
1285:
1282:
1274:
1268:
1262:
1259:
1251:
1227:
1222:
1221:
1216:
1207:
1202:
1201:
1199:
1196:
1195:
1186:
1180:
1174:
1168:
1145:
1140:
1132:
1130:
1127:
1126:
1117:
1111:
1105:
1102:
1077:
1045:computer vision
1031:
1025:
1015:
975:
969:
963:
957:
946:
917:
916:
890:
882:
881:
842:
834:
833:
794:Kernel machines
789:
781:
780:
756:
748:
747:
728:Active learning
723:
715:
714:
683:
673:
672:
598:Diffusion model
534:
524:
523:
496:
486:
485:
459:
449:
448:
404:Factor analysis
399:
389:
388:
372:
335:
325:
324:
245:
244:
228:
227:
226:
215:
214:
120:
112:
111:
77:Online learning
42:
30:
17:
12:
11:
5:
8371:
8361:
8360:
8355:
8350:
8345:
8343:Linear algebra
8312:
8305:
8304:
8303:
8301:
8300:
8289:
8286:978-3030103033
8278:
8275:978-0128177969
8267:
8264:978-3844048148
8256:
8253:978-3662517000
8245:
8242:978-9812872265
8234:
8231:978-3844324891
8223:
8220:978-0470746660
8211:
8208:978-9774540455
8200:
8153:
8127:(3): 793â830.
8114:
8088:(1): 142â145.
8068:
8053:
8014:
7988:(3): 780â791.
7975:
7953:Pentti Paatero
7949:
7911:
7909:
7906:
7904:
7903:
7882:
7867:
7826:
7790:
7716:
7692:
7670:
7648:
7603:
7554:
7525:(7): 1104â11.
7503:
7458:
7431:(4): 1310â21.
7413:
7370:
7327:
7312:
7286:
7228:
7191:(1): 246â259.
7171:
7142:(3): 359â371.
7120:
7092:Bioinformatics
7077:
7016:
6987:(4): 973â983.
6965:
6948:
6945:on 2011-11-14.
6912:(4): 334â347.
6887:
6840:
6829:(1): 155â173.
6813:
6792:
6761:
6742:(3): 249â264.
6724:
6703:
6676:(3): 520â522.
6651:
6597:
6538:
6485:
6446:(2): 575â586.
6422:
6407:
6381:
6360:
6345:
6325:
6305:
6258:
6239:(4): 854â888.
6227:Pentti Paatero
6218:
6201:
6184:
6150:
6130:
6056:
6018:
6007:(2): 421â435.
5984:
5951:
5887:
5831:
5807:
5778:(1): 119â134.
5762:
5755:
5729:
5708:(2): 285â319.
5683:
5669:10.1.1.419.798
5636:
5593:
5579:10.1.1.70.3485
5572:(2): 713â730.
5543:
5521:10.1.1.407.318
5498:
5433:
5418:
5388:
5345:
5338:
5318:
5311:
5280:
5241:(11): e46331.
5216:
5188:
5170:Neurocomputing
5159:
5114:
5103:(3): 393â394.
5087:
5072:
5061:(2): 161â172.
5045:
5033:
5030:on 2016-03-04.
4992:
4974:
4947:
4887:
4856:Pentti Paatero
4843:
4816:(2): 111â126.
4810:Environmetrics
4796:
4751:
4732:(3): 617â633.
4707:
4668:(12): e28898.
4648:
4631:
4609:
4541:
4469:
4456:10.1086/510127
4432:(2): 734â754.
4411:
4393:
4368:
4366:
4363:
4361:
4358:
4357:
4356:
4351:
4346:
4341:
4336:
4329:
4326:
4325:
4324:
4321:
4318:
4315:
4314:Decomposition.
4311:
4302:
4301:
4282:
4280:
4273:
4267:
4264:
4251:
4248:
4228:bioinformatics
4223:
4222:Bioinformatics
4220:
4206:
4203:
4195:Gaussian noise
4182:
4179:
4165:
4162:
4159:
4156:
4134:
4130:
4109:
4097:
4094:
4089:
4086:
4031:
4028:
4010:
4007:
3958:
3955:
3953:
3950:
3917:
3914:
3888:
3885:
3854:
3848:
3845:
3840:
3835:
3829:
3826:
3802:
3799:
3796:
3790:
3787:
3772:
3771:
3759:
3753:
3750:
3745:
3742:
3739:
3734:
3730:
3727:
3707:
3704:
3621:
3617:
3611:
3608:
3604:
3598:
3594:
3534:
3531:
3503:
3500:
3480:
3460:
3457:
3454:
3451:
3448:
3420:
3419:Sequential NMF
3417:
3304:
3299:
3295:
3291:
3258:
3252:
3246:
3241:
3232:
3226:
3220:
3190:
3185:
3178:
3172:
3164:
3157:
3151:
3120:
3119:
3104:
3088:
3085:
3082:
3079:
3076:
3072:
3066:
3062:
3056:
3053:
3050:
3045:
3040:
3035:
3032:
3029:
3024:
3017:
3012:
3007:
3000:
2997:
2994:
2991:
2988:
2984:
2978:
2974:
2968:
2965:
2962:
2957:
2952:
2948:
2944:
2936:
2931:
2928:
2925:
2922:
2919:
2914:
2909:
2904:
2901:
2898:
2893:
2890:
2887:
2884:
2881:
2876:
2864:
2861:
2845:
2842:
2839:
2836:
2833:
2829:
2823:
2818:
2811:
2806:
2799:
2795:
2789:
2784:
2779:
2776:
2769:
2766:
2763:
2760:
2757:
2753:
2748:
2742:
2738:
2732:
2727:
2722:
2719:
2711:
2706:
2703:
2700:
2697:
2694:
2689:
2684:
2679:
2676:
2673:
2668:
2665:
2662:
2659:
2656:
2651:
2639:
2627:
2604:
2568:
2565:
2531:
2528:
2510:
2507:
2480:
2479:
2466:
2461:
2456:
2451:
2448:
2444:
2440:
2435:
2430:
2427:
2423:
2419:
2415:
2411:
2408:
2384:
2365:Frobenius norm
2346:regularization
2330:cost functions
2325:
2322:
2291:
2288:
2263:
2258:
2254:
2250:
2247:
2244:
2239:
2235:
2231:
2202:
2190:
2187:
2075:
2072:
2070:
2067:
2043:
2040:
2035:
2030:
2024:
2000:
1980:
1960:
1940:
1918:
1914:
1883:
1880:
1875:
1870:
1865:
1862:
1857:
1835:
1808:
1805:
1800:
1795:
1789:
1767:
1743:
1740:
1737:
1734:
1731:
1728:
1725:
1705:
1700:
1695:
1691:
1688:
1685:
1682:
1678:
1663:Frobenius norm
1650:
1630:
1609:
1604:
1600:
1596:
1574:
1550:
1545:
1541:
1537:
1534:
1531:
1526:
1522:
1518:
1515:
1511:
1498:
1495:
1462:
1461:
1440:
1413:
1392:
1278:
1255:
1248:
1247:
1236:
1230:
1225:
1219:
1215:
1210:
1205:
1165:
1164:
1153:
1148:
1143:
1139:
1135:
1101:
1098:
1092:after Lee and
1076:
1073:
1069:bioinformatics
1009:linear algebra
999:is a group of
948:
947:
945:
944:
937:
930:
922:
919:
918:
915:
914:
909:
908:
907:
897:
891:
888:
887:
884:
883:
880:
879:
874:
869:
864:
859:
854:
849:
843:
840:
839:
836:
835:
832:
831:
826:
821:
816:
814:Occam learning
811:
806:
801:
796:
790:
787:
786:
783:
782:
779:
778:
773:
771:Learning curve
768:
763:
757:
754:
753:
750:
749:
746:
745:
740:
735:
730:
724:
721:
720:
717:
716:
713:
712:
711:
710:
700:
695:
690:
684:
679:
678:
675:
674:
671:
670:
664:
659:
654:
649:
648:
647:
637:
632:
631:
630:
625:
620:
615:
605:
600:
595:
590:
589:
588:
578:
577:
576:
571:
566:
561:
551:
546:
541:
535:
530:
529:
526:
525:
522:
521:
516:
511:
503:
497:
492:
491:
488:
487:
484:
483:
482:
481:
476:
471:
460:
455:
454:
451:
450:
447:
446:
441:
436:
431:
426:
421:
416:
411:
406:
400:
395:
394:
391:
390:
387:
386:
381:
376:
370:
365:
360:
352:
347:
342:
336:
331:
330:
327:
326:
323:
322:
317:
312:
307:
302:
297:
292:
287:
279:
278:
277:
272:
267:
257:
255:Decision trees
252:
246:
232:classification
222:
221:
220:
217:
216:
213:
212:
207:
202:
197:
192:
187:
182:
177:
172:
167:
162:
157:
152:
147:
142:
137:
132:
127:
125:Classification
121:
118:
117:
114:
113:
110:
109:
104:
99:
94:
89:
84:
82:Batch learning
79:
74:
69:
64:
59:
54:
49:
43:
40:
39:
36:
35:
24:
23:
15:
9:
6:
4:
3:
2:
8370:
8359:
8358:Factorization
8356:
8354:
8351:
8349:
8348:Matrix theory
8346:
8344:
8341:
8340:
8338:
8327:
8326:
8325:
8319:
8315:
8298:
8294:
8290:
8287:
8283:
8279:
8276:
8272:
8268:
8265:
8261:
8257:
8254:
8250:
8246:
8243:
8239:
8235:
8232:
8228:
8224:
8221:
8217:
8212:
8209:
8205:
8201:
8197:
8193:
8188:
8183:
8178:
8173:
8169:
8165:
8164:
8159:
8154:
8150:
8146:
8142:
8138:
8134:
8130:
8126:
8122:
8121:
8115:
8111:
8107:
8103:
8099:
8095:
8091:
8087:
8083:
8082:
8077:
8073:
8069:
8064:
8059:
8054:
8050:
8046:
8042:
8038:
8034:
8030:
8026:
8022:
8021:
8015:
8011:
8007:
8003:
7999:
7995:
7991:
7987:
7983:
7982:
7976:
7972:
7968:
7964:
7960:
7959:
7954:
7950:
7945:
7940:
7936:
7932:
7928:
7924:
7923:
7918:
7913:
7912:
7898:
7893:
7886:
7878:
7874:
7870:
7864:
7860:
7856:
7851:
7846:
7839:
7838:
7830:
7816:on 2015-04-02
7812:
7808:
7801:
7794:
7780:on 2015-04-19
7776:
7772:
7768:
7764:
7760:
7755:
7750:
7746:
7742:
7738:
7734:
7727:
7720:
7706:
7702:
7696:
7688:
7681:
7674:
7666:
7659:
7652:
7644:
7640:
7635:
7630:
7626:
7622:
7618:
7614:
7607:
7599:
7595:
7591:
7587:
7583:
7579:
7576:(1): 216â18.
7575:
7571:
7570:
7565:
7558:
7550:
7546:
7541:
7536:
7532:
7528:
7524:
7520:
7519:
7514:
7507:
7499:
7495:
7491:
7487:
7483:
7479:
7476:(3): 216â25.
7475:
7471:
7470:
7462:
7454:
7450:
7446:
7442:
7438:
7434:
7430:
7426:
7425:
7417:
7409:
7405:
7401:
7397:
7393:
7389:
7385:
7381:
7374:
7366:
7362:
7358:
7354:
7350:
7346:
7342:
7338:
7331:
7323:
7319:
7315:
7309:
7305:
7301:
7297:
7290:
7282:
7278:
7273:
7268:
7264:
7260:
7256:
7252:
7248:
7244:
7240:
7232:
7224:
7220:
7215:
7210:
7206:
7202:
7198:
7194:
7190:
7186:
7182:
7175:
7167:
7163:
7158:
7153:
7149:
7145:
7141:
7137:
7136:
7131:
7124:
7116:
7112:
7107:
7102:
7098:
7094:
7093:
7088:
7081:
7073:
7069:
7064:
7059:
7054:
7049:
7045:
7041:
7037:
7033:
7032:
7027:
7020:
7012:
7008:
7003:
6998:
6994:
6990:
6986:
6982:
6981:
6976:
6969:
6962:
6958:
6952:
6941:
6937:
6933:
6929:
6925:
6920:
6915:
6911:
6907:
6906:
6898:
6891:
6883:
6879:
6875:
6871:
6866:
6861:
6857:
6853:
6852:
6844:
6836:
6832:
6828:
6824:
6817:
6808:
6803:
6796:
6787:
6782:
6778:
6774:
6773:
6765:
6757:
6753:
6749:
6745:
6741:
6737:
6736:
6728:
6714:
6707:
6699:
6695:
6691:
6687:
6683:
6679:
6675:
6671:
6670:
6662:
6655:
6647:
6643:
6639:
6635:
6631:
6627:
6622:
6617:
6613:
6609:
6601:
6593:
6589:
6584:
6579:
6575:
6571:
6566:
6561:
6557:
6553:
6549:
6542:
6534:
6530:
6526:
6522:
6518:
6514:
6509:
6504:
6500:
6496:
6489:
6481:
6477:
6472:
6467:
6463:
6459:
6454:
6449:
6445:
6441:
6437:
6433:
6426:
6418:
6414:
6410:
6404:
6400:
6396:
6392:
6385:
6377:
6373:
6372:
6364:
6356:
6349:
6338:
6337:
6329:
6318:
6317:
6309:
6301:
6297:
6292:
6287:
6283:
6279:
6275:
6271:
6270:
6262:
6254:
6250:
6246:
6242:
6238:
6234:
6233:
6228:
6222:
6214:
6213:
6205:
6198:
6194:
6193:Amnon Shashua
6191:Ron Zass and
6188:
6174:on 2007-09-28
6170:
6163:
6162:
6154:
6143:
6142:
6134:
6126:
6120:
6112:
6108:
6104:
6100:
6096:
6095:10.1038/44565
6092:
6088:
6084:
6080:
6076:
6075:
6067:
6060:
6052:
6048:
6043:
6038:
6034:
6033:
6025:
6023:
6014:
6010:
6006:
6002:
5995:
5988:
5979:
5974:
5970:
5966:
5962:
5955:
5947:
5943:
5938:
5933:
5929:
5925:
5920:
5915:
5911:
5907:
5903:
5896:
5894:
5892:
5883:
5879:
5875:
5871:
5867:
5863:
5858:
5853:
5849:
5845:
5838:
5836:
5826:
5821:
5814:
5812:
5803:
5799:
5795:
5791:
5786:
5781:
5777:
5773:
5766:
5758:
5752:
5748:
5744:
5740:
5733:
5725:
5721:
5716:
5711:
5707:
5703:
5702:
5694:
5687:
5679:
5675:
5670:
5665:
5661:
5657:
5653:
5649:
5648:
5640:
5632:
5628:
5624:
5620:
5616:
5612:
5608:
5604:
5597:
5589:
5585:
5580:
5575:
5571:
5567:
5566:
5558:
5554:
5547:
5539:
5535:
5531:
5527:
5522:
5517:
5513:
5509:
5502:
5494:
5490:
5486:
5482:
5478:
5474:
5469:
5464:
5460:
5456:
5455:
5447:
5440:
5438:
5429:
5425:
5421:
5415:
5411:
5407:
5403:
5399:
5392:
5384:
5380:
5376:
5372:
5368:
5364:
5360:
5356:
5349:
5341:
5339:9780769530284
5335:
5331:
5330:
5322:
5314:
5312:9781450308137
5308:
5304:
5300:
5293:
5292:
5284:
5276:
5272:
5267:
5262:
5257:
5252:
5248:
5244:
5240:
5236:
5235:
5230:
5223:
5221:
5211:
5206:
5202:
5195:
5193:
5184:
5180:
5176:
5172:
5171:
5163:
5155:
5151:
5147:
5143:
5138:
5133:
5129:
5125:
5124:SIAM J. Optim
5118:
5110:
5106:
5102:
5098:
5091:
5083:
5076:
5068:
5064:
5060:
5056:
5049:
5040:
5038:
5026:
5022:
5018:
5014:
5010:
5003:
4996:
4989:
4983:
4981:
4979:
4970:
4963:
4962:
4954:
4952:
4943:
4939:
4935:
4931:
4927:
4926:10.1038/44565
4923:
4919:
4915:
4911:
4907:
4906:
4901:
4894:
4892:
4883:
4879:
4875:
4871:
4867:
4863:
4862:
4857:
4853:
4847:
4839:
4835:
4831:
4827:
4823:
4819:
4815:
4811:
4807:
4800:
4792:
4788:
4784:
4780:
4776:
4772:
4769:: S273âS276.
4768:
4764:
4763:
4755:
4747:
4743:
4739:
4735:
4731:
4727:
4726:
4725:Technometrics
4721:
4717:
4711:
4703:
4699:
4694:
4689:
4684:
4679:
4675:
4671:
4667:
4663:
4659:
4652:
4644:
4643:
4635:
4627:
4623:
4622:Peter J. Haas
4616:
4614:
4605:
4601:
4596:
4591:
4587:
4583:
4578:
4573:
4569:
4565:
4561:
4554:
4552:
4550:
4548:
4546:
4537:
4533:
4528:
4523:
4519:
4515:
4510:
4505:
4501:
4497:
4493:
4486:
4484:
4482:
4480:
4478:
4476:
4474:
4465:
4461:
4457:
4453:
4449:
4445:
4440:
4435:
4431:
4427:
4420:
4418:
4416:
4404:
4397:
4389:
4385:
4378:
4376:
4374:
4369:
4355:
4352:
4350:
4347:
4345:
4342:
4340:
4337:
4335:
4332:
4331:
4322:
4319:
4316:
4312:
4309:
4308:
4307:
4298:
4295:February 2024
4286:
4281:
4272:
4271:
4263:
4261:
4257:
4247:
4245:
4240:
4237:
4233:
4229:
4219:
4216:
4212:
4202:
4198:
4196:
4192:
4191:Wiener filter
4188:
4178:
4160:
4154:
4132:
4128:
4107:
4093:
4085:
4081:
4077:
4075:
4072:articles and
4071:
4067:
4063:
4058:
4056:
4055:data clusters
4052:
4048:
4044:
4042:
4041:document-term
4037:
4027:
4023:
4019:
4016:
4006:
4004:
3998:
3996:
3992:
3988:
3984:
3979:
3977:
3973:
3969:
3964:
3949:
3946:
3944:
3940:
3935:
3871:
3869:
3846:
3843:
3833:
3751:
3748:
3732:
3717:
3716:
3715:
3713:
3703:
3701:
3697:
3692:
3689:
3687:
3682:
3680:
3675:
3671:
3665:
3660:
3655:
3653:
3649:
3646:, trained by
3645:
3641:
3619:
3615:
3609:
3606:
3602:
3596:
3592:
3583:
3578:
3572:
3566:
3560:
3554:
3550:
3548:
3544:
3540:
3530:
3522:
3516:
3510:
3499:
3497:
3492:
3478:
3455:
3452:
3449:
3438:
3433:
3427:
3411:
3407:
3405:
3400:
3394:
3388:
3386:
3380:
3378:
3375:methods, the
3374:
3369:
3363:
3357:
3351:
3345:
3341:is fixed and
3339:
3333:
3329:is fixed and
3327:
3322:
3317:
3293:
3280:
3135:
3129:
3123:
3116:
3110:
3105:
3083:
3080:
3077:
3064:
3054:
3051:
3048:
3033:
3030:
3027:
3015:
2995:
2992:
2989:
2976:
2966:
2963:
2960:
2934:
2926:
2923:
2920:
2902:
2899:
2896:
2888:
2885:
2882:
2865:
2862:
2840:
2837:
2834:
2821:
2809:
2797:
2787:
2764:
2761:
2758:
2740:
2730:
2709:
2701:
2698:
2695:
2677:
2674:
2671:
2663:
2660:
2657:
2640:
2625:
2616:
2610:
2605:
2603:non negative.
2601:
2595:
2590:
2589:
2588:
2586:
2581:
2575:
2564:
2561:
2556:
2551:
2545:
2538:
2527:
2525:
2521:
2517:
2506:
2504:
2503:sparse coding
2500:
2496:
2492:
2487:
2485:
2464:
2459:
2442:
2428:
2417:
2406:
2399:
2398:
2397:
2372:
2370:
2366:
2361:
2358:
2352:
2347:
2342:
2336:
2331:
2321:
2318:
2312:
2308:
2302:
2297:
2287:
2284:
2278:
2256:
2252:
2248:
2245:
2242:
2237:
2233:
2221:
2216:
2210:
2201:
2197:
2186:
2183:
2177:
2171:
2165:
2159:
2153:
2147:
2141:
2138:
2134:
2130:
2126:, such that:
2124:
2118:
2112:
2106:
2100:
2094:
2088:
2082:
2066:
2064:
2060:
2055:
2041:
2038:
2033:
2012:
1998:
1978:
1958:
1938:
1916:
1912:
1903:
1899:
1881:
1878:
1868:
1863:
1860:
1833:
1824:
1822:
1806:
1803:
1798:
1755:
1741:
1738:
1735:
1732:
1729:
1726:
1723:
1703:
1698:
1689:
1686:
1683:
1680:
1666:
1664:
1648:
1628:
1598:
1562:
1543:
1539:
1535:
1532:
1529:
1524:
1520:
1513:
1494:
1491:
1485:
1478:
1471:
1465:
1458:
1452:
1446:
1441:
1437:
1431:
1425:
1419:
1414:
1410:
1406:
1401:
1397:
1393:
1389:
1383:
1377:
1372:
1371:
1370:
1367:
1364:
1358:
1352:
1346:
1342:
1336:
1330:
1326:
1320:
1314:
1310:
1304:
1297:
1294:
1288:
1281:
1277:
1271:
1265:
1258:
1254:
1234:
1228:
1213:
1208:
1194:
1193:
1192:
1189:
1183:
1177:
1171:
1151:
1137:
1125:
1124:
1123:
1120:
1114:
1108:
1097:
1095:
1091:
1087:
1082:
1072:
1070:
1066:
1062:
1058:
1054:
1050:
1046:
1042:
1037:
1034:
1028:
1023:
1018:
1014:
1010:
1006:
1002:
998:
994:
990:
986:
978:
972:
966:
960:
954:
943:
938:
936:
931:
929:
924:
923:
921:
920:
913:
910:
906:
903:
902:
901:
898:
896:
893:
892:
886:
885:
878:
875:
873:
870:
868:
865:
863:
860:
858:
855:
853:
850:
848:
845:
844:
838:
837:
830:
827:
825:
822:
820:
817:
815:
812:
810:
807:
805:
802:
800:
797:
795:
792:
791:
785:
784:
777:
774:
772:
769:
767:
764:
762:
759:
758:
752:
751:
744:
741:
739:
736:
734:
733:Crowdsourcing
731:
729:
726:
725:
719:
718:
709:
706:
705:
704:
701:
699:
696:
694:
691:
689:
686:
685:
682:
677:
676:
668:
665:
663:
662:Memtransistor
660:
658:
655:
653:
650:
646:
643:
642:
641:
638:
636:
633:
629:
626:
624:
621:
619:
616:
614:
611:
610:
609:
606:
604:
601:
599:
596:
594:
591:
587:
584:
583:
582:
579:
575:
572:
570:
567:
565:
562:
560:
557:
556:
555:
552:
550:
547:
545:
544:Deep learning
542:
540:
537:
536:
533:
528:
527:
520:
517:
515:
512:
510:
508:
504:
502:
499:
498:
495:
490:
489:
480:
479:Hidden Markov
477:
475:
472:
470:
467:
466:
465:
462:
461:
458:
453:
452:
445:
442:
440:
437:
435:
432:
430:
427:
425:
422:
420:
417:
415:
412:
410:
407:
405:
402:
401:
398:
393:
392:
385:
382:
380:
377:
375:
371:
369:
366:
364:
361:
359:
357:
353:
351:
348:
346:
343:
341:
338:
337:
334:
329:
328:
321:
318:
316:
313:
311:
308:
306:
303:
301:
298:
296:
293:
291:
288:
286:
284:
280:
276:
275:Random forest
273:
271:
268:
266:
263:
262:
261:
258:
256:
253:
251:
248:
247:
240:
239:
234:
233:
225:
219:
218:
211:
208:
206:
203:
201:
198:
196:
193:
191:
188:
186:
183:
181:
178:
176:
173:
171:
168:
166:
163:
161:
160:Data cleaning
158:
156:
153:
151:
148:
146:
143:
141:
138:
136:
133:
131:
128:
126:
123:
122:
116:
115:
108:
105:
103:
100:
98:
95:
93:
90:
88:
85:
83:
80:
78:
75:
73:
72:Meta-learning
70:
68:
65:
63:
60:
58:
55:
53:
50:
48:
45:
44:
38:
37:
34:
29:
26:
25:
21:
20:
8322:
8321:
8320:profile for
8317:
8167:
8161:
8124:
8118:
8085:
8079:
8024:
8018:
7985:
7979:
7965:(1): 23â35.
7962:
7956:
7926:
7920:
7885:
7836:
7829:
7818:. Retrieved
7811:the original
7806:
7793:
7782:. Retrieved
7775:the original
7739:(1): 44â56.
7736:
7732:
7719:
7708:. Retrieved
7704:
7695:
7686:
7673:
7664:
7651:
7616:
7612:
7606:
7573:
7567:
7557:
7522:
7516:
7506:
7473:
7467:
7461:
7428:
7422:
7416:
7383:
7379:
7373:
7340:
7336:
7330:
7295:
7289:
7246:
7242:
7231:
7188:
7185:Cell Reports
7184:
7174:
7139:
7133:
7123:
7096:
7090:
7080:
7035:
7029:
7019:
6984:
6978:
6968:
6960:
6951:
6940:the original
6909:
6903:
6890:
6855:
6849:
6843:
6826:
6822:
6816:
6795:
6771:
6764:
6739:
6733:
6727:
6716:. Retrieved
6706:
6673:
6667:
6654:
6611:
6607:
6600:
6555:
6551:
6541:
6498:
6494:
6488:
6443:
6439:
6425:
6390:
6384:
6370:
6363:
6354:
6348:
6335:
6328:
6315:
6308:
6291:10.1.1.21.24
6273:
6267:
6261:
6236:
6230:
6221:
6211:
6204:
6187:
6176:. Retrieved
6169:the original
6160:
6153:
6140:
6133:
6119:cite journal
6078:
6072:
6059:
6031:
6004:
6000:
5987:
5968:
5964:
5954:
5909:
5905:
5847:
5843:
5775:
5771:
5765:
5738:
5732:
5705:
5699:
5686:
5651:
5645:
5639:
5606:
5602:
5596:
5569:
5563:
5546:
5511:
5507:
5501:
5458:
5452:
5401:
5391:
5358:
5354:
5348:
5328:
5321:
5290:
5283:
5238:
5232:
5200:
5174:
5168:
5162:
5127:
5123:
5117:
5100:
5096:
5090:
5081:
5075:
5058:
5054:
5048:
5025:the original
5012:
5008:
4995:
4960:
4909:
4903:
4865:
4859:
4846:
4813:
4809:
4799:
4766:
4760:
4754:
4729:
4723:
4710:
4665:
4661:
4651:
4641:
4634:
4625:
4567:
4563:
4499:
4495:
4429:
4425:
4396:
4387:
4305:
4292:
4284:
4253:
4241:
4225:
4208:
4199:
4184:
4099:
4091:
4082:
4078:
4059:
4050:
4047:term-feature
4046:
4040:
4033:
4024:
4020:
4012:
3999:
3980:
3960:
3952:Applications
3947:
3933:
3872:
3868:non-negative
3773:
3709:
3693:
3690:
3683:
3676:
3669:
3663:
3656:
3637:
3576:
3570:
3564:
3558:
3538:
3536:
3520:
3514:
3508:
3505:
3493:
3431:
3425:
3422:
3398:
3392:
3389:
3381:
3367:
3361:
3355:
3349:
3343:
3337:
3331:
3325:
3318:
3133:
3127:
3124:
3121:
3114:
3108:
2614:
2608:
2599:
2593:
2591:initialize:
2579:
2573:
2570:
2559:
2549:
2543:
2536:
2533:
2512:
2498:
2488:
2481:
2373:
2362:
2356:
2350:
2340:
2334:
2327:
2316:
2310:
2306:
2300:
2294:In case the
2293:
2282:
2276:
2214:
2208:
2199:
2195:
2192:
2181:
2175:
2169:
2163:
2157:
2151:
2145:
2142:
2136:
2132:
2128:
2122:
2116:
2110:
2104:
2098:
2092:
2086:
2080:
2077:
2056:
2013:
1901:
1897:
1825:
1756:
1667:
1563:
1500:
1489:
1483:
1476:
1469:
1466:
1463:
1456:
1450:
1444:
1435:
1429:
1423:
1417:
1408:
1404:
1399:
1395:
1387:
1381:
1375:
1368:
1362:
1356:
1350:
1348:matrix then
1344:
1340:
1334:
1332:matrix, and
1328:
1324:
1318:
1312:
1308:
1302:
1298:
1292:
1286:
1279:
1275:
1269:
1263:
1256:
1252:
1249:
1187:
1181:
1175:
1169:
1166:
1118:
1112:
1106:
1103:
1089:
1085:
1081:chemometrics
1078:
1057:chemometrics
1038:
1032:
1026:
1016:
996:
992:
988:
984:
983:
976:
970:
964:
958:
819:PAC learning
506:
423:
355:
350:Hierarchical
282:
236:
230:
8170:(2): 1â17.
6614:(24): A24.
6501:(2): L148.
6430:Berné, O.;
5971:: 175â182.
5553:Haesun Park
4852:Pia Anttila
4036:text mining
4030:Text mining
3943:permutation
3385:NP-complete
3277:terms, are
3137:, i.e. the
3118:are stable.
1931:belongs to
1716:subject to
1104:Let matrix
703:Multi-agent
640:Transformer
539:Autoencoder
295:Naive Bayes
33:data mining
8337:Categories
7897:1605.06848
7820:2015-03-22
7784:2015-04-19
7710:2019-12-14
7313:1595933395
6807:1911.04705
6718:2008-08-26
6669:NeuroImage
6621:1502.03092
6558:(2): 948.
6432:Joblin, C.
6178:2007-01-29
5919:1604.06097
5912:(2): 117.
5850:(2): L28.
5825:1612.06037
5785:2109.03874
5210:cs/0202009
4577:2001.00563
4509:1712.10317
4502:(2): 104.
3991:exoplanets
3706:Uniqueness
3574:, so that
3377:active set
2567:Algorithms
2509:Online NMF
2360:matrices.
1100:Background
1022:factorized
1001:algorithms
688:Q-learning
586:Restricted
384:Mean shift
333:Clustering
310:Perceptron
238:regression
140:Clustering
135:Regression
8063:0801.3199
7845:CiteSeerX
7749:CiteSeerX
7629:CiteSeerX
7408:235634059
7365:218504587
7263:0168-9525
7205:2211-1247
6963:, 431â436
6914:CiteSeerX
6860:CiteSeerX
6786:0805.1154
6777:Wikimania
6592:119200505
6565:1207.6637
6508:0902.3247
6480:0004-6361
6286:CiteSeerX
6195:(2005). "
6042:1212.4777
5946:118349503
5857:1207.4197
5802:2364-415X
5664:CiteSeerX
5574:CiteSeerX
5516:CiteSeerX
5463:CiteSeerX
5137:0708.4149
4969:MIT Press
4838:Q29308406
4830:1180-4009
4791:Q58065673
4783:0021-8502
4604:209531731
4570:(2): 74.
3987:linearity
3983:linearity
3957:Astronomy
3916:~
3887:~
3844:−
3828:~
3789:~
3749:−
3593:∑
3582:generated
3502:Exact NMF
2908:←
2683:←
2516:streaming
2493:(akin to
2443:−
2246:…
1739:≥
1727:≥
1684:−
1599:≃
1533:…
1041:astronomy
847:ECML PKDD
829:VC theory
776:ROC curve
708:Self-play
628:DeepDream
469:Bayes net
260:Ensembles
41:Paradigms
8196:19536273
8149:13208611
8141:18785855
8049:15445516
8002:17298233
7771:12530378
7598:11060831
7590:25167546
7549:25899294
7490:11989846
7453:37186516
7400:34166199
7357:32365039
7281:30143323
7223:23318258
7166:23291781
7115:17483501
7072:18654623
7011:24496008
6980:Genetics
6882:12931155
6756:16249147
6698:18509039
6690:15946864
6646:20174209
6417:17923083
6103:10548103
5882:51088743
5724:11197117
5555:(2008).
5485:17716011
5375:24807135
5275:23133590
5234:PLOS One
5097:SIAM Rev
4934:10548103
4834:Wikidata
4787:Wikidata
4702:22216138
4662:PLOS ONE
4464:18561804
4328:See also
4003:sparsity
2455:‖
2434:‖
2206:ïŒ i.e.,
1896:for all
1779:, i.e.
1694:‖
1677:‖
1316:matrix,
1011:where a
995:), also
270:Boosting
119:Problems
8314:Scholia
8299:(2020).
8288:(2019).
8277:(2018).
8266:(2016).
8255:(2016).
8244:(2014).
8233:(2011).
8222:(2009).
8210:(2008).
8187:2688815
8110:9997603
8090:Bibcode
8029:Bibcode
8010:5337451
7931:Bibcode
7741:Bibcode
7621:Bibcode
7540:4640278
7498:6553527
7433:Bibcode
7272:6309559
7214:3588146
7157:4313078
7063:2447881
7040:Bibcode
7002:3982712
6936:8079061
6626:Bibcode
6570:Bibcode
6533:7332750
6513:Bibcode
6458:Bibcode
6342:. NIPS.
6278:Bibcode
6253:1390831
6215:. NIPS.
6111:4428232
6083:Bibcode
6047:Bibcode
5924:Bibcode
5862:Bibcode
5656:Bibcode
5631:8143231
5611:Bibcode
5538:2183630
5493:2295736
5428:3109867
5383:8755408
5266:3487913
5243:Bibcode
5154:7150400
4942:4428232
4914:Bibcode
4870:Bibcode
4746:1267173
4693:3245233
4670:Bibcode
4645:. AAAI.
4582:Bibcode
4536:3966513
4514:Bibcode
4444:Bibcode
4285:updated
3712:inverse
3688:model.
3686:PARAFAC
2354:and/or
2348:of the
1284:is the
1261:is the
1075:History
852:NeurIPS
669:(ECRAM)
623:AlexNet
265:Bagging
8316:has a
8295:
8284:
8273:
8262:
8251:
8240:
8229:
8218:
8206:
8194:
8184:
8147:
8139:
8108:
8047:
8008:
8000:
7908:Others
7875:
7865:
7847:
7769:
7751:
7631:
7596:
7588:
7547:
7537:
7496:
7488:
7451:
7406:
7398:
7363:
7355:
7322:165018
7320:
7310:
7279:
7269:
7261:
7221:
7211:
7203:
7164:
7154:
7113:
7070:
7060:
7009:
6999:
6934:
6916:
6880:
6862:
6754:
6696:
6688:
6644:
6590:
6531:
6478:
6415:
6405:
6288:
6251:
6109:
6101:
6074:Nature
5944:
5880:
5800:
5753:
5722:
5666:
5629:
5576:
5536:
5518:
5491:
5483:
5465:
5426:
5416:
5381:
5373:
5336:
5309:
5273:
5263:
5152:
4940:
4932:
4905:Nature
4836:
4828:
4789:
4781:
4744:
4700:
4690:
4602:
4534:
4462:
4344:Tensor
4062:PubMed
4049:and a
4043:matrix
3106:Until
1322:is an
1306:is an
1250:where
1067:, and
1013:matrix
645:Vision
501:RANSAC
379:OPTICS
374:DBSCAN
358:-means
165:AutoML
8318:topic
8145:S2CID
8106:S2CID
8058:arXiv
8045:S2CID
8006:S2CID
7892:arXiv
7873:S2CID
7841:(PDF)
7814:(PDF)
7803:(PDF)
7778:(PDF)
7767:S2CID
7729:(PDF)
7683:(PDF)
7661:(PDF)
7594:S2CID
7494:S2CID
7449:S2CID
7404:S2CID
7361:S2CID
7318:S2CID
6943:(PDF)
6932:S2CID
6900:(PDF)
6878:S2CID
6802:arXiv
6781:arXiv
6752:S2CID
6694:S2CID
6664:(PDF)
6642:S2CID
6616:arXiv
6588:S2CID
6560:arXiv
6529:S2CID
6503:arXiv
6448:arXiv
6413:S2CID
6340:(PDF)
6320:(PDF)
6249:JSTOR
6172:(PDF)
6165:(PDF)
6145:(PDF)
6107:S2CID
6069:(PDF)
6037:arXiv
5997:(PDF)
5942:S2CID
5914:arXiv
5878:S2CID
5852:arXiv
5820:arXiv
5780:arXiv
5720:S2CID
5696:(PDF)
5627:S2CID
5560:(PDF)
5534:S2CID
5489:S2CID
5449:(PDF)
5424:S2CID
5379:S2CID
5295:(PDF)
5205:arXiv
5150:S2CID
5132:arXiv
5028:(PDF)
5005:(PDF)
4965:(PDF)
4938:S2CID
4742:JSTOR
4600:S2CID
4572:arXiv
4532:S2CID
4504:arXiv
4460:S2CID
4434:arXiv
4406:(PDF)
4365:Notes
4256:SPECT
4066:Enron
3526:O(rm)
3281:when
2495:Lasso
2489:When
2143:When
2069:Types
1338:is a
1094:Seung
867:IJCAI
693:SARSA
652:Mamba
618:LeNet
613:U-Net
439:t-SNE
363:Fuzzy
340:BIRCH
8293:ISBN
8282:ISBN
8271:ISBN
8260:ISBN
8249:ISBN
8238:ISBN
8227:ISBN
8216:ISBN
8204:ISBN
8192:PMID
8168:2009
8137:PMID
7998:PMID
7877:4968
7863:ISBN
7586:PMID
7545:PMID
7486:PMID
7396:PMID
7353:PMID
7308:ISBN
7277:PMID
7259:ISSN
7219:PMID
7201:ISSN
7162:PMID
7111:PMID
7068:PMID
7007:PMID
6686:PMID
6476:ISSN
6403:ISBN
6125:link
6099:PMID
5798:ISSN
5751:ISBN
5481:PMID
5414:ISBN
5371:PMID
5334:ISBN
5307:ISBN
5271:PMID
4930:PMID
4826:ISSN
4779:ISSN
4698:PMID
4258:and
4234:and
3993:and
3902:and
3866:are
3814:and
3545:and
3429:and
3396:and
3365:and
3353:and
3205:and
3131:and
3112:and
2612:and
2597:and
2577:and
2338:and
2173:and
2149:and
2114:and
1869:>
1641:and
1421:and
1360:and
1273:and
1116:and
1030:and
1007:and
993:NNMF
968:and
877:JMLR
862:ICLR
857:ICML
743:RLHF
559:LSTM
345:CURE
31:and
8182:PMC
8172:doi
8129:doi
8098:doi
8037:doi
7990:doi
7967:doi
7939:doi
7855:doi
7759:doi
7639:doi
7578:doi
7535:PMC
7527:doi
7478:doi
7441:doi
7388:doi
7345:doi
7300:doi
7267:PMC
7251:doi
7209:PMC
7193:doi
7152:PMC
7144:doi
7140:125
7101:doi
7058:PMC
7048:doi
6997:PMC
6989:doi
6985:196
6959:",
6924:doi
6870:doi
6831:doi
6744:doi
6678:doi
6634:doi
6612:581
6578:doi
6556:427
6521:doi
6499:694
6466:doi
6444:469
6395:doi
6296:doi
6241:doi
6091:doi
6079:401
6009:doi
6005:436
5973:doi
5932:doi
5910:824
5870:doi
5848:755
5790:doi
5743:doi
5710:doi
5674:doi
5619:doi
5584:doi
5526:doi
5473:doi
5406:doi
5363:doi
5299:doi
5261:PMC
5251:doi
5179:doi
5142:doi
5105:doi
5063:doi
5017:doi
4922:doi
4910:401
4878:doi
4818:doi
4771:doi
4734:doi
4688:PMC
4678:doi
4590:doi
4568:892
4522:doi
4500:852
4452:doi
4430:133
4260:PET
3580:is
3537:In
3404:SVD
2863:and
2522:in
2298:of
2218:to
1754:,
1586:by
1385:in
1079:In
1020:is
1003:in
991:or
989:NMF
603:SOM
593:GAN
569:ESN
564:GRU
509:-NN
444:SDL
434:PGD
429:PCA
424:NMF
419:LDA
414:ICA
409:CCA
285:-NN
8339::
8190:.
8180:.
8166:.
8160:.
8143:.
8135:.
8125:21
8123:.
8104:.
8096:.
8086:25
8084:.
8043:.
8035:.
8025:51
8023:.
8004:.
7996:.
7986:19
7984:.
7963:37
7961:.
7937:.
7927:23
7925:.
7919:.
7871:.
7861:.
7853:.
7805:.
7765:.
7757:.
7747:.
7737:61
7735:.
7731:.
7703:.
7685:.
7663:.
7637:.
7627:.
7617:41
7615:.
7592:.
7584:.
7574:34
7572:.
7566:.
7543:.
7533:.
7523:35
7521:.
7515:.
7492:.
7484:.
7474:21
7472:.
7447:.
7439:.
7429:29
7427:.
7402:.
7394:.
7384:PP
7382:.
7359:.
7351:.
7341:24
7339:.
7316:.
7306:.
7275:.
7265:.
7257:.
7247:34
7245:.
7241:.
7217:.
7207:.
7199:.
7187:.
7183:.
7160:.
7150:.
7138:.
7132:.
7109:.
7097:23
7095:.
7089:.
7066:.
7056:.
7046:.
7034:.
7028:.
7005:.
6995:.
6983:.
6977:.
6930:.
6922:.
6908:.
6902:.
6876:.
6868:.
6856:24
6854:.
6827:52
6825:.
6779:.
6775:.
6750:.
6740:11
6738:.
6692:.
6684:.
6674:27
6672:.
6666:.
6640:.
6632:.
6624:.
6610:.
6586:.
6576:.
6568:.
6554:.
6550:.
6527:.
6519:.
6511:.
6497:.
6474:.
6464:.
6456:.
6442:.
6438:.
6411:.
6401:.
6294:.
6284:.
6274:22
6272:.
6247:.
6235:.
6121:}}
6117:{{
6105:.
6097:.
6089:.
6077:.
6071:.
6045:.
6021:^
6003:.
5999:.
5969:35
5967:.
5963:.
5940:.
5930:.
5922:.
5908:.
5904:.
5890:^
5876:.
5868:.
5860:.
5846:.
5834:^
5810:^
5796:.
5788:.
5776:16
5774:.
5749:.
5718:.
5706:33
5704:.
5698:.
5672:.
5662:.
5652:58
5650:.
5625:.
5617:.
5607:60
5605:.
5582:.
5570:30
5568:.
5562:.
5532:.
5524:.
5512:18
5510:.
5487:.
5479:.
5471:.
5459:19
5457:.
5451:.
5436:^
5422:.
5412:.
5400:.
5377:.
5369:.
5359:23
5357:.
5305:.
5269:.
5259:.
5249:.
5237:.
5231:.
5219:^
5191:^
5175:71
5173:.
5148:.
5140:.
5128:20
5126:.
5101:16
5099:.
5057:.
5036:^
5013:52
5011:.
5007:.
4977:^
4950:^
4936:.
4928:.
4920:.
4908:.
4890:^
4876:.
4866:29
4864:.
4854:;
4832:.
4824:.
4812:.
4808:.
4785:.
4777:.
4767:22
4765:.
4740:.
4730:13
4728:.
4718:;
4696:.
4686:.
4676:.
4664:.
4660:.
4612:^
4598:.
4588:.
4580:.
4566:.
4562:.
4544:^
4530:.
4520:.
4512:.
4498:.
4494:.
4472:^
4458:.
4450:.
4442:.
4428:.
4414:^
4386:.
4372:^
3997:.
3978:.
3945:.
3654:.
3316:.
2486:.
2341:WH
2311:WH
2309:=
2198:â
2135:+
2133:WH
2131:=
2093:WH
1900:â
1823:.
1742:0.
1665:)
1561:.
1493:.
1445:WH
1366:.
1343:Ă
1327:Ă
1311:Ă
1296:.
1122:,
1071:.
1063:,
1059:,
1055:,
1051:,
1047:,
1043:,
872:ML
8328:.
8198:.
8174::
8151:.
8131::
8112:.
8100::
8092::
8066:.
8060::
8051:.
8039::
8031::
8012:.
7992::
7973:.
7969::
7947:.
7941::
7933::
7900:.
7894::
7879:.
7857::
7823:.
7787:.
7761::
7743::
7713:.
7689:.
7667:.
7645:.
7641::
7623::
7600:.
7580::
7551:.
7529::
7500:.
7480::
7455:.
7443::
7435::
7410:.
7390::
7367:.
7347::
7324:.
7302::
7283:.
7253::
7225:.
7195::
7189:3
7168:.
7146::
7117:.
7103::
7074:.
7050::
7042::
7036:4
7013:.
6991::
6926::
6910:8
6884:.
6872::
6837:.
6833::
6810:.
6804::
6789:.
6783::
6758:.
6746::
6721:.
6700:.
6680::
6648:.
6636::
6628::
6618::
6594:.
6580::
6572::
6562::
6535:.
6523::
6515::
6505::
6482:.
6468::
6460::
6450::
6419:.
6397::
6302:.
6298::
6280::
6255:.
6243::
6237:8
6181:.
6127:)
6113:.
6093::
6085::
6053:.
6049::
6039::
6015:.
6011::
5981:.
5975::
5948:.
5934::
5926::
5916::
5884:.
5872::
5864::
5854::
5828:.
5822::
5804:.
5792::
5782::
5759:.
5745::
5726:.
5712::
5680:.
5676::
5658::
5633:.
5621::
5613::
5590:.
5586::
5540:.
5528::
5495:.
5475::
5430:.
5408::
5385:.
5365::
5315:.
5301::
5277:.
5253::
5245::
5239:7
5213:.
5207::
5185:.
5181::
5156:.
5144::
5134::
5111:.
5107::
5069:.
5065::
5059:2
5019::
4944:.
4924::
4916::
4884:.
4880::
4872::
4840:.
4820::
4814:5
4793:.
4773::
4748:.
4736::
4704:.
4680::
4672::
4666:6
4606:.
4592::
4584::
4574::
4538:.
4524::
4516::
4506::
4466:.
4454::
4446::
4436::
4297:)
4293:(
4287:.
4164:)
4161:N
4158:(
4155:O
4133:2
4129:N
4108:N
3934:B
3913:H
3884:W
3853:H
3847:1
3839:B
3834:=
3825:H
3801:B
3798:W
3795:=
3786:W
3758:H
3752:1
3744:B
3741:B
3738:W
3733:=
3729:H
3726:W
3670:H
3664:W
3634:.
3620:a
3616:h
3610:a
3607:i
3603:W
3597:a
3577:V
3571:W
3565:H
3559:V
3521:V
3515:V
3509:V
3479:n
3459:)
3456:1
3453:+
3450:n
3447:(
3432:H
3426:W
3399:H
3393:W
3368:H
3362:W
3356:H
3350:W
3344:H
3338:W
3332:W
3326:H
3303:H
3298:W
3294:=
3290:V
3257:T
3251:H
3245:H
3240:W
3231:T
3225:H
3219:V
3189:H
3184:W
3177:T
3171:W
3163:V
3156:T
3150:W
3134:H
3128:W
3115:H
3109:W
3087:]
3084:j
3081:,
3078:i
3075:[
3071:)
3065:T
3061:)
3055:1
3052:+
3049:n
3044:H
3039:(
3034:1
3031:+
3028:n
3023:H
3016:n
3011:W
3006:(
2999:]
2996:j
2993:,
2990:i
2987:[
2983:)
2977:T
2973:)
2967:1
2964:+
2961:n
2956:H
2951:(
2947:V
2943:(
2935:n
2930:]
2927:j
2924:,
2921:i
2918:[
2913:W
2903:1
2900:+
2897:n
2892:]
2889:j
2886:,
2883:i
2880:[
2875:W
2844:]
2841:j
2838:,
2835:i
2832:[
2828:)
2822:n
2817:H
2810:n
2805:W
2798:T
2794:)
2788:n
2783:W
2778:(
2775:(
2768:]
2765:j
2762:,
2759:i
2756:[
2752:)
2747:V
2741:T
2737:)
2731:n
2726:W
2721:(
2718:(
2710:n
2705:]
2702:j
2699:,
2696:i
2693:[
2688:H
2678:1
2675:+
2672:n
2667:]
2664:j
2661:,
2658:i
2655:[
2650:H
2626:n
2615:H
2609:W
2600:H
2594:W
2580:H
2574:W
2560:H
2550:V
2544:W
2537:V
2465:2
2460:F
2450:H
2447:W
2439:V
2429:=
2426:)
2422:H
2418:,
2414:W
2410:(
2407:F
2383:V
2357:H
2351:W
2335:V
2317:V
2307:V
2301:V
2283:H
2277:W
2262:)
2257:n
2253:v
2249:,
2243:,
2238:1
2234:v
2230:(
2215:W
2209:W
2203:+
2200:R
2196:W
2182:V
2176:H
2170:W
2164:V
2158:V
2152:H
2146:W
2137:U
2129:V
2123:U
2117:H
2111:W
2105:V
2099:V
2087:H
2081:W
2042:I
2039:=
2034:T
2029:H
2023:H
1999:k
1979:k
1959:W
1939:k
1917:j
1913:v
1902:k
1898:i
1882:j
1879:i
1874:H
1864:j
1861:k
1856:H
1834:H
1807:I
1804:=
1799:T
1794:H
1788:H
1766:H
1736:H
1733:,
1730:0
1724:W
1704:,
1699:F
1690:H
1687:W
1681:V
1649:H
1629:W
1608:H
1603:W
1595:V
1573:V
1549:)
1544:n
1540:v
1536:,
1530:,
1525:1
1521:v
1517:(
1514:=
1510:V
1490:H
1484:W
1477:H
1470:W
1460:.
1457:H
1451:W
1439:.
1436:V
1430:V
1424:H
1418:W
1409:H
1400:W
1388:V
1382:v
1376:V
1363:n
1357:m
1351:p
1345:n
1341:p
1335:H
1329:p
1325:m
1319:W
1313:n
1309:m
1303:V
1293:H
1287:i
1280:i
1276:h
1270:V
1264:i
1257:i
1253:v
1235:,
1229:i
1224:h
1218:W
1214:=
1209:i
1204:v
1188:V
1182:H
1176:W
1170:V
1152:.
1147:H
1142:W
1138:=
1134:V
1119:H
1113:W
1107:V
1033:H
1027:W
1017:V
987:(
980:.
977:V
971:H
965:W
959:V
941:e
934:t
927:v
507:k
356:k
283:k
241:)
229:(
Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.