8372:-based models. This interpretability is one of the most desirable qualities of decision trees. It allows developers to confirm that the model has learned realistic information from the data and allows end-users to have trust and confidence in the decisions made by the model. For example, following the path that a decision tree takes to make its decision is quite trivial, but following the paths of tens or hundreds of trees is much harder. To achieve both performance and interpretability, some model compression techniques allow transforming a random forest into a minimal "born-again" decision tree that faithfully reproduces the same decision function. If it is established that the predictive attributes are linearly correlated with the target variable, using random forest may not enhance the accuracy of the base learner. Furthermore, in problems with multiple categorical variables, random forest may not be able to increase the accuracy of the base learner.
3259:
suitably generated synthetic data. The observed data are the original unlabeled data and the synthetic data are drawn from a reference distribution. A random forest dissimilarity can be attractive because it handles mixed variable types very well, is invariant to monotonic transformations of the input variables, and is robust to outlying observations. The random forest dissimilarity easily deals with a large number of semi-continuous variables due to its intrinsic variable selection; for example, the "Addcl 1" random forest dissimilarity weighs the contribution of each variable according to how dependent it is on other variables. The random forest dissimilarity has been used in a variety of applications, e.g. to find clusters of patients based on tissue marker data.
1165:
6008:
6488:
3311:
Forest Kernel and show that it can empirically outperform state-of-art kernel methods. Scornet first defined KeRF estimates and gave the explicit link between KeRF estimates and random forest. He also gave explicit expressions for kernels based on centered random forest and uniform random forest, two simplified models of random forest. He named these two KeRFs
Centered KeRF and Uniform KeRF, and proved upper bounds on their rates of consistency.
10091:
5663:
4800:. Random regression forest has two levels of averaging, first over the samples in the target cell of a tree, then over all trees. Thus the contributions of observations that are in cells with a high density of data points are smaller than that of observations which belong to less populated cells. In order to improve the random forest methods and compensate the misestimation, Scornet defined KeRF by
6105:
5063:
4798:
7580:
5545:
1625:, or ExtraTrees. While similar to ordinary random forests in that they are an ensemble of individual trees, there are two main differences: first, each tree is trained using the whole learning sample (rather than a bootstrap sample), and second, the top-down splitting in the tree learner is randomized. Instead of computing the locally
2132:
4803:
3190:
1150:. Random forests are a way of averaging multiple deep decision trees, trained on different parts of the same training set, with the goal of reducing the variance. This comes at the expense of a small increase in the bias and some loss of interpretability, but generally greatly boosts the performance in the final model.
6003:{\displaystyle K_{k}^{cc}(\mathbf {x} ,\mathbf {z} )=\sum _{k_{1},\ldots ,k_{d},\sum _{j=1}^{d}k_{j}=k}{\frac {k!}{k_{1}!\cdots k_{d}!}}\left({\frac {1}{d}}\right)^{k}\prod _{j=1}^{d}\mathbf {1} _{\lceil 2^{k_{j}}x_{j}\rceil =\lceil 2^{k_{j}}z_{j}\rceil },\qquad {\text{ for all }}\mathbf {x} ,\mathbf {z} \in ^{d}.}
4552:
6483:{\displaystyle K_{k}^{uf}(\mathbf {0} ,\mathbf {x} )=\sum _{k_{1},\ldots ,k_{d},\sum _{j=1}^{d}k_{j}=k}{\frac {k!}{k_{1}!\ldots k_{d}!}}\left({\frac {1}{d}}\right)^{k}\prod _{m=1}^{d}\left(1-|x_{m}|\sum _{j=0}^{k_{m}-1}{\frac {\left(-\ln |x_{m}|\right)^{j}}{j!}}\right){\text{ for all }}\mathbf {x} \in ^{d}.}
1054:
monotonically is in sharp contrast to the common belief that the complexity of a classifier can only grow to a certain level of accuracy before being hurt by overfitting. The explanation of the forest method's resistance to overtraining can be found in
Kleinberg's theory of stochastic discrimination.
6721:
1641:
cut-point is selected. This value is selected from a uniform distribution within the feature's empirical range (in the tree's training set). Then, of all the randomly generated splits, the split that yields the highest score is chosen to split the node. Similar to ordinary random forests, the number
1048:
The general method of random decision forests was first proposed by
Salzberg and Heath in 1993, with a method that used a randomized decision tree algorithm to generate multiple different trees and then combine them using majority voting. This idea was developed further by Ho in 1995. Ho established
7353:
3258:
As part of their construction, random forest predictors naturally lead to a dissimilarity measure among the observations. One can also define a random forest dissimilarity measure between unlabeled data: the idea is to construct a random forest predictor that distinguishes the "observed" data from
1714:
The basic Random Forest procedure may not work well in situations where there are a large number of features but only a small proportion of these features are informative with respect to sample classification. This can be addressed by encouraging the procedure to focus mainly on features and trees
3310:
random vectors in the tree construction are equivalent to a kernel acting on the true margin. Lin and Jeon established the connection between random forests and adaptive nearest neighbor, implying that random forests can be seen as adaptive kernel estimates. Davies and
Ghahramani proposed Random
1403:
of the model, without increasing the bias. This means that while the predictions of a single tree are highly sensitive to noise in its training set, the average of many trees is not, as long as the trees are not correlated. Simply training many trees on a single training set would give strongly
1053:
dimensions. A subsequent work along the same lines concluded that other splitting methods behave similarly, as long as they are randomly forced to be insensitive to some feature dimensions. Note that this observation of a more complex classifier (a larger forest) getting more accurate nearly
4233:
5269:
5318:
6895:
1947:
4072:
4497:
3324:
Centered forest is a simplified model for
Breiman's original random forest, which uniformly selects an attribute among all attributes and performs splits at the center of the cell along the pre-chosen attribute. The algorithm stops when a fully binary tree of level
993:
at training time. For classification tasks, the output of the random forest is the class selected by most trees. For regression tasks, the mean or average prediction of the individual trees is returned. Random decision forests correct for decision trees' habit of
1943:
This feature importance for random forests is the default implementation in sci-kit learn and R. It is described in the book "Classification and
Regression Trees" by Leo Breiman. Variables which decrease the impurity during splits a lot are considered important:
2947:
1057:
The early development of
Breiman's notion of random forests was influenced by the work of Amit and Geman who introduced the idea of searching over a random subset of the available decisions when splitting a node, in the context of growing a single
1062:. The idea of random subspace selection from Ho was also influential in the design of random forests. In this method a forest of trees is grown, and variation among the trees is introduced by projecting the training data into a randomly chosen
9132:
Li, H. B., Wang, W., Ding, H. W., & Dong, J. (2010, 10-12 Nov. 2010). Trees weighting random forest method for classifying high-dimensional noisy data. Paper presented at the 2010 IEEE 7th
International Conference on E-Business Engineering.
6561:
5058:{\displaystyle {\tilde {m}}_{M,n}(\mathbf {x} ,\Theta _{1},\ldots ,\Theta _{M})={\frac {1}{\sum _{j=1}^{M}N_{n}(\mathbf {x} ,\Theta _{j})}}\sum _{j=1}^{M}\sum _{i=1}^{n}Y_{i}\mathbf {1} _{\mathbf {X} _{i}\in A_{n}(\mathbf {x} ,\Theta _{j})},}
4793:{\displaystyle m_{M,n}(\mathbf {x} ,\Theta _{1},\ldots ,\Theta _{M})={\frac {1}{M}}\sum _{j=1}^{M}\left(\sum _{i=1}^{n}{\frac {Y_{i}\mathbf {1} _{\mathbf {X} _{i}\in A_{n}(\mathbf {x} ,\Theta _{j})}}{N_{n}(\mathbf {x} ,\Theta _{j})}}\right)}
1577:
The above procedure describes the original bagging algorithm for trees. Random forests also include another type of bagging scheme: they use a modified tree learning algorithm that selects, at each candidate split in the learning process, a
7345:
3381:
Uniform forest is another simplified model for
Breiman's original random forest, which uniformly selects a feature among all features and performs splits at a point uniformly drawn on the side of the cell, along the preselected feature.
1527:
3249:
depends in a complex way on the structure of the trees, and thus on the structure of the training set. Lin and Jeon show that the shape of the neighborhood used by a random forest adapts to the local importance of each feature.
1066:
before fitting each tree or each node. Finally, the idea of randomized node optimization, where the decision at each node is selected by a randomized procedure, rather than a deterministic optimization was first introduced by
6726:
6100:
5658:
4077:
8588:
3477:
1739:
Random forests can be used to rank the importance of variables in a regression or classification problem in a natural way. The following technique was described in
Breiman's original paper and is implemented in the
8086:
1913:
Features which produce large values for this score are ranked as more important than features which produce small values. The statistical definition of the variable importance measure was given and analyzed by Zhu
7575:{\displaystyle |m_{\infty ,n}(\mathbf {x} )-{\tilde {m}}_{\infty ,n}(\mathbf {x} )|\leq {\frac {b_{n}-a_{n}}{a_{n}}}{\tilde {m}}_{\infty ,n}(\mathbf {x} )+n\varepsilon _{n}\left(\max _{1\leq i\leq n}Y_{i}\right).}
1392:
8350:
5540:{\displaystyle {\tilde {m}}_{M,n}(\mathbf {x} ,\Theta _{1},\ldots ,\Theta _{M})={\frac {\sum _{i=1}^{n}Y_{i}K_{M,n}(\mathbf {x} ,\mathbf {x} _{i})}{\sum _{\ell =1}^{n}K_{M,n}(\mathbf {x} ,\mathbf {x} _{\ell })}}}
5137:
3875:
1139:, "because it is invariant under scaling and various other transformations of feature values, is robust to inclusion of irrelevant features, and produces inspectable models. However, they are seldom accurate".
2653:
9096:
Ye, Y., Li, H., Deng, X., and Huang, J. (2008) Feature weighting random forest for detection of hidden web search interfaces. Journal of Computational Linguistics and Chinese Language Processing, 13, 387–404.
1841:
1049:
that forests of trees splitting with oblique hyperplanes can gain accuracy as they grow without suffering from overtraining, as long as the forests are randomly restricted to be sensitive to only selected
7196:
9123:
Winham, Stacey & Freimuth, Robert & Biernacka, Joanna. (2013). A weighted random forests approach to improve predictive performance. Statistical Analysis and Data Mining. 6. 10.1002/sam.11196.
3783:
3933:
4369:
3930:. This random variable can be used to describe the randomness induced by node splitting and the sampling procedure for tree construction. The trees are combined to form the finite forest estimate
2892:
1910:-th feature is computed by averaging the difference in out-of-bag error before and after the permutation over all trees. The score is normalized by the standard deviation of these differences.
1404:
correlated trees (or even the same tree many times, if the training algorithm is deterministic); bootstrap sampling is a way of de-correlating the trees by showing them different training sets.
6998:
3710:
1132:
Decision trees are a popular method for various machine learning tasks. Tree learning "come closest to meeting the requirements for serving as an off-the-shelf procedure for data mining", say
4284:
2809:
2289:
7631:
2127:{\displaystyle {\text{unormalized average importance}}(x)={\frac {1}{n_{T}}}\sum _{i=1}^{n_{T}}\sum _{{\text{node }}j\in T_{i}|{\text{split variable}}(j)=x}p_{T_{i}}(j)\Delta i_{T_{i}}(j),}
1934:
Additionally, the permutation procedure may fail to identify important features when there are collinear features. In this case permuting groups of correlated features together is a remedy.
1722:
Enriched Random Forest (ERF): Use weighted random sampling instead of simple random sampling at each node of each tree, giving greater weight to features that appear to be more informative.
3603:
2519:
8161:
7866:
7066:
3524:
7706:
8120:
7825:
4547:
2355:
4364:
3928:
3897:
3371:
9087:
Dessi, N. & Milia, G. & Pes, B. (2013). Enhancing random forests performance in microarray data classification. Conference paper, 99-103. 10.1007/978-3-642-38326-7_15.
3558:
2702:
1414:
6556:
3185:{\displaystyle {\hat {y}}={\frac {1}{m}}\sum _{j=1}^{m}\sum _{i=1}^{n}W_{j}(x_{i},x')\,y_{i}=\sum _{i=1}^{n}\left({\frac {1}{m}}\sum _{j=1}^{m}W_{j}(x_{i},x')\right)\,y_{i}.}
1086:. In addition, this paper combines several ingredients, some previously known and some novel, which form the basis of the modern practice of random forests, in particular:
9729:
Prinzie, Anita (2007). "Random Multiclass Classification: Generalizing Random Forests to Random MNL and Random NB". In Roland Wagner; Norman Revell; GĂĽnther Pernul (eds.).
1590:
trees, causing them to become correlated. An analysis of how bagging and random subspace projection contribute to accuracy gains under different conditions is given by Ho.
9114:
Ghosh D, Cabrera J. (2022) Enriched random forest for high dimensional genomic data. IEEE/ACM Trans Comput Biol Bioinform. 19(5):2817-2828. doi:10.1109/TCBB.2021.3089417.
7728:
7673:
7651:
5313:
5291:
5112:
4306:
3805:
3645:
4333:
1664:
1613:(rounded down) with a minimum node size of 5 as the default. In practice, the best values for these parameters should be tuned on a case-to-case basis for every problem.
7899:
2548:
3275:. In cases that the relationship between the predictors and the target variable is linear, the base learners may have an equally high accuracy as the ensemble learner.
8187:
1535:, is a free parameter. Typically, a few hundred to several thousand trees are used, depending on the size and nature of the training set. An optimal number of trees
884:
7202:
5090:
3223:
2942:
2741:
2206:
2179:
1407:
Additionally, an estimate of the uncertainty of the prediction can be made as the standard deviation of the predictions from all the individual regression trees on
7767:
1924:
For data including categorical variables with different number of levels, random forests are biased in favor of those attributes with more levels. Methods such as
922:
1582:. This process is sometimes called "feature bagging". The reason for doing this is the correlation of the trees in an ordinary bootstrap sample: if one or a few
7919:
7787:
6923:
5573:
5132:
3825:
3730:
3623:
3343:
3243:
2395:
2375:
2309:
2226:
2152:
1908:
1888:
1868:
1704:
1684:
6716:{\displaystyle a_{n}\leq N_{n}(\mathbf {x} ,\Theta )\leq b_{n}{\text{ and }}a_{n}\leq {\frac {1}{M}}\sum _{m=1}^{M}N_{n}{\mathbf {x} ,\Theta _{m}}\leq b_{n}.}
9250:
Piryonesi S. Madeh; El-Diraby Tamer E. (2020-06-01). "Role of Data Analytics in Infrastructure Asset Management: Overcoming Data Size and Quality Problems".
8364:
present in decision trees. Decision trees are among a fairly small family of machine learning models that are easily interpretable along with linear models,
8690:
8580:
1847:
for each data point is recorded and averaged over the forest (errors on an independent test set can be substituted if bagging is not used during training).
879:
6925:
goes to infinity, then we have infinite random forest and infinite KeRF. Their estimates are close if the number of observations in each cell is bounded:
8450:. Proceedings of the 3rd International Conference on Document Analysis and Recognition, Montreal, QC, 14–16 August 1995. pp. 278–282. Archived from
869:
4228:{\displaystyle m_{n}=\sum _{i=1}^{n}{\frac {Y_{i}\mathbf {1} _{\mathbf {X} _{i}\in A_{n}(\mathbf {x} ,\Theta _{j})}}{N_{n}(\mathbf {x} ,\Theta _{j})}}}
1890:-th feature are permuted in the out-of-bag samples and the out-of-bag error is again computed on this perturbed data set. The importance score for the
1310:
6020:
5578:
1168:
Illustration of training a Random Forest model. The training dataset (in this case, of 250 rows and 100 columns) is randomly sampled with replacement
8860:
3194:
This shows that the whole forest is again a weighted neighborhood scheme, with weights that average those of the individual trees. The neighbors of
2427:
it uses training statistics and therefore does not "reflect the ability of feature to be useful to make predictions that generalize to the test set"
2561:
710:
9486:
3393:
917:
8997:
7924:
5264:{\displaystyle K_{M,n}(\mathbf {x} ,\mathbf {z} )={\frac {1}{M}}\sum _{j=1}^{M}\mathbf {1} _{\mathbf {z} \in A_{n}(\mathbf {x} ,\Theta _{j})}}
9451:
8839:"RANDOM FORESTS Trademark of Health Care Productivity, Inc. - Registration Number 3185828 - Serial Number 78642027 :: Justia Trademarks"
1013:, which, in Ho's formulation, is a way to implement the "stochastic discrimination" approach to classification proposed by Eugene Kleinberg.
10072:
The Application of Data Analytics to Asset Management: Deterioration and Climate Change Adaptation in Ontario Roads (Doctoral dissertation)
8192:
3830:
2420:
The sci-kit learn default implementation of Mean Decrease in Impurity Feature Importance is susceptible to misleading feature importances:
874:
725:
8442:
6890:{\displaystyle |m_{M,n}(\mathbf {x} )-{\tilde {m}}_{M,n}(\mathbf {x} )|\leq {\frac {b_{n}-a_{n}}{a_{n}}}{\tilde {m}}_{M,n}(\mathbf {x} ).}
1931:
If the data contain groups of correlated features of similar relevance for the output, then smaller groups are favored over larger groups.
9731:
Database and Expert Systems Applications: 18th International Conference, DEXA 2007, Regensburg, Germany, September 3-7, 2007, Proceedings
456:
957:
760:
2417:
The normalized importance is then obtained by normalizing over all features, so that the sum of normalized feature importances is 1.
1759:
7072:
4067:{\displaystyle m_{M,n}(\mathbf {x} ,\Theta _{1},\ldots ,\Theta _{M})={\frac {1}{M}}\sum _{j=1}^{M}m_{n}(\mathbf {x} ,\Theta _{j})}
4492:{\displaystyle N_{n}(\mathbf {x} ,\Theta _{j})=\sum _{i=1}^{n}\mathbf {1} _{\mathbf {X} _{i}\in A_{n}(\mathbf {x} ,\Theta _{j})}}
836:
8946:"An Experimental Comparison of Three Methods for Constructing Ensembles of Decision Trees: Bagging, Boosting, and Randomization"
3735:
385:
3267:
Instead of decision trees, linear models have been proposed and evaluated as base estimators in random forests, in particular
10144:
9746:
9496:
9469:
9511:
9924:
Davies, Alex; Ghahramani, Zoubin (2014). "The Random Forest Kernel and other kernels for big data from random partitions".
1225:
894:
657:
192:
912:
9329:
Painsky A, Rosset S (2017). "Cross-Validated Variable Selection in Tree-Based Methods Improves Predictive Performance".
6932:
3650:
10240:
9657:"Using Machine Learning to Examine Impact of Type of Performance Indicator on Flexible Pavement Deterioration Modeling"
1079:
745:
720:
669:
17:
1725:
Tree Weighted Random Forest (TWRF): Weight trees so that trees exhibiting better accuracy are assigned higher weights.
10260:
10128:
8560:
4238:
2231:
1586:
are very strong predictors for the response variable (target output), these features will be selected in many of the
793:
788:
441:
7593:
1642:
of randomly selected features to be considered at each node can be specified. Default values for this parameter are
3563:
1563:
in their bootstrap sample. The training and test error tend to level off after some number of trees have been fit.
451:
89:
8731:
8623:
3268:
2832:
2454:
9702:
Prinzie, A.; Van den Poel, D. (2008). "Random Forests for multiclass classification: Random MultiNomial Logit".
8125:
7830:
2754:
9607:"Tumor classification by tissue microarray profiling: random forest clustering applied to renal cell carcinoma"
7006:
950:
846:
610:
431:
10221:
Liaw, Andy & Wiener, Matthew "Classification and Regression by randomForest" R News (2002) Vol. 2/3 p. 18
8360:
While random forests often achieve higher accuracy than a single decision tree, they sacrifice the intrinsic
3482:
1540:
821:
523:
299:
7678:
9198:. Proceedings of the 21st International Conference on Artificial Neural Networks (ICANN). pp. 293–300.
8868:
8369:
6502:
Predictions given by KeRF and random forests are close if the number of points in each cell is controlled:
2437:
2401:
778:
715:
625:
603:
446:
436:
10106:
8099:
7804:
4505:
8903:
8381:
8365:
2314:
929:
841:
826:
287:
109:
9767:"A comparison of random forest regression and multiple linear regression for prediction in neuroscience"
8805:
4338:
3902:
3880:
10245:
8950:
8764:
3348:
1583:
1147:
1050:
982:
889:
816:
566:
461:
249:
182:
142:
9998:
3529:
3283:
In machine learning, kernel random forests (KeRF) establish the connection between random forests and
9053:
8405:
6509:
1400:
1105:
The report also offers the first theoretical result for random forests in the form of a bound on the
943:
549:
317:
187:
9893:
9865:
9656:
9566:
9537:
9306:
9105:
Amaratunga, D., Cabrera, J., Lee, Y.S. (2008) Enriched Random Forest. Bioinformatics, 24, 2010-2014.
1172:
times. Then, a decision tree is trained on each sample. Finally, for prediction, the results of all
1036:" idea and random selection of features, introduced first by Ho and later independently by Amit and
10255:
10214:
9028:
8882:
8710:
8602:
1741:
571:
491:
414:
332:
162:
124:
119:
79:
74:
9958:
7711:
7656:
7636:
5296:
5274:
5095:
4289:
3788:
3628:
10250:
9766:
4311:
2660:
1645:
1606:(rounded down) features are used in each split. For regression problems the inventors recommend
518:
367:
267:
94:
7871:
2524:
2447:-NN) was pointed out by Lin and Jeon in 2002. It turns out that both can be viewed as so-called
10224:
9953:
9888:
9561:
9532:
9301:
8877:
8705:
8597:
8451:
8387:
3272:
2397:. As impurity measure for samples falling in a node e.g. the following statistics can be used:
1579:
1572:
1127:
1010:
990:
698:
674:
576:
337:
312:
272:
84:
8484:
7793:. Scornet proved upper bounds on the rates of consistency for centered KeRF and uniform KeRF.
7340:{\displaystyle \operatorname {P} \leq b_{n}\mid {\mathcal {D}}_{n}]\geq 1-\varepsilon _{n}/2,}
9852:
6017:
Uniform KeRF is built in the same way as uniform forest, except that predictions are made by
1181:
1159:
1083:
1033:
652:
474:
426:
282:
197:
69:
8838:
8166:
1142:
In particular, trees that are grown very deep tend to learn highly irregular patterns: they
10167:
10121:"Random Multiclass Classification: Generalizing Random Forests to Random MNL and Random NB"
9672:
8773:
8672:
8649:
8414:
7790:
5068:
3201:
2920:
2719:
2184:
2157:
1106:
1095:
1068:
581:
531:
7733:
27:
This article is about the machine learning technique. For other kinds of random tree, see
8:
9512:
https://scikit-learn.org/stable/auto_examples/inspection/plot_permutation_importance.html
9384:"Classification with correlated features: unreliability of feature ranking and solutions"
1925:
1522:{\displaystyle \sigma ={\sqrt {\frac {\sum _{b=1}^{B}(f_{b}(x')-{\hat {f}})^{2}}{B-1}}}.}
986:
684:
620:
591:
496:
322:
255:
241:
227:
202:
152:
104:
64:
10220:
10171:
10096:
9999:"Explainable decision forest: Transforming a decision forest into an interpretable tree"
8978:
8777:
1399:
This bootstrapping procedure leads to better model performance because it decreases the
10190:
10155:
10070:
10049:
10018:
9977:
9925:
9906:
9820:
9794:
9684:
9587:
9579:
9364:
9338:
9286:
9267:
9169:
9144:
9020:
8895:
8723:
8615:
8546:
8507:
8417: – Algorithm that employs a degree of randomness as part of its logic or procedure
7904:
7772:
6908:
5558:
5117:
3810:
3715:
3608:
3328:
3228:
2411:
2380:
2360:
2294:
2211:
2137:
1893:
1873:
1853:
1689:
1669:
662:
586:
372:
167:
8998:"A Data Complexity Analysis of Comparative Advantages of Decision Forest Constructors"
10195:
10140:
10120:
10022:
9786:
9742:
9688:
9676:
9628:
9492:
9465:
9405:
9356:
9271:
9232:
9193:
9174:
8619:
8556:
8542:
8511:
8399:
8393:
1301:
can be made by averaging the predictions from all the individual regression trees on
1164:
978:
755:
598:
511:
307:
277:
222:
217:
172:
114:
9948:
Breiman L, Ghahramani Z (2004). "Consistency for a simple model of random forests".
9798:
9552:
Shi, T.; Horvath, S. (2006). "Unsupervised Learning with Random Forest Predictors".
9400:
9383:
9227:
9210:
8899:
8550:
1078:. This paper describes a method of building a forest of uncorrelated trees using a
10185:
10175:
10132:
10010:
9910:
9898:
9844:
9782:
9778:
9734:
9711:
9668:
9618:
9571:
9457:
9426:
9395:
9368:
9348:
9311:
9259:
9222:
9164:
9156:
9068:
9024:
9012:
8959:
8887:
8781:
8727:
8715:
8658:
8607:
8499:
8361:
3526:-valued independent random variables distributed as the independent prototype pair
2406:
1844:
1630:
1545:
1091:
783:
536:
486:
396:
380:
350:
212:
207:
157:
147:
45:
10037:
9591:
9160:
10136:
10014:
9738:
8668:
1063:
811:
615:
481:
421:
6095:{\displaystyle {\tilde {m}}_{M,n}(\mathbf {x} ,\Theta _{1},\ldots ,\Theta _{M})}
5653:{\displaystyle {\tilde {m}}_{M,n}(\mathbf {x} ,\Theta _{1},\ldots ,\Theta _{M})}
10160:
Proceedings of the National Academy of Sciences of the United States of America
9902:
9879:
Lin, Yi; Jeon, Yongho (2006). "Random forests and adaptive nearest neighbors".
9715:
9352:
9315:
8891:
3303:
2554:
by looking at the "neighborhood" of the point, formalized by a weight function
1040:
in order to construct a collection of decision trees with controlled variance.
831:
362:
99:
9950:
Statistical Department, University of California at Berkeley. Technical Report
9623:
9606:
9073:
8964:
8945:
8786:
8759:
8645:"An Overtraining-Resistant Stochastic Modeling Method for Pattern Recognition"
10234:
9680:
8663:
8644:
8538:
3288:
3287:. By slightly modifying their definition, random forests can be rewritten as
3284:
1634:
1133:
1059:
750:
679:
561:
177:
10180:
9575:
6900:
10199:
9790:
9632:
9409:
9360:
9287:"Unbiased split selection for classification trees based on the Gini index"
9263:
9236:
9178:
8856:
3877:
are independent random variables, distributed as a generic random variable
1180:
The training algorithm for random forests applies the general technique of
1037:
1021:
9461:
9016:
9840:
8755:
3472:{\displaystyle {\mathcal {D}}_{n}=\{(\mathbf {X} _{i},Y_{i})\}_{i=1}^{n}}
3299:
1143:
1110:
1075:
1017:
995:
556:
50:
28:
9583:
9531:(Technical report). Technical Report No. 1055. University of Wisconsin.
8081:{\displaystyle \mathbb {E} ^{2}\leq C_{1}n^{-1/(3+d\log 2)}(\log n)^{2}}
5575:
is the same as for centered forest, except that predictions are made by
9733:. Lecture Notes in Computer Science. Vol. 4653. pp. 349–358.
8979:
Gareth James; Daniela Witten; Trevor Hastie; Robert Tibshirani (2013).
8611:
1843:
is to fit a random forest to the data. During the fitting process the
1006:
1005:
The first algorithm for random decision forests was created in 1995 by
705:
401:
327:
8980:
8719:
8503:
9605:
Shi T, Seligson D, Belldegrun AS, Palotie A, Horvath S (April 2005).
9195:
Bias of importance measures for multi-valued attributes and solutions
8345:{\displaystyle \mathbb {E} ^{2}\leq Cn^{-2/(6+3d\log 2)}(\log n)^{2}}
3870:{\displaystyle \mathbf {\Theta } _{1},\ldots ,\mathbf {\Theta } _{M}}
1396:
or by taking the plurality vote in the case of classification trees.
1025:
864:
645:
9976:
Arlot S, Genuer R (2014). "Analysis of purely random forests bias".
9249:
10054:
9825:
9343:
1920:
This method of determining variable importance has some drawbacks.
1109:
which depends on the strength of the trees in the forest and their
999:
9982:
9930:
8932:
Proceedings of the Second Intl. Workshop on Multistrategy Learning
8828:
U.S. trademark registration number 3185828, registered 2006/12/19.
3302:
was the first person to notice the link between random forest and
1756:
The first step in measuring the variable importance in a data set
1387:{\displaystyle {\hat {f}}={\frac {1}{B}}\sum _{b=1}^{B}f_{b}(x')}
1074:
The proper introduction of random forests was made in a paper by
1029:
640:
9211:"Permutation importance: a corrected feature importance measure"
8691:"On the Algorithmic Implementation of Stochastic Discrimination"
1629:
cut-point for each feature under consideration (based on, e.g.,
6102:, the corresponding kernel function, or connection function is
5660:, the corresponding kernel function, or connection function is
3307:
1938:
1715:
that are informative. Some methods for accomplishing this are:
1082:
like procedure, combined with randomized node optimization and
391:
9331:
IEEE Transactions on Pattern Analysis and Machine Intelligence
8698:
IEEE Transactions on Pattern Analysis and Machine Intelligence
8492:
IEEE Transactions on Pattern Analysis and Machine Intelligence
8485:"The Random Subspace Method for Constructing Decision Forests"
1836:{\displaystyle {\mathcal {D}}_{n}=\{(X_{i},Y_{i})\}_{i=1}^{n}}
9488:
Pattern Recognition Techniques Applied to Biomedical Problems
2648:{\displaystyle {\hat {y}}=\sum _{i=1}^{n}W(x_{i},x')\,y_{i}.}
989:
and other tasks that operates by constructing a multitude of
635:
630:
357:
9819:
Scornet, Erwan (2015). "Random forests and kernel methods".
9604:
7191:{\displaystyle \operatorname {P} \geq 1-\varepsilon _{n}/2,}
3253:
1928:
and growing unbiased trees can be used to solve the problem.
1719:
Prefiltering: Eliminate features that are mostly just noise.
9701:
9208:
5114:
in the forest. If we define the connection function of the
3306:. He pointed out that random forests which are grown using
1709:
8861:"Shape quantization and recognition with randomized trees"
6497:
3778:{\displaystyle m_{n}(\mathbf {x} ,\mathbf {\Theta } _{j})}
9943:
9941:
9191:
6901:
Relation between infinite KeRF and infinite random forest
10223:(Discussion of the use of the random forest package for
9284:
9252:
Journal of Transportation Engineering, Part B: Pavements
8537:
2743:
must sum to one. Weight functions are given as follows:
2424:
the importance measure prefers high cardinality features
1121:
923:
List of datasets in computer vision and image processing
9655:
Piryonesi, S. Madeh; El-Diraby, Tamer E. (2021-02-01).
8410:
Pages displaying short descriptions of redirect targets
9938:
9765:
Smith, Paul F.; Ganesh, Siva; Liu, Ping (2013-10-01).
9209:
Altmann A, ToloĹźi L, Sander O, Lengauer T (May 2010).
3291:, which are more interpretable and easier to analyze.
8195:
8169:
8128:
8102:
7927:
7907:
7874:
7833:
7807:
7775:
7736:
7714:
7681:
7659:
7639:
7596:
7356:
7205:
7075:
7009:
6935:
6911:
6729:
6564:
6512:
6108:
6023:
5666:
5581:
5561:
5321:
5299:
5277:
5140:
5120:
5098:
5071:
4806:
4555:
4508:
4372:
4341:
4314:
4292:
4241:
4080:
3936:
3905:
3883:
3833:
3813:
3791:
3738:
3718:
3653:
3631:
3611:
3566:
3532:
3485:
3396:
3351:
3331:
3231:
3204:
2950:
2923:
2835:
2757:
2722:
2663:
2564:
2527:
2457:
2383:
2363:
2317:
2297:
2234:
2214:
2187:
2160:
2140:
1950:
1896:
1876:
1856:
1762:
1692:
1672:
1648:
1417:
1313:
1228:
of the training set and fits trees to these samples:
1184:, or bagging, to tree learners. Given a training set
9971:
9969:
9814:
9812:
9810:
9808:
2913:
Since a forest averages the predictions of a set of
2431:
1549:: the mean prediction error on each training sample
9051:
10156:"Classification and interaction in random forests"
9654:
9551:
8344:
8181:
8155:
8114:
8080:
7913:
7893:
7860:
7819:
7781:
7761:
7722:
7700:
7667:
7645:
7625:
7574:
7339:
7190:
7060:
6993:{\displaystyle (\varepsilon _{n}),(a_{n}),(b_{n})}
6992:
6917:
6889:
6715:
6550:
6482:
6094:
6002:
5652:
5567:
5539:
5307:
5285:
5263:
5126:
5106:
5084:
5057:
4792:
4541:
4491:
4358:
4327:
4300:
4278:
4227:
4066:
3922:
3891:
3869:
3819:
3799:
3777:
3724:
3705:{\displaystyle m(\mathbf {x} )=\operatorname {E} }
3704:
3639:
3617:
3597:
3552:
3518:
3471:
3365:
3337:
3319:
3237:
3217:
3184:
2936:
2886:
2803:
2735:
2696:
2647:
2542:
2513:
2389:
2369:
2349:
2303:
2283:
2220:
2200:
2173:
2146:
2126:
1902:
1882:
1862:
1835:
1698:
1678:
1658:
1521:
1386:
1101:Measuring variable importance through permutation.
9966:
9947:
9805:
9554:Journal of Computational and Graphical Statistics
8589:Annals of Mathematics and Artificial Intelligence
8396: – Statistics and machine learning technique
1566:
1176:trees are aggregated to produce a final decision.
10232:
9923:
9917:
8850:
8848:
7533:
1621:Adding one further step of randomization yields
9881:Journal of the American Statistical Association
9833:
9149:Journal of the American Statistical Association
8684:
8682:
8638:
8636:
8574:
8572:
8533:
8531:
8529:
8527:
8525:
8523:
8521:
7796:
4279:{\displaystyle A_{n}(\mathbf {x} ,\Theta _{j})}
3712:. A random regression forest is an ensemble of
2284:{\displaystyle p_{T_{i}}(j)={\frac {n_{j}}{n}}}
1297:After training, predictions for unseen samples
1016:An extension of the algorithm was developed by
10153:
9845:"Some infinity theory for predictor ensembles"
9764:
9381:
9142:
8926:Heath, D., Kasif, S. and Salzberg, S. (1993).
8091:
7626:{\displaystyle Y=m(\mathbf {X} )+\varepsilon }
5271:, i.e. the proportion of cells shared between
4502:Thus random forest estimates satisfy, for all
2436:A relationship between random forests and the
1870:-th feature after training, the values of the
918:List of datasets for machine-learning research
10036:Vidal, Thibaut; Schiffer, Maximilian (2020).
10035:
9872:
9847:. Technical Report 579, Statistics Dept. UCB.
9529:Random forests and adaptive nearest neighbors
9484:
9328:
8845:
8750:
8748:
8746:
8744:
7653:is a centered Gaussian noise, independent of
3598:{\displaystyle \operatorname {E} <\infty }
3385:
2708:'th training point relative to the new point
2451:. These are models built from a training set
1593:Typically, for a classification problem with
951:
10042:International Conference on Machine Learning
9522:
9520:
9485:Ortiz-Posadas, Martha Refugio (2020-02-29).
9294:Computational Statistics & Data Analysis
9285:Strobl C, Boulesteix AL, Augustin T (2007).
8679:
8633:
8569:
8518:
5945:
5915:
5909:
5879:
3449:
3414:
3314:
2491:
2458:
1939:Mean Decrease in Impurity Feature Importance
1813:
1780:
10118:
9975:
5555:The construction of Centered KeRF of level
2887:{\displaystyle W(x_{i},x')={\frac {1}{k'}}}
2514:{\displaystyle \{(x_{i},y_{i})\}_{i=1}^{n}}
34:Tree-based ensemble machine learning method
9996:
9427:"Beware Default Random Forest Importances"
8943:
8806:"Documentation for R package randomForest"
8799:
8797:
8741:
8156:{\displaystyle n/2^{k}\rightarrow \infty }
7861:{\displaystyle n/2^{k}\rightarrow \infty }
2804:{\displaystyle W(x_{i},x')={\frac {1}{k}}}
1271:Train a classification or regression tree
958:
944:
10189:
10179:
10068:
10053:
9981:
9957:
9929:
9892:
9824:
9622:
9565:
9536:
9517:
9399:
9342:
9305:
9226:
9168:
9072:
8963:
8881:
8854:
8785:
8709:
8688:
8662:
8642:
8601:
8578:
8197:
7929:
7061:{\displaystyle \operatorname {E} \geq 1,}
3512:
3359:
3254:Unsupervised learning with random forests
3168:
3055:
2631:
2291:is the fraction of samples reaching node
1751:
1556:, using only the trees that did not have
10125:Database and Expert Systems Applications
10069:Piryonesi, Sayed Madeh (November 2019).
8478:
8476:
8474:
8472:
8436:
8434:
8432:
8430:
3647:, by estimating the regression function
1710:Random forests for high-dimensional data
1706:is the number of features in the model.
1163:
10154:Denisko D, Hoffman MM (February 2018).
9878:
9839:
9818:
9728:
9526:
9449:
8982:An Introduction to Statistical Learning
8794:
8754:
6498:Relation between KeRF and random forest
3519:{\displaystyle ^{p}\times \mathbb {R} }
3278:
2917:trees with individual weight functions
1024:, who registered "Random Forests" as a
14:
10233:
9192:Deng, H.; Runger, G.; Tuv, E. (2011).
9052:Geurts P, Ernst D, Wehenkel L (2006).
8989:
7701:{\displaystyle \sigma ^{2}<\infty }
7585:
3625:, associated with the random variable
3198:in this interpretation are the points
2181:is the number of trees in the forest,
1734:
10215:Random Forests classifier description
9760:
9758:
9650:
9648:
9646:
9644:
9642:
9421:
9419:
8469:
8427:
2712:in the same tree. For any particular
1122:Preliminaries: decision tree learning
1032:). The extension combines Breiman's "
8972:
8803:
8552:The Elements of Statistical Learning
8408: – Type of statistical analysis
8115:{\displaystyle k\rightarrow \infty }
7820:{\displaystyle k\rightarrow \infty }
4542:{\displaystyle \mathbf {x} \in ^{d}}
3732:randomized regression trees. Denote
3605:. We aim at predicting the response
9453:Classification and Regression Trees
9243:
8928:k-DT: A multi-tree learning method.
5092:'s falling in the cells containing
3245:. In this way, the neighborhood of
2350:{\displaystyle \Delta i_{T_{i}}(j)}
913:Glossary of artificial intelligence
24:
10082:
10062:
9755:
9673:10.1061/(ASCE)IS.1943-555X.0000602
9639:
9416:
9382:Tolosi L, Lengauer T (July 2011).
9143:Zhu R, Zeng D, Kosorok MR (2015).
8995:
8482:
8440:
8402: – Machine learning technique
8390: – Machine learning algorithm
8384: – Method in machine learning
8150:
8109:
7855:
7814:
7695:
7489:
7406:
7367:
7293:
7265:
7233:
7229:
7206:
7144:
7119:
7076:
7040:
7010:
6929:Assume that there exist sequences
6687:
6599:
6506:Assume that there exist sequences
6080:
6061:
5638:
5619:
5378:
5359:
5247:
5065:which is equal to the mean of the
5038:
4927:
4863:
4844:
4770:
4732:
4603:
4584:
4475:
4395:
4359:{\displaystyle {\mathcal {D}}_{n}}
4345:
4316:
4264:
4210:
4172:
4052:
3984:
3965:
3923:{\displaystyle {\mathcal {D}}_{n}}
3909:
3892:{\displaystyle \mathbf {\Theta } }
3671:
3592:
3567:
3400:
3225:sharing the same leaf in any tree
2704:is the non-negative weight of the
2357:is the change in impurity in tree
2318:
2092:
1766:
25:
10272:
10208:
10129:Lecture Notes in Computer Science
9997:Sagi, Omer; Rokach, Lior (2020).
9661:Journal of Infrastructure Systems
9005:Pattern Analysis and Applications
3376:
3373:is a parameter of the algorithm.
3366:{\displaystyle k\in \mathbb {N} }
2432:Relationship to nearest neighbors
1850:To measure the importance of the
10089:
9704:Expert Systems with Applications
8355:
8252:
8235:
7984:
7967:
7716:
7661:
7610:
7504:
7421:
7382:
7258:
7112:
7033:
6877:
6794:
6755:
6679:
6592:
6448:
6139:
6131:
6053:
5968:
5960:
5874:
5697:
5689:
5611:
5550:
5521:
5512:
5456:
5447:
5351:
5301:
5279:
5239:
5218:
5212:
5169:
5161:
5100:
5030:
5003:
4996:
4919:
4836:
4762:
4724:
4697:
4690:
4576:
4510:
4467:
4440:
4433:
4387:
4294:
4256:
4202:
4164:
4137:
4130:
4074:. For regression trees, we have
4044:
3957:
3885:
3857:
3836:
3793:
3762:
3753:
3695:
3687:
3661:
3633:
3553:{\displaystyle (\mathbf {X} ,Y)}
3537:
3422:
1148:low bias, but very high variance
10131:. Vol. 4653. p. 349.
10029:
9990:
9771:Journal of Neuroscience Methods
9722:
9695:
9598:
9545:
9505:
9478:
9443:
9375:
9322:
9278:
9202:
9185:
9136:
9126:
9117:
9108:
9099:
9090:
9081:
9045:
8937:
8920:
8831:
6551:{\displaystyle (a_{n}),(b_{n})}
6492:
6012:
5953:
3320:Preliminaries: Centered forests
3269:multinomial logistic regression
1146:their training sets, i.e. have
9783:10.1016/j.jneumeth.2013.08.024
9527:Lin, Yi; Jeon, Yongho (2002).
9145:"Reinforcement Learning Trees"
8822:
8333:
8320:
8315:
8291:
8260:
8256:
8248:
8239:
8231:
8211:
8201:
8147:
8106:
8069:
8056:
8051:
8030:
7992:
7988:
7980:
7971:
7963:
7943:
7933:
7852:
7811:
7750:
7737:
7614:
7606:
7508:
7500:
7482:
7429:
7425:
7417:
7399:
7386:
7378:
7358:
7304:
7271:
7268:
7254:
7241:
7212:
7155:
7122:
7108:
7082:
7046:
7043:
7029:
7016:
6987:
6974:
6968:
6955:
6949:
6936:
6881:
6873:
6855:
6802:
6798:
6790:
6772:
6759:
6751:
6731:
6602:
6588:
6545:
6532:
6526:
6513:
6468:
6455:
6412:
6397:
6341:
6326:
6143:
6127:
6089:
6049:
6031:
5988:
5975:
5701:
5685:
5647:
5607:
5589:
5531:
5508:
5466:
5443:
5387:
5347:
5329:
5256:
5235:
5173:
5157:
5047:
5026:
4936:
4915:
4872:
4832:
4814:
4779:
4758:
4741:
4720:
4612:
4572:
4530:
4517:
4484:
4463:
4404:
4383:
4273:
4252:
4219:
4198:
4181:
4160:
4061:
4040:
3993:
3953:
3772:
3749:
3699:
3677:
3665:
3657:
3586:
3573:
3547:
3533:
3499:
3486:
3445:
3417:
3160:
3136:
3052:
3028:
2957:
2863:
2839:
2785:
2761:
2691:
2667:
2628:
2604:
2571:
2534:
2487:
2461:
2449:weighted neighborhoods schemes
2344:
2338:
2258:
2252:
2118:
2112:
2089:
2083:
2055:
2049:
2040:
1962:
1956:
1953:unormalized average importance
1809:
1783:
1567:From bagging to random forests
1492:
1485:
1473:
1462:
1449:
1381:
1370:
1320:
1226:random sample with replacement
1028:in 2006 (as of 2019, owned by
333:Relevance vector machine (RVM)
13:
1:
9401:10.1093/bioinformatics/btr300
9228:10.1093/bioinformatics/btq134
9161:10.1080/01621459.2015.1036994
8985:. Springer. pp. 316–321.
8421:
5315:, then almost surely we have
3785:the predicted value at point
1729:
1616:
1580:random subset of the features
1531:The number of samples/trees,
822:Computational learning theory
386:Expectation–maximization (EM)
10137:10.1007/978-3-540-74469-6_35
10015:10.1016/j.inffus.2020.03.013
9739:10.1007/978-3-540-74469-6_35
9054:"Extremely randomized trees"
7797:Consistency of centered KeRF
7730:is uniformly distributed on
7723:{\displaystyle \mathbf {X} }
7668:{\displaystyle \mathbf {X} }
7646:{\displaystyle \varepsilon }
5308:{\displaystyle \mathbf {z} }
5286:{\displaystyle \mathbf {x} }
5107:{\displaystyle \mathbf {x} }
4301:{\displaystyle \mathbf {x} }
3899:, independent of the sample
3800:{\displaystyle \mathbf {x} }
3640:{\displaystyle \mathbf {X} }
3294:
1116:
779:Coefficient of determination
626:Convolutional neural network
338:Support vector machine (SVM)
7:
10038:"Born-Again Tree Ensembles"
9450:Breiman, Leo (2017-10-25).
8944:Dietterich, Thomas (2000).
8581:"Stochastic Discrimination"
8375:
8092:Consistency of uniform KeRF
4328:{\displaystyle \Theta _{j}}
4308:, designed with randomness
3262:
2905:points in the same leaf as
2697:{\displaystyle W(x_{i},x')}
2441:-nearest neighbor algorithm
1659:{\displaystyle {\sqrt {p}}}
930:Outline of machine learning
827:Empirical risk minimization
10:
10277:
10119:Prinzie A, Poel D (2007).
9903:10.1198/016214505000001230
9716:10.1016/j.eswa.2007.01.029
9353:10.1109/tpami.2016.2636831
9316:10.1016/j.csda.2006.12.030
8892:10.1162/neco.1997.9.7.1545
8804:Liaw A (16 October 2012).
8555:(2nd ed.). Springer.
8163:, there exists a constant
7894:{\displaystyle C_{1}>0}
7868:, there exists a constant
6558:such that, almost surely,
5547:, which defines the KeRF.
3386:From random forest to KeRF
2543:{\displaystyle {\hat {y}}}
1623:extremely randomized trees
1570:
1242:Sample, with replacement,
1157:
1153:
1125:
1043:
567:Feedforward neural network
318:Artificial neural networks
26:
10241:Classification algorithms
9624:10.1038/modpathol.3800322
9074:10.1007/s10994-006-6226-1
8406:Non-parametric statistics
7000:such that, almost surely
6905:When the number of trees
3390:Given a training sample
3315:Notations and definitions
2716:, the weights for points
550:Artificial neural network
10261:Computational statistics
859:Journals and conferences
806:Mathematical foundations
716:Temporal difference (TD)
572:Recurrent neural network
492:Conditional random field
415:Dimensionality reduction
163:Dimensionality reduction
125:Quantum machine learning
120:Neuromorphic engineering
80:Self-supervised learning
75:Semi-supervised learning
10181:10.1073/pnas.1800256115
9576:10.1198/106186006X94072
9456:. New York: Routledge.
8965:10.1023/A:1007607513941
8787:10.1023/A:1010933404324
8444:Random Decision Forests
7675:, with finite variance
4286:is the cell containing
3273:naive Bayes classifiers
1666:for classification and
1246:training examples from
975:random decision forests
268:Apprenticeship learning
9860:Cite journal requires
9264:10.1061/JPEODX.0000175
8664:10.1214/aos/1032181157
8388:Decision tree learning
8346:
8183:
8182:{\displaystyle C>0}
8157:
8116:
8082:
7915:
7895:
7862:
7821:
7783:
7763:
7724:
7702:
7669:
7647:
7627:
7583:
7576:
7341:
7192:
7062:
6994:
6919:
6898:
6891:
6717:
6666:
6552:
6484:
6378:
6313:
6206:
6096:
6004:
5871:
5764:
5654:
5569:
5541:
5491:
5416:
5309:
5287:
5265:
5209:
5128:
5108:
5086:
5059:
4983:
4962:
4904:
4794:
4674:
4648:
4543:
4493:
4430:
4360:
4329:
4302:
4280:
4229:
4114:
4068:
4029:
3924:
3893:
3871:
3821:
3801:
3779:
3726:
3706:
3641:
3619:
3599:
3554:
3520:
3473:
3367:
3339:
3239:
3219:
3186:
3125:
3089:
3017:
2996:
2944:, its predictions are
2938:
2888:
2805:
2737:
2698:
2649:
2600:
2544:
2521:that make predictions
2515:
2391:
2371:
2351:
2305:
2285:
2222:
2202:
2175:
2148:
2128:
2012:
1904:
1884:
1864:
1837:
1752:Permutation Importance
1700:
1686:for regression, where
1680:
1660:
1573:Random subspace method
1543:, or by observing the
1523:
1448:
1388:
1359:
1220:, bagging repeatedly (
1177:
1128:Decision tree learning
1094:as an estimate of the
1011:random subspace method
817:Bias–variance tradeoff
699:Reinforcement learning
675:Spiking neural network
85:Reinforcement learning
9462:10.1201/9781315139470
9017:10.1007/s100440200009
8347:
8184:
8158:
8117:
8083:
7916:
7896:
7863:
7822:
7784:
7764:
7725:
7703:
7670:
7648:
7628:
7577:
7342:
7193:
7063:
6995:
6927:
6920:
6892:
6718:
6646:
6553:
6504:
6485:
6345:
6293:
6186:
6097:
6005:
5851:
5744:
5655:
5570:
5542:
5471:
5396:
5310:
5288:
5266:
5189:
5129:
5109:
5087:
5085:{\displaystyle Y_{i}}
5060:
4963:
4942:
4884:
4795:
4654:
4628:
4544:
4494:
4410:
4361:
4330:
4303:
4281:
4230:
4094:
4069:
4009:
3925:
3894:
3872:
3822:
3802:
3780:
3727:
3707:
3642:
3620:
3600:
3555:
3521:
3474:
3368:
3340:
3240:
3220:
3218:{\displaystyle x_{i}}
3187:
3105:
3069:
2997:
2976:
2939:
2937:{\displaystyle W_{j}}
2909:, and zero otherwise.
2889:
2826:, and zero otherwise.
2806:
2751:-NN, the weights are
2738:
2736:{\displaystyle x_{i}}
2699:
2650:
2580:
2545:
2516:
2392:
2372:
2352:
2306:
2286:
2223:
2203:
2201:{\displaystyle T_{i}}
2176:
2174:{\displaystyle n_{T}}
2154:indicates a feature,
2149:
2129:
1985:
1905:
1885:
1865:
1838:
1701:
1681:
1661:
1524:
1428:
1389:
1339:
1182:bootstrap aggregating
1167:
1160:Bootstrap aggregating
653:Neural radiance field
475:Structured prediction
198:Structured prediction
70:Unsupervised learning
10217:(Leo Breiman's site)
8996:Ho, Tin Kam (2002).
8689:Kleinberg E (2000).
8650:Annals of Statistics
8643:Kleinberg E (1996).
8579:Kleinberg E (1990).
8441:Ho, Tin Kam (1995).
8415:Randomized algorithm
8193:
8167:
8126:
8100:
7925:
7905:
7872:
7831:
7805:
7773:
7762:{\displaystyle ^{d}}
7734:
7712:
7679:
7657:
7637:
7594:
7354:
7350:Then almost surely,
7203:
7073:
7007:
6933:
6909:
6727:
6723:Then almost surely,
6562:
6510:
6106:
6021:
5664:
5579:
5559:
5319:
5297:
5275:
5138:
5118:
5096:
5069:
4804:
4553:
4506:
4370:
4339:
4312:
4290:
4239:
4078:
3934:
3903:
3881:
3831:
3811:
3789:
3736:
3716:
3651:
3629:
3609:
3564:
3530:
3483:
3394:
3349:
3329:
3279:Kernel random forest
3229:
3202:
2948:
2921:
2833:
2755:
2720:
2661:
2562:
2525:
2455:
2381:
2361:
2315:
2295:
2232:
2212:
2185:
2158:
2138:
1948:
1926:partial permutations
1894:
1874:
1854:
1760:
1690:
1670:
1646:
1415:
1311:
1107:generalization error
1096:generalization error
1069:Thomas G. Dietterich
842:Statistical learning
740:Learning with humans
532:Local outlier factor
10172:2018PNAS..115.1690D
10048:. PMLR: 9743–9753.
9491:. Springer Nature.
8778:2001MachL..45....5B
8230:
7962:
7901:such that, for all
7586:Consistency results
6444: for all
6126:
5956: for all
5684:
3468:
2510:
1832:
1735:Variable importance
1539:can be found using
685:Electrochemical RAM
592:reservoir computing
323:Logistic regression
242:Supervised learning
228:Multimodal learning
203:Feature engineering
148:Generative modeling
110:Rule-based learning
105:Curriculum learning
65:Supervised learning
40:Part of a series on
10003:Information Fusion
9155:(512): 1770–1784.
8869:Neural Computation
8612:10.1007/BF01531079
8543:Tibshirani, Robert
8342:
8204:
8179:
8153:
8112:
8078:
7936:
7911:
7891:
7858:
7817:
7779:
7759:
7720:
7698:
7665:
7643:
7623:
7572:
7553:
7337:
7188:
7058:
6990:
6915:
6887:
6713:
6548:
6480:
6224:
6109:
6092:
6000:
5782:
5667:
5650:
5565:
5537:
5305:
5283:
5261:
5124:
5104:
5082:
5055:
4790:
4539:
4489:
4356:
4325:
4298:
4276:
4225:
4064:
3920:
3889:
3867:
3817:
3797:
3775:
3722:
3702:
3637:
3615:
3595:
3550:
3516:
3469:
3448:
3363:
3335:
3235:
3215:
3182:
2934:
2884:
2822:points closest to
2801:
2733:
2694:
2645:
2540:
2511:
2490:
2412:Mean squared error
2387:
2367:
2347:
2301:
2281:
2218:
2198:
2171:
2144:
2124:
2065:
1900:
1880:
1860:
1833:
1812:
1696:
1676:
1656:
1519:
1384:
1178:
253: •
168:Density estimation
18:Random naive Bayes
10246:Ensemble learning
10146:978-3-540-74467-2
9748:978-3-540-74467-2
9498:978-3-030-38021-2
9471:978-1-315-13947-0
9337:(11): 2142–2153.
8720:10.1109/34.857004
8504:10.1109/34.709601
8400:Gradient boosting
8394:Ensemble learning
8214:
7946:
7914:{\displaystyle n}
7782:{\displaystyle m}
7532:
7485:
7473:
7402:
6918:{\displaystyle M}
6858:
6846:
6775:
6644:
6621:
6445:
6435:
6281:
6266:
6149:
6034:
5957:
5839:
5824:
5707:
5592:
5568:{\displaystyle k}
5535:
5332:
5187:
5134:finite forest as
5127:{\displaystyle M}
4940:
4817:
4783:
4626:
4223:
4007:
3820:{\displaystyle j}
3725:{\displaystyle M}
3618:{\displaystyle Y}
3338:{\displaystyle k}
3238:{\displaystyle j}
3103:
2974:
2960:
2882:
2799:
2574:
2537:
2390:{\displaystyle j}
2370:{\displaystyle t}
2304:{\displaystyle j}
2279:
2221:{\displaystyle i}
2147:{\displaystyle x}
2047:
2021:
2013:
1983:
1954:
1903:{\displaystyle j}
1883:{\displaystyle j}
1863:{\displaystyle j}
1699:{\displaystyle p}
1679:{\displaystyle p}
1654:
1514:
1513:
1488:
1337:
1323:
1224:times) selects a
979:ensemble learning
968:
967:
773:Model diagnostics
756:Human-in-the-loop
599:Boltzmann machine
512:Anomaly detection
308:Linear regression
223:Ontology learning
218:Grammar induction
193:Semantic analysis
188:Association rules
173:Anomaly detection
115:Neuro-symbolic AI
16:(Redirected from
10268:
10203:
10193:
10183:
10166:(8): 1690–1692.
10150:
10093:
10092:
10077:
10076:
10066:
10060:
10059:
10057:
10033:
10027:
10026:
9994:
9988:
9987:
9985:
9973:
9964:
9963:
9961:
9945:
9936:
9935:
9933:
9921:
9915:
9914:
9896:
9887:(474): 578–590.
9876:
9870:
9869:
9863:
9858:
9856:
9848:
9837:
9831:
9830:
9828:
9816:
9803:
9802:
9762:
9753:
9752:
9726:
9720:
9719:
9710:(3): 1721–1732.
9699:
9693:
9692:
9652:
9637:
9636:
9626:
9611:Modern Pathology
9602:
9596:
9595:
9569:
9549:
9543:
9542:
9540:
9524:
9515:
9509:
9503:
9502:
9482:
9476:
9475:
9447:
9441:
9440:
9438:
9437:
9423:
9414:
9413:
9403:
9379:
9373:
9372:
9346:
9326:
9320:
9319:
9309:
9291:
9282:
9276:
9275:
9247:
9241:
9240:
9230:
9206:
9200:
9199:
9189:
9183:
9182:
9172:
9140:
9134:
9130:
9124:
9121:
9115:
9112:
9106:
9103:
9097:
9094:
9088:
9085:
9079:
9078:
9076:
9061:Machine Learning
9058:
9049:
9043:
9042:
9040:
9039:
9033:
9027:. Archived from
9002:
8993:
8987:
8986:
8976:
8970:
8969:
8967:
8951:Machine Learning
8941:
8935:
8924:
8918:
8917:
8915:
8914:
8908:
8902:. Archived from
8885:
8876:(7): 1545–1588.
8865:
8852:
8843:
8842:
8835:
8829:
8826:
8820:
8819:
8817:
8815:
8810:
8801:
8792:
8791:
8789:
8765:Machine Learning
8760:"Random Forests"
8752:
8739:
8738:
8736:
8730:. Archived from
8713:
8695:
8686:
8677:
8676:
8666:
8657:(6): 2319–2349.
8640:
8631:
8630:
8628:
8622:. Archived from
8605:
8596:(1–4): 207–239.
8585:
8576:
8567:
8566:
8547:Friedman, Jerome
8535:
8516:
8515:
8489:
8480:
8467:
8466:
8464:
8462:
8457:on 17 April 2016
8456:
8449:
8438:
8411:
8362:interpretability
8351:
8349:
8348:
8343:
8341:
8340:
8319:
8318:
8290:
8268:
8267:
8255:
8238:
8229:
8221:
8216:
8215:
8207:
8200:
8188:
8186:
8185:
8180:
8162:
8160:
8159:
8154:
8146:
8145:
8136:
8121:
8119:
8118:
8113:
8087:
8085:
8084:
8079:
8077:
8076:
8055:
8054:
8029:
8013:
8012:
8000:
7999:
7987:
7970:
7961:
7953:
7948:
7947:
7939:
7932:
7920:
7918:
7917:
7912:
7900:
7898:
7897:
7892:
7884:
7883:
7867:
7865:
7864:
7859:
7851:
7850:
7841:
7826:
7824:
7823:
7818:
7788:
7786:
7785:
7780:
7768:
7766:
7765:
7760:
7758:
7757:
7729:
7727:
7726:
7721:
7719:
7707:
7705:
7704:
7699:
7691:
7690:
7674:
7672:
7671:
7666:
7664:
7652:
7650:
7649:
7644:
7632:
7630:
7629:
7624:
7613:
7581:
7579:
7578:
7573:
7568:
7564:
7563:
7562:
7552:
7526:
7525:
7507:
7499:
7498:
7487:
7486:
7478:
7474:
7472:
7471:
7462:
7461:
7460:
7448:
7447:
7437:
7432:
7424:
7416:
7415:
7404:
7403:
7395:
7385:
7377:
7376:
7361:
7346:
7344:
7343:
7338:
7330:
7325:
7324:
7303:
7302:
7297:
7296:
7286:
7285:
7261:
7253:
7252:
7237:
7236:
7224:
7223:
7197:
7195:
7194:
7189:
7181:
7176:
7175:
7154:
7153:
7148:
7147:
7137:
7136:
7115:
7107:
7106:
7094:
7093:
7067:
7065:
7064:
7059:
7036:
7028:
7027:
6999:
6997:
6996:
6991:
6986:
6985:
6967:
6966:
6948:
6947:
6924:
6922:
6921:
6916:
6896:
6894:
6893:
6888:
6880:
6872:
6871:
6860:
6859:
6851:
6847:
6845:
6844:
6835:
6834:
6833:
6821:
6820:
6810:
6805:
6797:
6789:
6788:
6777:
6776:
6768:
6758:
6750:
6749:
6734:
6722:
6720:
6719:
6714:
6709:
6708:
6696:
6695:
6694:
6682:
6676:
6675:
6665:
6660:
6645:
6637:
6632:
6631:
6622:
6619:
6617:
6616:
6595:
6587:
6586:
6574:
6573:
6557:
6555:
6554:
6549:
6544:
6543:
6525:
6524:
6489:
6487:
6486:
6481:
6476:
6475:
6451:
6446:
6443:
6441:
6437:
6436:
6434:
6426:
6425:
6420:
6416:
6415:
6410:
6409:
6400:
6380:
6377:
6370:
6369:
6359:
6344:
6339:
6338:
6329:
6312:
6307:
6292:
6291:
6286:
6282:
6274:
6267:
6265:
6261:
6260:
6245:
6244:
6234:
6226:
6223:
6216:
6215:
6205:
6200:
6182:
6181:
6163:
6162:
6142:
6134:
6125:
6117:
6101:
6099:
6098:
6093:
6088:
6087:
6069:
6068:
6056:
6048:
6047:
6036:
6035:
6027:
6009:
6007:
6006:
6001:
5996:
5995:
5971:
5963:
5958:
5955:
5949:
5948:
5944:
5943:
5934:
5933:
5932:
5931:
5908:
5907:
5898:
5897:
5896:
5895:
5877:
5870:
5865:
5850:
5849:
5844:
5840:
5832:
5825:
5823:
5819:
5818:
5803:
5802:
5792:
5784:
5781:
5774:
5773:
5763:
5758:
5740:
5739:
5721:
5720:
5700:
5692:
5683:
5675:
5659:
5657:
5656:
5651:
5646:
5645:
5627:
5626:
5614:
5606:
5605:
5594:
5593:
5585:
5574:
5572:
5571:
5566:
5546:
5544:
5543:
5538:
5536:
5534:
5530:
5529:
5524:
5515:
5507:
5506:
5490:
5485:
5469:
5465:
5464:
5459:
5450:
5442:
5441:
5426:
5425:
5415:
5410:
5394:
5386:
5385:
5367:
5366:
5354:
5346:
5345:
5334:
5333:
5325:
5314:
5312:
5311:
5306:
5304:
5292:
5290:
5289:
5284:
5282:
5270:
5268:
5267:
5262:
5260:
5259:
5255:
5254:
5242:
5234:
5233:
5221:
5215:
5208:
5203:
5188:
5180:
5172:
5164:
5156:
5155:
5133:
5131:
5130:
5125:
5113:
5111:
5110:
5105:
5103:
5091:
5089:
5088:
5083:
5081:
5080:
5064:
5062:
5061:
5056:
5051:
5050:
5046:
5045:
5033:
5025:
5024:
5012:
5011:
5006:
4999:
4993:
4992:
4982:
4977:
4961:
4956:
4941:
4939:
4935:
4934:
4922:
4914:
4913:
4903:
4898:
4879:
4871:
4870:
4852:
4851:
4839:
4831:
4830:
4819:
4818:
4810:
4799:
4797:
4796:
4791:
4789:
4785:
4784:
4782:
4778:
4777:
4765:
4757:
4756:
4746:
4745:
4744:
4740:
4739:
4727:
4719:
4718:
4706:
4705:
4700:
4693:
4687:
4686:
4676:
4673:
4668:
4647:
4642:
4627:
4619:
4611:
4610:
4592:
4591:
4579:
4571:
4570:
4548:
4546:
4545:
4540:
4538:
4537:
4513:
4498:
4496:
4495:
4490:
4488:
4487:
4483:
4482:
4470:
4462:
4461:
4449:
4448:
4443:
4436:
4429:
4424:
4403:
4402:
4390:
4382:
4381:
4365:
4363:
4362:
4357:
4355:
4354:
4349:
4348:
4334:
4332:
4331:
4326:
4324:
4323:
4307:
4305:
4304:
4299:
4297:
4285:
4283:
4282:
4277:
4272:
4271:
4259:
4251:
4250:
4234:
4232:
4231:
4226:
4224:
4222:
4218:
4217:
4205:
4197:
4196:
4186:
4185:
4184:
4180:
4179:
4167:
4159:
4158:
4146:
4145:
4140:
4133:
4127:
4126:
4116:
4113:
4108:
4090:
4089:
4073:
4071:
4070:
4065:
4060:
4059:
4047:
4039:
4038:
4028:
4023:
4008:
4000:
3992:
3991:
3973:
3972:
3960:
3952:
3951:
3929:
3927:
3926:
3921:
3919:
3918:
3913:
3912:
3898:
3896:
3895:
3890:
3888:
3876:
3874:
3873:
3868:
3866:
3865:
3860:
3845:
3844:
3839:
3827:-th tree, where
3826:
3824:
3823:
3818:
3806:
3804:
3803:
3798:
3796:
3784:
3782:
3781:
3776:
3771:
3770:
3765:
3756:
3748:
3747:
3731:
3729:
3728:
3723:
3711:
3709:
3708:
3703:
3698:
3690:
3664:
3646:
3644:
3643:
3638:
3636:
3624:
3622:
3621:
3616:
3604:
3602:
3601:
3596:
3585:
3584:
3559:
3557:
3556:
3551:
3540:
3525:
3523:
3522:
3517:
3515:
3507:
3506:
3478:
3476:
3475:
3470:
3467:
3462:
3444:
3443:
3431:
3430:
3425:
3410:
3409:
3404:
3403:
3372:
3370:
3369:
3364:
3362:
3345:is built, where
3344:
3342:
3341:
3336:
3248:
3244:
3242:
3241:
3236:
3224:
3222:
3221:
3216:
3214:
3213:
3197:
3191:
3189:
3188:
3183:
3178:
3177:
3167:
3163:
3159:
3148:
3147:
3135:
3134:
3124:
3119:
3104:
3096:
3088:
3083:
3065:
3064:
3051:
3040:
3039:
3027:
3026:
3016:
3011:
2995:
2990:
2975:
2967:
2962:
2961:
2953:
2943:
2941:
2940:
2935:
2933:
2932:
2916:
2908:
2904:
2900:
2893:
2891:
2890:
2885:
2883:
2881:
2870:
2862:
2851:
2850:
2825:
2821:
2817:
2810:
2808:
2807:
2802:
2800:
2792:
2784:
2773:
2772:
2750:
2742:
2740:
2739:
2734:
2732:
2731:
2715:
2711:
2707:
2703:
2701:
2700:
2695:
2690:
2679:
2678:
2654:
2652:
2651:
2646:
2641:
2640:
2627:
2616:
2615:
2599:
2594:
2576:
2575:
2567:
2557:
2553:
2549:
2547:
2546:
2541:
2539:
2538:
2530:
2520:
2518:
2517:
2512:
2509:
2504:
2486:
2485:
2473:
2472:
2446:
2440:
2407:Gini coefficient
2396:
2394:
2393:
2388:
2376:
2374:
2373:
2368:
2356:
2354:
2353:
2348:
2337:
2336:
2335:
2334:
2310:
2308:
2307:
2302:
2290:
2288:
2287:
2282:
2280:
2275:
2274:
2265:
2251:
2250:
2249:
2248:
2227:
2225:
2224:
2219:
2207:
2205:
2204:
2199:
2197:
2196:
2180:
2178:
2177:
2172:
2170:
2169:
2153:
2151:
2150:
2145:
2133:
2131:
2130:
2125:
2111:
2110:
2109:
2108:
2082:
2081:
2080:
2079:
2064:
2048:
2045:
2043:
2038:
2037:
2022:
2019:
2011:
2010:
2009:
1999:
1984:
1982:
1981:
1969:
1955:
1952:
1909:
1907:
1906:
1901:
1889:
1887:
1886:
1881:
1869:
1867:
1866:
1861:
1845:out-of-bag error
1842:
1840:
1839:
1834:
1831:
1826:
1808:
1807:
1795:
1794:
1776:
1775:
1770:
1769:
1705:
1703:
1702:
1697:
1685:
1683:
1682:
1677:
1665:
1663:
1662:
1657:
1655:
1650:
1631:information gain
1612:
1605:
1604:
1603:
1596:
1589:
1562:
1555:
1546:out-of-bag error
1541:cross-validation
1538:
1534:
1528:
1526:
1525:
1520:
1515:
1512:
1501:
1500:
1499:
1490:
1489:
1481:
1472:
1461:
1460:
1447:
1442:
1426:
1425:
1410:
1393:
1391:
1390:
1385:
1380:
1369:
1368:
1358:
1353:
1338:
1330:
1325:
1324:
1316:
1304:
1300:
1291:
1284:
1277:
1267:
1260:
1253:
1249:
1245:
1238:
1234:
1219:
1212:
1205:
1201:
1194:
1187:
1092:out-of-bag error
960:
953:
946:
907:Related articles
784:Confusion matrix
537:Isolation forest
482:Graphical models
261:
260:
213:Learning to rank
208:Feature learning
46:Machine learning
37:
36:
21:
10276:
10275:
10271:
10270:
10269:
10267:
10266:
10265:
10256:Decision theory
10231:
10230:
10211:
10206:
10147:
10114:
10113:
10112:
10094:
10090:
10085:
10083:Further reading
10080:
10067:
10063:
10034:
10030:
9995:
9991:
9974:
9967:
9946:
9939:
9922:
9918:
9894:10.1.1.153.9168
9877:
9873:
9861:
9859:
9850:
9849:
9838:
9834:
9817:
9806:
9763:
9756:
9749:
9727:
9723:
9700:
9696:
9667:(2): 04021005.
9653:
9640:
9603:
9599:
9567:10.1.1.698.2365
9550:
9546:
9538:10.1.1.153.9168
9525:
9518:
9510:
9506:
9499:
9483:
9479:
9472:
9448:
9444:
9435:
9433:
9425:
9424:
9417:
9394:(14): 1986–94.
9380:
9376:
9327:
9323:
9307:10.1.1.525.3178
9289:
9283:
9279:
9258:(2): 04020022.
9248:
9244:
9207:
9203:
9190:
9186:
9141:
9137:
9131:
9127:
9122:
9118:
9113:
9109:
9104:
9100:
9095:
9091:
9086:
9082:
9056:
9050:
9046:
9037:
9035:
9031:
9000:
8994:
8990:
8977:
8973:
8942:
8938:
8925:
8921:
8912:
8910:
8906:
8863:
8853:
8846:
8837:
8836:
8832:
8827:
8823:
8813:
8811:
8808:
8802:
8795:
8753:
8742:
8734:
8693:
8687:
8680:
8641:
8634:
8626:
8583:
8577:
8570:
8563:
8536:
8519:
8487:
8481:
8470:
8460:
8458:
8454:
8447:
8439:
8428:
8424:
8409:
8378:
8358:
8336:
8332:
8286:
8279:
8275:
8263:
8259:
8251:
8234:
8222:
8217:
8206:
8205:
8196:
8194:
8191:
8190:
8168:
8165:
8164:
8141:
8137:
8132:
8127:
8124:
8123:
8101:
8098:
8097:
8094:
8072:
8068:
8025:
8018:
8014:
8008:
8004:
7995:
7991:
7983:
7966:
7954:
7949:
7938:
7937:
7928:
7926:
7923:
7922:
7906:
7903:
7902:
7879:
7875:
7873:
7870:
7869:
7846:
7842:
7837:
7832:
7829:
7828:
7806:
7803:
7802:
7799:
7774:
7771:
7770:
7753:
7749:
7735:
7732:
7731:
7715:
7713:
7710:
7709:
7686:
7682:
7680:
7677:
7676:
7660:
7658:
7655:
7654:
7638:
7635:
7634:
7609:
7595:
7592:
7591:
7588:
7558:
7554:
7536:
7531:
7527:
7521:
7517:
7503:
7488:
7477:
7476:
7475:
7467:
7463:
7456:
7452:
7443:
7439:
7438:
7436:
7428:
7420:
7405:
7394:
7393:
7392:
7381:
7366:
7362:
7357:
7355:
7352:
7351:
7326:
7320:
7316:
7298:
7292:
7291:
7290:
7281:
7277:
7257:
7248:
7244:
7232:
7228:
7219:
7215:
7204:
7201:
7200:
7177:
7171:
7167:
7149:
7143:
7142:
7141:
7132:
7128:
7111:
7102:
7098:
7089:
7085:
7074:
7071:
7070:
7032:
7023:
7019:
7008:
7005:
7004:
6981:
6977:
6962:
6958:
6943:
6939:
6934:
6931:
6930:
6910:
6907:
6906:
6903:
6876:
6861:
6850:
6849:
6848:
6840:
6836:
6829:
6825:
6816:
6812:
6811:
6809:
6801:
6793:
6778:
6767:
6766:
6765:
6754:
6739:
6735:
6730:
6728:
6725:
6724:
6704:
6700:
6690:
6686:
6678:
6677:
6671:
6667:
6661:
6650:
6636:
6627:
6623:
6620: and
6618:
6612:
6608:
6591:
6582:
6578:
6569:
6565:
6563:
6560:
6559:
6539:
6535:
6520:
6516:
6511:
6508:
6507:
6500:
6495:
6471:
6467:
6447:
6442:
6427:
6421:
6411:
6405:
6401:
6396:
6386:
6382:
6381:
6379:
6365:
6361:
6360:
6349:
6340:
6334:
6330:
6325:
6318:
6314:
6308:
6297:
6287:
6273:
6269:
6268:
6256:
6252:
6240:
6236:
6235:
6227:
6225:
6211:
6207:
6201:
6190:
6177:
6173:
6158:
6154:
6153:
6138:
6130:
6118:
6113:
6107:
6104:
6103:
6083:
6079:
6064:
6060:
6052:
6037:
6026:
6025:
6024:
6022:
6019:
6018:
6015:
5991:
5987:
5967:
5959:
5954:
5939:
5935:
5927:
5923:
5922:
5918:
5903:
5899:
5891:
5887:
5886:
5882:
5878:
5873:
5872:
5866:
5855:
5845:
5831:
5827:
5826:
5814:
5810:
5798:
5794:
5793:
5785:
5783:
5769:
5765:
5759:
5748:
5735:
5731:
5716:
5712:
5711:
5696:
5688:
5676:
5671:
5665:
5662:
5661:
5641:
5637:
5622:
5618:
5610:
5595:
5584:
5583:
5582:
5580:
5577:
5576:
5560:
5557:
5556:
5553:
5525:
5520:
5519:
5511:
5496:
5492:
5486:
5475:
5470:
5460:
5455:
5454:
5446:
5431:
5427:
5421:
5417:
5411:
5400:
5395:
5393:
5381:
5377:
5362:
5358:
5350:
5335:
5324:
5323:
5322:
5320:
5317:
5316:
5300:
5298:
5295:
5294:
5278:
5276:
5273:
5272:
5250:
5246:
5238:
5229:
5225:
5217:
5216:
5211:
5210:
5204:
5193:
5179:
5168:
5160:
5145:
5141:
5139:
5136:
5135:
5119:
5116:
5115:
5099:
5097:
5094:
5093:
5076:
5072:
5070:
5067:
5066:
5041:
5037:
5029:
5020:
5016:
5007:
5002:
5001:
5000:
4995:
4994:
4988:
4984:
4978:
4967:
4957:
4946:
4930:
4926:
4918:
4909:
4905:
4899:
4888:
4883:
4878:
4866:
4862:
4847:
4843:
4835:
4820:
4809:
4808:
4807:
4805:
4802:
4801:
4773:
4769:
4761:
4752:
4748:
4747:
4735:
4731:
4723:
4714:
4710:
4701:
4696:
4695:
4694:
4689:
4688:
4682:
4678:
4677:
4675:
4669:
4658:
4653:
4649:
4643:
4632:
4618:
4606:
4602:
4587:
4583:
4575:
4560:
4556:
4554:
4551:
4550:
4533:
4529:
4509:
4507:
4504:
4503:
4478:
4474:
4466:
4457:
4453:
4444:
4439:
4438:
4437:
4432:
4431:
4425:
4414:
4398:
4394:
4386:
4377:
4373:
4371:
4368:
4367:
4350:
4344:
4343:
4342:
4340:
4337:
4336:
4319:
4315:
4313:
4310:
4309:
4293:
4291:
4288:
4287:
4267:
4263:
4255:
4246:
4242:
4240:
4237:
4236:
4213:
4209:
4201:
4192:
4188:
4187:
4175:
4171:
4163:
4154:
4150:
4141:
4136:
4135:
4134:
4129:
4128:
4122:
4118:
4117:
4115:
4109:
4098:
4085:
4081:
4079:
4076:
4075:
4055:
4051:
4043:
4034:
4030:
4024:
4013:
3999:
3987:
3983:
3968:
3964:
3956:
3941:
3937:
3935:
3932:
3931:
3914:
3908:
3907:
3906:
3904:
3901:
3900:
3884:
3882:
3879:
3878:
3861:
3856:
3855:
3840:
3835:
3834:
3832:
3829:
3828:
3812:
3809:
3808:
3792:
3790:
3787:
3786:
3766:
3761:
3760:
3752:
3743:
3739:
3737:
3734:
3733:
3717:
3714:
3713:
3694:
3686:
3660:
3652:
3649:
3648:
3632:
3630:
3627:
3626:
3610:
3607:
3606:
3580:
3576:
3565:
3562:
3561:
3536:
3531:
3528:
3527:
3511:
3502:
3498:
3484:
3481:
3480:
3463:
3452:
3439:
3435:
3426:
3421:
3420:
3405:
3399:
3398:
3397:
3395:
3392:
3391:
3388:
3379:
3358:
3350:
3347:
3346:
3330:
3327:
3326:
3322:
3317:
3297:
3281:
3265:
3256:
3246:
3230:
3227:
3226:
3209:
3205:
3203:
3200:
3199:
3195:
3173:
3169:
3152:
3143:
3139:
3130:
3126:
3120:
3109:
3095:
3094:
3090:
3084:
3073:
3060:
3056:
3044:
3035:
3031:
3022:
3018:
3012:
3001:
2991:
2980:
2966:
2952:
2951:
2949:
2946:
2945:
2928:
2924:
2922:
2919:
2918:
2914:
2906:
2902:
2899:
2895:
2874:
2869:
2855:
2846:
2842:
2834:
2831:
2830:
2823:
2819:
2816:
2812:
2791:
2777:
2768:
2764:
2756:
2753:
2752:
2748:
2727:
2723:
2721:
2718:
2717:
2713:
2709:
2705:
2683:
2674:
2670:
2662:
2659:
2658:
2636:
2632:
2620:
2611:
2607:
2595:
2584:
2566:
2565:
2563:
2560:
2559:
2555:
2551:
2550:for new points
2529:
2528:
2526:
2523:
2522:
2505:
2494:
2481:
2477:
2468:
2464:
2456:
2453:
2452:
2444:
2438:
2434:
2382:
2379:
2378:
2362:
2359:
2358:
2330:
2326:
2325:
2321:
2316:
2313:
2312:
2296:
2293:
2292:
2270:
2266:
2264:
2244:
2240:
2239:
2235:
2233:
2230:
2229:
2213:
2210:
2209:
2208:indicates tree
2192:
2188:
2186:
2183:
2182:
2165:
2161:
2159:
2156:
2155:
2139:
2136:
2135:
2104:
2100:
2099:
2095:
2075:
2071:
2070:
2066:
2044:
2039:
2033:
2029:
2018:
2017:
2005:
2001:
2000:
1989:
1977:
1973:
1968:
1951:
1949:
1946:
1945:
1941:
1895:
1892:
1891:
1875:
1872:
1871:
1855:
1852:
1851:
1827:
1816:
1803:
1799:
1790:
1786:
1771:
1765:
1764:
1763:
1761:
1758:
1757:
1754:
1737:
1732:
1712:
1691:
1688:
1687:
1671:
1668:
1667:
1649:
1647:
1644:
1643:
1619:
1607:
1601:
1600:
1598:
1594:
1587:
1575:
1569:
1561:
1557:
1554:
1550:
1536:
1532:
1502:
1495:
1491:
1480:
1479:
1465:
1456:
1452:
1443:
1432:
1427:
1424:
1416:
1413:
1412:
1408:
1373:
1364:
1360:
1354:
1343:
1329:
1315:
1314:
1312:
1309:
1308:
1302:
1298:
1295:
1290:
1286:
1283:
1279:
1276:
1272:
1266:
1262:
1259:
1255:
1251:
1247:
1243:
1236:
1232:
1218:
1214:
1211:
1207:
1203:
1202:with responses
1200:
1196:
1193:
1189:
1185:
1162:
1156:
1130:
1124:
1119:
1046:
964:
935:
934:
908:
900:
899:
860:
852:
851:
812:Kernel machines
807:
799:
798:
774:
766:
765:
746:Active learning
741:
733:
732:
701:
691:
690:
616:Diffusion model
552:
542:
541:
514:
504:
503:
477:
467:
466:
422:Factor analysis
417:
407:
406:
390:
353:
343:
342:
263:
262:
246:
245:
244:
233:
232:
138:
130:
129:
95:Online learning
60:
48:
35:
32:
23:
22:
15:
12:
11:
5:
10274:
10264:
10263:
10258:
10253:
10251:Decision trees
10248:
10243:
10229:
10228:
10218:
10210:
10209:External links
10207:
10205:
10204:
10151:
10145:
10115:
10095:
10088:
10087:
10086:
10084:
10081:
10079:
10078:
10061:
10028:
9989:
9965:
9937:
9916:
9871:
9862:|journal=
9832:
9804:
9754:
9747:
9721:
9694:
9638:
9597:
9560:(1): 118–138.
9544:
9516:
9504:
9497:
9477:
9470:
9442:
9415:
9388:Bioinformatics
9374:
9321:
9277:
9242:
9221:(10): 1340–7.
9215:Bioinformatics
9201:
9184:
9135:
9125:
9116:
9107:
9098:
9089:
9080:
9044:
9011:(2): 102–112.
8988:
8971:
8958:(2): 139–157.
8936:
8934:, pp. 138-149.
8919:
8883:10.1.1.57.6069
8844:
8830:
8821:
8793:
8740:
8737:on 2018-01-18.
8711:10.1.1.33.4131
8704:(5): 473–490.
8678:
8632:
8629:on 2018-01-18.
8603:10.1.1.25.6750
8568:
8561:
8539:Hastie, Trevor
8517:
8498:(8): 832–844.
8483:Ho TK (1998).
8468:
8425:
8423:
8420:
8419:
8418:
8412:
8403:
8397:
8391:
8385:
8377:
8374:
8357:
8354:
8339:
8335:
8331:
8328:
8325:
8322:
8317:
8314:
8311:
8308:
8305:
8302:
8299:
8296:
8293:
8289:
8285:
8282:
8278:
8274:
8271:
8266:
8262:
8258:
8254:
8250:
8247:
8244:
8241:
8237:
8233:
8228:
8225:
8220:
8213:
8210:
8203:
8199:
8178:
8175:
8172:
8152:
8149:
8144:
8140:
8135:
8131:
8111:
8108:
8105:
8093:
8090:
8075:
8071:
8067:
8064:
8061:
8058:
8053:
8050:
8047:
8044:
8041:
8038:
8035:
8032:
8028:
8024:
8021:
8017:
8011:
8007:
8003:
7998:
7994:
7990:
7986:
7982:
7979:
7976:
7973:
7969:
7965:
7960:
7957:
7952:
7945:
7942:
7935:
7931:
7910:
7890:
7887:
7882:
7878:
7857:
7854:
7849:
7845:
7840:
7836:
7816:
7813:
7810:
7798:
7795:
7778:
7756:
7752:
7748:
7745:
7742:
7739:
7718:
7697:
7694:
7689:
7685:
7663:
7642:
7622:
7619:
7616:
7612:
7608:
7605:
7602:
7599:
7587:
7584:
7571:
7567:
7561:
7557:
7551:
7548:
7545:
7542:
7539:
7535:
7530:
7524:
7520:
7516:
7513:
7510:
7506:
7502:
7497:
7494:
7491:
7484:
7481:
7470:
7466:
7459:
7455:
7451:
7446:
7442:
7435:
7431:
7427:
7423:
7419:
7414:
7411:
7408:
7401:
7398:
7391:
7388:
7384:
7380:
7375:
7372:
7369:
7365:
7360:
7348:
7347:
7336:
7333:
7329:
7323:
7319:
7315:
7312:
7309:
7306:
7301:
7295:
7289:
7284:
7280:
7276:
7273:
7270:
7267:
7264:
7260:
7256:
7251:
7247:
7243:
7240:
7235:
7231:
7227:
7222:
7218:
7214:
7211:
7208:
7198:
7187:
7184:
7180:
7174:
7170:
7166:
7163:
7160:
7157:
7152:
7146:
7140:
7135:
7131:
7127:
7124:
7121:
7118:
7114:
7110:
7105:
7101:
7097:
7092:
7088:
7084:
7081:
7078:
7068:
7057:
7054:
7051:
7048:
7045:
7042:
7039:
7035:
7031:
7026:
7022:
7018:
7015:
7012:
6989:
6984:
6980:
6976:
6973:
6970:
6965:
6961:
6957:
6954:
6951:
6946:
6942:
6938:
6914:
6902:
6899:
6886:
6883:
6879:
6875:
6870:
6867:
6864:
6857:
6854:
6843:
6839:
6832:
6828:
6824:
6819:
6815:
6808:
6804:
6800:
6796:
6792:
6787:
6784:
6781:
6774:
6771:
6764:
6761:
6757:
6753:
6748:
6745:
6742:
6738:
6733:
6712:
6707:
6703:
6699:
6693:
6689:
6685:
6681:
6674:
6670:
6664:
6659:
6656:
6653:
6649:
6643:
6640:
6635:
6630:
6626:
6615:
6611:
6607:
6604:
6601:
6598:
6594:
6590:
6585:
6581:
6577:
6572:
6568:
6547:
6542:
6538:
6534:
6531:
6528:
6523:
6519:
6515:
6499:
6496:
6494:
6491:
6479:
6474:
6470:
6466:
6463:
6460:
6457:
6454:
6450:
6440:
6433:
6430:
6424:
6419:
6414:
6408:
6404:
6399:
6395:
6392:
6389:
6385:
6376:
6373:
6368:
6364:
6358:
6355:
6352:
6348:
6343:
6337:
6333:
6328:
6324:
6321:
6317:
6311:
6306:
6303:
6300:
6296:
6290:
6285:
6280:
6277:
6272:
6264:
6259:
6255:
6251:
6248:
6243:
6239:
6233:
6230:
6222:
6219:
6214:
6210:
6204:
6199:
6196:
6193:
6189:
6185:
6180:
6176:
6172:
6169:
6166:
6161:
6157:
6152:
6148:
6145:
6141:
6137:
6133:
6129:
6124:
6121:
6116:
6112:
6091:
6086:
6082:
6078:
6075:
6072:
6067:
6063:
6059:
6055:
6051:
6046:
6043:
6040:
6033:
6030:
6014:
6011:
5999:
5994:
5990:
5986:
5983:
5980:
5977:
5974:
5970:
5966:
5962:
5952:
5947:
5942:
5938:
5930:
5926:
5921:
5917:
5914:
5911:
5906:
5902:
5894:
5890:
5885:
5881:
5876:
5869:
5864:
5861:
5858:
5854:
5848:
5843:
5838:
5835:
5830:
5822:
5817:
5813:
5809:
5806:
5801:
5797:
5791:
5788:
5780:
5777:
5772:
5768:
5762:
5757:
5754:
5751:
5747:
5743:
5738:
5734:
5730:
5727:
5724:
5719:
5715:
5710:
5706:
5703:
5699:
5695:
5691:
5687:
5682:
5679:
5674:
5670:
5649:
5644:
5640:
5636:
5633:
5630:
5625:
5621:
5617:
5613:
5609:
5604:
5601:
5598:
5591:
5588:
5564:
5552:
5549:
5533:
5528:
5523:
5518:
5514:
5510:
5505:
5502:
5499:
5495:
5489:
5484:
5481:
5478:
5474:
5468:
5463:
5458:
5453:
5449:
5445:
5440:
5437:
5434:
5430:
5424:
5420:
5414:
5409:
5406:
5403:
5399:
5392:
5389:
5384:
5380:
5376:
5373:
5370:
5365:
5361:
5357:
5353:
5349:
5344:
5341:
5338:
5331:
5328:
5303:
5281:
5258:
5253:
5249:
5245:
5241:
5237:
5232:
5228:
5224:
5220:
5214:
5207:
5202:
5199:
5196:
5192:
5186:
5183:
5178:
5175:
5171:
5167:
5163:
5159:
5154:
5151:
5148:
5144:
5123:
5102:
5079:
5075:
5054:
5049:
5044:
5040:
5036:
5032:
5028:
5023:
5019:
5015:
5010:
5005:
4998:
4991:
4987:
4981:
4976:
4973:
4970:
4966:
4960:
4955:
4952:
4949:
4945:
4938:
4933:
4929:
4925:
4921:
4917:
4912:
4908:
4902:
4897:
4894:
4891:
4887:
4882:
4877:
4874:
4869:
4865:
4861:
4858:
4855:
4850:
4846:
4842:
4838:
4834:
4829:
4826:
4823:
4816:
4813:
4788:
4781:
4776:
4772:
4768:
4764:
4760:
4755:
4751:
4743:
4738:
4734:
4730:
4726:
4722:
4717:
4713:
4709:
4704:
4699:
4692:
4685:
4681:
4672:
4667:
4664:
4661:
4657:
4652:
4646:
4641:
4638:
4635:
4631:
4625:
4622:
4617:
4614:
4609:
4605:
4601:
4598:
4595:
4590:
4586:
4582:
4578:
4574:
4569:
4566:
4563:
4559:
4536:
4532:
4528:
4525:
4522:
4519:
4516:
4512:
4486:
4481:
4477:
4473:
4469:
4465:
4460:
4456:
4452:
4447:
4442:
4435:
4428:
4423:
4420:
4417:
4413:
4409:
4406:
4401:
4397:
4393:
4389:
4385:
4380:
4376:
4353:
4347:
4322:
4318:
4296:
4275:
4270:
4266:
4262:
4258:
4254:
4249:
4245:
4221:
4216:
4212:
4208:
4204:
4200:
4195:
4191:
4183:
4178:
4174:
4170:
4166:
4162:
4157:
4153:
4149:
4144:
4139:
4132:
4125:
4121:
4112:
4107:
4104:
4101:
4097:
4093:
4088:
4084:
4063:
4058:
4054:
4050:
4046:
4042:
4037:
4033:
4027:
4022:
4019:
4016:
4012:
4006:
4003:
3998:
3995:
3990:
3986:
3982:
3979:
3976:
3971:
3967:
3963:
3959:
3955:
3950:
3947:
3944:
3940:
3917:
3911:
3887:
3864:
3859:
3854:
3851:
3848:
3843:
3838:
3816:
3795:
3774:
3769:
3764:
3759:
3755:
3751:
3746:
3742:
3721:
3701:
3697:
3693:
3689:
3685:
3682:
3679:
3676:
3673:
3670:
3667:
3663:
3659:
3656:
3635:
3614:
3594:
3591:
3588:
3583:
3579:
3575:
3572:
3569:
3549:
3546:
3543:
3539:
3535:
3514:
3510:
3505:
3501:
3497:
3494:
3491:
3488:
3466:
3461:
3458:
3455:
3451:
3447:
3442:
3438:
3434:
3429:
3424:
3419:
3416:
3413:
3408:
3402:
3387:
3384:
3378:
3377:Uniform forest
3375:
3361:
3357:
3354:
3334:
3321:
3318:
3316:
3313:
3304:kernel methods
3296:
3293:
3289:kernel methods
3285:kernel methods
3280:
3277:
3264:
3261:
3255:
3252:
3234:
3212:
3208:
3181:
3176:
3172:
3166:
3162:
3158:
3155:
3151:
3146:
3142:
3138:
3133:
3129:
3123:
3118:
3115:
3112:
3108:
3102:
3099:
3093:
3087:
3082:
3079:
3076:
3072:
3068:
3063:
3059:
3054:
3050:
3047:
3043:
3038:
3034:
3030:
3025:
3021:
3015:
3010:
3007:
3004:
3000:
2994:
2989:
2986:
2983:
2979:
2973:
2970:
2965:
2959:
2956:
2931:
2927:
2911:
2910:
2901:is one of the
2897:
2880:
2877:
2873:
2868:
2865:
2861:
2858:
2854:
2849:
2845:
2841:
2838:
2827:
2818:is one of the
2814:
2798:
2795:
2790:
2787:
2783:
2780:
2776:
2771:
2767:
2763:
2760:
2730:
2726:
2693:
2689:
2686:
2682:
2677:
2673:
2669:
2666:
2644:
2639:
2635:
2630:
2626:
2623:
2619:
2614:
2610:
2606:
2603:
2598:
2593:
2590:
2587:
2583:
2579:
2573:
2570:
2536:
2533:
2508:
2503:
2500:
2497:
2493:
2489:
2484:
2480:
2476:
2471:
2467:
2463:
2460:
2433:
2430:
2429:
2428:
2425:
2415:
2414:
2409:
2404:
2386:
2366:
2346:
2343:
2340:
2333:
2329:
2324:
2320:
2300:
2278:
2273:
2269:
2263:
2260:
2257:
2254:
2247:
2243:
2238:
2217:
2195:
2191:
2168:
2164:
2143:
2123:
2120:
2117:
2114:
2107:
2103:
2098:
2094:
2091:
2088:
2085:
2078:
2074:
2069:
2063:
2060:
2057:
2054:
2051:
2046:split variable
2042:
2036:
2032:
2028:
2025:
2016:
2008:
2004:
1998:
1995:
1992:
1988:
1980:
1976:
1972:
1967:
1964:
1961:
1958:
1940:
1937:
1936:
1935:
1932:
1929:
1899:
1879:
1859:
1830:
1825:
1822:
1819:
1815:
1811:
1806:
1802:
1798:
1793:
1789:
1785:
1782:
1779:
1774:
1768:
1753:
1750:
1736:
1733:
1731:
1728:
1727:
1726:
1723:
1720:
1711:
1708:
1695:
1675:
1653:
1618:
1615:
1571:Main article:
1568:
1565:
1559:
1552:
1518:
1511:
1508:
1505:
1498:
1494:
1487:
1484:
1478:
1475:
1471:
1468:
1464:
1459:
1455:
1451:
1446:
1441:
1438:
1435:
1431:
1423:
1420:
1383:
1379:
1376:
1372:
1367:
1363:
1357:
1352:
1349:
1346:
1342:
1336:
1333:
1328:
1322:
1319:
1294:
1293:
1288:
1281:
1274:
1269:
1264:
1257:
1230:
1216:
1209:
1198:
1191:
1158:Main article:
1155:
1152:
1126:Main article:
1123:
1120:
1118:
1115:
1103:
1102:
1099:
1045:
1042:
991:decision trees
983:classification
971:Random forests
966:
965:
963:
962:
955:
948:
940:
937:
936:
933:
932:
927:
926:
925:
915:
909:
906:
905:
902:
901:
898:
897:
892:
887:
882:
877:
872:
867:
861:
858:
857:
854:
853:
850:
849:
844:
839:
834:
832:Occam learning
829:
824:
819:
814:
808:
805:
804:
801:
800:
797:
796:
791:
789:Learning curve
786:
781:
775:
772:
771:
768:
767:
764:
763:
758:
753:
748:
742:
739:
738:
735:
734:
731:
730:
729:
728:
718:
713:
708:
702:
697:
696:
693:
692:
689:
688:
682:
677:
672:
667:
666:
665:
655:
650:
649:
648:
643:
638:
633:
623:
618:
613:
608:
607:
606:
596:
595:
594:
589:
584:
579:
569:
564:
559:
553:
548:
547:
544:
543:
540:
539:
534:
529:
521:
515:
510:
509:
506:
505:
502:
501:
500:
499:
494:
489:
478:
473:
472:
469:
468:
465:
464:
459:
454:
449:
444:
439:
434:
429:
424:
418:
413:
412:
409:
408:
405:
404:
399:
394:
388:
383:
378:
370:
365:
360:
354:
349:
348:
345:
344:
341:
340:
335:
330:
325:
320:
315:
310:
305:
297:
296:
295:
290:
285:
275:
273:Decision trees
270:
264:
250:classification
240:
239:
238:
235:
234:
231:
230:
225:
220:
215:
210:
205:
200:
195:
190:
185:
180:
175:
170:
165:
160:
155:
150:
145:
143:Classification
139:
136:
135:
132:
131:
128:
127:
122:
117:
112:
107:
102:
100:Batch learning
97:
92:
87:
82:
77:
72:
67:
61:
58:
57:
54:
53:
42:
41:
33:
9:
6:
4:
3:
2:
10273:
10262:
10259:
10257:
10254:
10252:
10249:
10247:
10244:
10242:
10239:
10238:
10236:
10226:
10222:
10219:
10216:
10213:
10212:
10201:
10197:
10192:
10187:
10182:
10177:
10173:
10169:
10165:
10161:
10157:
10152:
10148:
10142:
10138:
10134:
10130:
10126:
10122:
10117:
10116:
10110:
10109:
10108:
10107:Random forest
10102:
10098:
10074:
10073:
10065:
10056:
10051:
10047:
10043:
10039:
10032:
10024:
10020:
10016:
10012:
10008:
10004:
10000:
9993:
9984:
9979:
9972:
9970:
9960:
9959:10.1.1.618.90
9955:
9951:
9944:
9942:
9932:
9927:
9920:
9912:
9908:
9904:
9900:
9895:
9890:
9886:
9882:
9875:
9867:
9854:
9846:
9842:
9836:
9827:
9822:
9815:
9813:
9811:
9809:
9800:
9796:
9792:
9788:
9784:
9780:
9776:
9772:
9768:
9761:
9759:
9750:
9744:
9740:
9736:
9732:
9725:
9717:
9713:
9709:
9705:
9698:
9690:
9686:
9682:
9678:
9674:
9670:
9666:
9662:
9658:
9651:
9649:
9647:
9645:
9643:
9634:
9630:
9625:
9620:
9617:(4): 547–57.
9616:
9612:
9608:
9601:
9593:
9589:
9585:
9581:
9577:
9573:
9568:
9563:
9559:
9555:
9548:
9539:
9534:
9530:
9523:
9521:
9514:31. Aug. 2023
9513:
9508:
9500:
9494:
9490:
9489:
9481:
9473:
9467:
9463:
9459:
9455:
9454:
9446:
9432:
9428:
9422:
9420:
9411:
9407:
9402:
9397:
9393:
9389:
9385:
9378:
9370:
9366:
9362:
9358:
9354:
9350:
9345:
9340:
9336:
9332:
9325:
9317:
9313:
9308:
9303:
9299:
9295:
9288:
9281:
9273:
9269:
9265:
9261:
9257:
9253:
9246:
9238:
9234:
9229:
9224:
9220:
9216:
9212:
9205:
9197:
9196:
9188:
9180:
9176:
9171:
9166:
9162:
9158:
9154:
9150:
9146:
9139:
9129:
9120:
9111:
9102:
9093:
9084:
9075:
9070:
9066:
9062:
9055:
9048:
9034:on 2016-04-17
9030:
9026:
9022:
9018:
9014:
9010:
9006:
8999:
8992:
8984:
8983:
8975:
8966:
8961:
8957:
8953:
8952:
8947:
8940:
8933:
8929:
8923:
8909:on 2018-02-05
8905:
8901:
8897:
8893:
8889:
8884:
8879:
8875:
8871:
8870:
8862:
8858:
8851:
8849:
8840:
8834:
8825:
8807:
8800:
8798:
8788:
8783:
8779:
8775:
8771:
8767:
8766:
8761:
8757:
8751:
8749:
8747:
8745:
8733:
8729:
8725:
8721:
8717:
8712:
8707:
8703:
8699:
8692:
8685:
8683:
8674:
8670:
8665:
8660:
8656:
8652:
8651:
8646:
8639:
8637:
8625:
8621:
8617:
8613:
8609:
8604:
8599:
8595:
8591:
8590:
8582:
8575:
8573:
8564:
8562:0-387-95284-5
8558:
8554:
8553:
8548:
8544:
8540:
8534:
8532:
8530:
8528:
8526:
8524:
8522:
8513:
8509:
8505:
8501:
8497:
8493:
8486:
8479:
8477:
8475:
8473:
8453:
8446:
8445:
8437:
8435:
8433:
8431:
8426:
8416:
8413:
8407:
8404:
8401:
8398:
8395:
8392:
8389:
8386:
8383:
8380:
8379:
8373:
8371:
8367:
8363:
8356:Disadvantages
8353:
8337:
8329:
8326:
8323:
8312:
8309:
8306:
8303:
8300:
8297:
8294:
8287:
8283:
8280:
8276:
8272:
8269:
8264:
8245:
8242:
8226:
8223:
8218:
8208:
8176:
8173:
8170:
8142:
8138:
8133:
8129:
8103:
8089:
8073:
8065:
8062:
8059:
8048:
8045:
8042:
8039:
8036:
8033:
8026:
8022:
8019:
8015:
8009:
8005:
8001:
7996:
7977:
7974:
7958:
7955:
7950:
7940:
7908:
7888:
7885:
7880:
7876:
7847:
7843:
7838:
7834:
7808:
7794:
7792:
7776:
7754:
7746:
7743:
7740:
7692:
7687:
7683:
7640:
7620:
7617:
7603:
7600:
7597:
7582:
7569:
7565:
7559:
7555:
7549:
7546:
7543:
7540:
7537:
7528:
7522:
7518:
7514:
7511:
7495:
7492:
7479:
7468:
7464:
7457:
7453:
7449:
7444:
7440:
7433:
7412:
7409:
7396:
7389:
7373:
7370:
7363:
7334:
7331:
7327:
7321:
7317:
7313:
7310:
7307:
7299:
7287:
7282:
7278:
7274:
7262:
7249:
7245:
7238:
7225:
7220:
7216:
7209:
7199:
7185:
7182:
7178:
7172:
7168:
7164:
7161:
7158:
7150:
7138:
7133:
7129:
7125:
7116:
7103:
7099:
7095:
7090:
7086:
7079:
7069:
7055:
7052:
7049:
7037:
7024:
7020:
7013:
7003:
7002:
7001:
6982:
6978:
6971:
6963:
6959:
6952:
6944:
6940:
6926:
6912:
6897:
6884:
6868:
6865:
6862:
6852:
6841:
6837:
6830:
6826:
6822:
6817:
6813:
6806:
6785:
6782:
6779:
6769:
6762:
6746:
6743:
6740:
6736:
6710:
6705:
6701:
6697:
6691:
6683:
6672:
6668:
6662:
6657:
6654:
6651:
6647:
6641:
6638:
6633:
6628:
6624:
6613:
6609:
6605:
6596:
6583:
6579:
6575:
6570:
6566:
6540:
6536:
6529:
6521:
6517:
6503:
6490:
6477:
6472:
6464:
6461:
6458:
6452:
6438:
6431:
6428:
6422:
6417:
6406:
6402:
6393:
6390:
6387:
6383:
6374:
6371:
6366:
6362:
6356:
6353:
6350:
6346:
6335:
6331:
6322:
6319:
6315:
6309:
6304:
6301:
6298:
6294:
6288:
6283:
6278:
6275:
6270:
6262:
6257:
6253:
6249:
6246:
6241:
6237:
6231:
6228:
6220:
6217:
6212:
6208:
6202:
6197:
6194:
6191:
6187:
6183:
6178:
6174:
6170:
6167:
6164:
6159:
6155:
6150:
6146:
6135:
6122:
6119:
6114:
6110:
6084:
6076:
6073:
6070:
6065:
6057:
6044:
6041:
6038:
6028:
6010:
5997:
5992:
5984:
5981:
5978:
5972:
5964:
5950:
5940:
5936:
5928:
5924:
5919:
5912:
5904:
5900:
5892:
5888:
5883:
5867:
5862:
5859:
5856:
5852:
5846:
5841:
5836:
5833:
5828:
5820:
5815:
5811:
5807:
5804:
5799:
5795:
5789:
5786:
5778:
5775:
5770:
5766:
5760:
5755:
5752:
5749:
5745:
5741:
5736:
5732:
5728:
5725:
5722:
5717:
5713:
5708:
5704:
5693:
5680:
5677:
5672:
5668:
5642:
5634:
5631:
5628:
5623:
5615:
5602:
5599:
5596:
5586:
5562:
5551:Centered KeRF
5548:
5526:
5516:
5503:
5500:
5497:
5493:
5487:
5482:
5479:
5476:
5472:
5461:
5451:
5438:
5435:
5432:
5428:
5422:
5418:
5412:
5407:
5404:
5401:
5397:
5390:
5382:
5374:
5371:
5368:
5363:
5355:
5342:
5339:
5336:
5326:
5251:
5243:
5230:
5226:
5222:
5205:
5200:
5197:
5194:
5190:
5184:
5181:
5176:
5165:
5152:
5149:
5146:
5142:
5121:
5077:
5073:
5052:
5042:
5034:
5021:
5017:
5013:
5008:
4989:
4985:
4979:
4974:
4971:
4968:
4964:
4958:
4953:
4950:
4947:
4943:
4931:
4923:
4910:
4906:
4900:
4895:
4892:
4889:
4885:
4880:
4875:
4867:
4859:
4856:
4853:
4848:
4840:
4827:
4824:
4821:
4811:
4786:
4774:
4766:
4753:
4749:
4736:
4728:
4715:
4711:
4707:
4702:
4683:
4679:
4670:
4665:
4662:
4659:
4655:
4650:
4644:
4639:
4636:
4633:
4629:
4623:
4620:
4615:
4607:
4599:
4596:
4593:
4588:
4580:
4567:
4564:
4561:
4557:
4534:
4526:
4523:
4520:
4514:
4500:
4479:
4471:
4458:
4454:
4450:
4445:
4426:
4421:
4418:
4415:
4411:
4407:
4399:
4391:
4378:
4374:
4351:
4320:
4268:
4260:
4247:
4243:
4214:
4206:
4193:
4189:
4176:
4168:
4155:
4151:
4147:
4142:
4123:
4119:
4110:
4105:
4102:
4099:
4095:
4091:
4086:
4082:
4056:
4048:
4035:
4031:
4025:
4020:
4017:
4014:
4010:
4004:
4001:
3996:
3988:
3980:
3977:
3974:
3969:
3961:
3948:
3945:
3942:
3938:
3915:
3862:
3852:
3849:
3846:
3841:
3814:
3767:
3757:
3744:
3740:
3719:
3691:
3683:
3680:
3674:
3668:
3654:
3612:
3589:
3581:
3577:
3570:
3544:
3541:
3508:
3503:
3495:
3492:
3489:
3464:
3459:
3456:
3453:
3440:
3436:
3432:
3427:
3411:
3406:
3383:
3374:
3355:
3352:
3332:
3312:
3309:
3305:
3301:
3292:
3290:
3286:
3276:
3274:
3270:
3260:
3251:
3232:
3210:
3206:
3192:
3179:
3174:
3170:
3164:
3156:
3153:
3149:
3144:
3140:
3131:
3127:
3121:
3116:
3113:
3110:
3106:
3100:
3097:
3091:
3085:
3080:
3077:
3074:
3070:
3066:
3061:
3057:
3048:
3045:
3041:
3036:
3032:
3023:
3019:
3013:
3008:
3005:
3002:
2998:
2992:
2987:
2984:
2981:
2977:
2971:
2968:
2963:
2954:
2929:
2925:
2878:
2875:
2871:
2866:
2859:
2856:
2852:
2847:
2843:
2836:
2828:
2796:
2793:
2788:
2781:
2778:
2774:
2769:
2765:
2758:
2746:
2745:
2744:
2728:
2724:
2687:
2684:
2680:
2675:
2671:
2664:
2655:
2642:
2637:
2633:
2624:
2621:
2617:
2612:
2608:
2601:
2596:
2591:
2588:
2585:
2581:
2577:
2568:
2531:
2506:
2501:
2498:
2495:
2482:
2478:
2474:
2469:
2465:
2450:
2442:
2426:
2423:
2422:
2421:
2418:
2413:
2410:
2408:
2405:
2403:
2400:
2399:
2398:
2384:
2364:
2341:
2331:
2327:
2322:
2298:
2276:
2271:
2267:
2261:
2255:
2245:
2241:
2236:
2215:
2193:
2189:
2166:
2162:
2141:
2121:
2115:
2105:
2101:
2096:
2086:
2076:
2072:
2067:
2061:
2058:
2052:
2034:
2030:
2026:
2023:
2014:
2006:
2002:
1996:
1993:
1990:
1986:
1978:
1974:
1970:
1965:
1959:
1933:
1930:
1927:
1923:
1922:
1921:
1918:
1917:
1911:
1897:
1877:
1857:
1848:
1846:
1828:
1823:
1820:
1817:
1804:
1800:
1796:
1791:
1787:
1777:
1772:
1749:
1747:
1743:
1724:
1721:
1718:
1717:
1716:
1707:
1693:
1673:
1651:
1640:
1636:
1635:Gini impurity
1632:
1628:
1624:
1614:
1610:
1591:
1585:
1581:
1574:
1564:
1548:
1547:
1542:
1529:
1516:
1509:
1506:
1503:
1496:
1482:
1476:
1469:
1466:
1457:
1453:
1444:
1439:
1436:
1433:
1429:
1421:
1418:
1405:
1402:
1397:
1394:
1377:
1374:
1365:
1361:
1355:
1350:
1347:
1344:
1340:
1334:
1331:
1326:
1317:
1306:
1270:
1254:; call these
1241:
1240:
1229:
1227:
1223:
1183:
1175:
1171:
1166:
1161:
1151:
1149:
1145:
1140:
1138:
1135:
1129:
1114:
1112:
1108:
1100:
1097:
1093:
1089:
1088:
1087:
1085:
1081:
1077:
1072:
1070:
1065:
1061:
1055:
1052:
1041:
1039:
1035:
1031:
1030:Minitab, Inc.
1027:
1023:
1019:
1014:
1012:
1008:
1003:
1001:
997:
992:
988:
984:
980:
976:
972:
961:
956:
954:
949:
947:
942:
941:
939:
938:
931:
928:
924:
921:
920:
919:
916:
914:
911:
910:
904:
903:
896:
893:
891:
888:
886:
883:
881:
878:
876:
873:
871:
868:
866:
863:
862:
856:
855:
848:
845:
843:
840:
838:
835:
833:
830:
828:
825:
823:
820:
818:
815:
813:
810:
809:
803:
802:
795:
792:
790:
787:
785:
782:
780:
777:
776:
770:
769:
762:
759:
757:
754:
752:
751:Crowdsourcing
749:
747:
744:
743:
737:
736:
727:
724:
723:
722:
719:
717:
714:
712:
709:
707:
704:
703:
700:
695:
694:
686:
683:
681:
680:Memtransistor
678:
676:
673:
671:
668:
664:
661:
660:
659:
656:
654:
651:
647:
644:
642:
639:
637:
634:
632:
629:
628:
627:
624:
622:
619:
617:
614:
612:
609:
605:
602:
601:
600:
597:
593:
590:
588:
585:
583:
580:
578:
575:
574:
573:
570:
568:
565:
563:
562:Deep learning
560:
558:
555:
554:
551:
546:
545:
538:
535:
533:
530:
528:
526:
522:
520:
517:
516:
513:
508:
507:
498:
497:Hidden Markov
495:
493:
490:
488:
485:
484:
483:
480:
479:
476:
471:
470:
463:
460:
458:
455:
453:
450:
448:
445:
443:
440:
438:
435:
433:
430:
428:
425:
423:
420:
419:
416:
411:
410:
403:
400:
398:
395:
393:
389:
387:
384:
382:
379:
377:
375:
371:
369:
366:
364:
361:
359:
356:
355:
352:
347:
346:
339:
336:
334:
331:
329:
326:
324:
321:
319:
316:
314:
311:
309:
306:
304:
302:
298:
294:
293:Random forest
291:
289:
286:
284:
281:
280:
279:
276:
274:
271:
269:
266:
265:
258:
257:
252:
251:
243:
237:
236:
229:
226:
224:
221:
219:
216:
214:
211:
209:
206:
204:
201:
199:
196:
194:
191:
189:
186:
184:
181:
179:
178:Data cleaning
176:
174:
171:
169:
166:
164:
161:
159:
156:
154:
151:
149:
146:
144:
141:
140:
134:
133:
126:
123:
121:
118:
116:
113:
111:
108:
106:
103:
101:
98:
96:
93:
91:
90:Meta-learning
88:
86:
83:
81:
78:
76:
73:
71:
68:
66:
63:
62:
56:
55:
52:
47:
44:
43:
39:
38:
30:
19:
10163:
10159:
10124:
10105:
10104:
10103:profile for
10100:
10071:
10064:
10045:
10041:
10031:
10006:
10002:
9992:
9949:
9919:
9884:
9880:
9874:
9853:cite journal
9841:Breiman, Leo
9835:
9777:(1): 85–91.
9774:
9770:
9730:
9724:
9707:
9703:
9697:
9664:
9660:
9614:
9610:
9600:
9557:
9553:
9547:
9528:
9507:
9487:
9480:
9452:
9445:
9434:. Retrieved
9431:explained.ai
9430:
9391:
9387:
9377:
9334:
9330:
9324:
9297:
9293:
9280:
9255:
9251:
9245:
9218:
9214:
9204:
9194:
9187:
9152:
9148:
9138:
9128:
9119:
9110:
9101:
9092:
9083:
9064:
9060:
9047:
9036:. Retrieved
9029:the original
9008:
9004:
8991:
8981:
8974:
8955:
8949:
8939:
8931:
8927:
8922:
8911:. Retrieved
8904:the original
8873:
8867:
8833:
8824:
8812:. Retrieved
8769:
8763:
8732:the original
8701:
8697:
8654:
8648:
8624:the original
8593:
8587:
8551:
8495:
8491:
8459:. Retrieved
8452:the original
8443:
8368:models, and
8359:
8095:
7800:
7708:. Moreover,
7590:Assume that
7589:
7349:
6928:
6904:
6505:
6501:
6016:
6013:Uniform KeRF
5554:
4501:
4335:and dataset
3389:
3380:
3323:
3298:
3282:
3266:
3257:
3193:
2912:
2656:
2448:
2435:
2419:
2416:
1942:
1919:
1915:
1912:
1849:
1755:
1746:randomForest
1745:
1738:
1713:
1638:
1626:
1622:
1620:
1608:
1592:
1576:
1544:
1530:
1406:
1398:
1395:
1307:
1296:
1221:
1179:
1173:
1169:
1141:
1136:
1131:
1104:
1073:
1056:
1047:
1022:Adele Cutler
1015:
1004:
1000:training set
974:
970:
969:
837:PAC learning
524:
373:
368:Hierarchical
300:
292:
254:
248:
10009:: 124–138.
9300:: 483–501.
8772:(1): 5–32.
8189:such that,
3300:Leo Breiman
2829:In a tree,
1111:correlation
1076:Leo Breiman
1018:Leo Breiman
996:overfitting
981:method for
721:Multi-agent
658:Transformer
557:Autoencoder
313:Naive Bayes
51:data mining
29:Random tree
10235:Categories
10055:2003.11132
9826:1502.03836
9436:2023-10-25
9344:1512.03444
9038:2015-11-13
8913:2008-04-01
8422:References
8366:rule-based
8096:Providing
7801:Providing
6493:Properties
2020:node
1730:Properties
1617:ExtraTrees
1597:features,
1235:= 1, ...,
1009:using the
1007:Tin Kam Ho
987:regression
706:Q-learning
604:Restricted
402:Mean shift
351:Clustering
328:Perceptron
256:regression
158:Clustering
153:Regression
10075:(Thesis).
10023:216444882
9983:1407.3939
9954:CiteSeerX
9931:1402.4293
9889:CiteSeerX
9689:233550030
9681:1076-0342
9562:CiteSeerX
9533:CiteSeerX
9302:CiteSeerX
9272:216485629
8878:CiteSeerX
8756:Breiman L
8706:CiteSeerX
8620:206795835
8598:CiteSeerX
8512:206420153
8370:attention
8327:
8310:
8281:−
8270:≤
8243:−
8212:~
8151:∞
8148:→
8110:∞
8107:→
8063:
8046:
8020:−
8002:≤
7975:−
7944:~
7856:∞
7853:→
7815:∞
7812:→
7791:Lipschitz
7696:∞
7684:σ
7641:ε
7621:ε
7547:≤
7541:≤
7519:ε
7490:∞
7483:~
7450:−
7434:≤
7407:∞
7400:~
7390:−
7368:∞
7318:ε
7314:−
7308:≥
7288:∣
7275:≤
7266:Θ
7239:
7234:Θ
7226:≤
7210:
7169:ε
7165:−
7159:≥
7139:∣
7126:≤
7120:Θ
7096:≤
7080:
7050:≥
7041:Θ
7014:
6941:ε
6856:~
6823:−
6807:≤
6773:~
6763:−
6698:≤
6688:Θ
6648:∑
6634:≤
6606:≤
6600:Θ
6576:≤
6453:∈
6394:
6388:−
6372:−
6347:∑
6323:−
6295:∏
6250:…
6188:∑
6168:…
6151:∑
6081:Θ
6074:…
6062:Θ
6032:~
5973:∈
5946:⌉
5916:⌈
5910:⌉
5880:⌈
5853:∏
5808:⋯
5746:∑
5726:…
5709:∑
5639:Θ
5632:…
5620:Θ
5590:~
5527:ℓ
5477:ℓ
5473:∑
5398:∑
5379:Θ
5372:…
5360:Θ
5330:~
5248:Θ
5223:∈
5191:∑
5039:Θ
5014:∈
4965:∑
4944:∑
4928:Θ
4886:∑
4864:Θ
4857:…
4845:Θ
4815:~
4771:Θ
4733:Θ
4708:∈
4656:∑
4630:∑
4604:Θ
4597:…
4585:Θ
4515:∈
4476:Θ
4451:∈
4412:∑
4396:Θ
4317:Θ
4265:Θ
4211:Θ
4173:Θ
4148:∈
4096:∑
4053:Θ
4011:∑
3985:Θ
3978:…
3966:Θ
3886:Θ
3858:Θ
3850:…
3837:Θ
3763:Θ
3684:∣
3675:
3593:∞
3571:
3509:×
3356:∈
3107:∑
3071:∑
2999:∑
2978:∑
2958:^
2582:∑
2572:^
2535:^
2319:Δ
2093:Δ
2027:∈
2015:∑
1987:∑
1507:−
1486:^
1477:−
1430:∑
1419:σ
1341:∑
1321:^
1117:Algorithm
1026:trademark
998:to their
865:ECML PKDD
847:VC theory
794:ROC curve
726:Self-play
646:DeepDream
487:Bayes net
278:Ensembles
59:Paradigms
10200:29440440
9843:(2000).
9799:13195700
9791:24012917
9633:15529185
9584:27594168
9410:21576180
9361:28114007
9237:20385727
9179:26903687
9067:: 3–42.
8900:12470146
8859:(1997).
8855:Amit Y,
8814:15 March
8758:(2001).
8549:(2008).
8382:Boosting
8376:See also
7633:, where
4235:, where
3560:, where
3263:Variants
3157:′
3049:′
2879:′
2860:′
2782:′
2688:′
2625:′
2377:at node
1744:package
1584:features
1470:′
1409:x′
1401:variance
1378:′
1064:subspace
288:Boosting
137:Problems
10191:5828645
10168:Bibcode
10097:Scholia
9952:(670).
9911:2469856
9369:5381516
9170:4760114
9025:7415435
8857:Geman D
8774:Bibcode
8728:3563126
8673:1425956
3807:by the
3295:History
2402:entropy
1633:or the
1627:optimal
1599:√
1213:, ...,
1195:, ...,
1154:Bagging
1144:overfit
1084:bagging
1051:feature
1044:History
1034:bagging
870:NeurIPS
687:(ECRAM)
641:AlexNet
283:Bagging
10198:
10188:
10143:
10099:has a
10021:
9956:
9909:
9891:
9797:
9789:
9745:
9687:
9679:
9631:
9592:245216
9590:
9582:
9564:
9535:
9495:
9468:
9408:
9367:
9359:
9304:
9270:
9235:
9177:
9167:
9023:
8898:
8880:
8726:
8708:
8671:
8618:
8600:
8559:
8510:
8461:5 June
4366:, and
3308:i.i.d.
2657:Here,
2134:where
1916:et al.
1639:random
1137:et al.
1134:Hastie
1090:Using
977:is an
663:Vision
519:RANSAC
397:OPTICS
392:DBSCAN
376:-means
183:AutoML
10101:topic
10050:arXiv
10019:S2CID
9978:arXiv
9926:arXiv
9907:S2CID
9821:arXiv
9795:S2CID
9685:S2CID
9588:S2CID
9580:JSTOR
9365:S2CID
9339:arXiv
9290:(PDF)
9268:S2CID
9057:(PDF)
9032:(PDF)
9021:S2CID
9001:(PDF)
8907:(PDF)
8896:S2CID
8864:(PDF)
8809:(PDF)
8735:(PDF)
8724:S2CID
8694:(PDF)
8627:(PDF)
8616:S2CID
8584:(PDF)
8508:S2CID
8488:(PDF)
8455:(PDF)
8448:(PDF)
1637:), a
1038:Geman
885:IJCAI
711:SARSA
670:Mamba
636:LeNet
631:U-Net
457:t-SNE
381:Fuzzy
358:BIRCH
10196:PMID
10141:ISBN
9866:help
9787:PMID
9743:ISBN
9677:ISSN
9629:PMID
9493:ISBN
9466:ISBN
9406:PMID
9357:PMID
9233:PMID
9175:PMID
8816:2013
8557:ISBN
8463:2016
8174:>
8122:and
7886:>
7827:and
7769:and
7693:<
5293:and
3590:<
3271:and
1231:For
1080:CART
1060:tree
1020:and
895:JMLR
880:ICLR
875:ICML
761:RLHF
577:LSTM
363:CURE
49:and
10186:PMC
10176:doi
10164:115
10133:doi
10046:119
10011:doi
9899:doi
9885:101
9779:doi
9775:220
9735:doi
9712:doi
9669:doi
9619:doi
9572:doi
9458:doi
9396:doi
9349:doi
9312:doi
9260:doi
9256:146
9223:doi
9165:PMC
9157:doi
9153:110
9069:doi
9013:doi
8960:doi
8930:In
8888:doi
8782:doi
8716:doi
8659:doi
8608:doi
8500:doi
8324:log
8307:log
8060:log
8043:log
7789:is
7534:max
3479:of
2894:if
2811:if
2747:In
1278:on
973:or
621:SOM
611:GAN
587:ESN
582:GRU
527:-NN
462:SDL
452:PGD
447:PCA
442:NMF
437:LDA
432:ICA
427:CCA
303:-NN
10237::
10194:.
10184:.
10174:.
10162:.
10158:.
10139:.
10127:.
10123:.
10044:.
10040:.
10017:.
10007:61
10005:.
10001:.
9968:^
9940:^
9905:.
9897:.
9883:.
9857::
9855:}}
9851:{{
9807:^
9793:.
9785:.
9773:.
9769:.
9757:^
9741:.
9708:34
9706:.
9683:.
9675:.
9665:27
9663:.
9659:.
9641:^
9627:.
9615:18
9613:.
9609:.
9586:.
9578:.
9570:.
9558:15
9556:.
9519:^
9464:.
9429:.
9418:^
9404:.
9392:27
9390:.
9386:.
9363:.
9355:.
9347:.
9335:39
9333:.
9310:.
9298:52
9296:.
9292:.
9266:.
9254:.
9231:.
9219:26
9217:.
9213:.
9173:.
9163:.
9151:.
9147:.
9065:63
9063:.
9059:.
9019:.
9007:.
9003:.
8956:40
8954:.
8948:.
8894:.
8886:.
8872:.
8866:.
8847:^
8796:^
8780:.
8770:45
8768:.
8762:.
8743:^
8722:.
8714:.
8702:22
8700:.
8696:.
8681:^
8669:MR
8667:.
8655:24
8653:.
8647:.
8635:^
8614:.
8606:.
8592:.
8586:.
8571:^
8545:;
8541:;
8520:^
8506:.
8496:20
8494:.
8490:.
8471:^
8429:^
8352:.
8088:.
7921:,
6391:ln
4549:,
4499:.
3247:x'
3196:x'
2907:x'
2903:k'
2824:x'
2714:x'
2710:x'
2558::
2552:x'
2311:,
2228:,
1748:.
1611:/3
1411::
1305::
1303:x'
1299:x'
1285:,
1261:,
1250:,
1239::
1206:=
1188:=
1113:.
1071:.
1002:.
985:,
890:ML
10227:)
10225:R
10202:.
10178::
10170::
10149:.
10135::
10111:.
10058:.
10052::
10025:.
10013::
9986:.
9980::
9962:.
9934:.
9928::
9913:.
9901::
9868:)
9864:(
9829:.
9823::
9801:.
9781::
9751:.
9737::
9718:.
9714::
9691:.
9671::
9635:.
9621::
9594:.
9574::
9541:.
9501:.
9474:.
9460::
9439:.
9412:.
9398::
9371:.
9351::
9341::
9318:.
9314::
9274:.
9262::
9239:.
9225::
9181:.
9159::
9077:.
9071::
9041:.
9015::
9009:5
8968:.
8962::
8916:.
8890::
8874:9
8841:.
8818:.
8790:.
8784::
8776::
8718::
8675:.
8661::
8610::
8594:1
8565:.
8514:.
8502::
8465:.
8338:2
8334:)
8330:n
8321:(
8316:)
8313:2
8304:d
8301:3
8298:+
8295:6
8292:(
8288:/
8284:2
8277:n
8273:C
8265:2
8261:]
8257:)
8253:X
8249:(
8246:m
8240:)
8236:X
8232:(
8227:f
8224:u
8219:n
8209:m
8202:[
8198:E
8177:0
8171:C
8143:k
8139:2
8134:/
8130:n
8104:k
8074:2
8070:)
8066:n
8057:(
8052:)
8049:2
8040:d
8037:+
8034:3
8031:(
8027:/
8023:1
8016:n
8010:1
8006:C
7997:2
7993:]
7989:)
7985:X
7981:(
7978:m
7972:)
7968:X
7964:(
7959:c
7956:c
7951:n
7941:m
7934:[
7930:E
7909:n
7889:0
7881:1
7877:C
7848:k
7844:2
7839:/
7835:n
7809:k
7777:m
7755:d
7751:]
7747:1
7744:,
7741:0
7738:[
7717:X
7688:2
7662:X
7618:+
7615:)
7611:X
7607:(
7604:m
7601:=
7598:Y
7570:.
7566:)
7560:i
7556:Y
7550:n
7544:i
7538:1
7529:(
7523:n
7515:n
7512:+
7509:)
7505:x
7501:(
7496:n
7493:,
7480:m
7469:n
7465:a
7458:n
7454:a
7445:n
7441:b
7430:|
7426:)
7422:x
7418:(
7413:n
7410:,
7397:m
7387:)
7383:x
7379:(
7374:n
7371:,
7364:m
7359:|
7335:,
7332:2
7328:/
7322:n
7311:1
7305:]
7300:n
7294:D
7283:n
7279:b
7272:]
7269:)
7263:,
7259:x
7255:(
7250:n
7246:N
7242:[
7230:E
7221:n
7217:a
7213:[
7207:P
7186:,
7183:2
7179:/
7173:n
7162:1
7156:]
7151:n
7145:D
7134:n
7130:b
7123:)
7117:,
7113:x
7109:(
7104:n
7100:N
7091:n
7087:a
7083:[
7077:P
7056:,
7053:1
7047:]
7044:)
7038:,
7034:x
7030:(
7025:n
7021:N
7017:[
7011:E
6988:)
6983:n
6979:b
6975:(
6972:,
6969:)
6964:n
6960:a
6956:(
6953:,
6950:)
6945:n
6937:(
6913:M
6885:.
6882:)
6878:x
6874:(
6869:n
6866:,
6863:M
6853:m
6842:n
6838:a
6831:n
6827:a
6818:n
6814:b
6803:|
6799:)
6795:x
6791:(
6786:n
6783:,
6780:M
6770:m
6760:)
6756:x
6752:(
6747:n
6744:,
6741:M
6737:m
6732:|
6711:.
6706:n
6702:b
6692:m
6684:,
6680:x
6673:n
6669:N
6663:M
6658:1
6655:=
6652:m
6642:M
6639:1
6629:n
6625:a
6614:n
6610:b
6603:)
6597:,
6593:x
6589:(
6584:n
6580:N
6571:n
6567:a
6546:)
6541:n
6537:b
6533:(
6530:,
6527:)
6522:n
6518:a
6514:(
6478:.
6473:d
6469:]
6465:1
6462:,
6459:0
6456:[
6449:x
6439:)
6432:!
6429:j
6423:j
6418:)
6413:|
6407:m
6403:x
6398:|
6384:(
6375:1
6367:m
6363:k
6357:0
6354:=
6351:j
6342:|
6336:m
6332:x
6327:|
6320:1
6316:(
6310:d
6305:1
6302:=
6299:m
6289:k
6284:)
6279:d
6276:1
6271:(
6263:!
6258:d
6254:k
6247:!
6242:1
6238:k
6232:!
6229:k
6221:k
6218:=
6213:j
6209:k
6203:d
6198:1
6195:=
6192:j
6184:,
6179:d
6175:k
6171:,
6165:,
6160:1
6156:k
6147:=
6144:)
6140:x
6136:,
6132:0
6128:(
6123:f
6120:u
6115:k
6111:K
6090:)
6085:M
6077:,
6071:,
6066:1
6058:,
6054:x
6050:(
6045:n
6042:,
6039:M
6029:m
5998:.
5993:d
5989:]
5985:1
5982:,
5979:0
5976:[
5969:z
5965:,
5961:x
5951:,
5941:j
5937:z
5929:j
5925:k
5920:2
5913:=
5905:j
5901:x
5893:j
5889:k
5884:2
5875:1
5868:d
5863:1
5860:=
5857:j
5847:k
5842:)
5837:d
5834:1
5829:(
5821:!
5816:d
5812:k
5805:!
5800:1
5796:k
5790:!
5787:k
5779:k
5776:=
5771:j
5767:k
5761:d
5756:1
5753:=
5750:j
5742:,
5737:d
5733:k
5729:,
5723:,
5718:1
5714:k
5705:=
5702:)
5698:z
5694:,
5690:x
5686:(
5681:c
5678:c
5673:k
5669:K
5648:)
5643:M
5635:,
5629:,
5624:1
5616:,
5612:x
5608:(
5603:n
5600:,
5597:M
5587:m
5563:k
5532:)
5522:x
5517:,
5513:x
5509:(
5504:n
5501:,
5498:M
5494:K
5488:n
5483:1
5480:=
5467:)
5462:i
5457:x
5452:,
5448:x
5444:(
5439:n
5436:,
5433:M
5429:K
5423:i
5419:Y
5413:n
5408:1
5405:=
5402:i
5391:=
5388:)
5383:M
5375:,
5369:,
5364:1
5356:,
5352:x
5348:(
5343:n
5340:,
5337:M
5327:m
5302:z
5280:x
5257:)
5252:j
5244:,
5240:x
5236:(
5231:n
5227:A
5219:z
5213:1
5206:M
5201:1
5198:=
5195:j
5185:M
5182:1
5177:=
5174:)
5170:z
5166:,
5162:x
5158:(
5153:n
5150:,
5147:M
5143:K
5122:M
5101:x
5078:i
5074:Y
5053:,
5048:)
5043:j
5035:,
5031:x
5027:(
5022:n
5018:A
5009:i
5004:X
4997:1
4990:i
4986:Y
4980:n
4975:1
4972:=
4969:i
4959:M
4954:1
4951:=
4948:j
4937:)
4932:j
4924:,
4920:x
4916:(
4911:n
4907:N
4901:M
4896:1
4893:=
4890:j
4881:1
4876:=
4873:)
4868:M
4860:,
4854:,
4849:1
4841:,
4837:x
4833:(
4828:n
4825:,
4822:M
4812:m
4787:)
4780:)
4775:j
4767:,
4763:x
4759:(
4754:n
4750:N
4742:)
4737:j
4729:,
4725:x
4721:(
4716:n
4712:A
4703:i
4698:X
4691:1
4684:i
4680:Y
4671:n
4666:1
4663:=
4660:i
4651:(
4645:M
4640:1
4637:=
4634:j
4624:M
4621:1
4616:=
4613:)
4608:M
4600:,
4594:,
4589:1
4581:,
4577:x
4573:(
4568:n
4565:,
4562:M
4558:m
4535:d
4531:]
4527:1
4524:,
4521:0
4518:[
4511:x
4485:)
4480:j
4472:,
4468:x
4464:(
4459:n
4455:A
4446:i
4441:X
4434:1
4427:n
4422:1
4419:=
4416:i
4408:=
4405:)
4400:j
4392:,
4388:x
4384:(
4379:n
4375:N
4352:n
4346:D
4321:j
4295:x
4274:)
4269:j
4261:,
4257:x
4253:(
4248:n
4244:A
4220:)
4215:j
4207:,
4203:x
4199:(
4194:n
4190:N
4182:)
4177:j
4169:,
4165:x
4161:(
4156:n
4152:A
4143:i
4138:X
4131:1
4124:i
4120:Y
4111:n
4106:1
4103:=
4100:i
4092:=
4087:n
4083:m
4062:)
4057:j
4049:,
4045:x
4041:(
4036:n
4032:m
4026:M
4021:1
4018:=
4015:j
4005:M
4002:1
3997:=
3994:)
3989:M
3981:,
3975:,
3970:1
3962:,
3958:x
3954:(
3949:n
3946:,
3943:M
3939:m
3916:n
3910:D
3863:M
3853:,
3847:,
3842:1
3815:j
3794:x
3773:)
3768:j
3758:,
3754:x
3750:(
3745:n
3741:m
3720:M
3700:]
3696:x
3692:=
3688:X
3681:Y
3678:[
3672:E
3669:=
3666:)
3662:x
3658:(
3655:m
3634:X
3613:Y
3587:]
3582:2
3578:Y
3574:[
3568:E
3548:)
3545:Y
3542:,
3538:X
3534:(
3513:R
3504:p
3500:]
3496:1
3493:,
3490:0
3487:[
3465:n
3460:1
3457:=
3454:i
3450:}
3446:)
3441:i
3437:Y
3433:,
3428:i
3423:X
3418:(
3415:{
3412:=
3407:n
3401:D
3360:N
3353:k
3333:k
3233:j
3211:i
3207:x
3180:.
3175:i
3171:y
3165:)
3161:)
3154:x
3150:,
3145:i
3141:x
3137:(
3132:j
3128:W
3122:m
3117:1
3114:=
3111:j
3101:m
3098:1
3092:(
3086:n
3081:1
3078:=
3075:i
3067:=
3062:i
3058:y
3053:)
3046:x
3042:,
3037:i
3033:x
3029:(
3024:j
3020:W
3014:n
3009:1
3006:=
3003:i
2993:m
2988:1
2985:=
2982:j
2972:m
2969:1
2964:=
2955:y
2930:j
2926:W
2915:m
2898:i
2896:x
2876:k
2872:1
2867:=
2864:)
2857:x
2853:,
2848:i
2844:x
2840:(
2837:W
2820:k
2815:i
2813:x
2797:k
2794:1
2789:=
2786:)
2779:x
2775:,
2770:i
2766:x
2762:(
2759:W
2749:k
2729:i
2725:x
2706:i
2692:)
2685:x
2681:,
2676:i
2672:x
2668:(
2665:W
2643:.
2638:i
2634:y
2629:)
2622:x
2618:,
2613:i
2609:x
2605:(
2602:W
2597:n
2592:1
2589:=
2586:i
2578:=
2569:y
2556:W
2532:y
2507:n
2502:1
2499:=
2496:i
2492:}
2488:)
2483:i
2479:y
2475:,
2470:i
2466:x
2462:(
2459:{
2445:k
2443:(
2439:k
2385:j
2365:t
2345:)
2342:j
2339:(
2332:i
2328:T
2323:i
2299:j
2277:n
2272:j
2268:n
2262:=
2259:)
2256:j
2253:(
2246:i
2242:T
2237:p
2216:i
2194:i
2190:T
2167:T
2163:n
2142:x
2122:,
2119:)
2116:j
2113:(
2106:i
2102:T
2097:i
2090:)
2087:j
2084:(
2077:i
2073:T
2068:p
2062:x
2059:=
2056:)
2053:j
2050:(
2041:|
2035:i
2031:T
2024:j
2007:T
2003:n
1997:1
1994:=
1991:i
1979:T
1975:n
1971:1
1966:=
1963:)
1960:x
1957:(
1898:j
1878:j
1858:j
1829:n
1824:1
1821:=
1818:i
1814:}
1810:)
1805:i
1801:Y
1797:,
1792:i
1788:X
1784:(
1781:{
1778:=
1773:n
1767:D
1742:R
1694:p
1674:p
1652:p
1609:p
1602:p
1595:p
1588:B
1560:i
1558:x
1553:i
1551:x
1537:B
1533:B
1517:.
1510:1
1504:B
1497:2
1493:)
1483:f
1474:)
1467:x
1463:(
1458:b
1454:f
1450:(
1445:B
1440:1
1437:=
1434:b
1422:=
1382:)
1375:x
1371:(
1366:b
1362:f
1356:B
1351:1
1348:=
1345:b
1335:B
1332:1
1327:=
1318:f
1292:.
1289:b
1287:Y
1282:b
1280:X
1275:b
1273:f
1268:.
1265:b
1263:Y
1258:b
1256:X
1252:Y
1248:X
1244:n
1237:B
1233:b
1222:B
1217:n
1215:y
1210:1
1208:y
1204:Y
1199:n
1197:x
1192:1
1190:x
1186:X
1174:n
1170:n
1098:.
959:e
952:t
945:v
525:k
374:k
301:k
259:)
247:(
31:.
20:)
Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.