Knowledge

Deep learning

Source 📝

1253:
using generative models of deep belief nets (DBN) would overcome the main difficulties of neural nets. However, it was discovered that replacing pre-training with large amounts of training data for straightforward backpropagation when using DNNs with large, context-dependent output layers produced error rates dramatically lower than then-state-of-the-art Gaussian mixture model (GMM)/Hidden Markov Model (HMM) and also than more-advanced generative model-based systems. The nature of the recognition errors produced by the two types of systems was characteristically different, offering technical insights into how to integrate deep learning into the existing highly efficient, run-time speech decoding system deployed by all major speech recognition systems. Analysis around 2009–2010, contrasting the GMM (and other generative speech models) vs. DNN models, stimulated early industrial investment in deep learning for speech recognition. That analysis was done with comparable performance (less than 1.5% in error rate) between discriminative DNNs and generative models. In 2010, researchers extended deep learning from
2188: 3174:, the need for training data does not stop once an ANN is trained. Rather, there is a continued demand for human-generated verification data to constantly calibrate and update the ANN. For this purpose, Facebook introduced the feature that once a user is automatically recognized in an image, they receive a notification. They can choose whether or not they like to be publicly labeled on the image, or tell Facebook that it is not them in the picture. This user interface is a mechanism to generate "a constant stream of verification data" to further train the network in real-time. As Mühlhoff argues, the involvement of human users to generate training and verification data is so typical for most commercial end-user applications of Deep Learning that such systems may be referred to as "human-aided artificial intelligence". 839:(1958) proposed the perceptron, an MLP with 3 layers: an input layer, a hidden layer with randomized weights that did not learn, and an output layer. He later published a 1962 book that also introduced variants and computer experiments, including a version with four-layer perceptrons "with adaptive preterminal networks" where the last two layers have learned weights (here he credits H. D. Block and B. W. Knight). The book cites an earlier network by R. D. Joseph (1960) "functionally equivalent to a variation of" this four-layer system (the book mentions Joseph over 30 times). Should Joseph therefore be considered the originator of proper adaptive 62: 632:, in which a signal may propagate through a layer more than once, the CAP depth is potentially unlimited. No universally agreed-upon threshold of depth divides shallow learning from deep learning, but most researchers agree that deep learning involves CAP depth higher than two. CAP of depth two has been shown to be a universal approximator in the sense that it can emulate any function. Beyond that, more layers do not add to the function approximator ability of the network. Deep models (CAP > two) are able to extract better features than shallow models and hence, extra layers help in learning the features effectively. 1508: 1532: 13320: 6112: 2650:, deep BSDE addresses the computational challenges faced by traditional numerical methods in high-dimensional settings. Specifically, traditional methods like finite difference methods or Monte Carlo simulations often struggle with the curse of dimensionality, where computational cost increases exponentially with the number of dimensions. Deep BSDE methods, however, employ deep neural networks to approximate solutions of high-dimensional partial differential equations (PDEs), effectively reducing the computational burden. 38: 2657:(PINNs) into the deep BSDE framework enhances its capability by embedding the underlying physical laws directly into the neural network architecture. This ensures that the solutions not only fit the data but also adhere to the governing stochastic differential equations. PINNs leverage the power of deep learning while respecting the constraints imposed by the physical models, resulting in more accurate and reliable solutions for financial mathematics problems. 15011: 14991: 3015: 2143: 2822:(ARL) and UT researchers. Deep TAMER used deep learning to provide a robot with the ability to learn new tasks through observation. Using Deep TAMER, a robot learned a task with a human trainer, watching video streams or observing a human perform a task in-person. The robot later practiced the task with the help of some coaching from the trainer, who provided feedback such as "good job" and "bad job". 1270: 2818:(UT) developed a machine learning framework called Training an Agent Manually via Evaluative Reinforcement, or TAMER, which proposed new methods for robots or computer programs to learn how to perform tasks by interacting with a human instructor. First developed as TAMER, a new algorithm called Deep TAMER was later introduced in 2018 during a collaboration between 1793:(computing the gradient on several training examples at once rather than individual examples) speed up computation. Large processing capabilities of many-core architectures (such as GPUs or the Intel Xeon Phi) have produced significant speedups in training, because of the suitability of such processing architectures for the matrix and vector computations. 1119:. These were designed for unsupervised learning of deep generative models. However, those were more computationally expensive compared to backpropagation. Boltzmann machine learning algorithm, published in 1985, was briefly popular before being eclipsed by the backpropagation algorithm in 1986. (p. 112 ). A 1988 network became state of the art in 2751:, well-tuned to their operating environment. A 1995 description stated, "...the infant's brain seems to organize itself under the influence of waves of so-called trophic-factors ... different regions of the brain become connected sequentially, with one layer of tissue maturing before another and so on until the whole brain is mature". 1061:(LSTM), published in 1995. LSTM can learn "very deep learning" tasks with long credit assignment paths that require memories of events that happened thousands of discrete time steps before. That LSTM was not yet the modern architecture, which required a "forget gate", introduced in 1999, which became the standard RNN architecture. 988:(RNN) were further developed in the 1980s. Recurrence is used for sequence processing, and when a recurrent network is unrolled, it mathematically resembles a deep feedforward layer. Consequently, they have similar properties and issues, and their developments had mutual influences. In RNN, two early influential works were the 1800:) is one such kind of neural network. It doesn't require learning rates or randomized initial weights. The training process can be guaranteed to converge in one step with a new batch of data, and the computational complexity of the training algorithm is linear with respect to the number of neurons involved. 1138:(GMM-HMM) technology based on generative models of speech trained discriminatively. Key difficulties have been analyzed, including gradient diminishing and weak temporal correlation structure in neural predictive models. Additional difficulties were the lack of training data and limited computing power. 2888:
In further reference to the idea that artistic sensitivity might be inherent in relatively low levels of the cognitive hierarchy, a published series of graphic representations of the internal states of deep (20-30 layers) neural networks attempting to discern within essentially random data the images
2515:
Deep learning has been shown to produce competitive results in medical application such as cancer cell classification, lesion detection, organ segmentation and image enhancement. Modern deep learning tools demonstrate the high accuracy of detecting various diseases and the helpfulness of their use by
2295:
In 2023 Murray et al. developed a deep learning architecture which was capable of determining whether a defendant should be tried as a child or adult. Their software was able to estimate subject age with significant accuracy. The same team has developed architectures capable of performing ante-mortem
1648:
A deep neural network (DNN) is an artificial neural network with multiple layers between the input and output layers. There are different types of neural networks but they always consist of the same components: neurons, synapses, weights, biases, and functions. These components as a whole function in
1573:
that constitute animal brains. Such systems learn (progressively improve their ability) to do tasks by considering examples, generally without task-specific programming. For example, in image recognition, they might learn to identify images that contain cats by analyzing example images that have been
2844:
A main criticism concerns the lack of theory surrounding some methods. Learning in the most common deep architectures is implemented using well-understood gradient descent. However, the theory surrounding other algorithms, such as contrastive divergence is less clear. (e.g., Does it converge? If so,
2286:
method in which the system "learns from millions of examples". It translates "whole sentences at a time, rather than pieces". Google Translate supports over one hundred languages. The network encodes the "semantics of the sentence rather than simply memorizing phrase-to-phrase translations". GT uses
2937:
As deep learning moves from the lab into the world, research and experience show that artificial neural networks are vulnerable to hacks and deception. By identifying patterns that these systems use to function, attackers can modify inputs to ANNs in such a way that the ANN finds a match that human
2769:
Although a systematic comparison between the human brain organization and the neuronal encoding in deep networks has not yet been established, several analogies have been reported. For example, the computations performed by deep learning units could be similar to those of actual neurons and neural
1812:
have led to more efficient methods for training deep neural networks that contain many layers of non-linear hidden units and a very large output layer. By 2019, graphics processing units (GPUs), often with AI-specific enhancements, had displaced CPUs as the dominant method for training large-scale
1652:
For example, a DNN that is trained to recognize dog breeds will go over the given image and calculate the probability that the dog in the image is a certain breed. The user can review the results and select which probabilities the network should display (above a certain threshold, etc.) and return
854:
and Lapa in 1965. They regarded it as a form of polynomial regression, or a generalization of Rosenblatt's perceptron. A 1971 paper described a deep network with eight layers trained by this method, which is based on layer by layer training through regression analysis. Superfluous hidden units are
2945:
In 2016 researchers used one ANN to doctor images in trial and error fashion, identify another's focal points, and thereby generate images that deceived it. The modified images looked no different to human eyes. Another group showed that printouts of doctored images then photographed successfully
1912:
Large-scale automatic speech recognition is the first and most convincing successful case of deep learning. LSTM RNNs can learn "Very Deep Learning" tasks that involve multi-second intervals containing speech events separated by thousands of discrete time steps, where one time step corresponds to
1856:
are considered promising for energy-efficient deep learning hardware where the same basic device structure is used for both logic operations and data storage. In 2020, Marega et al. published experiments with a large-area active channel material for developing logic-in-memory devices and circuits
1252:
The 2009 NIPS Workshop on Deep Learning for Speech Recognition was motivated by the limitations of deep generative models of speech, and the possibility that given more capable hardware and large-scale data sets that deep neural nets might become practical. It was believed that pre-training DNNs
10411:
Wu, Yonghui; Schuster, Mike; Chen, Zhifeng; Le, Quoc V; Norouzi, Mohammad; Macherey, Wolfgang; Krikun, Maxim; Cao, Yuan; Gao, Qin; Macherey, Klaus; Klingner, Jeff; Shah, Apurva; Johnson, Melvin; Liu, Xiaobing; Kaiser, Łukasz; Gouws, Stephan; Kato, Yoshikiyo; Kudo, Taku; Kazawa, Hideto; Stevens,
2674:
Traditional weather prediction systems solve a very complex system of partial differential equations. GraphCast is a deep learning based model, trained on a long history of weather data to predict how weather patterns change over time. It is able to predict weather conditions for up to 10 days
2665:
Image reconstruction is the reconstruction of the underlying images from the image-related measurements. Several works showed the better and superior performance of the deep learning methods compared to analytical methods for various applications, e.g., spectral imaging and ultrasound imaging.
1671:
DNNs are typically feedforward networks in which data flows from the input layer to the output layer without looping back. At first, the DNN creates a map of virtual neurons and assigns random numerical values, or "weights", to connections between them. The weights and inputs are multiplied and
2602:
database, offering researchers the opportunity to identify materials with desired properties for various applications. This development has implications for the future of scientific discovery and the integration of AI in material science research, potentially expediting material innovation and
2398:
Recommendation systems have used deep learning to extract meaningful features for a latent factor model for content-based music and journal recommendations. Multi-view deep learning has been applied for learning user preferences from multiple domains. The model uses a hybrid collaborative and
1672:
return an output between 0 and 1. If the network did not accurately recognize a particular pattern, an algorithm would adjust the weights. That way the algorithm can make certain parameters more influential, until it determines the correct mathematical manipulation to fully process the data.
2146: 1639:
As of 2017, neural networks typically have a few thousand to a few million units and millions of connections. Despite this number being several order of magnitude less than the number of neurons on a human brain, these networks can perform many tasks at a level beyond that of humans (e.g.,
7199: 1280:
Although CNNs trained by backpropagation had been around for decades and GPU implementations of NNs for years, including CNNs, faster implementations of CNNs on GPUs were needed to progress on computer vision. Later, as deep learning becomes widespread, specialized hardware and algorithm
2145: 2528:
is always challenging, since many data points must be considered and analyzed before a target segment can be created and used in ad serving by any ad server. Deep learning has been used to interpret large, many-dimensioned advertising datasets. Many data points are collected during the
2150: 2149: 2144: 1777:
regularization randomly omits units from the hidden layers during training. This helps to exclude rare dependencies. Finally, data can be augmented via methods such as cropping and rotating such that smaller training sets can be increased in size to reduce the chances of overfitting.
2151: 1248:
The impact of deep learning in industry began in the early 2000s, when CNNs already processed an estimated 10% to 20% of all the checks written in the US, according to Yann LeCun. Industrial applications of deep learning to large-scale speech recognition started around 2010.
2747:, neural networks employ a hierarchy of layered filters in which each layer considers information from a prior layer (or the operating environment), and then passes its output (and possibly the original input), to other layers. This process yields a self-organizing stack of 601:). The first representational layer may attempt to identify basic shapes such as lines and circles, the second layer may compose and encode arrangements of edges, the third layer may encode a nose and eyes, and the fourth layer may recognize that the image contains a face. 1537:
Subsequent run of the network on an input image (left): The network correctly detects the starfish. However, the weakly weighted association between ringed texture and sea urchin also confers a weak signal to the latter from one of two intermediate nodes. In addition, a
12283:
Lam, Remi; Sanchez-Gonzalez, Alvaro; Willson, Matthew; Wirnsberger, Peter; Fortunato, Meire; Alet, Ferran; Ravuri, Suman; Ewalds, Timo; Eaton-Rosen, Zach; Hu, Weihua; Merose, Alexander; Hoyer, Stephan; Holland, George; Vinyals, Oriol; Stott, Jacklynn (2023-12-22).
11339:
Litjens, Geert; Kooi, Thijs; Bejnordi, Babak Ehteshami; Setio, Arnaud Arindra Adiyoso; Ciompi, Francesco; Ghafoorian, Mohsen; van der Laak, Jeroen A.W.M.; van Ginneken, Bram; Sánchez, Clara I. (December 2017). "A survey on deep learning in medical image analysis".
2904:
Some deep learning architectures display problematic behaviors, such as confidently classifying unrecognizable images as belonging to a familiar category of ordinary images (2014) and misclassifying minuscule perturbations of correctly classified images (2013).
1932:
language models. This lets the strength of the acoustic modeling aspects of speech recognition be more easily analyzed. The error rates listed below, including these early results and measured as percent phone error rates (PER), have been summarized since 1991.
10412:
Keith; Kurian, George; Patil, Nishant; Wang, Wei; Young, Cliff; Smith, Jason; Riesa, Jason; Rudnick, Alex; Vinyals, Oriol; Corrado, Greg; et al. (2016). "Google's Neural Machine Translation System: Bridging the Gap between Human and Machine Translation".
1667:
Deep architectures include many variants of a few basic approaches. Each architecture has found success in specific domains. It is not always possible to compare the performance of multiple architectures, unless they have been evaluated on the same data sets.
2734:
in the early 1990s. These developmental theories were instantiated in computational models, making them predecessors of deep learning systems. These developmental models share the property that various proposed learning dynamics in the brain (e.g., a wave of
8298:
Silver, David; Huang, Aja; Maddison, Chris J.; Guez, Arthur; Sifre, Laurent; Driessche, George van den; Schrittwieser, Julian; Antonoglou, Ioannis; Panneershelvam, Veda (January 2016). "Mastering the game of Go with deep neural networks and tree search".
2148: 2593:
by discovering over 2 million new materials within a relatively short timeframe. GNoME employs deep learning techniques to efficiently explore potential material structures, achieving a significant increase in the identification of stable inorganic
7191: 1608:
Typically, neurons are organized in layers. Different layers may perform different kinds of transformations on their inputs. Signals travel from the first (input), to the last (output) layer, possibly after traversing the layers multiple times.
7732:
Fang, Hao; Gupta, Saurabh; Iandola, Forrest; Srivastava, Rupesh; Deng, Li; Dollár, Piotr; Gao, Jianfeng; He, Xiaodong; Mitchell, Margaret; Platt, John C; Lawrence Zitnick, C; Zweig, Geoffrey (2014). "From Captions to Visual Concepts and Back".
2170:
data set. MNIST is composed of handwritten digits and includes 60,000 training examples and 10,000 test examples. As with TIMIT, its small size lets users test multiple configurations. A comprehensive list of results on this set is available.
642:
Deep learning algorithms can be applied to unsupervised learning tasks. This is an important benefit because unlabeled data are more abundant than the labeled data. Examples of deep structures that can be trained in an unsupervised manner are
3526: 2241:, can be thought of as a representational layer in a deep learning architecture that transforms an atomic word into a positional representation of the word relative to other words in the dataset; the position is represented as a point in a 10764: 1817:
estimated the hardware computation used in the largest deep learning projects from AlexNet (2012) to AlphaZero (2017) and found a 300,000-fold increase in the amount of computation required, with a doubling-time trendline of 3.4 months.
1525:. The starfish match with a ringed texture and a star outline, whereas most sea urchins match with a striped texture and oval shape. However, the instance of a ring textured sea urchin creates a weakly weighted association between them. 2603:
reducing costs in product development. The use of AI and deep learning suggests the possibility of minimizing or eliminating manual lab experiments and allowing scientists to focus more on the design and analysis of unique compounds.
7229: 9751:
Hannun, Awni; Case, Carl; Casper, Jared; Catanzaro, Bryan; Diamos, Greg; Elsen, Erich; Prenger, Ryan; Satheesh, Sanjeev; Sengupta, Shubho; Coates, Adam; Ng, Andrew Y (2014). "Deep Speech: Scaling up end-to-end speech recognition".
2354:
were used for the first time to predict various properties of molecules in a large toxicology data set. In 2019, generative neural networks were used to produce molecules that were validated experimentally all the way into mice.
2938:
observers would not recognize. For example, an attacker can make subtle changes to an image such that the ANN finds a match even though the image looks to a human nothing like the search target. Such manipulation is termed an "
2174:
Deep learning-based image recognition has become "superhuman", producing more accurate results than human contestants. This first occurred in 2011 in recognition of traffic signs, and in 2014, with recognition of human faces.
13257:; Maddison, Chris J.; Guez, Arthur; Sifre, Laurent; Driessche, George van den; Schrittwieser, Julian; Antonoglou, Ioannis; Panneershelvam, Veda; Lanctot, Marc; Dieleman, Sander; Grewe, Dominik; Nham, John; Kalchbrenner, Nal; 1420:(2018) based on the Progressive GAN by Tero Karras et al. Here the GAN generator is grown from small to large scale in a pyramidal fashion. Image generation by GAN reached popular success, and provoked discussions concerning 11027: 1612:
The original goal of the neural network approach was to solve problems in the same way that a human brain would. Over time, attention focused on matching specific mental abilities, leading to deviations from biology such as
2178:
Deep learning-trained vehicles now interpret 360° camera views. Another example is Facial Dysmorphology Novel Analysis (FDNA) used to analyze cases of human malformation connected to a large database of genetic syndromes.
1601:) between neurons can transmit a signal to another neuron. The receiving (postsynaptic) neuron can process the signal(s) and then signal downstream neurons connected to it. Neurons may have state, generally represented by 10100: 4381: 697:
activation functions and was generalised to feed-forward multi-layer architectures in 1991 by Kurt Hornik. Recent work also showed that universal approximation also holds for non-bounded activation functions such as
2249:(PCFG) implemented by an RNN. Recursive auto-encoders built atop word embeddings can assess sentence similarity and detect paraphrasing. Deep neural architectures provide the best results for constituency parsing, 9204: 7275: 2199:
Closely related to the progress that has been made in image recognition is the increasing application of deep learning techniques to various visual art tasks. DNNs have proven themselves capable, for example, of
10320: 10131: 616:
useful feature representations from the data automatically. This does not eliminate the need for hand-tuning; for example, varying numbers of layers and layer sizes can provide different degrees of abstraction.
2872:, and they are also still a long way from integrating abstract knowledge, such as information about what objects are, what they are for, and how they are typically used. The most powerful A.I. systems, like 1660:. The extra layers enable composition of features from lower layers, potentially modeling complex data with fewer units than a similarly performing shallow network. For instance, it was proved that sparse 6587: 2315:
A large percentage of candidate drugs fail to win regulatory approval. These failures are caused by insufficient efficacy (on-target effect), undesired interactions (off-target effects), or unanticipated
5276:". David E. Rumelhart, James L. McClelland, and the PDP research group. (editors), Parallel distributed processing: Explorations in the microstructure of cognition, Volume 1: Foundation. MIT Press, 1986. 2889:
on which they were trained demonstrate a visual appeal: the original research notice received well over 1,000 comments, and was the subject of what was for a time the most frequently accessed article on
2598:. The system's predictions were validated through autonomous robotic experiments, demonstrating a noteworthy success rate of 71%. The data of newly discovered materials is publicly available through the 448:
into layers and "training" them to process data. The adjective "deep" refers to the use of multiple layers (ranging from three to several hundred or thousands) in the network. Methods used can be either
2693:
that can be used to measure age. Galkin et al. used deep neural networks to train an epigenetic aging clock of unprecedented accuracy using >6,000 blood samples. The clock uses information from 1000
5002:
The Early Mathematical Manuscripts of Leibniz: Translated from the Latin Texts Published by Carl Immanuel Gerhardt with Critical and Historical Notes (Leibniz published the chain rule in a 1676 memoir)
3567: 6465: 10204: 13593: 7305: 1237:
were developed for generative modeling. They are trained by training one restricted Boltzmann machine, then freezing it and training another one on top of the first one, and so on, then optionally
2245:. Using word embedding as an RNN input layer allows the network to parse sentences and phrases using an effective compositional vector grammar. A compositional vector grammar can be thought of as 855:
pruned using a separate validation set. Since the activation functions of the nodes are Kolmogorov-Gabor polynomials, these were also the first deep networks with multiplicative units or "gates."
10754: 2913:(AGI) architectures. These issues may possibly be addressed by deep learning architectures that internally form states homologous to image-grammar decompositions of observed entities and events. 2147: 1200:(SVMs) became the preferred choices in the 1990s and 2000s, because of artificial neural networks' computational cost and a lack of understanding of how the brain wires its biological networks. 7571: 1578:
as "cat" or "no cat" and using the analytic results to identify cats in other images. They have found most use in applications difficult to express with a traditional computer algorithm using
1307:
achieved for the first time superhuman performance in a visual pattern recognition contest, outperforming traditional methods by a factor of 3. It then won more contests. They also showed how
2770:
populations. Similarly, the representations developed by deep learning models are similar to those measured in the primate visual system both at the single-unit and at the population levels.
2766:, may be closer to biological reality. In this respect, generative neural network models have been related to neurobiological evidence about sampling-based processing in the cerebral cortex. 1605:, typically between 0 and 1. Neurons and synapses may also have a weight that varies as learning proceeds, which can increase or decrease the strength of the signal that it sends downstream. 7221: 7100: 3833: 958:
to apply CNN to phoneme recognition. It used convolutions, weight sharing, and backpropagation. In 1988, Wei Zhang applied a backpropagation-trained CNN to alphabet recognition. In 1989,
13505: 13333: 6409:
Baker, J.; Deng, Li; Glass, Jim; Khudanpur, S.; Lee, C.-H.; Morgan, N.; O'Shaughnessy, D. (2009). "Research Developments and Directions in Speech Recognition and Understanding, Part 1".
2994:
voice command system open a particular web address, and hypothesized that this could "serve as a stepping stone for further attacks (e.g., opening a web page hosting drive-by malware)".
922:
in 1673 to networks of differentiable nodes. The terminology "back-propagating errors" was actually introduced in 1962 by Rosenblatt, but he did not know how to implement this, although
2856:, not as an all-encompassing solution. Despite the power of deep learning methods, they still lack much of the functionality needed to realize this goal entirely. Research psychologist 9174: 6259:
Morgan, Nelson; Bourlard, Hervé; Renals, Steve; Cohen, Michael; Franco, Horacio (1 August 1993). "Hybrid neural network/hidden markov model systems for continuous speech recognition".
10351: 7165: 2758:
algorithm have been proposed in order to increase its processing realism. Other researchers have argued that unsupervised forms of deep learning, such as those based on hierarchical
1874:
for parallel convolutional processing. The authors identify two key advantages of integrated photonics over its electronic counterparts: (1) massively parallel data transfer through
11722: 11573:
De, Shaunak; Maity, Abhishek; Goel, Vritti; Shitole, Sanjay; Bhattacharya, Avik (2017). "Predicting the popularity of instagram posts for a lifestyle magazine using deep learning".
11019: 8757: 10501:
Murray, J., Heng, D., Lygate, A., et al. (2023). "Applying artificial intelligence to determination of legal age of majority from radiographic data". Morphologie. 108 (360): 100723
1849:
has also built a dedicated system to handle large deep learning models, the CS-2, based on the largest processor in the industry, the second-generation Wafer Scale Engine (WSE-2).
1172:
The principle of elevating "raw" features over hand-crafted optimization was first explored successfully in the architecture of deep autoencoder on the "raw" spectrogram or linear
2909:
hypothesized that these behaviors are due to limitations in their internal representations and that these limitations would inhibit integration into heterogeneous multi-component
612:
to transform the data into a more suitable representation for a classification algorithm to operate on. In the deep learning approach, features are not hand-crafted and the model
9002: 10730:
Wallach, Izhar; Dzamba, Michael; Heifets, Abraham (9 October 2015). "AtomNet: A Deep Convolutional Neural Network for Bioactivity Prediction in Structure-based Drug Discovery".
10092: 4362: 2646:(BSDE). This method is particularly useful for solving high-dimensional problems in financial mathematics. By leveraging the powerful function approximation capabilities of 2046:
The debut of DNNs for speaker recognition in the late 1990s and speech recognition around 2009-2011 and of LSTM around 2003–2007, accelerated progress in eight major areas:
4119: 2754:
A variety of approaches have been used to investigate the plausibility of deep learning models from a neurobiological perspective. On the one hand, several variants of the
1542:
that was not included in the training gives a weak signal for the oval shape, also resulting in a weak signal for the sea urchin output. These weak signals may result in a
8364: 9196: 8971: 14885: 12753:
Testolin, Alberto; Stoianov, Ivilin; Zorzi, Marco (September 2017). "Letter perception emerges from unsupervised deep learning and recycling of natural image features".
1379:
accuracy, known as the "degradation" problem. In 2015, two techniques were developed to train very deep networks: the Highway Network was published in May 2015, and the
10794: 10312: 10123: 7456:
Cireşan, Dan Claudiu; Meier, Ueli; Gambardella, Luca Maria; Schmidhuber, Jürgen (21 September 2010). "Deep, Big, Simple Neural Nets for Handwritten Digit Recognition".
7251: 1771: 1740: 8211: 8154: 13673:
Szegedy, Christian; Zaremba, Wojciech; Sutskever, Ilya; Bruna, Joan; Erhan, Dumitru; Goodfellow, Ian; Fergus, Rob (2013). "Intriguing properties of neural networks".
13474: 3471: 13741: 13562: 13774: 9082:
Viebke, André; Memeti, Suejb; Pllana, Sabri; Abraham, Ajith (2019). "CHAOS: a parallelization scheme for training convolutional neural networks on Intel Xeon Phi".
4228: 6513:
Doddington, G.; Przybocki, M.; Martin, A.; Reynolds, D. (2000). "The NIST speaker recognition evaluation ± Overview, methodology, systems, results, perspective".
620:
The word "deep" in "deep learning" refers to the number of layers through which the data is transformed. More precisely, deep learning systems have a substantial
13372: 9993: 9305:
P, JouppiNorman; YoungCliff; PatilNishant; PattersonDavid; AgrawalGaurav; BajwaRaminder; BatesSarah; BhatiaSuresh; BodenNan; BorchersAl; BoyleRick (2017-06-24).
3439: 13994: 13843: 12643:
O'Reilly, Randall C. (1 July 1996). "Biologically Plausible Error-Driven Learning Using Local Activation Differences: The Generalized Recirculation Algorithm".
10648: 9729: 8081: 11622: 8571: 2950:
that can then find other instances of it. A refinement is to search using only parts of the image, to identify images from which that piece may have been taken
10444: 9373: 3759: 1886:, and (2) extremely high data modulation speeds. Their system can execute trillions of multiply-accumulate operations per second, indicating the potential of 10389: 7801:
He, Kaiming; Zhang, Xiangyu; Ren, Shaoqing; Sun, Jian (2016). "Delving Deep into Rectifiers: Surpassing Human-Level Performance on ImageNet Classification".
6543:
Heck, L.; Konig, Y.; Sonmez, M.; Weintraub, M. (2000). "Robustness to Telephone Handset Distortion in Speaker Recognition by Discriminative Feature Design".
2639: 321: 10704: 10196: 6385: 5727: 13585: 10889: 8671: 5906:(2020). "Generative Adversarial Networks are Special Cases of Artificial Curiosity (1990) and also Closely Related to Predictability Minimization (1991)". 1656:
DNNs can model complex non-linear relationships. DNN architectures generate compositional models where the object is expressed as a layered composition of
10941: 10910:
Tkachenko, Yegor (8 April 2015). "Autonomous CRM Control via CLV Approximation with Deep Reinforcement Learning in Discrete and Continuous Action Space".
7297: 6762: 3829:(1986). Learning while searching in constraint-satisfaction problems. University of California, Computer Science Department, Cognitive Systems Laboratory. 2187: 539:. However, current neural networks do not intend to model the brain function of organisms, and are generally seen as low-quality models for that purpose. 11915:"Physics-informed neural networks: A deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations" 11401:
Forslid, Gustav; Wieslander, Hakan; Bengtsson, Ewert; Wahlby, Carolina; Hirsch, Jan-Michael; Stark, Christina Runow; Sadanandan, Sajith Kecheril (2017).
7561:
Ciresan, Dan; Giusti, Alessandro; Gambardella, Luca M.; Schmidhuber, Jürgen (2012). Pereira, F.; Burges, C. J. C.; Bottou, L.; Weinberger, K. Q. (eds.).
2228:
Neural networks have been used for implementing language models since the early 2000s. LSTM helped to improve machine translation and language modeling.
1916:
The initial success in speech recognition was based on small-scale recognition tasks based on TIMIT. The data set contains 630 speakers from eight major
11228: 10931:
van den Oord, Aaron; Dieleman, Sander; Schrauwen, Benjamin (2013). Burges, C. J. C.; Bottou, L.; Welling, M.; Ghahramani, Z.; Weinberger, K. Q. (eds.).
10816:
Gilmer, Justin; Schoenholz, Samuel S.; Riley, Patrick F.; Vinyals, Oriol; Dahl, George E. (2017-06-12). "Neural Message Passing for Quantum Chemistry".
9235: 8393: 6486:
Deng, L.; Hassanein, K.; Elmasry, M. (1994). "Analysis of correlation structure for a neural predictive model with applications to speech recognition".
1284:
A key advance for the deep learning revolution was hardware advances, especially GPU. Some early work dated back to 2004. In 2009, Raina, Madhavan, and
833:
produced work on "Intelligent Machinery" that was not published in his lifetime, containing "ideas related to artificial evolution and learning RNNs."
713:
concerns the capacity of networks with bounded width but the depth is allowed to grow. Lu et al. proved that if the width of a deep neural network with
624:(CAP) depth. The CAP is the chain of transformations from input to output. CAPs describe potentially causal connections between input and output. For a 14133: 5998: 13443: 10261:
Kariampuzha, William; Alyea, Gioconda; Qu, Sue; Sanjak, Jaleal; Mathé, Ewy; Sid, Eric; Chatelaine, Haley; Yadaw, Arjun; Xu, Yanji; Zhu, Qian (2023).
8178:
Li, Xiangang; Wu, Xihong (2014). "Constructing Long Short-Term Memory based Deep Recurrent Neural Networks for Large Vocabulary Speech Recognition".
8102:
Singh, Premjeet; Saha, Goutam; Sahidullah, Md (2021). "Non-linear frequency warping using constant-Q transformation for speech emotion recognition".
7595:
Ciresan, D.; Giusti, A.; Gambardella, L.M.; Schmidhuber, J. (2013). "Mitosis Detection in Breast Cancer Histology Images with Deep Neural Networks".
7562: 4956:
Fukushima, K. (1980). "Neocognitron: A self-organizing neural network model for a mechanism of pattern recognition unaffected by shift in position".
1412:'s principle of artificial curiosity) became state of the art in generative modeling during 2014-2018 period. Excellent image quality is achieved by 412: 13882: 2876:(...) use techniques like deep learning as just one element in a very complicated ensemble of techniques, ranging from the statistical technique of 1653:
the proposed label. Each mathematical manipulation as such is considered a layer, and complex DNN have many layers, hence the name "deep" networks.
1466:), as well as a range of large-vocabulary speech recognition tasks have steadily improved. Convolutional neural networks were superseded for ASR by 14727: 13652:
Nguyen, Anh; Yosinski, Jason; Clune, Jeff (2014). "Deep Neural Networks are Easily Fooled: High Confidence Predictions for Unrecognizable Images".
9802:
Cireşan, Dan; Meier, Ueli; Masci, Jonathan; Schmidhuber, Jürgen (August 2012). "Multi-column deep neural network for traffic sign classification".
9148: 8284: 13171: 8939: 7093: 3830: 2428:
In medical informatics, deep learning was used to predict sleep quality based on data from wearables and predictions of health complications from
13497: 10226:
Brocardo, Marcelo Luiz; Traore, Issa; Woungang, Isaac; Obaidat, Mohammad S. (2017). "Authorship verification using deep belief network systems".
1375:
In 2014, the state of the art was training “very deep neural network” with 20 to 30 layers. Stacking too many layers led to a steep reduction in
938:
applied backpropagation to neural networks in 1982 (his 1974 PhD thesis, reprinted in a 1994 book, did not yet describe the algorithm). In 1986,
13341: 11891: 10560:
Verbist, B; Klambauer, G; Vervoort, L; Talloen, W; The Qstar, Consortium; Shkedy, Z; Thas, O; Bender, A; Göhlmann, H. W.; Hochreiter, S (2015).
7978:
Karras, T.; Aila, T.; Laine, S.; Lehtinen, J. (26 February 2018). "Progressive Growing of GANs for Improved Quality, Stability, and Variation".
5128: 3218: 2557:. These applications include learning methods such as "Shrinkage Fields for Effective Image Restoration" which trains on an image dataset, and 1166: 1011:
where each RNN tries to predict its own next input, which is the next unexpected input of the RNN below. This "neural history compressor" uses
11049:
Chicco, Davide; Sadowski, Peter; Baldi, Pierre (1 January 2014). "Deep autoencoder neural networks for gene ontology annotation predictions".
8003: 7927: 2964:
into thinking ordinary people were celebrities, potentially allowing one person to impersonate another. In 2017 researchers added stickers to
2864:
Realistically, deep learning is only part of the larger challenge of building intelligent machines. Such techniques lack ways of representing
585:
in which a hierarchy of layers is used to transform input data into a slightly more abstract and composite representation. For example, in an
13804: 12688:"Probabilistic Models and Generative Neural Networks: Towards an Unified Framework for Modeling Normal and Impaired Neurocognitive Functions" 10470: 9166: 1169:
Speaker Recognition benchmark. It was deployed in the Nuance Verifier, representing the first major industrial application of deep learning.
11744: 10343: 10157:; He, X.; Heck, L.; Tur, G.; Yu, D.; Zweig, G. (2015). "Using recurrent neural networks for slot filling in spoken language understanding". 7917:
Goodfellow, Ian; Pouget-Abadie, Jean; Mirza, Mehdi; Xu, Bing; Warde-Farley, David; Ozair, Sherjil; Courville, Aaron; Bengio, Yoshua (2014).
7157: 5111:
Ostrovski, G.M., Volin,Y.M., and Boris, W.W. (1971). On the computation of derivatives. Wiss. Z. Tech. Hochschule for Chemistry, 13:382–384.
1707:
DNNs are prone to overfitting because of the added layers of abstraction, which allow them to model rare dependencies in the training data.
628:, the depth of the CAPs is that of the network and is the number of hidden layers plus one (as the output layer is also parameterized). For 7755:
Kiros, Ryan; Salakhutdinov, Ruslan; Zemel, Richard S (2014). "Unifying Visual-Semantic Embeddings with Multimodal Neural Language Models".
7192:"Keynote talk: 'Achievements and Challenges of Deep Learning - From Speech Analysis and Recognition To Language and Multimodal Processing'" 3963: 2946:
tricked an image classification system. One defense is reverse image search, in which a possible fake image is submitted to a site such as
2210: – capturing the style of a given artwork and applying it in a visually pleasing manner to an arbitrary photograph or video 1084:
to predict the reactions of the environment to these patterns. This was called "artificial curiosity". In 2014, this principle was used in
982:
et al., that classifies digits, was applied by several banks to recognize hand-written numbers on checks digitized in 32x32 pixel images.
11714: 11516:"System for the Recognizing of Pigmented Skin Lesions with Fusion and Analysis of Heterogeneous Data Based on a Multimodal Neural Network" 3890:
Co-evolving recurrent neurons learn deep memory POMDPs. Proc. GECCO, Washington, D. C., pp. 1795–1802, ACM Press, New York, NY, USA, 2005.
11253: 4637:
Unpublished (Later Published in Ince DC, Editor, Collected Works of AM Turing—Mechanical Intelligence, Elsevier Science Publishers, 1992)
1796:
Alternatively, engineers may look for other types of neural networks with more straightforward and convergent training algorithms. CMAC (
874:
to classify non-linearily separable pattern classes. Subsequent developments in hardware and hyperparameter tunings have made end-to-end
208: 173: 1513:
Simplified example of training a neural network in object detection: The network is trained by multiple images that are known to depict
1257:
to large vocabulary speech recognition, by adopting large output layers of the DNN based on context-dependent HMM states constructed by
1003:
In the 1980s, backpropagation did not work well for deep learning with long credit assignment paths. To overcome this problem, in 1991,
9696: 9460:
Feldmann, J.; Youngblood, N.; Karpov, M.; et al. (2021). "Parallel convolutional processing using an integrated photonic tensor".
5169: 1548:
In reality, textures and outlines would not be represented by single nodes, but rather by associated weight patterns of multiple nodes.
13931:"Human-aided artificial intelligence: Or, how to run large computations in human brains? Toward a media sociology of machine learning" 10967:"The Deep Learning–Based Recommender System "Pubmender" for Choosing a Biomedical Publication Venue: Development and Validation Study" 9867: 8445: 7044:; Kingsbury, B. (2012). "Deep Neural Networks for Acoustic Modeling in Speech Recognition: The Shared Views of Four Research Groups". 3111:
Most Deep Learning systems rely on training and verification data that is generated and/or annotated by humans. It has been argued in
7296:
Deng, Li; Li, Jinyu; Huang, Jui-Ting; Yao, Kaisheng; Yu, Dong; Seide, Frank; Seltzer, Mike; Zweig, Geoff; He, Xiaodong (1 May 2013).
6857: 2643: 1924:, where each speaker reads 10 sentences. Its small size lets many configurations be tried. More importantly, the TIMIT task concerns 10683: 11678:
Kleanthous, Christos; Chatzis, Sotirios (2020). "Gated Mixture Variational Autoencoders for Value Added Tax audit case selection".
11654: 11436: 10066: 8598:
Jozefowicz, Rafal; Vinyals, Oriol; Schuster, Mike; Shazeer, Noam; Wu, Yonghui (2016). "Exploring the Limits of Language Modeling".
5269: 3208: 2253:, information retrieval, spoken language understanding, machine translation, contextual entity linking, writing style recognition, 1797: 1376: 272: 250: 4917:
Fukushima, K. (1979). "Neural network model for a mechanism of pattern recognition unaffected by shift in position—Neocognitron".
14243: 12352: 9398:
Marega, Guilherme Migliato; Zhao, Yanfei; Avsar, Ahmet; Wang, Zhenyu; Tripati, Mukesh; Radenovic, Aleksandra; Kis, Anras (2020).
9349: 5456:"Computerized detection of clustered microcalcifications in digital mammograms using a shift-invariant artificial neural network" 4814: 4116: 3183: 2529:
request/serve/click internet advertising cycle. This information can form the basis of machine learning to improve ad selection.
186: 11280: 10030: 8358: 6829: 14126: 8963: 6621:
1st Intl. Workshop on Biologically Inspired Approaches to Advanced Information Technology, Bio-ADIT 2004, Lausanne, Switzerland
5000: 110: 7537: 1019:
at multiple self-organizing time scales. This can substantially facilitate downstream deep learning. The RNN hierarchy can be
639:
layer-by-layer method. Deep learning helps to disentangle these abstractions and pick out which features improve performance.
14029: 13730:
Miller, G. A., and N. Chomsky. "Pattern conception". Paper for Conference on pattern detection, University of Michigan. 1957.
13411: 12459: 11590: 10786: 8903: 8842: 8201:"Unidirectional Long Short-Term Memory Recurrent Neural Network with Recurrent Output Layer for Low-Latency Speech Synthesis" 8129: 7880: 7612: 7440: 7141: 6683: 6196: 5761: 4744: 4596: 4375: 4270: 4097: 4067: 3955: 3868: 3733: 3400: 2586: 2487:' color values to probabilities over possible image classes. In practice, the probability distribution of Y is obtained by a 1322:
created an FNN that learned to recognize higher-level concepts, such as cats, only from watching unlabeled images taken from
1208: 405: 331: 285: 240: 235: 11090: 9961:
Goldberg, Yoav; Levy, Omar (2014). "word2vec Explained: Deriving Mikolov et al.'s Negative-Sampling Word-Embedding Method".
8200: 8161: 8061: 7679: 6624: 6316: 3079: 14916: 14076: 13466: 11768:
Merchant, Amil; Batzner, Simon; Schoenholz, Samuel S.; Aykol, Muratahan; Cheon, Gowoon; Cubuk, Ekin Dogus (December 2023).
10971: 8542: 7710:
Vinyals, Oriol; Toshev, Alexander; Bengio, Samy; Erhan, Dumitru (2014). "Show and Tell: A Neural Image Caption Generator".
3463: 3188: 2623:
in both forward and inverse problems in a data driven manner. One example is the reconstructing fluid flow governed by the
2443:, a deep-learning based system, achieved a level of accuracy significantly higher than all previous computational methods. 2156: 1241:
using supervised backpropagation. They could model high-dimensional probability distributions, such as the distribution of
485: 17: 13745: 13558: 12987:
Yamins, Daniel L K; DiCarlo, James J (March 2016). "Using goal-driven deep learning models to understand sensory cortex".
7398:; Chen, Yu-Hsin; Yang, Tien-Ju; Emer, Joel (2017). "Efficient Processing of Deep Neural Networks: A Tutorial and Survey". 7375: 3725:
Human Behavior and Another Kind in Consciousness: Emerging Research and Opportunities: Emerging Research and Opportunities
3051: 3001:", false data is continually smuggled into a machine learning system's training set to prevent it from achieving mastery. 2998: 15017: 14568: 14305: 13766: 2839: 1035:
network. In 1993, a neural history compressor solved a "Very Deep Learning" task that required more than 1000 subsequent
790: 384: 356: 351: 245: 9030:
Proceedings of the International Conference for High Performance Computing, Networking, Storage and Analysis on - SC '17
8515: 7926:. Proceedings of the International Conference on Neural Information Processing Systems (NIPS 2014). pp. 2672–2680. 4205: 1789:
for optimal parameters may not be feasible due to the cost in time and computational resources. Various tricks, such as
12475:
Shrager, J.; Johnson, MH (1996). "Dynamic plasticity influences the emergence of function in a simple cortical array".
9555: 9060: 8622:
Gillick, Dan; Brunk, Cliff; Vinyals, Oriol; Subramanya, Amarnag (2015). "Multilingual Language Processing from Bytes".
8268: 6569:
L.P Heck and R. Teunen. "Secure and Convenient Transactions with Nuance Verifier". Nuance Users Conference, April 1998.
6294: 3428: 2246: 1829:
were designed to speed up deep learning algorithms. Deep learning processors include neural processing units (NPUs) in
721:; if the width is smaller or equal to the input dimension, then a deep neural network is not a universal approximator. 344: 213: 203: 193: 13364: 9986: 5061:
The representation of the cumulative rounding error of an algorithm as a Taylor expansion of the local rounding errors
1488:
for "conceptual and engineering breakthroughs that have made deep neural networks a critical component of computing."
1273:
How deep learning is a subset of machine learning and how machine learning is a subset of artificial intelligence (AI)
14829: 14456: 14263: 14119: 14048: 13978: 13628: 11422: 11076: 10640: 10562:"Using transcriptomics to guide lead optimization in drug discovery projects: Lessons learned from the QSTAR project" 9721: 9532: 9265: 9046: 8866:
Bengio, Yoshua; Boulanger-Lewandowski, Nicolas; Pascanu, Razvan (2013). "Advances in optimizing recurrent networks".
8423: 8077: 7952: 6655:(2006). "Connectionist temporal classification: Labelling unsegmented sequence data with recurrent neural networks". 6310: 6034: 5825: 5325: 5203: 5010: 3098: 3058: 2283: 2279: 1786: 974:
hardware. In 1991, a CNN was applied to medical image object segmentation and breast cancer detection in mammograms.
548: 433: 316: 262: 228: 95: 13835: 11614: 10619: 8060:
Google Research Blog. The neural networks behind Google Voice transcription. August 11, 2015. By Françoise Beaufays
4316:
Sonoda, Sho; Murata, Noboru (2017). "Neural network with unbounded activation functions is universal approximator".
1781:
DNNs must consider many training parameters, such as the size (number of layers and number of units per layer), the
843:
with learning hidden units? Unfortunately, the learning algorithm was not a functional one, and fell into oblivion.
14784: 10434: 6648: 3750: 2654: 2468: 2364: 1204: 870:. In computer experiments conducted by Amari's student Saito, a five layer MLP with two modifiable layers learned 749: 398: 302: 148: 10381: 7019: 4464:
Amari, Shun-Ichi (1972). "Learning patterns and pattern sequences by self-organizing nets of threshold elements".
4408: 2830:
Deep learning has attracted both criticism and comment, in some cases from outside the field of computer science.
989: 10708: 7094:"New types of deep neural network learning for speech recognition and related applications: An overview (ICASSP)" 5723: 3263: 2910: 2500: 993: 671: 80: 10881: 8644: 7896:
Gatys, Leon A.; Ecker, Alexander S.; Bethge, Matthias (26 August 2015). "A Neural Algorithm of Artistic Style".
6338: 5455: 5408: 5362:"Parallel distributed processing model with local space-invariant interconnections and its optical architecture" 5361: 2627:. Using physics informed neural networks does not require the often expensive mesh generation that conventional 2611:
The United States Department of Defense applied deep learning to train robots in new tasks through observation.
528:
programs, where they have produced results comparable to and in some cases surpassing human expert performance.
14971: 14911: 14509: 14103: 10932: 9781: 9558:(30 September 1991). "Several Improvements to a Recurrent Error Propagation Network Phone Recognition System". 7657:
Simonyan, Karen; Andrew, Zisserman (2014). "Very Deep Convolution Networks for Large Scale Image Recognition".
6718: 3065: 3036: 3032: 2853: 2675:
globally, at a very detailed level, and in under a minute, with precision similar to state of the art systems.
2455:
and called Neural Joint Entropy Estimator (NJEE). Such an estimation provides insights on the effects of input
1439:
In 2015, Google's speech recognition improved by 49% by an LSTM-based model, which they made available through
1401: 1319: 1130:
have been explored for many years. These methods never outperformed non-uniform internal-handcrafting Gaussian
1085: 748:, respectively. More specifically, the probabilistic interpretation considers the activation nonlinearity as a 481: 14082: 13105:"Deep Neural Networks Reveal a Gradient in the Complexity of Neural Representations across the Ventral Stream" 9610: 8385: 5808:
Gers, Felix; Schmidhuber, Jürgen; Cummins, Fred (1999). "Learning to forget: Continual prediction with LSTM".
3373:
Ciresan, D.; Meier, U.; Schmidhuber, J. (2012). "Multi-column deep neural networks for image classification".
2088:
Other types of deep models including tensor-based models and integrated deep generative/discriminative models.
531:
Early forms of neural networks were inspired by information processing and distributed communication nodes in
14504: 14193: 11459:"Liver Cancer Detection Using Hybridized Fully Convolutional Neural Network Based on Deep Learning Framework" 10313:"Deep Learning for Natural Language Processing: Theory and Practice (CIKM2014 Tutorial) - Microsoft Research" 9227: 5657:"Learning complex, extended sequences using the principle of history compression (based on TR FKI-148, 1991)" 2975:, potentially leading attackers and defenders into an arms race similar to the kind that already defines the 2922: 2819: 2815: 2620: 2464: 1349:
by a significant margin over shallow machine learning methods. Further incremental improvements included the
847: 477: 3135:(the embedding of annotation or computation tasks in the flow of a game), (2) "trapping and tracking" (e.g. 817:
which is essentially a non-learning RNN architecture consisting of neuron-like threshold elements. In 1972,
14946: 14343: 14300: 14253: 14248: 13435: 12510:
Quartz, SR; Sejnowski, TJ (1997). "The neural basis of cognitive development: A constructivist manifesto".
9134:
Ting Qin, et al. "A learning algorithm of CMAC based on RLS". Neural Processing Letters 19.1 (2004): 49-61.
6796: 5314:
IEEE Transactions on Acoustics, Speech, and Signal Processing, Volume 37, No. 3, pp. 328. – 339 March 1989.
2939: 2628: 2436: 2072: 1708: 1685: 1451: 1450:
Deep learning is part of state-of-the-art systems in various disciplines, particularly computer vision and
1120: 1108: 897: 757: 556: 552: 13866: 13334:"A Google DeepMind Algorithm Uses Deep Learning and More to Master the Game of Go | MIT Technology Review" 12806:"Neural Dynamics as Sampling: A Model for Stochastic Computation in Recurrent Networks of Spiking Neurons" 5310: 4896:
Ramachandran, Prajit; Barret, Zoph; Quoc, V. Le (October 16, 2017). "Searching for Activation Functions".
4289:
Fukushima, K. (1969). "Visual feature extraction by a multilayered network of analog threshold elements".
3047: 14997: 14293: 14219: 13394:
Bradley Knox, W.; Stone, Peter (2008). "TAMER: Training an Agent Manually via Evaluative Reinforcement".
10837:
Zhavoronkov, Alex (2019). "Deep learning enables rapid identification of potent DDR1 kinase inhibitors".
5196:
The Roots of Backpropagation : From Ordered Derivatives to Neural Networks and Political Forecasting
3601:
Bengio, Y.; Courville, A.; Vincent, P. (2013). "Representation Learning: A Review and New Perspectives".
2624: 2369: 2223: 1913:
about 10 ms. LSTM with forget gates is competitive with traditional speech recognizers on certain tasks.
1887: 1617:, or passing information in the reverse direction and adjusting the network to reflect that information. 1238: 1091:
During 1985–1995, inspired by statistical mechanics, several architectures and methods were developed by
886: 875: 863: 682: 501: 267: 218: 115: 13163: 9513:
Garofolo, J.S.; Lamel, L.F.; Fisher, W.M.; Fiscus, J.G.; Pallett, D.S.; Dahlgren, N.L.; Zue, V. (1993).
9144: 7040:
Hinton, G.; Deng, L.; Yu, D.; Dahl, G.; Mohamed, A.; Jaitly, N.; Senior, A.; Vanhoucke, V.; Nguyen, P.;
6703:, in Bengio, Yoshua; Schuurmans, Dale; Lafferty, John; Williams, Chris K. I.; and Culotta, Aron (eds.), 5295:. Meeting of the Institute of Electrical, Information and Communication Engineers (IEICE). Tokyo, Japan. 14621: 14556: 14157: 13536: 8928: 7636:
Ng, Andrew; Dean, Jeff (2012). "Building High-level Features Using Large Scale Unsupervised Learning".
7222:"Roles of Pre-Training and Fine-Tuning in Context-Dependent DBN-HMMs for Real-World Speech Recognition" 6022:
Parallel Distributed Processing: Explorations in the Microstructure of Cognition, Volume 1: Foundations
5785: 3228: 3198: 3140: 2782: 2699: 1369: 1292:
GPUs, an early demonstration of GPU-based deep learning. They reported up to 70 times faster training.
1192:
Neural networks entered a null, and simpler models that use task-specific handcrafted features such as
1050: 794: 658:
in 1986, and to artificial neural networks by Igor Aizenberg and colleagues in 2000, in the context of
625: 90: 73: 31: 11052:
Proceedings of the 5th ACM Conference on Bioinformatics, Computational Biology, and Health Informatics
6705:
Advances in Neural Information Processing Systems 22 (NIPS'22), December 7th–10th, 2009, Vancouver, BC
6613: 6580:"Acoustic Modeling with Deep Neural Networks Using Raw Time Signal for LVCSR (PDF Download Available)" 5960: 4138:"Efficient probabilistic inference in generic neural networks trained with non-probabilistic feedback" 15022: 14880: 14519: 14350: 14173: 13250: 11883: 11514:
Lyakhov, Pavel Alekseevich; Lyakhova, Ulyana Alekseevna; Nagornov, Nikolay Nikolaevich (2022-04-03).
5847:(1991). "A possibility for implementing curiosity and boredom in model-building neural controllers". 5132: 4720:
Contributions to Perceptron Theory, Cornell Aeronautical Laboratory Report No. VG-11 96--G-7, Buffalo
3243: 2961: 1774: 1675: 1570: 1522: 1497: 951: 919: 802: 753: 594: 473: 168: 13708: 13319: 9816: 8886: 7918: 6111: 6013: 5708:
Page 150 ff demonstrates credit assignment across the equivalent of 1,200 layers in an unfolded RNN.
3546: 1180:
features that contain stages of fixed transformation from spectrograms. The raw features of speech,
1049:'s diploma thesis (1991) implemented the neural history compressor, and identified and analyzed the 61: 14921: 14178: 12524: 9288:"HUAWEI Reveals the Future of Mobile AI at IFA 2017 | HUAWEI Latest News | HUAWEI Global" 9287: 8076:
Sak, Haşim; Senior, Andrew; Rao, Kanishka; Beaufays, Françoise; Schalkwyk, Johan (September 2015).
7999: 6665: 5522: 2731: 2476: 2429: 2254: 2082: 1150: 1077: 1016: 1008: 985: 871: 826: 629: 454: 292: 13796: 11575:
2017 2nd International Conference on Communication Systems, Computing and IT Applications (CSCITA)
10477: 14966: 14951: 14604: 14599: 14499: 14367: 14148: 12220:"Training Variational Networks With Multidomain Simulations: Speed-of-Sound Image Reconstruction" 10122:
Huang, Po-Sen; He, Xiaodong; Gao, Jianfeng; Deng, Li; Acero, Alex; Heck, Larry (1 October 2013).
8155:"Long Short-Term Memory recurrent neural network architectures for large scale acoustic modeling" 6879: 5584: 3131:
distinguishes five types of "machinic capture" of human microwork to generate training data: (1)
3025: 2926: 2703: 2381: 1861: 1826: 1661: 1380: 1216: 1054: 967: 613: 465: 437: 53: 10965:
Feng, X.Y.; Zhang, H.; Ren, Y.J.; Shang, P.H.; Zhu, Y.; Liang, Y.C.; Guan, R.C.; Xu, D. (2019).
10054:
Socher, R.; Perelygin, A.; Wu, J.; Chuang, J.; Manning, C.D.; Ng, A.; Potts, C. (October 2013).
4761: 3493: 1649:
a way that mimics functions of the human brain, and can be trained like any other ML algorithm.
1386:
Around the same time, deep learning started impacting the field of art. Early examples included
1203:
In 2003, LSTM became competitive with traditional speech recognizers on certain tasks. In 2006,
604:
Importantly, a deep learning process can learn which features to optimally place at which level
15045: 14926: 14686: 14405: 14400: 13703: 12519: 9811: 8881: 6660: 5517: 3541: 3120: 2811:
In 2017, Covariant.ai was launched, which focuses on integrating deep learning into factories.
2275: 1838: 1749: 1718: 1579: 1300: 1197: 1177: 1058: 1024: 840: 517: 163: 11020:"A Multi-View Deep Learning Approach for Cross Domain User Modeling in Recommendation Systems" 6026: 2589:
announced that they had developed an AI system known as GNoME. This system has contributed to
14956: 14941: 14906: 14594: 14494: 14362: 9872: 5992: 4652:"The perceptron: A probabilistic model for information storage and organization in the brain" 4087: 3723: 2918: 2789: 2207: 2125: 1842: 1507: 1394:(2015), both of which were based on pretrained image classification neural networks, such as 1391: 1036: 859: 798: 760:
in neural networks. The probabilistic interpretation was introduced by researchers including
717:
activation is strictly larger than the input dimension, then the network can approximate any
703: 560: 489: 458: 14824: 9663: 7533: 6652: 5903: 5862: 5844: 5781: 5627: 5158: 4496: 1409: 1304: 1065: 1004: 14976: 14931: 14377: 14322: 14168: 14163: 13276: 13206: 12817: 12589: 12307: 12163:"High-Resolution Multi-Spectral Imaging With Diffractive Lenses and Learned Reconstruction" 12115: 12048: 11991: 11926: 11781: 11470: 11359: 9411: 9101: 8563: 8453: 8308: 7339: 7053: 6977: 6418: 6145: 6136:; Neal, Radford (1995-05-26). "The wake-sleep algorithm for unsupervised neural networks". 5467: 5420: 5373: 5232: 4533: 4435: 4149: 3972: 3850: 3799: 3328: 3213: 3072: 2917:(visual or linguistic) from training data would be equivalent to restricting the system to 2647: 2504: 2480: 2472: 2377: 2351: 2339:. AtomNet was used to predict novel candidate biomolecules for disease targets such as the 1871: 1633: 1463: 1346: 1145:
researchers moved away from neural nets to pursue generative modeling. An exception was at
1116: 997: 718: 686: 105: 10154: 6918: 6853: 2983:
software by repeatedly attacking a defense with malware that was continually altered by a
662:
threshold neurons. Although the history of its appearance is apparently more complicated.
8: 14551: 14529: 14278: 14273: 14231: 14183: 13265:(28 January 2016). "Mastering the game of Go with deep neural networks and tree search". 11402: 10679: 10091:
Shen, Yelong; He, Xiaodong; Gao, Jianfeng; Deng, Li; Mesnil, Gregoire (1 November 2014).
8822: 8062:
http://googleresearch.blogspot.co.at/2015/08/the-neural-networks-behind-google-voice.html
8025:
Sohl-Dickstein, Jascha; Weiss, Eric; Maheswaranathan, Niru; Ganguli, Surya (2015-06-01).
5124: 3223: 2881: 2763: 2736: 2690: 2496: 2321: 2062: 1657: 1625: 1440: 1234: 1212: 1158: 1135: 1068:
also published adversarial neural networks that contest with each other in the form of a
890: 769: 710: 644: 609: 568: 505: 469: 450: 257: 13622:"Are there Deep Reasons Underlying the Pathologies of Today's Deep Learning Algorithms?" 13280: 13210: 12821: 12593: 12311: 12119: 12052: 12037:"Hidden fluid mechanics: Learning velocity and pressure fields from flow visualizations" 11995: 11930: 11812: 11785: 11769: 11645: 11474: 11363: 10093:"A Latent Semantic Model with Convolutional-Pooling Structure for Information Retrieval" 9415: 9105: 8567: 8312: 8026: 7343: 7057: 6981: 6422: 6149: 5736:
Diploma thesis. Institut f. Informatik, Technische Univ. Munich. Advisor: J. Schmidhuber
5698: 5471: 5424: 5377: 5287: 5266: 5236: 4537: 4439: 4153: 3976: 3803: 3332: 1165:
reported significant success with deep neural networks in speech processing in the 1998
14936: 14514: 14097: 13960: 13674: 13653: 13417: 13308: 13232: 13139: 13116: 13104: 13085: 13072: 13039: 13020: 12969: 12918: 12861: 12848: 12805: 12786: 12735: 12722: 12687: 12668: 12545: 12430: 12297: 12265: 12231: 12200: 12174: 12138: 12105: 12093: 12069: 12036: 12017: 11960: 11860: 11695: 11596: 11550: 11515: 11496: 11457:
Dong, Xin; Zhou, Yizhao; Wang, Lantian; Peng, Jingfeng; Lou, Yanbo; Fan, Yiqun (2020).
11428: 11383: 11349: 11321: 11295: 11205: 11172: 11153: 11140: 11113: 11082: 10995: 10966: 10911: 10862: 10817: 10759: 10731: 10542: 10413: 10289: 10262: 10243: 10174: 10055: 9962: 9848: 9753: 9688: 9644: 9602: 9495: 9469: 9432: 9399: 9318: 9117: 9091: 9052: 9024: 8909: 8871: 8848: 8800: 8728: 8663: 8623: 8599: 8553: 8507: 8340: 8278: 8179: 8135: 8107: 8041: 7979: 7897: 7858: 7827: 7802: 7782: 7756: 7734: 7711: 7690: 7658: 7637: 7499: 7465: 7399: 7267: 7069: 6821: 6754: 6442: 6377: 6169: 6100: 6014:"Chapter 6: Information Processing in Dynamical Systems: Foundations of Harmony Theory" 5941: 5915: 5882: 5679: 5535: 5094: 4981: 4938: 4897: 4874: 4522:"Neural networks and physical systems with emergent collective computational abilities" 4500: 4343: 4325: 4172: 4137: 4035: 3988: 3931: 3704: 3678: 3636: 3610: 3559: 3406: 3378: 3352: 3291: 3203: 3193: 3170:
Mühlhoff argues that in most commercial end-user applications of Deep Learning such as
3152: 2877: 2525: 2452: 2393: 2344: 2265: 2250: 2093: 2056:
Feature processing by deep models with solid understanding of the underlying mechanisms
1907: 1822: 1621: 1227: 1142: 1127: 939: 905: 882: 867: 818: 773: 699: 675: 497: 307: 13930: 12897: 12880: 12218:
Bernhardt, Melanie; Vishnevskiy, Valery; Rau, Richard; Goksel, Orcun (December 2020).
10063:
Proceedings of the 2013 Conference on Empirical Methods in Natural Language Processing
7427:. ICML '09. New York, NY, USA: Association for Computing Machinery. pp. 873–880. 6556: 6526: 6213: 5976: 5502: 4556: 4521: 4018:
Hornik, Kurt (1991). "Approximation Capabilities of Multilayer Feedforward Networks".
3313: 3128: 2380:
variables. The estimated value function was shown to have a natural interpretation as
1368:
The success in image classification was then extended to the more challenging task of
15002: 14990: 14794: 14446: 14317: 14310: 14072: 14044: 14025: 13986: 13964: 13952: 13407: 13300: 13292: 13224: 13144: 13077: 13059: 13012: 13004: 12961: 12953: 12910: 12902: 12853: 12835: 12804:
Buesing, Lars; Bill, Johannes; Nessler, Bernhard; Maass, Wolfgang (3 November 2011).
12778: 12770: 12727: 12709: 12660: 12625: 12620: 12607: 12577: 12537: 12492: 12488: 12455: 12422: 12333: 12325: 12269: 12257: 12249: 12204: 12192: 12143: 12074: 12021: 12009: 11952: 11944: 11864: 11852: 11836: 11817: 11799: 11699: 11586: 11555: 11537: 11500: 11488: 11418: 11403:"Deep Convolutional Neural Networks for Detecting Cellular Changes Due to Malignancy" 11375: 11325: 11313: 11210: 11192: 11145: 11086: 11072: 11000: 10866: 10854: 10593: 10534: 10294: 9829: 9692: 9606: 9528: 9499: 9487: 9437: 9042: 8994: 8899: 8838: 8749: 8720: 8712: 8499: 8332: 8324: 8264: 8139: 8125: 7876: 7618: 7608: 7491: 7483: 7436: 7137: 7073: 6995: 6946: 6938: 6899: 6813: 6788: 6746: 6738: 6579: 6499: 6369: 6306: 6276: 6241: 6233: 6229: 6214:"Predicting the secondary structure of globular proteins using neural network models" 6192: 6161: 6092: 6087: 6030: 5980: 5945: 5933: 5821: 5757: 5631: 5604: 5483: 5436: 5389: 5248: 5199: 5098: 5006: 4985: 4973: 4942: 4930: 4777: 4740: 4679: 4671: 4592: 4561: 4371: 4266: 4177: 4093: 4063: 4031: 3935: 3923: 3864: 3729: 3696: 3628: 3563: 3396: 3344: 3295: 3283: 3238: 2984: 2914: 2873: 2743:
somewhat analogous to the neural networks utilized in deep learning models. Like the
2740: 2727: 2715: 2599: 2595: 2590: 2554: 2066: 1925: 1809: 1773:-regularization) can be applied during training to combat overfitting. Alternatively 1700:
As with ANNs, many issues can arise with naively trained DNNs. Two common issues are
1586: 1146: 1112: 1104: 1012: 971: 851: 741: 586: 572: 532: 445: 85: 13089: 13024: 12973: 12922: 12790: 11964: 11600: 10546: 10382:"Zero-Shot Translation with Google's Multilingual Neural Machine Translation System" 10247: 9648: 9121: 8913: 8852: 8804: 8667: 8511: 8121: 6758: 6461: 5750:"Gradient flow in recurrent nets: the difficulty of learning long-term dependencies" 5683: 5539: 4347: 3708: 2987:
until it tricked the anti-malware while retaining its ability to damage the target.
2852:
Others point out that deep learning should be looked at as a step towards realizing
2845:
how fast? What is it approximating?) Deep learning methods are often looked at as a
1531: 14747: 14737: 14544: 14338: 14288: 14283: 14226: 14214: 13942: 13874: 13713: 13421: 13399: 13284: 13267: 13236: 13214: 13134: 13130: 13126: 13067: 13051: 12996: 12945: 12892: 12865: 12843: 12825: 12762: 12739: 12717: 12699: 12672: 12652: 12615: 12597: 12549: 12529: 12484: 12434: 12414: 12385: 12315: 12241: 12184: 12133: 12123: 12064: 12056: 11999: 11934: 11844: 11807: 11789: 11687: 11578: 11545: 11527: 11478: 11432: 11410: 11387: 11367: 11305: 11229:"DeepMind's protein-folding AI has solved a 50-year-old grand challenge of biology" 11200: 11184: 11157: 11135: 11125: 11064: 11056: 10990: 10980: 10846: 10583: 10573: 10524: 10439: 10284: 10274: 10235: 10178: 10166: 9941: 9903: 9821: 9678: 9636: 9594: 9563: 9520: 9479: 9427: 9419: 9328: 9109: 9056: 9034: 8891: 8830: 8792: 8732: 8704: 8695:
Hochreiter, Sepp; Schmidhuber, Jürgen (1 November 1997). "Long Short-Term Memory".
8655: 8491: 8480:"LSTM Recurrent Networks Learn Simple Context Free and Context Sensitive Languages" 8316: 8117: 7868: 7600: 7529: 7519:"Flexible, High Performance Convolutional Neural Networks for Image Classification" 7518: 7503: 7475: 7428: 7347: 7259: 7061: 6985: 6930: 6891: 6825: 6805: 6730: 6552: 6522: 6495: 6434: 6426: 6381: 6361: 6353: 6268: 6225: 6153: 6104: 6082: 6074: 5972: 5925: 5874: 5813: 5791: 5671: 5656: 5596: 5527: 5475: 5428: 5381: 5240: 5086: 5074: 5056: 5038: 4965: 4922: 4854: 4806: 4773: 4700: 4663: 4617: 4584: 4551: 4541: 4443: 4335: 4298: 4220: 4167: 4157: 4039: 4027: 3992: 3980: 3913: 3856: 3807: 3688: 3620: 3551: 3410: 3388: 3356: 3336: 3275: 3160: 2805: 2759: 2684: 2558: 2546: 2538: 2456: 2373: 2271: 2101: 2076: 1921: 1433: 1358: 1081: 1073: 931: 893:. The rectifier has become the most popular activation function for deep learning. 836: 805:(RNN). RNNs have cycles in their connectivity structure, FNNs don't. In the 1920s, 729: 694: 636: 579: 564: 429: 223: 158: 143: 13312: 12161:
Oktem, Figen S.; Kar, Oğuzhan Fatih; Bezek, Can Deniz; Kamalabadi, Farzad (2021).
11173:"Using recurrent neural network models for early detection of heart failure onset" 9640: 8659: 8344: 7271: 7263: 7252:"Conversational speech transcription using context-dependent deep neural networks" 6680: 6446: 6173: 5886: 5865:(2010). "Formal Theory of Creativity, Fun, and Intrinsic Motivation (1990-2010)". 4795: 3996: 3918: 3901: 3640: 1928:-sequence recognition, which, unlike word-sequence recognition, allows weak phone 608:. Prior to deep learning, machine learning techniques often involved hand-crafted 14860: 14804: 14626: 14268: 14188: 12830: 12449: 11691: 10578: 10561: 10124:"Learning Deep Structured Semantic Models for Web Search using Clickthrough Data" 10019: 9825: 9512: 9374:"Cerebras launches new AI supercomputing processor with 2.6 trillion transistors" 9152: 7604: 7517:
Ciresan, D. C.; Meier, U.; Masci, J.; Gambardella, L.M.; Schmidhuber, J. (2011).
7351: 7158:"Deng receives prestigious IEEE Technical Achievement Award - Microsoft Research" 7131: 7023: 6861: 6687: 6462:"Artificial Neural Networks and their Application to Speech/Sequence Recognition" 6125: 6058: 6054: 5929: 5777: 5749: 5731: 5326:"Shift-invariant pattern recognition neural network and its optical architecture" 5273: 5026: 4838: 4734: 4588: 4260: 4123: 4057: 3837: 3692: 3156: 2957: 2755: 2582: 2570: 2460: 2137: 1834: 1620:
Neural networks have been used on a variety of tasks, including computer vision,
1614: 1477: 1425: 1342: 1334: 1289: 1223: 1100: 1092: 1046: 923: 911: 659: 493: 100: 13403: 12376:
Galkin, F.; Mamoshina, P.; Kochetov, K.; Sidorenko, D.; Zhavoronkov, A. (2020).
11745:"Google DeepMind's materials AI has already discovered 2.2 million new crystals" 11582: 11483: 11458: 11371: 9567: 8895: 8834: 6934: 6809: 3464:"Google's AlphaGo AI wins three-match series against the world's best Go player" 1277:
The deep learning revolution started around CNN- and GPU-based computer vision.
14834: 14799: 14789: 14614: 14372: 14198: 14058: 13262: 13258: 12949: 12245: 12219: 12162: 11848: 11794: 11309: 11171:
Choi, Edward; Schuetz, Andy; Stewart, Walter F.; Sun, Jimeng (13 August 2016).
10279: 10170: 10056:"Recursive Deep Models for Semantic Compositionality Over a Sentiment Treebank" 9774:"MNIST handwritten digit database, Yann LeCun, Corinna Cortes and Chris Burges" 9598: 9483: 9028: 8750:"Learning Precise Timing with LSTM Recurrent Networks (PDF Download Available)" 8708: 8104:
2021 International Conference on Computer Communication and Informatics (ICCCI)
7848: 7369: 6895: 6734: 6701:
Offline Handwriting Recognition with Multidimensional Recurrent Neural Networks
5600: 4339: 4162: 3144: 2707: 2479:
over the possible classes of random variable Y, given input X. For example, in
2418: 2408: 2305: 2261: 2232: 2167: 1883: 1853: 1790: 1689: 1679: 1629: 1543: 1459: 1405: 1354: 1338: 1242: 1072:, where one network's gain is the other network's loss. The first network is a 927: 765: 745: 690: 509: 13878: 13556: 12766: 12656: 12533: 12418: 12004: 11979: 11939: 11914: 10850: 10469:
Boitet, Christian; Blanchon, Hervé; Seligman, Mark; Bellynck, Valérie (2010).
9683: 9423: 9113: 6990: 6965: 6681:
An application of recurrent neural networks to discriminative keyword spotting
6365: 6272: 6078: 5878: 5675: 5219:
Rumelhart, David E.; Hinton, Geoffrey E.; Williams, Ronald J. (October 1986).
4859: 4842: 4810: 4621: 4447: 3860: 3812: 3787: 3392: 3279: 1664:
are exponentially easier to approximate with DNNs than with shallow networks.
15039: 14779: 14759: 14676: 14062: 13990: 13956: 13947: 13296: 13063: 13008: 12957: 12906: 12839: 12774: 12713: 12704: 12664: 12611: 12329: 12253: 12196: 12188: 12094:"Solving high-dimensional partial differential equations using deep learning" 12013: 11948: 11803: 11541: 11492: 11196: 9631:
Deng, L.; Platt, J. (2014). "Ensemble Deep Learning for Speech Recognition".
8868:
2013 IEEE International Conference on Acoustics, Speech and Signal Processing
8827:
2013 IEEE International Conference on Acoustics, Speech and Signal Processing
8780: 8716: 8479: 8328: 7487: 7065: 6999: 6942: 6742: 6373: 6337:
Waibel, A.; Hanazawa, T.; Hinton, G.; Shikano, K.; Lang, K. J. (March 1989).
6280: 6237: 6062: 6020: 5984: 5608: 5252: 4675: 4302: 3927: 3287: 3233: 3112: 2796:
video games using only pixels as data input. In 2015 they demonstrated their
2422: 1858: 1782: 1743: 1678:, in which data can flow in any direction, are used for applications such as 1564: 1473: 1428:(2015) eclipsed GANs in generative modeling since then, with systems such as 1258: 1131: 1069: 822: 761: 725: 13261:; Lillicrap, Timothy; Leach, Madeleine; Kavukcuoglu, Koray; Graepel, Thore; 12602: 12320: 12285: 12128: 12060: 11532: 11060: 11050: 10641:"Multi-task Neural Networks for QSAR Predictions | Data Science Association" 9847:
Chaochao Lu; Xiaoou Tang (2014). "Surpassing Human Level Face Recognition".
9333: 9306: 9038: 7432: 7133:
Automatic Speech Recognition: A Deep Learning Approach (Publisher: Springer)
6707:, Neural Information Processing Systems (NIPS) Foundation, 2009, pp. 545–552 6430: 6298: 6157: 5959:
Ackley, David H.; Hinton, Geoffrey E.; Sejnowski, Terrence J. (1985-01-01).
5561: 5500: 3848: 14865: 14696: 14111: 13304: 13228: 13148: 13081: 13055: 13016: 12965: 12936:
Olshausen, B; Field, D (1 August 2004). "Sparse coding of sensory inputs".
12914: 12857: 12782: 12731: 12541: 12496: 12426: 12337: 12261: 12147: 12078: 11856: 11821: 11653:. Computer Vision and Pattern Recognition (CVPR), 2014 IEEE Conference on. 11559: 11379: 11317: 11214: 11188: 11149: 11004: 10858: 10597: 10538: 10344:"Found in translation: More accurate, fluent sentences in Google Translate" 10298: 9833: 9491: 9441: 8818: 8503: 8336: 8235: 7622: 7495: 7425:
Proceedings of the 26th Annual International Conference on Machine Learning
7395: 7041: 6950: 6903: 6817: 6784: 6750: 6133: 5937: 5440: 5409:"Image processing of human corneal endothelium based on a learning network" 5393: 5220: 4683: 4651: 4579:
Nakano, Kaoru (1971). "Learning Process in a Model of Associative Memory".
4546: 4181: 3826: 3700: 3632: 3348: 3164: 3132: 3123:) is regularly deployed for this purpose, but also implicit forms of human 3116: 2906: 2891: 2242: 2191:
Visual art processing of Jimmy Wales in France, with the style of Munch's "
2160: 2105: 1879: 1712: 1602: 1575: 1485: 1231: 1193: 970:
on mail. Training required 3 days. In 1990, Wei Zhang implemented a CNN on
901: 806: 733: 655: 441: 297: 13621: 12629: 11414: 9257: 8724: 8415: 7420: 6657:
Proceedings of the International Conference on Machine Learning, ICML 2006
6612:
Graves, Alex; Eck, Douglas; Beringer, Nicole; Schmidhuber, Jürgen (2003).
6245: 6165: 6096: 5817: 5795: 5562:"Attractor dynamics and parallelism in a connectionist sequential machine" 5487: 4977: 4934: 4565: 3624: 2714:. The aging clock was planned to be released for public use in 2021 by an 14961: 14732: 14641: 14636: 14258: 14236: 11254:"DeepMind solves 50-year-old 'grand challenge' with protein folding A.I." 11130: 10263:"Precision information extraction for rare disease epidemiology at scale" 10195:
Gao, Jianfeng; He, Xiaodong; Yih, Scott Wen-tau; Deng, Li (1 June 2014).
8024: 7872: 7479: 6299:"A real-time recurrent error propagation network word recognition system" 6129: 6050: 5432: 5385: 5305: 5154: 3749:
Bengio, Yoshua; Lamblin, Pascal; Popovici, Dan; Larochelle, Hugo (2007).
3669:
Schmidhuber, J. (2015). "Deep Learning in Neural Networks: An Overview".
2857: 2698:
and predicts people with certain conditions older than healthy controls:
2439:, according to the sequence of the amino acids that make it up. In 2020, 2414: 2340: 2336: 1701: 1362: 1308: 1173: 1096: 955: 935: 830: 814: 810: 536: 521: 513: 326: 311: 13288: 11068: 9946: 9929: 9908: 9891: 9722:"How Skype Used AI to Build Its Amazing New Language Translator | WIRED" 9664:"Phone Recognition with Hierarchical Convolutional Deep Maxout Networks" 9514: 8320: 7597:
Medical Image Computing and Computer-Assisted Intervention – MICCAI 2013
6438: 6261:
International Journal of Pattern Recognition and Artificial Intelligence
5330:
Proceedings of Annual Conference of the Japan Society of Applied Physics
3340: 2634: 926:
had a continuous precursor of backpropagation in 1960 in the context of
900:(CNNs) with convolutional layers and downsampling layers began with the 14855: 14814: 14809: 14722: 14631: 14539: 14451: 14431: 13717: 12224:
IEEE Transactions on Ultrasonics, Ferroelectrics, and Frequency Control
12035:
Raissi, Maziar; Yazdani, Alireza; Karniadakis, George Em (2020-02-28).
11407:
2017 IEEE International Conference on Computer Vision Workshops (ICCVW)
10053: 10018:
Socher, Richard; Bauer, John; Manning, Christopher; Ng, Andrew (2013).
9524: 7599:. Lecture Notes in Computer Science. Vol. 7908. pp. 411–418. 7016: 7012: 5090: 4969: 4926: 4224: 3984: 3758:. Advances in neural information processing systems. pp. 153–160. 3555: 3039: in this section. Unsourced material may be challenged and removed. 2991: 2990:
In 2016, another group demonstrated that certain sounds could make the
2849:, with most confirmations done empirically, rather than theoretically. 2748: 2550: 2399:
content-based approach and enhances recommendations in multiple tasks.
2332:
of environmental chemicals in nutrients, household products and drugs.
2325: 2309: 2192: 2109: 1875: 1518: 1481: 1444: 1207:, Santiago Fernández, Faustino Gomez, and Schmidhuber combined it with 1162: 979: 959: 942:
et al. popularised backpropagation but did not cite the original work.
934:'s master thesis (1970). G.M. Ostrovski et al. republished it in 1971. 915: 525: 11956: 10588: 8495: 7855:
2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR)
7371:
High performance convolutional neural networks for document processing
41:
Representing images on multiple layers of abstraction in deep learning
14850: 14819: 14717: 14561: 14524: 14461: 14415: 14410: 14395: 13498:"DARPA is funding projects that will try to open up AI's black boxes" 13254: 12282: 11978:
Mao, Zhiping; Jagtap, Ameya D.; Karniadakis, George Em (2020-03-01).
10611: 10239: 10197:"Learning Continuous Phrase Representations for Translation Modeling" 9773: 9304: 8796: 7822:
He, Kaiming; Zhang, Xiangyu; Ren, Shaoqing; Sun, Jian (10 Dec 2015).
7330:
Oh, K.-S.; Jung, K. (2004). "GPU implementation of neural networks".
6357: 5810:
9th International Conference on Artificial Neural Networks: ICANN '99
5531: 5479: 5244: 4667: 3124: 2972: 2965: 2869: 2865: 2846: 2801: 2744: 2542: 2440: 1890: 1387: 1383:(ResNet) in Dec 2015. ResNet behaves like an open-gated Highway Net. 1315: 1285: 1057:
connections to solve the vanishing gradient problem. This led to the
821:
made this architecture adaptive. His learning RNN was republished by
681:
The classic universal approximation theorem concerns the capacity of
582: 361: 125: 13219: 13194: 13000: 12562:
S. Blakeslee, "In brain's early growth, timetable may be critical",
10529: 10512: 10380:
Schuster, Mike; Johnson, Melvin; Thorat, Nikhil (22 November 2016).
8034:
Proceedings of the 32nd International Conference on Machine Learning
4762:"Heuristic self-organization in problems of engineering cybernetics" 3436:
NIPS 2012: Neural Information Processing Systems, Lake Tahoe, Nevada
3014: 2808:
uses a neural network to translate between more than 100 languages.
2721: 1211:(CTC) in stacks of LSTMs. In 2009, it became the first RNN to win a 14752: 14584: 13904: 13767:"Hackers Have Already Started to Weaponize Artificial Intelligence" 13557:
Alexander Mordvintsev; Christopher Olah; Mike Tyka (17 June 2015).
12302: 12236: 12179: 12110: 11354: 11300: 10916: 10822: 10736: 10418: 9582: 9474: 9323: 9096: 8929:"Improving DNNs for LVCSR using rectified linear units and dropout" 8628: 8604: 8112: 8046: 7984: 7902: 7863: 7832: 7807: 7594: 7560: 7404: 7298:"Recent Advances in Deep Learning for Speech Research at Microsoft" 5920: 5042: 4902: 4612:
Nakano, Kaoru (1972). "Associatron-A Model of Associative Memory".
4505: 4330: 3171: 3148: 2778: 2695: 2492: 2329: 2317: 2237: 1868: 1846: 1688:(CNNs) are used in computer vision. CNNs also have been applied to 1539: 1514: 1421: 1417: 1181: 850:, a method to train arbitrarily deep neural networks, published by 737: 198: 120: 14066: 13679: 13658: 13396:
2008 7th IEEE International Conference on Development and Learning
13121: 12378:"DeepMAge: A Methylation Aging Clock Developed with Deep Learning" 10985: 10755:"Toronto startup has a faster way to discover effective medicines" 10435:"An Infusion of AI Makes Google Translate More Powerful Than Ever" 9967: 9853: 9758: 9167:"Deep Neural Networks for Acoustic Modeling in Speech Recognition" 8876: 8865: 8817: 8558: 8184: 7787: 7779:
Very Deep Convolutional Networks for Large-Scale Image Recognition
7761: 7739: 7716: 7695: 7663: 7642: 7470: 7455: 7421:"Large-scale deep unsupervised learning using graphics processors" 5566:
Proceedings of the Annual Meeting of the Cognitive Science Society
3683: 3615: 3383: 2979:
defense industry. ANNs have been trained to defeat ANN-based anti-
1808:
Since the 2010s, advances in both machine learning algorithms and
1126:
Both shallow and deep learning (e.g., recurrent nets) of ANNs for
14875: 14712: 14666: 14589: 14489: 14484: 14436: 13694:
Zhu, S.C.; Mumford, D. (2006). "A stochastic grammar of images".
13365:"A.I. Researchers Leave Elon Musk Lab to Begin Robotics Start-Up" 12578:"A more biologically plausible learning rule for neural networks" 12451:
Rethinking Innateness: A Connectionist Perspective on Development
12405:
Utgoff, P. E.; Stracuzzi, D. J. (2002). "Many-layered learning".
11715:"Deep learning: the next frontier for money laundering detection" 11114:"Sleep Quality Prediction From Wearable Data Using Deep Learning" 6917:
Hinton, Geoffrey E.; Osindero, Simon; Teh, Yee-Whye (July 2006).
6614:"Biologically Plausible Speech Recognition with LSTM Neural Nets" 6512: 5776: 5501:
LeCun, Yann; Léon Bottou; Yoshua Bengio; Patrick Haffner (1998).
5346:, "Backpropagation Applied to Handwritten Zip Code Recognition", 3494:"Study urges caution when comparing neural networks to the brain" 3429:"ImageNet Classification with Deep Convolutional Neural Networks" 3136: 2980: 2976: 2797: 2711: 2488: 2121: 2092:
All major commercial speech recognition systems (e.g., Microsoft
1917: 1682:. Long short-term memory is particularly effective for this use. 1598: 1372:(captions) for images, often as a combination of CNNs and LSTMs. 1330: 1323: 1040: 366: 12881:"Linear summation of excitatory inputs by CA1 pyramidal neurons" 12390: 12377: 12375: 11400: 9889: 9307:"In-Datacenter Performance Analysis of a Tensor Processing Unit" 8597: 8027:"Deep Unsupervised Learning using Nonequilibrium Thermodynamics" 7226:
NIPS Workshop on Deep Learning and Unsupervised Feature Learning
6679:
Santiago Fernandez, Alex Graves, and Jürgen Schmidhuber (2007).
2213:
generating striking imagery based on random visual input fields.
1007:
proposed a hierarchy of RNNs pre-trained one level at a time by
37: 14890: 14870: 14742: 14534: 10559: 10513:"Trial watch: Phase II and phase III attrition rates 2011-2012" 9587:
IEEE/ACM Transactions on Audio, Speech, and Language Processing
9459: 8416:"The power of deeper networks for expressing natural functions" 6647: 3748: 3375:
2012 IEEE Conference on Computer Vision and Pattern Recognition
2947: 2507:
and outperforms other methods in case of large alphabet sizes.
2484: 1929: 1830: 1814: 1590: 1429: 1413: 1395: 1350: 670:
Deep neural networks are generally interpreted in terms of the
11767: 10468: 8384:
Szegedy, Christian; Toshev, Alexander; Erhan, Dumitru (2013).
7916: 7847:
He, Kaiming; Zhang, Xiangyu; Ren, Shaoqing; Sun, Jian (2016).
5265:
Rumelhart, David E., Geoffrey E. Hinton, and R. J. Williams. "
5077:(1976). "Taylor expansion of the accumulated rounding error". 4117:
The Expressive Power of Neural Networks: A View from the Width
3603:
IEEE Transactions on Pattern Analysis and Machine Intelligence
2451:
Deep neural networks can be used to estimate the entropy of a
752:. The probabilistic interpretation led to the introduction of 14691: 14671: 14661: 14656: 14651: 14646: 14609: 14441: 13529:"Is "Deep Learning" a Revolution in Artificial Intelligence?" 13528: 12217: 11913:
Raissi, M.; Perdikaris, P.; Karniadakis, G. E. (2019-02-01).
11018:
Elkahky, Ali Mamdouh; Song, Yang; He, Xiaodong (1 May 2015).
10930: 10225: 9197:"GPUs Continue to Dominate the AI Accelerator Market for Now" 8995:"A Practical Guide to Training Restricted Boltzmann Machines" 8621: 6346:
IEEE Transactions on Acoustics, Speech, and Signal Processing
4426:
Brush, Stephen G. (1967). "History of the Lenz-Ising Model".
2793: 2231:
Other key techniques in this field are negative sampling and
2117: 1594: 1455: 1281:
optimizations were developed specifically for deep learning.
1254: 1176:
features in the late 1990s, showing its superiority over the
1154: 975: 963: 598: 590: 13979:"Facebook Can Now Find Your Face, Even When It's Not Tagged" 13672: 13393: 10815: 9801: 9560:
Cambridge University Engineering Department Technical Report
8258: 7516: 6611: 5159:"Applications of advances in nonlinear sensitivity analysis" 5063:(Masters) (in Finnish). University of Helsinki. p. 6–7. 4499:(2022). "Annotated History of Modern AI and Deep Learning". 3427:
Krizhevsky, Alex; Sutskever, Ilya; Hinton, Geoffrey (2012).
2435:
Deep neural networks have shown unparalleled performance in
2320:. Research has explored use of deep learning to predict the 930:. The modern form of backpropagation was first published in 547:
Most modern deep learning models are based on multi-layered
492:. These architectures have been applied to fields including 14681: 12576:
Mazzoni, P.; Andersen, R. A.; Jordan, M. I. (15 May 1991).
12286:"Learning skillful medium-range global weather forecasting" 11912: 10159:
IEEE Transactions on Audio, Speech, and Language Processing
9400:"Logic-in-memory based on an atomically thin semiconductor" 9350:"Cerebras Hits the Accelerator for Deep Learning Workloads" 9025:"Scaling deep learning on GPU and knights landing clusters" 7419:
Raina, Rajat; Madhavan, Anand; Ng, Andrew Y. (2009-06-14).
6336: 2925:
and is a basic goal of both human language acquisition and
2726:
Deep learning is closely related to a class of theories of
2113: 2097: 1467: 1269: 1123:, an early application of deep learning to bioinformatics. 714: 13164:"Facebook's 'Deep Learning' Guru Reveals the Future of AI" 13040:"An emergentist perspective on the origin of number sense" 7731: 7185: 7183: 2971:
ANNs can however be further trained to detect attempts at
1585:
An ANN is based on a collection of connected units called
1521:, which are correlated with "nodes" that represent visual 14057: 13742:"Deep Learning of Recursive Structure: Grammar Induction" 11884:"Army researchers develop new algorithms to train robots" 11837:"Google AI and robots join forces to build new materials" 11288:
IEEE Transactions on Neural Networks and Learning Systems
10882:"A Molecule Designed By AI Exhibits 'Druglike' Qualities" 9750: 9081: 8781:"Gradient-based learning applied to document recognition" 8446:"Is Artificial Intelligence Finally Coming into Its Own?" 8054: 7526:
International Joint Conference on Artificial Intelligence
6258: 5503:"Gradient-based learning applied to document recognition" 4115:
Lu, Z., Pu, H., Wang, F., Hu, Z., & Wang, L. (2017).
4081: 4079: 3956:"Approximations by superpositions of sigmoidal functions" 3849:
Aizenberg, I.N.; Aizenberg, N.N.; Vandewalle, J. (2000).
3426: 2619:
Physics informed neural networks have been used to solve
2569:
Deep learning is being successfully applied to financial
2257:(token classification), text classification, and others. 1288:
reported a 100M deep belief network trained on 30 Nvidia
829:
were published by Kaoru Nakano in 1971. Already in 1948,
685:
with a single hidden layer of finite size to approximate
12034: 11615:"Colorizing and Restoring Old Images with Deep Learning" 11338: 10375: 10373: 10371: 10369: 10260: 10153:
Mesnil, G.; Dauphin, Y.; Yao, K.; Bengio, Y.; Deng, L.;
9868:
Nvidia Demos a Car Computer Trained with "Deep Learning"
9023:
You, Yang; Buluç, Aydın; Demmel, James (November 2017).
8825:(2013). "Deep convolutional neural networks for LVCSR". 8153:
Sak, Hasim; Senior, Andrew; Beaufays, Francoise (2014).
7709: 7368:
Chellapilla, Kumar; Puri, Sidd; Simard, Patrice (2006),
6542: 2503:
holds. It is shown that this method provides a strongly
2296:
post-mortem matching and determination of subject sex.
2287:
English as an intermediate between most language pairs.
2166:
A common evaluation set for image classification is the
2155:
Richard Green explains how deep learning is used with a
1454:(ASR). Results on commonly used evaluation sets such as 464:
Some common deep learning network architectures include
13249: 12803: 12160: 11980:"Physics-informed neural networks for high-speed flows" 11513: 11177:
Journal of the American Medical Informatics Association
8297: 7977: 7754: 7180: 7087: 7085: 7083: 6408: 5218: 3372: 3127:
that are often not recognized as such. The philosopher
2642:
is a numerical method that combines deep learning with
10152: 9846: 9671:
EURASIP Journal on Audio, Speech, and Music Processing
9583:"Convolutional Neural Networks for Speech Recognition" 6339:"Phoneme recognition using time-delay neural networks" 6019:. In Rumelhart, David E.; McLelland, James L. (eds.). 5807: 5267:
Learning Internal Representations by Error Propagation
4895: 4360: 4076: 3600: 3312:
LeCun, Yann; Bengio, Yoshua; Hinton, Geoffrey (2015).
2335:
AtomNet is a deep learning system for structure-based
2050:
Scale-up/out and accelerated DNN training and decoding
635:
Deep learning architectures can be constructed with a
13436:"Talk to the Algorithms: AI Becomes a Faster Learner" 13103:
Güçlü, Umut; van Gerven, Marcel A. J. (8 July 2015).
12752: 12575: 11984:
Computer Methods in Applied Mechanics and Engineering
11170: 10426: 10379: 10366: 9980: 9978: 9927: 8694: 8000:"Prepare, Don't Panic: Synthetic Media and Deepfakes" 7367: 5958: 5700:
Habilitation thesis: System modeling and optimization
5221:"Learning representations by back-propagating errors" 4051: 4049: 2640:
Deep backward stochastic differential equation method
2635:
Deep backward stochastic differential equation method
1752: 1721: 13038:
Zorzi, Marco; Testolin, Alberto (19 February 2018).
11977: 11278: 10934:
Advances in Neural Information Processing Systems 26
10729: 10017: 9890:
G. W. Smith; Frederic Fol Leymarie (10 April 2017).
8543:"Sequence to Sequence Learning with Neural Networks" 8420:
International Conference on Learning Representations
8075: 7564:
Advances in Neural Information Processing Systems 25
7080: 7035: 7033: 7031: 6673: 6485: 6049: 5311:
Phoneme Recognition Using Time-Delay Neural Networks
5289:
Phoneme Recognition Using Time-Delay Neural Networks
4291:
IEEE Transactions on Systems Science and Cybernetics
2730:(specifically, neocortical development) proposed by 2573:, tax evasion detection, and anti-money laundering. 2561:, which trains on the image that needs restoration. 654:
was introduced to the machine learning community by
13651: 11048: 9397: 8101: 7971: 7363: 7361: 7091: 6124: 5029:(1960). "Gradient theory of optimal flight paths". 4877:(1967). "A theory of adaptive pattern classifier". 4126:. Neural Information Processing Systems, 6231-6239. 2792:developed a system capable of learning how to play 2372:has been used to approximate the value of possible 2358: 2128:speech products, etc.) are based on deep learning. 1867:In 2021, J. Feldmann et al. proposed an integrated 1080:over output patterns. The second network learns by 12353:"GraphCast: A breakthrough in Weather Forecasting" 11572: 9975: 8540: 8383: 7588: 5867:IEEE Transactions on Autonomous Mental Development 5650: 5648: 4803:IEEE Transactions on Systems, Man, and Cybernetics 4614:IEEE Transactions on Systems, Man, and Cybernetics 4046: 3139:for image recognition or click-tracking on Google 2921:that operates on concepts in terms of grammatical 2446: 1765: 1734: 846:The first working deep learning algorithm was the 578:Fundamentally, deep learning refers to a class of 13559:"Inceptionism: Going Deeper into Neural Networks" 11677: 10964: 10335: 9954: 9228:"AI is changing the entire nature of computation" 8390:Advances in Neural Information Processing Systems 8152: 7953:"GAN 2.0: NVIDIA's Hyperrealistic Face Generator" 7945: 7777:Simonyan, Karen; Zisserman, Andrew (2015-04-10), 7776: 7028: 6916: 6783: 6212:Qian, Ning; Sejnowski, Terrence J. (1988-08-20). 4695: 4693: 3311: 2722:Relation to human cognitive and brain development 2614: 2516:specialists to improve the diagnosis efficiency. 2495:size of Y. NJEE uses continuously differentiable 1184:, later produced excellent larger-scale results. 1149:in the late 1990s. Funded by the US government's 908:in 1979, though not trained by backpropagation. 563:or latent variables organized layer-wise in deep 15037: 13905:"Whose intelligence is artificial intelligence?" 13797:"How hackers can force AI to make dumb mistakes" 13195:"Google AI algorithm masters ancient game of Go" 11647:Shrinkage Fields for Effective Image Restoration 11456: 10787:"Startup Harnesses Supercomputers to Seek Cures" 10510: 10410: 10404: 9516:TIMIT Acoustic-Phonetic Continuous Speech Corpus 8964:"Data Augmentation - deeplearning.ai | Coursera" 7895: 7358: 7092:Deng, L.; Hinton, G.; Kingsbury, B. (May 2013). 7039: 7006: 6919:"A Fast Learning Algorithm for Deep Belief Nets" 6789:"A Fast Learning Algorithm for Deep Belief Nets" 5839: 5837: 4999:Leibniz, Gottfried Wilhelm Freiherr von (1920). 4284: 4282: 3498:MIT News | Massachusetts Institute of Technology 3422: 3420: 3262:Schulz, Hannes; Behnke, Sven (1 November 2012). 3143:), (3) exploitation of social motivations (e.g. 2491:layer with number of nodes that is equal to the 2299: 2204:identifying the style period of a given painting 1311:CNNs on GPU improved performance significantly. 13102: 12582:Proceedings of the National Academy of Sciences 12509: 12404: 12098:Proceedings of the National Academy of Sciences 11770:"Scaling deep learning for materials discovery" 10669:"Toxicology in the 21st century Data Challenge" 10504: 8645:"Recurrent neural network based language model" 8617: 8615: 8593: 8591: 8477: 8078:"Google voice search: faster and more accurate" 7418: 7394: 5898: 5896: 5752:. In Kolen, John F.; Kremer, Stefan C. (eds.). 5748:Hochreiter, S.; et al. (15 January 2001). 5724:Untersuchungen zu dynamischen neuronalen Netzen 5690: 5645: 4526:Proceedings of the National Academy of Sciences 2537:Deep learning has been successfully applied to 1901: 996:(1990), which applied RNN to study problems in 13467:"In defense of skepticism about deep learning" 12935: 12474: 11111: 10940:. Curran Associates, Inc. pp. 2643–2651. 10705:"NCATS Announces Tox21 Data Challenge Winners" 10680:"NCATS Announces Tox21 Data Challenge Winners" 10228:International Journal of Communication Systems 10146: 10121: 9155:. Neural Processing Letters 22.1 (2005): 1-16. 9022: 8413: 7857:. Las Vegas, NV, USA: IEEE. pp. 770–778. 7656: 7570:. Curran Associates, Inc. pp. 2843–2851. 7295: 7125: 7123: 7121: 6506: 5997:: CS1 maint: DOI inactive as of August 2024 ( 5622: 5620: 5618: 4732: 4690: 4635:Turing, Alan (1948). "Intelligent Machinery". 3781: 3779: 3307: 3305: 3219:List of datasets for machine-learning research 2804:well enough to beat a professional Go player. 2217: 2037:Hierarchical Convolutional Deep Maxout Network 1470:. but are more successful in computer vision. 772:and popularized in surveys such as the one by 14127: 14020:Bishop, Christopher M.; Bishop, Hugh (2024). 13836:"AI Is Easy to Fool—Why That Needs to Change" 13037: 12986: 12685: 11279:Shalev, Y.; Painsky, A.; Ben-Gal, I. (2022). 11017: 10553: 10090: 10065:. Association for Computational Linguistics. 7846: 7821: 7800: 6607: 6605: 6211: 5961:"A learning algorithm for boltzmann machines" 5851:. MIT Press/Bradford Books. pp. 222–227. 5834: 5754:A Field Guide to Dynamical Recurrent Networks 5259: 5105: 4279: 4262:Machine Learning: A Probabilistic Perspective 3417: 2425:annotations and gene-function relationships. 1711:methods such as Ivakhnenko's unit pruning or 406: 14141: 14019: 13615: 13613: 13611: 13489: 12642: 10194: 10020:"Parsing With Compositional Vector Grammars" 9960: 9455: 9453: 9451: 8612: 8588: 8536: 8534: 8532: 8478:Gers, Felix A.; Schmidhuber, Jürgen (2001). 8473: 8471: 8360:A Guide to Deep Learning and Neural Networks 8283:: CS1 maint: multiple names: authors list ( 7850:Deep Residual Learning for Image Recognition 7840: 7824:Deep Residual Learning for Image Recognition 7325: 7323: 6880:"Learning multiple layers of representation" 6719:"Learning multiple layers of representation" 6287: 5893: 5299: 4837: 4513: 4491: 4489: 4487: 4485: 4483: 4481: 4479: 4459: 4457: 4403: 4401: 4315: 3964:Mathematics of Control, Signals, and Systems 3596: 3594: 3592: 3590: 3588: 3520: 3518: 3516: 3514: 3261: 2564: 2524:Finding the appropriate mobile audience for 1299:by Dan Ciresan, Ueli Meier, Jonathan Masci, 689:. In 1989, the first proof was published by 12091: 10836: 10749: 10747: 9928:Blaise Agüera y Arcas (29 September 2017). 9624: 9580: 9145:Continuous CMAC-QRLS and its systolic array 8821:; Mohamed, Abdel-Rahman; Kingsbury, Brian; 8541:Sutskever, L.; Vinyals, O.; Le, Q. (2014). 8386:"Deep neural networks for object detection" 8071: 8069: 7118: 6538: 6536: 6191:. Cambridge, Massachusetts: The MIT Press. 5902: 5861: 5843: 5718: 5716: 5714: 5696: 5654: 5626: 5615: 5123: 5119: 5117: 4992: 4831: 4789: 4787: 4495: 4318:Applied and Computational Harmonic Analysis 4213:Foundations and Trends in Signal Processing 4199: 4197: 4195: 4193: 4191: 4059:Neural Networks: A Comprehensive Foundation 3820: 3776: 3752:Greedy layer-wise training of deep networks 3668: 3302: 1264: 878:the currently dominant training technique. 14134: 14120: 13693: 13586:"Yes, androids do dream of electric sheep" 12556: 12167:IEEE Transactions on Computational Imaging 10672: 10604: 10464: 10462: 9766: 7748: 7725: 7703: 7213: 6854:Learning multiple layers of representation 6846: 6641: 6602: 6453: 5747: 5279: 5187: 5147: 5073: 5055: 5019: 4793: 4759: 4711: 4699: 4649: 4628: 4605: 4572: 4089:Fundamentals of Artificial Neural Networks 3742: 3664: 3662: 3660: 3658: 3656: 3654: 3652: 3650: 3534:Foundations and Trends in Machine Learning 2290: 2059:Adaptation of DNNs and related deep models 2013:Convolutional DNN w. Heterogeneous Pooling 413: 399: 13946: 13707: 13678: 13657: 13608: 13583: 13218: 13155: 13138: 13120: 13071: 12896: 12878: 12847: 12829: 12721: 12703: 12619: 12601: 12523: 12389: 12350: 12319: 12301: 12235: 12178: 12137: 12127: 12109: 12068: 12003: 11938: 11811: 11793: 11549: 11531: 11482: 11353: 11299: 11274: 11272: 11204: 11139: 11129: 10994: 10984: 10915: 10909: 10821: 10735: 10587: 10577: 10528: 10417: 10288: 10278: 10219: 9984: 9966: 9945: 9907: 9852: 9840: 9815: 9757: 9682: 9630: 9574: 9473: 9448: 9431: 9391: 9332: 9322: 9095: 8986: 8920: 8885: 8875: 8636: 8627: 8603: 8557: 8529: 8468: 8183: 8171: 8111: 8045: 7983: 7901: 7862: 7831: 7806: 7786: 7760: 7738: 7715: 7694: 7662: 7641: 7469: 7403: 7320: 7249: 6989: 6664: 6402: 6186: 6086: 6011: 5919: 5521: 5336: 4955: 4916: 4901: 4867: 4858: 4555: 4545: 4504: 4476: 4454: 4398: 4329: 4288: 4254: 4252: 4250: 4248: 4206:"Deep Learning: Methods and Applications" 4171: 4161: 4111: 4109: 3917: 3852:Multi-Valued and Universal Binary Neurons 3811: 3682: 3614: 3585: 3545: 3511: 3382: 3099:Learn how and when to remove this message 3004: 2868:(...) have no obvious ways of performing 2644:Backward stochastic differential equation 2510: 2387: 728:interpretation derives from the field of 13928: 13619: 12686:Testolin, Alberto; Zorzi, Marco (2016). 12468: 12085: 11112:Sathyanarayana, Aarti (1 January 2016). 10744: 10341: 10190: 10188: 9930:"Art in the Age of Machine Intelligence" 9923: 9921: 9919: 9892:"The Machine as Artist: An Introduction" 9885: 9883: 9881: 9719: 9554: 9164: 8772: 8351: 8192: 8146: 8066: 7910: 7671: 7388: 6777: 6690:. Proceedings of ICANN (2), pp. 220–229. 6651:; Fernández, Santiago; Gomez, Faustino; 6533: 6479: 6293: 5801: 5741: 5711: 5114: 5067: 5049: 4949: 4910: 4784: 4726: 4581:Pattern Recognition and Machine Learning 4519: 4364:Pattern Recognition and Machine Learning 4309: 4188: 3209:List of artificial intelligence projects 2825: 2280:Google Neural Machine Translation (GNMT) 2186: 2182: 2141: 1798:cerebellar model articulation controller 1692:for automatic speech recognition (ASR). 1268: 732:. It features inference, as well as the 709:The universal approximation theorem for 36: 14022:Deep learning: foundations and concepts 13924: 13922: 13920: 13918: 12692:Frontiers in Computational Neuroscience 12503: 12398: 11878: 11876: 11874: 11643: 10459: 9985:Socher, Richard; Manning, Christopher. 8744: 8742: 8642: 7815: 7677: 7629: 6699:Graves, Alex; and Schmidhuber, Jürgen; 5855: 5494: 4998: 4733:Ivakhnenko, A. G.; Lapa, V. G. (1967). 4419: 4135: 4085: 4013: 4011: 4009: 3953: 3899: 3721: 3647: 3184:Applications of artificial intelligence 2968:and caused an ANN to misclassify them. 2783:automatically tagging uploaded pictures 2660: 2463:. Practically, the DNN is trained as a 2075:and how to design them to best exploit 1643: 1039:in an RNN unfolded in time. The "P" in 14: 15038: 14038: 13902: 13864: 13830: 13828: 13826: 13824: 13822: 13526: 13495: 13464: 13192: 12441: 12351:Sivakumar, Ramakrishnan (2023-11-27). 11834: 11269: 10663: 10047: 10027:Proceedings of the ACL 2013 Conference 9744: 9655: 9506: 9347: 9311:ACM SIGARCH Computer Architecture News 8992: 8236:"2018 ACM A.M. Turing Award Laureates" 8198: 7794: 7635: 7329: 7219: 7129: 6963: 6877: 6717:Hinton, Geoffrey E. (1 October 2007). 6716: 6459: 5559: 5353: 5317: 5285: 5193: 5153: 5025: 4796:"Polynomial theory of complex systems" 4736:Cybernetics and Forecasting Techniques 4717: 4634: 4611: 4578: 4258: 4245: 4203: 4106: 4055: 4017: 3949: 3947: 3945: 3785: 3524: 3151:to obtain labeled facial images), (4) 2785:with the names of the people in them. 2773: 1569:are computing systems inspired by the 1161:. The speaker recognition team led by 14115: 13867:"The scientist who spots fake videos" 12879:Cash, S.; Yuste, R. (February 1999). 12447: 11742: 11712: 11251: 10797:from the original on 24 December 2015 10185: 10072:from the original on 28 December 2016 9916: 9878: 9720:McMillan, Robert (17 December 2014). 9581:Abdel-Hamid, O.; et al. (2014). 9137: 9128: 8778: 8414:Rolnick, David; Tegmark, Max (2018). 8259:Ferrie, C., & Kaiser, S. (2019). 7933:from the original on 22 November 2019 7534:10.5591/978-1-57735-516-8/ijcai11-210 5582: 5453: 5406: 5359: 5323: 4873: 4847:The Annals of Mathematical Statistics 4463: 4425: 2669: 2587:Lawrence Berkeley National Laboratory 2519: 2195:" applied using neural style transfer 1640:recognizing faces, or playing "Go"). 1209:connectionist temporal classification 14972:Generative adversarial network (GAN) 13929:Mühlhoff, Rainer (6 November 2019). 13915: 13846:from the original on 11 October 2017 13807:from the original on 11 October 2019 13777:from the original on 11 October 2019 13508:from the original on 4 November 2019 13477:from the original on 12 October 2018 13362: 13161: 12092:Han, J.; Jentzen, A.; E, W. (2018). 11871: 11625:from the original on 11 October 2019 11030:from the original on 25 January 2018 10972:Journal of Medical Internet Research 10767:from the original on 20 October 2015 10697: 10612:"Merck Molecular Activity Challenge" 10447:from the original on 8 November 2020 10432: 10342:Turovsky, Barak (15 November 2016). 10207:from the original on 27 October 2017 10134:from the original on 27 October 2017 10103:from the original on 27 October 2017 9661: 9177:from the original on 1 February 2016 8974:from the original on 1 December 2017 8926: 8739: 8484:IEEE Transactions on Neural Networks 8177: 8006:from the original on 2 December 2020 7650: 7308:from the original on 12 October 2017 6878:Hinton, Geoffrey E. (October 2007). 6765:from the original on 11 October 2013 6043: 5812:. Vol. 1999. pp. 850–855. 5131:. IDSIA, Switzerland. Archived from 4354: 4006: 3900:Fradkov, Alexander L. (2020-01-01). 3842: 3527:"Learning Deep Architectures for AI" 3368: 3366: 3189:Comparison of deep learning software 3037:adding citations to reliable sources 3008: 2576: 2532: 2131: 1965:Hidden Trajectory (Generative) Model 1787:Sweeping through the parameter space 793:of artificial neural network (ANN): 13997:from the original on 10 August 2019 13819: 13446:from the original on 28 August 2018 12564:The New York Times, Science Section 11894:from the original on 28 August 2018 10879: 10633: 9806:. Selected Papers from IJCNN 2011. 9795: 9225: 8811: 8688: 7554: 7510: 7449: 6787:; Osindero, S.; Teh, Y. W. (2006). 5198:. New York: John Wiley & Sons. 4843:"A Stochastic Approximation Method" 4739:. American Elsevier Publishing Co. 4259:Murphy, Kevin P. (24 August 2012). 3942: 3902:"Early History of Machine Learning" 3722:Shigeki, Sugiyama (12 April 2019). 2840:Explainable artificial intelligence 2678: 2499:, such that the conditions for the 1989:Triphone GMM-HMM with BMMI Training 914:is an efficient application of the 440:. The field takes inspiration from 24: 14012: 13739: 13527:Marcus, Gary (November 25, 2012). 13174:from the original on 28 March 2014 10651:from the original on 30 April 2017 10323:from the original on 13 March 2017 7250:Seide, F.; Li, G.; Yu, D. (2011). 7189: 7168:from the original on 16 March 2018 6964:Hinton, Geoffrey E. (2009-05-31). 6464:. McGill University Ph.D. thesis. 6118: 5175:from the original on 14 April 2016 3172:Facebook's face recognition system 2956:Another group showed that certain 2800:system, which learned the game of 2247:probabilistic context free grammar 1973:Monophone Randomly Initialized DNN 1491: 665: 60: 25: 15057: 13596:from the original on 19 June 2015 11719:Global Banking and Finance Review 11281:"Neural Joint Entropy Estimation" 10511:Arrowsmith, J; Miller, P (2013). 10392:from the original on 10 July 2017 10354:from the original on 7 April 2017 10267:Journal of Translational Medicine 9268:from the original on 17 June 2020 9207:from the original on 10 June 2020 9063:from the original on 29 July 2020 8643:Mikolov, T.; et al. (2010). 6065:(1995). "The Helmholtz machine". 5005:. Open court publishing Company. 3474:from the original on 17 June 2018 3363: 2816:The University of Texas at Austin 2781:'s AI lab performs tasks such as 2581:In November 2023, researchers at 2483:tasks, the NJEE maps a vector of 2402: 2284:example-based machine translation 1053:. Hochreiter proposed recurrent 968:recognizing handwritten ZIP codes 954:(TDNN) was introduced in 1987 by 559:, although they can also include 15010: 15009: 14989: 13971: 13896: 13858: 13789: 13759: 13733: 13724: 13696:Found. Trends Comput. Graph. Vis 13687: 13666: 13645: 13577: 13565:from the original on 3 July 2015 13550: 13520: 13465:Marcus, Gary (14 January 2018). 13458: 13428: 13387: 13375:from the original on 7 July 2019 13356: 13326: 13318: 13243: 13186: 13096: 13031: 12980: 12929: 12872: 12797: 12746: 12679: 12636: 12569: 12369: 12344: 12276: 12211: 12154: 12028: 11971: 11919:Journal of Computational Physics 11906: 11828: 11761: 11736: 11706: 11671: 11637: 11607: 11566: 11507: 11450: 11394: 11332: 11245: 11221: 11164: 11105: 11042: 11011: 10958: 10924: 10903: 10873: 10830: 10809: 10779: 10723: 10495: 10433:Metz, Cade (27 September 2016). 10305: 10254: 10115: 10084: 10011: 9999:from the original on 6 July 2014 9870:(6 January 2015), David Talbot, 9861: 9732:from the original on 8 June 2017 9713: 9548: 9366: 9341: 9298: 9280: 9250: 9238:from the original on 25 May 2020 9219: 9189: 9165:Research, AI (23 October 2015). 9158: 9075: 9016: 8956: 8859: 8437: 8407: 8377: 8291: 8252: 8228: 8095: 8018: 7992: 7889: 7770: 7680:"Going deeper with convolutions" 7412: 7289: 7243: 7150: 7015:(2016). Slides on Deep Learning 6957: 6910: 6871: 6110: 5583:Elman, Jeffrey L. (March 1990). 5166:System modeling and optimization 4136:Orhan, A. E.; Ma, W. J. (2017). 3855:. Science & Business Media. 3013: 2718:spinoff company Deep Longevity. 2655:Physics-informed neural networks 2653:In addition, the integration of 2365:Customer relationship management 2359:Customer relationship management 2053:Sequence discriminative training 1530: 1506: 896:Deep learning architectures for 750:cumulative distribution function 444:and is centered around stacking 14085:from the original on 2016-04-16 13885:from the original on 2017-10-10 13634:from the original on 2015-05-13 13539:from the original on 2009-11-27 12938:Current Opinion in Neurobiology 11725:from the original on 2018-11-16 11660:from the original on 2018-01-02 11439:from the original on 2021-05-09 11093:from the original on 9 May 2021 10947:from the original on 2017-05-16 10892:from the original on 2020-04-30 10686:from the original on 2015-09-08 10622:from the original on 2020-07-16 10036:from the original on 2014-11-27 9784:from the original on 2014-01-13 9702:from the original on 2020-09-24 9613:from the original on 2020-09-22 9005:from the original on 2021-05-09 8945:from the original on 2017-08-12 8779:LeCun, Y.; et al. (1998). 8760:from the original on 9 May 2021 8677:from the original on 2017-05-16 8577:from the original on 2021-05-09 8518:from the original on 2020-01-26 8443: 8426:from the original on 2021-01-07 8396:from the original on 2017-06-29 8367:from the original on 2020-11-02 8217:from the original on 2021-05-09 8199:Zen, Heiga; Sak, Hasim (2015). 8122:10.1109/ICCCI50826.2021.9402569 8084:from the original on 2016-03-09 7920:Generative Adversarial Networks 7577:from the original on 2017-08-09 7543:from the original on 2014-09-29 7378:from the original on 2020-05-18 7278:from the original on 2017-10-12 7232:from the original on 2017-10-12 7202:from the original on 2017-09-26 7106:from the original on 2017-09-26 7046:IEEE Signal Processing Magazine 6835:from the original on 2015-12-23 6710: 6693: 6630:from the original on 2021-05-09 6590:from the original on 9 May 2021 6572: 6563: 6468:from the original on 2021-05-09 6411:IEEE Signal Processing Magazine 6391:from the original on 2021-04-27 6330: 6319:from the original on 2021-05-09 6252: 6205: 6187:Sejnowski, Terrence J. (2018). 6180: 6005: 5952: 5770: 5576: 5553: 5447: 5400: 5212: 5129:"Who Invented Backpropagation?" 4889: 4820:from the original on 2017-08-29 4760:Ivakhnenko, A.G. (March 1970). 4753: 4643: 4387:from the original on 2017-01-11 4361:Bishop, Christopher M. (2006). 4234:from the original on 2016-03-14 4129: 3893: 3884: 3765:from the original on 2019-10-20 3715: 3445:from the original on 2017-01-10 3024:needs additional citations for 2932: 2911:artificial general intelligence 2501:universal approximation theorem 2447:Deep Neural Network Estimations 2260:Recent developments generalize 2069:by DNNs and related deep models 1896: 1893:in data-heavy AI applications. 1157:, SRI researched in speech and 1086:generative adversarial networks 672:universal approximation theorem 589:model, the raw input may be an 482:generative adversarial networks 81:Artificial general intelligence 30:For the TV series episode, see 14922:Recurrent neural network (RNN) 14912:Differentiable neural computer 13496:Knight, Will (14 March 2017). 13363:Metz, Cade (6 November 2017). 13131:10.1523/jneurosci.5023-14.2015 11713:Czech, Tomasz (28 June 2018). 9519:. Linguistic Data Consortium. 9033:. SC '17, ACM. pp. 1–12. 8927:Dahl, G.; et al. (2013). 8210:. ICASSP. pp. 4470–4474. 5286:Waibel, Alex (December 1987). 5168:. Springer. pp. 762–770. 3486: 3456: 3255: 2621:partial differential equations 2615:Partial differential equations 1402:Generative adversarial network 945: 784: 13: 1: 14967:Variational autoencoder (VAE) 14927:Long short-term memory (LSTM) 14194:Computational learning theory 13162:Metz, C. (12 December 2013). 12898:10.1016/s0896-6273(00)81098-3 12512:Behavioral and Brain Sciences 11743:Nuñez, Michael (2023-11-29). 10517:Nature Reviews Drug Discovery 9641:10.21437/Interspeech.2014-433 9084:The Journal of Supercomputing 8660:10.21437/Interspeech.2010-343 7264:10.21437/Interspeech.2011-169 6557:10.1016/s0167-6393(99)00077-1 6527:10.1016/S0167-6393(99)00080-1 5977:10.1016/S0364-0213(85)80012-4 3919:10.1016/j.ifacol.2020.12.1888 3249: 3229:Scale space and deep learning 2820:U.S. Army Research Laboratory 2376:actions, defined in terms of 2300:Drug discovery and toxicology 2274:(GT) uses a large end-to-end 2124:voice search, and a range of 1695: 1686:Convolutional neural networks 1634:playing board and video games 1043:refers to such pre-training. 898:convolutional neural networks 848:Group method of data handling 553:convolutional neural networks 478:convolutional neural networks 14947:Convolutional neural network 14039:Prince, Simon J. D. (2023). 12831:10.1371/journal.pcbi.1002211 12489:10.1016/0893-6080(96)00033-0 11692:10.1016/j.knosys.2019.105048 11644:Schmidt, Uwe; Roth, Stefan. 10579:10.1016/j.drudis.2014.12.014 9826:10.1016/j.neunet.2012.02.023 7605:10.1007/978-3-642-40763-5_51 7352:10.1016/j.patcog.2004.01.013 6884:Trends in Cognitive Sciences 6866:Trends in Cognitive Sciences 6723:Trends in Cognitive Sciences 6500:10.1016/0893-6080(94)90027-2 6230:10.1016/0022-2836(88)90564-5 6218:Journal of Molecular Biology 6189:The deep learning revolution 5930:10.1016/j.neunet.2020.04.008 5697:Schmidhuber, Jürgen (1993). 5655:Schmidhuber, Jürgen (1992). 4778:10.1016/0005-1098(70)90092-0 4589:10.1007/978-1-4615-7566-5_15 4086:Hassoun, Mohamad H. (1995). 4032:10.1016/0893-6080(91)90009-t 3908:. 21st IFAC World Congress. 3693:10.1016/j.neunet.2014.09.003 2437:predicting protein structure 1902:Automatic speech recognition 1452:automatic speech recognition 1245:, but convergence was slow. 1121:protein structure prediction 1109:restricted Boltzmann machine 978:-5 (1998), a 7-level CNN by 962:et al. created a CNN called 719:Lebesgue integrable function 7: 14942:Multilayer perceptron (MLP) 14065:; Courville, Aaron (2016). 14041:Understanding deep learning 13404:10.1109/devlrn.2008.4640845 11835:Peplow, Mark (2023-11-29). 11583:10.1109/CSCITA.2017.8066548 11484:10.1109/ACCESS.2020.3006362 11372:10.1016/j.media.2017.07.005 9568:10.13140/RG.2.2.15418.90567 9348:Woodie, Alex (2021-11-01). 8999:Tech. Rep. UTML TR 2010-003 8896:10.1109/icassp.2013.6639349 8835:10.1109/icassp.2013.6639347 7678:Szegedy, Christian (2015). 7190:Li, Deng (September 2014). 6935:10.1162/neco.2006.18.7.1527 6810:10.1162/neco.2006.18.7.1527 5585:"Finding Structure in Time" 5560:Jordan, Michael I. (1986). 4794:Ivakhnenko, Alexey (1971). 4705:Principles of Neurodynamics 3268:KI - Künstliche Intelligenz 3177: 2814:As of 2008, researchers at 2606: 2370:Deep reinforcement learning 2224:Natural language processing 2218:Natural language processing 1803: 1589:, (analogous to biological 1031:network into a lower level 876:stochastic gradient descent 864:stochastic gradient descent 683:feedforward neural networks 542: 502:natural language processing 116:Natural language processing 10: 15062: 15018:Artificial neural networks 14932:Gated recurrent unit (GRU) 14158:Differentiable programming 14102:: CS1 maint: postscript ( 13865:Gibney, Elizabeth (2017). 13584:Alex Hern (18 June 2015). 13193:Gibney, Elizabeth (2016). 12950:10.1016/j.conb.2004.07.007 12810:PLOS Computational Biology 12448:Elman, Jeffrey L. (1998). 12246:10.1109/TUFFC.2020.3010186 11849:10.1038/d41586-023-03745-5 11795:10.1038/s41586-023-06735-9 11310:10.1109/TNNLS.2022.3204919 10280:10.1186/s12967-023-04011-y 10171:10.1109/taslp.2014.2383614 9599:10.1109/taslp.2014.2339736 9484:10.1038/s41586-020-03070-1 8709:10.1162/neco.1997.9.8.1735 8261:Neural Networks for Babies 6896:10.1016/j.tics.2007.09.004 6735:10.1016/j.tics.2007.09.004 5632:"Neural Sequence Chunkers" 5601:10.1207/s15516709cog1402_1 4340:10.1016/j.acha.2015.12.005 4163:10.1038/s41467-017-00181-8 3199:Differentiable programming 2837: 2682: 2406: 2391: 2362: 2304:For more information, see 2303: 2235:. Word embedding, such as 2221: 2135: 2085:and its rich LSTM variants 1997:Monophone DBN-DNN on fbank 1905: 1571:biological neural networks 1556:Artificial neural networks 1495: 1051:vanishing gradient problem 795:feedforward neural network 779: 626:feedforward neural network 524:, material inspection and 169:Hybrid intelligent systems 91:Recursive self-improvement 32:Deep Learning (South Park) 29: 27:Branch of machine learning 14985: 14899: 14843: 14772: 14705: 14577: 14477: 14470: 14424: 14388: 14351:Artificial neural network 14331: 14207: 14174:Automatic differentiation 14147: 13879:10.1038/nature.2017.22784 12767:10.1038/s41562-017-0186-2 12657:10.1162/neco.1996.8.5.895 12534:10.1017/s0140525x97001581 12419:10.1162/08997660260293319 12005:10.1016/j.cma.2019.112789 11940:10.1016/j.jcp.2018.10.045 11252:Shead, Sam (2020-11-30). 11055:. ACM. pp. 533–540. 10851:10.1038/s41587-019-0224-x 9684:10.1186/s13636-015-0068-3 9424:10.1038/s41586-020-2861-0 9114:10.1007/s11227-017-1994-x 7220:Yu, D.; Deng, L. (2010). 7130:Yu, D.; Deng, L. (2014). 6991:10.4249/scholarpedia.5947 6273:10.1142/s0218001493000455 6088:21.11116/0000-0002-D6D3-E 6079:10.1162/neco.1995.7.5.889 5879:10.1109/TAMD.2010.2056368 5756:. John Wiley & Sons. 5676:10.1162/neco.1992.4.2.234 5079:BIT Numerical Mathematics 4919:Trans. IECE (In Japanese) 4811:10.1109/TSMC.1971.4308320 4622:10.1109/TSMC.1972.4309133 4448:10.1103/RevModPhys.39.883 4428:Reviews of Modern Physics 4204:Deng, L.; Yu, D. (2014). 4092:. MIT Press. p. 48. 4056:Haykin, Simon S. (1999). 3861:10.1007/978-1-4757-3115-6 3813:10.4249/scholarpedia.5947 3393:10.1109/cvpr.2012.6248110 3280:10.1007/s13218-012-0198-z 3244:Topological deep learning 2962:facial recognition system 2899: 2833: 2732:cognitive neuroscientists 2689:An epigenetic clock is a 2565:Financial fraud detection 2157:remotely operated vehicle 1957:Bayesian Triphone GMM-HMM 1766:{\displaystyle \ell _{1}} 1735:{\displaystyle \ell _{2}} 1676:Recurrent neural networks 1498:Artificial neural network 1408:et al., 2014) (based on 1222:In 2006, publications by 986:Recurrent neural networks 952:time delay neural network 920:Gottfried Wilhelm Leibniz 866:was published in 1967 by 827:recurrent neural networks 803:recurrent neural networks 744:, related to fitting and 630:recurrent neural networks 474:recurrent neural networks 14179:Neuromorphic engineering 14142:Differentiable computing 14092:, introductory textbook. 13948:10.1177/1461444819885334 13561:. Google Research Blog. 12705:10.3389/fncom.2016.00073 12189:10.1109/TCI.2021.3075349 11118:JMIR mHealth and uHealth 7066:10.1109/msp.2012.2205597 6868:, 11, pp. 428–434, 2007. 6012:Smolensky, Paul (1986). 5194:Werbos, Paul J. (1994). 4520:Hopfield, J. J. (1982). 4303:10.1109/TSSC.1969.300225 2960:spectacles could fool a 2477:probability distribution 2430:electronic health record 2255:named-entity recognition 1949:Randomly Initialized RNN 1862:field-effect transistors 1827:deep learning processors 1662:multivariate polynomials 1265:Deep learning revolution 1187: 1078:probability distribution 1017:internal representations 1009:self-supervised learning 889:(rectified linear unit) 872:internal representations 858:The first deep learning 466:fully connected networks 293:Artificial consciousness 14952:Residual neural network 14368:Artificial Intelligence 13935:New Media & Society 13109:Journal of Neuroscience 12603:10.1073/pnas.88.10.4433 12321:10.1126/science.adi2336 12129:10.1073/pnas.1718942115 12061:10.1126/science.aaw4741 11680:Knowledge-Based Systems 11533:10.3390/cancers14071819 11061:10.1145/2649387.2649442 10645:www.datascienceassn.org 10471:"MT on and for the Web" 10348:The Keyword Google Blog 9987:"Deep Learning for NLP" 9334:10.1145/3140659.3080246 9039:10.1145/3126908.3126912 8785:Proceedings of the IEEE 7433:10.1145/1553374.1553486 6431:10.1109/msp.2009.932166 6158:10.1126/science.7761831 5979:(inactive 2024-08-07). 5510:Proceedings of the IEEE 5350:, 1, pp. 541–551, 1989. 4921:. J62-A (10): 658–665. 4860:10.1214/aoms/1177729586 4650:Rosenblatt, F. (1958). 4409:"bibliotheca Augustana" 3525:Bengio, Yoshua (2009). 3115:that not only low-paid 2927:artificial intelligence 2704:frontotemporal dementia 2625:Navier-Stokes equations 2382:customer lifetime value 2291:Forensic Identification 1839:tensor processing units 1785:, and initial weights. 1636:and medical diagnosis. 1381:residual neural network 1370:generating descriptions 1217:handwriting recognition 1198:support vector machines 1023:into a single RNN, by 676:probabilistic inference 442:biological neuroscience 438:representation learning 164:Evolutionary algorithms 54:Artificial intelligence 13903:Tubaro, Paola (2020). 13620:Goertzel, Ben (2015). 13440:governmentciomedia.com 13056:10.1098/rstb.2017.0043 13044:Phil. Trans. R. Soc. B 12755:Nature Human Behaviour 11342:Medical Image Analysis 9562:. CUED/F-INFENG/TR82. 8993:Hinton, G. E. (2010). 8870:. pp. 8624–8628. 8829:. pp. 8614–8618. 6966:"Deep belief networks" 6305:. Icassp'92: 617–620. 6025:. MIT Press. pp.  5787:Long Short Term Memory 4805:. SMC-1 (4): 364–378. 4718:Joseph, R. D. (1960). 4616:. SMC-2 (3): 380–388. 4547:10.1073/pnas.79.8.2554 3788:"Deep belief networks" 3377:. pp. 3642–3649. 3121:Amazon Mechanical Turk 3005:Data collection ethics 2886: 2511:Medical image analysis 2388:Recommendation systems 2276:long short-term memory 2196: 2163: 1813:commercial cloud AI . 1767: 1736: 1704:and computation time. 1580:rule-based programming 1546:result for sea urchin. 1484:were awarded the 2018 1301:Luca Maria Gambardella 1274: 1215:contest, in connected 1103:, etc., including the 1059:long short-term memory 841:multilayer perceptrons 622:credit assignment path 561:propositional formulas 518:medical image analysis 490:neural radiance fields 65: 42: 14907:Neural Turing machine 14495:Human image synthesis 13773:. 11 September 2017. 13502:MIT Technology Review 13338:MIT Technology Review 11415:10.1109/ICCVW.2017.18 11233:MIT Technology Review 9873:MIT Technology Review 9662:Tóth, Laszló (2015). 9226:Ray, Tiernan (2019). 8450:MIT Technology Review 5639:TR FKI-148, TU Munich 4142:Nature Communications 3786:Hinton, G.E. (2009). 3625:10.1109/tpami.2013.50 2919:commonsense reasoning 2862: 2826:Criticism and comment 2790:DeepMind Technologies 2352:graph neural networks 2208:Neural Style Transfer 2190: 2183:Visual art processing 2154: 1944:error rate (PER) (%) 1843:Google Cloud Platform 1768: 1737: 1392:neural style transfer 1295:In 2011, a CNN named 1272: 860:multilayer perceptron 825:in 1982. Other early 799:multilayer perceptron 704:rectified linear unit 567:such as the nodes in 64: 40: 14998:Computer programming 14977:Graph neural network 14552:Text-to-video models 14530:Text-to-image models 14378:Large language model 14363:Scientific computing 14169:Statistical manifold 14164:Information geometry 13398:. pp. 292–297. 11621:. 13 November 2018. 11577:. pp. 174–177. 11189:10.1093/jamia/ocw112 11131:10.2196/mhealth.6562 10839:Nature Biotechnology 10566:Drug Discovery Today 10386:Google Research Blog 8823:Ramabhadran, Bhuvana 7873:10.1109/CVPR.2016.90 7480:10.1162/neco_a_00052 7258:. pp. 437–440. 6623:. pp. 175–184. 6545:Speech Communication 6515:Speech Communication 5433:10.1364/AO.30.004211 5386:10.1364/AO.29.004790 5125:Schmidhuber, Juergen 4841:; Monro, S. (1951). 4707:. Spartan, New York. 4656:Psychological Review 4583:. pp. 172–186. 3214:Liquid state machine 3155:(e.g. by leveraging 3141:search results pages 3033:improve this article 2866:causal relationships 2764:deep belief networks 2661:Image reconstruction 2648:deep neural networks 2505:consistent estimator 2497:activation functions 2481:image classification 2337:rational drug design 2322:biomolecular targets 2021:Ensemble DNN/CNN/RNN 1882:in conjunction with 1872:hardware accelerator 1750: 1742:-regularization) or 1719: 1644:Deep neural networks 1597:). Each connection ( 1464:image classification 1347:ImageNet competition 1345:won the large-scale 1235:deep belief networks 1228:Ruslan Salakhutdinov 1117:wake-sleep algorithm 998:cognitive psychology 711:deep neural networks 687:continuous functions 645:deep belief networks 569:deep belief networks 470:deep belief networks 106:General game playing 18:Deep neural networks 14344:In-context learning 14184:Pattern recognition 13842:. 10 October 2017. 13289:10.1038/nature16961 13281:2016Natur.529..484S 13211:2016Natur.529..445G 13115:(27): 10005–10014. 12989:Nature Neuroscience 12822:2011PLSCB...7E2211B 12594:1991PNAS...88.4433M 12312:2023Sci...382.1416L 12296:(6677): 1416–1421. 12120:2018PNAS..115.8505H 12053:2020Sci...367.1026R 12047:(6481): 1026–1030. 11996:2020CMAME.360k2789M 11931:2019JCoPh.378..686R 11786:2023Natur.624...80M 11475:2020IEEEA...8l9889D 11364:2017arXiv170205747L 10711:on 28 February 2015 9947:10.3390/arts6040018 9909:10.3390/arts6020005 9416:2020Natur.587...72M 9292:consumer.huawei.com 9106:2017arXiv170207908V 8568:2014arXiv1409.3215S 8321:10.1038/nature16961 8313:2016Natur.529..484S 8040:. PMLR: 2256–2265. 7959:. December 14, 2018 7344:2004PatRe..37.1311O 7332:Pattern Recognition 7164:. 3 December 2015. 7058:2012ISPM...29...82H 6982:2009SchpJ...4.5947H 6653:Schmidhuber, Jürgen 6460:Bengio, Y. (1991). 6423:2009ISPM...26...75B 6150:1995Sci...268.1158H 6144:(5214): 1158–1161. 6126:Hinton, Geoffrey E. 6055:Hinton, Geoffrey E. 5904:Schmidhuber, Jürgen 5863:Schmidhuber, Jürgen 5845:Schmidhuber, Jürgen 5818:10.1049/cp:19991218 5628:Schmidhuber, Jürgen 5472:1994MedPh..21..517Z 5454:Zhang, Wei (1994). 5425:1991ApOpt..30.4211Z 5407:Zhang, Wei (1991). 5378:1990ApOpt..29.4790Z 5360:Zhang, Wei (1990). 5324:Zhang, Wei (1988). 5237:1986Natur.323..533R 4538:1982PNAS...79.2554H 4497:Schmidhuber, Jürgen 4440:1967RvMP...39..883B 4154:2017NatCo...8..138O 4002:on 10 October 2015. 3977:1989MCSS....2..303C 3804:2009SchpJ...4.5947H 3341:10.1038/nature14539 3333:2015Natur.521..436L 3224:Reservoir computing 2882:deductive reasoning 2774:Commercial activity 2737:nerve growth factor 2631:methods relies on. 2467:that maps an input 1823:electronic circuits 1626:machine translation 1441:Google Voice Search 1213:pattern recognition 1159:speaker recognition 1136:Hidden Markov model 891:activation function 610:feature engineering 535:, particularly the 506:machine translation 258:Machine translation 174:Systems integration 111:Knowledge reasoning 48:Part of a series on 14937:Echo state network 14825:Jürgen Schmidhuber 14520:Facial recognition 14515:Speech recognition 14425:Software libraries 13718:10.1561/0600000018 13369:The New York Times 13344:on 1 February 2016 13050:(1740): 20170043. 12645:Neural Computation 12566:, pp. B5–B6, 1995. 12407:Neural Computation 11409:. pp. 82–89. 11024:Microsoft Research 10791:KQED Future of You 10760:The Globe and Mail 10317:Microsoft Research 10201:Microsoft Research 10128:Microsoft Research 10097:Microsoft Research 9525:10.35111/17gk-bn40 9151:2018-11-18 at the 9143:Ting Qin, et al. " 8697:Neural Computation 7458:Neural Computation 7302:Microsoft Research 7162:Microsoft Research 7022:2016-04-23 at the 6923:Neural Computation 6860:2018-05-22 at the 6797:Neural Computation 6686:2018-11-18 at the 6366:10338.dmlcz/135496 6067:Neural Computation 5784:(21 August 1995), 5782:Jürgen Schmidhuber 5730:2015-03-06 at the 5664:Neural Computation 5348:Neural Computation 5272:2022-10-13 at the 5091:10.1007/bf01931367 4970:10.1007/bf00344251 4927:10.1007/bf00344251 4413:www.hs-augsburg.de 4225:10.1561/2000000039 4122:2019-02-13 at the 3985:10.1007/bf02551274 3836:2016-04-19 at the 3556:10.1561/2200000006 3204:Echo state network 3194:Compressed sensing 3153:information mining 2940:adversarial attack 2915:Learning a grammar 2878:Bayesian inference 2870:logical inferences 2670:Weather prediction 2596:crystal structures 2526:mobile advertising 2520:Mobile advertising 2459:on an independent 2453:stochastic process 2394:Recommender system 2345:multiple sclerosis 2266:sentence embedding 2251:sentiment analysis 2197: 2164: 2161:mussel aquaculture 2029:Bidirectional LSTM 1908:Speech recognition 1763: 1732: 1622:speech recognition 1587:artificial neurons 1410:Jürgen Schmidhuber 1305:Jürgen Schmidhuber 1275: 1143:speech recognition 1128:speech recognition 1066:Jürgen Schmidhuber 1005:Jürgen Schmidhuber 940:David E. Rumelhart 906:Kunihiko Fukushima 883:Kunihiko Fukushima 700:Kunihiko Fukushima 593:(represented as a 573:Boltzmann machines 533:biological systems 498:speech recognition 446:artificial neurons 66: 43: 15033: 15032: 14795:Stephen Grossberg 14768: 14767: 14043:. The MIT Press. 14031:978-3-031-45467-7 13941:(10): 1868–1884. 13413:978-1-4244-2661-4 13275:(7587): 484–489. 13205:(7587): 445–446. 12588:(10): 4433–4437. 12461:978-0-262-55030-7 12413:(10): 2497–2529. 12382:Aging and Disease 12230:(12): 2584–2594. 12104:(34): 8505–8510. 11592:978-1-5090-4381-1 11469:: 129889–129898. 10880:Gregory, Barber. 9633:Proc. Interspeech 9593:(10): 1533–1545. 9203:. December 2019. 8905:978-1-4799-0356-6 8844:978-1-4799-0356-6 8791:(11): 2278–2324. 8496:10.1109/72.963769 8307:(7587): 484–489. 8167:on 24 April 2018. 8131:978-1-7281-5875-4 7882:978-1-4673-8851-1 7614:978-3-642-38708-1 7464:(12): 3207–3220. 7442:978-1-60558-516-1 7143:978-1-4471-5779-3 6198:978-0-262-03803-4 6063:Zemel, Richard S. 5965:Cognitive Science 5763:978-0-7803-5369-5 5722:S. Hochreiter., " 5589:Cognitive Science 5516:(11): 2278–2324. 5231:(6088): 533–536. 5075:Linnainmaa, Seppo 5057:Linnainmaa, Seppo 4879:IEEE Transactions 4746:978-0-444-00020-0 4701:Rosenblatt, Frank 4598:978-1-4615-7568-9 4466:IEEE Transactions 4377:978-0-387-31073-2 4272:978-0-262-01802-9 4099:978-0-262-08239-6 4069:978-0-13-273350-2 4062:. Prentice Hall. 3906:IFAC-PapersOnLine 3870:978-0-7923-7824-2 3735:978-1-5225-8218-2 3402:978-1-4673-1228-8 3327:(7553): 436–444. 3239:Stochastic parrot 3161:activity trackers 3109: 3108: 3101: 3083: 2985:genetic algorithm 2760:generative models 2741:self-organization 2728:brain development 2716:Insilico Medicine 2600:Materials Project 2591:materials science 2577:Materials science 2555:film colorization 2533:Image restoration 2152: 2132:Image recognition 2067:transfer learning 2044: 2043: 2005:Convolutional DNN 1981:Monophone DBN-DNN 1810:computer hardware 1690:acoustic modeling 1680:language modeling 1329:In October 2012, 1147:SRI International 1113:Helmholtz machine 1105:Boltzmann machine 1013:predictive coding 972:optical computing 852:Alexey Ivakhnenko 587:image recognition 565:generative models 432:methods based on 423: 422: 159:Bayesian networks 86:Intelligent agent 16:(Redirected from 15053: 15023:Machine learning 15013: 15012: 14993: 14748:Action selection 14738:Self-driving car 14545:Stable Diffusion 14510:Speech synthesis 14475: 14474: 14339:Machine learning 14215:Gradient descent 14136: 14129: 14122: 14113: 14112: 14107: 14101: 14093: 14091: 14090: 14078:978-0-26203561-3 14054: 14035: 14007: 14006: 14004: 14002: 13975: 13969: 13968: 13950: 13926: 13913: 13912: 13900: 13894: 13893: 13891: 13890: 13862: 13856: 13855: 13853: 13851: 13832: 13817: 13816: 13814: 13812: 13803:. 18 June 2018. 13793: 13787: 13786: 13784: 13782: 13763: 13757: 13756: 13754: 13753: 13744:. Archived from 13737: 13731: 13728: 13722: 13721: 13711: 13691: 13685: 13684: 13682: 13670: 13664: 13663: 13661: 13649: 13643: 13642: 13640: 13639: 13633: 13626: 13617: 13606: 13605: 13603: 13601: 13581: 13575: 13574: 13572: 13570: 13554: 13548: 13547: 13545: 13544: 13524: 13518: 13517: 13515: 13513: 13493: 13487: 13486: 13484: 13482: 13462: 13456: 13455: 13453: 13451: 13432: 13426: 13425: 13391: 13385: 13384: 13382: 13380: 13360: 13354: 13353: 13351: 13349: 13340:. Archived from 13330: 13324: 13323: 13322: 13316: 13247: 13241: 13240: 13222: 13190: 13184: 13183: 13181: 13179: 13159: 13153: 13152: 13142: 13124: 13100: 13094: 13093: 13075: 13035: 13029: 13028: 12984: 12978: 12977: 12933: 12927: 12926: 12900: 12876: 12870: 12869: 12851: 12833: 12816:(11): e1002211. 12801: 12795: 12794: 12750: 12744: 12743: 12725: 12707: 12683: 12677: 12676: 12640: 12634: 12633: 12623: 12605: 12573: 12567: 12560: 12554: 12553: 12527: 12507: 12501: 12500: 12483:(7): 1119–1129. 12472: 12466: 12465: 12445: 12439: 12438: 12402: 12396: 12395: 12393: 12373: 12367: 12366: 12364: 12363: 12348: 12342: 12341: 12323: 12305: 12280: 12274: 12273: 12239: 12215: 12209: 12208: 12182: 12158: 12152: 12151: 12141: 12131: 12113: 12089: 12083: 12082: 12072: 12032: 12026: 12025: 12007: 11975: 11969: 11968: 11942: 11910: 11904: 11903: 11901: 11899: 11880: 11869: 11868: 11832: 11826: 11825: 11815: 11797: 11765: 11759: 11758: 11756: 11755: 11740: 11734: 11733: 11731: 11730: 11710: 11704: 11703: 11675: 11669: 11668: 11666: 11665: 11659: 11652: 11641: 11635: 11634: 11632: 11630: 11611: 11605: 11604: 11570: 11564: 11563: 11553: 11535: 11511: 11505: 11504: 11486: 11454: 11448: 11447: 11445: 11444: 11398: 11392: 11391: 11357: 11336: 11330: 11329: 11303: 11294:(4): 5488–5500. 11285: 11276: 11267: 11266: 11264: 11263: 11249: 11243: 11242: 11240: 11239: 11225: 11219: 11218: 11208: 11168: 11162: 11161: 11143: 11133: 11109: 11103: 11102: 11100: 11098: 11046: 11040: 11039: 11037: 11035: 11015: 11009: 11008: 10998: 10988: 10962: 10956: 10955: 10953: 10952: 10946: 10939: 10928: 10922: 10921: 10919: 10907: 10901: 10900: 10898: 10897: 10877: 10871: 10870: 10845:(9): 1038–1040. 10834: 10828: 10827: 10825: 10813: 10807: 10806: 10804: 10802: 10783: 10777: 10776: 10774: 10772: 10751: 10742: 10741: 10739: 10727: 10721: 10720: 10718: 10716: 10707:. Archived from 10701: 10695: 10694: 10692: 10691: 10676: 10670: 10667: 10661: 10660: 10658: 10656: 10637: 10631: 10630: 10628: 10627: 10608: 10602: 10601: 10591: 10581: 10557: 10551: 10550: 10532: 10508: 10502: 10499: 10493: 10492: 10490: 10488: 10483:on 29 March 2017 10482: 10476:. Archived from 10475: 10466: 10457: 10456: 10454: 10452: 10430: 10424: 10423: 10421: 10408: 10402: 10401: 10399: 10397: 10377: 10364: 10363: 10361: 10359: 10339: 10333: 10332: 10330: 10328: 10309: 10303: 10302: 10292: 10282: 10258: 10252: 10251: 10240:10.1002/dac.3259 10223: 10217: 10216: 10214: 10212: 10192: 10183: 10182: 10150: 10144: 10143: 10141: 10139: 10119: 10113: 10112: 10110: 10108: 10088: 10082: 10081: 10079: 10077: 10071: 10060: 10051: 10045: 10044: 10042: 10041: 10035: 10024: 10015: 10009: 10008: 10006: 10004: 9998: 9991: 9982: 9973: 9972: 9970: 9958: 9952: 9951: 9949: 9925: 9914: 9913: 9911: 9887: 9876: 9865: 9859: 9858: 9856: 9844: 9838: 9837: 9819: 9799: 9793: 9792: 9790: 9789: 9770: 9764: 9763: 9761: 9748: 9742: 9741: 9739: 9737: 9717: 9711: 9710: 9708: 9707: 9701: 9686: 9668: 9659: 9653: 9652: 9628: 9622: 9621: 9619: 9618: 9578: 9572: 9571: 9552: 9546: 9545: 9543: 9541: 9510: 9504: 9503: 9477: 9457: 9446: 9445: 9435: 9395: 9389: 9388: 9386: 9385: 9370: 9364: 9363: 9361: 9360: 9345: 9339: 9338: 9336: 9326: 9302: 9296: 9295: 9284: 9278: 9277: 9275: 9273: 9258:"AI and Compute" 9254: 9248: 9247: 9245: 9243: 9223: 9217: 9216: 9214: 9212: 9193: 9187: 9186: 9184: 9182: 9162: 9156: 9141: 9135: 9132: 9126: 9125: 9099: 9079: 9073: 9072: 9070: 9068: 9020: 9014: 9013: 9011: 9010: 8990: 8984: 8983: 8981: 8979: 8960: 8954: 8953: 8951: 8950: 8944: 8933: 8924: 8918: 8917: 8889: 8879: 8863: 8857: 8856: 8819:Sainath, Tara N. 8815: 8809: 8808: 8797:10.1109/5.726791 8776: 8770: 8769: 8767: 8765: 8746: 8737: 8736: 8703:(8): 1735–1780. 8692: 8686: 8685: 8683: 8682: 8676: 8649: 8640: 8634: 8633: 8631: 8619: 8610: 8609: 8607: 8595: 8586: 8585: 8583: 8582: 8576: 8561: 8547: 8538: 8527: 8526: 8524: 8523: 8490:(6): 1333–1340. 8475: 8466: 8465: 8463: 8461: 8456:on 31 March 2019 8452:. Archived from 8441: 8435: 8434: 8432: 8431: 8411: 8405: 8404: 8402: 8401: 8381: 8375: 8374: 8373: 8372: 8355: 8349: 8348: 8295: 8289: 8288: 8282: 8274: 8256: 8250: 8249: 8247: 8246: 8232: 8226: 8225: 8223: 8222: 8216: 8205: 8196: 8190: 8189: 8187: 8175: 8169: 8168: 8166: 8160:. Archived from 8159: 8150: 8144: 8143: 8115: 8106:. pp. 1–4. 8099: 8093: 8092: 8090: 8089: 8073: 8064: 8058: 8052: 8051: 8049: 8031: 8022: 8016: 8015: 8013: 8011: 7996: 7990: 7989: 7987: 7975: 7969: 7968: 7966: 7964: 7957:SyncedReview.com 7949: 7943: 7942: 7940: 7938: 7932: 7925: 7914: 7908: 7907: 7905: 7893: 7887: 7886: 7866: 7844: 7838: 7837: 7835: 7819: 7813: 7812: 7810: 7798: 7792: 7791: 7790: 7774: 7768: 7766: 7764: 7752: 7746: 7744: 7742: 7729: 7723: 7721: 7719: 7707: 7701: 7700: 7698: 7684: 7675: 7669: 7668: 7666: 7654: 7648: 7647: 7645: 7633: 7627: 7626: 7592: 7586: 7585: 7583: 7582: 7576: 7569: 7558: 7552: 7551: 7549: 7548: 7542: 7523: 7514: 7508: 7507: 7473: 7453: 7447: 7446: 7416: 7410: 7409: 7407: 7392: 7386: 7385: 7384: 7383: 7365: 7356: 7355: 7338:(6): 1311–1314. 7327: 7318: 7317: 7315: 7313: 7293: 7287: 7286: 7284: 7283: 7256:Interspeech 2011 7247: 7241: 7240: 7238: 7237: 7217: 7211: 7210: 7208: 7207: 7187: 7178: 7177: 7175: 7173: 7154: 7148: 7147: 7127: 7116: 7115: 7113: 7111: 7105: 7098: 7089: 7078: 7077: 7037: 7026: 7010: 7004: 7003: 6993: 6961: 6955: 6954: 6929:(7): 1527–1554. 6914: 6908: 6907: 6875: 6869: 6852:G. E. Hinton., " 6850: 6844: 6843: 6841: 6840: 6834: 6804:(7): 1527–1554. 6793: 6781: 6775: 6774: 6772: 6770: 6714: 6708: 6697: 6691: 6677: 6671: 6670: 6668: 6645: 6639: 6638: 6636: 6635: 6629: 6618: 6609: 6600: 6599: 6597: 6595: 6576: 6570: 6567: 6561: 6560: 6540: 6531: 6530: 6510: 6504: 6503: 6483: 6477: 6476: 6474: 6473: 6457: 6451: 6450: 6406: 6400: 6399: 6397: 6396: 6390: 6358:10.1109/29.21701 6343: 6334: 6328: 6327: 6325: 6324: 6291: 6285: 6284: 6256: 6250: 6249: 6209: 6203: 6202: 6184: 6178: 6177: 6134:Frey, Brendan J. 6122: 6116: 6115: 6114: 6108: 6090: 6059:Neal, Radford M. 6047: 6041: 6040: 6018: 6009: 6003: 6002: 5996: 5988: 5956: 5950: 5949: 5923: 5900: 5891: 5890: 5859: 5853: 5852: 5841: 5832: 5831: 5805: 5799: 5798: 5774: 5768: 5767: 5745: 5739: 5720: 5709: 5707: 5705: 5694: 5688: 5687: 5661: 5652: 5643: 5642: 5636: 5624: 5613: 5612: 5580: 5574: 5573: 5557: 5551: 5550: 5548: 5546: 5532:10.1109/5.726791 5525: 5507: 5498: 5492: 5491: 5480:10.1118/1.597177 5451: 5445: 5444: 5404: 5398: 5397: 5357: 5351: 5340: 5334: 5333: 5321: 5315: 5306:Alexander Waibel 5303: 5297: 5296: 5294: 5283: 5277: 5263: 5257: 5256: 5245:10.1038/323533a0 5216: 5210: 5209: 5191: 5185: 5184: 5182: 5180: 5174: 5163: 5151: 5145: 5144: 5142: 5140: 5121: 5112: 5109: 5103: 5102: 5071: 5065: 5064: 5053: 5047: 5046: 5027:Kelley, Henry J. 5023: 5017: 5016: 4996: 4990: 4989: 4953: 4947: 4946: 4914: 4908: 4907: 4905: 4893: 4887: 4886: 4875:Amari, Shun'ichi 4871: 4865: 4864: 4862: 4835: 4829: 4828: 4826: 4825: 4819: 4800: 4791: 4782: 4781: 4757: 4751: 4750: 4730: 4724: 4723: 4715: 4709: 4708: 4697: 4688: 4687: 4668:10.1037/h0042519 4647: 4641: 4640: 4632: 4626: 4625: 4609: 4603: 4602: 4576: 4570: 4569: 4559: 4549: 4532:(8): 2554–2558. 4517: 4511: 4510: 4508: 4493: 4474: 4473: 4472:(21): 1197–1206. 4461: 4452: 4451: 4423: 4417: 4416: 4405: 4396: 4395: 4393: 4392: 4386: 4369: 4358: 4352: 4351: 4333: 4313: 4307: 4306: 4286: 4277: 4276: 4256: 4243: 4242: 4240: 4239: 4233: 4210: 4201: 4186: 4185: 4175: 4165: 4133: 4127: 4113: 4104: 4103: 4083: 4074: 4073: 4053: 4044: 4043: 4015: 4004: 4003: 4001: 3995:. Archived from 3960: 3954:Cybenko (1989). 3951: 3940: 3939: 3921: 3912:(2): 1385–1390. 3897: 3891: 3888: 3882: 3881: 3879: 3877: 3846: 3840: 3824: 3818: 3817: 3815: 3783: 3774: 3773: 3771: 3770: 3764: 3757: 3746: 3740: 3739: 3719: 3713: 3712: 3686: 3666: 3645: 3644: 3618: 3609:(8): 1798–1828. 3598: 3583: 3582: 3580: 3578: 3572: 3566:. Archived from 3549: 3531: 3522: 3509: 3508: 3506: 3505: 3490: 3484: 3483: 3481: 3479: 3460: 3454: 3453: 3451: 3450: 3444: 3433: 3424: 3415: 3414: 3386: 3370: 3361: 3360: 3318: 3309: 3300: 3299: 3259: 3159:devices such as 3113:media philosophy 3104: 3097: 3093: 3090: 3084: 3082: 3041: 3017: 3009: 2923:production rules 2806:Google Translate 2691:biochemical test 2685:Epigenetic clock 2679:Epigenetic clock 2559:Deep Image Prior 2547:super-resolution 2539:inverse problems 2457:random variables 2417:ANN was used in 2374:direct marketing 2278:(LSTM) network. 2272:Google Translate 2153: 2102:Skype Translator 2077:domain knowledge 1936: 1935: 1922:American English 1852:Atomically thin 1847:Cerebras Systems 1837:servers such as 1772: 1770: 1769: 1764: 1762: 1761: 1741: 1739: 1738: 1733: 1731: 1730: 1595:biological brain 1534: 1510: 1434:Stable Diffusion 1426:Diffusion models 1388:Google DeepDream 1359:Andrew Zisserman 1082:gradient descent 1074:generative model 932:Seppo Linnainmaa 837:Frank Rosenblatt 730:machine learning 580:machine learning 430:machine learning 415: 408: 401: 322:Existential risk 144:Machine learning 45: 44: 21: 15061: 15060: 15056: 15055: 15054: 15052: 15051: 15050: 15036: 15035: 15034: 15029: 14981: 14895: 14861:Google DeepMind 14839: 14805:Geoffrey Hinton 14764: 14701: 14627:Project Debater 14573: 14471:Implementations 14466: 14420: 14384: 14327: 14269:Backpropagation 14203: 14189:Tensor calculus 14143: 14140: 14110: 14095: 14094: 14088: 14086: 14079: 14059:Goodfellow, Ian 14051: 14032: 14015: 14013:Further reading 14010: 14000: 13998: 13977: 13976: 13972: 13927: 13916: 13909:Global Dialogue 13901: 13897: 13888: 13886: 13863: 13859: 13849: 13847: 13840:Singularity Hub 13834: 13833: 13820: 13810: 13808: 13795: 13794: 13790: 13780: 13778: 13765: 13764: 13760: 13751: 13749: 13740:Eisner, Jason. 13738: 13734: 13729: 13725: 13709:10.1.1.681.2190 13692: 13688: 13671: 13667: 13650: 13646: 13637: 13635: 13631: 13624: 13618: 13609: 13599: 13597: 13582: 13578: 13568: 13566: 13555: 13551: 13542: 13540: 13525: 13521: 13511: 13509: 13494: 13490: 13480: 13478: 13463: 13459: 13449: 13447: 13442:. 16 May 2018. 13434: 13433: 13429: 13414: 13392: 13388: 13378: 13376: 13361: 13357: 13347: 13345: 13332: 13331: 13327: 13317: 13263:Hassabis, Demis 13259:Sutskever, Ilya 13248: 13244: 13220:10.1038/529445a 13191: 13187: 13177: 13175: 13160: 13156: 13101: 13097: 13036: 13032: 13001:10.1038/nn.4244 12985: 12981: 12934: 12930: 12877: 12873: 12802: 12798: 12751: 12747: 12684: 12680: 12641: 12637: 12574: 12570: 12561: 12557: 12508: 12504: 12477:Neural Networks 12473: 12469: 12462: 12446: 12442: 12403: 12399: 12374: 12370: 12361: 12359: 12349: 12345: 12281: 12277: 12216: 12212: 12159: 12155: 12090: 12086: 12033: 12029: 11976: 11972: 11911: 11907: 11897: 11895: 11882: 11881: 11872: 11833: 11829: 11780:(7990): 80–85. 11766: 11762: 11753: 11751: 11741: 11737: 11728: 11726: 11711: 11707: 11676: 11672: 11663: 11661: 11657: 11650: 11642: 11638: 11628: 11626: 11613: 11612: 11608: 11593: 11571: 11567: 11512: 11508: 11455: 11451: 11442: 11440: 11425: 11399: 11395: 11337: 11333: 11283: 11277: 11270: 11261: 11259: 11250: 11246: 11237: 11235: 11227: 11226: 11222: 11169: 11165: 11110: 11106: 11096: 11094: 11079: 11047: 11043: 11033: 11031: 11016: 11012: 10963: 10959: 10950: 10948: 10944: 10937: 10929: 10925: 10908: 10904: 10895: 10893: 10878: 10874: 10835: 10831: 10814: 10810: 10800: 10798: 10793:. 27 May 2015. 10785: 10784: 10780: 10770: 10768: 10753: 10752: 10745: 10728: 10724: 10714: 10712: 10703: 10702: 10698: 10689: 10687: 10678: 10677: 10673: 10668: 10664: 10654: 10652: 10639: 10638: 10634: 10625: 10623: 10610: 10609: 10605: 10558: 10554: 10530:10.1038/nrd4090 10509: 10505: 10500: 10496: 10486: 10484: 10480: 10473: 10467: 10460: 10450: 10448: 10431: 10427: 10409: 10405: 10395: 10393: 10378: 10367: 10357: 10355: 10340: 10336: 10326: 10324: 10311: 10310: 10306: 10259: 10255: 10224: 10220: 10210: 10208: 10193: 10186: 10155:Hakkani-Tur, D. 10151: 10147: 10137: 10135: 10120: 10116: 10106: 10104: 10089: 10085: 10075: 10073: 10069: 10058: 10052: 10048: 10039: 10037: 10033: 10022: 10016: 10012: 10002: 10000: 9996: 9989: 9983: 9976: 9959: 9955: 9926: 9917: 9888: 9879: 9866: 9862: 9845: 9841: 9817:10.1.1.226.8219 9804:Neural Networks 9800: 9796: 9787: 9785: 9772: 9771: 9767: 9749: 9745: 9735: 9733: 9718: 9714: 9705: 9703: 9699: 9666: 9660: 9656: 9629: 9625: 9616: 9614: 9579: 9575: 9553: 9549: 9539: 9537: 9535: 9511: 9507: 9458: 9449: 9396: 9392: 9383: 9381: 9372: 9371: 9367: 9358: 9356: 9346: 9342: 9303: 9299: 9286: 9285: 9281: 9271: 9269: 9264:. 16 May 2018. 9256: 9255: 9251: 9241: 9239: 9224: 9220: 9210: 9208: 9201:InformationWeek 9195: 9194: 9190: 9180: 9178: 9163: 9159: 9153:Wayback Machine 9142: 9138: 9133: 9129: 9080: 9076: 9066: 9064: 9049: 9021: 9017: 9008: 9006: 8991: 8987: 8977: 8975: 8962: 8961: 8957: 8948: 8946: 8942: 8931: 8925: 8921: 8906: 8887:10.1.1.752.9151 8864: 8860: 8845: 8816: 8812: 8777: 8773: 8763: 8761: 8748: 8747: 8740: 8693: 8689: 8680: 8678: 8674: 8647: 8641: 8637: 8620: 8613: 8596: 8589: 8580: 8578: 8574: 8545: 8539: 8530: 8521: 8519: 8476: 8469: 8459: 8457: 8444:Hof, Robert D. 8442: 8438: 8429: 8427: 8412: 8408: 8399: 8397: 8382: 8378: 8370: 8368: 8357: 8356: 8352: 8296: 8292: 8276: 8275: 8271: 8263:. Sourcebooks. 8257: 8253: 8244: 8242: 8234: 8233: 8229: 8220: 8218: 8214: 8203: 8197: 8193: 8176: 8172: 8164: 8157: 8151: 8147: 8132: 8100: 8096: 8087: 8085: 8074: 8067: 8059: 8055: 8029: 8023: 8019: 8009: 8007: 8002:. witness.org. 7998: 7997: 7993: 7976: 7972: 7962: 7960: 7951: 7950: 7946: 7936: 7934: 7930: 7923: 7915: 7911: 7894: 7890: 7883: 7845: 7841: 7820: 7816: 7799: 7795: 7775: 7771: 7753: 7749: 7730: 7726: 7708: 7704: 7682: 7676: 7672: 7655: 7651: 7634: 7630: 7615: 7593: 7589: 7580: 7578: 7574: 7567: 7559: 7555: 7546: 7544: 7540: 7521: 7515: 7511: 7454: 7450: 7443: 7417: 7413: 7393: 7389: 7381: 7379: 7366: 7359: 7328: 7321: 7311: 7309: 7294: 7290: 7281: 7279: 7248: 7244: 7235: 7233: 7218: 7214: 7205: 7203: 7188: 7181: 7171: 7169: 7156: 7155: 7151: 7144: 7128: 7119: 7109: 7107: 7103: 7096: 7090: 7081: 7038: 7029: 7024:Wayback Machine 7011: 7007: 6962: 6958: 6915: 6911: 6890:(10): 428–434. 6876: 6872: 6862:Wayback Machine 6851: 6847: 6838: 6836: 6832: 6791: 6782: 6778: 6768: 6766: 6729:(10): 428–434. 6715: 6711: 6698: 6694: 6688:Wayback Machine 6678: 6674: 6646: 6642: 6633: 6631: 6627: 6616: 6610: 6603: 6593: 6591: 6578: 6577: 6573: 6568: 6564: 6541: 6534: 6511: 6507: 6488:Neural Networks 6484: 6480: 6471: 6469: 6458: 6454: 6407: 6403: 6394: 6392: 6388: 6341: 6335: 6331: 6322: 6320: 6313: 6292: 6288: 6257: 6253: 6210: 6206: 6199: 6185: 6181: 6123: 6119: 6109: 6048: 6044: 6037: 6016: 6010: 6006: 5990: 5989: 5957: 5953: 5908:Neural Networks 5901: 5894: 5860: 5856: 5842: 5835: 5828: 5806: 5802: 5778:Sepp Hochreiter 5775: 5771: 5764: 5746: 5742: 5732:Wayback Machine 5721: 5712: 5703: 5695: 5691: 5659: 5653: 5646: 5634: 5625: 5616: 5581: 5577: 5558: 5554: 5544: 5542: 5505: 5499: 5495: 5460:Medical Physics 5452: 5448: 5405: 5401: 5358: 5354: 5341: 5337: 5322: 5318: 5304: 5300: 5292: 5284: 5280: 5274:Wayback Machine 5264: 5260: 5217: 5213: 5206: 5192: 5188: 5178: 5176: 5172: 5161: 5152: 5148: 5138: 5136: 5135:on 30 July 2024 5127:(25 Oct 2014). 5122: 5115: 5110: 5106: 5072: 5068: 5054: 5050: 5037:(10): 947–954. 5024: 5020: 5013: 4997: 4993: 4954: 4950: 4915: 4911: 4894: 4890: 4872: 4868: 4836: 4832: 4823: 4821: 4817: 4798: 4792: 4785: 4758: 4754: 4747: 4731: 4727: 4716: 4712: 4698: 4691: 4648: 4644: 4633: 4629: 4610: 4606: 4599: 4577: 4573: 4518: 4514: 4494: 4477: 4462: 4455: 4424: 4420: 4407: 4406: 4399: 4390: 4388: 4384: 4378: 4367: 4359: 4355: 4314: 4310: 4287: 4280: 4273: 4257: 4246: 4237: 4235: 4231: 4208: 4202: 4189: 4134: 4130: 4124:Wayback Machine 4114: 4107: 4100: 4084: 4077: 4070: 4054: 4047: 4020:Neural Networks 4016: 4007: 3999: 3958: 3952: 3943: 3898: 3894: 3889: 3885: 3875: 3873: 3871: 3847: 3843: 3838:Wayback Machine 3825: 3821: 3784: 3777: 3768: 3766: 3762: 3755: 3747: 3743: 3736: 3720: 3716: 3671:Neural Networks 3667: 3648: 3599: 3586: 3576: 3574: 3573:on 4 March 2016 3570: 3547:10.1.1.701.9550 3529: 3523: 3512: 3503: 3501: 3492: 3491: 3487: 3477: 3475: 3470:. 25 May 2017. 3462: 3461: 3457: 3448: 3446: 3442: 3431: 3425: 3418: 3403: 3371: 3364: 3316: 3314:"Deep Learning" 3310: 3303: 3264:"Deep Learning" 3260: 3256: 3252: 3180: 3157:quantified-self 3129:Rainer Mühlhoff 3105: 3094: 3088: 3085: 3048:"Deep learning" 3042: 3040: 3030: 3018: 3007: 2935: 2902: 2842: 2836: 2828: 2776: 2756:backpropagation 2724: 2687: 2681: 2672: 2663: 2637: 2617: 2609: 2583:Google DeepMind 2579: 2571:fraud detection 2567: 2535: 2522: 2513: 2475:X to an output 2461:random variable 2449: 2411: 2405: 2396: 2390: 2367: 2361: 2313: 2302: 2293: 2226: 2220: 2185: 2142: 2140: 2138:Computer vision 2134: 1943: 1910: 1904: 1899: 1884:frequency combs 1835:cloud computing 1833:cellphones and 1806: 1757: 1753: 1751: 1748: 1747: 1726: 1722: 1720: 1717: 1716: 1698: 1646: 1615:backpropagation 1553: 1552: 1551: 1550: 1549: 1547: 1535: 1527: 1526: 1511: 1500: 1494: 1492:Neural networks 1478:Geoffrey Hinton 1343:Geoffrey Hinton 1335:Alex Krizhevsky 1290:GeForce GTX 280 1267: 1230:, Osindero and 1190: 1101:Geoffrey Hinton 1093:Terry Sejnowski 1047:Sepp Hochreiter 1027:a higher level 992:(1986) and the 948: 924:Henry J. Kelley 912:Backpropagation 885:introduced the 868:Shun'ichi Amari 819:Shun'ichi Amari 787: 782: 668: 666:Interpretations 549:neural networks 545: 522:climate science 494:computer vision 455:semi-supervised 434:neural networks 428:is a subset of 419: 390: 389: 380: 372: 371: 347: 337: 336: 308:Control problem 288: 278: 277: 189: 179: 178: 139: 131: 130: 101:Computer vision 76: 35: 28: 23: 22: 15: 12: 11: 5: 15059: 15049: 15048: 15031: 15030: 15028: 15027: 15026: 15025: 15020: 15007: 15006: 15005: 15000: 14986: 14983: 14982: 14980: 14979: 14974: 14969: 14964: 14959: 14954: 14949: 14944: 14939: 14934: 14929: 14924: 14919: 14914: 14909: 14903: 14901: 14897: 14896: 14894: 14893: 14888: 14883: 14878: 14873: 14868: 14863: 14858: 14853: 14847: 14845: 14841: 14840: 14838: 14837: 14835:Ilya Sutskever 14832: 14827: 14822: 14817: 14812: 14807: 14802: 14800:Demis Hassabis 14797: 14792: 14790:Ian Goodfellow 14787: 14782: 14776: 14774: 14770: 14769: 14766: 14765: 14763: 14762: 14757: 14756: 14755: 14745: 14740: 14735: 14730: 14725: 14720: 14715: 14709: 14707: 14703: 14702: 14700: 14699: 14694: 14689: 14684: 14679: 14674: 14669: 14664: 14659: 14654: 14649: 14644: 14639: 14634: 14629: 14624: 14619: 14618: 14617: 14607: 14602: 14597: 14592: 14587: 14581: 14579: 14575: 14574: 14572: 14571: 14566: 14565: 14564: 14559: 14549: 14548: 14547: 14542: 14537: 14527: 14522: 14517: 14512: 14507: 14502: 14497: 14492: 14487: 14481: 14479: 14472: 14468: 14467: 14465: 14464: 14459: 14454: 14449: 14444: 14439: 14434: 14428: 14426: 14422: 14421: 14419: 14418: 14413: 14408: 14403: 14398: 14392: 14390: 14386: 14385: 14383: 14382: 14381: 14380: 14373:Language model 14370: 14365: 14360: 14359: 14358: 14348: 14347: 14346: 14335: 14333: 14329: 14328: 14326: 14325: 14323:Autoregression 14320: 14315: 14314: 14313: 14303: 14301:Regularization 14298: 14297: 14296: 14291: 14286: 14276: 14271: 14266: 14264:Loss functions 14261: 14256: 14251: 14246: 14241: 14240: 14239: 14229: 14224: 14223: 14222: 14211: 14209: 14205: 14204: 14202: 14201: 14199:Inductive bias 14196: 14191: 14186: 14181: 14176: 14171: 14166: 14161: 14153: 14151: 14145: 14144: 14139: 14138: 14131: 14124: 14116: 14109: 14108: 14077: 14063:Bengio, Yoshua 14055: 14049: 14036: 14030: 14016: 14014: 14011: 14009: 14008: 13970: 13914: 13895: 13857: 13818: 13788: 13758: 13732: 13723: 13702:(4): 259–362. 13686: 13665: 13644: 13607: 13576: 13549: 13533:The New Yorker 13519: 13488: 13457: 13427: 13412: 13386: 13355: 13325: 13242: 13185: 13154: 13095: 13030: 12995:(3): 356–365. 12979: 12944:(4): 481–487. 12928: 12891:(2): 383–394. 12871: 12796: 12761:(9): 657–664. 12745: 12678: 12651:(5): 895–938. 12635: 12568: 12555: 12525:10.1.1.41.7854 12518:(4): 537–556. 12502: 12467: 12460: 12440: 12397: 12368: 12343: 12275: 12210: 12153: 12084: 12027: 11970: 11905: 11870: 11827: 11760: 11735: 11705: 11670: 11636: 11606: 11591: 11565: 11506: 11449: 11423: 11393: 11331: 11268: 11244: 11220: 11183:(2): 361–370. 11163: 11104: 11077: 11041: 11010: 10957: 10923: 10902: 10872: 10829: 10808: 10778: 10743: 10722: 10696: 10671: 10662: 10632: 10603: 10572:(5): 505–513. 10552: 10503: 10494: 10458: 10425: 10403: 10365: 10334: 10304: 10253: 10218: 10184: 10165:(3): 530–539. 10145: 10114: 10083: 10046: 10010: 9974: 9953: 9915: 9877: 9860: 9839: 9794: 9778:yann.lecun.com 9765: 9743: 9712: 9654: 9623: 9573: 9556:Robinson, Tony 9547: 9533: 9505: 9447: 9390: 9365: 9340: 9297: 9279: 9249: 9218: 9188: 9171:airesearch.com 9157: 9136: 9127: 9074: 9047: 9015: 8985: 8955: 8919: 8904: 8858: 8843: 8810: 8771: 8738: 8687: 8635: 8611: 8587: 8528: 8467: 8436: 8406: 8376: 8350: 8290: 8270:978-1492671206 8269: 8251: 8240:awards.acm.org 8227: 8191: 8170: 8145: 8130: 8094: 8065: 8053: 8017: 7991: 7970: 7944: 7909: 7888: 7881: 7839: 7814: 7793: 7769: 7747: 7724: 7702: 7670: 7649: 7628: 7613: 7587: 7553: 7509: 7448: 7441: 7411: 7387: 7357: 7319: 7288: 7242: 7212: 7179: 7149: 7142: 7117: 7079: 7027: 7005: 6956: 6909: 6870: 6845: 6776: 6709: 6692: 6672: 6666:10.1.1.75.6306 6640: 6601: 6571: 6562: 6551:(2): 181–192. 6532: 6521:(2): 225–254. 6505: 6494:(2): 331–339. 6478: 6452: 6401: 6352:(3): 328–339. 6329: 6311: 6286: 6267:(4): 899–916. 6251: 6224:(4): 865–884. 6204: 6197: 6179: 6117: 6073:(5): 889–904. 6042: 6035: 6004: 5971:(1): 147–169. 5951: 5892: 5873:(3): 230–247. 5854: 5849:Proc. SAB'1991 5833: 5826: 5800: 5769: 5762: 5740: 5710: 5689: 5670:(2): 234–242. 5644: 5630:(April 1991). 5614: 5595:(2): 179–211. 5575: 5552: 5523:10.1.1.32.9552 5493: 5446: 5419:(29): 4211–7. 5413:Applied Optics 5399: 5372:(32): 4790–7. 5366:Applied Optics 5352: 5335: 5316: 5298: 5278: 5258: 5211: 5204: 5186: 5146: 5113: 5104: 5085:(2): 146–160. 5066: 5048: 5043:10.2514/8.5282 5018: 5011: 4991: 4964:(4): 193–202. 4948: 4909: 4888: 4885:(16): 279–307. 4866: 4830: 4783: 4772:(2): 207–219. 4752: 4745: 4725: 4710: 4689: 4662:(6): 386–408. 4642: 4627: 4604: 4597: 4571: 4512: 4475: 4453: 4434:(4): 883–893. 4418: 4397: 4376: 4353: 4324:(2): 233–268. 4308: 4297:(4): 322–333. 4278: 4271: 4244: 4219:(3–4): 1–199. 4187: 4128: 4105: 4098: 4075: 4068: 4045: 4026:(2): 251–257. 4005: 3971:(4): 303–314. 3941: 3892: 3883: 3869: 3841: 3819: 3775: 3741: 3734: 3728:. IGI Global. 3714: 3646: 3584: 3510: 3485: 3455: 3416: 3401: 3362: 3301: 3274:(4): 357–363. 3253: 3251: 3248: 3247: 3246: 3241: 3236: 3231: 3226: 3221: 3216: 3211: 3206: 3201: 3196: 3191: 3186: 3179: 3176: 3107: 3106: 3021: 3019: 3012: 3006: 3003: 2999:data poisoning 2934: 2931: 2901: 2898: 2835: 2832: 2827: 2824: 2775: 2772: 2739:) support the 2723: 2720: 2708:ovarian cancer 2683:Main article: 2680: 2677: 2671: 2668: 2662: 2659: 2636: 2633: 2616: 2613: 2608: 2605: 2578: 2575: 2566: 2563: 2534: 2531: 2521: 2518: 2512: 2509: 2448: 2445: 2419:bioinformatics 2409:Bioinformatics 2407:Main article: 2404: 2403:Bioinformatics 2401: 2392:Main article: 2389: 2386: 2363:Main article: 2360: 2357: 2306:Drug discovery 2301: 2298: 2292: 2289: 2262:word embedding 2233:word embedding 2222:Main article: 2219: 2216: 2215: 2214: 2211: 2205: 2184: 2181: 2168:MNIST database 2136:Main article: 2133: 2130: 2090: 2089: 2086: 2080: 2070: 2060: 2057: 2054: 2051: 2042: 2041: 2038: 2034: 2033: 2030: 2026: 2025: 2022: 2018: 2017: 2014: 2010: 2009: 2006: 2002: 2001: 1998: 1994: 1993: 1990: 1986: 1985: 1982: 1978: 1977: 1974: 1970: 1969: 1966: 1962: 1961: 1958: 1954: 1953: 1950: 1946: 1945: 1940: 1906:Main article: 1903: 1900: 1898: 1895: 1854:semiconductors 1805: 1802: 1760: 1756: 1729: 1725: 1709:Regularization 1697: 1694: 1645: 1642: 1630:social network 1544:false positive 1536: 1529: 1528: 1512: 1505: 1504: 1503: 1502: 1501: 1496:Main article: 1493: 1490: 1406:Ian Goodfellow 1355:Karen Simonyan 1339:Ilya Sutskever 1266: 1263: 1259:decision trees 1189: 1186: 1076:that models a 990:Jordan network 947: 944: 928:control theory 904:introduced by 789:There are two 786: 783: 781: 778: 746:generalization 691:George Cybenko 667: 664: 544: 541: 510:bioinformatics 421: 420: 418: 417: 410: 403: 395: 392: 391: 388: 387: 381: 378: 377: 374: 373: 370: 369: 364: 359: 354: 348: 343: 342: 339: 338: 335: 334: 329: 324: 319: 314: 305: 300: 295: 289: 284: 283: 280: 279: 276: 275: 270: 265: 260: 255: 254: 253: 243: 238: 233: 232: 231: 226: 221: 211: 206: 204:Earth sciences 201: 196: 194:Bioinformatics 190: 185: 184: 181: 180: 177: 176: 171: 166: 161: 156: 151: 146: 140: 137: 136: 133: 132: 129: 128: 123: 118: 113: 108: 103: 98: 93: 88: 83: 77: 72: 71: 68: 67: 57: 56: 50: 49: 26: 9: 6: 4: 3: 2: 15058: 15047: 15046:Deep learning 15044: 15043: 15041: 15024: 15021: 15019: 15016: 15015: 15008: 15004: 15001: 14999: 14996: 14995: 14992: 14988: 14987: 14984: 14978: 14975: 14973: 14970: 14968: 14965: 14963: 14960: 14958: 14955: 14953: 14950: 14948: 14945: 14943: 14940: 14938: 14935: 14933: 14930: 14928: 14925: 14923: 14920: 14918: 14915: 14913: 14910: 14908: 14905: 14904: 14902: 14900:Architectures 14898: 14892: 14889: 14887: 14884: 14882: 14879: 14877: 14874: 14872: 14869: 14867: 14864: 14862: 14859: 14857: 14854: 14852: 14849: 14848: 14846: 14844:Organizations 14842: 14836: 14833: 14831: 14828: 14826: 14823: 14821: 14818: 14816: 14813: 14811: 14808: 14806: 14803: 14801: 14798: 14796: 14793: 14791: 14788: 14786: 14783: 14781: 14780:Yoshua Bengio 14778: 14777: 14775: 14771: 14761: 14760:Robot control 14758: 14754: 14751: 14750: 14749: 14746: 14744: 14741: 14739: 14736: 14734: 14731: 14729: 14726: 14724: 14721: 14719: 14716: 14714: 14711: 14710: 14708: 14704: 14698: 14695: 14693: 14690: 14688: 14685: 14683: 14680: 14678: 14677:Chinchilla AI 14675: 14673: 14670: 14668: 14665: 14663: 14660: 14658: 14655: 14653: 14650: 14648: 14645: 14643: 14640: 14638: 14635: 14633: 14630: 14628: 14625: 14623: 14620: 14616: 14613: 14612: 14611: 14608: 14606: 14603: 14601: 14598: 14596: 14593: 14591: 14588: 14586: 14583: 14582: 14580: 14576: 14570: 14567: 14563: 14560: 14558: 14555: 14554: 14553: 14550: 14546: 14543: 14541: 14538: 14536: 14533: 14532: 14531: 14528: 14526: 14523: 14521: 14518: 14516: 14513: 14511: 14508: 14506: 14503: 14501: 14498: 14496: 14493: 14491: 14488: 14486: 14483: 14482: 14480: 14476: 14473: 14469: 14463: 14460: 14458: 14455: 14453: 14450: 14448: 14445: 14443: 14440: 14438: 14435: 14433: 14430: 14429: 14427: 14423: 14417: 14414: 14412: 14409: 14407: 14404: 14402: 14399: 14397: 14394: 14393: 14391: 14387: 14379: 14376: 14375: 14374: 14371: 14369: 14366: 14364: 14361: 14357: 14356:Deep learning 14354: 14353: 14352: 14349: 14345: 14342: 14341: 14340: 14337: 14336: 14334: 14330: 14324: 14321: 14319: 14316: 14312: 14309: 14308: 14307: 14304: 14302: 14299: 14295: 14292: 14290: 14287: 14285: 14282: 14281: 14280: 14277: 14275: 14272: 14270: 14267: 14265: 14262: 14260: 14257: 14255: 14252: 14250: 14247: 14245: 14244:Hallucination 14242: 14238: 14235: 14234: 14233: 14230: 14228: 14225: 14221: 14218: 14217: 14216: 14213: 14212: 14210: 14206: 14200: 14197: 14195: 14192: 14190: 14187: 14185: 14182: 14180: 14177: 14175: 14172: 14170: 14167: 14165: 14162: 14160: 14159: 14155: 14154: 14152: 14150: 14146: 14137: 14132: 14130: 14125: 14123: 14118: 14117: 14114: 14105: 14099: 14084: 14080: 14074: 14071:. MIT Press. 14070: 14069: 14068:Deep Learning 14064: 14060: 14056: 14052: 14050:9780262048644 14046: 14042: 14037: 14033: 14027: 14023: 14018: 14017: 13996: 13992: 13988: 13984: 13980: 13974: 13966: 13962: 13958: 13954: 13949: 13944: 13940: 13936: 13932: 13925: 13923: 13921: 13919: 13910: 13906: 13899: 13884: 13880: 13876: 13872: 13868: 13861: 13845: 13841: 13837: 13831: 13829: 13827: 13825: 13823: 13806: 13802: 13801:The Daily Dot 13798: 13792: 13776: 13772: 13768: 13762: 13748:on 2017-12-30 13747: 13743: 13736: 13727: 13719: 13715: 13710: 13705: 13701: 13697: 13690: 13681: 13676: 13669: 13660: 13655: 13648: 13630: 13623: 13616: 13614: 13612: 13595: 13591: 13587: 13580: 13564: 13560: 13553: 13538: 13534: 13530: 13523: 13507: 13503: 13499: 13492: 13476: 13472: 13468: 13461: 13445: 13441: 13437: 13431: 13423: 13419: 13415: 13409: 13405: 13401: 13397: 13390: 13374: 13370: 13366: 13359: 13343: 13339: 13335: 13329: 13321: 13314: 13310: 13306: 13302: 13298: 13294: 13290: 13286: 13282: 13278: 13274: 13270: 13269: 13264: 13260: 13256: 13252: 13251:Silver, David 13246: 13238: 13234: 13230: 13226: 13221: 13216: 13212: 13208: 13204: 13200: 13196: 13189: 13173: 13169: 13165: 13158: 13150: 13146: 13141: 13136: 13132: 13128: 13123: 13118: 13114: 13110: 13106: 13099: 13091: 13087: 13083: 13079: 13074: 13069: 13065: 13061: 13057: 13053: 13049: 13045: 13041: 13034: 13026: 13022: 13018: 13014: 13010: 13006: 13002: 12998: 12994: 12990: 12983: 12975: 12971: 12967: 12963: 12959: 12955: 12951: 12947: 12943: 12939: 12932: 12924: 12920: 12916: 12912: 12908: 12904: 12899: 12894: 12890: 12886: 12882: 12875: 12867: 12863: 12859: 12855: 12850: 12845: 12841: 12837: 12832: 12827: 12823: 12819: 12815: 12811: 12807: 12800: 12792: 12788: 12784: 12780: 12776: 12772: 12768: 12764: 12760: 12756: 12749: 12741: 12737: 12733: 12729: 12724: 12719: 12715: 12711: 12706: 12701: 12697: 12693: 12689: 12682: 12674: 12670: 12666: 12662: 12658: 12654: 12650: 12646: 12639: 12631: 12627: 12622: 12617: 12613: 12609: 12604: 12599: 12595: 12591: 12587: 12583: 12579: 12572: 12565: 12559: 12551: 12547: 12543: 12539: 12535: 12531: 12526: 12521: 12517: 12513: 12506: 12498: 12494: 12490: 12486: 12482: 12478: 12471: 12463: 12457: 12454:. MIT Press. 12453: 12452: 12444: 12436: 12432: 12428: 12424: 12420: 12416: 12412: 12408: 12401: 12392: 12387: 12383: 12379: 12372: 12358: 12354: 12347: 12339: 12335: 12331: 12327: 12322: 12317: 12313: 12309: 12304: 12299: 12295: 12291: 12287: 12279: 12271: 12267: 12263: 12259: 12255: 12251: 12247: 12243: 12238: 12233: 12229: 12225: 12221: 12214: 12206: 12202: 12198: 12194: 12190: 12186: 12181: 12176: 12172: 12168: 12164: 12157: 12149: 12145: 12140: 12135: 12130: 12125: 12121: 12117: 12112: 12107: 12103: 12099: 12095: 12088: 12080: 12076: 12071: 12066: 12062: 12058: 12054: 12050: 12046: 12042: 12038: 12031: 12023: 12019: 12015: 12011: 12006: 12001: 11997: 11993: 11989: 11985: 11981: 11974: 11966: 11962: 11958: 11954: 11950: 11946: 11941: 11936: 11932: 11928: 11924: 11920: 11916: 11909: 11893: 11889: 11885: 11879: 11877: 11875: 11866: 11862: 11858: 11854: 11850: 11846: 11842: 11838: 11831: 11823: 11819: 11814: 11809: 11805: 11801: 11796: 11791: 11787: 11783: 11779: 11775: 11771: 11764: 11750: 11746: 11739: 11724: 11720: 11716: 11709: 11701: 11697: 11693: 11689: 11685: 11681: 11674: 11656: 11649: 11648: 11640: 11624: 11620: 11619:FloydHub Blog 11616: 11610: 11602: 11598: 11594: 11588: 11584: 11580: 11576: 11569: 11561: 11557: 11552: 11547: 11543: 11539: 11534: 11529: 11525: 11521: 11517: 11510: 11502: 11498: 11494: 11490: 11485: 11480: 11476: 11472: 11468: 11464: 11460: 11453: 11438: 11434: 11430: 11426: 11424:9781538610343 11420: 11416: 11412: 11408: 11404: 11397: 11389: 11385: 11381: 11377: 11373: 11369: 11365: 11361: 11356: 11351: 11347: 11343: 11335: 11327: 11323: 11319: 11315: 11311: 11307: 11302: 11297: 11293: 11289: 11282: 11275: 11273: 11258: 11255: 11248: 11234: 11230: 11224: 11216: 11212: 11207: 11202: 11198: 11194: 11190: 11186: 11182: 11178: 11174: 11167: 11159: 11155: 11151: 11147: 11142: 11137: 11132: 11127: 11123: 11119: 11115: 11108: 11092: 11088: 11084: 11080: 11078:9781450328944 11074: 11070: 11066: 11062: 11058: 11054: 11053: 11045: 11029: 11025: 11021: 11014: 11006: 11002: 10997: 10992: 10987: 10986:10.2196/12957 10982: 10979:(5): e12957. 10978: 10974: 10973: 10968: 10961: 10943: 10936: 10935: 10927: 10918: 10913: 10906: 10891: 10887: 10883: 10876: 10868: 10864: 10860: 10856: 10852: 10848: 10844: 10840: 10833: 10824: 10819: 10812: 10796: 10792: 10788: 10782: 10766: 10762: 10761: 10756: 10750: 10748: 10738: 10733: 10726: 10710: 10706: 10700: 10685: 10681: 10675: 10666: 10650: 10646: 10642: 10636: 10621: 10617: 10613: 10607: 10599: 10595: 10590: 10585: 10580: 10575: 10571: 10567: 10563: 10556: 10548: 10544: 10540: 10536: 10531: 10526: 10522: 10518: 10514: 10507: 10498: 10479: 10472: 10465: 10463: 10446: 10442: 10441: 10436: 10429: 10420: 10415: 10407: 10391: 10387: 10383: 10376: 10374: 10372: 10370: 10353: 10349: 10345: 10338: 10322: 10318: 10314: 10308: 10300: 10296: 10291: 10286: 10281: 10276: 10272: 10268: 10264: 10257: 10249: 10245: 10241: 10237: 10234:(12): e3259. 10233: 10229: 10222: 10206: 10202: 10198: 10191: 10189: 10180: 10176: 10172: 10168: 10164: 10160: 10156: 10149: 10133: 10129: 10125: 10118: 10102: 10098: 10094: 10087: 10068: 10064: 10057: 10050: 10032: 10028: 10021: 10014: 9995: 9988: 9981: 9979: 9969: 9964: 9957: 9948: 9943: 9939: 9935: 9931: 9924: 9922: 9920: 9910: 9905: 9901: 9897: 9893: 9886: 9884: 9882: 9875: 9874: 9869: 9864: 9855: 9850: 9843: 9835: 9831: 9827: 9823: 9818: 9813: 9809: 9805: 9798: 9783: 9779: 9775: 9769: 9760: 9755: 9747: 9731: 9727: 9723: 9716: 9698: 9694: 9690: 9685: 9680: 9676: 9672: 9665: 9658: 9650: 9646: 9642: 9638: 9635:: 1915–1919. 9634: 9627: 9612: 9608: 9604: 9600: 9596: 9592: 9588: 9584: 9577: 9569: 9565: 9561: 9557: 9551: 9536: 9534:1-58563-019-5 9530: 9526: 9522: 9518: 9517: 9509: 9501: 9497: 9493: 9489: 9485: 9481: 9476: 9471: 9467: 9463: 9456: 9454: 9452: 9443: 9439: 9434: 9429: 9425: 9421: 9417: 9413: 9409: 9405: 9401: 9394: 9379: 9375: 9369: 9355: 9351: 9344: 9335: 9330: 9325: 9320: 9316: 9312: 9308: 9301: 9293: 9289: 9283: 9267: 9263: 9259: 9253: 9237: 9233: 9229: 9222: 9206: 9202: 9198: 9192: 9176: 9172: 9168: 9161: 9154: 9150: 9146: 9140: 9131: 9123: 9119: 9115: 9111: 9107: 9103: 9098: 9093: 9089: 9085: 9078: 9062: 9058: 9054: 9050: 9048:9781450351140 9044: 9040: 9036: 9032: 9031: 9026: 9019: 9004: 9000: 8996: 8989: 8973: 8969: 8965: 8959: 8941: 8937: 8930: 8923: 8915: 8911: 8907: 8901: 8897: 8893: 8888: 8883: 8878: 8873: 8869: 8862: 8854: 8850: 8846: 8840: 8836: 8832: 8828: 8824: 8820: 8814: 8806: 8802: 8798: 8794: 8790: 8786: 8782: 8775: 8759: 8755: 8751: 8745: 8743: 8734: 8730: 8726: 8722: 8718: 8714: 8710: 8706: 8702: 8698: 8691: 8673: 8669: 8665: 8661: 8657: 8654:: 1045–1048. 8653: 8646: 8639: 8630: 8625: 8618: 8616: 8606: 8601: 8594: 8592: 8573: 8569: 8565: 8560: 8555: 8551: 8544: 8537: 8535: 8533: 8517: 8513: 8509: 8505: 8501: 8497: 8493: 8489: 8485: 8481: 8474: 8472: 8455: 8451: 8447: 8440: 8425: 8422:. ICLR 2018. 8421: 8417: 8410: 8395: 8392:: 2553–2561. 8391: 8387: 8380: 8366: 8362: 8361: 8354: 8346: 8342: 8338: 8334: 8330: 8326: 8322: 8318: 8314: 8310: 8306: 8302: 8294: 8286: 8280: 8272: 8266: 8262: 8255: 8241: 8237: 8231: 8213: 8209: 8202: 8195: 8186: 8181: 8174: 8163: 8156: 8149: 8141: 8137: 8133: 8127: 8123: 8119: 8114: 8109: 8105: 8098: 8083: 8079: 8072: 8070: 8063: 8057: 8048: 8043: 8039: 8035: 8028: 8021: 8005: 8001: 7995: 7986: 7981: 7974: 7958: 7954: 7948: 7929: 7922: 7921: 7913: 7904: 7899: 7892: 7884: 7878: 7874: 7870: 7865: 7860: 7856: 7852: 7851: 7843: 7834: 7829: 7825: 7818: 7809: 7804: 7797: 7789: 7784: 7780: 7773: 7763: 7758: 7751: 7741: 7736: 7728: 7718: 7713: 7706: 7697: 7692: 7688: 7681: 7674: 7665: 7660: 7653: 7644: 7639: 7632: 7624: 7620: 7616: 7610: 7606: 7602: 7598: 7591: 7573: 7566: 7565: 7557: 7539: 7535: 7531: 7527: 7520: 7513: 7505: 7501: 7497: 7493: 7489: 7485: 7481: 7477: 7472: 7467: 7463: 7459: 7452: 7444: 7438: 7434: 7430: 7426: 7422: 7415: 7406: 7401: 7397: 7396:Sze, Vivienne 7391: 7377: 7373: 7372: 7364: 7362: 7353: 7349: 7345: 7341: 7337: 7333: 7326: 7324: 7307: 7303: 7299: 7292: 7277: 7273: 7269: 7265: 7261: 7257: 7253: 7246: 7231: 7227: 7223: 7216: 7201: 7197: 7193: 7186: 7184: 7167: 7163: 7159: 7153: 7145: 7139: 7135: 7134: 7126: 7124: 7122: 7102: 7099:. Microsoft. 7095: 7088: 7086: 7084: 7075: 7071: 7067: 7063: 7059: 7055: 7051: 7047: 7043: 7036: 7034: 7032: 7025: 7021: 7018: 7014: 7009: 7001: 6997: 6992: 6987: 6983: 6979: 6975: 6971: 6967: 6960: 6952: 6948: 6944: 6940: 6936: 6932: 6928: 6924: 6920: 6913: 6905: 6901: 6897: 6893: 6889: 6885: 6881: 6874: 6867: 6863: 6859: 6855: 6849: 6831: 6827: 6823: 6819: 6815: 6811: 6807: 6803: 6799: 6798: 6790: 6786: 6785:Hinton, G. E. 6780: 6764: 6760: 6756: 6752: 6748: 6744: 6740: 6736: 6732: 6728: 6724: 6720: 6713: 6706: 6702: 6696: 6689: 6685: 6682: 6676: 6667: 6662: 6658: 6654: 6650: 6644: 6626: 6622: 6615: 6608: 6606: 6589: 6585: 6581: 6575: 6566: 6558: 6554: 6550: 6546: 6539: 6537: 6528: 6524: 6520: 6516: 6509: 6501: 6497: 6493: 6489: 6482: 6467: 6463: 6456: 6448: 6444: 6440: 6436: 6432: 6428: 6424: 6420: 6416: 6412: 6405: 6387: 6383: 6379: 6375: 6371: 6367: 6363: 6359: 6355: 6351: 6347: 6340: 6333: 6318: 6314: 6312:9780780305328 6308: 6304: 6300: 6296: 6290: 6282: 6278: 6274: 6270: 6266: 6262: 6255: 6247: 6243: 6239: 6235: 6231: 6227: 6223: 6219: 6215: 6208: 6200: 6194: 6190: 6183: 6175: 6171: 6167: 6163: 6159: 6155: 6151: 6147: 6143: 6139: 6135: 6131: 6127: 6121: 6113: 6106: 6102: 6098: 6094: 6089: 6084: 6080: 6076: 6072: 6068: 6064: 6060: 6056: 6052: 6046: 6038: 6036:0-262-68053-X 6032: 6028: 6024: 6023: 6015: 6008: 6000: 5994: 5986: 5982: 5978: 5974: 5970: 5966: 5962: 5955: 5947: 5943: 5939: 5935: 5931: 5927: 5922: 5917: 5913: 5909: 5905: 5899: 5897: 5888: 5884: 5880: 5876: 5872: 5868: 5864: 5858: 5850: 5846: 5840: 5838: 5829: 5827:0-85296-721-7 5823: 5819: 5815: 5811: 5804: 5797: 5793: 5789: 5788: 5783: 5779: 5773: 5765: 5759: 5755: 5751: 5744: 5737: 5733: 5729: 5725: 5719: 5717: 5715: 5702: 5701: 5693: 5685: 5681: 5677: 5673: 5669: 5665: 5658: 5651: 5649: 5640: 5633: 5629: 5623: 5621: 5619: 5610: 5606: 5602: 5598: 5594: 5590: 5586: 5579: 5571: 5567: 5563: 5556: 5541: 5537: 5533: 5529: 5524: 5519: 5515: 5511: 5504: 5497: 5489: 5485: 5481: 5477: 5473: 5469: 5466:(4): 517–24. 5465: 5461: 5457: 5450: 5442: 5438: 5434: 5430: 5426: 5422: 5418: 5414: 5410: 5403: 5395: 5391: 5387: 5383: 5379: 5375: 5371: 5367: 5363: 5356: 5349: 5345: 5339: 5331: 5327: 5320: 5313: 5312: 5307: 5302: 5291: 5290: 5282: 5275: 5271: 5268: 5262: 5254: 5250: 5246: 5242: 5238: 5234: 5230: 5226: 5222: 5215: 5207: 5205:0-471-59897-6 5201: 5197: 5190: 5171: 5167: 5160: 5156: 5150: 5134: 5130: 5126: 5120: 5118: 5108: 5100: 5096: 5092: 5088: 5084: 5080: 5076: 5070: 5062: 5058: 5052: 5044: 5040: 5036: 5032: 5028: 5022: 5014: 5012:9780598818461 5008: 5004: 5003: 4995: 4987: 4983: 4979: 4975: 4971: 4967: 4963: 4959: 4952: 4944: 4940: 4936: 4932: 4928: 4924: 4920: 4913: 4904: 4899: 4892: 4884: 4880: 4876: 4870: 4861: 4856: 4852: 4848: 4844: 4840: 4834: 4816: 4812: 4808: 4804: 4797: 4790: 4788: 4779: 4775: 4771: 4767: 4763: 4756: 4748: 4742: 4738: 4737: 4729: 4721: 4714: 4706: 4702: 4696: 4694: 4685: 4681: 4677: 4673: 4669: 4665: 4661: 4657: 4653: 4646: 4638: 4631: 4623: 4619: 4615: 4608: 4600: 4594: 4590: 4586: 4582: 4575: 4567: 4563: 4558: 4553: 4548: 4543: 4539: 4535: 4531: 4527: 4523: 4516: 4507: 4502: 4498: 4492: 4490: 4488: 4486: 4484: 4482: 4480: 4471: 4467: 4460: 4458: 4449: 4445: 4441: 4437: 4433: 4429: 4422: 4414: 4410: 4404: 4402: 4383: 4379: 4373: 4366: 4365: 4357: 4349: 4345: 4341: 4337: 4332: 4327: 4323: 4319: 4312: 4304: 4300: 4296: 4292: 4285: 4283: 4274: 4268: 4265:. MIT Press. 4264: 4263: 4255: 4253: 4251: 4249: 4230: 4226: 4222: 4218: 4214: 4207: 4200: 4198: 4196: 4194: 4192: 4183: 4179: 4174: 4169: 4164: 4159: 4155: 4151: 4147: 4143: 4139: 4132: 4125: 4121: 4118: 4112: 4110: 4101: 4095: 4091: 4090: 4082: 4080: 4071: 4065: 4061: 4060: 4052: 4050: 4041: 4037: 4033: 4029: 4025: 4021: 4014: 4012: 4010: 3998: 3994: 3990: 3986: 3982: 3978: 3974: 3970: 3966: 3965: 3957: 3950: 3948: 3946: 3937: 3933: 3929: 3925: 3920: 3915: 3911: 3907: 3903: 3896: 3887: 3872: 3866: 3862: 3858: 3854: 3853: 3845: 3839: 3835: 3832: 3828: 3823: 3814: 3809: 3805: 3801: 3797: 3793: 3789: 3782: 3780: 3761: 3754: 3753: 3745: 3737: 3731: 3727: 3726: 3718: 3710: 3706: 3702: 3698: 3694: 3690: 3685: 3680: 3676: 3672: 3665: 3663: 3661: 3659: 3657: 3655: 3653: 3651: 3642: 3638: 3634: 3630: 3626: 3622: 3617: 3612: 3608: 3604: 3597: 3595: 3593: 3591: 3589: 3569: 3565: 3561: 3557: 3553: 3548: 3543: 3539: 3535: 3528: 3521: 3519: 3517: 3515: 3499: 3495: 3489: 3473: 3469: 3465: 3459: 3441: 3437: 3430: 3423: 3421: 3412: 3408: 3404: 3398: 3394: 3390: 3385: 3380: 3376: 3369: 3367: 3358: 3354: 3350: 3346: 3342: 3338: 3334: 3330: 3326: 3322: 3315: 3308: 3306: 3297: 3293: 3289: 3285: 3281: 3277: 3273: 3269: 3265: 3258: 3254: 3245: 3242: 3240: 3237: 3235: 3234:Sparse coding 3232: 3230: 3227: 3225: 3222: 3220: 3217: 3215: 3212: 3210: 3207: 3205: 3202: 3200: 3197: 3195: 3192: 3190: 3187: 3185: 3182: 3181: 3175: 3173: 3168: 3166: 3162: 3158: 3154: 3150: 3146: 3145:tagging faces 3142: 3138: 3134: 3130: 3126: 3122: 3118: 3114: 3103: 3100: 3092: 3081: 3078: 3074: 3071: 3067: 3064: 3060: 3057: 3053: 3050: –  3049: 3045: 3044:Find sources: 3038: 3034: 3028: 3027: 3022:This section 3020: 3016: 3011: 3010: 3002: 3000: 2995: 2993: 2988: 2986: 2982: 2978: 2974: 2969: 2967: 2963: 2959: 2954: 2953: 2949: 2943: 2941: 2930: 2928: 2924: 2920: 2916: 2912: 2908: 2897: 2895: 2893: 2885: 2883: 2879: 2875: 2871: 2867: 2861: 2859: 2855: 2850: 2848: 2841: 2831: 2823: 2821: 2817: 2812: 2809: 2807: 2803: 2799: 2795: 2791: 2786: 2784: 2780: 2771: 2767: 2765: 2761: 2757: 2752: 2750: 2746: 2742: 2738: 2733: 2729: 2719: 2717: 2713: 2709: 2705: 2701: 2697: 2692: 2686: 2676: 2667: 2658: 2656: 2651: 2649: 2645: 2641: 2632: 2630: 2626: 2622: 2612: 2604: 2601: 2597: 2592: 2588: 2584: 2574: 2572: 2562: 2560: 2556: 2552: 2548: 2544: 2540: 2530: 2527: 2517: 2508: 2506: 2502: 2498: 2494: 2490: 2486: 2482: 2478: 2474: 2470: 2466: 2462: 2458: 2454: 2444: 2442: 2438: 2433: 2431: 2426: 2424: 2423:gene ontology 2421:, to predict 2420: 2416: 2410: 2400: 2395: 2385: 2383: 2379: 2375: 2371: 2366: 2356: 2353: 2348: 2346: 2342: 2338: 2333: 2331: 2330:toxic effects 2327: 2323: 2319: 2318:toxic effects 2311: 2307: 2297: 2288: 2285: 2281: 2277: 2273: 2269: 2267: 2263: 2258: 2256: 2252: 2248: 2244: 2240: 2239: 2234: 2229: 2225: 2212: 2209: 2206: 2203: 2202: 2201: 2194: 2189: 2180: 2176: 2172: 2169: 2162: 2158: 2139: 2129: 2127: 2123: 2119: 2115: 2111: 2107: 2103: 2099: 2095: 2087: 2084: 2081: 2078: 2074: 2071: 2068: 2064: 2061: 2058: 2055: 2052: 2049: 2048: 2047: 2039: 2036: 2035: 2031: 2028: 2027: 2023: 2020: 2019: 2015: 2012: 2011: 2007: 2004: 2003: 1999: 1996: 1995: 1991: 1988: 1987: 1983: 1980: 1979: 1975: 1972: 1971: 1967: 1964: 1963: 1959: 1956: 1955: 1951: 1948: 1947: 1942:Percent phone 1941: 1938: 1937: 1934: 1931: 1927: 1923: 1919: 1914: 1909: 1894: 1892: 1889: 1885: 1881: 1877: 1873: 1870: 1865: 1863: 1860: 1859:floating-gate 1855: 1850: 1848: 1844: 1841:(TPU) in the 1840: 1836: 1832: 1828: 1824: 1819: 1816: 1811: 1801: 1799: 1794: 1792: 1788: 1784: 1783:learning rate 1779: 1776: 1758: 1754: 1745: 1727: 1723: 1714: 1710: 1705: 1703: 1693: 1691: 1687: 1683: 1681: 1677: 1673: 1669: 1665: 1663: 1659: 1654: 1650: 1641: 1637: 1635: 1631: 1627: 1623: 1618: 1616: 1610: 1606: 1604: 1600: 1596: 1592: 1588: 1583: 1581: 1577: 1572: 1568: 1566: 1565:connectionist 1561: 1557: 1545: 1541: 1533: 1524: 1520: 1516: 1509: 1499: 1489: 1487: 1483: 1479: 1475: 1474:Yoshua Bengio 1471: 1469: 1465: 1461: 1457: 1453: 1448: 1446: 1442: 1437: 1435: 1431: 1427: 1423: 1419: 1415: 1411: 1407: 1403: 1399: 1397: 1393: 1389: 1384: 1382: 1378: 1373: 1371: 1366: 1364: 1361:and Google's 1360: 1356: 1352: 1348: 1344: 1340: 1336: 1332: 1327: 1325: 1321: 1317: 1312: 1310: 1306: 1302: 1298: 1293: 1291: 1287: 1282: 1278: 1271: 1262: 1260: 1256: 1250: 1246: 1244: 1240: 1236: 1233: 1229: 1225: 1220: 1218: 1214: 1210: 1206: 1201: 1199: 1195: 1194:Gabor filters 1185: 1183: 1179: 1175: 1170: 1168: 1164: 1160: 1156: 1152: 1148: 1144: 1139: 1137: 1133: 1132:mixture model 1129: 1124: 1122: 1118: 1114: 1110: 1106: 1102: 1098: 1094: 1089: 1087: 1083: 1079: 1075: 1071: 1070:zero-sum game 1067: 1062: 1060: 1056: 1052: 1048: 1044: 1042: 1038: 1034: 1030: 1026: 1022: 1018: 1014: 1010: 1006: 1001: 999: 995: 994:Elman network 991: 987: 983: 981: 977: 973: 969: 965: 961: 957: 953: 943: 941: 937: 933: 929: 925: 921: 917: 913: 909: 907: 903: 899: 894: 892: 888: 884: 879: 877: 873: 869: 865: 861: 856: 853: 849: 844: 842: 838: 834: 832: 828: 824: 823:John Hopfield 820: 816: 812: 808: 804: 800: 796: 792: 777: 775: 771: 767: 763: 759: 755: 751: 747: 743: 739: 735: 731: 727: 726:probabilistic 722: 720: 716: 712: 707: 705: 701: 696: 692: 688: 684: 679: 677: 673: 663: 661: 657: 653: 652:Deep Learning 648: 646: 640: 638: 633: 631: 627: 623: 618: 615: 611: 607: 602: 600: 596: 592: 588: 584: 581: 576: 574: 570: 566: 562: 558: 554: 550: 540: 538: 534: 529: 527: 523: 519: 515: 511: 507: 503: 499: 495: 491: 487: 483: 479: 475: 471: 467: 462: 460: 456: 452: 447: 443: 439: 435: 431: 427: 426:Deep learning 416: 411: 409: 404: 402: 397: 396: 394: 393: 386: 383: 382: 376: 375: 368: 365: 363: 360: 358: 355: 353: 350: 349: 346: 341: 340: 333: 330: 328: 325: 323: 320: 318: 315: 313: 309: 306: 304: 301: 299: 296: 294: 291: 290: 287: 282: 281: 274: 271: 269: 266: 264: 261: 259: 256: 252: 251:Mental health 249: 248: 247: 244: 242: 239: 237: 234: 230: 227: 225: 222: 220: 217: 216: 215: 214:Generative AI 212: 210: 207: 205: 202: 200: 197: 195: 192: 191: 188: 183: 182: 175: 172: 170: 167: 165: 162: 160: 157: 155: 154:Deep learning 152: 150: 147: 145: 142: 141: 135: 134: 127: 124: 122: 119: 117: 114: 112: 109: 107: 104: 102: 99: 97: 94: 92: 89: 87: 84: 82: 79: 78: 75: 70: 69: 63: 59: 58: 55: 52: 51: 47: 46: 39: 33: 19: 14866:Hugging Face 14830:David Silver 14478:Audio–visual 14355: 14332:Applications 14311:Augmentation 14156: 14087:. Retrieved 14067: 14040: 14024:. Springer. 14021: 13999:. Retrieved 13982: 13973: 13938: 13934: 13908: 13898: 13887:. Retrieved 13870: 13860: 13848:. Retrieved 13839: 13809:. Retrieved 13800: 13791: 13779:. Retrieved 13770: 13761: 13750:. Retrieved 13746:the original 13735: 13726: 13699: 13695: 13689: 13668: 13647: 13636:. Retrieved 13598:. Retrieved 13590:The Guardian 13589: 13579: 13567:. Retrieved 13552: 13541:. Retrieved 13532: 13522: 13510:. Retrieved 13501: 13491: 13479:. Retrieved 13470: 13460: 13448:. Retrieved 13439: 13430: 13395: 13389: 13377:. Retrieved 13368: 13358: 13346:. Retrieved 13342:the original 13337: 13328: 13272: 13266: 13245: 13202: 13198: 13188: 13176:. Retrieved 13167: 13157: 13112: 13108: 13098: 13047: 13043: 13033: 12992: 12988: 12982: 12941: 12937: 12931: 12888: 12884: 12874: 12813: 12809: 12799: 12758: 12754: 12748: 12695: 12691: 12681: 12648: 12644: 12638: 12585: 12581: 12571: 12563: 12558: 12515: 12511: 12505: 12480: 12476: 12470: 12450: 12443: 12410: 12406: 12400: 12381: 12371: 12360:. Retrieved 12356: 12346: 12293: 12289: 12278: 12227: 12223: 12213: 12170: 12166: 12156: 12101: 12097: 12087: 12044: 12040: 12030: 11987: 11983: 11973: 11922: 11918: 11908: 11896:. Retrieved 11887: 11840: 11830: 11777: 11773: 11763: 11752:. Retrieved 11748: 11738: 11727:. Retrieved 11718: 11708: 11683: 11679: 11673: 11662:. Retrieved 11646: 11639: 11627:. Retrieved 11618: 11609: 11574: 11568: 11523: 11519: 11509: 11466: 11462: 11452: 11441:. Retrieved 11406: 11396: 11345: 11341: 11334: 11291: 11287: 11260:. Retrieved 11256: 11247: 11236:. Retrieved 11232: 11223: 11180: 11176: 11166: 11121: 11117: 11107: 11095:. Retrieved 11069:11311/964622 11051: 11044: 11032:. Retrieved 11023: 11013: 10976: 10970: 10960: 10949:. Retrieved 10933: 10926: 10905: 10894:. Retrieved 10885: 10875: 10842: 10838: 10832: 10811: 10799:. Retrieved 10790: 10781: 10769:. Retrieved 10758: 10725: 10713:. Retrieved 10709:the original 10699: 10688:. Retrieved 10674: 10665: 10653:. Retrieved 10644: 10635: 10624:. Retrieved 10615: 10606: 10569: 10565: 10555: 10520: 10516: 10506: 10497: 10485:. Retrieved 10478:the original 10449:. Retrieved 10438: 10428: 10406: 10394:. Retrieved 10385: 10356:. Retrieved 10347: 10337: 10325:. Retrieved 10316: 10307: 10270: 10266: 10256: 10231: 10227: 10221: 10209:. Retrieved 10200: 10162: 10158: 10148: 10136:. Retrieved 10127: 10117: 10105:. Retrieved 10096: 10086: 10074:. Retrieved 10062: 10049: 10038:. Retrieved 10026: 10013: 10001:. Retrieved 9956: 9937: 9933: 9899: 9895: 9871: 9863: 9842: 9807: 9803: 9797: 9786:. Retrieved 9777: 9768: 9746: 9734:. Retrieved 9725: 9715: 9704:. Retrieved 9674: 9670: 9657: 9632: 9626: 9615:. Retrieved 9590: 9586: 9576: 9559: 9550: 9538:. Retrieved 9515: 9508: 9468:(2): 52–58. 9465: 9461: 9410:(2): 72–77. 9407: 9403: 9393: 9382:. Retrieved 9380:. 2021-04-20 9377: 9368: 9357:. Retrieved 9353: 9343: 9314: 9310: 9300: 9291: 9282: 9270:. Retrieved 9261: 9252: 9240:. Retrieved 9231: 9221: 9209:. Retrieved 9200: 9191: 9179:. Retrieved 9170: 9160: 9139: 9130: 9087: 9083: 9077: 9065:. Retrieved 9029: 9018: 9007:. Retrieved 8998: 8988: 8976:. Retrieved 8967: 8958: 8947:. Retrieved 8935: 8922: 8867: 8861: 8826: 8813: 8788: 8784: 8774: 8762:. Retrieved 8754:ResearchGate 8753: 8700: 8696: 8690: 8679:. Retrieved 8651: 8638: 8579:. Retrieved 8549: 8520:. Retrieved 8487: 8483: 8458:. Retrieved 8454:the original 8449: 8439: 8428:. Retrieved 8419: 8409: 8398:. Retrieved 8389: 8379: 8369:, retrieved 8359: 8353: 8304: 8300: 8293: 8260: 8254: 8243:. Retrieved 8239: 8230: 8219:. Retrieved 8207: 8194: 8173: 8162:the original 8148: 8103: 8097: 8086:. Retrieved 8056: 8037: 8033: 8020: 8008:. Retrieved 7994: 7973: 7961:. Retrieved 7956: 7947: 7935:. Retrieved 7919: 7912: 7891: 7854: 7849: 7842: 7823: 7817: 7796: 7778: 7772: 7750: 7727: 7705: 7686: 7673: 7652: 7631: 7596: 7590: 7579:. Retrieved 7563: 7556: 7545:. Retrieved 7525: 7512: 7461: 7457: 7451: 7424: 7414: 7390: 7380:, retrieved 7370: 7335: 7331: 7310:. Retrieved 7301: 7291: 7280:. Retrieved 7255: 7245: 7234:. Retrieved 7225: 7215: 7204:. Retrieved 7195: 7170:. Retrieved 7161: 7152: 7136:. Springer. 7132: 7108:. Retrieved 7052:(6): 82–97. 7049: 7045: 7008: 6973: 6970:Scholarpedia 6969: 6959: 6926: 6922: 6912: 6887: 6883: 6873: 6865: 6848: 6837:. Retrieved 6801: 6795: 6779: 6767:. Retrieved 6726: 6722: 6712: 6704: 6700: 6695: 6675: 6656: 6649:Graves, Alex 6643: 6632:. Retrieved 6620: 6592:. Retrieved 6584:ResearchGate 6583: 6574: 6565: 6548: 6544: 6518: 6514: 6508: 6491: 6487: 6481: 6470:. Retrieved 6455: 6439:1721.1/51891 6417:(3): 75–80. 6414: 6410: 6404: 6393:. Retrieved 6349: 6345: 6332: 6321:. Retrieved 6302: 6295:Robinson, T. 6289: 6264: 6260: 6254: 6221: 6217: 6207: 6188: 6182: 6141: 6137: 6130:Dayan, Peter 6120: 6070: 6066: 6051:Peter, Dayan 6045: 6021: 6007: 5993:cite journal 5968: 5964: 5954: 5911: 5907: 5870: 5866: 5857: 5848: 5809: 5803: 5786: 5772: 5753: 5743: 5735: 5699: 5692: 5667: 5663: 5638: 5592: 5588: 5578: 5569: 5565: 5555: 5543:. Retrieved 5513: 5509: 5496: 5463: 5459: 5449: 5416: 5412: 5402: 5369: 5365: 5355: 5347: 5343: 5338: 5329: 5319: 5309: 5301: 5288: 5281: 5261: 5228: 5224: 5214: 5195: 5189: 5177:. Retrieved 5165: 5155:Werbos, Paul 5149: 5137:. Retrieved 5133:the original 5107: 5082: 5078: 5069: 5060: 5051: 5034: 5030: 5021: 5001: 4994: 4961: 4958:Biol. Cybern 4957: 4951: 4918: 4912: 4891: 4882: 4878: 4869: 4850: 4846: 4833: 4822:. Retrieved 4802: 4769: 4765: 4755: 4735: 4728: 4719: 4713: 4704: 4659: 4655: 4645: 4636: 4630: 4613: 4607: 4580: 4574: 4529: 4525: 4515: 4469: 4465: 4431: 4427: 4421: 4412: 4389:. Retrieved 4370:. Springer. 4363: 4356: 4321: 4317: 4311: 4294: 4290: 4261: 4236:. Retrieved 4216: 4212: 4145: 4141: 4131: 4088: 4058: 4023: 4019: 3997:the original 3968: 3962: 3909: 3905: 3895: 3886: 3874:. Retrieved 3851: 3844: 3827:Rina Dechter 3822: 3795: 3792:Scholarpedia 3791: 3767:. Retrieved 3751: 3744: 3724: 3717: 3674: 3670: 3606: 3602: 3575:. Retrieved 3568:the original 3540:(1): 1–127. 3537: 3533: 3502:. Retrieved 3500:. 2022-11-02 3497: 3488: 3476:. Retrieved 3467: 3458: 3447:. Retrieved 3435: 3374: 3324: 3320: 3271: 3267: 3257: 3169: 3133:gamification 3110: 3095: 3086: 3076: 3069: 3062: 3055: 3043: 3031:Please help 3026:verification 3023: 2996: 2989: 2970: 2955: 2951: 2944: 2936: 2933:Cyber threat 2903: 2892:The Guardian 2890: 2887: 2863: 2851: 2843: 2829: 2813: 2810: 2787: 2777: 2768: 2753: 2725: 2688: 2673: 2664: 2652: 2638: 2618: 2610: 2580: 2568: 2536: 2523: 2514: 2450: 2434: 2427: 2412: 2397: 2368: 2349: 2334: 2314: 2294: 2270: 2259: 2243:vector space 2236: 2230: 2227: 2198: 2177: 2173: 2165: 2106:Amazon Alexa 2091: 2045: 1915: 1911: 1897:Applications 1880:multiplexing 1866: 1851: 1820: 1807: 1795: 1780: 1713:weight decay 1706: 1699: 1684: 1674: 1670: 1666: 1655: 1651: 1647: 1638: 1619: 1611: 1607: 1603:real numbers 1584: 1563: 1559: 1555: 1554: 1486:Turing Award 1472: 1449: 1438: 1400: 1390:(2015), and 1385: 1374: 1367: 1328: 1313: 1296: 1294: 1283: 1279: 1276: 1251: 1247: 1243:MNIST images 1224:Geoff Hinton 1221: 1202: 1191: 1178:Mel-Cepstral 1171: 1140: 1125: 1090: 1063: 1045: 1032: 1028: 1020: 1002: 984: 949: 910: 902:Neocognitron 895: 880: 857: 845: 835: 813:created the 807:Wilhelm Lenz 788: 736:concepts of 734:optimization 723: 708: 680: 669: 656:Rina Dechter 651: 649: 641: 634: 621: 619: 605: 603: 577: 557:transformers 546: 530: 486:transformers 463: 459:unsupervised 425: 424: 298:Chinese room 187:Applications 153: 15014:Categories 14962:Autoencoder 14917:Transformer 14785:Alex Graves 14733:OpenAI Five 14637:IBM Watsonx 14259:Convolution 14237:Overfitting 14001:22 November 13471:Gary Marcus 12391:10.14336/AD 12173:: 489–504. 11925:: 686–707. 11888:EurekAlert! 11749:VentureBeat 11526:(7): 1819. 11463:IEEE Access 11124:(4): e125. 11097:23 November 10076:21 December 9810:: 333–338. 9540:27 December 9378:VentureBeat 9317:(2): 1–12. 9090:: 197–227. 8978:30 November 8652:Interspeech 8010:25 November 7196:Interspeech 7110:27 December 7042:Sainath, T. 6976:(5): 5947. 6659:: 369–376. 5031:ARS Journal 4839:Robbins, H. 3876:27 December 3798:(5): 5947. 3577:3 September 2958:psychedelic 2858:Gary Marcus 2749:transducers 2415:autoencoder 2341:Ebola virus 2326:off-targets 1702:overfitting 1632:filtering, 1519:sea urchins 1432:(2022) and 1363:Inceptionv3 1353:network by 1309:max-pooling 1205:Alex Graves 1174:filter-bank 1097:Peter Dayan 1033:automatizer 956:Alex Waibel 946:1980s-2000s 936:Paul Werbos 918:derived by 862:trained by 831:Alan Turing 815:Ising model 811:Ernst Ising 785:Before 1980 758:regularizer 537:human brain 514:drug design 327:Turing test 303:Friendly AI 74:Major goals 15003:Technology 14856:EleutherAI 14815:Fei-Fei Li 14810:Yann LeCun 14723:Q-learning 14706:Decisional 14632:IBM Watson 14540:Midjourney 14432:TensorFlow 14279:Activation 14232:Regression 14227:Clustering 14089:2021-05-09 13889:2017-10-11 13850:11 October 13811:11 October 13781:11 October 13752:2015-05-10 13638:2015-05-10 13543:2017-06-14 13512:2 November 13481:11 October 13348:30 January 13255:Huang, Aja 12362:2024-05-19 12303:2212.12794 12237:2006.14395 12180:2008.11625 12111:1707.02568 11990:: 112789. 11754:2023-12-19 11729:2018-07-15 11686:: 105048. 11664:2018-01-01 11629:11 October 11443:2019-11-12 11355:1702.05747 11301:2012.11197 11262:2024-05-10 11238:2024-05-10 10951:2017-06-14 10917:1504.01840 10896:2019-09-05 10823:1704.01212 10801:9 November 10771:9 November 10737:1510.02855 10690:2015-03-05 10626:2020-07-16 10616:kaggle.com 10589:1942/18723 10523:(8): 569. 10487:1 December 10451:12 October 10419:1609.08144 10273:(1): 157. 10040:2014-09-03 10003:26 October 9788:2014-01-28 9706:2019-04-01 9617:2018-04-20 9475:2002.00281 9384:2022-08-03 9359:2022-08-03 9324:1704.04760 9181:23 October 9097:1702.07908 9009:2017-06-13 8949:2017-06-13 8681:2017-06-13 8629:1512.00103 8605:1602.02410 8581:2017-06-13 8550:Proc. NIPS 8522:2020-02-25 8430:2021-01-05 8400:2017-06-13 8371:2020-11-16 8245:2024-08-07 8221:2017-06-13 8208:Google.com 8113:2102.04029 8088:2016-04-09 8047:1503.03585 7985:1710.10196 7963:October 3, 7903:1508.06576 7864:1512.03385 7833:1512.03385 7808:1502.01852 7581:2017-06-13 7547:2017-06-13 7405:1703.09039 7382:2021-02-14 7282:2017-06-14 7236:2017-06-14 7206:2017-06-12 7013:Yann LeCun 6839:2011-07-20 6634:2016-04-09 6472:2017-06-12 6395:2019-09-24 6323:2017-06-12 5921:1906.04493 5545:October 7, 4903:1710.05941 4853:(3): 400. 4824:2019-11-05 4766:Automatica 4506:2212.11279 4391:2017-08-06 4331:1505.03654 4238:2014-10-18 4148:(1): 138. 3769:2019-10-06 3677:: 85–117. 3504:2023-12-06 3468:TechCrunch 3449:2017-05-24 3250:References 3163:) and (5) 3089:April 2021 3059:newspapers 2992:Google Now 2966:stop signs 2838:See also: 2551:inpainting 2465:classifier 2310:Toxicology 2193:The Scream 2114:Apple Siri 2110:Google Now 2063:Multi-task 1888:integrated 1876:wavelength 1864:(FGFETs). 1696:Challenges 1658:primitives 1482:Yann LeCun 1458:(ASR) and 1445:smartphone 1404:(GAN) by ( 1239:fine-tuned 1163:Larry Heck 1115:, and the 1025:distilling 980:Yann LeCun 960:Yann LeCun 916:chain rule 801:(MLP) and 606:on its own 583:algorithms 526:board game 451:supervised 332:Regulation 286:Philosophy 241:Healthcare 236:Government 138:Approaches 14886:MIT CSAIL 14851:Anthropic 14820:Andrew Ng 14718:AlphaZero 14562:VideoPoet 14525:AlphaFold 14462:MindSpore 14416:SpiNNaker 14411:Memristor 14318:Diffusion 14294:Rectifier 14274:Batchnorm 14254:Attention 14249:Adversary 14098:cite book 13991:1059-1028 13965:209363848 13957:1461-4448 13704:CiteSeerX 13680:1312.6199 13659:1412.1897 13450:29 August 13297:0028-0836 13178:26 August 13122:1411.6422 13064:0962-8436 13009:1546-1726 12958:0959-4388 12907:0896-6273 12840:1553-7358 12775:2397-3374 12714:1662-5188 12665:0899-7667 12612:0027-8424 12520:CiteSeerX 12330:0036-8075 12270:220055785 12254:1525-8955 12205:235340737 12197:2333-9403 12022:212755458 12014:0045-7825 11949:0021-9991 11898:29 August 11865:265503872 11804:1476-4687 11700:204092079 11542:2072-6694 11501:220733699 11493:2169-3536 11348:: 60–88. 11326:229339809 11197:1067-5027 11087:207217210 10867:201716327 9968:1402.3722 9940:(4): 18. 9854:1404.3840 9812:CiteSeerX 9759:1412.5567 9693:217950236 9607:206602362 9500:211010976 8882:CiteSeerX 8877:1212.0901 8717:0899-7667 8559:1409.3215 8329:1476-4687 8279:cite book 8185:1410.4281 8140:231846518 7937:20 August 7788:1409.1556 7762:1411.2539 7740:1411.4952 7717:1411.4555 7696:1409.4842 7664:1409.1556 7643:1112.6209 7488:0899-7667 7471:1003.0358 7074:206485943 7000:1941-6016 6943:0899-7667 6743:1364-6613 6661:CiteSeerX 6374:0096-3518 6281:0218-0014 6238:0022-2836 5985:0364-0213 5946:216056336 5914:: 58–66. 5796:Q98967430 5609:0364-0213 5518:CiteSeerX 5253:1476-4687 5099:122357351 4986:206775608 4943:206775608 4676:1939-1471 3936:235081987 3928:2405-8963 3684:1404.7828 3616:1206.5538 3564:207178999 3542:CiteSeerX 3384:1202.2745 3296:220523562 3288:1610-1987 3165:clickwork 3125:microwork 3119:(e.g. on 3117:clickwork 2973:deception 2896:website. 2854:strong AI 2847:black box 2788:Google's 2745:neocortex 2696:CpG sites 2543:denoising 2441:AlphaFold 2079:of speech 1891:photonics 1878:division 1857:based on 1755:ℓ 1724:ℓ 1574:manually 1422:deepfakes 1320:Jeff Dean 1316:Andrew Ng 1314:In 2012, 1286:Andrew Ng 1182:waveforms 1064:In 1991, 1021:collapsed 1015:to learn 881:In 1969, 797:(FNN) or 650:The term 614:discovers 571:and deep 362:AI winter 263:Military 126:AI safety 15040:Category 14994:Portals 14753:Auto-GPT 14585:Word2vec 14389:Hardware 14306:Datasets 14208:Concepts 14083:Archived 13995:Archived 13911:: 38–39. 13883:Archived 13844:Archived 13805:Archived 13775:Archived 13629:Archived 13594:Archived 13563:Archived 13537:Archived 13506:Archived 13475:Archived 13444:Archived 13373:Archived 13305:26819042 13229:26819021 13172:Archived 13149:26157000 13090:39281431 13082:29292348 13025:16970545 13017:26906502 12974:16560320 12966:15321069 12923:14663106 12915:10069343 12858:22096452 12791:24504018 12783:31024135 12732:27468262 12542:10097006 12497:12662587 12427:12396572 12338:37962497 12262:32746211 12148:30082389 12079:32001523 11965:57379996 11892:Archived 11857:38030771 11822:38030720 11813:10700131 11723:Archived 11655:Archived 11623:Archived 11601:35350962 11560:35406591 11437:Archived 11380:28778026 11318:36155469 11215:27521897 11150:27815231 11091:Archived 11028:Archived 11005:31127715 10942:Archived 10890:Archived 10859:31477924 10795:Archived 10765:Archived 10684:Archived 10649:Archived 10620:Archived 10598:25582842 10547:20246434 10539:23903212 10445:Archived 10396:23 March 10390:Archived 10358:23 March 10352:Archived 10321:Archived 10299:36855134 10248:40745740 10205:Archived 10132:Archived 10101:Archived 10067:Archived 10031:Archived 9994:Archived 9902:(4): 5. 9834:22386783 9782:Archived 9730:Archived 9697:Archived 9649:15641618 9611:Archived 9492:33408373 9442:33149289 9354:Datanami 9266:Archived 9236:Archived 9205:Archived 9175:Archived 9149:Archived 9122:14135321 9061:Archived 9003:Archived 8972:Archived 8968:Coursera 8940:Archived 8914:12485056 8853:13816461 8805:14542261 8758:Archived 8672:Archived 8668:17048224 8572:Archived 8516:Archived 8512:10192330 8504:18249962 8424:Archived 8394:Archived 8365:archived 8337:26819042 8212:Archived 8082:Archived 8004:Archived 7928:Archived 7687:Cvpr2015 7623:24579167 7572:Archived 7538:Archived 7496:20858131 7376:archived 7306:Archived 7276:Archived 7230:Archived 7200:Archived 7172:16 March 7166:Archived 7101:Archived 7020:Archived 6951:16764513 6904:17921042 6858:Archived 6830:Archived 6818:16764513 6763:Archived 6759:15066318 6751:17921042 6684:Archived 6625:Archived 6588:Archived 6466:Archived 6386:Archived 6317:Archived 6297:(1992). 5938:32334341 5792:Wikidata 5728:Archived 5684:18271205 5540:14542261 5441:20706526 5394:20577468 5308:et al., 5270:Archived 5170:Archived 5157:(1982). 5059:(1970). 4815:Archived 4703:(1962). 4684:13602029 4382:Archived 4348:12149203 4229:Archived 4182:28743932 4120:Archived 3834:Archived 3760:Archived 3709:11715509 3701:25462637 3633:23787338 3472:Archived 3440:Archived 3349:26017442 3178:See also 3149:Facebook 3137:CAPTCHAs 2907:Goertzel 2779:Facebook 2607:Military 2541:such as 2493:alphabet 2350:In 2017 2282:uses an 2238:word2vec 1918:dialects 1869:photonic 1821:Special 1804:Hardware 1791:batching 1744:sparsity 1523:features 1515:starfish 1436:(2022). 1430:DALL·E 2 1418:StyleGAN 1377:training 1326:videos. 1088:(GANs). 1055:residual 770:Narendra 762:Hopfield 738:training 551:such as 543:Overview 385:Glossary 379:Glossary 357:Progress 352:Timeline 312:Takeover 273:Projects 246:Industry 209:Finance 199:Deepfake 149:Symbolic 121:Robotics 96:Planning 14876:Meta AI 14713:AlphaGo 14697:PanGu-Σ 14667:ChatGPT 14642:Granite 14590:Seq2seq 14569:Whisper 14490:WaveNet 14485:AlexNet 14457:Flux.jl 14437:PyTorch 14289:Sigmoid 14284:Softmax 14149:General 13771:Gizmodo 13600:20 June 13569:20 June 13422:5613334 13277:Bibcode 13237:4460235 13207:Bibcode 13140:6605414 13073:5784047 12866:7504633 12849:3207943 12818:Bibcode 12740:9868901 12723:4943066 12673:2376781 12630:1903542 12590:Bibcode 12550:5818342 12435:1119517 12308:Bibcode 12290:Science 12139:6112690 12116:Bibcode 12070:7219083 12049:Bibcode 12041:Science 11992:Bibcode 11957:1595805 11927:Bibcode 11782:Bibcode 11551:8997449 11520:Cancers 11471:Bibcode 11433:4728736 11388:2088679 11360:Bibcode 11206:5391725 11158:3821594 11141:5116102 11034:14 June 10996:6555124 10715:5 March 10655:14 June 10327:14 June 10290:9972634 10211:14 June 10179:1317136 10138:14 June 10107:14 June 9736:14 June 9433:7116757 9412:Bibcode 9272:11 June 9242:11 June 9211:11 June 9102:Bibcode 9067:5 March 9057:8869270 8764:13 June 8733:1915014 8725:9377276 8564:Bibcode 8460:10 July 8309:Bibcode 7504:1918673 7340:Bibcode 7312:14 June 7054:Bibcode 6978:Bibcode 6826:2309950 6769:12 June 6594:14 June 6419:Bibcode 6382:9563026 6246:3172241 6166:7761831 6146:Bibcode 6138:Science 6105:1890561 6097:7584891 6027:194–281 5738:, 1991. 5488:8058017 5468:Bibcode 5421:Bibcode 5374:Bibcode 5233:Bibcode 4978:7370364 4935:7370364 4566:6953413 4534:Bibcode 4436:Bibcode 4173:5527101 4150:Bibcode 4040:7343126 3993:3958369 3973:Bibcode 3800:Bibcode 3478:17 June 3411:2161592 3357:3074096 3329:Bibcode 3073:scholar 2981:malware 2977:malware 2860:noted: 2798:AlphaGo 2712:obesity 2489:Softmax 2122:iFlyTek 2094:Cortana 1825:called 1775:dropout 1599:synapse 1591:neurons 1576:labeled 1567:systems 1331:AlexNet 1324:YouTube 1041:ChatGPT 1029:chunker 780:History 754:dropout 742:testing 695:sigmoid 660:Boolean 367:AI boom 345:History 268:Physics 14891:Huawei 14871:OpenAI 14773:People 14743:MuZero 14605:Gemini 14600:Claude 14535:DALL-E 14447:Theano 14075:  14047:  14028:  13989:  13963:  13955:  13871:Nature 13706:  13420:  13410:  13379:5 July 13313:515925 13311:  13303:  13295:  13268:Nature 13235:  13227:  13199:Nature 13147:  13137:  13088:  13080:  13070:  13062:  13023:  13015:  13007:  12972:  12964:  12956:  12921:  12913:  12905:  12885:Neuron 12864:  12856:  12846:  12838:  12789:  12781:  12773:  12738:  12730:  12720:  12712:  12698:: 73. 12671:  12663:  12628:  12618:  12610:  12548:  12540:  12522:  12495:  12458:  12433:  12425:  12357:Medium 12336:  12328:  12268:  12260:  12252:  12203:  12195:  12146:  12136:  12077:  12067:  12020:  12012:  11963:  11955:  11947:  11863:  11855:  11841:Nature 11820:  11810:  11802:  11774:Nature 11698:  11599:  11589:  11558:  11548:  11540:  11499:  11491:  11431:  11421:  11386:  11378:  11324:  11316:  11213:  11203:  11195:  11156:  11148:  11138:  11085:  11075:  11003:  10993:  10865:  10857:  10596:  10545:  10537:  10297:  10287:  10246:  10177:  9832:  9814:  9691:  9647:  9605:  9531:  9498:  9490:  9462:Nature 9440:  9430:  9404:Nature 9262:OpenAI 9120:  9055:  9045:  8936:ICASSP 8912:  8902:  8884:  8851:  8841:  8803:  8731:  8723:  8715:  8666:  8510:  8502:  8345:515925 8343:  8335:  8327:  8301:Nature 8267:  8138:  8128:  7879:  7621:  7611:  7502:  7494:  7486:  7439:  7272:398770 7270:  7140:  7072:  7017:Online 6998:  6949:  6941:  6902:  6824:  6816:  6757:  6749:  6741:  6663:  6447:357467 6445:  6380:  6372:  6309:  6303:ICASSP 6279:  6244:  6236:  6195:  6174:871473 6172:  6164:  6103:  6095:  6033:  5983:  5944:  5936:  5887:234198 5885:  5824:  5794:  5760:  5682:  5607:  5538:  5520:  5486:  5439:  5392:  5344:et al. 5342:LeCun 5251:  5225:Nature 5202:  5179:2 July 5139:14 Sep 5097:  5009:  4984:  4976:  4941:  4933:  4743:  4682:  4674:  4595:  4564:  4557:346238 4554:  4374:  4346:  4269:  4180:  4170:  4096:  4066:  4038:  3991:  3934:  3926:  3867:  3831:Online 3732:  3707:  3699:  3641:393948 3639:  3631:  3562:  3544:  3409:  3399:  3355:  3347:  3321:Nature 3294:  3286:  3075:  3068:  3061:  3054:  3046:  2948:TinEye 2929:(AI). 2900:Errors 2874:Watson 2834:Theory 2553:, and 2485:pixels 2473:matrix 2469:vector 2432:data. 2328:, and 2126:Nuance 1939:Method 1930:bigram 1831:Huawei 1815:OpenAI 1414:Nvidia 1396:VGG-19 1351:VGG-16 1341:, and 1303:, and 1297:DanNet 1037:layers 774:Bishop 766:Widrow 637:greedy 599:pixels 595:tensor 488:, and 317:Ethics 14957:Mamba 14728:SARSA 14692:LLaMA 14687:BLOOM 14672:GPT-J 14662:GPT-4 14657:GPT-3 14652:GPT-2 14647:GPT-1 14610:LaMDA 14442:Keras 13983:Wired 13961:S2CID 13675:arXiv 13654:arXiv 13632:(PDF) 13625:(PDF) 13418:S2CID 13309:S2CID 13233:S2CID 13168:Wired 13117:arXiv 13086:S2CID 13021:S2CID 12970:S2CID 12919:S2CID 12862:S2CID 12787:S2CID 12736:S2CID 12669:S2CID 12621:51674 12546:S2CID 12431:S2CID 12298:arXiv 12266:S2CID 12232:arXiv 12201:S2CID 12175:arXiv 12106:arXiv 12018:S2CID 11961:S2CID 11861:S2CID 11696:S2CID 11658:(PDF) 11651:(PDF) 11597:S2CID 11497:S2CID 11429:S2CID 11384:S2CID 11350:arXiv 11322:S2CID 11296:arXiv 11284:(PDF) 11154:S2CID 11083:S2CID 10945:(PDF) 10938:(PDF) 10912:arXiv 10886:Wired 10863:S2CID 10818:arXiv 10732:arXiv 10543:S2CID 10481:(PDF) 10474:(PDF) 10440:Wired 10414:arXiv 10244:S2CID 10175:S2CID 10070:(PDF) 10059:(PDF) 10034:(PDF) 10023:(PDF) 9997:(PDF) 9990:(PDF) 9963:arXiv 9849:arXiv 9754:arXiv 9726:Wired 9700:(PDF) 9689:S2CID 9667:(PDF) 9645:S2CID 9603:S2CID 9496:S2CID 9470:arXiv 9319:arXiv 9232:ZDNet 9118:S2CID 9092:arXiv 9053:S2CID 8943:(PDF) 8932:(PDF) 8910:S2CID 8872:arXiv 8849:S2CID 8801:S2CID 8729:S2CID 8675:(PDF) 8664:S2CID 8648:(PDF) 8624:arXiv 8600:arXiv 8575:(PDF) 8554:arXiv 8546:(PDF) 8508:S2CID 8341:S2CID 8215:(PDF) 8204:(PDF) 8180:arXiv 8165:(PDF) 8158:(PDF) 8136:S2CID 8108:arXiv 8042:arXiv 8030:(PDF) 7980:arXiv 7931:(PDF) 7924:(PDF) 7898:arXiv 7859:arXiv 7828:arXiv 7803:arXiv 7783:arXiv 7757:arXiv 7735:arXiv 7712:arXiv 7691:arXiv 7683:(PDF) 7659:arXiv 7638:arXiv 7575:(PDF) 7568:(PDF) 7541:(PDF) 7522:(PDF) 7500:S2CID 7466:arXiv 7400:arXiv 7268:S2CID 7104:(PDF) 7097:(PDF) 7070:S2CID 6833:(PDF) 6822:S2CID 6792:(PDF) 6755:S2CID 6628:(PDF) 6617:(PDF) 6443:S2CID 6389:(PDF) 6378:S2CID 6342:(PDF) 6170:S2CID 6101:S2CID 6017:(PDF) 5942:S2CID 5916:arXiv 5883:S2CID 5704:(PDF) 5680:S2CID 5660:(PDF) 5635:(PDF) 5536:S2CID 5506:(PDF) 5293:(PDF) 5173:(PDF) 5162:(PDF) 5095:S2CID 4982:S2CID 4939:S2CID 4898:arXiv 4818:(PDF) 4799:(PDF) 4501:arXiv 4385:(PDF) 4368:(PDF) 4344:S2CID 4326:arXiv 4232:(PDF) 4209:(PDF) 4036:S2CID 4000:(PDF) 3989:S2CID 3959:(PDF) 3932:S2CID 3763:(PDF) 3756:(PDF) 3705:S2CID 3679:arXiv 3637:S2CID 3611:arXiv 3571:(PDF) 3560:S2CID 3530:(PDF) 3443:(PDF) 3432:(PDF) 3407:S2CID 3379:arXiv 3353:S2CID 3317:(PDF) 3292:S2CID 3080:JSTOR 3066:books 2794:Atari 2118:Baidu 2040:16.5 2032:17.8 2024:18.3 2016:18.7 2008:20.0 2000:20.7 1992:21.7 1984:22.4 1976:23.4 1968:24.8 1960:25.6 1952:26.1 1926:phone 1593:in a 1562:) or 1540:shell 1460:MNIST 1456:TIMIT 1255:TIMIT 1188:2000s 1155:DARPA 1141:Most 976:LeNet 964:LeNet 791:types 591:image 436:with 229:Music 224:Audio 14881:Mila 14682:PaLM 14615:Bard 14595:BERT 14578:Text 14557:Sora 14104:link 14073:ISBN 14045:ISBN 14026:ISBN 14003:2019 13987:ISSN 13953:ISSN 13852:2017 13813:2019 13783:2019 13602:2015 13571:2015 13514:2017 13483:2018 13452:2018 13408:ISBN 13381:2019 13350:2016 13301:PMID 13293:ISSN 13225:PMID 13180:2017 13145:PMID 13078:PMID 13060:ISSN 13013:PMID 13005:ISSN 12962:PMID 12954:ISSN 12911:PMID 12903:ISSN 12854:PMID 12836:ISSN 12779:PMID 12771:ISSN 12728:PMID 12710:ISSN 12661:ISSN 12626:PMID 12608:ISSN 12538:PMID 12493:PMID 12456:ISBN 12423:PMID 12334:PMID 12326:ISSN 12258:PMID 12250:ISSN 12193:ISSN 12144:PMID 12075:PMID 12010:ISSN 11953:OSTI 11945:ISSN 11900:2018 11853:PMID 11818:PMID 11800:ISSN 11631:2019 11587:ISBN 11556:PMID 11538:ISSN 11489:ISSN 11419:ISBN 11376:PMID 11314:PMID 11257:CNBC 11211:PMID 11193:ISSN 11146:PMID 11099:2015 11073:ISBN 11036:2017 11001:PMID 10855:PMID 10803:2015 10773:2015 10717:2015 10657:2017 10594:PMID 10535:PMID 10489:2016 10453:2017 10398:2017 10360:2017 10329:2017 10295:PMID 10213:2017 10140:2017 10109:2017 10078:2023 10005:2014 9934:Arts 9896:Arts 9830:PMID 9738:2017 9675:2015 9542:2023 9529:ISBN 9488:PMID 9438:PMID 9274:2020 9244:2020 9213:2020 9183:2015 9069:2018 9043:ISBN 8980:2017 8900:ISBN 8839:ISBN 8766:2017 8721:PMID 8713:ISSN 8500:PMID 8462:2018 8333:PMID 8325:ISSN 8285:link 8265:ISBN 8126:ISBN 8012:2020 7965:2019 7939:2019 7877:ISBN 7619:PMID 7609:ISBN 7492:PMID 7484:ISSN 7437:ISBN 7314:2017 7174:2018 7138:ISBN 7112:2023 6996:ISSN 6947:PMID 6939:ISSN 6900:PMID 6814:PMID 6771:2017 6747:PMID 6739:ISSN 6596:2017 6370:ISSN 6307:ISBN 6277:ISSN 6242:PMID 6234:ISSN 6193:ISBN 6162:PMID 6093:PMID 6031:ISBN 5999:link 5981:ISSN 5934:PMID 5822:ISBN 5758:ISBN 5605:ISSN 5547:2016 5484:PMID 5437:PMID 5390:PMID 5249:ISSN 5200:ISBN 5181:2017 5141:2024 5007:ISBN 4974:PMID 4931:PMID 4741:ISBN 4680:PMID 4672:ISSN 4593:ISBN 4562:PMID 4372:ISBN 4267:ISBN 4178:PMID 4094:ISBN 4064:ISBN 3924:ISSN 3878:2023 3865:ISBN 3730:ISBN 3697:PMID 3629:PMID 3579:2015 3480:2018 3397:ISBN 3345:PMID 3284:ISSN 3052:news 2997:In " 2762:and 2585:and 2343:and 2308:and 2120:and 2098:Xbox 2073:CNNs 2065:and 1560:ANNs 1517:and 1480:and 1468:LSTM 1357:and 1318:and 1196:and 1167:NIST 1153:and 966:for 950:The 887:ReLU 809:and 768:and 740:and 724:The 715:ReLU 693:for 555:and 14622:NMT 14505:OCR 14500:HWR 14452:JAX 14406:VPU 14401:TPU 14396:IPU 14220:SGD 13943:doi 13875:doi 13714:doi 13400:doi 13285:doi 13273:529 13215:doi 13203:529 13135:PMC 13127:doi 13068:PMC 13052:doi 13048:373 12997:doi 12946:doi 12893:doi 12844:PMC 12826:doi 12763:doi 12718:PMC 12700:doi 12653:doi 12616:PMC 12598:doi 12530:doi 12485:doi 12415:doi 12386:doi 12316:doi 12294:382 12242:doi 12185:doi 12134:PMC 12124:doi 12102:115 12065:PMC 12057:doi 12045:367 12000:doi 11988:360 11935:doi 11923:378 11845:doi 11808:PMC 11790:doi 11778:624 11688:doi 11684:188 11579:doi 11546:PMC 11528:doi 11479:doi 11411:doi 11368:doi 11306:doi 11201:PMC 11185:doi 11136:PMC 11126:doi 11065:hdl 11057:doi 10991:PMC 10981:doi 10847:doi 10584:hdl 10574:doi 10525:doi 10285:PMC 10275:doi 10236:doi 10167:doi 9942:doi 9904:doi 9822:doi 9679:doi 9637:doi 9595:doi 9564:doi 9521:doi 9480:doi 9466:589 9428:PMC 9420:doi 9408:587 9329:doi 9147:". 9110:doi 9035:doi 8892:doi 8831:doi 8793:doi 8705:doi 8656:doi 8492:doi 8317:doi 8305:529 8118:doi 7869:doi 7601:doi 7530:doi 7476:doi 7429:doi 7348:doi 7260:doi 7062:doi 6986:doi 6931:doi 6892:doi 6856:". 6806:doi 6731:doi 6553:doi 6523:doi 6496:doi 6435:hdl 6427:doi 6362:hdl 6354:doi 6269:doi 6226:doi 6222:202 6154:doi 6142:268 6083:hdl 6075:doi 5973:doi 5926:doi 5912:127 5875:doi 5814:doi 5726:". 5672:doi 5597:doi 5528:doi 5476:doi 5429:doi 5382:doi 5241:doi 5229:323 5087:doi 5039:doi 4966:doi 4923:doi 4855:doi 4807:doi 4774:doi 4664:doi 4618:doi 4585:doi 4552:PMC 4542:doi 4444:doi 4336:doi 4299:doi 4221:doi 4168:PMC 4158:doi 4028:doi 3981:doi 3914:doi 3857:doi 3808:doi 3689:doi 3621:doi 3552:doi 3389:doi 3337:doi 3325:521 3276:doi 3147:on 3035:by 2942:". 2880:to 2700:IBD 2629:CFD 2471:or 2413:An 2378:RFM 2264:to 2159:in 2083:RNN 1920:of 1443:on 1424:. 1416:'s 1333:by 1232:Teh 1151:NSA 756:as 702:'s 674:or 597:of 457:or 219:Art 15042:: 14100:}} 14096:{{ 14081:. 14061:; 13993:. 13985:. 13981:. 13959:. 13951:. 13939:22 13937:. 13933:. 13917:^ 13907:. 13881:. 13873:. 13869:. 13838:. 13821:^ 13799:. 13769:. 13712:. 13698:. 13627:. 13610:^ 13592:. 13588:. 13535:. 13531:. 13504:. 13500:. 13473:. 13469:. 13438:. 13416:. 13406:. 13371:. 13367:. 13336:. 13307:. 13299:. 13291:. 13283:. 13271:. 13253:; 13231:. 13223:. 13213:. 13201:. 13197:. 13170:. 13166:. 13143:. 13133:. 13125:. 13113:35 13111:. 13107:. 13084:. 13076:. 13066:. 13058:. 13046:. 13042:. 13019:. 13011:. 13003:. 12993:19 12991:. 12968:. 12960:. 12952:. 12942:14 12940:. 12917:. 12909:. 12901:. 12889:22 12887:. 12883:. 12860:. 12852:. 12842:. 12834:. 12824:. 12812:. 12808:. 12785:. 12777:. 12769:. 12757:. 12734:. 12726:. 12716:. 12708:. 12696:10 12694:. 12690:. 12667:. 12659:. 12647:. 12624:. 12614:. 12606:. 12596:. 12586:88 12584:. 12580:. 12544:. 12536:. 12528:. 12516:20 12514:. 12491:. 12479:. 12429:. 12421:. 12411:14 12409:. 12384:. 12380:. 12355:. 12332:. 12324:. 12314:. 12306:. 12292:. 12288:. 12264:. 12256:. 12248:. 12240:. 12228:67 12226:. 12222:. 12199:. 12191:. 12183:. 12169:. 12165:. 12142:. 12132:. 12122:. 12114:. 12100:. 12096:. 12073:. 12063:. 12055:. 12043:. 12039:. 12016:. 12008:. 11998:. 11986:. 11982:. 11959:. 11951:. 11943:. 11933:. 11921:. 11917:. 11890:. 11886:. 11873:^ 11859:. 11851:. 11843:. 11839:. 11816:. 11806:. 11798:. 11788:. 11776:. 11772:. 11747:. 11721:. 11717:. 11694:. 11682:. 11617:. 11595:. 11585:. 11554:. 11544:. 11536:. 11524:14 11522:. 11518:. 11495:. 11487:. 11477:. 11465:. 11461:. 11435:. 11427:. 11417:. 11405:. 11382:. 11374:. 11366:. 11358:. 11346:42 11344:. 11320:. 11312:. 11304:. 11292:PP 11290:. 11286:. 11271:^ 11231:. 11209:. 11199:. 11191:. 11181:24 11179:. 11175:. 11152:. 11144:. 11134:. 11120:. 11116:. 11089:. 11081:. 11071:. 11063:. 11026:. 11022:. 10999:. 10989:. 10977:21 10975:. 10969:. 10888:. 10884:. 10861:. 10853:. 10843:37 10841:. 10789:. 10763:. 10757:. 10746:^ 10682:. 10647:. 10643:. 10618:. 10614:. 10592:. 10582:. 10570:20 10568:. 10564:. 10541:. 10533:. 10521:12 10519:. 10515:. 10461:^ 10443:. 10437:. 10388:. 10384:. 10368:^ 10350:. 10346:. 10319:. 10315:. 10293:. 10283:. 10271:21 10269:. 10265:. 10242:. 10232:30 10230:. 10203:. 10199:. 10187:^ 10173:. 10163:23 10161:. 10130:. 10126:. 10099:. 10095:. 10061:. 10029:. 10025:. 9992:. 9977:^ 9936:. 9932:. 9918:^ 9898:. 9894:. 9880:^ 9828:. 9820:. 9808:32 9780:. 9776:. 9728:. 9724:. 9695:. 9687:. 9677:. 9673:. 9669:. 9643:. 9609:. 9601:. 9591:22 9589:. 9585:. 9527:. 9494:. 9486:. 9478:. 9464:. 9450:^ 9436:. 9426:. 9418:. 9406:. 9402:. 9376:. 9352:. 9327:. 9315:45 9313:. 9309:. 9290:. 9260:. 9234:. 9230:. 9199:. 9173:. 9169:. 9116:. 9108:. 9100:. 9088:75 9086:. 9059:. 9051:. 9041:. 9027:. 9001:. 8997:. 8970:. 8966:. 8938:. 8934:. 8908:. 8898:. 8890:. 8880:. 8847:. 8837:. 8799:. 8789:86 8787:. 8783:. 8756:. 8752:. 8741:^ 8727:. 8719:. 8711:. 8699:. 8670:. 8662:. 8650:. 8614:^ 8590:^ 8570:. 8562:. 8552:. 8548:. 8531:^ 8514:. 8506:. 8498:. 8488:12 8486:. 8482:. 8470:^ 8448:. 8418:. 8388:. 8363:, 8339:. 8331:. 8323:. 8315:. 8303:. 8281:}} 8277:{{ 8238:. 8206:. 8134:. 8124:. 8116:. 8080:. 8068:^ 8038:37 8036:. 8032:. 7955:. 7875:. 7867:. 7853:. 7826:. 7781:, 7689:. 7685:. 7617:. 7607:. 7536:. 7528:. 7524:. 7498:. 7490:. 7482:. 7474:. 7462:22 7460:. 7435:. 7423:. 7374:, 7360:^ 7346:. 7336:37 7334:. 7322:^ 7304:. 7300:. 7274:. 7266:. 7254:. 7228:. 7224:. 7198:. 7194:. 7182:^ 7160:. 7120:^ 7082:^ 7068:. 7060:. 7050:29 7048:. 7030:^ 6994:. 6984:. 6972:. 6968:. 6945:. 6937:. 6927:18 6925:. 6921:. 6898:. 6888:11 6886:. 6882:. 6864:. 6828:. 6820:. 6812:. 6802:18 6800:. 6794:. 6761:. 6753:. 6745:. 6737:. 6727:11 6725:. 6721:. 6619:. 6604:^ 6586:. 6582:. 6549:31 6547:. 6535:^ 6519:31 6517:. 6490:. 6441:. 6433:. 6425:. 6415:26 6413:. 6384:. 6376:. 6368:. 6360:. 6350:37 6348:. 6344:. 6315:. 6301:. 6275:. 6265:07 6263:. 6240:. 6232:. 6220:. 6216:. 6168:. 6160:. 6152:. 6140:. 6132:; 6128:; 6099:. 6091:. 6081:. 6069:. 6061:; 6057:; 6053:; 6029:. 5995:}} 5991:{{ 5967:. 5963:. 5940:. 5932:. 5924:. 5910:. 5895:^ 5881:. 5869:. 5836:^ 5820:. 5790:, 5780:; 5734:. 5713:^ 5678:. 5666:. 5662:. 5647:^ 5637:. 5617:^ 5603:. 5593:14 5591:. 5587:. 5568:. 5564:. 5534:. 5526:. 5514:86 5512:. 5508:. 5482:. 5474:. 5464:21 5462:. 5458:. 5435:. 5427:. 5417:30 5415:. 5411:. 5388:. 5380:. 5370:29 5368:. 5364:. 5328:. 5247:. 5239:. 5227:. 5223:. 5164:. 5116:^ 5093:. 5083:16 5081:. 5035:30 5033:. 4980:. 4972:. 4962:36 4960:. 4937:. 4929:. 4883:EC 4881:. 4851:22 4849:. 4845:. 4813:. 4801:. 4786:^ 4768:. 4764:. 4692:^ 4678:. 4670:. 4660:65 4658:. 4654:. 4591:. 4560:. 4550:. 4540:. 4530:79 4528:. 4524:. 4478:^ 4468:. 4456:^ 4442:. 4432:39 4430:. 4411:. 4400:^ 4380:. 4342:. 4334:. 4322:43 4320:. 4293:. 4281:^ 4247:^ 4227:. 4215:. 4211:. 4190:^ 4176:. 4166:. 4156:. 4144:. 4140:. 4108:^ 4078:^ 4048:^ 4034:. 4022:. 4008:^ 3987:. 3979:. 3967:. 3961:. 3944:^ 3930:. 3922:. 3910:53 3904:. 3863:. 3806:. 3794:. 3790:. 3778:^ 3703:. 3695:. 3687:. 3675:61 3673:. 3649:^ 3635:. 3627:. 3619:. 3607:35 3605:. 3587:^ 3558:. 3550:. 3536:. 3532:. 3513:^ 3496:. 3466:. 3438:. 3434:. 3419:^ 3405:. 3395:. 3387:. 3365:^ 3351:. 3343:. 3335:. 3323:. 3319:. 3304:^ 3290:. 3282:. 3272:26 3270:. 3266:. 3167:. 2894:'s 2884:. 2802:Go 2710:, 2706:, 2702:, 2549:, 2545:, 2384:. 2347:. 2324:, 2268:. 2116:, 2112:, 2108:, 2104:, 2100:, 2096:, 1845:. 1628:, 1624:, 1582:. 1476:, 1447:. 1398:. 1365:. 1337:, 1261:. 1226:, 1219:. 1111:, 1107:, 1099:, 1095:, 1000:. 776:. 764:, 706:. 678:. 647:. 575:. 520:, 516:, 512:, 508:, 504:, 500:, 496:, 484:, 480:, 476:, 472:, 468:, 461:. 453:, 14135:e 14128:t 14121:v 14106:) 14053:. 14034:. 14005:. 13967:. 13945:: 13892:. 13877:: 13854:. 13815:. 13785:. 13755:. 13720:. 13716:: 13700:2 13683:. 13677:: 13662:. 13656:: 13641:. 13604:. 13573:. 13546:. 13516:. 13485:. 13454:. 13424:. 13402:: 13383:. 13352:. 13315:. 13287:: 13279:: 13239:. 13217:: 13209:: 13182:. 13151:. 13129:: 13119:: 13092:. 13054:: 13027:. 12999:: 12976:. 12948:: 12925:. 12895:: 12868:. 12828:: 12820:: 12814:7 12793:. 12765:: 12759:1 12742:. 12702:: 12675:. 12655:: 12649:8 12632:. 12600:: 12592:: 12552:. 12532:: 12499:. 12487:: 12481:9 12464:. 12437:. 12417:: 12394:. 12388:: 12365:. 12340:. 12318:: 12310:: 12300:: 12272:. 12244:: 12234:: 12207:. 12187:: 12177:: 12171:7 12150:. 12126:: 12118:: 12108:: 12081:. 12059:: 12051:: 12024:. 12002:: 11994:: 11967:. 11937:: 11929:: 11902:. 11867:. 11847:: 11824:. 11792:: 11784:: 11757:. 11732:. 11702:. 11690:: 11667:. 11633:. 11603:. 11581:: 11562:. 11530:: 11503:. 11481:: 11473:: 11467:8 11446:. 11413:: 11390:. 11370:: 11362:: 11352:: 11328:. 11308:: 11298:: 11265:. 11241:. 11217:. 11187:: 11160:. 11128:: 11122:4 11101:. 11067:: 11059:: 11038:. 11007:. 10983:: 10954:. 10920:. 10914:: 10899:. 10869:. 10849:: 10826:. 10820:: 10805:. 10775:. 10740:. 10734:: 10719:. 10693:. 10659:. 10629:. 10600:. 10586:: 10576:: 10549:. 10527:: 10491:. 10455:. 10422:. 10416:: 10400:. 10362:. 10331:. 10301:. 10277:: 10250:. 10238:: 10215:. 10181:. 10169:: 10142:. 10111:. 10080:. 10043:. 10007:. 9971:. 9965:: 9950:. 9944:: 9938:6 9912:. 9906:: 9900:6 9857:. 9851:: 9836:. 9824:: 9791:. 9762:. 9756:: 9740:. 9709:. 9681:: 9651:. 9639:: 9620:. 9597:: 9570:. 9566:: 9544:. 9523:: 9502:. 9482:: 9472:: 9444:. 9422:: 9414:: 9387:. 9362:. 9337:. 9331:: 9321:: 9294:. 9276:. 9246:. 9215:. 9185:. 9124:. 9112:: 9104:: 9094:: 9071:. 9037:: 9012:. 8982:. 8952:. 8916:. 8894:: 8874:: 8855:. 8833:: 8807:. 8795:: 8768:. 8735:. 8707:: 8701:9 8684:. 8658:: 8632:. 8626:: 8608:. 8602:: 8584:. 8566:: 8556:: 8525:. 8494:: 8464:. 8433:. 8403:. 8347:. 8319:: 8311:: 8287:) 8273:. 8248:. 8224:. 8188:. 8182:: 8142:. 8120:: 8110:: 8091:. 8050:. 8044:: 8014:. 7988:. 7982:: 7967:. 7941:. 7906:. 7900:: 7885:. 7871:: 7861:: 7836:. 7830:: 7811:. 7805:: 7785:: 7767:. 7765:. 7759:: 7745:. 7743:. 7737:: 7722:. 7720:. 7714:: 7699:. 7693:: 7667:. 7661:: 7646:. 7640:: 7625:. 7603:: 7584:. 7550:. 7532:: 7506:. 7478:: 7468:: 7445:. 7431:: 7408:. 7402:: 7354:. 7350:: 7342:: 7316:. 7285:. 7262:: 7239:. 7209:. 7176:. 7146:. 7114:. 7076:. 7064:: 7056:: 7002:. 6988:: 6980:: 6974:4 6953:. 6933:: 6906:. 6894:: 6842:. 6808:: 6773:. 6733:: 6669:. 6637:. 6598:. 6559:. 6555:: 6529:. 6525:: 6502:. 6498:: 6492:7 6475:. 6449:. 6437:: 6429:: 6421:: 6398:. 6364:: 6356:: 6326:. 6283:. 6271:: 6248:. 6228:: 6201:. 6176:. 6156:: 6148:: 6107:. 6085:: 6077:: 6071:7 6039:. 6001:) 5987:. 5975:: 5969:9 5948:. 5928:: 5918:: 5889:. 5877:: 5871:2 5830:. 5816:: 5766:. 5706:. 5686:. 5674:: 5668:4 5641:. 5611:. 5599:: 5572:. 5570:8 5549:. 5530:: 5490:. 5478:: 5470:: 5443:. 5431:: 5423:: 5396:. 5384:: 5376:: 5332:. 5255:. 5243:: 5235:: 5208:. 5183:. 5143:. 5101:. 5089:: 5045:. 5041:: 5015:. 4988:. 4968:: 4945:. 4925:: 4906:. 4900:: 4863:. 4857:: 4827:. 4809:: 4780:. 4776:: 4770:6 4749:. 4722:. 4686:. 4666:: 4639:. 4624:. 4620:: 4601:. 4587:: 4568:. 4544:: 4536:: 4509:. 4503:: 4470:C 4450:. 4446:: 4438:: 4415:. 4394:. 4350:. 4338:: 4328:: 4305:. 4301:: 4295:5 4275:. 4241:. 4223:: 4217:7 4184:. 4160:: 4152:: 4146:8 4102:. 4072:. 4042:. 4030:: 4024:4 3983:: 3975:: 3969:2 3938:. 3916:: 3880:. 3859:: 3816:. 3810:: 3802:: 3796:4 3772:. 3738:. 3711:. 3691:: 3681:: 3643:. 3623:: 3613:: 3581:. 3554:: 3538:2 3507:. 3482:. 3452:. 3413:. 3391:: 3381:: 3359:. 3339:: 3331:: 3298:. 3278:: 3102:) 3096:( 3091:) 3087:( 3077:· 3070:· 3063:· 3056:· 3029:. 2952:. 2312:. 1759:1 1746:( 1728:2 1715:( 1558:( 1462:( 1134:/ 414:e 407:t 400:v 310:/ 34:. 20:)

Index

Deep neural networks
Deep Learning (South Park)
Representing images on multiple layers of abstraction in deep learning
Artificial intelligence

Major goals
Artificial general intelligence
Intelligent agent
Recursive self-improvement
Planning
Computer vision
General game playing
Knowledge reasoning
Natural language processing
Robotics
AI safety
Machine learning
Symbolic
Deep learning
Bayesian networks
Evolutionary algorithms
Hybrid intelligent systems
Systems integration
Applications
Bioinformatics
Deepfake
Earth sciences
Finance
Generative AI
Art

Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.