problem string | solution string | n_hop int64 | dataset string | split string | __index_level_0__ int64 |
|---|---|---|---|---|---|
The node content
null
1-hop neighbor's text information:Bits-back coding software guide: Abstract | In this document, I first review the theory behind bits-back coding (aka. free energy coding) (Frey and Hinton 1996) and then describe the interface to C-language software that can be used for bits-back coding. This meth... | 6 | 1 | cora | train | 2 |
The node content
null
1-hop neighbor's text information: Genetic Algorithms in Search, Optimization and Machine Learning. : Angeline, P., Saunders, G. and Pollack, J. (1993) An evolutionary algorithm that constructs recurrent neural networks, LAIR Technical Report #93-PA-GNARLY, Submitted to IEEE Transactions on Neural... | 3 | 2 | cora | train | 4 |
The node content
null
1-hop neighbor's text information: "Measures for performance evaluation of genetic algorithms," : This paper proposes four performance measures of a genetic algorithm (GA) which enable us to compare different GAs for an op timization problem and different choices of their parameters' values. The p... | 3 | 2 | cora | train | 5 |
The node content
null
1-hop neighbor's text information: Generalization in reinforcement learning: Successful examples using sparse coarse coding. : On large problems, reinforcement learning systems must use parameterized function approximators such as neural networks in order to generalize between similar situations a... | 5 | 1 | cora | train | 6 |
The node content
null
1-hop neighbor's text information: Grounding robotic control with genetic neural net-works. : Technical Report AI94-223 May 1994 Abstract An important but often neglected problem in the field of Artificial Intelligence is that of grounding systems in their environment such that the representations... | 3 | 1 | cora | train | 7 |
The node content
null
1-hop neighbor's text information: A model for projection and action. : In designing autonomous agents that deal competently with issues involving time and space, there is a tradeoff to be made between guaranteed response-time reactions on the one hand, and flexibility and expressiveness on the ot... | 6 | 2 | cora | train | 8 |
The node content
null
1-hop neighbor's text information:Interpretable Neural Networks with BP-SOM: Back-propagation learning (BP) is known for its serious limitations in generalising knowledge from certain types of learning material. BP-SOM is an extension of BP which overcomes some of these limitations. BP-SOM is a co... | 1 | 1 | cora | train | 9 |
The node content
null
1-hop neighbor's text information: Generalization in reinforcement learning: Successful examples using sparse coarse coding. : On large problems, reinforcement learning systems must use parameterized function approximators such as neural networks in order to generalize between similar situations a... | 5 | 1 | cora | train | 10 |
The node content
null
1-hop neighbor's text information: Grounding robotic control with genetic neural net-works. : Technical Report AI94-223 May 1994 Abstract An important but often neglected problem in the field of Artificial Intelligence is that of grounding systems in their environment such that the representations... | 3 | 2 | cora | train | 11 |
The node content
null
1-hop neighbor's text information: Toward efficient agnostic learning. : In this paper we initiate an investigation of generalizations of the Probably Approximately Correct (PAC) learning model that attempt to significantly weaken the target function assumptions. The ultimate goal in this directio... | 4 | 2 | cora | train | 12 |
The node content
null
1-hop neighbor's text information: Maximizing the robustness of a linear threshold classifier with discrete weights. Network: : Quantization of the parameters of a Perceptron is a central problem in hardware implementation of neural networks using a numerical technology. An interesting property of... | 3 | 2 | cora | train | 14 |
The node content
null
1-hop neighbor's text information: Learning to play the game of chess. : This paper presents NeuroChess, a program which learns to play chess from the final outcome of games. NeuroChess learns chess board evaluation functions, represented by artificial neural networks. It integrates inductive neur... | 5 | 2 | cora | train | 16 |
The node content
null
1-hop neighbor's text information: Learning one-dimensional geometric patterns under one-sided random misclassification noise. :
1-hop neighbor's text information: "A General Lower Bound on the Number of Examples Needed for Learning," : We prove a lower bound of ( 1 * ln 1 ffi + VCdim(C) * ) on t... | 4 | 2 | cora | train | 17 |
The node content
null
1-hop neighbor's text information: Genetic Algorithms in Search, Optimization and Machine Learning. : Angeline, P., Saunders, G. and Pollack, J. (1993) An evolutionary algorithm that constructs recurrent neural networks, LAIR Technical Report #93-PA-GNARLY, Submitted to IEEE Transactions on Neural... | 3 | 1 | cora | train | 18 |
The node content
null
1-hop neighbor's text information: (1995) Linear space induction in first order logic with RELIEFF, : Current ILP algorithms typically use variants and extensions of the greedy search. This prevents them to detect significant relationships between the training objects. Instead of myopic impurity f... | 0 | 1 | cora | train | 19 |
The node content
null
1-hop neighbor's text information: Learning to predict by the methods of temporal differences. : This article introduces a class of incremental learning procedures specialized for prediction|that is, for using past experience with an incompletely known system to predict its future behavior. Wherea... | 5 | 2 | cora | train | 20 |
The node content
null
1-hop neighbor's text information: Genetic Algorithms in Search, Optimization and Machine Learning. : Angeline, P., Saunders, G. and Pollack, J. (1993) An evolutionary algorithm that constructs recurrent neural networks, LAIR Technical Report #93-PA-GNARLY, Submitted to IEEE Transactions on Neural... | 5 | 2 | cora | train | 22 |
The node content
null
1-hop neighbor's text information: Learning to play the game of chess. : This paper presents NeuroChess, a program which learns to play chess from the final outcome of games. NeuroChess learns chess board evaluation functions, represented by artificial neural networks. It integrates inductive neur... | 5 | 2 | cora | train | 23 |
The node content
null
1-hop neighbor's text information: "Measures for performance evaluation of genetic algorithms," : This paper proposes four performance measures of a genetic algorithm (GA) which enable us to compare different GAs for an op timization problem and different choices of their parameters' values. The p... | 3 | 2 | cora | train | 24 |
The node content
null
1-hop neighbor's text information: Learning to predict by the methods of temporal differences. : This article introduces a class of incremental learning procedures specialized for prediction|that is, for using past experience with an incompletely known system to predict its future behavior. Wherea... | 5 | 1 | cora | train | 25 |
The node content
null
1-hop neighbor's text information:Solving Combinatorial Optimization Tasks by Reinforcement Learning: A General Methodology Applied to Resource-Constrained Scheduling: This paper introduces a methodology for solving combinatorial optimization problems through the application of reinforcement learn... | 5 | 2 | cora | train | 26 |
The node content
null
1-hop neighbor's text information: Why experimentation can be better than "perfect guidance". : Many problems correspond to the classical control task of determining the appropriate control action to take, given some (sequence of) observations. One standard approach to learning these control rules... | 5 | 1 | cora | train | 27 |
The node content
null
1-hop neighbor's text information: Maximizing the robustness of a linear threshold classifier with discrete weights. Network: : Quantization of the parameters of a Perceptron is a central problem in hardware implementation of neural networks using a numerical technology. An interesting property of... | 3 | 1 | cora | train | 28 |
The node content
null
1-hop neighbor's text information: The Structure-Mapping Engine: Algorithms and Examples. : This paper describes the Structure-Mapping Engine (SME), a program for studying analogical processing. SME has been built to explore Gentner's Structure-mapping theory of analogy, and provides a "tool kit" ... | 2 | 2 | cora | train | 29 |
The node content
null
1-hop neighbor's text information: Using Markov chains to analyze GAFOs. : Our theoretical understanding of the properties of genetic algorithms (GAs) being used for function optimization (GAFOs) is not as strong as we would like. Traditional schema analysis provides some first order insights, but... | 3 | 2 | cora | train | 30 |
The node content
null
1-hop neighbor's text information: Introduction to the Theory of Neural Computa 92 tion. : Neural computation, also called connectionism, parallel distributed processing, neural network modeling or brain-style computation, has grown rapidly in the last decade. Despite this explosion, and ultimatel... | 1 | 2 | cora | train | 31 |
The node content
null
1-hop neighbor's text information: Genetic Algorithms in Search, Optimization and Machine Learning. : Angeline, P., Saunders, G. and Pollack, J. (1993) An evolutionary algorithm that constructs recurrent neural networks, LAIR Technical Report #93-PA-GNARLY, Submitted to IEEE Transactions on Neural... | 3 | 2 | cora | train | 32 |
The node content
null
1-hop neighbor's text information: Genetic Algorithms in Search, Optimization and Machine Learning. : Angeline, P., Saunders, G. and Pollack, J. (1993) An evolutionary algorithm that constructs recurrent neural networks, LAIR Technical Report #93-PA-GNARLY, Submitted to IEEE Transactions on Neural... | 3 | 2 | cora | train | 34 |
The node content
null
1-hop neighbor's text information: Using Markov chains to analyze GAFOs. : Our theoretical understanding of the properties of genetic algorithms (GAs) being used for function optimization (GAFOs) is not as strong as we would like. Traditional schema analysis provides some first order insights, but... | 3 | 1 | cora | train | 35 |
The node content
null
1-hop neighbor's text information: Grounding robotic control with genetic neural net-works. : Technical Report AI94-223 May 1994 Abstract An important but often neglected problem in the field of Artificial Intelligence is that of grounding systems in their environment such that the representations... | 3 | 2 | cora | train | 36 |
The node content
null
1-hop neighbor's text information: Using Markov chains to analyze GAFOs. : Our theoretical understanding of the properties of genetic algorithms (GAs) being used for function optimization (GAFOs) is not as strong as we would like. Traditional schema analysis provides some first order insights, but... | 3 | 2 | cora | train | 38 |
The node content
null
1-hop neighbor's text information: The Use of Explicit Goals for Knowledge to Guide Inference and Learning. : Combinatorial explosion of inferences has always been a central problem in artificial intelligence. Although the inferences that can be drawn from a reasoner's knowledge and from available... | 2 | 2 | cora | train | 40 |
The node content
null
1-hop neighbor's text information: Learning to play the game of chess. : This paper presents NeuroChess, a program which learns to play chess from the final outcome of games. NeuroChess learns chess board evaluation functions, represented by artificial neural networks. It integrates inductive neur... | 5 | 2 | cora | train | 41 |
The node content
null
1-hop neighbor's text information: Maximizing the robustness of a linear threshold classifier with discrete weights. Network: : Quantization of the parameters of a Perceptron is a central problem in hardware implementation of neural networks using a numerical technology. An interesting property of... | 3 | 1 | cora | train | 42 |
The node content
null
1-hop neighbor's text information: A Comparison of Full and Partial Predicated Execution Support for ILP Processors. : One can effectively utilize predicated execution to improve branch handling in instruction-level parallel processors. Although the potential benefits of predicated execution are h... | 0 | 1 | cora | train | 44 |
The node content
null
1-hop neighbor's text information: (1995) Linear space induction in first order logic with RELIEFF, : Current ILP algorithms typically use variants and extensions of the greedy search. This prevents them to detect significant relationships between the training objects. Instead of myopic impurity f... | 0 | 2 | cora | train | 45 |
The node content
null
1-hop neighbor's text information: (1992) Generic Teleological Mechanisms and their Use in Case Adaptation, : In experience-based (or case-based) reasoning, new problems are solved by retrieving and adapting the solutions to similar problems encountered in the past. An important issue in experienc... | 2 | 2 | cora | train | 46 |
The node content
null
1-hop neighbor's text information: The Estimation of Probabilities in Attribute Selection Measures for Decision Structure Induction in Proceeding of the European Summer School on Machine Learning, : In this paper we analyze two well-known measures for attribute selection in decision tree induction... | 0 | 2 | cora | train | 47 |
The node content
null
1-hop neighbor's text information: Generalization in reinforcement learning: Successful examples using sparse coarse coding. : On large problems, reinforcement learning systems must use parameterized function approximators such as neural networks in order to generalize between similar situations a... | 5 | 1 | cora | train | 48 |
The node content
null
1-hop neighbor's text information: A case study in dynamic belief networks: monitoring walking, fall prediction and detection. :
1-hop neighbor's text information: A theory of inferred causation. : This paper concerns the empirical basis of causation, and addresses the following issues: We propos... | 6 | 2 | cora | train | 49 |
The node content
null
1-hop neighbor's text information: Neuro-dynamic Programming. :
1-hop neighbor's text information: Dynamic Programming and Markov Processes. : The problem of maximizing the expected total discounted reward in a completely observable Markovian environment, i.e., a Markov decision process (mdp), mo... | 6 | 2 | cora | train | 50 |
The node content
null
1-hop neighbor's text information: A practical Bayesian framework for backpropagation networks. : A quantitative and practical Bayesian framework is described for learning of mappings in feedforward networks. The framework makes possible: (1) objective comparisons between solutions using alternati... | 1 | 2 | cora | train | 51 |
The node content
null
1-hop neighbor's text information:Fast Online Q(): Q()-learning uses TD()-methods to accelerate Q-learning. The update complexity of previous online Q() implementations based on lookup-tables is bounded by the size of the state/action space. Our faster algorithm's update complexity is bounded by t... | 5 | 1 | cora | train | 52 |
The node content
null
1-hop neighbor's text information: Neuronlike adaptive elements that can solve difficult learning control problems. : Miller, G. (1956). The magical number seven, plus or minus two: Some limits on our capacity for processing information. The Psychological Review, 63(2):81-97. Schmidhuber, J. (1990... | 3 | 2 | cora | train | 53 |
The node content
null
1-hop neighbor's text information: Using Markov chains to analyze GAFOs. : Our theoretical understanding of the properties of genetic algorithms (GAs) being used for function optimization (GAFOs) is not as strong as we would like. Traditional schema analysis provides some first order insights, but... | 3 | 1 | cora | train | 54 |
The node content
null
1-hop neighbor's text information:Robust Value Function Approximation by Working Backwards Computing an accurate value function is the key: In this paper, we examine the intuition that TD() is meant to operate by approximating asynchronous value iteration. We note that on the important class of di... | 5 | 2 | cora | train | 55 |
The node content
null
1-hop neighbor's text information: Adapting the evaluation space to improve global learning. :
1-hop neighbor's text information: Adaptation in constant utility nonstationary environments. : Environments that vary over time present a fundamental problem to adaptive systems. Although in the worst ... | 3 | 1 | cora | train | 56 |
The node content
null
1-hop neighbor's text information: Graphical Models in Applied Multivariate Statistics. :
1-hop neighbor's text information: Using path diagrams as a structural equation modeling tool. :
1-hop neighbor's text information: A theory of inferred causation. : This paper concerns the empirical basis ... | 6 | 2 | cora | train | 59 |
The node content
null
1-hop neighbor's text information: Introduction to the Theory of Neural Computa 92 tion. : Neural computation, also called connectionism, parallel distributed processing, neural network modeling or brain-style computation, has grown rapidly in the last decade. Despite this explosion, and ultimatel... | 3 | 1 | cora | train | 61 |
The node content
null
1-hop neighbor's text information:Computational complexity reduction for BN2O networks using similarity of states: Although probabilistic inference in a general Bayesian belief network is an NP-hard problem, inference computation time can be reduced in most practical cases by exploiting domain kno... | 6 | 1 | cora | train | 63 |
The node content
null
1-hop neighbor's text information: Learning in the presence of malicious errors, : In this paper we study an extension of the distribution-free model of learning introduced by Valiant [23] (also known as the probably approximately correct or PAC model) that allows the presence of malicious errors ... | 4 | 2 | cora | train | 64 |
End of preview. Expand in Data Studio
README.md exists but content is empty.
- Downloads last month
- 6