index int64 0 20.3k | text stringlengths 0 1.3M | year stringdate 1987-01-01 00:00:00 2024-01-01 00:00:00 | No stringlengths 1 4 |
|---|---|---|---|
100 | 206 APPLICATIONS OF ~RROR BACK-PROPAGATION TO PHONETIC CLASSIFICATION Hong C. Leung & Victor W. Zue Spoken Language Systems Group Laboratory for Computer Science Massachusetts Institute of Technology Cambridge, MA 02139 ABSTRACT This paper is concerced with the use of error back-propagatio... | 1988 | 19 |
101 | 402 MODELING THE OLFACTORY BULB COUPLED NONLINEAR OSCILLATORS Zhaoping Lit J. J. Hopfield· t Division of Physics, Mathematics and Astronomy ·Division of Biology, and Division of Chemistry and Chemical Engineering t· California Institute of Technology, Pasadena, CA 91125, USA • AT&T Bell Laborato... | 1988 | 2 |
102 | AN ANALOG SELF-ORGANIZING NEURAL NElWORK CHIP James R. Mann MIT Lincoln Laboratory 244 Wood Street Lexington, MA 02173"()()73 ABSTRACT Sheldon Gilbert 4421 West Estes Lincolnwood, IL 60646 A design for a fully analog version of a self-organizing feature map neural network has been compl... | 1988 | 20 |
103 | A BIFURCATION THEORY APPROACH TO THE PROGRAMMING OF PERIODIC A TTRACTORS IN NETWORK MODELS OF OLFACTORY CORTEX Bill Baird Department of Biophysics U.C. Berkeley ABSTRACT A new learning algorithm for the storage of static and periodic attractors in biologically inspired recurrent analog neural... | 1988 | 21 |
104 | HETEROGENEOUS NEURAL NETWORKS FOR ADAPTIVE BEHAVIOR IN DYNAMIC ENVIRONMENTS Hillel J. Chiel Biology Dept. & CAISR CWRU Randall D. Beer Dept. of Computer Engineering and Science and Center for Automation and Intelligent Systems Research Case Western Reserve University Cleveland, OH 44106 ... | 1988 | 22 |
105 | PROGRAMMABLE ANALOG PULSE-FIRING NEURAL NETWORKS Alan F. Murray Alister Hamilton Dept. of Elec. Eng., Dept. of Elec. Eng., University of Edinburgh, University of Edinburgh, Mayfield Road, Mayfield Road, Edinburgh, EH9 3JL Edinburgh, EH9 3JL United Kingdom. United Kingdom. ABSTRACT... | 1988 | 23 |
106 | AN ANALOG VLSI CHIP FOR THIN-PLATE SURFACE INTERPOLATION John G. Harris California Institute of Technology Computation and Neural Systeins Option, 216-76 Pasadena, CA 91125 ABSTRACT Reconstructing a surface from sparse sensory data is a well-known problem iIi computer vision. This paper describe... | 1988 | 24 |
107 | 436 SIMULATION AND MEASUREMENT OF THE ELECTRIC FIELDS GENERATED BY WEAKLY ELECTRIC FISH Brian Rasnow1, Christopher Assad2, Mark E. Nelson3 and James M. Bow~ Divisions of Physics1 ,Elecbical Engineerini, and Biolo~ Caltech, Pasadena, 91125 ABSTRACT The weakly electric fish, Gnathonemus peters;;, ... | 1988 | 25 |
108 | TRAINING MULTILAYER PERCEPTRONS WITH THE EXTENDED KALMAN ALGORITHM Sharad Singhal and Lance Wu Bell Communications Research, Inc. Morristown, NJ 07960 ABSTRACT A large fraction of recent work in artificial neural nets uses multilayer perceptrons trained with the back-propagation algorithm descri... | 1988 | 26 |
109 | 720 AN ELECTRONIC PHOTORECEPTOR SENSITIVE TO SMALL CHANGES IN INTENSITY T. Delbriick and C. A. Mead 256-80 Computer Science California Institute of Technology Pasadena, CA 91125 ABSTRACT We describe an electronic photoreceptor circuit that is sensitive to small changes in incident light inten... | 1988 | 27 |
110 | 494 TRAINING A 3-NODE NEURAL NETWORK IS NP-COMPLETE Avrim Blum'" MIT Lab. for Computer Science Cambridge, Mass. 02139 USA Ronald L. Rivest t MIT Lab. for Computer Science Cambridge, Mass. 02139 USA ABSTRACT We consider a 2-layer, 3-node, n-input neural network whose nodes compute linear... | 1988 | 28 |
111 | 602 AUTOMATIC LOCAL ANNEALING Jared Leinbach Deparunent of Psychology Carnegie-Mellon University Pittsburgh, PA 15213 ABSTRACT This research involves a method for finding global maxima in constraint satisfaction networks. It is an annealing process butt unlike most otherst requires no anne... | 1988 | 29 |
112 | 40 EFFICIENT PARALLEL LEARNING ALGORITHMS FOR NEURAL NETWORKS Alan H. Kramer and A. Sangiovanni-Vincentelli Department of EECS U .C. Berkeley Berkeley, CA 94720 ABSTRACT Parallelizable optimization techniques are applied to the problem of learning in feedforward neural networks. In addition t... | 1988 | 3 |
113 | 124 ADAPTIVE NEURAL NET PREPROCESSING FOR SIGNAL DETECTION IN NON-GAUSSIAN NOISE1 Richard P. Lippmann and Paul Beckman MIT Lincoln Laboratory Lexington, MA 02173 ABSTRACT A nonlinearity is required before matched filtering in mInimum error receivers when additive noise is present which is imp... | 1988 | 30 |
114 | 468 LEARNING THE SOLUTION TO THE APERTURE PROBLEM FOR PATTERN MOTION WITH A HEBB RULE Martin I. Sereno Cognitive Science C-015 University of California, San Diego La Jolla, CA 92093-0115 ABSTRACT The primate visual system learns to recognize the true direction of pattern motion using local... | 1988 | 31 |
115 | GENESIS: A SYSTEM FOR SIMULATING NEURAL NETWOfl.KS Matthew A. Wilson, Upinder S. Bhalla, John D. Uhley, James M. Bower. Division of Biology California Institute of Technology Pasadena, CA 91125 ABSTRACT We have developed a graphically oriented, general purpose simulation system to facilitate the... | 1988 | 32 |
116 | 794 NEURAL ARCHITECTURE Valentino Braitenberg Max Planck Institute Federal Republic of Germany While we are waiting for the ultimate biophysics of cell membranes and synapses to be completed, we may speculate on the shapes of neurons and on the patterns of their connections. Much of this will be si... | 1988 | 33 |
117 | LEARNING BY CHOICE OF INTERNAL REPRESENTATIONS Tal Grossman, Ronny Meir and Eytan Domany Department of Electronics, Weizmann Institute of Science Rehovot 76100 Israel ABSTRACT We introduce a learning algorithm for multilayer neural networks composed of binary linear threshold elements. Whereas existin... | 1988 | 34 |
118 | Adaptive Neural Networks Using MOS Charge Storage D. B. Schwartz 1, R. E. Howard and W. E. Hubbard AT&T Bell Laboratories Crawfords Corner Rd. Holmdel, N.J. 07733 Abstract MOS charge storage has been demonstrated as an effective method to store the weights in VLSI implementations of neural network ... | 1988 | 35 |
119 | IMPLICATIONS OF RECURSIVE DISTRIBUTED REPRESENTATIONS Jordan B. Pollack Laboratory for A I Research Ohio State University Columbus, OH -'3210 ABSTRACT I will describe my recent results on the automatic development of fixedwidth recursive distributed representations of variable-sized hierarchal data... | 1988 | 36 |
120 | 340 BACKPROPAGATION AND ITS APPLICATION TO HANDWRITTEN SIGNATURE VERIFICATION Dorothy A. Mighell Electrical Eng. Dept. Info. Systems Lab Stanford University Stanford, CA 94305 Timothy S. Wilkinson Electrical Eng. Dept. Info. Systems Lab Stanford University Stanford, CA 94305 AB... | 1988 | 37 |
121 | THEORY OF SELF-ORGANIZATION OF CORTICAL MAPS Shigeru Tanaka Fundamental Research Laboratorys, NEC Corporation 1-1 Miyazaki 4-Chome, Miyamae-ku, Kawasaki, Kanagawa 213, Japan ABSTRACT We have mathematically shown that cortical maps in the primary sensory cortices can be reproduced by using three ... | 1988 | 38 |
122 | AN ADAPTIVE NETWORK THAT LEARNS SEQUENCES OF TRANSITIONS C. L. Winter Science Applications International Corporation 5151 East Broadway, Suite 900 Tucson, Auizona 85711 ABSTRACT We describe an adaptive network, TIN2, that learns the transition function of a sequential system from observations of... | 1988 | 39 |
123 | 116 THE BOLTZMANN PERCEPTRON NETWORK: A MULTI-LAYERED FEED-FORWARD NETWORK EQUIVALENT TO THE BOLTZMANN MACHINE Eyal Yair and Allen Gersho Center for Infonnation Processing Research Department of Electrical & Computer Engineering University of California, Santa Barbara, CA 93106 ABSTRACT The c... | 1988 | 4 |
124 | 568 DYNAMICS OF ANALOG NEURAL NETWORKS WITH TIME DELAY C.M. Marcus and RM. Westervelt Division of Applied Sciences and Department of Physics Harvard University, Cambridge Massachusetts 02138 ABSTRACT A time delay in the response of the neurons in a network can induce sustained oscillation and ch... | 1988 | 40 |
125 | 634 ON THE K-WINNERS-TAKE-ALL NETWORK E. Majani Jet Propulsion Laboratory California Institute of Technology R. Erlanson, Y. Abu-Mostafa Department of Electrical Engineering California Institute of Technology ABSTRACT We present and rigorously analyze a generalization of the WinnerTake-All Ne... | 1988 | 41 |
126 | DOES THE NEURON "LEARN" LIKE THE SYNAPSE? RAOUL TAWEL Jet Propulsion Laboratory California Institute of Technology Pasadena, CA 91109 Abstract. An improved learning paradigm that offers a significant reduction in computation time during the supervised learning phase is described. It is based on extend... | 1988 | 42 |
127 | 224 USE OF MULTI-LAYERED NETWORKS FOR CODING SPEECH WITH PHONETIC FEATURES Yoshua Bengio, Regis Cardin and Renato De Mori Computer Science Dept. McGill University Montreal, Canada H3A2A7 ABSTRACT Piero Cosi Centro di Studio per Ie Ricerche di Fonetica, C.N.R., Via Oberdan,10, 3512... | 1988 | 43 |
128 | ELECTRONIC RECEPTORS FOR TACTILE/HAPTIC· SENSING Andreas G. Andreou Electrical and Computer Engineering The Johns Hopkins University Baltimore, MD 21218 ABSTRACT We discuss synthetic receptors for haptic sensing. These are based on magnetic field sensors (Hall effect structures) fabricated... | 1988 | 44 |
129 | 712 A PROGRAMMABLE ANALOG NEURAL COMPUTER AND SIMULATOR Paul Mueller*, Jan Vander Spiegel, David Blackman*, Timothy Chiu, Thomas Clare, Joseph Dao, Christopher Donham, Tzu-pu Hsieh, Marc Loinaz *Dept.of Biochem. Biophys., Dept. of Electrical Engineering. University of Pennsylvania, Philadelphia Pa. ... | 1988 | 45 |
130 | 256 AN INFORMATION THEORETIC APPROACH TO RULE-BASED CONNECTIONIST EXPERT SYSTEMS Rodney M. Goodman, John W. Miller Department of Electrical Engineering C altech 116-81 Pasadena, CA 91125 Padhraic Smyth Communication Systems Research Jet Propulsion Laboratories 238-420 4800 Oak Grove Drive ... | 1988 | 46 |
131 | 248 A CONNECTIONIST EXPERT SYSTEM THAT ACTUALLY WORKS Gary Bradshaw Psychology Richard Fozzard Computer Science University of Colorado Boulder, CO 80302 fozzard@boulder.colorado.edu ABSTRACf LouisCeci Computer Science The Space Environment Laboratory in Boulder has collaborated ... | 1988 | 47 |
132 | 802 CRICKET WIND DETECTION John P. Miller Neurobiology Group, University of California, Berkeley, California 94720, U.S.A. A great deal of interest has recently been focused on theories concerning parallel distributed processing in central nervous systems. In particular, many researchers have becom... | 1988 | 48 |
133 | ALVINN: AN AUTONOMOUS LAND VEHICLE IN A NEURAL NETWORK Dean A. Pomerleau Computer Science Department Carnegie Mellon University Pittsburgh, PA 15213 ABSTRACT ALVINN (Autonomous Land Vehicle In a Neural Network) is a 3-layer back-propagation network designed for the task of road following. Cur... | 1988 | 49 |
134 | 332 NEURAL NETWORKS THAT LEARN TO DISCRIMINATE SIMILAR KANJI CHARACTERS Yoshihiro Morl Kazuhiko Yokosawa ATR Auditory and Visual Perception Research Laboratories 2-1-61 Shiromi Higashiku Osaka 540 Japan ABSTRACT A neural network is applied to the problem of recognizing Kanji characters.... | 1988 | 5 |
135 | "FAST LEARNING IN MULTI-RESOLUTION HIERARCHIES" John Moody Yale Computer Science, P.O. Box 2158, New Haven, CT 06520 Abstract A class of fast, supervised learning algorithms is presented. They use local representations, hashing, atld multiple scales of resolution to approximate functions which are pie... | 1988 | 50 |
136 | Mapping Classifier Systems Into Neural Networks Lawrence Davis BBN Laboratories BBN Systems and Technologies Corporation 10 Moulton Street Cambridge, MA 02238 January 16, 1989 Abstract Classifier systems are machine learning systems incotporating a genetic algorithm as the learning mechanism.... | 1988 | 51 |
137 | A NETWORK FOR IMAGE SEGMENTATION USING COLOR Anya Hurlbert and Tomaso Poggio Center for Biological Information Processing at Whitaker College Department of Brain and Cognitive Science and the MIT AI Laboratory Cambridge, MA 02139 (hur lbert@wheaties.ai.mit.edu) ABSTRACT We propose a parallel ... | 1988 | 52 |
138 | TRAINING A LIMITED-INTERCONNECT, SYNTHETIC NEURAL IC M.R. Walker. S. Haghighi. A. Afghan. and L.A. Akers Center for Solid State Electronics Research Arizona State University Tempe. AZ 85287-6206 mwalker@enuxha.eas.asu.edu ABSTRACT Hardware implementation of neuromorphic algorithms is hampered... | 1988 | 53 |
139 | 444 A MODEL FOR RESOLUTION ENHANCEMENT (HYPERACUITY) IN SENSORY REPRESENTATION Jun Zhang and John P. Miller Neurobiology Group, University of California, Berkeley, California 94720, U.S.A. ABSTRACT Heiligenberg (1987) recently proposed a model to explain how sensory maps could enhance resolution th... | 1988 | 54 |
140 | 240 TEMPORAL REPRESENTATIONS IN A CONNECTIONIST SPEECH SYSTEM Erich J. Smythe 207 Greenmanville Ave, #6 Mystic, CT 06355 ABSTRACT SYREN is a connectionist model that uses temporal information in a speech signal for syllable recognition. It classifies the rates and directions of formant center... | 1988 | 55 |
141 | 314 NEURAL NETWORK STAR PATTERN RECOGNITION FOR SPACECRAFT ATTITUDE DETERMINATION AND CONTROL Phillip Alvelda, A. Miguel San Martin The Jet Propulsion Laboratory, California Institute of Technology, Pasadena, Ca. 91109 ABSTRACT Currently, the most complex spacecraft attitude determination ... | 1988 | 56 |
142 | A MASSIVELY PARALLEL SELF-TUNING CONTEXT-FREE PARSER! Eugene Santos Jr. Department of Computer Science Brown University Box 1910, Providence, RI 02912 eSj@cs.brown.edu ABSTRACT The Parsing and Learning System(PALS) is a massively parallel self-tuning context-free parser. It is capable of p... | 1988 | 57 |
143 | 272 NEURAL NET RECEIVERS IN MULTIPLE-ACCESS COMMUNICATIONS Bernd-Peter Paris, Geoffrey Orsak, Mahesh Varanasi, Behnaam Aazhang Department of Electrical and Computer Engineering Rice University Houston, TX 77251-1892 ABSTRACT The application of neural networks to the demodulation of spread-spe... | 1988 | 58 |
144 | A COMPUTATIONA.LLY ROBUST ANATOlVIICAL MODEL FOR RETIN.AL DIRECTIONAL SELECTI\lITY Norberto M. Grzywacz Center BioI. Inf. Processing MIT, E25-201 Cambridge, MA 02139 ABSTRACT Franklin R. Amthor Dept. Psychol. Univ. Alabama Birmingham Birmingham, AL 35294 We analyze a mathematical mod... | 1988 | 59 |
145 | COMPUTER MODELING OF ASSOCIATIVE LEARNING DANIEL L. ALKON' FRANCIS QUEK2a THOMAS P. VOGL2b 1. Laboratory for Cellular and Molecular NeurobiologYt NINCDS t NIH t Bethesdat MD 20892 2. Environmental Research Institute of Michigan a) P.O. Box 8G18t Ann Arbor t MI 48107 b) 1501 Wilson Blvd. t Suite ... | 1988 | 6 |
146 | 586 STATISTICAL PREDICTION WITH KANERVA'S SPARSE DISTRmUTED MEMORY David Rogers Research Institute for Advanced Computer Science MS 230-5, NASA Ames Research Center Moffett Field, CA 94035 ABSTRACT A new viewpoint of the processing performed by Kanerva's sparse distributed memory (SDM) is pre... | 1988 | 60 |
147 | LEARNING SEQUENTIAL STRUCTURE IN SIMPLE RECURRENT NETWORKS David Servan-Schreiber. Axel Cleeremans. and James L. McClelland Departtnents of Computer Science and Psycholgy Carnegie Mellon University Pittsburgh, PA 15213 ABSTRACT We explore a network architecture introduced by Elman (1988) for pre... | 1988 | 61 |
148 | A BACK-PROPAGATION ALGORITHM WITH OPTIMAL USE OF HIDDEN UNITS Yves Chauvin Thomson-CSF, Inc (and Psychology Department, Stanford University) 630, Hansen Way (Suite 250) Palo Alto, CA 94306 ABSTRACT This paper presents a variation of the back-propagation algorithm that makes optimal use of a netw... | 1988 | 62 |
149 | GEMINI: GRADIENT ESTIMATION THROUGH MATRIX INVERSION AFTER NOISE INJECTION Yann Le Cun 1 Conrad C. Galland and Geoffrey E. Hinton Department of Computer Science University of Toronto 10 King's College Rd Toronto, Ontario M5S 1A4 Canada ABSTRACT Learning procedures that measure how random p... | 1988 | 63 |
150 | A SELF-LEARNING NEURAL NETWORK A. Hartstein and R. H. Koch IBM - Thomas J. Watson Research Center Yorktown Heights, New York ABSTRACf We propose a new neural network structure that is compatible with silicon technology and has built-in learning capability. The thrust of this network work is a new s... | 1988 | 64 |
151 | 618 NEURAL NETWORKS FOR MODEL MATCHING AND PERCEPTUAL ORGANIZATION Gene Gindi EE Department Yale University New Haven, CT 06520 Eric Mjolsness CS Department Yale University New Haven, CT 06520 ABSTRACT P. Anandan CS Department Yale University New Haven, CT 06520 We int... | 1988 | 65 |
152 | 366 NEURONAL MAPS FOR SENSORY -MOTOR CONTROL IN THE BARN OWL C.D. Spence, J.C. Pearson, JJ. Gelfand, and R.M. Peterson David Sarnoff Research Center Subsidiary of SRI International CN5300 Princeton, New Jersey 08543-5300 W.E. Sullivan Department of Biology Princeton University Princeton... | 1988 | 66 |
153 | PERFORMANCE OF SYNTHETIC NEURAL NETWORK CLASSIFICATION OF NOISY RADAR SIGNALS S. C. Ahalt F. D. Garber I. Jouny A. K. Krishnamurthy Department of Electrical Engineering The Ohio State University, Columbus, Ohio 43210 ABSTRACT This study evaluates the performance of the multilayer-perceptro... | 1988 | 67 |
154 | Connectionist Learning of Expert Preferences by Comparison Training Gerald Tesauro IBl\f Thomas.1. '''atson Rcsearc11 Centcr PO Box 704, Yorktown Heights, NY 10598 USA Abstract A new training paradigm, caned the "eomparison pa.radigm," is introduced for tasks in which a. network must learn to choos... | 1988 | 68 |
155 | WINNER-TAKE-ALL NETWORKS OF O(N) COMPLEXITY J. Lazzaro, S. Ryckebusch, M.A. Mahowald, and C. A. Mead California Institute of Technology Pasadena, CA 91125 ABSTRACT We have designed, fabricated, and tested a series of compact CMOS integrated circuits that realize the winner-take-all function. These ... | 1988 | 69 |
156 | 502 LINKS BETWEEN MARKOV MODELS AND MULTILAYER PERCEPTRONS H. Bourlard t,t & C.J. Wellekens t (t)Philips Research Laboratory Brussels, B-1170 Belgium. mInt. Compo Science Institute Berkeley, CA 94704 USA. ABSTRACT Hidden Markov models are widely used for automatic speech recognition. They inh... | 1988 | 7 |
157 | NEURAL NETWORK RECOGNIZER FOR HAND-WRITTEN ZIP CODE DIGITS J. S. Denker, W. R. Gardner, H. P. Graf, D. Henderson, R. E. Howard, W. Hubbard, L. D. Jackel, H. S. Baird, and I. Guyon AT &T Bell Laboratories Holmdel, New Jersey 07733 ABSTRACT This paper describes the construction of a system that recog... | 1988 | 70 |
158 | ANALOG IMPLEMENTATION OF SHUNTING NEURAL NETWORKS Bahram Nabet, Robert B. Darling, and Robert B. Pinter Department of Electrical Engineering, FT-lO University of Washington Seattle, WA 98195 ABSTRACT An extremely compact, all analog and fully parallel implementation of a class of shunting recurrent... | 1988 | 71 |
159 | 594 Range Image Restoration using Mean Field Annealing Griff L. Bilbro Wesley E. Snyder Center for Communications and Signal Processing North Carolina State University Raleigh, NC Abstract A new optimization strategy, Mean Field Annealing, is presented. Its application to MAP restoration o... | 1988 | 72 |
160 | 626 ANALYZING THE ENERGY LANDSCAPES OF DISTRIBUTED WINNER-TAKE-ALL NETWORKS David S. Touretzky School of Computer Science Carnegie Mellon University Pittsburgh, P A 15213 ABSTRACT DCPS (the Distributed Connectionist Production System) is a neural network with complex dynamical properties. ... | 1988 | 73 |
161 | 264 NEURAL APPROACH FOR TV IMAGE COMPRESSION USING A HOPFIELD TYPE NETWORK Martine NAILLON Jean-Bernard THEETEN Laboratoire d'Electronique et de Physique Appliquee * 3 Avenue DESCARTES, BP 15 94451 LIMEIL BREVANNES Cedex FRANCE. ABSTRACT A self-organizing Hopfield network has been de... | 1988 | 74 |
162 | 796 SPEECH RECOGNITION: STATISTICAL AND NEURAL INFORMATION PROCESSING APPROACHES John S. Bridle Speech Research Unit and National Electronics Research Initiative in Pattern Recognition Royal Signals and Radar Establishment Malvern UK Automatic Speech Recognition (ASR) is an artificial percept... | 1988 | 75 |
163 | CONVERGENCE AND PATTERN STABILIZATION IN THE BOLTZMANN MACHINE MosheKam Dept. of Electrical and Computer Eng. Drexel University, Philadelphia PA 19104 ABSTRACT Roger Cheng Dept. of Electrical Eng. Princeton University, NJ 08544 The Boltzmann Machine has been introduced as a means to perform ... | 1988 | 76 |
164 | LINEAR LEARNING: LANDSCAPES AND ALGORITHMS Pierre Baldi Jet Propulsion Laboratory California Institute of Technology Pasadena, CA 91109 What follows extends some of our results of [1] on learning from examples in layered feed-forward networks of linear units. In particular we examine what happens when th... | 1988 | 77 |
165 | 728 DIGITAL REALISATION OF SELF-ORGANISING MAPS Nigel M. Allinson M~rtin J. Johnson Department of Electronics University of York York Y015DD England ABSTRACT Kevin J. Moon A digital realisation of two-dimensional self-organising feature maps is presented. The method is based on su... | 1988 | 78 |
166 | 348 Further Explorations in Visually-Guided Reaching: Making MURPHY Smarter Bartlett W. Mel Center for Complex Systems Research Beckman Institute, University of illinois 405 North Matheus Street Urbana, IL 61801 ABSTRACT MURPHY is a vision-based kinematic controller and path planner based ... | 1988 | 79 |
167 | SKELETONIZATION: A TECHNIQUE FOR TRIMMING THE FAT FROM A NETWORK VIA RELEVANCE ASSESSMENT Michael C. Mozer Paul Smolensky Department of Computer Science & Institute of Cognitive Science University of Colorado Boulder, CO 80309-0430 ABSTRACT This paper proposes a means of using the knowledg... | 1988 | 8 |
168 | 160 SCALING AND GENERALIZATION IN NEURAL NETWORKS: A CASE STUDY Subutai Ahmad Center for Complex Systems Research University of Illinois at Urbana-Champaign 508 S. 6th St., Champaign, IL 61820 ABSTRACT Gerald Tesauro IBM Watson Research Center PO Box 704 Yorktown Heights, NY 10598 Th... | 1988 | 80 |
169 | 560 A MODEL OF NEURAL OSCILLATOR FOR A UNIFIED SUEt10DULE A.B.Kirillov, G.N.Borisyuk, R.M.Borisyuk, Ye.I.Kovalenko, V.I.Makarenko,V.A.Chulaevsky, V.I.Kryukov Research Computer Center USSR Academy of Sciences Pushchino, Moscow Region 142292 USSR AmTRACT A new model of a controlled neuron... | 1988 | 81 |
170 | AN OPTIMALITY PRINCIPLE FOR UNSUPERVISED LEARNING Terence D. Sanger MIT AI Laboratory, NE43-743 Cambridge, MA 02139 (tds@wheaties.ai.mit.edu) ABSTRACT We propose an optimality principle for training an unsupervised feedforward neural network based upon maximal ability to reconstruct the input da... | 1988 | 82 |
171 | NEURAL ANALOG DIFFUSION-ENHANCEMENT LAYER AND SPATIO-TEMPORAL GROUPING IN EARLY VISION Allen M. Waxman·,t, Michael Seibert·,t,RobertCunninghamt and I ian Wu· • Laboratory for Sensory Robotics Boston University Boston, MA 02215 t Machine Intelligence Group MIT Lincoln Laboratory Lexington, MA ... | 1988 | 83 |
172 | OPTIMIZATION BY MEAN FIELD ANNEALING Griff Bilbro Reinhold Mann Thomas K. Miller ECE Dept. Eng. Physics and Math. Div. ECE Dept. NCSU Oak Ridge N atl. Lab. NCSU Raleigh, NC 27695 Oak Ridge, TN 37831 Raleigh, N C 27695 Wesley. E. Snyder David E. Van den Bout Mark White ECE... | 1988 | 84 |
173 | 186 AN APPLICATION OF THE PRINCIPLE OF MAXIMUM INFORMATION PRESERVATION TO LINEAR SYSTEMS Ralph Linsker IBM T. J. Watson Research Center, Yorktown Heights, NY 10598 ABSTRACT This paper addresses the problem of determining the weights for a set of linear filters (model "cells") so as to maximize ... | 1988 | 85 |
174 | SPREADING ACTIVATION OVER DISTRIBUTED MICROFEATURES * James Hendler Depart.ment, of Computer Science University of Maryland College Park, MD 20742 ABSTRACT One att·empt at explaining human inferencing is that of spreading activat,ion, particularly in the st.ructured connectionist paradigm. This ... | 1988 | 86 |
175 | Consonant Recognition by Modular Construction of Large Phonemic Time-Delay Neural Networks Abstract Alex Waibel Carnegie-Mellon University Pittsburgh, PA 15213, A TR Interpreting Telephony Research Laboratories Osaka, Japan In this paperl we show that neural networks for speech recognition can b... | 1988 | 87 |
176 | 2 CONSTRAINTS ON ADAPTIVE NETWORKS FOR MODELING HUMAN GENERALIZATION M. Pavel Mark A. Gluck Departm£1Il of Psychology Stanford University Stanford. CA 94305 ABSTRACT Van Henkle The potential of adaptive networks to learn categorization rules and to model human performance is studied by ... | 1988 | 88 |
177 | Self Organizing Neural Networks for the Identification Problem Manoel Fernando Tenorio School of Electrical Engineering Purdue University VV. Lafayette, UN. 47907 tenoriQ@ee.ecn.purdue.edu ABSTRACT VVei-Tsih Lee School of Electrical Engineering Purdue University VV. Lafayette, UN. 47907... | 1988 | 89 |
178 | 662 A PASSIVE SHARED ELEMENT ANALOG ELECTRICAL COCHLEA Joe Eisenberg Bioeng. Group U.C. Berkeley David Feld Dept. Elect. Eng. 207-30 Cory Hall U.C. Berkeley Berkeley, CA. 94720 ABSTRACT Edwin Lewis Dept Elect. Eng. U.C. Berkeley We present a simplified model of the micromech... | 1988 | 9 |
179 | LEARNING WITH TEMPORAL DERIVATIVES IN PULSE-CODED NEURONAL SYSTEMS Mark Gluck David B. Parker Department of Psychology Stanford University Stanford, CA 94305 Abstract Eric S. Reifsnider A number of learning models have recently been proposed which involve calculations of temporal differenc... | 1988 | 90 |
180 | 356 USING BACKPROPAGATION WITH TEMPORAL WINDOWS TO LEARN THE DYNAMICS OF THE CMU DIRECT-DRIVE ARM II K. Y. Goldberg and B. A. Pearlmutter School of Computer Science Carnegie Mellon University Pittsburgh, PA 15213 ABSTRACT Computing the inverse dynamics of a robot ann is an active area of r... | 1988 | 91 |
181 | 394 STORING COVARIANCE BY THE ASSOCIATIVE LONG·TERM POTENTIATION AND DEPRESSION OF SYNAPTIC STRENGTHS IN THE HIPPOCAMPUS Patric K. Stanton· and Terrence J. Sejnowskit Department of Biophysics Johns Hopkins University Baltimore, MD 21218 ABSTRACT In modeling studies or memory based on neural n... | 1988 | 92 |
182 | DYNAMIC, NON·LOCAL ROLE BINDINGS AND INFERENCING IN A LOCALIST NETWORK FOR NATURAL LANGUAGE UNDERSTANDING· Trent E. Lange Michael G. Dyer Artificial Intelligence Laboratory Computer Science Department University of California, Los Angeles Los Angeles, CA 90024 ABSTRACT This paper introduce... | 1988 | 93 |
183 | FIXED POINT ANALYSIS FOR RECURRENT NETWORKS Mary B. Ottaway Patrice Y. Simard Dept. of Computer Science University of Rochester Rochester NY 14627 ABSTRACT Dana H. Ballard This paper provides a systematic analysis of the recurrent backpropagation (RBP) algorithm, introducing a number of new r... | 1988 | 94 |
184 | 614 Gish and Blanz Comparing the Performance of Connectionist and Statistical Classifiers on an Image Segmentation Problem Sheri L. Gish w. E. Blanz IBM Almaden Research Center 650 Harry Road San Jose, CA 95120 ABSTRACT In the development of an image segmentation system for real time ... | 1989 | 1 |
185 | 348 Farotimi, Demho and Kailath Neural Network Weight Matrix Synthesis Using Optimal Control Techniques O. Farotimi A. Dembo Information Systems Lab. Electrical Engineering Dept. Stanford University, Stanford, CA 94305 ABSTRACT T. Kailath Given a set of input-output training samples,... | 1989 | 10 |
186 | 232 Sejnowski, Yuhas, Goldstein and Jenkins Combining Visual and Acoustic Speech Signals with a Neural Network Improves Intelligibility T .J. Sejnowski The Salk Institute and Department of Biology The University of California at San Diego San Diego, CA 92037 B.P. Yuhas M.H. Goldstein... | 1989 | 100 |
187 | 10 Spence and Pearson The Computation of Sound Source Elevation the Barn Owl Clay D. Spence John C. Pearson David Sarnoff Research Center CN5300 Princeton, NJ 08543-5300 ABSTRACT The midbrain of the barn owl contains a map-like representation of sound source direction which is used to p... | 1989 | 101 |
188 | 76 Kammen, Koch and Holmes Collective Oscillations in the Visual Cortex Daniel Kammen & Christof Koch Computation and Neural Systems Caltech 216-76 Pasadena, CA 91125 Philip J. H oImes Dept. of Theor. & Applied Mechanics Cornell University Ithaca, NY 14853 ABSTRACT The firing patt... | 1989 | 11 |
189 | 828 Cowan Neural networks: the early days J.D. Cowan Department of Mathematics, Committee on Neurobiology, and Brain Research Institute, The University of Chicago, 5734 S. Univ. Ave., Chicago, Illinois 60637 ABSTRACT A short account is given of various investigations of neural network prop... | 1989 | 12 |
190 | 340 Carter, Rudolph and Nucci Operational Fault Tolerance of CMAC Networks Michael J. Carter Franklin J. Rudolph Adam J. Nucci Intelligent Structures Group Department of Electrical and Computer Engineering University of New Hampshire Durham, NH 03824-3591 ABSTRACT The performance sen... | 1989 | 13 |
191 | Effects of Firing Synchrony on Signal Propagation in Layered Networks 141 Effects of Firing Synchrony on Signal Propagation in Layered Networks G. T. Kenyon,l E. E. Fetz,2 R. D. Puffl 1 Department of Physics FM-15, 2Department of Physiology and Biophysics SJ-40 University of Washington, Seattle, Wa. 9... | 1989 | 14 |
192 | 2 Simmons Acoustic-Imaging Computations by Echolocating Bats: Unification of Diversely-Represented Stimulus Features into Whole Images. James A. Simmons Department of Psychology and Section of Neurobiology, Division of Biology and Medicine Brown University, Providence, RI 02912. ABSTRACT ... | 1989 | 15 |
193 | 710 Pineda Time DependentAdaptive Neural Networks Fernando J. Pineda Center for Microelectronics Technology Jet Propulsion Laboratory California Institute of Technology Pasadena, CA 91109 ABSTRACT A comparison of algorithms that minimize error functions to train the trajectories of recurre... | 1989 | 16 |
194 | Neural Network Visualization 465 NEURAL NETWORK VISUALIZATION Jakub Wejchert Gerald Tesauro IB M Research T.J. Watson Research Center Yorktown Heights NY 10598 ABSTRACT We have developed graphics to visualize static and dynamic information in layered neural network learning systems. Emp... | 1989 | 17 |
195 | Discovering the Structure of a Reactive Environment by Exploration 439 Discovering the Structure of a Reactive Environment by Exploration Michael C. Mozer Department of Computer Science and Institute of Cognitive Science University of Colorado Boulder, CO 80309-0430 ABSTRACT Jonatban Bachr... | 1989 | 18 |
196 | 332 Hormel A Sell-organizing Associative Memory System lor Control Applications Michael Bormel Department of Control Theory and Robotics Technical University of Darmstadt Schlossgraben 1 6100 Darmstadt/W.-Ger.any ABSTRACT The CHAC storage scheme has been used as a basis for a softwar... | 1989 | 19 |
197 | Analog Circuits for Constrained Optimization 777 A nalog Circuits for Constrained Optimization John C. Platt 1 Computer Science Department, 256-80 California Institute of Technology Pasadena, CA 91125 ABSTRACT This paper explores whether analog circuitry can adequately perform constrained optimi... | 1989 | 2 |
198 | 668 Dembo, Siu and Kailath Complexity of Finite Precision Neural Network Classifier Amir Dembo1 Inform. Systems Lab. Stanford University Stanford, Calif. 94305 Kai-Yeung Siu Inform. Systems Lab. Stanford University Stanford, Calif. 94305 ABSTRACT Thomas Kailath Inform. Systems ... | 1989 | 20 |
199 | 308 Donnett and Smithers Neuronal Group Selection Theory: A Grounding in Robotics Jim Donnett and Tim Smithers Department of Artificial Intelligence University of Edinburgh 5 Forrest Hill Edinburgh EH12QL SCOTLAND ABSTRACT In this paper, we discuss a current attempt at applying the orga... | 1989 | 21 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.