Publications     Software     Bibliography  

Brain 0 Project
the problem of a brain-like universal learning computer

Victor Eliashberg (click for resume)
Palo Alto, California    (

  About the Brain 0 Project

As a systems engineer interested in brain modelling and cognitive modelling, I've been trying to "reverse-engineer" the basic principles of organization of a brain-like universal learning computer since the late sixties. This web site is an attempt to communicate some insights resulted from this long-term project. I refer to this project as the Brain 0 project, the attribute "0" implying the state of the human brain at the beginning of learning (t=0). I argue that there exists a relatively short formal representation of Brain 0 because this representation is encoded in some form in the human genome. Given a powerful enough "initial approximation," it is possible to learn a great deal about Brain 0 from the analysis of basic psychological and neurobiological observations. (See the following paper.)


  1. Eliashberg, V. (2003). Cognitive system (Man,World): the big picture.    Web publication,, Palo Alto, California   (.html file).

    This paper takes a broader look at the cognitive system (MAN,WORLD) to explain the general methodology of the Brain 0 project. The outlined big picture helps to understand the motivation for the computational models discussed in other papers. The brain (particularly its neocortex) is viewed as a "nonclassical symbolic system" in which the probabilities of sequential "symbolic" processes are controlled by massively-parallel "dynamical" processes -- this general vision is formalized as the "concept of E-machine." The term "nonclassical" implies a loose association with the formalism of quantum mechanics. In both cases, the probabilities of discrete processes are controlled by transformations of continuous functions. Interestingly enough, there exists an efficient formal representation of the whole probabilistic-symbolic-dynamical behavior of an E-machine. At the same time, in nontrivial cases, it is practically impossible to find separate formal representations of either the symbolic or the dynamical projections of this behavior. The whole behavior turns out to be a simpler mathematical object than its parts. To get a preliminary intuition as to how this can be possible, recall the general relationship between a complex analytical function and its real and imaginary parts. As another example, consider the classical electrodynamics. There exists an efficient formal representation of the "whole" behavior of electromagnetic field (the Maxwell equations). At the same time, in the case of nontrivial external constraints, it is practically impossible to find separate representations of either the electric or the magnetic projections of this behavior. The metaphor "neocortex as an E-machine" suggests that, in a much more sophisticated form, the same holds for the physical phenomenon of human context-dependent behavior. An attempt to find separate formal representations of the parts of this behavior corresponding to different contexts leads to a combinatorial explosion of the number of partial models needed to cover the whole context-dependent behavior.

  2. Eliashberg, V. (1979). The Concept of E-machine and the Problem of Context- Dependent Behavior. Palo Alto, CA, Copyright 1980 by Victor Eliashberg, TXu40-302 US Copyright Office. 1-158 (.pdf file) 8.5 MB.

    ABSTRACT. The concept of a "non-classical" symbolic processor (E-machine) is developed as an attempt to match the known general pattern of "analogo-discrete" information processes in the associative neural networks of the brain. An E-machine performs two types of processes: 1) the serial symbolic processes similar, in a sense, to those in the Turing machine, and 2) the quasi-parallel non-symbolic processes resembling those in the analog computers for simulating partial differential equations. The latter are interpreted as transformations of some residual excitation states (E-states) in the associative neural structures of the brain and employed as a tool for controling the probabilities of different branches of the serial symbolic processes depending on context. Several extreme cases of the behavior of E-machines are investigated to establish a link between the concept of E-machine and some basic concepts from the area of classical symbolic machines. The link provides an insight into more complex forms of E-machines' behavior which have no good classical counterparts. Some psycho-cybernetical applications of the concept of E-machine are discussed primarily for understanding the effects of so-called context-dependent behavior which are difficult to express in classical symbolic terms. Several specially simplified examples of such a behavior are demonstrated by computer simulation. The main goal of the study is to promote the idea that the symbolic transformations underlying man's behavior may be of the same non-classical type as those performed by E-machines. This would give a genaral explanation of why it is so difficult to find their descriptions in terms of traditional algorithms for processing symbolic information.

  3. Eliashberg, V. (1981). The concept of E-machine: On brain hardware and the algorithms of thinking. Proceedings of the Third Annual Meeting of the Cognitive Science Society, U.C. Berkeley 289-291.   .pdf file.

    This paper presents a simple example of an E-machine implemented as a three-layer associative neural network with neuromodulation. It is shown that an E-machine can be dynamically reconfigured into a combinatorial number of different symbolic machines by changing its dynamical E-states.

  4. Eliashberg, V. (1989). Context-sensitive associative memory: "Residual excitation" in neural networks as the mechanism of STM and mental set. Proceedings of IJCNN-89, June 18-22, 1989, Washington, D.C. vol. I, 67-75 (.pdf file).

    This paper discusses several models of context-sensitive associative memory that address the problems of short-term memory, temporal associations and temporal context (mental set).

  5. Eliashberg, V. (1990a). Universal learning neurocomputers. Proceeding of the Fourth Annual parallel processing symposium. California State University, Fullerton. April 4-6, 1990., 181-191 ( .pdf file).

    This paper gives a psychological interpretation of the basic types of behavior (roughly corresponding to Chomsky's hierarchy of formal languages) and discusses the general levels of computing power needed to implement these types of behavior. It then introduces the concept of a universal learning neurocomputer (type 0) arranged on the principle of E-machine.

  6. Eliashberg, V. (1990b). Molecular dynamics of short-term memory. Mathematical and Computer modeling in Science and Technology. vol. 14, 295-299 (.pdf file). This paper treats a single protein molecule (such as an ion channel) as an abstract microscopic probabilistic machine (a first-order Markov system). The formalism is used to reformulate the classical Hodgkin-Huxley theory in system-theoretical terms and to simulate the generation of nerve spike. It is pointed out that this approach can be naturally extended to represent various phenomena of neuromodulation and cellular short-term memory.

  7. Eliashberg, V. (1993). A relationship between neural networks and programmable logic arrays. Proceeding of the International Conference on Neural Networks, San Francisco, CA, March 28 - April 1, 1993., vol. III, 1333-1337. ( .pdf file).

    This paper demonstrates a remarkable similarity between the topological structure of some popular neural networks and that of the Programmable Logic Arrays (PLA).

  8. Eliashberg, V. (2002). What Is Working Memory and Mental Imagery? A Robot that Learns to Perform Mental Computations. Web publication,, Palo Alto, California (.html file)

    This paper discusses a simple robot that can be trained (in an experiment of forced motor training) to perform, in principle, any algorithm using an external memory aid (a tape similar to that of Turing's machine). After performing a sufficient number of computations with the use of the external memory device the robot learns to perform similar mental computations using the corresponding imaginary memory device. The robot's brain is organized as a complex E-machine consisting of two primitive E-machines. One primitive E-machine is responsible for motor control, another -- for working memory and mental imagery. The model is implemented as a user-friendly program EROBOT for the Microsoft Windows.

    This paper is also available from the Cornell/Los Alamos pre-print server

  9. Eliashberg, V. (2005). Ensembles of membrane proteins as statistical mixed-signal computers. IJCNN 2005 Proceedings. (.pdf file).

    The paper develops the formalism introduced in Eliashberg (1990b). It is shown how ensembles of membrane proteins could provide a robust statistical implementation of a class of mixed-signal computers combining the dynamical capabilities of analog computers with the sequencing capabilities of state machines. It is suggested that such molecular computers account for the main volume of the brain hardware computations. This "law-level" approach is consistent with the "high-level" metaphor "the neocortex as an E-machine." It is not consistent with the notion of the brain as a "distributed connectionist system."

  10. Eliashberg, V. (2009). A nonclassical symbolic theory of working memory, mental computations, and mental set.    This paper is also available from the Cornell/Los Alamos pre-print server

    Conventional computers process symbolic information by manipulating symbols in a read/write memory -- call it a RAM buffer. This classical symbolic computational paradigm encounters serious problems as a metaphor for the symbolic level of information processing in the human brain. It is unlikely that the brain has a counterpart of a conventional RAM buffer. Even if such a buffer existed it would be too small and too slow to allow the brain to efficiently process symbolic information in a traditional way. How can the brain produce such cognitive phenomena as working memory, mental computations, and language without a RAM buffer?
  11. Home