Michael Brooks looks at the coming together of cognitive neuroscience and computing
The aim of this brief article is to stimulate discussion of a new scientific convergence.
Two Oxford physicists - David Deutsch and Chiara Marletto - have remarked upon the trend: “When we consider some of the most striking phenomena permitted by the laws of physics – from human reasoning to computer technologies and the replication of genes – we find that information plays a central role.”
Biogeneticist J Craig Venter has written recently that we have now entered what he calls “the digital age of biology”. Here, he says, “the once-distinct domains of computer codes and those that program life are beginning to merge”.
Philosopher Daniel Dennett has identified something similar: His chapter on programming in the book Intuition Pumps and Other Tools for Thinking is an excellent primer for the non-specialist, and makes several salient points on convergence. “Our present transistor-based systems,” he says, “are only the current ‘hardware’ that runs our ‘software’. Researchers all over the world are racing to replace them with ... biological mechanisms using DNA and proteins.”
The fascinating nature of this particular convergence is also noted by Edinburgh physiologist Jamie A Davies. Exceptionally valuable insights into human development have, he says, “been contributed by researchers in fields that might seem at first to have nothing to do with the topic, such as mathematics, physics, computer science and even philosophy.” Microsoft and UCL have announced a joint program to explore “the convergence of carbon and silicon - based life forms – the ‘interactome’”.
It is exciting that mathematicians and philosophers are working with biologists and computer scientists. But our excitement should go deeper than that. For the last fifty years we have thought of computers only as useful information processors. However, just as Galileo looked through his telescopes and Hooke through his microscopes, computers also seem to give us a means of seeing things we have not been able to see before, creating a new opportunity.
This is not just about collaboration between fields; it seems there is genuine convergence. It is becoming clear that many of our most complex challenges are, in fact, just one challenge: understanding information processing, whether it is carried out by the human brain, the universe, DNA or a silicon-based computer.
Research is uncovering remarkable similarities between three computing systems in particular: the coding, structure and systems used by DNA to build living organisms; the processes that enable the brain to develop memory, learning and thinking; and the way we program computers to carry out ever more sophisticated tasks. It seems likely that greater exchange of ideas and closer cooperation will enable these disciplines to learn from each other and expand everyone’s knowledge.
Of the three systems, computing is the one we understand best because we have built its code from the ground up. In his 1936 paper entitled ‘Computable Numbers’, computer pioneer Alan Turing outlined how he should be able to perform almost any calculation with a computer using just three instructions: add, subtract and jump to the next instruction. These instructions are compiled into a coding system we call ‘software’ (to differentiate it from the physical ‘hardware’). Thus was born the science of software, and it seems that studying how computer code works might be our best hope of gaining insights into the functioning of the other two systems.
Our growing familiarity with the concept of software has perhaps inured us to the fact that software is little more than information management - and this is a task also carried out by the machinery of life.
Eight years after Turing’s breakthrough, the Austrian physicist Erwin Schrödinger published a controversial book called What is Life? Here, he applied the rules of physics to the thorny issues confronting biologists of the day and came up with a remarkable hypothesis. Life, he said, had to obey the laws of physics that govern the entire universe. That must mean that chromosomes contain “some kind of code-script determining the entire pattern of the individual’s future development”. He even suggested that the code could be as basic as a binary code of the kind that Turing had used to create the era of mechanical computing.
Following Crick and Watson’s discovery of the structure of DNA, we now know that Schrödinger was right. Complex strands of DNA are made up of billions of base pairs or four nucleotides (Guanine, Cytosine, Thymine and Adenine) in the double helix structure. Triplets of these pairs form ‘codons’. Patterns of codons specify a ‘gene’, with ‘start’ and ‘stop’ instructions to define their beginning and end, and determine which are used for which purpose. Thousands of genes make up the twenty three pairs of chromosomes that specify the complete genome that creates a living organism.
This is the code of life. But, as with an unpowered electronic computer, this DNA code can do nothing by itself; it is inert. Only when electrochemical energy passes through is the code accessed. This happens through the creation of RNA, which enables the genes to generate twenty amino acids. These specify patterns of enzymes which construct proteins that, in turn, build the cells that carry out every function in the whole body. Thus, through this hierarchy of structures, or subroutines, RNA converts one long string of nucleotides into the most complex systems on earth.
Unlike computer code, the genetic code appears to have the ability to randomly change itself. In some cases, these are the result of transcription errors when the code is ‘read’ by the transcription and messenger molecules in the nucleus. If this ‘error’ leads to a significant physical change which enhances the organism’s chances of survival, the mutated gene will be passed on and become part of the species’ genetic inheritance.
To fully understand a code we need to know the function for which it was designed. Computer code is written to enable lumps of inorganic materials to perform large, often repetitive calculations and tasks quickly and accurately. DNA’s main function is to reproduce life forms that will be successful in their environments.
The brain’s coding appears to have the purpose of extending the efficacy of DNA’s replication function. Our brains provide three extra resources to this end. First, it seems that inherited DNA coding has equipped some organisms’ brains with hardwired instructions that give it instinctual knowledge of a few survival tactics specific to their native environment. Second, the brain can strengthen the organism’s chances of success by coordinating its motor resources to respond to danger and seek out and exploit opportunities for sustaining life. Third, it enables the organism to learn from experience.
This multi-functional system we call the brain, then, hosts the most interesting and difficult-to-access coding system of all. The basic mechanisms are broadly understood. The brain’s neurons are connected by a gap or ‘synapse’. Messages, in the form of electrochemical pulses, are input through ‘dendrite’ filaments. These run from every organ in the body and other neurons to the nucleus, and output in the form of electrochemical pulses from the nucleus along ‘axon’ filaments to every muscle, gland, organ and other neurons to stimulate an action response.
There are some similarities with better-known forms of computing. Though the neuron nuclei initiate signals, the operation of the synapses is reminiscent of transistors (with analogue rather than digital signals). However, as we understand it, the brain has one significant difference from DNA and digital computers: an ability to grow new ‘hardware’ in the form of new links between neurons and the sensory organs monitoring the environment. The psychologist and neuroscientist Donald Hebb put this very succinctly: “neurons that fire together, wire together”.
By the time it is born, a child has grown some billions of neurons. Half of these connect up all the organs throughout the body, with the other half concentrated in the brain. It has been known for some time that a mature adult brain has grown some trillion additional neural connections. Recent research counting neuron nuclei suggests that a mature brain has some eighty billion neurons. In effect, the brain continuously grows its own network hardware: one estimate suggests we grow between a thousand and a million new neural links or structures every second.
How does this happen? When a stimulus from a sensory organ is received (as an electrochemical pulse travelling up a dendrite), it transmits pulses across the network in an attempt to trigger a pathway formed by an earlier stimulus of the same experience. If it finds an existing path (a ‘yes’), the neuron activates (and strengthens: learns) the same response used by the earlier experience. If it doesn’t (a ‘no’), it triggers the growth of a new link (a ‘conditional jump’), laying the new experience into the brain’s memory stores. Eric Kandel won his Nobel Prize for demonstrating this basic learning process in the very primitive brain of the snail Aplysia.
Interestingly, this choice (“yes”, or a conditional jump) mirrors the operation of the computing machine that Turing envisioned. The result, though, is a massive relational database of trillions of connections linking up related experiences. Very quickly, it has so much information stored in its neurons that some sort of hierarchy of focus has to develop; this allows the organism to respond to imminent danger as fast as possible, while, at other times allowing the system to pause, reflect and develop more efficient responses.
There is much left to discover, but there is good reason for optimism in the convergence of the research into biological, computer-based and neurological information processing. The more we learn to develop our computers, the more this can inform our understanding of biogenetics and cognitive neuroscience. Learning more about the way DNA is able to grow the hardware of our bodies, in turn, helps us to learn more about how the brain grows all the neural networks and structures that enable us to learn, to think and to be creative. The feedback loop goes on: the more we learn about both DNA and the brain, the better equipped we will be to conceive and design new generations of computers.
In summary, semiconductors, the nucleotides and neurons all create incredibly complex structures out of very simple basic units capable of being switched ‘on’ or ‘off’. We know that the elegant simplicity of this architecture can create computers capable of performing open heart surgery and putting Man on the moon. The same basic architecture allows DNA to specify the creation of new life. The human brain’s ability to create complexity from simple structures enables the creation of minds capable of designing those computers and understanding DNA.
As we probe further into the links between these systems, we can be confident that their remaining secrets will be uncovered, and harnessed for specific applications. There are many medical, technological and humanitarian reasons to pursue this convergence; it may be the key to massive improvements in healthcare, manufacturing through artificial intelligence and innovations that enable us to overcome challenges presented by issues such as climate change. We have uncovered a route to a better world, and must now encourage philosophers, physicists, computer scientists, psychologists and biologists to grab the opportunity with both hands.
 Reconstructing physics: The universe is information, New Scientist, 21 May 2014, p.30
 Life at the speed of light, From the double helix to the dawn of digital life, Little Brown (2013), p.2
 Life unfolding: How the human body constructs itself, Oxford University Press (2014), p. 4
Stumped trying to find a memorable gift for the science lover in your life? We got you covered with this gift guide to blow their socks off - no socks featured.
Posted on10th December 2021
As we celebrate the bicentenary of Faraday's invention of the electric motor in 1821, our Head of Heritage and Collections, Charlotte New, takes us on a voyage through time to rediscover this history-defining moment.
Posted to In the archives on3rd September 2021
Our Interim Head of Heritage and Collections, Charlotte New, remembers His Royal Highness The Duke of Edinburgh by looking back at his 60-year relationship with the Ri.
Posted to In the archives on16th April 2021