Prosthetic and Neural Implants

implantsIt is probably worth clarifying a number of general points regarding the discussion of neural implants. In very general terms, implants will be described as belonging to two major classes or types as follows:

  • Prosthetic Implants - Motor and Sensory
  • Neural Implants - Passive and Active

What we will infer by a prosthetic implant is a device that connects to an existing motor or sensory system. Today, a prosthetic limb tries to replicate some of the basic movements of the original limb, but the connections between the prosthetic limb and the brain are still broken. In most cases, we know that the brain functions required to move the original limb exists, although 'neural memory' does decay if not used. Therefore, in principle, a prosthetic limb that supports a neural interface, via connections to the original nerve fibres, could offer the hope of an artificial limb controlled by the brain. In the case of a sensory prosthetic, such as a cochlear implant, the implant would restore the audio input of the ear onto the original nerve fibres. Again, providing the brain function existed in the first place, some level of hearing could be restored.

In contrast, a neural implant will be described as a device that is connected directly to the brain or brain stem, which operates in one of two modes. A passive implant would be able to intercept brain activity and convert that activity into a digital stream or pattern. This information could then be used as an input by an external AI system to infer some mental state or possibly memories within the brain. An active implant would provide the return channel allowing information to be input directly into the brain. However, as in the case of the prosthetic implants, this information would only be meaningful, if the brain supported an appropriate configuration of neurons to interpret the pattern superimposed on the brain via the implants. However, one of the greatest attributes of the neural network within the human brain is its ability to adapt and learn. In principle, it has been shown that neurons in the brain can adapt to new information received by neural implants and learn to perform new functions.

At this point, we need to be clear that this is only a statement of requirements rather than a prediction. All too often, even the news that a primitive implant has been developed triggers dramatic reports along the lines of 'The Cyborgs have arrived'. We live in a world, where some initial ideas are reported and promoted in a way that all perspective on any real progress can be lost. Today, our current reality is that humans receive information through their input senses and transmit information through the motor control of their muscles, which includes the ability to sign, talk, write, draw and even type badly with one finger. In a rational state of mind, the higher thought processes of the brain interpret the information and then trigger an appropriate motor response. Within the totality of the process, we call life, explicit knowledge is acquired and implicit knowledge inferred from previous experience or logic laid down in memories, constructed from billions of neurons. It is probably fair to say that we currently only have a rudimentary understanding of how any of these processes really function; especially the higher thought processes.

So will it ever be possible to construct a two-way brain-computer interface (BCI)?

In all honesty, it is not clear that anybody can yet fully answer this question, but could anybody have answered how the human genome could be mapped in 1953, when Crick and Watson first discovered the double-helix nature of DNA? Then, as now, breakthroughs will occur in increments, some due to brilliance, others by hard work and persistence. Clearly, developments such as weak AI and advances in nano-technology linked to a new generation of brain scanners should not be overlooked. Such technologies could help transform our understanding of the brain and support the ability to insert neural implants over the next few decades. However, let us also be realistic about the difficulties, as the ability to both extract and insert coherent thoughts and memories directly into a living brain is still in the realms of science fiction.

Of course, this does not preclude more pragmatic stepwise approaches to the implementation of a two-way brain-computer interface (BCI) using both prosthetic and neural implants being considered. It is highlighted that we are now entering the realm of almost pure speculation, but the diagram above tries to illustrate a configuration of prosthetic and neural implants that would function as a two-way BCI. Initially, neural implants would be limited to the passive mode, which only allows the brain to be monitored to determine its overall mental state, e.g. awake or sleeping etc. However, by learning to control certain mental states, it might be possible for the brain to actively control the basic operational state of other implants, e.g. on/off.

The key assumption is that prosthetic implants continue to develop to a point, where within a few decades, they start to become capable of not only inputting visual and auditory signals onto the appropriate nerve fibres, but also able to intercept and store these signals, which can then be shared or played back. So the visual or auditory signal patterns being input into one person's prosthetic implants may have been intercepted from another person's implants and transferred through a BCI via a central AI system for distribution. While this approach would allow Homo Cybernetic to share images or sounds with each other, the input could equally be generated by an AI computer simulation and provide a form of extended or artificial reality. It is also highlighted that information being output as speech would also be picked up and distributed via the local auditory implants, so that all sights and sounds could be a shared experience.

How does this approach differ from virtual reality?

Today, virtual reality can be experienced by receiving images, via special glasses, which are synchronised to hand, head or eye movements. Sound can also be integrated into the effect. It is expected that this technology will continue to develop and be exploited by Homo Computerus to facilitate greater interaction with informational AI systems. However, this is essentially a receive-only system, although some positional information can be fed back into the system. In contrast, even the initial hybrid implant system described would allow Homo Cyberneticus to record its own sensory information, which could be preserved and indexed within its extended AI system. To some extent, all the information that the brain is receiving and subsequently interpreting in order to learn and experience life is also being acquired by the extended AI system. However, the memory of the AI system will be permanent, less subjective, and capable of intelligent cross-indexing and recalled in the form of playback. In essence, a memory could be almost physically relived through playback via the sensory implants. In addition, the implants could be extended to provide telemetry monitoring of both the brain and body functions. However, by virtue of the physical integration of the implants, access to extended AI capabilities is always available. Also, as an approach, the combination of prosthetic and neural implants might prove to be more practical for a number of reasons.

  • Extracting coherent high-level thoughts via passive implants may prove to be very, very difficult, although inserting coherent thoughts via active implants may prove to be even more difficult. One of the key problems being that human thought is closely intertwined with sensory interpretation, which is being constantly updated by fragmented differential updates.

  • Prosthetic implants on the optic and auditory nerves could effectively by-pass this problem by intercepting the raw sensory data, which the brain thought processes interpret as information and then stores as knowledge. However, the AI system which also receives this data may evolve to carry out enhanced processing functions, which would be impossible for the brain to perform plus highlight anomalies in behavioural responses and generally act as a secondary expert opinion on any information received.

  • Current developments in artificial neural networks suggest that they could one-day be capable of decoding and encoding sensory data patterns associated with the primary senses. Therefore, the key problem may be the nano-implants required to extract and insert sensory data of sufficient quality and resolution. However, while this is a considerable challenge for future generations of researchers, it does not appear impossible in concept.

  • The system of prosthetic implants can evolve in-line with the ability of AI systems, nano-technology and our understanding of neuro-physiology. For example, initial implants may do little more than monitor vital life functions and the physical location of a person via the Global Positioning System (GPS).

OK, even if we could sell this idea to the pioneering technophiles, what about acceptance by the rest of society?

It is probably true to say that most people today would not be queuing up to undergo what appears to be a very radical and potentially risky procedure without some very good reasons. Therefore, our predictions must be extended beyond the mere technical ability to provide such functions and include some seriously compelling benefits, which can be understood by a broader section of society.