From digital to data biomimicry

When considering where we are now with modern computing, it can be worth a quick reflection on how we got to here. To quote Maya Angelou, “I have great respect for the past. If you don't know where you've come from, you don't know where you're going”.

From IBM to transformers

Founded in 1911, the International Business Machines Corporation specialised in Computing-Tabulating-Recording (CTR), and in 1924 they became IBM. They launched their first large-scale electronic computer for commercial use in 1952. It is understood that in 1968 the English scientist and SciFi author, Arthur C Clarke may have based the seemingly sentient “HAL” in 2001: A Space Odyssey on the IBM acronym. New developments at IBM then paved the way for a series of improvements. All the way to the 1980’s when IBM were the dominant computer manufacturers providing a range of large mainframes, minicomputers, and workstations. Notably, all their product lines at this point had tightly integrated operating systems, processors and applications.

Then, a giant geek leap took place in the 1980’s when the company designed a low-cost personal computer (PC), with an independently sourced microprocessor “Intel inside”. This had an operating system (Microsoft) that converted the computing power into something anyone could use without needing to learn and write code. Then other applications could be added provided by specialist third party suppliers. This meant that modules could now be chosen dependent on the user requirement.

The adoption of this approach meant that IBM effectively launched a global market of “clone” PCs, which is still pretty much how we see things now.

However, the launch of ChatGPT in November 2022 introduced AI and a whole new ball-game, with every existing and start-up software company now utilising the new technology and developing generative AI products using large language models (LLMs).

As we are now experiencing for ourselves, algorithms (known as transformers), can process massive amounts of text information in seconds to detect language patterns.

Feeding AI

By feeding increasingly large amounts of data into these training models and expanding the algorithms the idea is to improve both yield and accuracy gains. However, as previously reported in Total Health, the methodology requires increasingly more powerful processors in rapidly expanding data centres - that require more and more space and power. These centres need power – a lot of it! The other big intrinsic implication is that this means only the biggest and most powerful companies can afford to continually innovate by this method. In addition, Total Health have also highlighted the concern over systemic human biases within the training material, which makes it extremely difficult to guarantee trustworthy results. This naturally places an even bigger emphasis on the need for authoritative sources of information.

So where are we heading now?

A shift towards natural biology - or biomimicry

DNA puts the worlds data demands into context; millions of years of natural evolution can't be wrong. As we reach the limit of our data storage capacity on the planet compared to how much data society is generating. NB Currently, humans generate about 5 zettabytes of new data in a year. It’s estimated that our current trajectory, the planet will generate 160 zettabytes of data by 2025. 1 zettabyte is 1,000 exabytes. Doing the maths, we can store all 160 zettabytes with 350 grams of DNA- less than 1 pound. And it will last thousands of years.

Is producing biomimicry datacentres realistic?

Some researchers are giving early indications that access to biological storage may well be applicable. There are multiple centres describing how DNA is being explored as a long-term solution to preserving digital information for future generations. 

Another biomimicry technology development involves neuromorphic computing. Physical systems use silicon chips for digital process, while the brain uses chemicals to stimulate and transfer electrical signals via cells called neurones. Nature shows us that new insights can literally spark new approaches across connections (synapses). This would be a dramatically different pathway for AI. The computer inside our heads solves problems beyond even the capability of the largest AI models, at the same time drawing only around 20 watts. Computer hardware more closely mimicking the brain could therefore help us to match both its power and energy efficiency.

As the Singularity University report, the neural networks powering modern AI are already loosely modelled on the brain, but only at a very, very rudimentary level. Neuromorphic computers extend the biological realism with the hope that we can more closely replicate some of the brain’s most attractive qualities. Neuromorphic computers use spiking neural networks where information is contained in the timing of spikes between neurones.

Power is nothing without control

In a spiking neural network, instead of a focus on the outcome following activation of each neurone, connecting neurones are only activated briefly when they have important information to transmit, which means far fewer neurones draw power at any one time. In addition and crucially, the cores aren’t always on, as they would be in a normal computer. The cores are only ‘event-based’ meaning that the need for power is effectively controlled.

Some ways to go yet

However, the main challenge facing neuromorphic computing is that it operates in fundamentally different ways compared to existing AI systems. This makes it difficult to translate between the two disciplines. A lack of software tools and supporting infrastructure also makes it hard to get started

The basic unit of all living organisms. Full medical glossary
The abbreviation for computed tomography, a scan that generates a series of cross-sectional x-ray images Full medical glossary
The building blocks of the genes in almost all living organisms - spelt out in full as deoxyribonucleic acid. Full medical glossary
The basic unit of genetic material carried on chromosomes. Full medical glossary
Nerve cell. Full medical glossary
Affecting the whole body. Full medical glossary