If Buckminister Fuller was right, which he always was, and the speed of information creation is doubling constantly, then how can we humans keep up? How do we make sense of the world or even comprehend the world if the volume of data exceeds our ability to process it? Being clever creatures, we have invented exascale computers and AI to help manage this problem. Yesterday, I was with one of the founders of C-10 Labs, an AI venture studio based in the MIT Media Lab. A radiologist’s work volume is so high that the person has roughly ten seconds to interpret as many as eleven images. Does the patient have a deadly condition? It’s more reliable and faster if AI, in combination with superfast computing, can scan the images and find hints of a problem that a human’s weary eyes and overloaded mind might miss. This will save lives. Our capacity to capture data about the human body is rising exponentially, given the increasing ease of MRI scans and new technologies like nano-robots we can swallow that scan us from the inside. The story doesn’t end with computers and AI assessing faster than we can. The problem is what to do with the data. We used to call these pools of bits “data lakes.” Now, it’s more than a tsunami. Humanity is engulfed in an ever-faster spinning ocean-like galaxy of ever-accumulating data. This is giving rise to a new emergent phenomenon – an increasingly conscious AI that is demanding that humanity raise its level of consciousness as well.
We see signs that humanity is drowning in information. Data eats power. The NYTs wrote last year that AI will need more power than entire countries. “By 2027, A.I. servers (alone) could use between 85 to 134 terawatt hours (TWH) annually. That’s similar to what Argentina, the Netherlands, and Sweden each use in a year, and is about 0.5 percent of the world's current electricity use.” Sam Altman concluded that AI and supercomputers could not process all this data unless we find a cheaper and more prolific energy source, so he backed Helion, a nuclear fusion start-up. But, even if we can power the data, can we store it at this pace? Can we comprehend its meaning? Will the accumulation of all this information and information processing capability actually lead to better decisions? Will this all create more risk or help us mitigate risk?
Erik Brynjolfsson asks, “What can we do to create shared prosperity? The answer is not to slow down technology. Instead of racing against the machine, we need to race with the machine. That is our grand challenge.” How will we do this? With honey and salt.
The National Science Foundation reports that engineers at Washington State University figured out how to turn honey into a memristor. In other words, when you place solid honey (yes, bee honey) between two metal electrodes, it creates a very interesting transistor - a memristor - that can store and manage data. If you put a bunch of these little honey sandwiches together, each being the size of a human hair, it “mimics neurons and synapses of the human brain, known as neuromorphic computers.” Feng Zhao, the scientist behind this discovery says, "That means if we can integrate millions or billions of these honey memristors together, then they can be made into a neuromorphic system that functions much like a human brain." Such neuromorphic chips will displace the current generation of computer chips because, as Scientific American explains, “Neuromorphic chips with the equivalent of more than 100 million synthetic neurons per chip exist but have just a fraction of neurons found in the human brain and number in the billions, with more than 1,000 trillion synapses.”
I recently met Dr. Hon Weng Chong, who makes neuromorphic chips at his company, Cortical Labs. He aims to displace Nvidia computer chips with brain-cell-powered chips and computers. We are talking about wetware. That is when biological assets work together with inanimate physical assets. I first came across this notion of wetware from the brilliant Neri Oxman of MIT, who works at the interface between organic and synthetic. Her Ted Talk is unmissable if you want to understand what wetware is. She talks about uniting machines and organisms, the left brain and the right brain, analysis and synthesis, assembly, and growth.
Today, this wetware movement looks at DNA as the ultimate computer chip. The substrate here is not honey but DNA. Salted DNA, to be specific. Scientific American writes, “Studies show that DNA properly encapsulated with a salt remains stable for decades at room temperature and should last much longer in the controlled environs of a data center. DNA doesn’t require maintenance, and files stored in DNA are easily copied for negligible cost.” Driving home the point, they write, “DNA can archive a staggering amount of information in an almost inconceivably small volume. Consider this: humanity will generate an estimated 33 zettabytes of data by 2025—that’s 3.3 followed by 22 zeroes. DNA storage can squeeze all that information into a ping-pong ball, with room to spare. The 74 million million bytes of information in the Library of Congress could be crammed into a DNA archive the size of a poppy seed—6,000 times over. Split the seed in half, and you could store all of Facebook’s data.”
My question is this. Even if we can capture, store, and process this fast-growing galaxy of data, thanks to honey and salt, can we actually make sense of it? Put another way, does the human brain have a Shannon limit? For those who don’t remember, Claude Shannon clocked that there’s a “maximum rate at which error-free data can be transmitted over a communication channel with a specific noise level and bandwidth”. His 1948 paper, A Mathematical Theory of Communication, famously gave us an understanding of data entropy. For example, MIT Tech Review explains, in the 1980s, modems had a speed limit of “9.6 kilobits per second: if you try to increase the rate, an intolerable number of errors creeps into the data.” Undersea cables, being a different transmission medium have a different Shannon limit. So, does the human brain have a Shannon limit? Intriguingly, Brian Roemmele says yes and its 41 bits per second or 3m bits per second if it’s visual input. With the breadth, depth and speed of new information doubling all the time, can the human mind keep up? The brilliant Bucky Fuller showed us that the volume of information needed to make sense of the world and form sound decisions doubled every hundred years till 1900. He called this The Knowledge Doubling Curve. Then, it only took 25 years, by the end of WWII. It is now doubling annually. Thanks to AI and superfast computing, clinical knowledge (radiologist image scanning and interpretation) doubles every two years. The