No title

 The emergence of Artificial Intelligence has brought us ChatGPT, smart cars and a groundbreaking drug discovery- however it has also brought with it an astronomical and unsustainable power bill. The current large AI models are consuming large amounts of electricity on a par with whole neighborhoods, and this is all due to hardware that was created to be fast, rather than efficient in its fundamental design.

In clear contrast, the biological computer is the human brain which is the ultimate that successfully carries out complex learning and recognition tasks with a very small amount of power of about 20 watts, just about as much energy as an ordinary dim lightbulb consumes.

(Image Credit: The Yang Lab at USC)

This gap has now been solved by researchers at the University of Southern California (USC) with a breakthrough in neuromorphic (brain-inspired) computing that may finally bridge this gap not through mathematical modeling of the brain, but through physical modeling of brain operation. The new hardware is founded on a radical architectural change which transitions to the electron-based computing of silicon to a system which is founded upon the motion of atoms and charged particles-as is the case with a real neuron.

The Secret Ingredient: Not Electrons but Atoms.

Digital computers have been running on the speed of the moving electrons in silicon over decades. Electrons are not only fast, but they are also horrible at long term, adaptive memory, and they need constant power. It is this continuous data movement between the processor and memory, which is referred to as the von Neumann bottleneck, that is the primary contributor to AI systems consuming a lot of power.

This has been addressed by the team of USC headed by Professor Joshua Yang who introduced a new component that is known as the diffusive memristor.

And the simplest explanation is that:

  • Physical Emulation: This new system, rather than trying to be complex silicon circuits simulating the behavior of a neuron, relies on physical movement of ions (charged atoms, in this case, silver ions within oxide materials) in order to get electrical pulses.
  • Biological Fidelity: This is exactly the ion movement that controls the process of learning and communication in your own brain where ions such as sodium and potassium are crossed at a synapse to pass information along. Through these similar principles of physics, the artificial neuron will represent the analog dynamics that render biological learning so effective.

The 1M1T1R Benefit: AI by an Order of Magnitudes.

This transition to the so-called iontronic computing makes the hardware itself much easier.

  • Miniaturization: Traditional digital systems require tens or even hundreds of transistors in order to mimic the complicated behavior of a single biological neuron. The new iontronic neuron has only three parts one diffusive memristor, one transistor, and one resistor (1M1T1R) .
  • Footprint: This minimalist architecture is able to fit in the pure physical area of a single standard transistor when stacked vertically. This enormous decrease in size (physically) is estimated to enable the chips to reduce by orders of magnitude.
  • Power Leap: What is more important is that this design dramatically reduces power consumption. The new artificial neuron is about the level of picojoule per spike. To compare, scientists think that as it is scaled further, this design can scale to the attojoule per spike level the range of estimated efficacies of biological neurons.

The very small size of a chip, which combines processing and memory, reduces the amount of energy that data moves around (in its own communication bottleneck) eliminating the communication bottleneck that current AI hardware suffers.

Efficiency alone is not the solution: The Future of AI.

The real capability of this breakthrough is that it is faithful to the real brain functioning. The researchers have shown that the 1M1T1R neuron is able to display six basic neuronal properties that are important to learning that is decentralized and robust:

  • Leaky Integration: It is a cumulative summation of the incoming signals with time, accumulating potential.
  • Threshold Firing: This is when the potential reaches a certain spot it only triggers a signal (sends out a signal).
  • Refractory Period: It is a short period of self-regulative rest period between the firing that stabilizes the network.
  • Intrinsic Plasticity: It modifies itself according to the action of the recent activity which enhances learning.

These analog-like properties are physically implemented in the silicon, so the chips are intrinsically more capable of learning and adapting with several examples, as a child can learn a new object and only a few examples, which is not the case with the existing computer systems which require thousands of training images to know about it.

The Future: Wearables, Robotics, and AGI.

The impacts of this invention are so widespread. The chips are extremely small and consume less energy, thus they are ideal in:

  • Edge Computing: Putting advanced AI on tiny power-constrained computers such as advanced sensors, robotics, and new wearable technology.
  • Bio-Hybrid Systems: The hardware can be much more natural to interface with medical devices, such as next-generation neuroprosthetics and state-of-the-art Brain-Computer Interfaces (BCIs) to control cursors or limbs since the device works based on ion dynamics.
  • Artificial General Intelligence (AGI): The last step that may bring the development of AGI closer is to remove the enormous power consumption and develop hardware which enables active, decentralized learning, iontronic computing is believed to be a significant stride in possibly making the next step towards the development of the concept of AGI.

Although the original systems employ silver ions, and more work needs to be done on creating the materials perfectly compatible with the current large-scale manufacturing systems, the architectural breakthrough has been achieved. It is probable that the future of computing will be based less on rapid electrons and more on slow, effective atoms and eventually providing AI with the brain it merits .

Previous Post Next Post