Thu. Mar 13th, 2025

Artificial Intelligence Needs a Brain Transplant to Avoid Imminent Collapse

The Good News: We Have the Programming Language for Coding Smarter Neurons

AI Is Dying of Success

AI is riding high, but it’s burning itself out in the process. The massive computational overhead, skyrocketing temperatures, and resource-devouring appetite needed to pull off even basic cognitive realism are unsustainable. The bills are way past red, and if this keeps up, AI could be running itself into an early grave.

Take OpenAI — basically the golden child of AI right now. It’s got a massive user base — 300 million active users weekly as of late 2024 — but guess what? It’s still bleeding money. We’re talking $5 billion in projected losses for 2024, with the red ink expected to hit a jaw-dropping $44 billion by 2028 (see chart 1, below )

Chart 1. Losses are mounting in OpenAI’s over-debt account, supported by Microsoft, bringing the business closer to financial collapse with each new AI model upgrade released.
Chart 1. Losses are mounting in OpenAI’s over-debt account, supported by Microsoft, bringing the business closer to financial collapse with each new AI model upgrade released. Sources: Business Insider, Reuters

The numbers don’t lie: every shiny new large language model (LLM) they roll out just adds fuel to the spending fire. Training these beasts costs billions. To even think about breaking even, they’d have to charge subscribers hundreds of euros a month — yeah, not happening in this lifetime. The wild part? Despite becoming less profitable with every new upgrade, investors are doubling down, betting on the opposite. This casino-like gamble could lead to a massive AI bubble, potentially triggering another AI winter (see chart 2, below).

Chart 2. The market is driven by irrational tech beliefs, fueling another giant tech bubble while ignoring the financial burden — funny enough, the miracle might actually be closer than we think.
Chart 2. The market is driven by irrational tech beliefs, fueling another giant tech bubble while ignoring the financial burden — funny enough, the miracle might actually be closer than we think. Sources: Business Insider, Financial Times, Market Watch

And this isn’t just OpenAI’s headache. The whole AI game is built on the same shaky foundation. Microsoft, for example, is burning $20 per user per month just to keep GitHub Copilot afloat. Across the board, AI companies are drowning in costs, with operations eating up 60–80% of their budgets.

The truth is, AI wasn’t designed to sustain itself — it was designed to guzzle resources. Unless we hit pause and rebuild the foundations, this entire industry might just implode under the weight of its own success.

Why Artificial Neural Networks Are the Clunky Pickup Trucks of AI

Artificial Neural Networks (ANNs) have been running the AI show since Frank Rosenblatt introduced the perceptron 65 years ago in 1958. But here’s the catch: ANNs are brute-force wannabes of biological neurons. They skip all the clever tricks that make brains efficient. It’s like trying to emulate a Formula 1 racer with a clunky pickup truck.

Take spatial and temporal summation, for example. In real brains, neurons cleverly mix multiple inputs at once (spatialand accumulate signals over time (temporal). This isn’t just a “nice to have” — it’s the secret sauce behind efficient, powerful thinking. Biological neurons handle tasks with fewer resources, while ANNs pile on layers and connections like a digital Tower of Babel, just to get comparable results.

Table 1. shows how mind-blowing the novel smarter LIF neurons are by comparing their advanced features to those of biological neurons and highlighting their advantages over the older ANN-based neurons. They emerge as the clear successors for overcoming the current dead-end in ANN evolution.
Table 1. shows how mind-blowing the novel smarter LIF neurons are by comparing their advanced features to those of biological neurons and highlighting their advantages over the older ANN-based neurons. They emerge as the clear successors for overcoming the current dead-end in ANN evolution . Source: The author’s own elaboration based on data available under a free public license

Then there’s plasticity — how real neurons adapt dynamically through mechanisms like spike-timing-dependent plasticity (STDP). In contrast, with ANNs, introducing new knowledge means retraining everything, like re-teaching a toddler the alphabet whenever a new letter shows up.

On top of that, ANNs rely on continuous activation functions, which guzzle compute power while being biologically unrealistic. It’s functional, sure, but compared to real neurons, it’s like trying to reach Mars with a paper airplane instead of a spacecraft.

It’s Time for a Brain Transplant and to Build Novel Mind-Blowing Architectures

We’ve got a smarter foundation for building ANNs: meet Spiking Neural Networks (SNNs) with LIF neurons. These systems fire sharp, event-driven bursts (spikes) and process data in real time, just like the brain. The challenge? Transitioning from traditional ANNs to these novel SNNs can be a bit tricky.

Here’s the smarter upgrade: the Linear Leaky-Integrate-and-Fire (LIF) neuron model with Weight Mapping. Think of a spiking neuron’s membrane potential as a battery that charges with every input spike. No need for overly complicated activation functions anymore, as with our current and old ANNs.

Figure 1. Leaky-Integrate-and-Fire (LIF) spiking network processes a candle in the darkeness with efficiency. Brighter parts of the image, like the candle’s flame, trigger higher spike frequencies (action potentials) than darker areas
Figure 1. Leaky-Integrate-and-Fire (LIF) spiking network processes a candle in the darkeness with efficiency. Brighter parts of the image, like the candle’s flame, trigger higher spike frequencies (action potentials) than darker areas. This spiking code incorporates features like temporal summation (spike rates over time) and spatial summation (integrating multiple inputs) — capabilities current AI ANNs lack. Unlike ANNs, LIF neurons fire only when their membrane potential crosses a threshold, saving energy by activating only when needed. This allows LIF networks to capture detailed features with far fewer computational resources, making them the efficient, brain-like alternative to today’s resource-heavy AI. Source: author’s own elaboration based on public data from a Frontiers paper.

Well, ready to proceed with the AI-Brain transplant? Follow these easy steps:

  1. Convert Your Existing ANN
    No need to reinvent the wheel: with the right parameter mapping, you can plop your ANN’s weights and biases right into LIF parameters. Instant gratification — no draining compute cycles retraining from scratch.
  2. Train LIF Parameters Directly
    If LIF neurons are truly the new “ANN nodes,” why not optimize membrane capacitance and conductance like ordinary network weights? It’s the same backprop dance, but now you’re tuning actual biological-like properties. That’s next-level control.
  3. Dynamic Activation Functions
    Traditional ReLUs? They’re so last decade. Now, you can tweak thresholds or membrane capacitances on the fly, creating a “dynamic ReLU” that fine-tunes itself mid-flight. Picture your network easily throttling its own firing rates depending on the task.
  4. (Bonus) Build Brainy Columns
    Feeling ambitious? Take a page from biology and stack your LIF neurons into columnar microcircuits — the very structural blueprint that powers higher-level cognition in real cortices (see figure 2 below). But here’s the crazy part: by tweaking not just the neuron parameters but the architecture itself, you can go beyond even the human brain’s limitations, dialing in entirely new “super-cognitive” features. Instead of flattened layers, imagine layered columns handling feedback loops, predictions, and more — where your AI doesn’t just save power, it levels up its cognitive prowess. Let’s think of it as the ultimate evolution beyond brute-force matrix ops, plunging into a realm of self-organizing, column-based intelligence that could outthink both classical ANNs and biological brains.
Fig 2. Spiking Neural Networks (SNNs) with LIF Neurons enable new mind-blowing architectures based on the columnar cortical arrangement of living brains with high cognitive capabilities. In the image on the far right, you can see two biological examples of this architecture.
Fig 2. Spiking Neural Networks (SNNs) with LIF Neurons enable new mind-blowing architectures based on the columnar cortical arrangement of living brains with high cognitive capabilities. In the image on the far right, you can see two biological examples of this architecture. Source: The author’s own elaboration based on publicly available images licensed under Creative Commons BY-NC-ND.

The result? You can convert your trusty older ANN into the smarter and way more energy and time efficient spiking neural network with LIF neurons without losing performance. Finally, real-time adaptation, energy efficiency, and biologically inspired features are within reach. As we mentioned in a previous article here on Medium, “AI Does NOT Need Intelligence to Become Intelligent,” it just needs to be more efficient and open to improvements in its dynamic architecture.

In short, we’re no longer just scaling up old-school ANNs — we’re throwing them an entire cerebral upgrade. And that’s how you transform “hey, my AI can classify dogs vs. cats” into “whoa, my AI is legitimately thinking.”

Bottom Line: Experiments Back the New AI Brain

Recent experiments are proving it — whether it’s single neurons or deep CNNs (Convolutional Neural Networks), LIF-based spiking networks can achieve the exact same performance as traditional ANNs, especially with proper parameter mapping. But here’s the thing: they do it with energy efficiency several orders of magnitude greater while enabling cognitive superpowers that our current ANN-based AI can only dream of.

ANNs have been the workhorses of AI for decades, driving breakthroughs in image recognition, speech processing, and more. But compared to the elegance and efficiency of these new LIF-based AI “brains,” they’re practically stuck in the computational Stone Age.

Picture this: same network size, but your LIF spiking model only fires when necessary, instead of running full-blast 24/7 like a standard ANN. That translates into major energy savings. In fact, particularly on event-driven tasks.

By admin

Related Post

Leave a Reply

Your email address will not be published. Required fields are marked *