Close Menu

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    What's Hot

    The Subtle Hiring Mistake That’s Costing You Great Talent

    May 1, 2026

    The AI industry’s massive bet on transformer models may not be enough for true AGI

    May 1, 2026

    Why He Turned Down OpenAI’s Near-Million-Dollar Job Offer

    May 1, 2026
    Facebook X (Twitter) Instagram
    Facebook X (Twitter) Instagram
    Live Wild Feel Well
    Subscribe
    • Home
    • Green Brands
    • Wild Living
    • Green Fitness
    • Brand Spotlights
    • About Us
    Live Wild Feel Well
    Home»Brand Spotlights»The AI industry’s massive bet on transformer models may not be enough for true AGI
    Brand Spotlights

    The AI industry’s massive bet on transformer models may not be enough for true AGI

    wildgreenquest@gmail.comBy wildgreenquest@gmail.comMay 1, 2026003 Mins Read
    Share Facebook Twitter Pinterest Copy Link LinkedIn Tumblr Email Telegram WhatsApp
    Follow Us
    Google News Flipboard
    Share
    Facebook Twitter LinkedIn Pinterest Email Copy Link



    Welcome to AI Decoded, Fast Company’s weekly newsletter that breaks down the most important news in the world of AI. You can sign up to receive this newsletter every week via email here.

    Are the biggest AI labs betting on the wrong horse?

    Big AI companies are betting nearly all of their R&D and capital expenditure on the idea that pre-trained transformer models can deliver AI with human-level general intelligence. This approach relies heavily on backpropagation, the standard algorithm used to train deep neural networks.

    Ben Goertzel, who coined the term “AGI” with his 2005 book Artificial General Intelligence (co-written with DeepMind founder Shane Legg), is skeptical. “The commercial AI industry is just betting everything on copying GPT [generative pre-trained transformers] in various permutations, which in my view is a waste of resources because all these LLMs are kind of doing about the same thing.”

    “When something works, everyone wants to double and triple down on what worked,” he says. But this concentration of resources around a single paradigm may be risky. Transformer models require billions of dollars in compute to train, along with enormous ongoing computational resources to operate. So far, major AI labs have continued to see intelligence gains from adding more compute and training data. But as models grow larger, those gains are becoming increasingly expensive, raising the possibility that the returns may eventually no longer justify the cost. And because the financial stakes are so high, labs have little room to invest seriously in fundamentally different approaches.

    Goertzel argues that scale alone is not enough without the right underlying algorithms. In his view, a major limitation of transformer models is that they cannot continually learn from new experiences and update their internal parameters in real time the way humans do. Instead, they revert to their baseline parameters with each new interaction, without meaningfully learning from prior exchanges.

    Researchers at Google DeepMind, Microsoft, and Ilya Sutskever’s Safe Superintelligence are exploring alternative neural network architectures that may enable continual learning, Goertzel says. “DeepMind has incredible diversity within their AI team” and possesses a “deep bench” of experience with alternate AI paradigms, he says.

    The result is an AI landscape in which massive compute resources are largely devoted to refining existing methods rather than pursuing fundamentally different architectures that may be better suited to the kind of human-level generalization required for true AGI. Goertzel remains optimistic that AGI could emerge within the next few years, but he believes it will likely require moving beyond simply scaling current LLMs.



    Source link

    Follow on Google News Follow on Flipboard
    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email Copy Link
    wildgreenquest@gmail.com
    • Website

    Related Posts

    This $23B homebuilder is pushing its housing market incentives to 10.9%—that’s $54,500 on a $500K sale

    May 1, 2026

    Is Trump NACHO the next TACO? Why stock market trading terms sound like a menu

    May 1, 2026

    AI rollouts fail because of culture

    May 1, 2026
    Add A Comment
    Leave A Reply Cancel Reply

    Top Posts

    Study finds asking AI for advice could be making you a worse person

    March 31, 202611 Views

    Best Road Running Shoes (Spring 2026): Over 100 Shoes Tested

    March 25, 20264 Views

    Secrets of the Blue Zones. My Summary

    March 17, 20264 Views
    Latest Reviews
    8.5

    Pico 4 Review: Should You Actually Buy One Instead Of Quest 2?

    wildgreenquest@gmail.comJanuary 15, 2021
    8.1

    A Review of the Venus Optics Argus 18mm f/0.95 MFT APO Lens

    wildgreenquest@gmail.comJanuary 15, 2021
    8.3

    DJI Avata Review: Immersive FPV Flying For Drone Enthusiasts

    wildgreenquest@gmail.comJanuary 15, 2021
    Stay In Touch
    • Facebook
    • YouTube
    • TikTok
    • WhatsApp
    • Twitter
    • Instagram

    Subscribe to Updates

    Get the latest tech news from FooBar about tech, design and biz.

    Demo
    Facebook X (Twitter) Instagram Pinterest
    • About Us
    • Contact Us
    • Privacy Policy
    • Terms & Conditions
    • Disclaimer
    © 2026 ThemeSphere. Designed by ThemeSphere.

    Type above and press Enter to search. Press Esc to cancel.