Close Menu

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    What's Hot

    Trump’s delay on Strait of Hormuz deadline fails to boost Wall Street

    March 28, 2026

    The Marketing System That Drives Predictable Revenue Growth

    March 28, 2026

    NY Gov Kathy Hochul delay to climate law will impact energy bills

    March 28, 2026
    Facebook X (Twitter) Instagram
    Facebook X (Twitter) Instagram
    Live Wild Feel Well
    Subscribe
    • Home
    • Green Brands
    • Wild Living
    • Green Fitness
    • Brand Spotlights
    • About Us
    Live Wild Feel Well
    Home»Green Brands»AI Founders Are Chasing The Wrong Thing
    Green Brands

    AI Founders Are Chasing The Wrong Thing

    wildgreenquest@gmail.comBy wildgreenquest@gmail.comMarch 28, 2026006 Mins Read
    Share Facebook Twitter Pinterest Copy Link LinkedIn Tumblr Email Telegram WhatsApp
    Follow Us
    Google News Flipboard
    Share
    Facebook Twitter LinkedIn Pinterest Email Copy Link


    Opinions expressed by Entrepreneur contributors are their own.

    Key Takeaways

    • The loudest constraint often distracts founders from the real limiting factor.
    • AI success depends more on infrastructure logistics than headline technologies like GPUs.
    • Winning founders identify binding constraints and optimize around what actually blocks progress.

    If you want a quick lesson in “the bottleneck isn’t always what’s loudest,” look at the California power market.

    For years, politicians articulated the need for green energy production. That meant installing renewables like solar and wind, and incentivizing their production via subsidies. But after all of that effort, the evening price in California actually increased.

    The reason is subtle, but the summary is that the real bottleneck turned out to be batteries: the ability to save that cheap daytime power and use it at night.

    Now take that same mental model and apply it to AI.

    Everyone is talking about GPUs. What’s advertised most broadly right now is that companies are building out their own data centers. But if you’re actually trying to build, you’ll keep running into the same question: How should AI companies try to get the power or the compute that they need?

    This is where I think founders get misled. They copy whatever the market is obsessing over, and they mistake “what’s loud” for “what’s tight.” In optimization, there’s a more useful concept: binding constraints. What’s the thing that’s stopping you?

    Here are three steps I use to find it.

    1. Start with the objective function, then ask the only question that matters

    I’m going to get a little bit technical here, because this is how I think about the real bottlenecks.

    For any pricing problem, you start with an objective function — what’s the price that maximizes total welfare? Total welfare is the area between the supply curve and the demand curve. In other words, we want to give power to the people who want it the most, and we want to produce it for the lowest cost possible.

    Then you ask the only question that matters: what are the constraints?

    In a simplified production cost model for power, there are three main constraints:

    1. Supply has to equal demand.
    2. No transmission line can exceed its capacity.
    3. No generator produces more power than its capacity.

    This doesn’t go into startup costs, shutdown costs, reliability and regional information, but it’s a useful lens because the model spits out a bunch of shadow prices.

    Those shadow prices are the point. They tell you what’s actually tight.

    Founders can do the same thing without building an optimization model.

    Write down your objective function in one sentence, then list the constraints that can stop it. List the problems that can actually prevent the outcome.

    If your objective is to “ship an AI product that people use,” your constraints might be that you can’t get power, compute, a data center built out fast enough to matter, the agreement structure to work or make the unit economics work.

    The reason this step matters is simple: if you don’t know what’s tight, you’ll build the wrong thing.

    2. Use logistics as your reality check

    Logistics have a nice way of forcing you to deal with reality. It doesn’t care about narrative reasons. You either procure the supply that you need, or you don’t.

    Operations researchers in big tech companies are used to this frame of thinking. They’ll literally write up a linear program or mixed integer optimization to represent how to organize their data center. This degree of modeling might be surprising to smaller companies.

    And when there’s no way to satisfy all of your constraints, the optimization tells you that. Of course, there’s a cost to fixing it. You might have to relax some of your constraints. You can’t always get everything that you want.

    That’s the whole point: constraints show up in the number. If you’re building in AI, stop asking, “What is everyone doing?” and start asking, “What would make my objective function move?” In your business, the objective might be deployment timelines, inference cost, latency or the ability to actually get compute online.

    This is why I like the phrase “binding constraint.” It forces honesty. It forces you to say, “What is the thing that’s stopping me?”

    3. Treat compute like a menu

    I’ve seen people reach for a default answer because it’s what’s advertised most broadly: “build a data center.” But there might be other ways to get the compute you need.

    When I think about AI companies, I translate the same question into founder language: How should AI companies try to get the power or the compute that they need?

    One way is that you get your own GPUs. You find the power. Another option is a data center build-out, which also needs GPUs. You could also use inference providers that already exist, or rent data centers, or get them from somewhere like Crusoe.

    It’s a fairly common build, buy or joint-venture consulting problem. And I like framing it this way because it forces you to stop pretending there’s one default path, especially when “what’s advertised most broadly” becomes the roadmap.

    So take the menu of options and force a decision by asking one question for each option: is it the cheapest way to solve our binding constraint, and what is the new constraint it introduces? Owning GPUs solves one problem, and then you’re back to “find the power.”

    A data center sounds direct, but if the timeline doesn’t work, then you have a new binding constraint. Renting or using inference providers can avoid one binding constraint, but you might trade it for another.

    Stop copying the loud bottleneck

    If you remember one thing, make it this: the limiter isn’t always what everyone is talking about.

    The first step is always the same: define the objective function, then ask what the constraints are. The commodities markets are a good teacher because the name of the game has always been procurement. Supply has to equal demand, you can only ship so much, and you can only produce so much yourself. When something is tight, the price tells you. When something is loose, the price tells you that too.

    AI infrastructure is heading into the same kind of reality. You can obsess over chips, but you still have to find the power. And once you see the problem through that lens, the roadmap gets a lot clearer: pick your option in the build-by-joint-venture spectrum, be honest about what’s actually tight, and plan around the constraint that’s real, not the loud one.

    Key Takeaways

    • The loudest constraint often distracts founders from the real limiting factor.
    • AI success depends more on infrastructure logistics than headline technologies like GPUs.
    • Winning founders identify binding constraints and optimize around what actually blocks progress.

    If you want a quick lesson in “the bottleneck isn’t always what’s loudest,” look at the California power market.

    For years, politicians articulated the need for green energy production. That meant installing renewables like solar and wind, and incentivizing their production via subsidies. But after all of that effort, the evening price in California actually increased.

    The reason is subtle, but the summary is that the real bottleneck turned out to be batteries: the ability to save that cheap daytime power and use it at night.



    Source link

    Follow on Google News Follow on Flipboard
    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email Copy Link
    wildgreenquest@gmail.com
    • Website

    Related Posts

    The Marketing System That Drives Predictable Revenue Growth

    March 28, 2026

    5 Proven Tips for Writing Emails That Actually Convert

    March 28, 2026

    How to Level Up Your Sales Process in Under 10 Hours

    March 28, 2026
    Add A Comment
    Leave A Reply Cancel Reply

    Top Posts

    Secrets of the Blue Zones. My Summary

    March 17, 20264 Views

    JetBlue Is Exploring a Merger With These Rival Airlines

    March 27, 20262 Views

    The 1 Skill Leaders Need Most in an Age of Constant Change

    March 20, 20262 Views
    Latest Reviews
    8.5

    Pico 4 Review: Should You Actually Buy One Instead Of Quest 2?

    wildgreenquest@gmail.comJanuary 15, 2021
    8.1

    A Review of the Venus Optics Argus 18mm f/0.95 MFT APO Lens

    wildgreenquest@gmail.comJanuary 15, 2021
    8.3

    DJI Avata Review: Immersive FPV Flying For Drone Enthusiasts

    wildgreenquest@gmail.comJanuary 15, 2021
    Stay In Touch
    • Facebook
    • YouTube
    • TikTok
    • WhatsApp
    • Twitter
    • Instagram

    Subscribe to Updates

    Get the latest tech news from FooBar about tech, design and biz.

    Demo
    Facebook X (Twitter) Instagram Pinterest
    • About Us
    • Contact Us
    • Privacy Policy
    • Terms & Conditions
    • Disclaimer
    © 2026 ThemeSphere. Designed by ThemeSphere.

    Type above and press Enter to search. Press Esc to cancel.