Competing in AI When Infrastructure Is Controlled by Hyperscalers
Artificial Intelligence 7 min

Building AI Products With Limited Resources in a Centralized Landscape

With AI infrastructure increasingly concentrated among hyperscalers, smaller businesses and startups face an uphill battle—but they don’t have to be squeezed out. Senior Executive AI Think Tank leaders outline actionable strategies—from open-source models to domain specialization—small businesses can use to build competitive AI products with limited resources.

by AI Editorial Team on November 3, 2025

As major players like OpenAI, Google, Amazon and Anthropic continue to dominate AI infrastructure, smaller businesses and startups face a growing concern: how to compete in a landscape shaped by centralized compute, model development and vast resources. Major tech firms have invested billions in foundational models and own substantial portions of the infrastructure underlying generative AI. This can make it challenging for smaller companies to not only get off the ground, but get ahead.

The Senior Executive AI Think Tank brings together seasoned experts in machine learning, generative AI and enterprise AI applications who believe that smaller firms can still win—in different ways. This article explores their insights on how startups should pivot from trying to match scale to leveraging agility, domain expertise and smarter infrastructure choices.

Embrace Open-Source and Modular Architectures

As hyperscalers consolidate AI infrastructure, startups face a tough question: How can they compete when compute and capital are stacked against them? Aravind Nuthalapati, Cloud Technology Leader for Data and AI at Microsoft, argues that the answer lies not in matching resources, but in using them more strategically.

“AI centralization by hyperscalers challenges smaller firms, but agility and focus can level the field,” Nuthalapati says. “Startups should leverage open-source models like Mistral or Llama to fine-tune cost-effectively and retain control. Use modular, API-driven architectures that stay cloud-agnostic and integrate lightweight tools for orchestration.”

This approach—treating AI models as components rather than dependencies—empowers smaller companies to stay nimble. Instead of relying on proprietary black-box systems, startups can use open frameworks to fine-tune and deploy models across multiple environments.

“Smaller companies can compete by focusing on applied innovation, using these tools creatively to address niche, high-impact challenges.”

Mo Ezderman, Director of AI at MindGrub Technologies, member of the AI Think Tank, sharing expertise on Artificial Intelligence on the Senior Executive Media site.

– Mo Ezderman, Director of AI at Mindgrub Technologies

SHARE IT

Focus on Applied Innovation, Not Infrastructure Lock-In

The centralization of AI capabilities at companies like OpenAI, Google and Amazon may look like a closed door, but Mo Ezderman, Director of AI at Mindgrub Technologies, believes it’s an open invitation to think differently. He draws a parallel to how FAAMG once dominated the cloud, mobile and data markets, yet left room for thousands of innovators to thrive atop their platforms.

“AI centralization in firms like OpenAI, Google and Amazon isn’t a new pattern; it mirrors how FAAMG dominated cloud, mobile and data infrastructure over the last decade,” Ezderman says. “These companies build foundational tech and monetize access, but they aren’t positioned to solve every business problem.”

For smaller firms, the lesson is clear: Focus on application-level creativity, not infrastructure mimicry. “Smaller companies can compete by focusing on applied innovation, using these tools creatively to address niche, high-impact challenges,” he adds.

Build Domain Specialization and Unique Data Moats

If hyperscalers own the infrastructure, then startups must own the insight. Jim Liddle, Chief Innovation Officer of Data Intelligence and AI at Nasuni, says startups need to stop thinking of AI giants as competitors—and start treating them as utilities.

“The large AI vendors aren’t the competition—they’re the infrastructure. You don’t compete with the power company; you build things that need electricity,” Liddle says. “Smart startups treat foundation models as a commodity and spend their cash on what the giants can’t replicate, such as solving problems in a particular vertical or domain in which they have deep experience.”

That means trading breadth for depth. According to Liddle, “The idea of building your own AI model is a capital bonfire.” Instead, firms should direct capital toward proprietary data collection and problem-specific optimization.

“You don’t need more compute; you need more clarity.”

Divya Parekh, Founder of The DP Group, member of the AI Think Tank, sharing expertise on Artificial Intelligence on the Senior Executive Media site.

– Divya Parekh, Founder of The DP Group

SHARE IT

Precision, Trust and Ecosystem Over Scale

While large AI firms chase generality, smaller ones can win with precision. Divya Parekh, Founder of The DP Group, believes startups need to focus on intimacy, not imitation.

“Small companies often think they’re losing because they don’t have the budgets of hyperscalers,” Parekh says. “You don’t need more compute; you need more clarity. Start by owning a problem no one else is obsessing over. Then collect the data others can’t, because proximity beats scale.”

Her philosophy centers on building sharp ecosystems of partners, data, and users that hyperscalers can’t easily replicate. “The companies that win will not build the biggest models. They will create the sharpest ecosystems,” she adds.

Agility, Fine-Tuning and Cost-Efficient Paths

For Aditya Vikram Kashyap, Vice President of Firmwide Innovation at Morgan Stanley, the secret to competing with hyperscalers isn’t brute force—it’s focus.

“Startups don’t need to outspend hyperscalers to compete—they need to out-innovate them,” Kashyap says. “True leverage comes from building on open-source models, fine-tuning them for domain-specific problems, and tapping into cloud credits and modular APIs rather than trying to replicate full-stack infrastructure.”

This focus on adaptability rather than scale gives smaller firms an edge. “What smaller firms lack in compute power, they make up for in agility: the ability to move fast, specialize deeply and build trust with users,” he adds.

Partnerships, Niches and Feedback-Driven Development

Even with limited capital, startups can amplify their impact through collaboration. Roman Vinogradov, VP of Product at Improvado, stresses the value of ecosystem-building and customer iteration.

“Smaller businesses can find their niche by focusing on specific problems that larger firms might overlook,” Vinogradov says. “Leverage open-source AI tools and platforms to build prototypes quickly without heavy investments. Collaborate with research institutions or universities for access to resources and fresh talent.”

He encourages startups to prioritize speed and feedback over perfection. “Agile iterations based on real user needs can set you apart in a crowded market,” he adds.

“Smaller businesses must prioritize out-specializing rather than trying to out-scale.”

Mohan Krishna, Data & AI Leader of Texas Health, member of the AI Think Tank, sharing expertise on Artificial Intelligence on the Senior Executive Media site.

– Mohan Krishna Mannava, Data and AI Leader at Texas Health

SHARE IT

Out-Specialize, Don’t Out-Scale

Mohan Krishna Mannava, Data and AI Leader at Texas Health, argues that hyperscaler centralization demands not defiance, but precision. He believes smaller firms can compete effectively by targeting high-value niches and building data moats.

“The structural challenge posed by the AI centralization within hyperscalers demands a shift in strategy,” Mannava explains. “Smaller businesses must prioritize out-specializing rather than trying to out-scale. This is best achieved by adopting a hybrid infrastructure approach—leveraging open-source models fine-tuned on proprietary data.”

By narrowing their focus, startups can train models that outperform general-purpose AIs within specific verticals.

Actionable Strategies for Leaders

  • Embrace open-source and modular architectures. Leverage open-source models and cloud-agnostic orchestration to stay lean and flexible.
  • Focus on applied innovation, not infrastructure lock-in. Competing with hyperscalers doesn’t mean building everything—just solving the problems they aren’t addressing.
  • Build domain specialization and data moats. Be the best in a narrow vertical rather than try to be everything.
  • Prioritize precision, trust and ecosystem over scale. Clarity of insight and customer intimacy beat sheer size when resources are limited.
  • Fine-tune open models and harness agility. Leverage open-source, fine-tune and tap into cloud credits over replicating full-stack infrastructure.
  • Leverage partnerships, feedback loops and value-focused development. Zero in on collaboration, iterative design and delivering real value rather than tech for its own sake.
  • Out-specialize instead of out-scale. Focus on hybrid infrastructures and proprietary data to outperform hyperscaler systems in tightly defined domains.

The Future Belongs to the Focused

In an AI landscape increasingly dominated by a few hyperscalers, smaller businesses and startups might feel the odds stacked against them—but the industry’s shift toward open models, modular infrastructure and domain-specialized solutions levels the field in unexpected ways. The verdict from the Senior Executive AI Think Tank is that you don’t have to out-spend the big players, but you should out-think them.

By embracing open-source models, zeroing in on a niche, fine-tuning for domain relevance, building trust and leveraging agility and partnerships, smaller firms can not only survive but thrive. The future of AI competition isn’t about who has the biggest model—it’s about who has the sharpest focus, the clearest domain insight and the tightest connection to the end user.


Copied to clipboard.