Thought leadership
What DeepSeek Actually Means for Enterprise AI
Iain Mackie
DeepSeek R1 runs 20-50x cheaper than OpenAI's equivalent — but for enterprises, the real story isn't the model. It's what you build around it.
This week, the world lost its mind over DeepSeek.
Stock markets shed a trillion dollars. Pundits declared it "AI's Sputnik moment." The coverage has been wall-to-wall geopolitics — US vs China, open source vs closed, who spent what and who got blindsided.
It's a great story. But if you're a business leader trying to work out what AI can actually do for your organisation, most of this commentary misses the point entirely.
I was quoted in The Independent this week alongside other UK tech leaders on what DeepSeek means for the industry. My take was simple: "Model companies represent just one facet of the AI market's potential. While language models are powerful, the real impact will come from technologies that drive AI adoption to solve high-value problems."
I want to expand on that — because the enterprise implications of DeepSeek are far more interesting than the headlines suggest.
The cost story matters more than the geopolitics
Here's the number that should actually get your attention. DeepSeek's R1 model runs at roughly $0.55 per million input tokens. OpenAI's comparable reasoning model, o1, costs $15 per million. That's somewhere between 20x and 50x cheaper, depending on the task.
The reported training cost? Around $5.6 million — a fraction of what OpenAI, Google, and Anthropic have spent on their flagship models.
Whether those numbers hold up to scrutiny is almost beside the point. What DeepSeek has demonstrated is that the cost of building and running capable AI models is falling fast. The assumption that you need billions of dollars and tens of thousands of GPUs to compete? It's crumbling.
For enterprises weighing up AI investments, this changes the equation. The barrier to entry just dropped.
But the model isn't the product
And here's where I think most of the DeepSeek commentary goes wrong.
There's a fixation on foundation models — who built the best one, who's cheapest, who's open source. It's the wrong question for most businesses. Foundation models are commoditising. DeepSeek just accelerated that process.
Think of it like this: nobody chooses a bank based on which cloud provider hosts their servers. The infrastructure matters, but it's not the value. The value sits in the layers above — the data, the domain expertise, the workflows, the integration with systems people actually use.
The same is true for AI. At Malted, we build smaller, purpose-built language models for enterprise clients. We've been saying for a while that the race to build the biggest general-purpose model isn't the race that matters for most organisations. What matters is whether the AI solves a specific, high-value problem — reliably, affordably, and within the constraints of your data and your industry.
DeepSeek just made that argument a lot easier to win.
What this means if you're adopting AI right now
If you're a business leader watching the DeepSeek fallout and wondering what to do about it, here's my advice.
First, stop waiting for the "right" model. The model landscape will keep shifting. DeepSeek won't be the last surprise. Your competitive advantage won't come from picking the best foundation model — it'll come from understanding your own data and use cases deeply enough to build something useful on top of whatever model fits.
Second, think smaller. Not every problem needs GPT-4 or DeepSeek R1. Smaller, fine-tuned models trained on your domain data can outperform general-purpose giants on the tasks that actually matter to your business — at a fraction of the cost and with far better control over your data.
Third, focus on adoption, not capability. The models are already impressive. The bottleneck in most organisations isn't the AI — it's the data pipelines, the governance, the change management, the integration work. That's where the real investment should go.
The efficiency era is here
DeepSeek isn't the end of the AI race. It's the starting gun for a different one — a race toward efficiency, specificity, and practical value.
For businesses, that's unambiguously good news. The question was never "which model is biggest?" It was always "which approach solves your problem?" DeepSeek just made it a lot harder for anyone to argue otherwise.

At Malted, we build smaller, purpose-built language models for enterprise clients.
But the model isn't the product
And here's where I think most of the DeepSeek commentary goes wrong.
There's a fixation on foundation models — who built the best one, who's cheapest, who's open source. It's the wrong question for most businesses. Foundation models are commoditising. DeepSeek just accelerated that process.
Think of it like this: nobody chooses a bank based on which cloud provider hosts their servers. The infrastructure matters, but it's not the value. The value sits in the layers above — the data, the domain expertise, the workflows, the integration with systems people actually use.
The same is true for AI. At Malted, we build smaller, purpose-built language models for enterprise clients. We've been saying for a while that the race to build the biggest general-purpose model isn't the race that matters for most organisations. What matters is whether the AI solves a specific, high-value problem — reliably, affordably, and within the constraints of your data and your industry.
DeepSeek just made that argument a lot easier to win.
What this means if you're adopting AI right now
If you're a business leader watching the DeepSeek fallout and wondering what to do about it, here's my advice.
First, stop waiting for the "right" model. The model landscape will keep shifting. DeepSeek won't be the last surprise. Your competitive advantage won't come from picking the best foundation model — it'll come from understanding your own data and use cases deeply enough to build something useful on top of whatever model fits.
Second, think smaller. Not every problem needs GPT-4 or DeepSeek R1. Smaller, fine-tuned models trained on your domain data can outperform general-purpose giants on the tasks that actually matter to your business — at a fraction of the cost and with far better control over your data.
Third, focus on adoption, not capability. The models are already impressive. The bottleneck in most organisations isn't the AI — it's the data pipelines, the governance, the change management, the integration work. That's where the real investment should go.
The efficiency era is here
DeepSeek isn't the end of the AI race. It's the starting gun for a different one — a race toward efficiency, specificity, and practical value.
For businesses, that's unambiguously good news. The question was never "which model is biggest?" It was always "which approach solves your problem?" DeepSeek just made it a lot harder for anyone to argue otherwise.

