Back to all perspectives

Blog

Why Optionality Is the Right Strategy for Data & AI

Blog by Ana Valle

With the growing strategic importance of Data and AI to modern businesses, choosing the right technologies for your organization has never been more crucial. But a fast-paced, ever-changing market can create a lot of uncertainty and make these choices feel overwhelming.

Choosing the wrong solution could mean getting stuck with something sub-optimal, or facing costly, time-consuming restructuring to adapt to new tools. At the same time, delaying these decisions could mean being left behind competitors’ capabilities. But what if there was a way to innovate quickly and leverage the latest technologies without committing to a single technology stack?

If the market is constantly changing, your data and AI strategy should also allow you to easily change, adapt and experiment with new tools. Maintaining optionality should be the highest priority for organizations who want to leverage the best of data and AI in this competitive and fast-moving landscape.

A buyer’s market

The good news for those making strategic decisions in data & AI is that it is increasingly becoming a buyer’s market. Competition between AI models from the largest providers is intense, and heavy investments are pressuring organizations to seek customer acquisition and retention, driving down costs and accelerating innovation. Additionally, cloud and data platform services are increasingly commoditized, with little differentiation between the main vendors’ offerings in many regards. Although they may try to present themselves as a one-stop-shop for everything you need related to data and AI, independent data platform providers bet on open formats and interoperability, allowing you to “mix and match” and prevent single vendor lock-in.

However, this favourability towards buyers comes with a caveat. The data and AI landscape is not only competitive, but also complex, interconnected, and fast-moving. In this context, making a bet on one vendor or approach is increasingly risky. This is where optionality becomes critical.

The ability to adapt

Optionality is simply about keeping your options open and being able to adapt to this complex and fast-moving environment. It means not being locked into one ecosystem or solution when another becomes best in class or offers new capabilities. It means spreading risk should one provider become unable to deliver the service you need. And it also means having more power when negotiating special deals and rates with providers, since the threat of switching is real.

This requires a change of mindset when considering strategic choices in data and AI. When making these choices, we are used to focusing on aspects like technical capabilities or cost, but building your strategy with optionality in mind means including more factors into your decision:

1. Commercial Trends and Incentives: It’s crucial to analyse not just the tool itself, but the commercial incentives of the vendors behind it.
2. Lock-in Degree: For every choice, it’s important to consider the degree of lock-in and the effort required to switch directions in the future, if necessary.

Too often, organizations focus purely on what a technology can do. It’s equally important to understand why a vendor is offering it, how it’s monetized, and where lock-in might be introduced.

Implementing optionality

So how can we put optionality into practice?

The core principle is to build a robust, vendor-agnostic foundation, and connect whichever tools you choose on top of that. This way, even when the tools change, the foundation stays the same.

Firstly, prioritize open source over vendor-specific formats. Open file and table formats like Iceberg, Delta and Parquet, as well as open coding frameworks like Spark are widely used and compatible, giving you much greater flexibility than vendor-specific solutions.

When designing data flows and pipelines, be cautious with fully managed or automated approaches provided by vendors. Although automation can greatly help in some instances, building your own metadata-driven data automation framework gives you more control over your data, prevents lock-in and allows you to adapt faster in case you need to switch any tools in the future.

Another way to introduce optionality is by offering built-in choices for users. For example, if you’ve developed an application that uses large language models, having the ability to switch models on the fly gives users deeper options depending on their needs and goals. This can only be achieved when having a robust, vendor-agnostic data platform foundation.

Focus on what matters

The AI landscape will continue to change, new models will emerge, and platforms will converge and diverge. It is easy to get distracted or feel paralyzed by choice. To cut through the noise, organizations need to focus on their goals, have a clear view the technical possibilities and understand the commercial realities. Rather than trying to pick a single “winner”, building your data & AI estate with optionality gives you resilience, flexibility and the freedom to take advantage of a rapidly evolving industry.

Find out more

Contact Ana Valle. Ana is a Data & AI Consultant at Thorogood based in Brazil.

You might be interested in...

Webcast

Navigating a Complex AI Landscape: The Importance of ‘Opt...

Navigating a Complex AI Landscape: The Importance of ‘Optionality’

Blog

The Virtues of Vendor Independence

The Virtues of Vendor Independence

Blog

What Does a Good Data Platform Look Like?

What Does a Good Data Platform Look Like?

Blog

Critical Success Factors for AI Adoption

Critical Success Factors for AI Adoption