In the previous chapter, we saw the advantage of embedding LLMs in a data warehouse: it empowers the user with new ways to interact with the data while eliminating the need to move it to external systems for analysis — an activity that doesn’t add any value but often can’t be avoided.
Some might argue that integrating AI directly into a data warehouse is unnecessary, especially given that many so-called "AI" services today are just repackaged versions of ChatGPT with little added value.
But is it really overkill in this case? Let’s consider it from a different perspective
If I ask you to close your eyes and picture a long summer trip on a car, I'm pretty sure that, by default, you would imagine a car with A/C.
A car without A/C would still take you to destination, sure. But who would give up the comfort of air conditioning? We're so used to it that it became a commodity.
We are going to witness a similar process for AI in Data Warehouses. It may seem redundant today, but soon everyone will realize that it makes the ride a lot smoother and more convenient for the user.
And it won't be long before LLMs become the default way of interacting with a Data Warehouse.
We have made the case for having AI within the data warehouse, now let’s see what Cortex AI brings to the table.