Generative AI Challenges and Opportunities for Modern Enterprises

By on
Read more about author Coral Trivedi.

Generative AI (GenAI), machine learning (ML), and large language models (LLMs) are all becoming increasingly important to modern enterprises, but achieving measurable value from AI is still a challenge. Part of the issue is that a well-trained AI model relies on a large amount of data, and for many companies, organizing and making use of all their data slows them down every day. To maximize the value from AI, companies need to ensure their data stack is well organized. If a company is able to consolidate data sources, it’s much easier to create valuable use cases for generative AI. Here are a few examples already adding value today.

AI in Software Development and Data Science

As far as LLMs go, GPT-4 is an impressive generalist, with broad ranging knowledge of topics spanning from world history to computer programming to middle eastern cuisine and beyond. That’s not surprising, as it was largely trained on webpages scraped from the internet. But what most companies need are specialized models focused on their vertical market, that are trained on their internal data, not the internet. The a16z post on What Builders Talk About When They Talk About AI explained how enterprises don’t really need more chatbots. Companies need GPTs that can efficiently provide insight with high accuracy and precision. It doesn’t matter if the AI can summarize Shakespeare – it matters whether it can accurately predict what a potential customer’s lifetime value might be.

Ali Ghodsi from Databricks noted that his customers “want to have specialized models that are cheaper, smaller, and have really high accuracy and performance.” For something like manufacturing that requires extreme accuracy, you’re better off training a smaller model on a specialized, domain-specific dataset. The resulting model will be faster, cheaper and more accurate as a result. 

With a more comprehensive dataset, we’re seeing how companies can prototype new software and iterate quickly. We use generative AI at my company to help create prototype connectors that facilitate the movement of data from cloud apps, databases, streaming data and enterprise applications, all flowing into a data warehouse or data lake. Creating connectors for new SaaS applications can be challenging when platforms and schema change so quickly. Using GPT-4, we’ve been able to get a customer up and running while we do the longer-term work to create full-featured, robust connectors. 

Instant Intelligence

One of the use cases I find fascinating is how GenAI is being used for search and summarization. Every big company has multiple data repositories, from Atlassian to Slack, Sharepoint to Teams, or Google Drive and Gmail. Or a mix of all of the above. And for the most part, these massive resources of organizational knowledge are still largely untapped. That will soon change, as companies recognize the competitive advantage of tapping into this data, and leveraging it using AI. Retrieval-augmented generation (RAG), which enables LLMs to retrieve facts from outside sources like internal documents or the internet, is an exciting development that we have yet to fully capitalize on.

Along with these enterprise apps, there are domain-specific repositories, like the trading history at a finance company or retail orders and customer profiles that need to be integrated in the training dataset. Training an LLM can make it very easy to ask questions in plain English that can uncover information from an organization’s entire data stack. But that data needs to be organized and categorized first so training can make sense of it all, and the more data available, the better the results from training. 

This problem is especially challenging in a change data capture environment, when financial or transaction data is coming in around the clock and constantly updating. When data schemas change, data can get miscategorized or even lost to the ether. If the LLM is going to help automate things, create new product ideas or brainstorm new concepts, it needs to be up to date. Unfortunately, many companies struggle just getting data into one place in the first place.

AI Up-Levels Roles and Facilitates Collaboration 

For a long time, there’s been a need for entry-level software engineers who can write basic code, without focusing on the bigger picture of data architecture and design patterns, integration with other platforms or designing a system for maximum performance.

As Dylan Field from Figma put it, “The best designers are starting to think much more about code, and the best developers are thinking much more about design.” GenAI is enabling these people to cross over into each other’s traditional domain and add value – that’s going to make development so much faster. Meanwhile, smart devs are studying systems design patterns in an effort to move themselves higher up the value chain.

Ultimately, the fusion of generative AI, large language models and machine learning will transform enterprise operations. From software development to marketing strategy, generative AI is going to have a dramatic impact by creating new code, prototyping ideas and breaking down silos between designers and coders – without giving away proprietary data. The key will lie in balancing the versatility of AI with an essential foundation of data management. If we can keep the underlying data centralized and integrated, we can kick off this next era of technology to make people more productive and enterprises more effective.