News Feed

Prioritizing Data: Why a Solid Data Management Strategy Will Be Critical in 2024 – TDWI

By using tdwi.org website you agree to our use of cookies as described in our cookie policy. Learn More
This year, smart enterprises will see beyond hype and strategically position themselves for success by using data as a foundational asset to deliver growth, innovation, and competitive advantage.
In 2023, data leaders and enthusiasts were enamored of — and often distracted by — initiatives such as generative AI and cloud migration.
For Further Reading:
3 Data Management Rules to Live By
The Problem and Promise of Generative AI
Data Fabric: How to Architect Your Next-Generation Data Management
The generative AI buzz and interest in cloud migration shouldn’t be ignored, but as with any technology that requires data strategy, it’s critical that data and analytics professionals be crystal clear about their priorities and confident in the projects that will positively impact their business and goals.
As companies in almost every market segment attempt to continuously enhance and modernize data management practices to drive greater business outcomes, organizations will be watching numerous trends emerge this year. These will include developing a better understanding of AI, recognizing the role semantic metadata plays in data fabrics, and the rapid acceleration and adoption of knowledge graphs — which will be driven by large language models (LLMs) and the convergence of labeled property graphs (LPGs) and resource description frameworks (RDFs).
I expect to see the following data and knowledge management trends emerge in 2024.
Trend #1: Organizations will (finally) manage the hype around AI
As the deafening noise around generative AI reaches a crescendo, organizations will be forced to temper the hype and foster a realistic and responsible approach to this disruptive technology. Whether it’s an AI crisis around the shortage of GPUs, climate effects of training large language models (LLMs), or concerns around privacy, ethics, bias, and/or governance, the challenges will worsen before they get better, leading many to wonder if it’s worth applying generative AI in the first place.
Although corporate pressures may prompt organizations to “do something with AI,” being data-driven must come first and remain top priority. After all, ensuring foundational data is organized, shareable, and interconnected is just as critical as asking whether generative AI models are trusted, reliable, deterministic, explainable, ethical, and free from bias.
Before deploying generative AI solutions to production, organizations must be sure to protect their intellectual property and plan for potential liability issues. This is because although generative AI can replace people in some cases, there is no professional liability insurance for LLMs. This means that business processes that involve generative AI will still require extensive “humans-in-the-loop” involvement which can offset any efficiency gains.
In 2024, expect to see vendors accelerate enhancements to their product offerings by adding new interfaces focused on meeting the generative AI market trend. However, organizations need to be aware that these may be nothing more than bolted-on Band-Aids. Addressing challenges such as data quality and ensuring unified, semantically consistent access to accurate, trustworthy data will require setting a clear data strategy as well as taking a realistic, business-driven approach. Without this, organizations will continue to pay a “bad data tax” as AI/ML models will struggle to get past a proof of concept and ultimately fail to deliver on the hype.
Trend #2: Knowledge graph adoption accelerates as LLMs and technology converge
A key factor slowing down knowledge graph (KG) adoption is the extensive (and expensive) process of developing the necessary domain models. LLMs can optimize several tasks, such as updating taxonomies, classifying entities, and extracting new properties and relationships from unstructured data. Done correctly, LLMs could lower information extraction costs because the proper tools and methodology can manage the quality of text analysis pipelines and bootstrap or evolve KGs at a fraction of the effort currently required. LLMs will also make it easier to consume KGs by applying natural language querying and summarization.
View all | Become a contributor
Join experts from Databricks and Immuta as they discuss the data mesh in regulated industries with TDWI’s Fern Halper. 04/09/2024
Join TDWI’s senior research director James Kobielus on this webinar to explore the importance of the data lakehouse as the foundation of the enterprise customer data stack. 04/10/2024
Register for this webinar in which experts from Databricks and Collibra, along with TDWI’s VP of research, Fern Halper, will take a deep dive into the larger concept of data governance with AI. 04/11/2024
View All >>
TDWI Members have access to exclusive research reports, publications, communities and training.
Individual, Student, and Team memberships available.
Membership Information
Privacy Policy Cookie Policy Terms of Use CA: Do Not Sell My Personal Info © 2024 TDWI
All Rights Reserved


This article was autogenerated from a news feed from CDO TIMES selected high quality news and research sources. There was no editorial review conducted beyond that by CDO TIMES staff. Need help with any of the topics in our articles? Schedule your free CDO TIMES Tech Navigator call today to stay ahead of the curve and gain insider advantages to propel your business!

Leave a Reply