Blog by: Callum Lawn, Lead Business Intelligence Consultant, Arreoblue
The Omnipresence of Generative AI
Going into Big Data LDN 2024, I set myself an unusual challenge: to avoid every Generative AI session possible. Little did I know how daunting this task would prove to be. In a LinkedIn post during the event, I shared my initial observations:
“This year at Big Data London I’ve given myself the challenge of avoiding every Generative AI session possible – it’s proving harder than I expected. I’d estimate that 95% of the exhibitors are shoehorning AI into their propositions, with a frightening lack of attention given to the engineering fundamentals required to underpin successful AI systems.”
The pervasiveness of Generative AI at the conference was nothing short of overwhelming. From the exhibition floor to the presentation halls, it seemed that every corner of Big Data LDN was buzzing with AI-related topics. Vendors, speakers, and attendees alike were caught up in a whirlwind of AI enthusiasm, with most conversations inevitably circling back to the latest developments in generative models and their potential applications.
This AI-centric focus, while exciting, raised some concerns. As someone deeply involved in data engineering, I couldn’t help but notice a significant gap between the lofty promises of Vendors and the robust infrastructure required to support these advanced systems. The exhibition floor was flooded with highly curated demos and “AI-powered” solutions, but discussions about the fundamental engineering challenges were few and far between.
In the following sections, we’ll dive into the key themes we observed at Big Data LDN 2024, examining both the opportunities and the potential pitfalls in this AI-driven landscape. We’ll explore why, amidst this “AI gold rush”, it’s crucial to remember the importance of solid engineering foundations in building successful, scalable AI systems.
Engineering Fundamentals: The Overlooked Foundation
As Big Data LDN 2024 progressed, I couldn’t help but draw parallels between the current AI frenzy and the self-service analytics boom we witnessed a few years ago. Both trends share a common thread: they represent the “next shiny thing” that CXO’s are racing to implement in their organisations. However, this eagerness often comes at the cost of overlooking a critical component: the engineering foundation necessary to support these initiatives.
Looking back on the self-service analytics bubble – Promises of democratised data access and user-friendly dashboards captivated leadership teams across industries. CXOs, sold on the prospect of data-driven decision-making at all levels of their organisations, rushed to implement these solutions.
However, many of these initiatives faced significant challenges or outright failure. Why? Because the underlying data infrastructure, governance frameworks, and data quality processes were not robust enough to support the increased demand and complexity that self-service analytics introduced.
Fast forward to today, and we see history repeating itself with AI. At Big Data LDN, it was evident that AI has become the new object of desire for executives. The AI-powered insights, automated decision-making, and increased operational efficiency would appeal to anyone. However, just as with self-service analytics, there’s a dangerous tendency to underestimate the engineering complexities involved.
What’s often missing from the glossy presentations and eye-catching demos is a serious discussion about the infrastructure needed to make AI initiatives successful and sustainable. This includes:
- Data Engineering: Robust, scalable engineering practices are crucial for feeding AI models with high-quality, timely data.
- Data Governance and Quality: AI models are only as good as the data they’re trained on. Strong governance and quality assurance processes are essential.
- Security and Privacy Considerations: AI implementations often move your data in ways we’ve never had to consider before, sometimes crossing borders and being subject to policy outside of that within the region your data normally resides.
Ignoring these foundational elements in the rush to implement AI solutions is a recipe for failure. Just as many self-service analytics initiatives crumbled under the weight of poor data quality and inadequate infrastructure, hastily implemented AI projects are likely to face similar fates.
As data professionals, it’s our responsibility to advocate for a balanced approach. While embracing the potential of AI, we must emphasise the critical importance of solid engineering practices. The excitement around AI should be tempered with a realistic assessment of an organisation’s current data maturity and a concrete plan to build the necessary foundational elements.
In the AI era, perhaps more than ever, the adage holds true: “During a gold rush, sell shovels.” As I mentioned in my LinkedIn post, at Arreoblue, we focus on helping organisations establish the engineering foundations they need to support their enterprise AI ambitions. This approach, while perhaps less glamorous than promising overnight AI transformation, is crucial for long-term success in the AI landscape.
It’s a Platform-as-a-Service now…
The landscape of data analytics platforms is evolving rapidly, with a clear trend towards more comprehensive, integrated solutions. This shift was evident at Big Data LDN 2024, where several key developments caught my attention:
Microsoft’s Fabric offering, despite its slower than anticipated adoption, has clearly influenced the direction of the analytics market. At the conference, it was apparent that many challengers in the analytics space are now pivoting towards similar platform-as-a-service offerings. These competitors are expanding their scope, aiming to provide end-to-end solutions that look increasingly similar to Fabric’s comprehensive approach.
While the intent is clear, the execution varies significantly among vendors. Some have managed to create cohesive, well-integrated platforms that genuinely simplify the data analytics workflow. Others, however, seem to be struggling with the complexity of offering such a wide range of services under one umbrella. This disparity in quality and integration was evident in the demonstrations and presentations throughout the event.
Perhaps the most significant implication of this trend is the continued blurring of lines between traditional data engineering and data analyst roles. As these platforms aim to provide more accessible, end-to-end solutions, they’re effectively pushing for a convergence of skills that were once distinct.
In response to this shift, we’re likely to see growth in the Business Intelligence (BI) Engineering role. This hybrid position bridges the gap between deep technical expertise and business-focused analytics. BI Engineers will be crucial in leveraging these new platform offerings effectively, ensuring that organisations can maximise the value of their data assets while maintaining the necessary technical rigour.
As the industry continues to evolve towards these integrated platforms, it will be interesting to see how roles and skill sets adapt. For now, it’s clear that versatility and a broad understanding of the data lifecycle will be key assets for professionals in the Business Intelligence space.
Copilot-like experiences and LLM Integration
Another trend at Big Data LDN 2024 was the integration of Large Language Models (LLMs) into Business Intelligence tools. Vendors were eager to showcase their AI-enhanced offerings, but the results were decidedly mixed:
- Some integrations showed promise, offering genuinely useful features like natural language querying or automated insight generation.
- Others appeared to be hastily implemented, with limited functionality that added little value to the core BI experience.
- Many vendors presented highly curated demos using small, carefully prepared datasets.
While impressive at first glance, these demonstrations often failed to address the complexities and challenges of real-world, large-scale data environments.
This raised questions about the scalability and practicality of these LLM integrations in enterprise settings.
Perhaps the most eyebrow-raising aspect of the LLM integrations was the inclusion of what can only be described as gimmicks:
- Some vendors proudly showcased voice-to-text features, allowing users to literally “talk” to their BI tools.
This begged the question: Who was asking for this feature? In a professional setting, the practicality and usefulness of verbally querying a BI tool seem limited at best.
Many of these flashy features seemed to prioritise novelty over genuine utility – The focus on these gimmicks often overshadowed discussions about more critical aspects of BI, such as data quality, governance, and scalable architecture.
As BI tools continue to evolve and incorporate AI technologies, it’s crucial to maintain a critical perspective:
- We must distinguish between genuinely useful AI integrations and superficial add-ons.
- The focus should remain on how these technologies can solve real business problems and improve decision-making processes.
Building a Solid Foundation for Enterprise AI
As we reflect on the trends and observations from Big Data LDN 2024, it’s clear that the data industry is at a critical juncture. The pervasive enthusiasm for AI, particularly generative AI, is reshaping the landscape of data analytics and business intelligence. However, as we’ve explored throughout this article, this excitement comes with its own set of challenges and potential pitfalls.
To summarise the key trends and discussion points briefly:
- AI was everywhere – even where it provides a tenuous value proposition
- The parallels between the current AI boom and the self-service analytics trend, both often overlooking crucial engineering fundamentals.
- The evolution of platform offerings, mirroring Microsoft’s Fabric and blurring the lines between data engineering and analyst roles.
- The integration of LLMs into BI tools, ranging from promising innovations to questionable gimmicks.
Amidst this AI gold rush, it is crucial to remember the importance of solid engineering foundations. As I mentioned in my LinkedIn post during the conference:
“At Arreoblue we can’t sell you Nvidia Blackwell GPU’s, but we can absolutely support you in establishing the engineering foundations you need in order to support your enterprise AI ambitions.”
While we cannot ignore the potential of AI, we must approach its implementation with a clear-eyed focus on the underlying infrastructure and engineering practices that will make or break these initiatives.
As data professionals, our role is more critical than ever. We need to:
- Advocate for a balanced approach that doesn’t sacrifice engineering rigor for the sake of AI hype.
- Guide organisations in building robust data foundations that can support advanced AI applications.
- Bridge the gap between cutting-edge AI capabilities and practical, scalable implementations.
- Continuously educate ourselves and our teams on both the potential and limitations of AI in data analytics.
The future of data analytics and AI is undoubtedly exciting, but it’s up to us to ensure that this future is built on a solid, sustainable foundation. By focusing on the fundamentals, we can help our organisations navigate the complexities of AI adoption and unlock its true potential.
About Arreoblue
Arreoblue is a forward-thinking solutions provider specialising in tailor-made strategies to optimise business processes and foster growth. With our Assess, Accelerate and Amplify methodology, our experts will utilise their decades of experience to empower your people with a platform of success and solution that works FOR you.
To find out more, get in touch with one of our dedicated team today at info@arreoblue.com