We provide a range of data services to help our clients uncover the true story their data is telling them and facilitate data-driven decision making.
Our expertise spans 5 key technical areas, from data visualisation to data science. Explore our solutions based on your technical requirements or by your business function.
We provide a range of data services to help our clients uncover the true story their data is telling them and facilitate data-driven decision making.
Our expertise spans 5 key technical areas, from data visualisation to data science. Explore our solutions based on your technical requirements or by your business function.
Data Strategy
From data storage and management
to processes and analysis, we’ll help
you get the most out of your data.
We’ll audit your commercial
requirements, review your KPI
frameworks and design a custom
solution that helps you achieve your
goals.
Data Platforms
With a variety of cloud data platforms
(Azure, GCP & AWS) in the market, it
can be tough to choose the right
platform for you and your business. As
experts in data, we can guide you on
the right solution based on your
business needs.
Data Visualisation
We are technology agnostic and
experts in a wide range of tools that
can help you better visualise your
data, from Power BI and Looker to Qlik
Sense and Tableau.
Data Science & AI
Our team of data scientists are
experts in helping businesses gain
actionable insights from their data. We
can build you custom AI solutions
using the latest machine learning
techniques.
Data People
Looking for an agency that can act as
an extension of your in-house team, or
provide a fully managed service? We
can step in and help fill your skills
gap.
Ecommerce
Understanding the performance of
your eCommerce site is essential for
growth.
We can provide those insights, and
optimise performance, through a
variety of techniques, such as CRO,
A/B testing, tagging and tracking, and
visualisation.
Marketing
We’ll help you to understand the true
performance of your marketing
campaigns and which channels are
driving the best ROI, so you know
where to best allocate your
investments.
Operations
We are experts in helping
organisations use their data to help
optimise their business operations
including logistics, channel
management and financial reporting.
Finance
We are experts in helping organisations use their data to help optimise their finance processes including automated financial reporting, and revenue forecasting.
We’re on a mission to build a world-class organisation that helps businesses transform the way they manage and utilise data. Find out more about us, our people and how we’re planning on achieving our mission.
We’re on a mission to build a world-class organisation that helps businesses transform the way they manage and utilise data. Find out more about us, our people and how we’re planning on achieving our mission.
Looking to launch a career in data and analytics at a company that values learning, development and social connections? Check out our current vacancies today.
We provide a range of data services to help our clients uncover the true story their data is telling them and facilitate data-driven decision making.
Our expertise spans 5 key technical areas, from data visualisation to data science. Explore our solutions based on your technical requirements or by your business function.
We provide a range of data services to help our clients uncover the true story their data is telling them and facilitate data-driven decision making.
Our expertise spans 5 key technical areas, from data visualisation to data science. Explore our solutions based on your technical requirements or by your business function.
Data Strategy
From data storage and management
to processes and analysis, we’ll help
you get the most out of your data.
We’ll audit your commercial
requirements, review your KPI
frameworks and design a custom
solution that helps you achieve your
goals.
Data Platforms
With a variety of cloud data platforms
(Azure, GCP & AWS) in the market, it
can be tough to choose the right
platform for you and your business. As
experts in data, we can guide you on
the right solution based on your
business needs.
Data Visualisation
We are technology agnostic and
experts in a wide range of tools that
can help you better visualise your
data, from Power BI and Looker to Qlik
Sense and Tableau.
Data Science & AI
Our team of data scientists are
experts in helping businesses gain
actionable insights from their data. We
can build you custom AI solutions
using the latest machine learning
techniques.
Data People
Looking for an agency that can act as
an extension of your in-house team, or
provide a fully managed service? We
can step in and help fill your skills
gap.
Ecommerce
Understanding the performance of
your eCommerce site is essential for
growth.
We can provide those insights, and
optimise performance, through a
variety of techniques, such as CRO,
A/B testing, tagging and tracking, and
visualisation.
Marketing
We’ll help you to understand the true
performance of your marketing
campaigns and which channels are
driving the best ROI, so you know
where to best allocate your
investments.
Operations
We are experts in helping
organisations use their data to help
optimise their business operations
including logistics, channel
management and financial reporting.
Finance
We are experts in helping organisations use their data to help optimise their finance processes including automated financial reporting, and revenue forecasting.
We’re on a mission to build a world-class organisation that helps businesses transform the way they manage and utilise data. Find out more about us, our people and how we’re planning on achieving our mission.
We’re on a mission to build a world-class organisation that helps businesses transform the way they manage and utilise data. Find out more about us, our people and how we’re planning on achieving our mission.
Looking to launch a career in data and analytics at a company that values learning, development and social connections? Check out our current vacancies today.
Engrained into our everyday lives through technologies such as facial recognition, digital assistants, and smart cars, the era of AI is well and truly upon us, and there are no signs of its substantial growthstagnating. In fact, the AI market size is projected to reach $407 billion by 2027; representing an annual growth rate of 37.3% from 2023 to 2030 [1].
Alongside this, businesses are also recognising the potential of AI and are increasingly leveraging it to streamline their operations, enhance data-driven decision making through data analysis, automate repetitive tasks and improve customer services. To provide some context to this, according to Gov.uk, in the UK alone almost half a million businesses had adopted at least one AI technology in their operations at the start of 2022 [2].
And yet, whilst the AI industry has continued to advance and adoption has increased, there has been little development in the mitigation of AI-associated risks, regardless of the growing concerns about cyber-security and regulatory compliance of artificial intelligence within organisations [3].
Now, don’t get us wrong, we’re not convinced we’re going to have an iRobot situation on our hands any time soon, however it cannot be denied that there are potential risks associated with the use of AI technology, and an urgent need for regulation to address these concerns.
This is where the Frontier Model Forum comes in to play…
Introducing The Frontier Model Forum
The Frontier Model Forum (FMF) is a newly announced partnership aimed at promoting the responsible and safe development of AI models.
Formed by Microsoft, Google, OpenAI and Anthropic, this new industry body has set out to cover four core objectives:
Advancing AI safety research
Identifying best practices
Collaborating with policymakers, academics, civil society and companies
Supporting efforts to develop applications that can help meet society’s greatest challenges
Whilst these four tech-giants have founded the FMF, their aim is to establish an Advisory Board by inviting member organisations to contribute towards its strategy and priorities. Organisations that wish to join the forum will need to meet the following membership criteria:
Develop and deploy frontier models (large-scale ML models that are capable of performing an extensive range of tasks that go beyond what is currently possible with even the most advanced existing models)
Demonstrate strong commitment to frontier model safety
Are prepared to contribute towards advancing the FMF’s efforts
The aim of the Frontier Model Forum is then to leverage the collective technical and operational knowledge of its member companies to benefit the overall AI ecosystem. This includes driving progress in technical evaluations and benchmarks, as well as creating a public repository of solutions to promote industry best practices and standards. Through these collaborative efforts, the Forum seeks to contribute to the advancement and development of the AI industry as a whole.
“Companies creating AI technology have a responsibility to ensure that it is safe, secure, and remains under human control. This initiative is a vital step to bring the tech sector together in advancing AI responsibly and tackling the challenges so that it benefits all of humanity.” Brad Smith, Vice Chair & President, Microsoft.
Our thoughts
In our perspective, AI presents a range of risks – job displacement, security & privacy concerns, bias and discrimination to name a few. However, we believe the primary concerns related to AI revolves around the absence of regulation, and the lack of clear guidelines. This is why we consider the launch of the Frontier Model Forum to be a highly encouraging and indispensable development which will help to mitigate risks, establish industry-recognised standards and reduce potential negative social impact.
By bringing together experts and industry leaders, it will foster a collective effort to:
Reduce potential negative impact
Safeguard society’s interest
Ensure the responsible and ethical use of AI
The Frontier Model Forum has the potential to shape the future of AI in a way that minimizes risks, enhances transparency, and creates a more secure and accountable environment for AI development and deployment, so we can continue to reap the benefits made possible by AI and unveil further progress in the field of AI, all whilst effectively managing the associated risks.
At Ipsos Jarmany, we’re conscious of all the media hype around artificial intelligence (AI), and how the discourse has been mixed, to say the least. The idea that general-purpose AI will be the biggest event in human history may feel like hyperbole, and it’s probably too early to call. But what’s certain is that it’s going to change all our lives and is already transforming business.
In this blog, we want to touch on the opportunity that AI presents organisations, but more importantly we’ll get into what you need to do to make sure your business can leverage AI to the max.
What is the business opportunity of AI?
Just so we understand what we’re talking about here—AI will have made the world $15.7 trillion richer by 2030[1]. It will also have given a 26%-plus boost in GDP for local economies by the same date.[2]
Those figures may actually be conservative bearing in mind how quickly AI and its adoption is advancing, but regardless of how many trillions-of-dollars AI generates, there’s plenty for business to get excited about. Indeed, McKinsey found that way back in 2021, 27% of the companies it spoke to in an AI-related survey said 5% or more of their profits were already down to AI.
The difference between generative and non-generative AI
So what do we mean by AI? There are actually two kinds—Generative AI which produces new content, like chatbot responses, that imitate human creativity. And non-Generative, or predictive, AI forecasts outcomes based on patterns in historical data.
It’s generative AI and ChatGPT from OpenAI in particular that’s been grabbing all the headlines recently, which is unsurprising since Microsoft pumped a massive $10 billion into the continued development of this natural language processing tool back in January.
In practice, Generative will work alongside non-Generative, and in unison at times to enhance outcomes. Right now, these two types of AI are revolutionising businesses, from sales & marketing departments, to logistics and inventory, accounting & finance and human resources.
Whether it’s boosting efficiency by removing repetitive tasks like writing emails or summarizing large documents; or improving supply chains by showing how much of anything should be stored where and when, AI is there to give your business an edge.
How difficult is it to use AI in a business?
You won’t be surprised to learn that successful adoption of AI depends on how much effort you put in beforehand. There are plenty of problems to making AI work for a company—but for every issue there is a solution and we’re going to walk you through the key ones now.
We recommend establishing an AI Framework for Success. Make it a mental checklist that you go through and learn and share with colleagues so everyone interested in making AI a success is aligned. Remember AI adoption is a team game and you don’t want anyone from across the company going off-piste.
The Ipsos Jarmany AI Framework for Success
We’re going to split the framework broadly in two. There are the structural parts that you have to get right, covering data, architectures, legal requirements and skillsets for example. Plus, there are the softer parts, which cover things like sensitivities and ethics.
AI Framework for Success—1st Phase:
Time to make sure you have the correct foundation for AI:
What’s your AI mission statement?
Sounds obvious, but you’d be shocked by the number of companies we’ve come across that launch into AI without a clear vision of the revolution’s ultimate goals. Get together, agree and write down what you want AI to achieve for the business. Decide what you want the main benefits to be—enhance user experience, improve topline revenue or reduce internal costs?
Check your data quality
You need to audit your current data sources to ensure you have enough data and that it’s in the right place, clean enough and essentially fit-for-purpose. It’s worth spending a moment on this because you also need to consider how accessible your data is. Your systems-data needs to be able to flow freely in order for AI to work. The last thing you need are data siloes.
Do you have enough performance?
Along with your data, you need to audit your infrastructure to find out whether you have the basic computing capabilities to process large amounts of data for AI. Sure, the availability of AI services on public clouds like Azure offering massive amounts of compute and storage can help you here but see what you have in-house before you take that step.
Who is on the AI team?
We all know how labour shortages are hurting IT at the moment, so you need to count the number of hands you have available for your AI taskforce. If you’re short, then we recommend training for those who want to join up and, more for the longer term, think about bringing in AI specialists.
AI Framework for Success—2nd Phase:
You’ve put a check against everything structural, so now it’s time to move into the second, softer phase, which is just as important.
Data governance, ethics and bias?
Governance is going to need some thought because to train AI algorithms, for example, you need large quantities of data, making storage and security of major importance.
Racial and gender biases are also a known problem with AI unless work is done to iron out discriminatory assumptions in algorithms, often associated with low-quality data. Set down standards that will help control the problem, and check out the UK Government’s white paper on its approach to AI regulation and the EU’s AI Act for guidance.
Deal with employee concerns
Your personnel will have legitimate worries over how AI is going to impact them. The question over whether they will they lose their jobs is the elephant in the room that you’ll need to address first and foremost. You need to correct many of the negative assumptions about AI and communicate the benefits, reinforcing that it will enable them to focus on other, less mundane, repetitive and manual tasks, freeing them up to work on more interesting stuff.
Walk before you run
Everyone comes to AI nowadays with preconceived ideas—and it’s most likely that internal stakeholders will have massive expectations for AI in general. Afterall, they read the news, right? While it’s great to have high-level interest in a project, you have to manage people’s expectations at the start.
Therefore, consider a proof of concept to test that your AI model is working before going big. Use just a small sample of data to demonstrate the model’s effectiveness to the people that really matter before launching anything wide scale across the business.
Summing Up
With so much excitement around AI—and its transformative power for business—we could forgive anyone for not wanting to hold things up with questions like—Are we AI ready?; because quite frankly that’s incredibly boring, and who wants to be a killjoy?
But asking that question and following a framework like the one we’ve shared is incredibly rewarding in the long term and is the best way to get the most out of your AI investment.
Still, even with your AI Framework to Success, the time and expertise needed to get everything lined up can be a challenge; and so, at Ipsos Jarmany, we’ve created a team of AI specialists that can deliver AI in the most time effective and cost-efficient way possible.
If this blog has trigged some questions, thoughts or ideas, speak to us today and let us see how we can get your business on the path to a best-practice adoption of AI.
Microsoft, a leader in the technology industry, recently announced the launch of Microsoft Fabric, a comprehensive analytics solution that promises to revolutionise the way businesses store, manage and analyse their data; in turn, streamlining their data processes so businesses can extract timely and valuable insights, more efficiently.
In this blog, we will take a closer look at Microsoft Fabric and explore its features and benefits as well as discussing our thoughts. So, whether you’re a data scientist, analyst, or business leader, we’re here to demonstrate how it can help you unlock the full potential of your data.
So, let’s get to it.
What is Microsoft Fabric?
Microsoft Fabric is a comprehensive, all-in-one data analytics solution that encompasses a whole suite of data services, including data engineering & transformation, data science, real-time analytics, and business intelligence. It brings together the suite of existing products within the Microsoft stack, such as Data Factory, Power BI, and Synapse, to deliver a seamlessly unified experience that serves your end-to-end analytical needs.
By integrating a variety of different data services, Fabric offers a simplified user experience which can be customised based on each business’ needs and therefore eliminates the need for multiple vendors. It also enables businesses to centralise their admin and governance whilst providing users with a familiar and easy-to-learn experience.
What Are The Key Features?
#1 Data Lake
One of the key features of Microsoft Fabric is its data lake, also known as OneLake.
OneLake provides a centralised repository for all enterprise data and is the foundation of all services available on Fabric. By providing a unified storage solution, data scientists and analysts can more easily access and analyse data from various sources, including structured, semi-structured, and unstructured data.
Microsoft Fabric’s data lake is designed to handle massive amounts of data, making it an ideal solution for businesses with large volumes of data, whilst also simplifying the management of big data.
#2 Data Engineering
Another important feature of Microsoft Fabric is its data engineering capabilities. With Microsoft Fabric, businesses can design, build and maintain infrastructures, allowing them to more easily transform and process their data, in turn making it easier to analyse and derive insights.
Additionally, Microsoft Fabric provides a range of other data engineering capabilities, including:
Creating and managing data lakehouses
Designing data pipelines that feed in to your lakehouse
Using notebooks to write code for data ingestion, preparation and transformation
All in all, these engineering capabilities allow businesses to better prepare their data for analysis.
#3 Business Intelligence
Microsoft is already widely known for their popular business intelligence and data visualisation tool, Power BI, so it will come as no surprise that real-time analytics and BI has been incorporated into the features of Fabric.
This capability enables users to:
Monitor and analyse data in real-time
Build interactive dashboards
Manage ad hoc reporting
Implement predictive analytics
And much more.
This feature helps businesses to gain real-time valuable insights into their operations so they can make more informed decisions and can respond quickly to changes in the market.
#4 Co-Pilot and Data Activator
Another exciting feature of Microsoft Fabric is the integration of the newly announced Copilot and Data Activator.
Copilot is Microsoft’s new artificial intelligence tool that can aid productivity by automating repetitive tasks, writing code, creating visualisations, summarising insights, and much more.
Data Activator is a no-code tool for analysing data and then automating alerts & actions off the back of those insights. This could include notifying sales managers when inventory dips below a certain threshold, alerting finance teams when a customer is in arrears with their payments, or automatically creating support tickets if an error is triggered.
Our Thoughts on Microsoft Fabric
Now that we’ve explored some of the key features of Microsoft Fabric, we’re going to give you the run-down of what we think of this new unified platform.
The Benefits
One interface to access all components of Fabric
Existing knowledge of Microsoft products can be utilised
Strong, centralised governance of data access
Git integration for robust source control
Simplified billing
Whilst we’re big fans of the Microsoft technology stack, we won’t deny that there are a few contrasting elements that need ironing out before Microsoft Fabric has our full backing.
Firstly, the application has a few bugs which impacts the user experience – no doubt due to the sheer amount of integrated services and level of capacity, but something we imagine will be resolved as the uptake increases and it’s phased out of preview.
Whilst the promise of exciting AI features is enticing, a lot of these features are not yet available which is a little disappointing given the current AI-hype and market-eagerness to leverage these types of tools.
Lastly, stand-alone Microsoft Fabric is currently only available on a pay-as-you-go basis, making it a more expensive option and therefore a less feasible option for businesses that are more price sensitive. Later this year ‘reserved capacity’ SKUs are due which will bring down the cost of dedicated computer resources.
Get In Contact
Overall, Microsoft Fabric is a great unified analytics solution if you’re looking for a system that offers a suite of services for data processing, analysis, and visualisation, all in one place. And, with features like Copilot, Data Activator and the integration of Power BI, there’s no doubt it will make it much easier, and more streamlined, for businesses to extract valuable insights from their data. Microsoft Fabric is certainly something we’ll be keeping our eye on as it’s phased out of preview and more readily available.
If you’d like to find out more about Microsoft Fabric, or how you can leverage other Microsoft products to advance your data capabilities, then get in touch with the team today.
The name data platform couldn’t be more mundane, but it would be a mistake to judge this technology by what it’s called. Ingesting, processing, analysing and presenting huge quantities of information—data platforms are turning around the fortunes of many organisations today and helping them thrive in some pretty tough markets.
In this blog, we’re going to get into which cloud is best for your data platform. We’re not going to debate whether cloud is your best option, because quite frankly we’re discounting an on-premises infrastructure from the start.
What we’re going to do is help you figure out which of the Big 3—Amazon Web Services (AWS), Google Cloud Platform (GCP) and Azure—is right for your data platform. And even give you an alternative to boot if none of the three cuts it.
Let’s get started.
What are the advantages and disadvantages of the Big 3 Clouds?
So what are the pros and cons of AWS, GCP and Azure? Before we answer that let’s make a couple of things clear. If you approach that question by going through the Big 3 service-by-service, you’re wasting your time.
It’s a mistake because by focusing on each cloud’s services capabilities, you’re missing the bigger picture and may end up having to back-track and rethink your original choice further down the line. You’ll see why later.
The Big 3 Defined
Amazon Web Services (AWS)
Part of Amazon, AWS has more than one million active users and offers more than 200 fully featured cloud services. It accounts for 41.5% of the cloud market and has 5x more cloud infrastructure deployed than its 14 leading competitors combined. In people’s minds, it stands out for AI and ML services. Azure might wonder where that idea comes from, but really there isn’t a cloud that does it in these areas better than AWS.
Google Cloud Platform (GCP)
GCP is the smallest of the Big 3 with 9% cloud marketshare. Despite being the smallest, it’s revenue growth is healthy, and has consistently been up to 45% per annum. In addition, it’s global network is one of the biggest. You get seamless integration with all Google products and it packs a fully-managed data warehouse, called BigQuery, which is highly rated and could be central part of your data platform.
Microsoft Azure
If we renamed Azure, the Microsoft Cloud, you’d get an instant feel for what we’re talking about here: It’s Microsoft’s own public cloud offering; and it’s growing fast. It’s crucially important to Microsoft, delivering revenue of $28.5 billion—up by 22%—in the company’s third quarter results, released in April 2023. It offers everything a data platform could need and is well-known for being simple to work with.
How do I distinguish between the Big 3?
Had we created this blog 8 years ago, you would have seen the word maturity dotted around in a number of places. Back then, people spoke about some of these clouds being more mature than others; and hence offering a broader range of services to meet a company’s specific needs.
Maturity is no longer relevant and if you try to separate the Big 3 on their service offerings—unless your business is very very niche—it’s not worth it.
When it comes to compute power, data storage options, networking, security and compliance, all of the Big 3 have what you want. They all offer tonnes of services—many of which you’ll probably never need.
Location, however, could be an issue. Depending on your industry, you’ll need to comply with a host of regulatory standards around cloud usage, one of which is where your data is situated.
That may sound odd because we’re talking about global cloud providers and thus your data will be everywhere, right? Correct, but while access is ubiquitous, your data will be stored on physical devices somewhere out there—and it’s where those devices sit that counts.
Hence, you need to check where the AWS, GCP or Azure data centre is located that will be storing your data and then you’ll know if that cloud is the one for you. The good news is that all the Big 3 are really up on the regulatory needs of multiple industries, including public sector, and they have teams that can provide you with all the information you need to know if you’ll be on the right side of your industry’s watchdogs.
The Big 3’s key points of difference
There is a way to think about AWS, GCP and Azure so you can start to draw lines between them. Sure, these are going to very broad statements but they are no less true for being light on detail:
AWS – the best place to build and run open-source software.
GCP – a great choice if you’re already using solutions within the Google Stack.
Azure –integrates seamlessly with your existing Microsoft technology.
Perhaps that’s all you really need to know. Maybe you can stop reading here. What’s certain is that these points are going to have a bearing when we get more into the details.
The Pros and Cons of AWS, GCP and Azure
With our broad brushstrokes in place, we now can start focusing the discussion a bit more on the advantages and disadvantages. We’ll show you how to properly evaluate each cloud, based on the premise that they all have the infrastructure, compute, storage and networking etc, you need.
Legacy Investment – this is such a crucial point—and so often overlooked—because if you’re heavily invested in Microsoft or Google, it makes no sense whatsoever not to leverage all that legacy.
Skillsets – this really builds on from the previous bullet, because if, for example, you have the Microsoft skills already in-house then adopting and working with a cloud like Azure is going to be much easier and less costly in terms of training. Of course, the same argument can apply to AWS and open-source. Therefore, you need to audit what skills you have internally, as part of the decision-making process.
Community – a reflection of their size, both AWS and Azure have much larger online communities than GCP. These communities provide advice and resources to resolve challenges and boost developers’ skillsets. The Azure Community, for example, has approximately 182,000 members, and Microsoft employees regularly participate in its online forums.
Politics – no we’re not joking; politics does play a role in any cloud decision. It doesn’t always happen, but we often see senior managers having an emotional connection with certain platforms, often Azure, since their experience of Microsoft goes back years. So which way does the wind blow in your company? AWS, GCP, Azure? What’s your sense?
Are AWS, GCP and Azure my only options?
We focused our blog on the Big 3 because they are the ones the vast majority of businesses choose from. Nevertheless, they aren’t your only options.
Ask your IT team about a Modern Data Stack as an alternative to the Big 3 and see what members say. A Modern Data Stack is an assembly of software tools and technologies running across different cloud platforms to collect, process, store, and analyse data.
To be honest, the idea has been around for more than a decade and it’s often used for niche cloud projects; however, modern data stack comes with a sense of freedom. What we mean by that is you’re getting the independence to run a particular workload on a particular cloud. Your IT team chooses whichever one is best suited to the job you want to do.
Parting thoughts
On balance, and based on our experience, we think you have to go a long way to beat Azure. It fits so well with legacy Microsoft infrastructures. There’s nothing that AWS and GCP pack that Azure doesn’t, unless it’s for something niche that probably wouldn’t be relevant to your business anyway.
Indeed, Azure carries Microsoft’s DNA, which makes it easy to learn and intuitive. There’s generally less coding required. What’s more, the whole community thing continues to grow so the support is out there if you need it, both in terms of gazillions of documents and online forums.
Boiled down to just three things, Azure is great on price, ease of use and ease of integration. Not bad really.
We hope this blog proves useful in helping you choose the right cloud for your platform. That said, our team of consultants at Ipsos Jarmany is available to continue the conversation and give you a deeper insight into the Big 3 and how to find the cloud that is best for your business.
Talk to us today and have an honest conversation about how to select the right cloud for your data platform.
It may come as no surprise that there was a particular focus on generative AI, ChatGPT, and leveraging OpenAI’s capabilities, with Microsoft aiming to enhance its offerings and maintain its market-leading position. However, these developments raise concerns about a potential single-source dependency, prompting speculation about the acquisition of OpenAI by Microsoft.
In this blog post, we will delve into Microsoft’s 5 key announcements.
#1 Copilot: Microsoft’s Generative AI Assistant
Microsoft unveiled Copilot, an innovative feature that incorporates generative AI technology into its core operating systems and Office 365 products. Copilot acts as an assistant within Office apps andalsoresides as a taskbar button, assisting users with various tasks on their PCs. While the demos were impressive, it will be interesting to see how this performs in the real-world and if it will be widely accepted and utilised, or if it will become the next generation of ‘Clippy’ for the AI era.
#2 Bing & ChatGPT: Augmenting Knowledge With Bing Search
ChatGPT’s main limitation lies in its knowledge being restricted to information before September 2021. To address this issue, Microsoft plans to integrate Bing Search with ChatGPT, allowing the search results from Bing to supplement ChatGPT’s responses and keep it up to date. Additionally, Microsoft aims to ensure interoperability between ChatGPT plugins and Bing, enabling integration of the results. Although similar to Google’s approach with Bard, the vast user base of ChatGPT suggests the potential for a significant increase in Bing Search usage.
#3 Azure AI Studio: Building Custom Models and Ensuring Safety
Microsoft introduced Azure AI Studio, a platform that empowers developers to build their own models and create functionalities on top of them. This initiative also emphasises the importance of AI safety, allowing developers to test applications and mitigate any potential issues that may arise.
#4 Microsoft Fabric: A Complete End-To-End Analysis Platform
Microsoft Fabric, a direct competitor to Snowflake, offers a comprehensive solution for data engineering, storage,warehousing andanalytics. Fabric introduces OneLake (a centralised, simplified storage service), Data Activator (a system for building complex, data-driven alerts) and the integration of Copilot into Power BI to help build eye-catching reports from natural language prompts.
#5 Single-Source Dependency and the Potential Acquisition of OpenAI
Microsoft’s commitment to infusing generative AI across its product range is a strategic move aimed at reclaiming market share from Google in productivity and search. However, this strategy also poses a significant risk—a single-source dependency on OpenAI. If OpenAI were to cease supplying Microsoft with technology, it could impact the company’s core business and profitability, leading to a potential decline in its share price. Consequently, acquiring OpenAI becomes a critical consideration for Microsoft to mitigate this risk.
Overall, it’s clear that Microsoft continues to make strides in AI, with the integration of OpenAI’s technology into its products holding testament to this and demonstrating the value and investment that Microsoft are placing on this type of technology. With millions of users everyday across their suite of products and services, this focus on AI holds promise for enhanced functionality and improved user experience, and we can’t wait to see it evolve more.
Many companies are sitting on mountains of data and information, but few are extracting the gold that lies within it, which we think is crazy. In this blog, we’re going to show you how you can maximise its benefit to allow your business to thrive.
You’ll learn that every successful data lead organisation is built on an effective data strategy. We’ll explain:
What a data strategy really is
The benefits of having a data strategy
Why you really should have a data strategy
Ipsos Jarmany’s 5 steps to building an effective data strategy
Let’s get to it.
What actually is a data strategy?
A data strategy is basically a plan that, if implemented properly, will allow you, as a business, to leverage the power of the all the data and information at your disposal quickly and effectively. This power will then result in the business being able to make the most informed decisions possible and act quickly to help maximise commercial performance.
Sounds simple, but the difference between a great data strategy and poor data strategy could result in a massive impact on your business. Research shows that businesses with a strong data strategy can perform over 2.5x better than those with a poor data strategy.1
What are the benefits of having a data strategy, and why should you have one?
You might say to yourself: “I already have loads of data so surely I just need to take a quick look at it and it will give me the answers I need to run my business” …if only life was that simple!
When we start working with our clients, we often see that they are facing a variety of challenges, including:
Incomplete and untrustworthy data which results in more arguments than insights
Inadequate data cleansing compounding already questionable data
Inefficient data management processes slowing down their speed of decision making
Insufficient use of available 3rd party data that will give colour and relevance to your internal 1st party data
An over reliance on human beings, rather than technology and AI, to do relatively simple and mundane tasks. (A machine will never get bored of doing these tasks, will often do them better, and will be quicker, with far less mistakes or human-error).
Once you have your data sorted so it’s clean, accurate, timely and in a format where you can readily understand and interpret it, you need to ask yourself what’s next and how can you use this information?
You’ll be surprised by how many instances there are where good data and insights can help turbo charge your business. Below is a small subset of the main areas where a data-driven business can drive a massive commercial advantage:
Increased Sales– a cohesive data strategy can help you identify opportunities to optimise marketing efforts. Businesses that strategically use data to inform business decisions can outperform their peers in sales growth by 85%.2
Increased Profits – this can be achieved by streamlining operational logistics and through cost analysis. According to a Business Application Research Center (BARC) data-driven sales reduced the overall cost of operations by 10%.3
Greater client satisfaction – Businesses that personalise the customer experience using data can increase the customer lifetime value by 2.5x on average.4
Decreased Risk – this can be achieved through better management of regulatory requirements and data breaches. According to IBM the average cost of a data breach in 2022 was $4.35 million and 83% of organisations reported more than one breach.5
Ipsos Jarmany’s 5 steps to building an effective data strategy
A data strategy is essentially a plan that allows you to quickly and effectively leverage the power of all the data and information available to you as a business. In turn, this allows you to make the best business decisions to drive growth and operational efficiencies.
We’ve consolidated the core steps you need to take to help you define your data strategy:
1. Define the questions that need to be answered to allow the company to meet its strategic objectives and respond to tactical challenges. This could be based on goals relating to revenue growth, increased profit, market share growth or cost reduction.
2. Define the gaps between what you have today and where you want to get to. In particular, you need to consider the following 4 areas:
Data – Do you even have all the raw data you need? Are you set-up to collect the data from your business operations required to make the right decisions? Are you maximising the benefit of 3rd party data sets that are available to you? Do you have the right quality, breadth and depth in your data?
Technology – What data technology do you already have in your tech stack? Does it have the functionality to complete the tasks required by your business? Are you restrained in your options by significant previous investments in certain tech stacks (Azure, GCP, AWS). Finally, are you making the most of the recent advances in technology that are happening, in particular AI? (Whilst this last question is key to consider, you must always remember to have the enablers of AI in place, such as good data and a clear strategic need, to really leverage its true power).
Internal Capability – Do you have the right people with the right skills to enable you to leverage your investment in data and technology so you can transform that data into valuable information?
Culture – All of the above points are redundant if you don’t have an organisational culture that is programmed to accept that data needs to be an intrinsic part of the decision support structure. Ask yourself if you have buy-in from the right stakeholders and how you can embed a greater level of acceptance and interest towards data and data-driven insights from your organisation.
3. Define the plan – Once you have defined the objectives that need to be met and the current gaps you face it is important to create the plan to address them. Below are the key factors every good plan needs to contain:
Incremental wins – Better data and insights can start driving benefits to your business almost instantly. Therefore, no data strategy should wait until the transformation is 100% complete before launching it. This could mean months of missed opportunity and eventually result in a flop. At Ipsos Jarmany we think a staged delivery focus is the best. We usually advise 3-month milestones to deliver specific commercial advantages that build on themselves over time. This means you start getting a return on your investment sooner, and also allows you to flex the strategy slightly over time if the needs of the business change. This approach significantly reduces the chances of the business ending up with a BI white elephant that isn’t fit for purpose.
Leverage previous investments as much as possible – Don’t reinvent the wheel or spend time and money in areas where you don’t need to, unless it results in greater commercial benefit. (New and shiny isn’t always best).
Spend money wisely – Technology, especially AI, is rapidly advancing so investing in the right tech could provide significant commercial advantages to your organisation. However, as always make sure the fundamentals are in place first. (Sometimes new and shiny is the right way forward).
Don’t neglect your people – Bring them on the journey and remind your people of the benefits to them. It’s a support function not a threat, training can create your citizen data analysts.
4. Review progress – It’s important to constantly monitor the progress of the implementation of a data strategy. We always advise to stick to the 3-month cadence mentioned above to so you can work in shorter term sprints so you can ensure everything is on track and it enables you to tweak the strategy when necessary.
5. Repeat the above – The needs of any business changes over time especially if it is going through a period of transformational change. Therefore, whilst we talk about working in 3-month sprints, we believe that any data strategy should go through a deep review every 2-3 years. This gives you time to implement a strategy but not too long that the plan becomes irrelevant and doesn’t align with the changing needs and focus of the business.
What’s next?
So, there you go—a successful data strategy framework in five steps, as promised.
We don’t mind confessing to you that negotiating each step can be tricky if you don’t have enough experience and expertise at your disposal. Therefore, the wisest move can often be to work with experts who create data strategies for a living.
At Ipsos Jarmany, we have the talents to support you in building and implementing a successful data strategy. We’ll help deliver your strategy as well as collect and structure your data to be analysed and modelled in such a way to answer your business questions and deliver your business objectives as quickly and as cost effectively as you can.
Talk to us today and have an honest conversation about how to get your data strategy moving.
Identifying trends and patterns from raw data is hard and has nothing to do with a person’s intelligence. But spotting those signs in shapes and colours is much easier and can be achieved surprisingly quickly.
Therefore, the rise of data visualisation tools as part of the broader business intelligence (BI) world is no surprise. These tools not only speed up decision-making processes but improve the decisions themselves, helping viewers interpret data more accurately.
All this brings us to Microsoft Power BI – the most complete data visualisation technology in the market, according to the Gartner Magic Quadrant for BI and Analytics Platforms 2023 – and something that millions of people are using every day to extract insights from within their data. Let us walk you through it.
What is Power BI?
Microsoft Power BI aggregates your data and then represents it visually for you to analyse and share. Forrester calls it Microsoft’s augmented business intelligence platform, infused with the power of AI (which we’ll get to later on).
In essence, Power BI is a collection of software services, apps and connectors. What that means is you can connect data from multiple sources across your business, including Excel spreadsheets, visualise it in a dashboard or a report, share with colleagues and uncover what’s important to you in no time.
Some common types of data visualisation:
Bar and column charts
Doughnut charts
Decomposition tree
Funnel charts
Gauge charts
KPIs
What makes Power BI different from other BI solutions with data visualisation tools? Ask a senior consultant who works with Power BI and has experience of other solutions and you’ll hear words such as more intuitive, adaptable, unified and interactive.
The truth is that because it’s Microsoft, Power BI has a look-and-feel that many of you will recognise and like. If you use Excel then making the step up to Power BI will feel like a natural development.
How Much Does Power BI Cost?
The solution comprises 3 basic elements:
Power BI Desktop – a Windows desktop application.
Power BI Service – a software-as-a-service offering.
Power BI Mobile – apps for Windows, iOS and Android devices.
In terms of licensing:
Power BI Desktop is free.
Power BI Pro is £8.20 per user/ month
Power BI Premium Per User (PPU) is £16.40 per user/ month
Power BI Premium is £4,105.60 per capacity/ month
You can find out more about the differences between each package here.
Why Is Power BI Popular?
It’s unlikely you’ll find any area of your operations that Power BI won’t support; hence you’ll see Power BI providing insights to teams across:
Finance
HR
Production
Planning
Warehouse
Supply chain
Logistics
Sales
Marketing
It’s also true that new Power BI use cases will occur as the solution gets more tightly woven into your operations. Soon enough you’ll be building reports and dashboards delivering niche views on everything from expenses to specific project plans and progress on individual targets.
Power BI reports tend to feature historic data sets, delivering a snapshot of your organisation over a set period rather than just in real-time. Nevertheless, your Power BI reports can aggregate and visual data on key parts of your operation in just the same way as your Power BI dashboards, from Finance to HR and Customer Profitability to Ecommerce sales.
Power BI dashboards organise and visualise your data in real-time. You can create alerts when figures change and hit a chosen threshold. Here are a couple of dashboard examples:
Ecommerce – You can see how your online sales channels are performing day-to-day to gain a deeper understanding of how your products are performing. Insights could include: sales by category, most returned product and reasons for returns and sales over specific periods.
Marketing – You can visualise the effectiveness of your campaigns and the performance of segments and channels. For example, marketing spend by products, channel performance and campaign success rates.
Once you’re creating your reports and dashboards, you can start using some of the value-adding features in Power BI to distribute your insights and isolate the data that’s most important to your company.
Power BI Apps Power BI Apps allows you to bundle your reports, dashboards, spreadsheets and datasets and distribute them to individuals or large groups across your organisation in one go.
Power BI Metrics With Metrics, you can publish the performance metrics that are most important to your business in a single pane within Power BI. The main idea here is that Metrics promotes accountability, alignment and visibility for your teams.
How To Become A Power BI Expert
Power BI is promoted as a self-service tool; and that people with little or no technical background can become data heroes in just a short while.
Because it’s based on Microsoft Excel, many people will get a head start on learning the basics and the drag-and-drop functionality simplifies the process of connecting multiple data sources.
As you’d expect, Microsoft also offers plenty of Power BI training, with online workshops, documentation, and sample dashboards and reports.
At some point, you should think about learning DAX (Data Analysis Expressions), developed by Microsoft for platforms such as Power BI. It’s been referred to as Excel formulas on steroids and is crucial if you want to get the full value of Power BI, helping you create new information from data that is already in your model.
“If you’re familiar with Office 365, you’re going to be able to pick up Power BI quite quickly.”
Building Power BI Dashboards And Reports
You can create visualisations (referred to as visuals) in reports using visual types directly from the visualisation pain in Power BI. Furthermore, there are a growing number of pre-packaged custom visuals available through third parties that might be enough for what you need.
You simply download the custom visuals into your Power BI system and off you go.
Common sense will tell you to be wary of downloading anything unless it’s from a trusted source, in which case you’re better off using custom visuals that have been certified by Microsoft. There are many available in the Microsoft AppSource community site.
To cut down on the effort to extract useful data insights, Power BI has added its own AI Insights feature, which covers Text Analytics, Vision and Azure Machine Learning. It gives you access to a collection of pre-trained learning models that enhance your data preparation efforts. Using this capability, which requires Power BI Premium, you can enrich your data and gain a clearer view of data patterns.
Avoid Common Mistakes In Power BI
As you’d expect, there are best practices that you should follow to extract the full potential of Power BI for your organisation. Here are some top ones:
Spend a bit of time thinking carefully about what your dashboard or report is for.
When starting out, avoid introducing too much data because it can slow down the performance of your dashboard.
Remember you want your data visual to be used by colleagues, so think of them and don’t over complicate the report, making the information difficult to digest.
Top 5 Power BI Tips
Now you know some of the common mistakes, we’ll leave you with some top tips as shared by our own Power BI experts:
Have a clear a purpose in mind – there are so many data visualisation possibilities so be certain on what you’re trying to say and who you’re trying to say it to.
Keep your visualisations simple – it’s worth reviewing your data visual multiple times as it evolves, asking yourself: Can I make it clearer or can anything be removed?
Do some proper benchmarking comparisons – your data also needs context so include benchmarking to show performance against a set of standards.
Annotate your reports using Tooltips and buttons – both provide additional information on visuals, such as contextual data or, in the case of buttons, making them more interactive.
Do a training course – Power BI may be aimed at non-technical people, but there is so much to it and it’s such a powerful tool that to get the most out of this technology it’s definitely worth getting some formal guidance.
Speed Up Your Power BI Development
With all this information, we hope it’s clearer what Power BI is and how it can help your organisation speed up and improve the effectiveness of your decision-making. We also hope you’ve got a sense of why Power BI is a leader in the data visualisation market and how with continued development, such as the integration of AI, that position isn’t likely to change any time soon.
What’s also true, however, is that without the internal experience and expertise of Power BI to hand, you’re going to need to invest time and money in developing those skillsets; and that partnering with an organisation that can plug those skills straight into your operation may be more time and cost effective.
At Ipsos Jarmany, we’ve built a first-class team of Power BI consultants that can help your business harness the power of data effectively. Whether you are looking for a fully outsourced team or support for your in-house team, we can provide you with seamless expertise at a competitive cost.
If you’d like to know more about how Ipsos Jarmany could help you maximise the value of Power BI to drive smarter decision-making across your company, contact us today.
What is Econometric Modelling? And What Are The Benefits?
Econometric modelling uses statistical analysis to discover how changes in activities are likely to affect sales and turnover, so you can predict future impact and make better-informed decisions. Most typically, it’s used in marketing to provide valuable insights into how well a campaign or marketing activity may perform and the factors that will drive the most ROI.For example, you may be thinking about launching a new promotional campaign, a sales discount, or loyalty scheme. Econometric modelling will help you to:
Understand how different variables, like price and distribution channels, will impact your performance
Determine the optimal allocation of resources across your different marketing activities
Forecast your future demand
Identify different customer segments and their responsiveness to marketing activities
Evaluate market conditions and competitive factors that may impact consumer behaviour
And much more.
For businesses, complex econometric models can help to answer questions about what really drives a company’s main KPIs, such as volume, value, market share and gross margin. After all, few companies really understand the external forces that affect their industries or their brands.As well as helping you to answer these vital questions, econometric modelling can also help you to:
Save money
Drive better, faster results
Make data-informed decisions
Make your business more profitable
Marketing Mix Models; A Subset Of Econometric Modelling
Marketing mix modelling is one way to use econometric methods — this type of model uses aggregated data to analyse all marketing inputs over time to arrive at an optimal allocation for resources. For example, what’s the correct amount to spend on television advertising compared to the radio or the internet? Should a company invest money in more salespeople or in more advertising? What is the impact of promotional spending? At what is the point of diminishing return? With the right approach you can find the right answers.Marketing mix models have been used historically but were phased out with the rise of individual tracking. However, changes in legislation, like Googles privacy sandbox and the diminishing of third-party cookies, have reduced the ability for businesses to use individual tracking, which in turn has led to the return of the marketing mix model.
Implementing Econometric Models
The first step to making econometric models work, like marketing mix modelling, of course, is to have good data. At Ipsos Jarmany, we recommend having at least 3 years worth of data to input into the model. Limiting this to just 1 year, for example, would mean that the model would be unable to identify any trends or patterns, and the output would match the trends of last year since there is only one reference point. Basically, the more data, the better.These are the steps you should follow:
Define all the parts of the marketing mix that might have an impact on sales.
Review the state of your existing marketing data on these activities and close the gaps where they exist.
Set-up ongoing processes to collect, clean and store the data; and develop the history that will help provide the patterns the model will identify.
Begin modelling.
With everything in place, econometric models can enable businesses to forecast demand by examining all the economic factors involved. For example, econometric analysis revealed that the growth in the number of women working in the US played a major role in the growth of the restaurant industry from 1950 to 2000. But other variables were at work too: rising incomes made eating out more affordable and greater levels of car ownership, especially among teenagers and college students, all translated into higher restaurant sales.Understanding the economic variables that underlie demand makes it possible to forecast the future of an entire industry. What happens to your company if the price of oil plummets, or if more women re-enter the workforce after having families?It’s obviously not a simple and straightforward analysis, but having the right data and knowing how the global winds of change are shifting can stop a business from suffering huge setbacks. Just ask BlackBerry or Kodak about the impact of the smartphone revolution.
Find Out More
Econometric modelling can deliver a massive benefit to businesses that want to forward plan and avoid major disruption. But, it’s critical that you have the right foundations in place before you begin econometric modelling. If your inputs are sub-optimal, your outputs will be sub-optimal too. Get in touch with our experts and we’ll explain how we can bring this benefit to you.
As data continues to become crucial to all sorts of businesses, the need to understand, analyse, visualise, and use data grows more imperative.
However, without a data visualisation tool or analytics solution to view this data, businesses can quickly become overwhelmed. Data analytics solutions, business intelligence (BI) programs, and data visualisation tools are now essentials — rather than optional extras.
That’s why 54% of enterprises consider BI and other data-based solutions to be critical to their work now and in the future. By understanding the insights within their data, businesses can make better informed, data-driven decisions. But with a range of tools out there, which one is best?
In this blog, we’ll look at Looker, Power BI and Tableau — the three leading BI and data visualisation tools — to help decide which is best for you.
At a glance: Looker vs Tableau vs Power BI
Looker
Looker is a browser-based data analytics and visualisation tool. Founded in 2012, Looker was acquired by Google in 2019 and is now part of Google’s cloud platform. It also uses its own modelling language, LookML, a modular language that allows data and calculations to be reused. Alongside this, Looker’s Data Dictionary is a searchable directory for all metrics and descriptions in a Looker data warehouse.
Advantages
Looker’s unique approach to data offers some interesting advantages:
Cloud-based & browser-based: Looker offers the useful combination of being part of Google’s Cloud Platform and being completely accessible via a browser. Google Cloud offers an advanced level of security and a flexible way to manage data. With direct access through a browser, these benefits are offered without the need for software installation and manual updates.
Easy Git Integration: Looker can integrate with the popular version control system Git, enabling multiple people to work on multiple visualisations simultaneously. With Looker, users can see changes made to data-modelling layers, and jump back to them anytime. They can also create different version strands for developers to work on. This setup is easy and provides a benefit not offered by other data visualisation tools.
Connects with multiple data sources: Looker integrates with more than 50 different data sources due to LookML, Looker’s data modelling language. LookML’s flexible modelling language means it can analyse and visualise data from multiple sources, including Google Cloud, Microsoft Azure, Amazon Web Services and on-premises databases.
Self-serve capabilities: LookML also offers the ability to define dimensions, metrics, aggregates and relationships. These can then be used seamlessly in data visualisations, providing self-service analytics whilst also enabling the data to be reused. Looker also offers an Explore feature that enables users to self-serve their data through drag-and-drop functions, individual dashboards, and the ability to add additional fields to aid in further data
Disadvantages
Limited range of visualisations: Despite Looker’s popularity, the variety of visualisations offered with the basic program is somewhat limited. This comparison is even starker when comparing these capabilities to Looker’s competitors, Tableau and Power BI. It should be noted that Looker does offer the ability to build custom visualisations, which can go some way to mitigating this issue.
More expensive than direct competitors: In theory, Looker’s pricing model is ideal with cost being tailored to the company in question. However, Looker is the most expensive of its competitors — Tableau and Power BI.
Steep learning curve: Looker’s unique modelling language requires users to have at least a basic understanding of coding – in particular programming languages like SQL. The theory behind LookML is sound; a programming language that is easier to pick up. However, it is more difficult if a business lacks the right in-house expertise or training.
Looker’s ability to integrate with other systems, thanks to their unique LookML coding language, means that enterprise businesses can make use of data stored in already present third party software. Features like Looker Blocks — pre-built data models designed to fit common analytics patterns — streamline this integration, offering pre-built code that can more easily be embedded.
Looker is also a powerful beginner platform. Its systems are easy to learn, and the code is easily understood. While its visualistions might not be as sophisticated as its competitors, it also offers visualisation with real-time analysis and the ability to customise.
Tableau
Tableau formerly held the title of the undisputed king of Premium BI tools and has only recently gained rivals in Looker and Power BI. With quick implementation, ungoverned analytics and data can become accessible and easily shared throughout an organisation.
Tableau has recently been acquired by Salesforce, leading to simple integration with Salesforce users, as well as other programs such as MuleSoft and Slack.
Advantages
Interactive data visualisations: Tableau provides interactive data visualisation benefits, helping to turn unstructured statistical information into logical and intuitive visualisations. Filtering and selection provide options for further analysis and ease of understanding.
Adaptable to large amounts of data: Unlike other platforms that have a limit on data model size, Tableau has the ability to handle very large amounts of data without there being any impact on performance.
Intuitive user interface: Developer and non-dev users alike can easily use Tableau due to its intuitive user interface (UI). Non-dev users can use all the basic facilities of Tableau, however, specialists might be needed to increase the platform’s functionality. Tableau’s simplicity is also coupled with its ability to reliably operate on big data thanks to its columnar data model.
Compatibility: Tableau is compatible with multiple data sources, enabling businesses to connect with, access and blend data from multiple sources into one visualisation for easy data analysis. Tableau is also compatible with multiple scripting languages, such as Python or R, to maximise potential output.
Mobile support: Tableau has a mobile app for both iOS and Android systems. This app has the same functionality as the desktop and online software, allowing users to analyse data remotely. Moreover, the Tableau dashboard can be customised to each application, meaning functionality can be maximised to the individual’s separate mobile and desktop needs.
Disadvantages
Inflexible pricing: Tableau’s pricing doesn’t change on a case-by-case basis, despite the fact that most companies have individual needs. Purchasing an extended licence is required by Tableau’s sales model from the start. Many companies might find that they would rather start with a specific set of features and later adjust the pricing for further features if necessary.
Poor after-sales support: Due to Tableau’s seniority, there are many online message forums that users can use to discuss Tableau’s features. However, many focus on a lack of support and maintenance. To resolve this, Tableau’s support team sometimes advise purchasing a new feature, which can become costly.
Favoured towards Salesforce: Depending on an enterprise business’s requirements, this might not have a big impact. However, the nature of Salesforce’s acquisition means that Tableau’s development will now be skewed more towards Salesforce integration; Tableau is no longer an independent BI tool.
Tableau is designed with businesses in mind, rather than an IT department or developer. Tableau’s user interface is considered to be the easiest to use of its direct competitors. Its ease of usage means that you do not have to be an expert in programming languages or coding, empowering teams across an organisation to become more data-driven and data-literate in their decision making.
Power BI
Microsoft Power BI integrates well with Microsoft products and systems, however a recent uptick in adoption likely comes from the free version of Power BI that is available to anybody. This free version is reliable for individual analysts, but the premium version allows important functionalities such as sharing reports, dashboards or analytical apps.
Advantages
Microsoft’s tool offers the following advantages:
Large range of visualisations: Power BI has a great number of standard visuals to populate your reports, each with a wide variety of format options. Power BI is backed up by integrations with Microsoft Office and can harness the power of Excel to create easy data visualisations. Moreover, if the desired option is unavailable, users can build their own custom visuals also.
User-friendly interface: Power BI is extremely intuitive to navigate and user-friendly. Users with little dashboard experience can navigate the platform as easily as those with expertise. This is partly due to their natural language query tool, which allows people to ask simple questions to easily navigate to the data they wish to visualise.
Lower cost: Power BI is relatively low in cost compared to other leading platforms. A trial version of Power BI is available to everyone, while Power BI Pro is included in some Office 365 business and enterprise plans. This has caused a shift in the market, causing other BI vendors to become much more competitive in their licensing options.
Easy to learn: Power BI might be the easiest to use of the three platforms. Though you will need expert support to truly get the most of your data, those who are familiar with Excel will be able to start using Power BI’s data visualisation tools quickly.
Disadvantages
Limited customisations: Though Power BI offers a range of visualisations to choose from, it can be difficult to customise any of them. There are basic formatting options available but this can prove limiting for businesses looking to create bespoke visualisations with limited Power BI experience.
Potential learning difficulties: As covered, while Power BI is simple to get to terms with in the beginning, it will require added training further down the line. This especially applies when performing analysis over your datasets, as it will likely require tools that are external to Power BI, like DAX Studio.
Data security: Power BI offers advanced encryption capabilities using Azure. However, as it’s a cloud-based tool, some stakeholders may feel uneasy about the security and privacy of their data. Businesses will have to ensure that they have the full breadth of knowledge of Power BI’s encryption services to fulfil their business case.
It’s clear that Power BI offers good integration capabilities, especially with other Microsoft products, allowing data analysis and visualisations to be shared across. It offers the reliability of other products, and even offers integrations into other data analytics tools.
Using a data consultancy to make the most out of your tools
Power BI, Tableau and Looker offer high-quality BI and data visualisation solutions for businesses in 2023. What is ‘best’ for your business is relative — but what’s not relative is that in order to maximise your ROI from these platforms and harness the power of your data, you need to get the best out of these tools.
Without in-house expertise or the right training, the steep learning curve and technical know-how required to maximise its potential can hurt your ROI, and squander the potential within your data. This is where Ipsos Jarmany can help.
With our consultancy services, we’ll help you find the right platform for your business. Once matched to the correct tool, we’ll help you maximise the insights you get from your data and make business intelligent decisions. Ipsos Jarmany’s team of data scientists are seasoned experts who understand that no two businesses have the same needs.
Whether you need help selecting a platform, getting the most out of data visualisations, creating a data strategy or something else, Ipsos Jarmany’s data consultancy experts can help.
In 1597 Sir Francis Bacon famously said, “knowledge itself is power.”1 Four centuries later, his words are proving to be more accurate than ever, as knowledge in the form of big data delivers an increasing amount of power to businesses.
Tech giants like Google and Facebook have made it abundantly clear that, to them, big data is a goldmine of insights. Therefore, forward-thinking organisations need to invest in and develop a comprehensive data strategy to improve how they obtain, store, manage, share, and use their data.
However, many businesses struggle to make data work for them. A Mckinsey survey found that 47% of business leaders feel that data & analytics have fundamentally transformed their industries, but they still had difficulties putting data to work for their organisations.2
While new technologies allow organisations to collect lots of data, raw data in and of itself has little value. Instead, the value arises when that data is presented in a way that provides actionable insights, informing business leaders on the best course of action.
That’s why in this blog post we’re going to be looking at how data visualization improves decision making. Let’s dive straight in.
What is data visualisation?
Data visualisation is the final part of a process that includes the collection, cleansing and analysis of information from numerous data sources. This final stage is all about creating a pictorial representation of that data which can then function as a single source of truth for businesses.
The goal behind creating these visually stimulating visualisations is to tell a compelling story using raw data whilst keeping crucial KPIs in mind during review processes. With the help of data visualisation, key insights and information, such as trends and patterns, can be digested and understood by stakeholders much quicker.
Types of data visualisation
When it comes to visualising their data to help communicate the story behind it to their stakeholders, there are a number of things businesses need to consider. Chief among these is the category of visualisation they want to focus their efforts on, either:
Data exploration: Data exploration helps to uncover insights and identify patterns that need further attention.
Data explanation: By presenting an easy-to-understand graph or illustration, data explanation helps an audience better understand the results of that data.
Understanding which of those two ends a given visualisation is intended to achieve is essential in order to achieve success in an overarching data strategy.
While there are just two broad categories of data visualisation, there are a number of specific types of visualisations that organisations can deploy to better understand their data. These include:
2D area visualisations: 2D area data visualisations are typically geospatial, as they relate to the relative position of things on the earth’s surface.
Temporal visualisations: Temporal visualisations have a start and finish time and elements that may overlap.
Multidimensional charts: Multidimensional charts are those with two or more dimensions that help explore correlations and discover casualty, which is why these are amongst the most commonly used visualisations.
Hierarchical charts: Hierarchical data sets are the arrangement of groups in which larger groups encompass smaller sets, allowing users to drill down or drill up to conduct in-depth analysis.
Network visualisations: Network data visualisations show how data points are related within a wider network.
How does data visualisation improve decision-making?
Data visualisation helps decision-makers see the big picture. From understanding trends and patterns to highlighting issues and areas of concern, data visualisation is crucial to obtaining enhanced oversight over business operations.
Research has shown that organisations that leverage their customer behaviour data to generate insights and make data-driven decisions can outperform their peers by as much as 85% in sales growth. 3
Consequently, any organisation with an eye on the future needs to make sense of its data through data visualisation techniques and tools to enlighten its decision-making processes. Without effective visualisation, organisations are relying more on guesswork and interpretation when it comes to making crucial decisions.
Benefits of data visualisation
Whilst the primary benefit of data visualisation centres around making better business decisions, it’s worth digging into some of the more specific benefits it can help organisations obtain. These include:
Improving speed: Many bad decisions are just good choices with bad timing, as timing is an often overlooked aspect of decision-making. Data visualisation can help businesses draw insights from vast amounts of data in real-time, increasing response times to challenges.
More accurate numbers: Although data provides decision-makers with potentially all the information they need, it’s usually not presented in an easily digestible format. Data visualisation simplifies the information, boosting our comprehension of the data and reducing the need to fill the gaps with our biases, making our decisions more accurate. However, in order to ensure accuracy, it’s pivotal that the data used within visualisations is of the highest quality.
Simplified communication: Once executives and other decision-makers use data to decide on a specific direction, that decision must be communicated to the team responsible for implementation. While the decision may seem obvious, other stakeholders may not fully understand the reasoning behind it, thereby reducing efficiency. With data visualisation, decision-makers could use graphs and charts to communicate the reasons behind the decision clearly.
Identify benchmarks and trends: An effective visualisation makes it easier than ever before for users to recognise relationships and patterns within their data. By exploring these patterns, users are able to focus on specific areas that need attention to help drive their business forward.
Empowering collaboration: Data visualisation helps organisations by presenting data in a universally understood form, empowering people to contribute to decision-making with their perspectives. Approaching any challenge from multiple perspectives enables decision-makers to make better choices.
Understand the story behind your data: Ultimately, all of these benefits of data visualisation lead to one key outcome — a more comprehensive understanding of the story behind a business’s data. Armed with this knowledge, businesses can make better informed decisions that help to drive outcomes and business success in the long term.
Data visualisation tools
Cutting-edge data visualisation tools are essential for converting raw data into actionable insights. As a result, identifying and deploying the right tools is vital for businesses looking to uncover valuable insights that can help drive growth.
Fortunately, there are now a range of data visualisations tools available to businesses looking to harness the power of their data. The most popular among these include:
Domo: Domo is a cloud software company specialising in business intelligence tools and data visualisation.
Dundas BI: Dundas Data Visualization, Inc. is a software company specialising in data visualisation and dashboard solutions.
Infogram: Infogram is a web-based data visualisation and infographics platform.
Looker: Part of the Google Cloud Platform following a 2019 acquisition, Looker markets a data exploration and discovery business intelligence platform.
Microsoft Power BI: Power BI is an interactive data visualisation software developed by Microsoft with a primary focus on business intelligence.
Qlik: Qlik is a business analytics platform that provides software products such as business intelligence and data integration.
Sisense: Sisense is a business intelligence software company best known for embedded analytics.
Tableau: Tableau Software is an interactive data visualisation software company focused on business intelligence specialising in visualisation techniques.
Even if businesses have access to one or more of these tools, that isn’t enough to ensure effective visualisations. Remember, collecting, sorting, cleansing and analysing data before it gets fed into a cutting-edge tool is essential to ensuring accurate and relevant insights.
And that’s not all. On top of that, businesses also need knowledge, skills and expertise to ensure that tools such as those outlined above are used correctly and therefore produce results that drive positive outcomes.
Enhance your decision-making with data visualisation
Data visualisation has a track record of driving progress. For example, the 1854 Cholera Outbreak Map of London marked the locations of outbreaks, revealing that affected households used the same drinking water wells. Examination of these wells demonstrated a connection between cholera and contaminated water.4 These results helped the city eradicate cholera and contributed to Louis Pasteur’s discovery of modern germ theory.
Over a hundred years later, businesses are looking to leverage data to ensure both growth and prosperity. A comprehensive data strategy that facilitates visualisations that enhance decision-making processes has therefore become essential to long-term success.
However, that requires access to significant knowledge, expertise and cutting-edge tools, all of which can be difficult to obtain and retain in-house. That’s where data analytics providers like Ipsos Jarmany come in. We’re here to ensure that your business can establish a successful data strategy that delivers insights through stimulating visualisations.
So, if you’re ready to start using your data to predict needs, deliver efficiencies, connect people and achieve growth targets, get in touch with us today.
Whilst first-party data can provide rich and meaningful insights on your customers and can feed into machine learning, it often lacks breadth, especially if your business isn’t able to collect, store and manage valuable high quality first-party data efficiently.
This is where third-party data comes in.
Third-party data refers to data that is collected by organisations outside of your company and can be used to gain valuable insights into your target audience, industry, or market.
In this blog post, we’ll explore the reasons why third-party data is so important and how it can benefit businesses of all sizes.
#1 Close the gaps in your data
A lot of organisations are collecting their own first-party data to help derive actionable insights and gain a greater understanding of their customers to then guide decision making.
This could be:
Website data
Social data
Marketing data
Operations data
Sales data
Whilst this first-party data can be very high value, unless you have a large quantity of it, it often lacks validity and is not enough to base high-level decisions on. This impacts the quality and reliability of your analysis.
In this scenario, third-party data can be used to close the gaps to enhance the value of your insights and findings. Put simply, third-party data cannot match an organisation’s first-party data, however, it can help you build on to the insights you already have. First-party data lays the foundations, third-party data heightens it and allows you to broaden your data ecosystem.
#2 Greater context into customer behaviour
Even if your business is a well-oiled machine when it comes to collecting first-party data, this is often useless if you don’t understand the macro-economic factors driving consumer behaviour.
This could include:
Geographical trends
Demographic changes
Environmental changes
Political news
Market share/size information
By utilising third-party data, you can obtain insights that will help you to understand current behaviour and predict future behaviour, so you can calculate any impact on business operations, and gain greater insights into supply and demand shifts.
#3 Understanding your target audience
Third-party data can help you better understand your target audience and their behaviours, interests, and preferences. This information can be used to create more targeted marketing campaigns and to develop more effective customer engagement strategies. For example, if you’re selling athletic clothing, you might use third-party data to learn more about your customers’ exercise habits, which can help you create content and promotions that resonate with them.
#4 Strengthen Indirect Sales Insights
Third-party data is also pivotal if your business operates through indirect sales channels, as it enables you to gain insights into your sales activity through each third-party retailer. Without it, you only have a partial understanding of your sales performance.
For example, if you were a company selling computers direct to the consumer, but also indirectly through a retailer, you would have access to certain information, such as no. of units you were providing to the retailer, product price point and location where the units are sold. However, you’d be missing a range of insights such as how the retailers discount & marketing schemes impact sales, whether the user is purchasing online or in person, or if certain areas of the world sell better than others.
This is where you can really benefit from utilising third-party data to gain more granularity into your indirect sales performance.
#5 Improving marketing and advertising efforts
Third-party data can also be used to improve your marketing and advertising efforts by providing a more complete picture of your target audience. As a result, you’ll be able to offer a deeper level of personalisation to help your ads resonate more with your target audience.
For example, you can use third-party data to create more effective targeting strategies for your digital ads, such as targeting based on demographics, interests, or purchase history.
This information can also be used to improve your email marketing campaigns by personalising your messages and making them more relevant to your subscribers.
#6 Making informed business decisions
Ultimately, third-party data can provide you with the valuable insights into your industry and market that can be used to make informed business decisions. It allows you assess the competitive landscape, identify market trends, determine the best target audience for your product and predict future customer behaviour. Combined with your first-party data, this information can provide you with a complete picture that will then guide your business in terms of pricing, distribution, product positioning and much more.
In conclusion, third-party data is a valuable tool that can help businesses to close the gaps in their data, gain greater context into customer behaviour, build a better understanding of their target audience, strengthen indirect sales insights, improve their marketing and advertising efforts, and ultimately make informed business decisions. Whether you’re a small business just starting out or a large corporation looking to stay ahead of the competition, incorporating third-party data into your data strategy is essential for success.
How Ipsos Jarmany can help you
Managing your third-party data can be a minefield, especially in a privacy conscious world with increasing regulations around data protection and misuse. It can also be a struggle to integrate this third-party data with your existing data, and using this to build and feed machine learning models to gain enhanced insights. Additionally, this type of data management requires a specialised skillset, which is often very timely and expensive to build internally. As a result, leaning on a specialist agency, who have expertise in storing, managing and transforming data in order to gain actionable insights is often the favoured approach.
Get in contact with us today if you’d like to explore how we can help you manage your data, use techniques such as web scraping to obtain more insights, and then build machine learning models to help you drive business growth.
You might already be familiar with GA4 — many businesses have been using it alongside UA for the last two years. Alternatively, you might know next to nothing about it. Whatever the case, getting to grips with GA4 is important to your business.
Google Analytics is one of the most popular analytics tools, with over half (55%) of online businesses using it to gain visibility into key website metrics.1 Understanding how the latest version works should be a priority.
But don’t worry, we’ve got you covered. In this GA4 guide, we’ll explain everything you need to know to get you ready for the shift to GA4 — and leverage it to gain a deeper understanding of your customers. But first, let’s answer an important question.
What is GA4?
GA4 is an analytics service that allows you to measure traffic, engagement, and performance across your websites and apps (known as properties), giving you the insights you need to improve all three.
Launched in 2020, GA4 is the fourth and latest version of Google Analytics. It was designed to phase out and ultimately replace the previous version, Universal Analytics, which was built when the digital world was very different from today.
GA4 provides data insights throughout the customer lifecycle, making it a useful tool for businesses or marketers seeking to understand how customers behave before, during, and after conversion. As you’d expect from a modern data analytics platform, GA4 also offers machine learning insights and data science analysis.
GA4 vs Universal Analytics
Up until October 2020, Universal Analytics was the default version used when a new Google Analytics property was created. After that date, the default version became GA4.
Google now plans to phase UA out completely. From July 2023, UA will stop processing new hits, although users will still be able to access data for their Universal Analytics properties for another six months.
Universal Analytics 360 (also referred to as Google Analytics 360), on the other hand, is used by bigger, enterprise-sized businesses. UA 360 is a scaled-up, paid version of UA with extended capabilities. UA 360 has higher data limits, service level agreements and, support-wise, a dedicated account manager and implementation support.
Like UA, however, UA 360 is being sunsetted, albeit from the later date of July 2024.
What’s the difference between GA4 and UA?
There are several key differences between GA4 and UA. In this section, we’ll highlight the most important ones to understand.
Events vs sessions
GA4 uses a fundamentally different model for measuring data compared to its predecessor. UA’s measurement model was based on sessions, including any number of user interactions ( known as hits) within a specific time period. These could include page views, clicks, and transactions, for example.
GA4’s data collection model, on the other hand, is based on events, with any user interaction qualifying as a separate event.
The change to events, however, has also led to some ‘missing’ metrics and reports, in particular bounce rate. The bounce rate metric in UA is replaced with ‘engaged sessions’, which shows sessions lasting 10 seconds or longer, has one or more conversion events, or two or more screen or page views.
Other valuable metrics available in UA, such as views per session and average session duration, while harder to access in GA4, have recently been made available through customisable reports.
Multiple devices
UA was designed for a world where desktop reigned supreme. Since its launch in 2005, however, the world has changed drastically. Today, people access digital services across a range of devices, with mobile becoming increasingly popular in recent years. GA4 is designed to track the users of today, seamlessly collecting data across multiple devices.
Machine learning
GA4 has machine learning (ML) capabilities, enabling it to use current and historical data to predict how your users might behave in the future. The resulting insights allow you to see the probability of customers purchasing something or churning, for example. UA, on the other hand, has no ML capabilities.
Data protection and security
GA4 anonymises IP addresses automatically, guarding against the identification and misuse of personal information and protecting personal privacy. Unlike Universal Analytics, this brings GA4 in line with GDPR compliance.
Future-proofing
Compared with UA, GA4 focuses on tracking user IDs rather than cookies. Reducing the reliance on cookies helps future-proof GA4 and move away from UA’s focus on tracking page visits and sessions through cookies. This will help improve the quality and access to insights across multiple platforms.
Google Ads
GA4 enjoys a deeper integration with Google Ads, allowing you to measure app and web integrations together. This ultimately provides a deeper level of insight than UA.
More reporting
With GA4, you get reporting options across the customer lifecycle, with reports focusing on acquisition, engagement, monetisation, and retention (more on this later). With UA, on the other hand, you only get reporting for acquisition.
GA4 is taking over
GA4 is designed to meet the needs of businesses in 2023, enabling them to understand how their customers behave across platforms and journeys. UA was designed for an era of desktop dominance and cookie-related data — ideas that are slowly becoming obsolete.
With UA being phased out completely by summer, now’s the time to switch to Google Analytics 4 — if you haven’t already. This means you should:
ensure you have a centralised archive of historical data you can draw from
set up GA4 event tracking
and transition all your existing UA properties to GA4.
Setting up GA4
As set out in the Google Support guide, GA4 is relatively simple to set up — if you know how.2 In this section, we’ll walk you through the process step by step.
Log in to your Google Analytics account.
Check which version you are currently using. If you can see three columns (Account, Property, and View), you are using UA. If you can see just two columns (Account and Property), you are already using GA4.
Assuming you are still using UA, select ‘GA4 Setup Assistant’ under the Property column.
Click ‘Get Started’ to set up a Google Analytics 4 property. Alternatively, if you already have a GA4 property that isn’t connected to your Google Analytics account, select ‘Connect Properties’ and follow the instructions.
If you are already using gtag.js tags, select ‘Enable data collection using your existing tags.’ If you are using Google Tag Manager or the old analytics.js tags, you’ll need to add gtag.js tags yourself.
Click ‘Create property’.
Once you’re up and running, you can set up a range of capabilities designed to help you track and obtain data, including:
Configure Custom Events
Configure User IDs
Configure Enhanced Measurements
Activate Google Signals
Link to Google Ads
Define Audiences
Import or set up Conversions
Using GA4
Tracking across multiple platforms
One key benefit of GA4 is the ability to track data across multiple platforms — something that was virtually impossible in UA. In practice, this means that GA4 tracks website and app data for one property. So if a user visits your site using a laptop and a mobile, the data for the various sessions is consolidated under one user rather than two. This helps you keep track of the same user across multiple devices and sessions.
Cross-platform tracking provides a much more complete view of user behaviour, allowing you to understand how customers engage with your website or app, as well as the different devices they are using to access them. You get to see the entire customer journey — from acquisition through engagement and retention — across various platforms.
To set up cross-platform tracking, you need to use the appropriate gtag.js script to create unique user IDs. These IDs can then be configured to track users across platforms.
Using Events
As we touched on earlier in this Google Analytics 4 guide, GA4’s data collection model is based on events. Sessions dominated UA — and they’re still used to a degree — but events are how you track almost everything in GA4.
Put simply, all user actions on your site or app now qualify as events. So to understand and track events is to understand and track user behaviour and engagement. You can choose which events you want visibility over, and how you track them is up to you.
Broadly speaking, events fall into four different categories in GA4:
Automatically captured events: These events, such as when a user clicks on an ad or when a free trial is converted to a paid subscription, are automatically tracked by default, without you having to do anything.
Enhanced measurement events: These are events that you can enable in GA4, allowing you to measure interactions with your content. Enhanced measurement events can be toggled on and off by going to the Admin column, selecting Data Streams, then Web, and then Enhanced Measurement.
Recommended events: These events require additional context to function effectively, meaning you’ll have to set them up yourself. They include ‘login’ events (when a user logs in), ‘search’ (when a user searches your content) and ‘share’ (when a user shares your content).
Custom events: These are events that are specific to your business, website, or app and not already known or measured by GA4. With custom events, you define the name and the set of parameters for each event.
How to create a custom event in GA4
To create a custom event in GA4, simply follow these steps:
Select the Admin icon in the bottom left of your screen
Go to the Property column and select Events
From here, select Create Event
Choose the data stream for which you want to deploy the event (assuming you have more than one)
Click Select
Follow the rest of the set-up prompts to complete the process
Getting the most out of your GA4 reports
As you’d expect from a data analytics tool, GA4 provides a range of reports and data visualisations designed to help you understand your data — and act upon it. In this section, we’ll explain everything you need to know to get the most out of your GA4 reports.
Reports snapshot
As the name suggests, the reports snapshot provides an overview of the most popular metrics in one single, easy-to-read dashboard. This is where you go if you need an at-a-glance view of how your property is performing. Data sets in the reports snapshot include things like:
User behaviour
New users by channel
Number of sessions by channel
Users by country
User activity over time
Views by page and screen
Top events
Top conversions
Top-selling products
The data sets in the snapshot are pulled from other reports, and the reports snapshot is customisable, allowing you to focus on the insights that matter to you most. To customise your report snapshot, you’ll need to follow these steps:
Select Library from the bottom of the left navigation bar (note: you’ll need admin rights to do this — this option isn’t available in a demo account)
Select Reports
Select Create a new report
Select Create an Overview Report
Follow the set-up steps to complete
Real-time Overview Reports
With GA4, you also get access to real-time reports, allowing you to see how customers are using your website in real time and track their journey through the sales funnel. Real-time reports offer a range of metrics, including:
Geo-maps, showing where current users are based
Number of users in the last 30 minutes
Users by source, showing how your users arrived at your site
Users by audience
Views by page title and screen name
Event count by event name
Conversions by event name
Lifecycle reports
GA4 breaks down the customer lifecycle into four stages — acquisition, engagement, monetization, and retention — with corresponding reports for each. Let’s take a look at what they offer.
Acquisition: See how new users found your website or app, allowing you to understand which channels and campaigns are proving the most successful.
Engagement: Explore how users interact with and navigate through your website or app, with metrics covering a range of events.
Monetization: Get a full breakdown of how your website or app is generating money, covering e-commerce, subscriptions, and ad revenue.
Retention: Understand the frequency and duration of users’ interactions with your website or app after their first visit — and how valuable they are to you over their lifecycle.
Together, these reports give you a complete picture of how users behave across all stages of the customer journey, as well as the value they bring through engagement. Ultimately, this helps you refine your campaigns, content, and UX to improve customer acquisition and retention — and ultimately drive more revenue.
Other reports
In addition to those highlighted above, GA4 comes with a range of other reports designed to give you a complete picture of your users and how they interact with your website or app.
For example, the Tech report in Google Analytics 4 analyses the technology that people use when visiting your website or app, including the platform, operating system, screen resolution, and app version.
Meanwhile, the Demographics report breaks down your users by their age, location, gender, and affinity category, which includes acquisition, behaviour and conversion metrics — giving you greater insight into your customer base.
Making the most of GA4
If you rely on the Google Analytics platform, it’s time you started thinking about switching from UA to GA4. At Ipsos Jarmany, we recommend a test and trial period of at least six months; this helps you identify any nuances in your reporting and reconcile them to help you get up and running with GA4. With July’s sunset date coming fast, this is no longer a choice but a necessity. To benefit from the switch, it’s critical that you start to get to grips with GA4 as quickly as possible.
That said, getting started with Google Analytics 4 can involve a steep learning curve, while migrating from UA to GA4 can be tricky for those without the technical know-how. Plus, for businesses with multiple brands, websites and properties, successfully merging them together in GA4 for a complete view can be tricky. That’s why it pays to work with an expert technology partner with expertise in migration, implementation and support — like Ipsos Jarmany.
As an analytics and data consultancy, we can help you seamlessly migrate to GA4, providing you with the support and expertise you need to get up and running fast and maximise its potential. The change is coming, make sure you’re prepared for it with our expert help.
This is where business intelligence and visualisation tools come in. They allow businesses to turn complex data sets into clear visualisations, and then act on them. The result is smarter decision-making, more streamlined processes, and a competitive advantage over businesses that fail to capitalise on this opportunity.
In this article, we’ll take an in-depth look at one of the most popular data visualisation tools on the market: Looker. Read on to learn about:
Looker’s data visualisation capabilities
Its key features and how they are used
The pros and cons of choosing Looker over one of its competitors
How your business can get the most out of this powerful tool
What is Looker?
Looker is a data analytics and visualisation tool. It enables businesses to analyse, and explore their data through unique visualisations, helping them to turn raw data into actionable insights that drive smarter business decisions. It does so through powerful features such as integrated insights and data-driven workflows.
Launched back in 2012, in 2019 Looker was acquired by Google for $2.6 billion and is now part of the Google Cloud Platform. It’s a browser-based solution, so there’s no need to worry about installation or maintenance.
While Looker is well-known in the data visualisation world, direct competitors including Microsoft Power BI, Tableau and Qlik might be more familiar to businesses. Though Google’s acquisition of Looker in 2019 is aiming to change that.
As you’d expect, Looker shares some core features with other popular data visualisation and business intelligence tools, such as the ability to:
Build custom real-time dashboards
Connect to any SQL database
Create custom applications
Leverage embedded analytics
Access a range of customer support options
What modelling language does Looker use?
One of Looker’s key differentiators is LookML, its native modelling language. LookML is an SQL-based language, but it aims to improve on SQL’s shortcomings to help users write simplified and streamlined SQL queries.
LookML is a modular, reusable language. And collaboration tools such as version control means that Looker users don’t have to start a script from scratch or spend ages trying to find what changed and when.
Looker Blocks — pre-built data models designed to fit common analytics patterns — also prevent users from having to start from square one each time they want to create a data model. Users can select pre-existing models and modify them to their needs. This includes:
Analytics blocks
Source blocks
Data blocks
Data tool blocks
Embedded blocks
Viz blocks
Looker’s data visualisation
As the name suggests, Looker is all about data visualisation. In this section, we’ll run through some of its core data visualisation capabilities — and how you can use them to drive business success.
Looks and dashboards
Looks are visualisations created and saved by users. Looks are created in Looker’s Explore section, which can then be shared and used across multiple dashboards.
Dashboards allow users to place and view multiple Looks, graphs or tables in a single place. This allows users to, for example, view a range of different but relevant KPIs in the same way in one place. Dashboards are interactive and customisable. For instance, you can put several Looks into one dashboard and add a filter, acting as a master control that affects each Look within that dashboard in the same way.
Both Looks and dashboards can be shared with anyone, helping everyone get on the same page and view and understand the data easily.
Filtering looks and dashboards
Both Dashboards and Looks have filter functionality. Toggling Looks and Dashboards filters can also provide users with greater flexibility and specificity based on the filters’ hierarchies. For example, by selecting a Dasboard filter for a particular year, that filter would apply to all the Looks in that dashboard by default.
However, you can also choose which Looks within a dashboard are affected by that filter. This enables users to set a dashboard filter for a particular year, and then apply a separate filter specific to certain Looks and disable the default dashboard filter for them. This lets you the ability to apply a filter to all your Looks in one dashboard, or apply different filters to Looks within an overall Dashboard filter.
Types of visualisations
Looker features a rich variety of visualisations that allow you to present, read, and understand data in different ways, including:
Cartesian charts, i.e. any chart plotted on x and y axes, including column, bar, line, and scatterplot charts
Pie and donut charts
Progression charts, including funnel charts and timelines
Text and tables, including single value charts, single record charts, and word clouds
Maps, including Google Maps
Custom visualisations
There are also 40 visualisations available via Looker Studio, previously known as Google Data Studio, as well as custom visualisations created by Looker’s partners. As mentioned above, Looker’s blocks — and Viz blocks in particular — can be used to quickly and easily create data visualisations.
Hosted by Looker, you can add them to your Looker instance, allowing for seamless visualisations with powerful functionality, including the ability to drill down, download, embed, and schedule data.
Now you have a solid understanding of what Looker is and how it works, but how do you know if it’s the right choice for your business? In this section, we’ll look at some of the pros and cons of Looker visualisations.
Looker Pros:
#1 Cloud-based + browser-based
Looker has all the advantages you’d expect from a cloud-based data analytics platform, including advanced security, high performance, and seamless accessibility. And because you access it directly through your browser, you don’t need to worry about software installation or manual updates and maintenance.
#2 Easy Git integration
Looker allows users to integrate the popular version control system Git, enabling multiple people to work on visualisations simultaneously, record changes, and manage file versions. Looker users can see changes made to data-modelling layers, jump back to them at any time, and create different version strands in repositories that developers can then work on.
While not set up automatically, the integration can be easily set up and provides a benefit other data visualisation tools don’t.
#3 Connects with multiple data sources
Looker can connect with and visualise data from multiple disparate sources, including Google Cloud, Microsoft Azure, Amazon Web Services (AWS), on-premises databases, and a range of database software.
And as a Google browser-based product, Looker easily integrates with Google’s entire suite of browser-based applications. This makes sharing Looker dashboards quick and easy, with no downloading and little set up required.
#4 Self-service analytics
Thanks to Looker’s LookML data-modelling language, users can define dimensions, metrics, aggregates, and relationships. These are then used to populate Looker’s data visualisations, providing users with seamless self-service analytics, while enabling them to reuse data and calculations.
Looker Cons:
#1 Limited range of visualisations
While Looker is a perfectly effective and highly popular data visualisation platform, the variety of out-the-box visualisations is somewhat limited — especially compared to competing data analysis and visualisation tools like Tableau. That said, the ability to build custom visualisations goes some way towards mitigating this issue.
#2 More expensive than direct competitors
When compared with its closest competitors — for instance, Microsoft Power BI and Tableau — Looker is the most expensive of the lot. Businesses looking to cut costs may be tempted to look at one of the cheaper, but no less popular, options on the market.
#3 A steep learning curve
Looker isn’t the type of product you can just pick up and play with from the start. Before you begin visualising data, you need to define a semantic model in LookML, which then translates into SQL. This is to ensure that the underlying data is all drawn from the same place and matches up.
LookML is designed to make things easier — and it does once you understand how it works — but without the right in-house expertise or outside training, it can be a while before you get the most out of Looker and improve your ROI.
Pros
Cons
Cloud-based + browser-based
Limited visualisations
Easy Git Integration
High cost
Connects with multiple data sources
Steep learning curve
Self-service analytics
Requires expertise to maximise results
Suggested reading: While Looker is a solid choice for many businesses, there are other business intelligence and data visualisation tools on the market. For a closer look at one of Looker’s direct competitors — Microsoft Power BI — check out the below article: 11 Benefits of Using Power BI for Data Analytics
How to create visualisations in Looker
As a visualisation tool, Looker strives to make creating visualisations as easy as possible. Creating Looker visualisations involves the following simple steps:
Create and run a query in Looker
Click on the Visualisation tab
Select the visualisation type you want to use
Select Edit to configure and customise your visualisation
Now, let’s look at some key parts of this process in a bit more detail.
How to choose a visualisation type
Once you’ve created and run a query, click on the visualisation tab. You’ll then be able to choose a visualisation type by selecting one of the chart buttons at the top of the screen. To view more visualisation options, simply click on the three dots to the right of the chart buttons.
Each option displays your data in a different way, and some options are better suited to certain types of data than others. If you’re measuring the change in a value over time, for example, you’ll be well served by a cartesian chart, with the time-related data making up the x (or horizontal) axis. Meanwhile, if you want to visualise how values are proportioned in relation to each other, a donut chart is your best bet.
How to customise visualisations
Once you’ve selected one of the visualisation types, you can play around with the configuration options to make the data more readable and customise the look and feel of the visualisation.
Each visualisation type has its own unique configuration options. In a column chart, for example, you can choose whether you want the data to be grouped or stacked, what kind of spacing you want between columns, the colour of each column, etc. Have a play around and see what works for you.
Creating multiple visualisation types
Looker also allows you to create multiple visualisations within a Look. For example, you might use a column chart or line chart visualisation in one Look as a way to compare data or provide additional insight and context.
To do this, follow these steps:
Click on the Edit button to display the customisation options for a particular visualisation
Select the Series tab
Go to the Customizations section and click the arrow next to the particular series
Go to the Type box and select the visualisation type you want for that series
Getting the most out of Looker
Looker is a powerful BI and data visualisation tool that helps you start visualising your data and making business intelligent decisions. But you can only do that once you know how to use it and get the best out of it. The companies that are best able to view their data are best positioned to use that data to drive decision-making.
Without in-house expertise or the right training, the steep learning curve and technical know-how required to maximise its potential can hurt your ROI, and squander the potential within your data. This is where Ipsos Jarmany can help.
With our Looker consultancy services, we’ll help you to get the best out of the platform, ensuring that your business capitalises on its powerful data visualisation capabilities. Our team of experts has the experience you need to build visualisation solutions tailored to the unique needs and goals of your business, enabling you to:
Master Looker’s native language, LookML
Create bespoke visualisations that simplify complex data sets
Drive data-driven decision-making across your organisation.
To find out more about how Ipsos Jarmany could help you use Looker to drive business success, get in touch with one of our experts today.
Businesses that use their data to drive decision-making are 9x more likely to be profitable, so it’s no surprise that organisations are re-focusing their investments on data and technology.
With such substantial growth set for the tech industry, we’ve collated our top 5 data and technology predictions for 2023 to guide you on where and how you should be investing your funds.
#1 Data Democratisation
First-up, we have data democratisation. We predict that data democratisation will be more widely adopted in 2023, with businesses starting to incorporate ‘data mesh’ as part of their data strategy.
Data democratisation is the process of enabling employees throughout your organisation to have access to the data relevant to their roles, irrespective of their technical or analytical background. This reduces gatekeepers or bottlenecks, therefore improving efficiency.
With so much emphasis placed on the value data and actionable insights can drive in your business, it’s important that data is accessible for employees across all verticals. By empowering your entire workforce with data-driven insights, you’re enabling them to do their job more effectively and efficiently.
A recent study, conducted by McKinsey, found that companies that make data accessible to their entire workforce are 40x more likely to say analytics has a positive impact on revenue.
As an extension of this, we predict that firms will start adopting a ‘data mesh’ approach, whereby data is de-centralised and each business team has internal data literate capabilities, therefore enabling them to more easily self-serve. If you aren’t already, you should be considering how to adopt a data democratisation and data mesh approach throughout your organisation.
#2 AI and Machine Learning
No surprise here and of course is an annual trend that you obviously can’t look past. Applications are already live in a number of organisations today, from chatbots and automated responders to process and machinery automation and business forecasting models. However, we’re expecting this to seriously ramp up in 2023.
According to IDC research, worldwide AI technology spend by governments and businesses is expected to exceed $500 billion in 2023. Gartner also predicts that in 2023, ML will penetrate even more business fields helping to increase efficiency and work security.
ChatGPT is a prime example of this. Released at the end of 2022 by OpenAI, this new generation chatbot has the ability to understand natural human language and generate detailed human-like responses. This advanced AI technology is already paving the way for next generation customer service chatbots within companies such as Meta, Canva and Shopify.
With the business world becoming increasingly competitive, factors such as personalisation is also what will set you apart from the competition in 2023 and beyond. Consumers want a personalised experience, and those that get it are 80% more likely to buy from a brand – AI and ML will help you achieve this so you can attain this competitive advantage.
Machine learning models provide businesses with the means to deliver a more scalable and accurate way of achieving unique experiences for individual users. They enable businesses to track and observe digital habits so they can then pre-empt future consumer behaviour.
If AI and ML isn’t already part of your digital growth plan for 2023, then it should be.
#3 Data Clean Rooms
The diminishing of third-party cookies has been a popular topic in 2022, as the deadline fast-approaches Google relinquishing support on their browser, Chrome. This places an even greater emphasis on the importance of first party data to help provide in-depth customer insights. However, first party data will only take you part of the way. Consider data clean rooms as a new solution.
Data clean rooms refers to a piece of software that enables two parties, typically publishers and advertisers, to share anonymised customer data for joint analysis. This private data exchange enables heightened insights on your first party data so you can:
Understand how customers are interacting with other brands
Establish lookalike audiences
Avoid duplicate efforts across channels
Build new customer segments for targeting
Walled gardens are a common example of data clean rooms, with the likes of Google, Amazon and Facebook sharing their aggregated customer-level data with advertisers.
With third-party cookies posing huge attribution and insights challenges, we think this is going to be particularly important in 2023 to help brands bridge those gaps in a post-cookie world.
#4 Optimising IT Systems
Computing power and technology has come leaps and bounds in the last few decades, with revolutionary new platforms and tools available and accessible for more people. However, ensuring you have the right systems in place is vital if you want to keep up with the pace of new and developing technologies, and the ever-increasing flow of data into your business.
When reviewing and updating your IT systems and data stacks you should be considering the 4 v’s of data:
Volume
Velocity
Variety
Veracity
2023 is going to be a pivotal year for enhancing IT solutions, with factors such as the metaverse, a greater need for automation and systems that can cope with vast amount of data driving this evolution of IT systems. Investing in the right data stacks will therefore be essential.
Your IT systems should enable you to:
Analyse your data in real-time data
Adhere to privacy and security regulations
Ensure smoother automation
Collect, store and manage your big data
And much more.
Further to this, we predict that cloud storage will take strides in 2023 to enable all of the above points and more. If you have the right cloud storage solution in place, you’ll find it easier to scale up as your business grows, and your data will be much more secure.
According to statistics, about 60% of the world’s corporate data is stored in the cloud, and this number is likely to grow. As a result, you should be factoring in cloud storage as part of your digital transformation strategy in 2023.
#5 Data as a Service
DaaS can be defined as “a data management strategy that is used to store data and analytics. DaaS companies are organisations that provide customers with a service surrounding data – meaning data management, data storage, and analytics are the main selling points of the software.”
All of the above points require specialist skills, expertise and experience to deploy, and this can be especially challenging to deliver and maintain internally due to skill shortages in the industry.
You can offset these challenges by partnering with DaaS providers and agencies. And, we predict that the majority of firms will tap in to the expertise of these types of partners to help deliver their 2023 digital transformation strategy and manage their data and analytics, in turn freeing up their internal resource to focus on higher priority tasks.
It is estimated that the DaaS market will grow to $10.7 billion in 2023, further demonstrating the value that third-party providers can add to your business.
Get in touch
The digital world and power of computing systems is enhancing at an unparalleled pace, and having the right technology and data processes in place will be the driving factor for your business achieving growth.
Ensuring that you’re investing your efforts and finances in the right way will keep you ahead of the game, and we’re confident that if you focus on the 5 points we’ve listed in this blog to define your digital strategy in 2023, you can’t go far wrong.
If you’d like to discuss how Ipsos Jarmany can support you on your data and digital journey in 2023 then please contact us today.
In today’s digital world, data is the lifeblood of business. Whether you’re a small eCommerce retailer or a multinational corporation, data analytics and visualisation give you a competitive advantage by driving smarter decision-making. But for any data to work within an analytics or visualisation platform, you need to get the foundations right. That means effective data modelling.
In this article, we’ll look at some of the best practices for data modelling in Qlik — a popular analytics platform that provides powerful real-time business intelligence and data visualisation. Qlik’s two main solutions, both of which can be used for data modelling, are:
QlikView: A data analytics, visualisation, and reporting tool that helps businesses make sense of their data using charts and dashboards.
Qlik Sense: Launched in 2014, Qlik Sense is a modern, self-service data exploration tool that allows users to build custom dashboards via drag-and-drop functionality.
Why is good data modelling important?
Businesses today collect a vast amount of data from multiple sources. But the usefulness of raw data is limited; it becomes useful when it’s transferred into an understandable and actionable format.
Data modelling is the visualisation and blueprint for how the data will be used. Without effective data models, platforms like QlikView and Qlik Sense can’t perform at their best, resulting in sluggish performance. To get the most out of your data, you need to design and implement a data model that:
Reduces your system’s memory storage by freeing up access data
Creates high-quality visualisations in real-time
Run platforms, like Qlik, efficiently.
Qlik data model best practices
Data modelling can be a complex process. In this section, we’ll break down some of the data model best practices for QlikView and Qlik Sense, helping you get the most out of your data. Let’s dive in.
#1 Working with crosstables
A crosstable is a table consisting of columns and rows in a grid-like format. The top row contains one field, and the left-hand column contains another, with data populating the grid accordingly. See the example below.
Year
Jan
Feb
Mar
Apr
May
Jun
Jul
2019
56
34
60
48
84
80
74
2020
19
32
83
54
23
38
20
2021
33
37
43
29
20
09
11
While this may look appealing, it’s not the ideal format for data modelling in Qlik. If you load data this way, it would display a field for the year plus additional fields for every month, whereas you most likely need just three fields: the year, the month, and the respective values.
You can fix this problem by adding the crosstable prefix to the SELECT or LOAD script. Here’s an example:
Crosstable (Month, Sales) LOAD * from ex1.xlsx
What you get is this:
Year
Month
Units
2019
Jan
56
2019
Feb
34
2019
Mar
60
2019
Apr
48
2019
May
84
2019
Jun
80
2019
Jul
74
This process enables efficient data structuring and is the same whether you are using QlikView or Qlik Sense.
#2 Star schema vs Snowflake schema
Using a star schema in both QlikView and Qlik sense is the most efficient schema technique. Using a central fact table containing the relevant fields and keys, surrounded by dimensional tables that contain the attributes of the fields located in the central table, is the easiest to understand schema for data modelling.
Snowflake schemas, though useful for more complex fields and data, are less efficient due to the additional, intermediary tables through which information needs to travel.
Pro Tip: Circular references or loops — tables with more than one path of association between two fields — should be eliminated to improve efficiency. Qlik Sense uses loosely coupled tables to break circular references.
#3 Join and keep
You can combine two data tables in Qlik using the join and keep prefixes in your script. Join is used to fully combine two tables, creating all possible combinations of values from the tables. As a result, joined tables can be huge and slow to process in Qlik.
This is where the keep functionality comes in. Instead of joining tables to create one large table, keep allows you to link the two tables together, reducing repeated or identical data from the two, while continuing to store them as separate tables. This reduces the table size, ensuring faster processing times while freeing up memory.
The process here is the same for both QlikView and Qlik Sense.
#4 Incremental load
Incremental load allows you to load only new or updated data, as opposed to loading the entire data set each time. The best and fastest way to go about an incremental load is by using QVD files.
Here’s how the basic process works in both QlikView and Qlik Sense:
New or updated data is loaded from the data source table. While this can be a slow process, only a limited number of records are actually loaded.
Existing/old data is loaded from the QVD file. This involves loading a lot of records but at a much faster speed.
You then create a new QVD file, containing both the old and new data, which you’ll use the next time you want to do an incremental load.
Repeat this for each table you want to load.
Pro Tip: Using an ‘As-Of calendar’ prevents users from loading data multiple times to get previous-period calculations. An As-Of calendar prevents multiplication of data volumes.
#5 Generic databases
To display attributes of different objects, you can store data in generic databases. These are essentially tables where field names are stored as values in one column, with field values stored in a second column. See the example below:
Object
Attribute
Value
Ball
Colour
Blue
Ball
Diameter
30 cm
Ball
Weight
250 g
Box
Colour
Red
Box
Length
25 cm
Box
Width
15 cm
Box
Weight
400 g
As you can see, this table contains two objects: a ball and a box. While they share some common attributes, e.g. colour and weight, other attributes are specific to one or the other, e.g. diameter or length/width.
If you load this table as a generic database in Qlik Sense or QlikView, the attributes in the second column become tables of their own, allowing the data to be stored in a more compact way. See the examples below.
Colour
Blue
Red
Diameter
30 cm
Weight
250 g
400 g
Pro Tip: Giving tables easy and intuitive names helps users easily filter data and fields using table names.
#6 Matching intervals to discrete data
By adding the intervalmatch prefix to a LOAD or SELECT statement in Qlik Sense or QlikView, you can link discrete numeric values from one table to different numeric intervals in another table.
This allows you to show, for example, how certain events actually took place compared to how they were expected to take place. It is particularly powerful in manufacturing, where production lines are scheduled to run at certain times, but due to breakdowns, delays, or other errors, they may run at different times.
There are a few important points to consider when using interval matching:
The discrete data points must already have been read in Qlik before using intervalmatch.
The table you want to be matched must always contain two fields, typically start and end.
Intervals are always closed, with endpoints included in the interval.
#7 Using and loading hierarchy data
Hierarchy data can be displayed in Qlik Sense and QlikView in several ways, including adjacent nodes tables, expanded nodes tables, and ancestors tables. Let’s take a look at what each one offers.
Adjacent nodes tables: each node in the hierarchy is stored once and is linked to the node’s parent (see the examples below). Adjacent nodes tables are the simplest way to present hierarchy data. While good for maintaining unbalanced hierarchies, they aren’t suitable for detailed analysis.
NodeID
ParentNodeID
Title
1
–
CEO
2
1
Director
3
2
Senior manager
4
3
Manager
Expanded nodes tables: In this type of table, each level of the hierarchy is presented in its own separate field, making it easier to use in a tree structure (see example below).
Expanded nodes tables are more suitable for querying and analysis than adjacent nodes tables, but aren’t best suited for searches or selections as you need prior knowledge of each level you want to search for or select.
NodeID
ParentNodeID
Title
Title1
Title2
Title3
Title4
1
–
CEO
–
–
–
–
2
1
Director
CEO
Director
–
–
3
2
Senior Manager
CEO
Director
Senior Manager
–
4
3
Manager
CEO
Director
Senior Manager
Manager
Ancestors table: This table solves the search/selection issues that come with expanded nodes tables, presenting hierarchy data in even greater detail. Ancestors tables show a unique record for each child-ancestor relation in the data, including keys and names for each child as well as for each ancestor.
#8 Data cleansing
Sometimes, field values that represent the same thing may be written differently. For example, you could find the following common field values in different tables: UK, U.K., United Kingdom.
All three field values clearly mean the same thing, but the lack of consistency in their formatting means they could be interpreted as different values, leading to messy, inaccurate, or redundant data. This is why data cleansing is so important.
You can cleanse such data in Qlik Sense and QlikView using a mapping table, which maps the column values between different tables. This ensures that values that are written in different ways will consistently be recognised as the same value, not different ones.
#9 Mapping instead of joining
As we discussed in point #2, the join prefix is a powerful way to combine multiple tables in Qlik Sense and QlikView, but it often results in very large tables that can be a drag on performance. You can get around this problem by using mapping instead.
Let’s look at an example. The first table below presents a business’s order book. Imagine you needed to know which countries your customers are from, which is stored in the second table below.
OrderID
OrderDate
ShipperID
Freight
CustomerID
470
2022-11-01
1
62
2
471
2022-11-02
2
58
1
472
2022-11-02
1
32
3
473
2022-11-04
1
11
4
Customer ID
Name
Country
1
GPP
USA
2
ElectroCorp
Italy
3
DataMesh
France
4
Coopers
UK
To look up the country of a customer, you’d need to create a mapping table, like the one below:
CustomerID
Country
1
USA
2
Italy
3
France
4
UK
By applying the mapping table to the order table, you create a clear table, like this:
OrderID
OrderDate
ShipperID
Freight
CustomerID
Country
470
2022-11-01
1
62
2
Italy
471
2022-11-02
2
58
1
USA
472
2022-11-02
1
32
3
France
473
2022-11-04
1
11
4
UK
#10 Creating date intervals from single dates
In some cases, time intervals are not stored with a beginning and an end time, but rather a single field representing when something changed. Take this table below, for example, which shows different rates for two different currencies:
Currency
Change Date
Rate
EUR
–
8.59
EUR
28/01/2013
8.69
EUR
15/02/2013
8.45
USD
–
6.50
USD
10/01/2013
6.56
USD
03/02/2013
6.30
In this instance, the change date field is equivalent to the beginning date of an interval, and the end date is defined by the beginning of the next interval. The two empty rows in the change date column show the initial currency conversion rate, prior to the first change being made.
Additionally, there’s no end date column. To create a new table that has an end date column, you’ll need to follow the steps outlined in this article for Qlik Sense and this article for QlikView. Once that’s done, you will produce a table like this:
Currency
Rate
FromDate
ToDate
EUR
8.45
15/02/2013
01/03/2013
EUR
8.69
28/01/2013
14/02/2013
EUR
8.59
01/01/2013
28/01/2013
USD
6.30
03/02/2013
01/03/2013
USD
6.56
10/01/2013
02/02/2013
USD
6.50
01/01/2013
09/01/2013
Pro Tip: When using multiple dates, using a master calendar with canonical dates helps reduce multiple calendars, each of which contain date fields.
Making best practice normal practice
Data modelling is a complicated process. But to make the most of your data and powerful platforms like Qlik, effective data modelling is critical. Without a solid understanding of Qlik data model best practices, however, you could put unnecessary strain on the platform — and never truly unlock the insights in your data.
This can affect the speed and efficiency of your data processing, which in turn can impact the speed of your decision-making, the value of your data, and the ROI of your investment in the tool itself.
By working with a trusted data partner like Ipsos Jarmany, you can sidestep these issues altogether, ensuring that you get the most out of Qlik and, as a result, your data. Whether it’s supplementing your in-house team or providing a fully outsourced service, our experts are here to help you implement data modelling best practices with minimum hassle and maximum benefit.
If you’d like to find out more about how Ipsos Jarmany could help you unlock the power of Qlik, get in touch today and talk to one of our experts.
The sunsetting of third-party cookies has shaken up the marketing and advertising industry, with many now searching for alternative ways to identify and target audiences whilst balancing growing consumer demand for data privacy, security, and greater control and visibility of their data.
Alongside this, factors such as GDPR, Google’s privacy sandbox and Apple’s IOS14 update are further restraints that the advertising ecosystem needs to navigate.
In this blog, we’re going to discuss why first party data should be your first priority, what you should be doing to enhance your first party data strategy, and how it can help you to deliver a personalised customer-first experience whilst remaining fully compliant in a post-cookie world.
Let’s get to it.
The importance of first party data
Google’s current plan is for third-party cookies to be phased out of Chrome by the end of 2024. Most web browsers are already blocking third-party cookies by default, Google’s update will of course have the largest impact.
This means businesses will be unable to conduct cross-domain tracking and as a result will be unable to see:
What other websites the user has visited
Their end-to-end user journey
Other products or services they’ve purchased
Put simply, there will be much less data at our disposal, affecting our understanding of users and our ability to deliver a personalised experience. This places even more emphasis on what we do our first party data. The use of first-party data is the number 1 lever for business growth and gaining competitive differentiation through personalised experiences.
And businesses that utilise their first party data can benefit from:
2 x incremental revenue
1.5x cost efficiency1
First party data powers personalisation
Consumers want a personalised experience, and those that get it are 80% more likely to buy from a brand.
Let’s take a look at Spotify as an example. Since 2016 they’ve been running their viral ‘Spotify Unwrapped’ campaign every December – using their first party data to create the ultimate personalised music experience, focused on telling you what you listen to…and don’t we all love it!
In fact, the 2020 campaign generated over 60 million shares from 90 million users and led to a 21% surge in downloads of the Spotify mobile app2. All just from centring a campaign around their first party data to build a buzz around their brand and generate customer loyalty.
Adopt a privacy-first approach
Whilst we’ve established the importance of first party data, building a database of loyal customers is far from straightforward. There’s been a shift in consumer perspective of data privacy, fuelled by GDPR, and so consumers are increasingly conscious of controlling who has access to their data. It’s therefore vital that brands adopt a privacy-first approach in their digital marketing to establish trust.
It’s also important to identify what your value exchange is. Ask yourself, what is the customer gaining by consenting their information? Does it mean they’re going to get access to a one-time 10% discount code, or perhaps you’re able to give them access to content that matches their interests. Be transparent with it – tell your customers what the value exchange is.
By providing a positive privacy experience, not only are you more likely to gain first party consent, but a recent study also indicated that some companies could increase brand share by 43%3. That same study also found that a poor privacy experience was almost as damaging as a data privacy breach.
Use first party data to fuel machine learning
We’ve entered the ‘predictive era’ of digital marketing, whereby sophisticated predictive modelling and algorithms, such as artificial intelligence and machine learning, are increasingly used to pre-empt consumers behaviour and bridge the gap between observable and unobservable insights. It’s therefore vital that you have strong first party customer data to fuel your machine learning.
“As online advertising becomes more automated, your first party data plays a critical role in optimising towards your KPIs. Machine learning is only as good as what you ask it to optimise.”4
First party data can come from a plethora of sources, including:
Web
App
CRM
Social
Data modelling can also help you to connect these disparate data sources so you can see the true picture of your customer data and avoid viewing data sources in silo.
Get in touch
Consolidating your first party data and then using this to fuel machine learning sounds relatively straightforward, but in practice it’s more sophisticated than you may think and requires a specialised skillset. This skillset can be very timely and expensive to build internally – as a result leaning on specialist agency, like us (no apologies for the plug), to support you is often the best approach.
Get in contact with us today if you’d like to explore how we can help you manage your first party data and then build machine learning models to help you drive business growth.
Having data is one thing, but being able to utilise it to drive positive change is another. After all, data is simply a raw material. Turning it into actionable insights requires the right tools, processes, and expertise.
Get this right, and your business will have a significant competitive advantage over those that don’t. Data-driven businesses are:
23x more likely to acquire new customers
6x more likely to retain those customers
19x more likely to be profitable1
Data visualisation tools play a central role in the process of becoming more data-driven. By presenting the results of data analysis in a way that is clear, graphical, and actionable, they enable anyone to understand and act on data insights.
In this article, we’ll look at one of the most popular data visualisation tools on the market — Tableau — and how it could benefit your business.
What is Tableau?
Tableau is a data visualisation tool that helps organisations maximise the potential of their data and better inform in their decision-making.
Launched in 2003, Tableau is the market leader in the data visualisation space, with a market share of around 18%, putting it just ahead of competitor Microsoft Power BI2 . The company behind the tool, Tableau Software, was acquired by SaaS giant Salesforce in 2019 for over $15 billion.
So, what are the benefits of using Tableau? Let’s dive right in.
#1 Data visualisation
As the name suggests, data visualisation is the process of presenting data insights visually, allowing users to spot patterns, see trends, and understand and unpack insights.
Data visualisation is incredibly powerful because it allows us to process information faster. It’s far easier to understand graphs and charts than raw data and spreadsheets. As a result, anyone can leverage the power of data analytics to make better decisions — whether they’re a data expert or not.
Tableau brings together data from multiple sources and transforms it into easy-to-understand, customisable visualisations that empower teams to make better decisions.
#2 Interactive visualisations
Interactive visualisations allow users to explore and analyse data directly via customisable, responsive dashboards. This empowers users to drill down into the data and uncover new insights. This is particularly helpful for numerous teams using the same dashboard, as it enables them to drill down into the data as much or as little as they need.
With Tableau, visualisations are built through simple drag-and-drop functions to create dashboards and reports. Users can then interact with these visualisations through filtering and selection options.
This makes the process of data analysis more intuitive and inclusive, to those creating the visualisation to those viewing it, making it much easier to understand at a glance, compared to a messy and overwhelming Excel sheet.
#3 Easy implementation
Unlike tools such as SAP BusinessObjects and Domo, or programming languages such as Python or R, you don’t need to be able to code to use Tableau. Nor do you need to be a data expert or data scientist.
Instead, Tableau offers an intuitive, user-friendly interface that makes it relatively simple to use. This empowers teams from across an organisation to become more data-driven and data-literate, instead of having to rely on internal data teams to spoon-feed them insights. But ease-of-use doesn’t mean limited functionality, power of flexibility: users can go beyond this, drawing more out of the data with coding and more sophisticated techniques.
Tableau is also simple to set up, meaning you can start making data-driven decisions faster.
#4 Data source compatibility
Businesses today collect data across multiple sources — everything from files, spreadsheets, and databases to cloud-based applications. Tableau allows you to connect to, access, and blend data from multiple sources into single visualisations.
This means that you don’t have to create different types of visualisation for different data sources. Alternatively, you can choose to use a range of data sources but view them separately.
With Tableau, you get a complete view of all your data — from sources including SQL Server, Google Analytics and Salesforce — allowing you to streamline processes and make smarter decisions.
#5 Use multiple scripting languages
While you don’t need to be able to write code to use Tableau, it is possible to use programming languages such as Python and R to maximise Tableau’s potential and create more complex data flows and calculations.
By using Python scripts within Tableau, it enables users to:
Transform data
Run complex machine-learning pipelines
Query remote sources for information
Fix potential performance issues
Speed up computation speeds
Python scripts do this in Tableau in two main ways:
Gather and process data, which can then be used in your reports
Produce bespoke visualisations for added power and flexibility.
To ensure compatibility for bespoke visualisations, you can import Python’s visualisation packages into Tableau, as Python is not a native Tableau language. While this process requires sufficient technical expertise and experience, it’s important to remember that Tableau can still be used ‘out of the box’ without needing coding experience (see point #3).
#6 Handle large amounts of data
Tableau is able to handle large data sets, processing millions of rows of data with ease. At the same time, there is no impact on performance when it comes to large data sets. Your Tableau dashboard will continue to offer interactive data visualisation, real-time insights, and more — without you having to worry about lag.
That said, Tableau isn’t used only by large enterprises that collect vast amounts of data; it’s a data visualisation tool for businesses of all sizes. Even if you have relatively little data to work with, Tableau will help you understand what’s happening in your business and make better decisions.
#7 Mobile support
In today’s world of remote work and flexibility, being able to access critical information via our mobile phones is critical. With mobile apps for iOS and Android, Tableau allows you to access data insights on the go.
What’s more, Tableau allows you to customise your dashboard for the device you’re working on — whether that’s your phone, tablet, or laptop. Tableau automatically recognises the type of device you are using to view a report and makes adjustments to its scale, optimising the viewing experience. As a result, you can view beautiful reports and data visualisations wherever you are.
#8 Mapping geodata
One of Tableau’s most interesting features is its geodata mapping, which enables it to produce geodata visualisations. Via instant geocoding, Tableau is able to use location data to create interactive maps, complete with built-in demographic data sets such as population, region name, income, etc.
Geodata mapping adds another dimension to traditional data visualisations, allowing you to see the ‘where’ as well as the ‘what’ and the ‘why.’
#9 Low cost
A look at the advantages and disadvantages of Tableau wouldn’t be complete without discussing the cost. Despite its rich functionality and powerful dashboards, Tableau costs less than some high-profile competitors, like Qlik. So not only does Tableau save you time when it comes to the initial set-up, it could also save you money.
Unlock the benefits of Tableau
Tableau makes data visualisation and analysis easier for the average user than most competitor tools. That said, we’re still talking about data science and analytics, which can be complicated areas to grasp.
If you’ve decided to go with a tool like Tableau, you want to ensure that you get a good return on your investment. If you don’t have the right knowledge or expertise to get the most out of Tableau, you might not even know what you’re missing — impacting both your ROI and your long-term profits.
Furthermore, when it comes to data, mistakes can be extremely costly. After all, being data-driven only works if you understand how to unlock the power of that data in the first place.
Working with Ipsos Jarmany’s experts
To get the most out of Tableau — and by extension your data — it helps to work with a trusted data partner. That’s where Ipsos Jarmany’s consultants can help. We can help you unlock the potential of Tableau, empowering your business to become more efficient and data-driven.
Our team of data scientists and analysts are seasoned Tableau experts who work closely with businesses to customise and configure the platform to their specific needs. After all, no two businesses are the same — how you use Tableau will differ from how other companies use it.
If you’d like to learn more about how Ipsos Jarmany could help your business maximise the potential of Tableau, get in touch today and talk to one of our experts.
Account based marketing is a strategic approach to B2B marketing, whereby a business pools it’s resources to target a specific set of customers. Campaigns are then personalised to these target customers and designed to establish communication, build relationships and ultimately win new business.
Whilst ABM has been used by companies in various forms for a couple of decades, many businesses are still missing out on a fundamental element which can help enhance their strategy and drive success… data.
Did you know that:
1/3 of businesses aren’t using data for better decision-making
57% of businesses do not have a single view of data on their top clients across marketing, sales and customer success (I.e a consolidated view of account performance, activity and engagement stats)
32% are not using data to better make decisions across marketing, sales and customer success.
Here are 4 ways you can (and should) be using data to help bolster your ABM approach.
#1 Connect your First-Party Data
Before you can launch your ABM strategy, you need to have a good understanding of your current customer base so you can establish important information such as:
Who your contacts are in each account
How they currently interact with you
How often they interact with you
The type of content they are interacting with
To get a holistic view of your current customers you need to be consolidating your first party data from numerous data sets like CRM, website visits, social engagement, advertising impressions, product usage, emails and meetings.
Not only will this help you to understand your current relationship with existing customers, but it will also help to unlock insights into the type of content they are engaging with, so you can also deliver relevant content. This is hugely important as consumers are 80% more likely to buy from a company that offers them a personalised experience.
Whilst it sounds easy, it involves consolidating this data from numerous sources, cleaning it so it’s accurate, in-date and relevant, and then analysing and visualising this data so you can derive valuable insights.
If you’re in the early stages of mapping out your ABM strategy, getting a true picture of your current customers will help you to map out characteristics of your ideal customer profile (ICP) so you can pinpoint target customers that match your current business model.
#2 Augment Intelligence with Third-Party Data
Once you’ve collated your first-party data, it’s then important to augment this intelligence with third-party data. This includes researching and collating information on:
Industry size
Job titles of influencers and key decision makers in that organisation/business sector
Technographics
Intent data
Industry news that may be impacting their business and creating pain points you can help to solve
It’s important to deliver content that is relevant and personalised to customers their business and their industry. 76% expect more personalised digital experience from companies, so using third party data will provide you with the macro insights to help you achieve this.
#3 Use AI to Derive Insights from your First and Third-Party Data
Once you’ve collated your first and third-party data, you can then make sense of this through AI. Data and analytics techniques, such as predictive analytics and propensity modelling, can help you derive information that will guide your ABM strategy. Specifically, these models can help you:
Identify and score buying patterns
Establish other customers that are showing similar patterns
Predict pipelines
This information will then help you to pinpoint opportunities within your existing accounts and new target accounts.
#4 Use Data Modelling to Understand the True Impact of your ABM Campaign
The previous points have focused on how data can help you with the planning stages of ABM, however it’s important to note that data can also derive useful insights post-ABM campaign too.
Cross-channel activation is a big part of any ABM campaign, but how do you know which marketing tactics are having the most impact and delivering the best ROI?
Data and analytics techniques, such as marketing mix modelling (MMM), can help with this. Not only will MMM tell you which elements of your campaign are working best, it will guide you towards the channels that are providing the best ROI vs the ones that aren’t, so you can better invest your marketing budget in the right channels.
Account Based Marketing is all about delivering a personalised customer first experience to those key customers that you’re targeting – data can help you achieve this, and more. Speak to a member of the team today to find out how Ipsos Jarmany can support you on your ABM journey.