Evaluate Your Data Asset through Customer Journey Analytics

Evaluate Your Data Asset through Customer Journey Analytics

The secret to ongoing profitability are three little words “love your customer”. This is not just because of the purchases they make, but the behavioural data they leave behind. In the age of the data-driven business, this is where you will find insights that can be leveraged for acquiring new customers and maintaining existing ones.

You need to do whatever is necessary to keep existing customers on board. But when you have aggressive growth targets to meet, the only way to achieve meaningful uplift is by acquiring new customers. To succeed you need to be more creative in the way that you analyse your customer information and understand and utilise your data assets. Any one of your existing customers is a goldmine of information - if you know how to unlock and analyse the underlying data. Especially if you have the capability to analyse all of your customers’ behaviour.

Looking at the bigger picture, you can identify common trends and experiences that can be leveraged to attract new clients. By building cohorts of customers based on similar behaviours for instance, you can create marketing (and retention) strategies that are tailored to customer interests, preferences and behaviour profiles. Done properly, analytics can enable companies to reach that highly desired segment of one whereby each customer is understood and serviced as an individual.

Understanding your customer journey is critical to gaining insight into customer behaviour. In order to do this successfully you must understand the data footprints that illustrate customer journeys - only then will you be able to measure performance and success.

Marketers have long known that customer journeys are multi-stage affairs. But by performing advanced analytics on their data stores, the journey is shown to be made up of data footprints left by customers on their paths. Where the entire journey is digital, tools like Google Analytics make it very easy to identify and follow these footprints, tracking clicks and page navigation across your website.

But if your customer journey crosses multiple channels - online, phone, social media - it becomes more difficult to create an accurate, comprehensive oversight. Not least because each footprint will typically be recorded in a different system. You must have a way to query and aggregate each of these datasets to properly understand the various nuances of the journey.

What is customer journey mapping?

As we’ve already implied, customer journey mapping is the process by which customers go from brand new prospect, making a final purchase, ongoing consumption of the product/service, all the way through to the next buying cycle. In order to fully understand your customer’s journey you must also identify the data assets that document (or ‘record’) their experiences, decisions and behaviours.

Here at Idiro this is done by carrying out a deep-dive data asset discovery project to help identify what data assets are available within an organisation and how they might be used to drive value. Mining those data sets allows you to track customers across all of your channels, providing a granular view into every decision point and outcome. We then put these insights to work to understand which journeys are the most effective for achieving your commercial objectives; customer acquisition, customer value increase and customer retention.

Any business can perform a customer journey mapping exercise - even those still developing their analytics or customer management programs. All you need is access to skilled, experienced analytics experts, and their tried and tested methodologies.

Moving beyond Post Its

The leap from journey map to actionable insight is not always so straight forward however. Sometimes your most valuable data asset is not the most obvious - in most cases it will be a correlation of multiple data sources.

All the data you need for behavioural analysis is available, but you may need specialist skills and tools to extract those insights and to perform data visualisation. Querying multiple data sets and collating results to piece together the fine details of the customer journey can be complicated - and potentially time-consuming.

Looking further afield

It may be that some of the data sets you identify exist outside your organisation. Examining these external sources of data can be difficult - especially when you don’t know exactly what you are looking for. Social media is a rich source of data relating to product/service experiences and referrals - but you need to know how to collect, aggregate and analyse relevant data.

The data asset audit of the journey map will also point you in the direction - you can then outsource the physical analytics tasks to experts like Idiro who have the tools and experience to analyse internal and external data sets.

Going social

Another source of data ripe for analysis is social media. With more than 2 billion active users who are sharing experiences, thoughts and glimpses of their everyday lives, Twitter is a great place to gain additional understanding of your target market because data is freely available to marketers for behavioural and intent analysis. And if you can begin matching social media profiles with contact names, you instantly gain a head-start on your sales leads.

Social media analysis also provides a way to gauge customer sentiment towards any subject of interest. This could be your brand, your products and services, or your competitors. Sentiment analysis provides another point on the customer journey map – and some insights on how to guide new prospects towards your brand.

Are people complaining about their current supplier for instance? Do they use negative language in their status updates? These are clear indicators of an unhappy customer – and an opportunity to poach them.

Once you have identified specific individuals (or similar groups of individuals) you can use your customer journey map to target messaging and draw them into your sales funnel.

Twitter, LinkedIn and Instagram offer similar opportunities – assuming you have the right social media analytics in place. Or a suitably experienced data mining partner.

A worthy investment

Never assume that the cost of predictive analytics and customer journey mapping is too high, or that you can simply “muddle” your way through. Because after all, you are entering a market that has incumbents – and you are going to have to entice most of your customers away from them.

To do this you will need to expand your data horizons to include third party information. Doing so will enrich your understanding of your marketplace and the potential customers that inhabit it. Not only will you better connect with new prospects, but the behavioural insights will provide another part of the puzzle for understanding existing clients, allowing you to further refine your customer retention strategies.

Businesses are quickly realising that advanced analytics is a crucial tool for managing the customer journey, and using their own behaviour to provide a better quality of service - and to maximise revenue earning opportunities. Making better use of the data you have is vital to love your existing customer, and to help find new ones.

To learn more about advanced analytics and using third party data to enhance the accuracy and quality of the insights you generate, please call us now on +353 1671 9036

In-house vs Outsourced – Building an analytics function that hits the ground running

In-house vs Outsourced – Building an analytics function that hits the ground running

We live in a data-driven economy and failure to build a data analytics competence of some kind leaves you at a competitive disadvantage. And we know that businesses need to become much smarter about how they use data to retain or attract customers.

One of the first choices you face is how to build out your analytics function – do you want to build a team in-house, or partner with external experts, or even choose a hybrid model? This decision will have wide-ranging consequences for your ability to exploit your data in the future.

DIY data analytics

In theory, building your data analytics capability in-house has one major advantage – you can begin analysing your data almost immediately. Obviously, you will still need to deploy predictive analytics tools, but you can save time that would be otherwise spent identifying potential partners and agreeing service contracts.

But this course of action assumes you already have data science and analytics skills in-house. If not, you will need to hire suitably-skilled staff. And that’s where you start to run into delays and risks.

Paradoxically, you need data science experience in order to hire your first data scientist - otherwise you cannot properly evaluate their technical skills. It is also incredibly important to realise that you cannot simply bolt data science onto existing operations - you must change your culture to be able to act on the insights being generated by your data science team. Too often, businesses make this mistake and never realise the full potential of their investment.

Expensive skills shortages

Data science skills are in very short supply helping to drive salaries up. According to Payscale.com, the average annual salary for a senior data scientist is currently €70,318 – and rising. And you’ll need a broad range of skills that are rare to find in one person – according to research from McKinsey, “Best-practice companies rarely cherry-pick one or two specialist profiles to address isolated challenges. Instead, they build departments at scale from the start.”

Although you will realise significant benefits, building an in-house team to turn your company’s data into money will involve a substantial initial outlay.

Instead of making new hires, you could retrain existing staff. But this will greatly increase the time to get your advanced analytics program up and running – and longer still until you see returns on your data analytics investments.

Why you should consider outsourcing

Keeping analytics in-house creates a huge burden on time and resources – at least during the initial stages of building your data analytics capability. Over time they will deliver value, but many CFOs will baulk at the time it takes to generate a return on investment.

Partnering with an external provider offers a much quicker return on investment because the entire process is shortened. And because your partner already has a suite of pre-configured analytics tools, they can begin unlocking value from your data almost immediately.

Outsourcing can be a transitional process too. One way to get the best of both worlds is to outsource all of your predictive analytics functions initially while you build an in-house data science team. As those capabilities come on stream, you can then start bringing functions back in-house.

Using third party consultancy also helps you avoid the staffing issues inherent in trying to maintain operations in-house. Your business doesn’t have to attract suitably skilled data scientists, or deal with rapidly increasing salary demands.

Outsourcing can be implemented in different ways too. Hybrid outsourcing allows you to split responsibilities with your analytics partner for instance. Under this model you retain responsibility for some elements - for example, the underlying database infrastructure, while the outsourcer provides others, such as the hard-to-come by modelling and analytics functions. The hybrid model is fully flexible because no two scenarios or deployments are ever the same.

This allows you to maximise the use of your own staff resources and minimise outsourcing costs without limiting your analytics projects and obtain the skills you really need for data-driven operations.

Speed is everything

When it comes to improving the customer experience, speed is incredibly important. Giving people what they want, when they want it is a key aspect of all customer retention strategies.

As you roll out your data analytics program, speed needs to be a factor at every point – including before you even begin analysing data. The faster you can get your predictive analytics capability in place and generating insights, the quicker you can begin to realise a return on your investment. McKinsey even put a figure on operating profit improvements, suggesting “first movers” account for around 25% of the gain Why? Because they have more time to integrate analytics with workflows than their competitors.

In reality, if your business has never used predictive analytics tools before, choosing to implement data-driven strategies in-house could be a mistake. Any initial cost saving will be quickly cancelled out by the extended time it takes to begin generating actionable insights. Far better to outsource the work to the experts initially, and have your outsource partner train and gradually hand over hand over responsibility for analytics as it comes up to speed.

For more help and advice on finding the optimum mix between in-sourcing and outsourcing for your data analytics team, please get in touch.

Advanced Analytics, Customer Churn and the Appliance of Science

Advanced Analytics, Customer Churn and the Appliance of Science

In 2010 Eric Schmidt (former CEO of Google) said “Every two days now we create as much information as we did from the dawn of civilization up until 2003." That’s something like five exabytes of data. According to IBM, the build out of the “internet of things” will lead to the doubling of knowledge every 12 hours. Let that sink in for a moment.

We take the digital era for granted these days, we’ve normalised its existence but when you step back and think about its impact, it’s as remarkable as it is overwhelming.

With the collective knowledge of the entire history of civilisation available for dissection, human behaviour has been documented in its entirely.

We’ll leave the philosophical ramifications of all of this to others - this is a B2B article on advanced analytics after all, but it’s worth taking in the bigger picture of just how much data is out there.

If we leave aside the focus on big data and the internet of things and apply advanced analytics on just a tiny speck of this information - your customer database - the insights gleaned from their behaviour will be decisive in the future success or failure of your company.

Customer Intimacy

To start with, let’s get the most obvious learning out of the way - retained customers are way more valuable than new ones, due to the costs of acquiring new customers. Adobe once found that it takes seven new shoppers to equal the revenue of a single repeat customer.

So if your focus is on retention campaigns, then your focus needs to be on your existing customer base. The development of programs to improve customer experience has been a direct result of this understanding. By delivering an exceptional experience, customers will not defect – or so the theory goes. But despite throwing millions of euros at “experiences”, customers continue to defect. If anything, they leave even more quickly and easily than ever before.

So what has gone wrong?

Net Promoter Score

Customer experience is a nebulous concept, but there has to be a way to assess its success. And so the famous “net promoter score” (NPS) was born. For a while marketers felt they had a good way of understanding satisfaction levels by simply asking customers what they thought.

Surveys were sacrosanct.

But there is a problem with surveys and the NPS regarding churn prediction – what customers say and do are two different things. According to a report published in Bloomberg Businessweek 60% of defecting customers describe themselves as 'very satisfied' just before they leave.

To make matters worse, the evidence of their impending defection has always been available – if you know where to look.

The Appliance of Science - Applied Analytics

Your existing customer database is a veritable goldmine of data for analysing customer behaviours. Every interaction between brand and consumer creates a digital footprint, an indication of intent – if you know how to read them.

Applied analytics provide a way to spot trends and patterns based on past behaviours. By classifying and categorising customers based on commonalities, you can drill down into those behaviours and better understand customers as individuals.

By following the behavioural trail you can identify indicators of intent. A customer may not say they are leaving, but their behaviour provides clues about what they are thinking. Has there been an increase in calls to customer support? A use of increasingly negative language in their emails? A reduction in their use of your service? All the behavioural indicators are there in plain sight - but only if you know what to look for and how to analyse it.

Taking these indicators and comparing them to the behaviours of other customers, you can predict their next move.

And here is the thing - you can identify, understand, and predict behaviour right down to the individual.

You can uncover how any one customer feels about your service and your offering and confidently predict how likely they are to leave, when are they likely to leave, why are they likely to leave, and what offer will make them happy to stay.

Act Early, Reduce Costs

With refinement your analytics will begin to identify these behaviours much more quickly, allowing you to act earlier. The sooner you act, the easier it is to recover the relationship – and the cheaper the incentive you need to offer. Your analytics will even reveal which retention incentives have had the greatest success for similar customers previously, further increasing your chances of a positive outcome for both parties.

Instead of issuing surveys that can be ignored, or which capture inaccurate sentiment data, analytics use the actual behaviours of your existing customers to make extremely accurate inferences and predictions. Statistical patterns provide actionable insights in a way that the nebulous NPS scoring system cannot, which means that your attempts to improve customer experience will always be more effective because you better understand each customer as an individual.

Fads come and go, but predictive behaviour modelling is just that...predictable. All the answers are there, but very few have the expertise or the tools to spot them, track them, report on them and recommend actions.

Speak to one of our analytics experts to see how you can use advanced analytics to improve your customer experience and reduce churn.

Will my car pass the NCT?

Will my car pass the NCT?

EDIT 7/8/18: Our NCT work featured in the Sunday Independent: https://www.independent.ie/life/motoring/car-reviews/which-car-is-best-of-the-test-37185356.html 

In Ireland, every car over 4 years old requires a roadworthiness certificate, which it gets by passing the National Car Test (NCT). If you're buying a used car, it's important to know how likely that make and model is to pass the NCT - and if it fails, on what part of the test.

To help you find out this information, Idiro has analysed the results of the last 5 years' NCT tests - and we offer you two tools:

The NCT checker

Idiro has created a simple NCT car checker tool, available online. You’ll find it at www.Idiro.com/NCTchecker.

Just enter the make, model and year of the car in question to learn all about how these cars perform in the NCT. If your car is quite rare, like the Mazda MX-5, then we recommend that you select all the years of NCT tests. Otherwise, just leave 2017 ticked.

As you can see, Mazda MX-5s from the year 2000 have a 66.3% failure rate across 5 years of tests - just slightly worse than the average of 65.7% for all cars of that age. However, look at the detail - the MX-5 does much better than average for some elements (no failures for suspension!) but much worse for others (four times worse for emissions). That will help you know what to look for when buying a used car, and help you prepare your car to maximise its chance of passing the NCT.

Idiro's NCT checker - input form and results

Our handy NCT checker works on phones, tablets and PCs. Kudos to my colleague John Grant for building it.

Exploring all of this year's NCT results

For people who would like to dig deeper into the NCT results, we have produced an interactive dashboard as a demonstration of our data analytics skills.


Again, it uses data published by the Road Safety Authority covering 2017 NCT tests. Data has not been published on retests, so our dashboard covers the first NCT test that each car underwent in 2017.

For practical purposes, the data is filtered to show only the twenty most popular makes of vehicle tested in 2017, and for each of these, only models with at least 1000 tests in the year. This was necessary because to show every make and model of car tested would make the dashboard so complex as to be unusable. As a result, from the total 1.4 million tests carried out in 2016, 1.1 million tests are represented in this dashboard.

However, if you do want to look at makes and models of cars that are not shown in this analysis, you can download the full dataset from the RSA.

How to use this dashboard

Pro tip: To reset all your filters and return to the original screen, click your browser’s refresh button.

The RSA provides the ‘Year of Car’ of each car tested, which we understand to mean the year of first registration. You can filter by the age of cars using this slider. For example, if you want to see test results for all cars registered in 2010 or before, you simply drag the end buttons in the slider over to the desired year.

This is an interactive dashboard, so as you change one parameter, all of the graphs adjust to match your selection. For example, this bubble table shows the top 20 most popular makes of car with a 2010 or older registration. The bubble's size indicates how popular that make is, and its colour indicates the make's pass rate - from deep blue (high pass rate) to deep red (low pass rate).

Test volume and pass rate per make

In this example, we can see below that for vehicles registered in 2010 and older, Toyota is the most popular make and has a high pass rate.

Now let's click on 'Toyota'. As you can see, this changes all the charts in the dashboard - they now only show the details of Toyota models.

Toyota selected

Pro tip: To compare different car makes, hold down the CTRL key while you click on each make that you want to filter in the bubble chart “Overall Popularity & Passing % of Model”.

Now let's examine the different Toyota models. In the next graphic to the right, ‘First time pass rate by model’, you’ll see the pass rate of each Toyota model.

First time pass rate for Toyotas

In the table on the far right entitled ‘Most Popular Model & Age’, you’ll see each model in the Toyota range. The Prius is the Toyota with the highest pass rate, but it isn’t the most popular Toyota - as you can see, the Corolla is the most popular (as you will see, Corollas have been around since 1980).

As you scroll down, you can see the ‘First Time Failure By Year’ graph which shows the number of cars tested (blue bars) and failure rates (red line) for each year of registration. As you can see, younger cars are much more likely to pass the NCT. To look at failure rates over time in each category within the test, you can filter by category in the drop-down menu.

To the right you'll see the ‘First Time Failure By Category’ table, which shows the percentage of cars that fail each category within the NCT test. This image displays what caused Toyota cars to fail their NCT.

As you can see, the dashboard allows you to dig deep into the 2017 NCT test results. Here again is the link to the dashboard:


This dashboard works best on PCs, rather than mobiles. Kudos to colleagues Paul Goldsberry and John Grant for building it.


We do hope you find these tools useful. To discuss how Idiro's analytics skills can help your business, drop us an email at info@idiro.com.  To download the source data from RSA.ie, click here.


Idiro shortlisted in TWO categories at the Technology Ireland software awards

Idiro shortlisted in TWO categories at the Technology Ireland software awards

We are delighted to announce that Idiro has been shortlisted for awards in two categories of the prestigious Technology Ireland software awards.  Our two categories are:

  • Digital Technology Services Project of the Year, for our analytics project in the South Pacific
  • Technology Innovation of the Year, for Red Sqirl, Idiro's advanced analytics platform for Big Data

Idiro's CEO, Aidan Connolly commented: "It is an honour to be shortlisted for these awards and it is testament to the ingenuity and hard work by the team". 

The awards ceremony is on Friday 24th November and our fingers are crossed.

Analysis of NCT test results helps car buyers choose wisely

Analysis of NCT test results helps car buyers choose wisely


Today Idiro has published a data visualisation dashboard (here) allowing you to explore the 2016 NCT test results. 

Update 21/8/17: The Sunday Independent ran a story yesterday on Idiro's dashboard - see here http://www.independent.ie/life/motoring/car-reviews/put-your-car-to-the-test-36049372.html 

Update 30/8/17: Our dashboard has been picked up by many other media including RTÉ: https://www.rte.ie/lifestyle/living/2017/0830/900977-is-this-the-worst-car-model-for-nct/

About this dashboard and the data

This interactive report is published by Idiro Analytics as a demonstration of our data visualisation capability. The data used is published by RSA.ie and covers NCT tests conducted in 2016. The data covers the first NCT test that each car underwent in 2016 - data has not been published on retests for cars that fail.

The data is filtered to show only the twenty most popular makes of vehicle tested in 2016, and for each of these, only models with at least 1000 tests carried out. This was necessary because to show every make and model of car tested would make the dashboard unusable. As a result, from the total 1.4 million test carried out in 2016, 1.1 million tests are represented in this dashboard.

If you wish to examine the data further or want to look at makes / models of cars that are not shown in this analysis, the full dataset is available for download at RSA.ie.

How to use this dashboard

Pro tip: To reset all your filters and return to the original screen, click your browser’s refresh button.

Pro tip: To compare different car brands, press CTRL + click on the type of makes you want to filter in the bubble chart “Overall Popularity & Passing % of Model”. 


The RSA provides the ‘Year of Car’ of each car tested.  You can filter by the age of cars using this slider. 

If you want to see cars from 2010 or before that were tested in 2016, you simply drag the end buttons in the slider (at the top) over to the desired year.

As the year changes, so do the interactive maps. This bubble table shows the top 20 most popular car models with a 2010 or older registration tested in 2016 that passed first time.

The bubbles indicate how popular that model is, and the colour of each circle indicates the pass rate - from deep blue (high pass rate) to deep red (low pass rate).

For example, we can see below that for vehicles registered in 2010 and older. Toyota is the most popular and has a high pass rate in 2016.

In the next graphic to the right, ‘First time pass rate by model’, you’ll see the different Toyota models that were tested.

In the next table ‘Model Popular Model & Age’, you’ll see the most popular models in the Toyota age range of the models tested.

Although the Prius has the highest pass rate in the Toyota model, it isn’t the most popular make, Corolla is the most popular.  As you scroll down the interactive map, you can see the failure rates of each car that was tested in the NCT and what the cause of the failure was.


You can filter what cars failed on by category or you can choose to see total which will show all the categories on the ‘First Time Failure By Year On’ drop-down menu.  The image below displays blue bars which indicate the units tested in each year and the red line graph shows the failure percentage rate. Remember that pass/fail thresholds can vary according to the age of the car.

As you scroll across, you’ll see the ‘First Time Failure By Category’ table. This table shows each category that the NCT test each car.

This graph displays what caused Toyota cars to fail their NCT. 



Here's the link to the dashboard: https://public.tableau.com/profile/idiro.analytics#!/vizhome/NCT2016Top20MakesResults/NCT2016-20MostPopularMakes

To contact Idiro about this blog post or about how Idiro's analytics can help your business, drop us an email at info@idiro.com.  To download the source data from RSA.ie, click here.  



Twenty Numbers that Define Kenny’s Leadership in the Past Six Years

Twenty Numbers that Define Kenny’s Leadership in the Past Six Years

Google, homelessness and a shrinking unemployment rate: a look at the figures that will come to define Kenny’s legacy—for better and worse.

  • 2,277: days in power on 1st June 2017.
  • €197,000: Enda Kenny’s average salary between the 2011 election and the end of 2016. 
  • 14.4%: the unemployment rate in February 2011 when Enda Kenny was elected Taoiseach. 
  • 6.2%: the unemployment rate in April 2017. 
  • 2.59%: average inflation rate in 2011. 
  • 0.01%: average inflation rate in 2016. 
  • 7: words [the homeless] “don't want to come off the streets” - Enda Kenny’s opinion on the homeless in 2016.
  • 4,588,252: the population of Ireland in 2011. 
  • 4,761,865: the population of Ireland in 2016.
  • 3.8%: increase in the population of Ireland between 2011 and 2016. 
  • 173,613: increase in population from 2011 to 2016.
  • €13,000,000,000: Apple’s Irish Tax bill.
  • 2.7: Doctors per 1,000 population in 2013 
  • 20: Ireland’s rank in 2015 for disposable income within the 38 OECD countries. 
  • 3,808: the number of homeless people in Ireland as of April 2011.
  • 7,472: the number of homeless people in Ireland as of March 2017.
  • 679: drug related deaths in 2013.
  • 26: seats lost by Fine Gael in the general election 2016.
  • €22,600,000,000: Google’s EMEA revenue from controversial advertising sales business in Ireland in 2015.
  • €47,800,000: tax paid by Google in Ireland in 2015.


Big data – will it solve your marketing problems?

Big data – will it solve your marketing problems?

As ever, Tom Fishburne has a point.  Increasingly, organisations are turning to their data to improve decision-making and improve commercial results - but buying big data infrastructure won't solve your marketing problems.  In many ways, installing the big data infrastructure is the easy bit.  The real challenge, as Idiro has found time and time again, is turning all that data into money.  For this you need people with the BI and analytics skills to mine all that newly-available data for dashboards, insights and predictions.  And of course the organisation needs to be ready to change - to try new ways of using data to drive commercial activity - and it needs to be prepared to fail.  Samuel Beckett said:

'Ever tried. Ever failed. No matter. Try Again. Fail again. Fail better.'

With the right analytics partner, the journey to excellence in data-driven marketing should be a lot easier than Beckett paints it - but nevertheless, it takes skill and a ruthless focus on the results.  However, the results from using your organisation's data to drive its business are nearly always well worth the effort.