Sustained economic growth over the past three decades has allowed Ireland to prosper. Our welcoming culture and plentiful job opportunities are the reason people move to Dublin. Unfortunately, like many well-off countries, Ireland has struggled to provide housing for all its citizens and new immigrant arrivals to our shores. In recent years the housing shortage, for both rental and purchase, has become one of the key issues in the country.Continue reading
One of the first choices you face is how to build out your analytics function – do you want to build a team in-house, or partner with external experts, or even choose a hybrid model? This decision will have wide-ranging consequences for your ability to exploit your data in the future.
DIY data analytics
In theory, building your data analytics capability in-house has one major advantage – you can begin analysing your data almost immediately. Obviously, you will still need to deploy predictive analytics tools, but you can save time that would be otherwise spent identifying potential partners and agreeing service contracts.
But this course of action assumes you already have data science and analytics skills in-house. If not, you will need to hire suitably-skilled staff. And that’s where you start to run into delays and risks.
Paradoxically, you need data science experience in order to hire your first data scientist – otherwise you cannot properly evaluate their technical skills. It is also incredibly important to realise that you cannot simply bolt data science onto existing operations – you must change your culture to be able to act on the insights being generated by your data science team. Too often, businesses make this mistake and never realise the full potential of their investment.
Expensive skills shortages
Data science skills are in very short supply helping to drive salaries up. According to Payscale.com, the average annual salary for a senior data scientist is currently €70,318 – and rising. And you’ll need a broad range of skills that are rare to find in one person – according to research from McKinsey, “Best-practice companies rarely cherry-pick one or two specialist profiles to address isolated challenges. Instead, they build departments at scale from the start.”
Although you will realise significant benefits, building an in-house team to turn your company’s data into money will involve a substantial initial outlay.
Instead of making new hires, you could retrain existing staff. But this will greatly increase the time to get your advanced analytics program up and running – and longer still until you see returns on your data analytics investments.
Why you should consider outsourcing
Keeping analytics in-house creates a huge burden on time and resources – at least during the initial stages of building your data analytics capability. Over time they will deliver value, but many CFOs will baulk at the time it takes to generate a return on investment.
Partnering with an external provider offers a much quicker return on investment because the entire process is shortened. And because your partner already has a suite of pre-configured analytics tools, they can begin unlocking value from your data almost immediately.
Outsourcing can be a transitional process too. One way to get the best of both worlds is to outsource all of your predictive analytics functions initially while you build an in-house data science team. As those capabilities come on stream, you can then start bringing functions back in-house.
Using third party consultancy also helps you avoid the staffing issues inherent in trying to maintain operations in-house. Your business doesn’t have to attract suitably skilled data scientists, or deal with rapidly increasing salary demands.
Outsourcing can be implemented in different ways too. Hybrid outsourcing allows you to split responsibilities with your analytics partner for instance. Under this model you retain responsibility for some elements – for example, the underlying database infrastructure, while the outsourcer provides others, such as the hard-to-come by modelling and analytics functions. The hybrid model is fully flexible because no two scenarios or deployments are ever the same.
This allows you to maximise the use of your own staff resources and minimise outsourcing costs without limiting your analytics projects and obtain the skills you really need for data-driven operations.
Speed is everything
When it comes to improving the customer experience, speed is incredibly important. Giving people what they want, when they want it is a key aspect of all customer retention strategies.
As you roll out your data analytics program, speed needs to be a factor at every point – including before you even begin analysing data. The faster you can get your predictive analytics capability in place and generating insights, the quicker you can begin to realise a return on your investment. McKinsey even put a figure on operating profit improvements, suggesting “first movers” account for around 25% of the gain Why? Because they have more time to integrate analytics with workflows than their competitors.
In reality, if your business has never used predictive analytics tools before, choosing to implement data-driven strategies in-house could be a mistake. Any initial cost saving will be quickly cancelled out by the extended time it takes to begin generating actionable insights. Far better to outsource the work to the experts initially, and have your outsource partner train and gradually hand over hand over responsibility for analytics as it comes up to speed.
For more help and advice on finding the optimum mix between in-sourcing and outsourcing for your data analytics team, please get in touch.
We take the digital era for granted these days, we’ve normalised its existence but when you step back and think about its impact, it’s as remarkable as it is overwhelming.
With the collective knowledge of the entire history of civilisation available for dissection, human behaviour has been documented in its entirely.
We’ll leave the philosophical ramifications of all of this to others – this is a B2B article on advanced analytics after all, but it’s worth taking in the bigger picture of just how much data is out there.
If we leave aside the focus on big data and the internet of things and apply advanced analytics on just a tiny speck of this information – your customer database – the insights gleaned from their behaviour will be decisive in the future success or failure of your company.
To start with, let’s get the most obvious learning out of the way – retained customers are way more valuable than new ones, due to the costs of acquiring new customers. Adobe once found that it takes seven new shoppers to equal the revenue of a single repeat customer.
So if your focus is on retention campaigns, then your focus needs to be on your existing customer base. The development of programs to improve customer experience has been a direct result of this understanding. By delivering an exceptional experience, customers will not defect – or so the theory goes. But despite throwing millions of euros at “experiences”, customers continue to defect. If anything, they leave even more quickly and easily than ever before.
So what has gone wrong?
Net Promoter Score
Customer experience is a nebulous concept, but there has to be a way to assess its success. And so the famous “net promoter score” (NPS) was born. For a while marketers felt they had a good way of understanding satisfaction levels by simply asking customers what they thought.
Surveys were sacrosanct.
But there is a problem with surveys and the NPS regarding churn prediction – what customers say and do are two different things. According to a report published in Bloomberg Businessweek 60% of defecting customers describe themselves as ‘very satisfied’ just before they leave.
To make matters worse, the evidence of their impending defection has always been available – if you know where to look.
The Appliance of Science – Applied Analytics
Your existing customer database is a veritable goldmine of data for analysing customer behaviours. Every interaction between brand and consumer creates a digital footprint, an indication of intent – if you know how to read them.
Applied analytics provide a way to spot trends and patterns based on past behaviours. By classifying and categorising customers based on commonalities, you can drill down into those behaviours and better understand customers as individuals.
By following the behavioural trail you can identify indicators of intent. A customer may not say they are leaving, but their behaviour provides clues about what they are thinking. Has there been an increase in calls to customer support? A use of increasingly negative language in their emails? A reduction in their use of your service? All the behavioural indicators are there in plain sight – but only if you know what to look for and how to analyse it.
Taking these indicators and comparing them to the behaviours of other customers, you can predict their next move.
And here is the thing – you can identify, understand, and predict behaviour right down to the individual.
You can uncover how any one customer feels about your service and your offering and confidently predict how likely they are to leave, when are they likely to leave, why are they likely to leave, and what offer will make them happy to stay.
Act Early, Reduce Costs
With refinement your analytics will begin to identify these behaviours much more quickly, allowing you to act earlier. The sooner you act, the easier it is to recover the relationship – and the cheaper the incentive you need to offer. Your analytics will even reveal which retention incentives have had the greatest success for similar customers previously, further increasing your chances of a positive outcome for both parties.
Instead of issuing surveys that can be ignored, or which capture inaccurate sentiment data, analytics use the actual behaviours of your existing customers to make extremely accurate inferences and predictions. Statistical patterns provide actionable insights in a way that the nebulous NPS scoring system cannot, which means that your attempts to improve customer experience will always be more effective because you better understand each customer as an individual.
Fads come and go, but predictive behaviour modelling is just that…predictable. All the answers are there, but very few have the expertise or the tools to spot them, track them, report on them and recommend actions.
Speak to one of our analytics experts to see how you can use advanced analytics to improve your customer experience and reduce churn.
EDIT 7/8/18: Our NCT work featured in the Sunday Independent: https://www.independent.ie/life/motoring/car-reviews/which-car-is-best-of-the-test-37185356.html
In Ireland, every car over 4 years old requires a roadworthiness certificate, which it gets bypassing the National Car Test (NCT). If you’re buying a used car, it’s important to know how likely that make and model is to pass the NCT – and if it fails, on what part of the test.
To help you find out this information, Idiro has analysed the results of the last 5 years’ NCT tests – and we offer you two tools:
The NCT checker
Idiro has created a simple NCT car checker tool, available online. You’ll find it at www.Idiro.com/NCTchecker.
Just enter the make, model and year of the car in question to learn all about how these cars perform in the NCT. If your car is quite rare, like the Mazda MX-5, then we recommend that you select all the years of NCT tests. Otherwise, just leave 2017 ticked.
As you can see, Mazda MX-5s from the year 2000 have a 66.3% failure rate across 5 years of tests – just slightly worse than the average of 65.7% for all cars of that age. However, look at the detail – the MX-5 does much better than average for some elements (no failures for suspension!) but much worse for others (four times worse for emissions). That will help you know what to look for when buying a used car, and help you prepare your car to maximise its chance of passing the NCT.
Our handy NCT checker works on phones, tablets and PCs. Kudos to my colleague John Grant for building it.
Exploring all of this year’s NCT results
For people who would like to dig deeper into the NCT results, we have produced an interactive dashboard as a demonstration of our data analytics skills.
Again, it uses data published by the Road Safety Authority covering 2017 NCT tests. Data has not been published on retests, so our dashboard covers the first NCT test that each car underwent in 2017.
For practical purposes, the data is filtered to show only the twenty most popular makes of vehicle tested in 2017, and for each of these, only models with at least 1000 tests in the year. This was necessary because to show every make and model of the car tested would make the dashboard so complex as to be unusable. As a result, from the total 1.4 million tests carried out in 2016, 1.1 million tests are represented in this dashboard.
However, if you do want to look at the makes and models of cars that are not shown in this analysis, you can download the full dataset from the RSA.
How to use this dashboard
Pro tip: To reset all your filters and return to the original screen, click your browser’s refresh button.
The RSA provides the ‘Year of Car’ of each car tested, which we understand to mean the year of first registration. You can filter by the age of cars using this slider. For example, if you want to see test results for all cars registered in 2010 or before, you simply drag the end buttons in the slider over to the desired year.
This is an interactive dashboard, so as you change one parameter, all of the graphs adjust to match your selection. For example, this bubble table shows the top 20 most popular makes of car with a 2010 or older registration. The bubble’s size indicates how popular that make is, and its colour indicates the make’s pass rate – from deep blue (high pass rate) to deep red (low pass rate).
In this example, we can see below that for vehicles registered in 2010 and older, Toyota is the most popular make and has a high pass rate.
Now let’s click on ‘Toyota’. As you can see, this changes all the charts in the dashboard – they now only show the details of Toyota models.
Pro tip: To compare different car makes, hold down the CTRL key while you click on each make that you want to filter in the bubble chart “Overall Popularity & Passing % of Model”.
Now let’s examine the different Toyota models. In the next graphic to the right, ‘First-time pass rate by model’, you’ll see the pass rate of each Toyota model.
In the table on the far right entitled ‘Most Popular Model & Age’, you’ll see each model in the Toyota range. The Prius is the Toyota with the highest pass rate, but it isn’t the most popular Toyota – as you can see, the Corolla is the most popular (as you will see, Corollas have been around since 1980).
As you scroll down, you can see the ‘First Time Failure By Year’ graph which shows the number of cars tested (blue bars) and failure rates (red line) for each year of registration. As you can see, younger cars are much more likely to pass the NCT. To look at failure rates over time in each category within the test, you can filter by category in the drop-down menu.
To the right, you’ll see the ‘First Time Failure By Category’ table, which shows the percentage of cars that fail each category within the NCT test. This image displays what caused Toyota cars to fail their NCT.
As you can see, the dashboard allows you to dig deep into the 2017 NCT test results. Here again, is the link to the dashboard:
We do hope you find these tools useful. To discuss how Idiro’s analytics skills can help your business, drop us an email at firstname.lastname@example.org. To download the source data from RSA.ie, click here.