A new generation of analytical software appears to offer freedom from drudge work for analysts and more productivity for its business users. So will automation really deliver the benefits that are promised, or are there downside risks? David Reed finds out.
Imagine you own a string of karaoke bars. Data on your customers is easy to come by - the venue is pre-booked, drinks and food go through an EPOS system, even the songs being sung can be tracked from the electronic catalogue. What you really want to do is ensure you optimise the revenue from each booking and incentivise your sales staff to keep the drinks flowing.
But how to do that in real-time without employing a skilled (and expensive) data analyst to prepare the data and reporting? Lucky Voice solved the challenge by adopting the business intelligence application Tableau, which now allows on-the-fly querying by bar managers and instant reporting of sales trends.
Data on the performance of each of the Lucky Voice karaoke bars, from revenue to song catalogue via the demand for drinks and food is made available through interactive dashboards. As a result, services can be tailored to meet needs and increase the take. Staff are able to review their sales performance via the in-bar reservation system and compete with one another to see who sells the most. A competitive incentive programme has helped boost food and drinks revenues in each bar.
“Developers used to provide the reports, however, changes were slow and difficult and you couldn’t play with the data. Using Tableau, the team can ask incidental questions on the fly without engaging with the developers. Freeing up development time has saved us a significant sum of money,” says Nick Thistleton, managing director, Lucky Voice.
This is a prime example of how data-driven insights are increasingly being delivered via automated, self-service solutions, especially in low-risk environments, without the need for an analyst to get involved. With the increasing demand for analytics in a market where the supply of skilled analysts is very limited, it seems likely that automation of these kind of tasks will only increase.
“There is an appetite for the democratisation of data,” says Chris Love, consultant at The Information Lab, which runs Tableau as well as other new generation analytical applications like Alteryx. “We are seeing that across industries, partly as a generational thing. Within our lives now, everything is becoming easier - you don’t need the skills you used to in order to understand these solutions.”
Speed-to-market is one of the drivers of this demand for the diffusion of analytical access. Businesses can no longer wait for a backroom insight team to explore the data, test and build a model and then operationalise it as part of a process which used to take months. Instead, they want in-the-moment data access and reports - even insights and forecasts - on which to base decisions.
Love says that use of downstream tools can help to transform the nature of business problems and how they are addressed. “We have been working with an electricity company that thought it had a GIS problem in trying to ensure it laid cables where demand was high. They understood the data, knew all about the infrastructure, but not necessarily the territory,” he says.
The Information Lab introduced Alteryx as the analytical interface for business users, allowing them to run queries without having to learn new skills or rely on high-level backroom analysts. “Now they are using that tool, they no longer see it as a GIS problem requiring specialist resources, they just run through queries with no particular domain expert overseeing them,” says Love.
What this next-generation solution - and others like it - aim to provide is the ability to distribute data access and insight routines far beyond the usual footprint that is achieved. A super-user does the heavy lifting to set up complex models and set the parameters within which downstream users are able to conduct their own queries.
But Love notes, “there are a lot of risks in making that more available - the risk is ignoring the science and the demands of the research that has gone into those areas. The fear for me is people making decisions based on data that can deteriorate if you ignore the way it needs to be done.”
A paradox of automated analytics and reporting tools is the way they both remove the obstacle of data integration and normalisation, yet potentially increase the risk that this has not been done properly. Analysts will routinely spend between 50 and 80 per cent of their time simply assembling data before building any models - even setting up reports demands a lot of data alignment. Getting rid of that burden frees up a lot of vaue-adding skills and time. But if an expert is no longer involved in data assembly, it might not get done right.
“The development of these tools does give people better access to data,” agrees Tina Christison, strategy director, data and CRM at RAPP. “It also means the analysts can focus more on the analytical work than more basic chores, because they do tend to get involved in the provision of information that isn’t doing analytics.”
Her company has used Tableau and also Pentaho, a web-based business intelligence and data visualisation tool. “It does require a different approach to set up, requiring web expertise and Java knowledge, so we are moving away from having analysts involved in setting it up,” she says.
Christison points out that simply accessing data has itself become easier as a result of changes to IT architectures, but that certain categories - such as web logs or call centre records - are still challenging. With enough effort, that can be overcome.
A more significant question is the extent to which end users are capable of adopting thse new solutions. “It depends on the maturity of the client. Some are at the leading edge of their analytics journey, but others are still at Analytics 1.0. To get them up to level 3.0 takes a lot of work, but we can start them on that journey,” she says.
In this type of scenario, a third-party data services provider can play an important role in establishing the processes, technical framework and back office analytical heavy lifting while the end user develops internal capabilities and knowledge. For those agencies and consultancies, being able to automate standardised or repeatable tasks is a boon.
Their own approach to managing analytics is also changing. “We still use SAS and SPSS, but we have also been looking at R - not least because it is free. It is shifting us away from having perpetual licences and being able to tap into a developer community. Also, to put a graduate through a training course on SAS costs about £1,600 - and there are not as many being run,” says Christison.
Well-established analytical solutions of that sort are not going away any time soon. For one thing, there is a considerable sunk cost in those platforms and the training of experts to use them. For another, a culture exists that cares deeply about understanding where data has come from and how it has been put together.
“If you are a classic data company with a roomful of SQL developers who can do the ‘data wrangling’, there is a level of expectation among that old school because they care about it. If there is no common key to link different data sets together, for example, you will hit a dead end,” points out Charles Ping, chief executive of Fuel Data.
One of the most significant challenges in the use of big data is precisely how to find those links with other data sets which add meaning and usability. A web log can show all the different customer journeys through a site, but if the only identifier is a cookie which does not track back in some way to a specific consumer, just how much value does that give a marketer?
Ping also points out that the more strategic questions which end-users want analysts to answer can rarely be tackled with data that has been automatically joined. “Suppose you want to understand the contribution to brand awareness or ad performance from the presence of your logo on a partner’s ad, such as when Dell includes ‘Powered by Intel’. That is an analytical job which an automated solution won’t be able to do,” he says.
He also sees the trend away from established analytical software towards open source programs like R which is changing the profile of the analytics function. “You could tap into PhDs, but in the commercial world, they have got to get up to speed on the business and then apply their academic background to the particular objective,” says Ping.
At that level, it is unlikely that automation will be necessary, although these super-analysts may help downstream users to deploy new tools with fewer risks. Setting parameters, such as ensuring that the most variable item of data does not get included too early in a regression or CHAID model, could be built in to a rules engine once the models have been created and the data conditioned.
If automated analytics helps to expand the total number of users able to ask train-of-thought questions, it will only be a good thing. Equally, in-the-moment reporting can support more dynamic business decision making that should improve profitability. That could mean analysts themselves are able to become more strategic and have to dirty their hands less.
For Ping, the issue is not the tools involved, but the way they are used. As he says: “Analysts need a mix of SQL, common sense and strategic knowledge. More important still is to be like a dog with a bone and don’t stop chewing until they get the answer they want.”
Thank you for your input
Thank you for your feedback
DataIQ is a trading name of IQ Data Group Limited
10 York Road, London, SE1 7ND
Phone: +44 020 3821 5665
Registered in England: 9900834
Copyright © IQ Data Group Limited 2024