New IT systems can save money and help drive revenues, but only if you pay attention to the data they are using. As new research suggests half of all companies have not woken up to the business impact of poor data quality, David Reed examines how to get the best out of technology by putting the best data into it.
Competitive advantage is always a concern for companies. So is ensuring that business as usual is assured and operations run as efficiently as possible. In both cases, one of the major components is IT. Put the right systems in place and you not only keep the lights on, you can also get them to burn brighter.
Little wonder then that IT budgets remain one of the few growth areas in the business plan. Investment by small and medium-sized businesses (SMBs) rose by 9 per cent in the second half of 2011 compared to the first half, according to the latest annual “State of SMB IT” survey by Spiceworks. Based on data gathered from 1,200-plus IT professionals in companies employing under 1,000 staff, it found average annual IT budgets of $143,000, up from the $132,000 average in the first six months of last year.
“SMBs are making strategic technology investments with expanded budgets - pointing to a stronger market for IT products and services among small and mid-sized businesses in 2012,” said Jay Hallberg, co-founder and vice-president for marketing at Spiceworks.
Spending during a recession has to show a positive effect on the bottom line, either through controlling costs and maintaining revenues, or by allowing the business to expand in some way. (Some IT budget is clearly replacement or consolidation, but the principle remains the same.)
But what if buying systems did not result in the expected improvements? By focusing too much on the technology, are companies risking their investments by failing to take into account the impact that information can have on performance?
That was a conclusion reached in a study released last October. It found that clean trusted data was one of the top three challenges faced by IT executives, along with compliance and timely access. A lack of standardised data was a contributory problem to the 85 per cent of companies surveyed experiencing application performance problems. The survey of 146 executives in large organisations (with a turnover above $100 million), found an average of 20 to 30 per cent data duplication.
Ovum, which carried out the survey “Optimising Enterprise Applications: The Data Connection”, commented that, “all efforts to improve application management and delivery will be in vain if the underlying data and its management strategy are flawed - no matter how well architected the application platform or how effective the development team.”
Strikingly, the survey found that over 50 per cent of large enterprises do not have a data focus, with no information lifecycle management strategy, data quality or master data management programmes in place.
“Simply put, without sound data, and an even sounder data management strategy, your investment in business applications will diminish over time as your portfolio grows and expands across the enterprise,” said Tony Baer, principal analyst at Ovum. “There is no silver bullet as every enterprise and problem are different. But the common thread remains: effective rationalisation and management of data are at the heart of any challenge to optimise the application portfolio.”
That is not just a call to arms for data managers to ensure they are running appropriate data quality routines. It should be seen as a broader opportunity to get data management higher up the agenda by focusing on its impacts across the business.
It is clear from the investment plans of SMEs that IT has the attention and confidence of the board. If data managers can show their own area of responsibility is intimately linked to the performance of that technology, then senior management is likely to get interested.
Bert Oosterhof, director of technology EMEA at Informatica, believes this could be a breakout moment for data management. “Management may be looking to reduce the number of applications it is running, but there are also a lot of new types of application popping up. With the growth of lean management models, there is a focus on trying to minimise the number of enterprise systems,” he notes.
“So it is time now not to resolve point solution issues, but to move to a higher level and put in the right strategy around data. Data management needs to move up higher to the data governance level,” says Oosterhof. “That is a business issue and data management needs to take ownership of data together with IT.”
If the issues around data quality, standardisation and consistency are addressed under the data governance umbrella, they automatically become an enterprise-wide concern. That makes it more likely that a senior sponsor can be won for projects and also that connections with performance (and even strategic KPIs) get made.
Helping to reduce the data woes experienced in the business can only lead to greater efficiency and ROI on systems. Oosterhof notes that large portions of the IT budget go into “keeping the lights on”, but that this leads to high levels of redundancy in systems and over-specification of storage capacity.
“If you can reduce the cost of operations and running existing applications, that makes money available for implementing data governance and other new things the company might want to do, like social and mobile,” he says. An information lifecycle management strategy can be central to that, since it moves redundant data out of Tier One operating systems and storage, which are expensive, into lower level archives, which are much cheaper.
Another key play is to consider entirely new models that could substantially reduce the total cost of ownership while keeping business-critical data available. Cloud-based services are one of the three big drivers that Oosterhof sees in the market, alongside mobile devices and social media. “The cloud is very disruptive of the traditional data management approach,” he points out.
Data quality concerns generally tend to get picked up by two of the major budget holders in a business - IT during system consolidations or migrations and marketing during campaign preparation. In both cases, there are clear cost implications from data which is not fit for purpose. What makes the the marketing impact so visible is that expenditure on targeting to incorrect records is an avoidable cost, whereas most IT implementations are going to happen regardless of the underlying data quality.
The difference tends to be how those functions talk about the problem. “We work with clients to improve the performance of their email programmes through improving the relevance of messages, their timelines and content,” says Jill Brittlebank, director of strategy and analytics at e-Dialog. “So there is a reliance on data to create the business rules that drive campaigns.”
Relevance is an aggregated term used by marketers which contains within it a number of critical data dimensions, such as demographics, preferences and permissions. At one level, a campaign for women-only car insurance should not be sent to men, while an email to somebody who prefers printed communications risks being ignored.
Brittlebank points out that simple data errors - or absence of standards - can have a big impact. “For one client, we were looking at their gender field to send images based on whether they were male or female,” she says. “We discovered they only had that data for a percentage of the file, which adds a layer of complexity because then you have to come up with a generic image.”
Not only does that kind of data issue add cost, it also limits the extent to which marketing can be targeted and future products and services planned. For an insurance provider, for example, working out the potential market size within its customer base for a female-only product is only possible if it can identify correctly their gender.
One way in which Brittlebank’s company has helped to tackle such data errors is by examining records to see if data has been entered, but in the wrong field - a surprisingly common practice. Asking customers during relevant interactions is another key strategy, although one which may require the consent of customer services or web operations. “You can self-serve questions to customers where data is missing so they only see a request if their profile is incomplete,” she notes.
“The key is being able to generate value,” adds Brittlebank. Few companies are willing to invest in data quality as a standalone activity. But link it to the performance of core operating systems or essential activities like marketing and it becomes more interesting. Proving a downside cost for an investment that has already been made, such as a new IT solution, is also likely to get the board holding a candle for data quality.