There is a surprising lack of standardisation around big data, especially when one looks at other high-profile technology areas like the internet of things or cyber security. At the international level, there are welcome big data standards in development providing an overview on the topic and delving into the big data reference architecture. In the meantime, and to complement this, BSI commissioned external market research to understand better the needs for standardisation in big data in the UK.
The report identified business challenges that organisations are currently facing when undertaking big data projects. One broad area was around technology. At the heart of many organisations are decades-old IT systems - often cumbersome, disjointed and inflexible, these present a tangible barrier to growth in big data usage. This is particularly a challenge when it comes to creating a single, holistic data source. For many organisations, investment in big data will need to be accompanied by investment in IT systems and will probably require using cloud technology.
Cultural challenges also present issues. While possessing IT talent is certainly necessary to execute complicated big data initiatives, such talent is wasted if there is not a strategic imperative from senior management within businesses to ask the right questions. Another cultural challenge is around businesses adopting a different attitude towards data. Rather than seeing data sources as pieces of property that are owned by individual functions within the business, one must instead consider data as a single and unifying company resource. This requires co-operation and collaboration between all organisational functions - something that isn’t always easy.
From these and other challenges, the research identified a number of potential areas where there is greatest need for standardisation, including:
“How-to” guides for big data projects: while big data is not a new area, many organisations are approaching such projects for the first time or in new areas. Best practice could help by aiding to formulate projects, determine who should be involved, properly define the objectives, ensure quality checks are in place, etc.
Meta-data: the importance of meta-data is generally seen to be growing in importance and the research found that it is increasingly the main emphasis of big data analytics. However, many organisations struggle to capture and store meta-data in a usable and consistent format. Furthermore, there is a lack of guidance on areas such as how to ensure meta-data quality and how long it should be stored for.
Big data communications: over the past few years, there have been numerous instances of big data initiatives failing to take-off due to public resistance. Many big data experts and industry professionals believe that much of the problem lies in a failure to explain adequately to customers/the public the potential societal benefits of using big data analytics. Many consumers have reacted with fear to initiatives that, if presented differently, could have been viewed as fantastic opportunities. Standardisation could help to develop best practice for how big data initiatives should be explained and communicated to ensure that a positive case for big data is presented to the public.
Terms and conditions: building public trust in the use of their data is essential. However, T&Cs are often confusing, ambiguous and wordy. From a business perspective, it is believed that an organisation that has clear and easy-to-understand T&Cs will be at a competitive advantage. Standards could help by providing best practice for ensuring that T&Cs are simple to understand and optimise informed consent prior to data being used in big data projects.
For a more detailed version of the research, click here.