Insight/Legal

Fiber Optics

Impact of real-time data analysis on information law

By 2025, the role and volume of data generated worldwide will have risen exponentially, with nearly a fifth marked as ‘critical’ to daily life, and nearly a tenth as ‘hypercritical’. At the same time, the amount of data stored and analysed could rise from 1% today to 37% by 2025, implying up to $5tn in annual benefits. However, current infrastructures will be unable to deal with this volume of information in a timely manner.

Indeed, volume alone suggests sending it all to the cloud will be uneconomic. By 2025 and as the impact of 5G is felt, nearly 30% of the data generated is forecast to be in real-time and over 50% of data could be managed autonomously.

What does it mean?

All time-sensitive data could therefore be processed in near real-time, at the edge of networks such as the IoT, with the bulk of non-critical data still sent to the cloud. Since the future of computing is at the edge, IT strategies and infrastructure must reflect that it will increasingly be embedded in IoT devices and in many cases outside of traditional organisational or IT boundaries.

While businesses’ IT and data strategies are at stake, the broader shift is likely to impact consumer journeys, and redraw both business and operating models. Real-time consumer engagement – such as an insurance company offering a contextual time-limited micro-policy for your upcoming ski trip – is likely to boom. Many organisations will look to position themselves as advisors, build new services or even switch business models completely.

Collaboration and ecosystem partnerships will gain prominence as many organisations don’t have the direct to consumer reach necessary to engage with the 100 billion or so IoT connected devices expected by 2030. Fundamental shifts in risk management are likely. Security and privacy controls will need to be built at the edge and intrinsically part of every device and network. By 2025, the average data generated per person, each year, is expected to reach almost 300 gigabytes. About 90% of this data is forecast to be vulnerable to unauthorised capture or theft and less than half of it is likely to be secured.

The risk of retroactive judgement for misusing or mishandling data, allied to sheer volume will likely create a need for regulated third party data markets. Common forms of dataflow will be necessary if partners are going to be able to work collaboratively and common third-party security standards will be required. Those leading the edge could transform industries but care must be taken it does not become the bleeding edge.

Your comments

Read our comments policy here