Big Data: The #10YearChallenge

source: Corinium Intelligence

Written by Corinium on Feb 20, 2019 5:52:57 PM

The viral #10YearChallenge, spread across social media like wildfire. The challenge was centred around people showing the world what they looked like 10 years ago compared to today. We also took a look back, but at how the world of data & analytics has changed.

A 10 year challenge on almost any aspect of technology and IT would unearth just how quickly the industry is advancing; so much so, that it may feel as if each decade feels more like a 20-year period. One of the most intriguing questions though, is not just about the technology, but about the data at hand which everyone is now talking about as the most valuable asset for many enterprises.

So how did things look ten years ago? Flashing back to 2009, there were news stories of data breaches, improvements needing to be made on the data protection front, calls for the government to improve the way it handles data, and indeed, concerns from privacy campaigners about Whitehall initiatives such as a potential national ID register. In that sense, not much has changed at all!

However, what has changed, is what businesses are now capable of doing with data, and the volume of data that exists. A look at Gartner’s top 10 strategic technologies for 2009 gives us a clue as to why; among the technologies that the analyst firm saw potential for significant impact on the enterprise were cloud computing, social software and social networking and business intelligence. This was merely the beginning of wider adoption of cloud computing by large enterprises, and this shift to cloud came at a time when businesses were being told that they would no longer be restricted to working with structured data. Cloud is now a key platform of choice for enterprises, and most organisations use at least some cloud services. Cloud enables businesses to store so much more data, back-up more data in a disaster recovery scenario, and the compute affording by cloud infrastructure can help businesses to process that data (sometimes in real-time), in order to get insight from that data.

Research a decade ago estimated that unstructured data made up between 80 to 90 per cent of all organisational data, and Gartner had suggested that the worldwide volume of unstructured data was doubling every month. The issue was, that organisations were not yet able to extract value from this data – but many were starting to.

Fast forward 10 years, and artificial intelligence and machine learning have already replaced big data as the biggest buzzwords in the industry, and with good reason; there’s no smoke without fire, and AI and machine learning products are genuinely making an impact in businesses, with both structured and unstructured data. Take this example from back in 2008, where anti-fraud specialists suggested that the most effective way of tackling online fraud – a relatively new phenomenon at the time – was to centralise detection systems. As Dimitris Vlitas, Principal AI Lead at Data Practitioners says “there was no evidence to suggest that by 2010 we would be able to talk about AI as we currently do”.

Nowadays, businesses are using AI and machine learning to detect fraud, with multiple different systems being able to be connected through APIs – there is no longer a push for a centralised system, and in fact, there’s calls to ensure that data is kept in separate parts of the business to avoid criminals taking organisations hostage.

Another great example of AI and data use is the Serious Fraud Office’s use of a tool which helped it to trawl through 31 million documents for the infamous Rolls Royce fraud case, to check which materials were covered by legal professional privilege – this was a task that independent barristers had to do previously. The system it used was able to process more than half a million documents a day at speeds 2,000 times faster than a human lawyer.

F1 racing is another industry which shows how far technology has come in embracing data to improve products and services. Graeme Hackland, CIO at Williams F1 recently tweeted about the data explosion in the sport. He said that in 2004, the organisation needed more than 1 DVD (about 4.5GB) to hold the data being generated at the track such as telemetry, photos, weather, track and strategy. By 2018, the organisation transferred 8TB of data from the track to the engineers, and this included car telemetry, photos, video, audio, weather, tyre data, competitor analysis, amongst other things. F1 cars are now geared with so many sensors which all pick up these different data feeds, and enable F1 teams to react accordingly in real-time to help their drivers, while it also allows broadcasters to give viewers predictions on how likely it is for one driver to overtake another.

If all of this has happened over the course of about 10 years, imagine what could happen in the next decade!


Join TCI at the Evolution of Data and Analytics Conference, where discussions pertaining to technologies, stratagems and methods for placing your organisation ahead of the data-game will be dissected and investigated. 

To register as a delegate email Wendy Dhlamini on Wendy@tci-sa.co.za.

Sponsorship & Exhibition enquiries are handled by Jason Joseph jason@tci-sa.co.za, or call 011 803 1553

Send me more on the exhibition and sponsorship opportunities
 

For more 10 years, TCI has been a registered and preferred conference organiser for all major banks and financial institutions.
Vendor details are available on request.

Similar Posts