The Coronavirus Covid-19 pandemic is spreading rapidly throughout the world and scientists are racing to find a vaccine. Big Data analytics, data visualisation and artificial intelligence are also being recruited to track the spread of the virus. This is with the aim of helping governments and public authorities to make better decisions about how to prevent and reduce the infection rate. The virulent nature of the Covid-19 means that time is of the essence.
For example, ZDNET reports that “The Centre for Systems Science and Engineering”, at John Hopkins University, is running an online dashboard that tracks the spread of the deadly coronavirus, as it makes its way across the globe…The data is visualised through a real-time graphic information system (GIS) powered by Esri.”
For the livestreamed dashboard, data is collated from the World Health Organisation (WHO), as well as from centres for disease control in the US, China and Europe. This is for the purpose of showing all confirmed and suspected cases of coronavirus Covid-19. It also records the number of recovered patients and deaths.
Improving contact tracing
Big data can also be used to make contact tracing more efficient. In Indonesia, for example, there is no time to wait for a vaccine that may be at least 18 months away. So, in the meantime, the country’s government and health authorities need to be able to “record the applied medication, treatments and the patients’ responses to find out statistically which treatment is the most effective”, writes Alexander Senaputra, Technical adviser for PT Geoservices, in the Jakarta Post on 17th March 2020.
He adds: “This approach is similar to efforts being made to find a cure for cancer in countries with advanced medical systems. This is where all patients’ data — especially those who recover — are taken and processed by algorithm to find something in common that gives doctors a lead about the best medication.”
Data analysis can be deployed to track the average time is takes patients to recover whenever they are receiving treatment. They can also use the data to predict how many more beds are required following a spike in Covid-19 infections, leading to an increased number of patients needing hospital treatment.
China: Big data and AI
In China, the epicentre of the origins of the global pandemic, artificial intelligence and Big Data has been deployed in its cities. Shawn Yuan, writing for Aljazeera on 1st March 2020, says thermal scanners were introduced to spot people showing the symptoms, such as a high temperature. These temperature checks can be used to inform passengers, transport and health authorities to ensure that preventative action can be taken to reduce the spread of the virus.
The belief is that the development of technology can enable the authorities to fight the disease in a way that was not possible during the SARS outbreak in 2003. However, much depends on the quality of the data that’s collated and how it’s defined. Data also needs to be collated from a wide variety of sources, shared and backed up.
The obstacle: latency
However, latency and packet loss can make the synchronisation of databases inefficient and slow networks can reduce the accuracy of ‘real-time’ data. With accurate real-time data, everyone will be able to get back to normal as soon as possible, and to speed up the march towards finding a vaccine against the coronavirus. However, a lack of real-time data modelling could lead to the wrong decisions being made – including over when to reduce lockdown measures in order to kickstart the economy in each country across the world. Some countries have begun, at the time of writing of 20th April 2020, to take these tentative steps. The results are being closely watched.
David Trossell, CEO and CTO of Bridgeworks, reveals the types of data that are crucial to decision-making: “Key data to help form this decision has to include at least: number of infections, number of deaths, number of survivors, number of tests, outcome of tests, drug trials, locational data - on a global basis!” In other words, governments, scientists and health authorities across the world should ideally share this data to beat back the virus.
This kind of data is so invaluable it’s important to protect it against ransomware and get the data together in a timely manner to ensure accurate data analysis. Trossell adds: “Intelligence is at the heart of decision-making and that is driven by data. Big Data. Not so long ago, we were all talking about Big Data in the 4 key pillars, Velocity, Veracity, Volume and Verity each is key, but if we want to look at this on a global basis the velocity is going to play a key part.”
The preservation of data is essential, and this isn’t just about real-time data. To enable the right decisions to be made, there is also a need to be able to analyse historic data. All the data the scientists, government, health authorities and other interested parties generate now could be useful in the future – helping to prevent a future pandemic. “Unfortunately, we’ve already seen cyber-attacks on medical organisations just when they are distracted elsewhere”, says Trossell before advising: “Offsite air-gapped back-ups are critical; the more the merrier in my mind, and with the right technology this is now highly possible.”
“The problem with off-site air-gapped data storage, and also where data has to be transferred across any distance between organisations, is one that many see as impossible to implement in an efficient way is due to latency and packet loss.” However, one way to achieve this is by deploying WAN data acceleration solution such as PORTrockIT.
There are also concerns about data accuracy. “As we‘ve seen in the UK, daily data information figures can be skewed because of the lack of velocity in the data: sometimes these cover a period of number of days, sometimes the time between events and central reporting can be over 5 days, which makes the decision of how and when to lift the lockdown problematic”, he comments.
Trossell adds: “So, if we’re going to combine big data analysis with AI, we’ve got to meet the 4 pillars of Big Data, especially the velocity pillar; and to crack the latency problem we need a different approach to transporting data not only efficiently but securely.”
The issue of data accuracy is exacerbated by different governments and authorities using different data models, making a bit like comparing apples and pears. Trossell explains: “This has always been the problem with any data and digital in particular. A common reporting format would be extremely useful for the electronic gathering of data – perhaps it is something the WHO should look into for the next emergency – as we all know there will be others.”
Trossell concludes that the pandemic is likely to change the way people work, with more people continuing to use technology to work from home. Yet, he claims we are very social beings and that lack of contact with others is causing many mental health concerns.
UK Cabinet Office
Meanwhile, joining the fight against Covid-19, Bridgeworks has written to the UK’s Cabinet Office to make its PORTrockIT products available free of charge for a year to “any Health Organisation or Medical Research Establishment engaged in this Covid-19 work.”
In the letter, the company says: “PORTrockIT massively accelerates the transfer of vast quantities of data over a very long distance, in a manner that is unique and which overcomes the problems of latency, packet loss and congestion on the line, in a way that no other organisation in the world has come anywhere near matching.”
The solution uses machine learning, artificial intelligence and parallelisation to mitigate wide area network (WAN) latency and packet loss. While this can’t change a scenario where poor quality data leads to poor decisions, it can make real-time big data analysis more accurate and enable voluminous amounts of data to be shared, backed up and transferred across the globe – making it quicker and easier to conduct research and collaboration against Covid-19.
Graham Jarvis, freelance business and technology journalist