by Mary Shacklett in Big Data
We have learned a lot about big data in action during the COVID-19 crisis. Going forward, these lessons will make it easier for enterprises and vendors to deliver better big data projects and products.
Here are five big data lessons learned from the COVID-19 crisis:
1. Visualization is paramount
By now, most of us are very familiar with seeing COVID-19 spread maps on TV and the internet. The maps identify hot spots throughout the world and report the state of the virus state by state in the US.
These geographic pictorial maps were facilitated by placing statistical data on mapping engines, thereby combining both structured statistics and unstructured map-based visuals to create an overall result.
The maps work because we can easily relate to the geography that maps represent, as well as the statistical data on the virus that are superimposed on them. In this case, a "best case" visualization of information is being used by presenters that ensures that the messages they are communicating about COVID-19 spread, and hot spots can be easily understood by audiences.
2. Big data is an enabler
Sometimes the value of big data is not so much in the data it presents, but in the capabilities it offers.
During the COVID-19 crisis, we have seen big data play an important role as an enabler because of its ability to process video, audio, and other types of non-standard information in ways that structured data processing can't.
A prime example is the widespread use of telemedicine that has facilitated virtual doctor appointments between home-bound patients and their healthcare providers. Businesses with home-bound employees are also using video conferencing as a means of staying in touch and conducting virtual meetings.
This flow of big data to participants around the world has created opportunities for real-time collaboration and information exchanges that COVID-19 "lockdowns" would otherwise have prevented.
3. Integration and aggregation play key roles
The best big data apps are those that unlock data value from both structured and unstructured data. In the fight against COVID-19, Internet of Things (IoT)-based devices, such as thermometers and contact tracers, can be combined with statistical data to track virus outbreaks so the outbreaks can be mitigated. To produce these comprehensive COVID-19 tracing and detection engines, data scientists must choose the best sources of data to integrate and aggregate so they can create a composite picture of what's going on. Big data toolsets enable them to do this.
4. Big data projects are mission-critical
There are at least 78 COVID-19 vaccine projects underway worldwide. None of these vaccine formulations and trials could occur without the assistance of big data processing. This squarely positions big data processing as a mission-critical application, indispensable for developing a vaccine that can end the pandemic.
Being able to process data at the speed of big data for drug and vaccine formulations, and then being able to use big data in-memory processing to connect and communicate with IoT and automated systems on manufacturing floors, will also determine how quickly mass amounts of vaccines can be distributed to the population. This IoT processing helps companies analyze product manufacturing and quality, and it speeds product to market.
5. Collaboration is vital
COVID-19 is a worldwide pandemic. Logic dictates that the best approach to such a global sickness problem is global cooperation and information exchange so we can find a cure. This information collaboration can easily be done with big data projects as it has been done in the open source community.
The key is countries being willing to share what they know so we can collectively find a cure. In the current political environment, this is not the case—but the big data collaboration tools are there. The lesson going forward is that we can achieve results faster if we can use big data collaboration tools and collaborate.
Image: Ca-ssis, Getty Images/iStockphoto