The global news timeline has been flooded with COVID-19 since the early days of 2020 and for obvious reasons. The impact of the pandemic has claimed the lives of over 1,196,000 people globally to date and the spread of the coronavirus has disrupted lives, communities, and businesses worldwide. This health crisis, as many experts will point out, is here to stay and will have a long-lasting and profound impact on the world as we know it.
While the scientific world still grapples with understanding the extent of this virus and is diligently working to develop a vaccine, it also pays to understand the technical prowess required to deliver the actionable and valuable data insights necessary to provide country leaders and health experts an edge in the fight against disasters such as this virus, among other use cases.
Disaster Tech, a public benefit corporation that builds technology to analyse, visualise, and communicate risk believes there are unique and comprehensive approaches to utilising data to augment the methods states and counties are using to fight against crises such as the global pandemic, and more importantly, translate those methods to also tackle other known natural disasters such as wildfires and hurricanes.
The Right Answers Require the Right Data
An important element that every organisation, whether it’s a state or local government agency, federal agencies, or private corporations, struggle with when they try to react and respond to an emergency like an unprecedented global pandemic like COVID-19, is access to data that can support the types of analysis they are trying to do.
One of the things Disaster Tech focused on earlier in the process is the ability to centralise and collect disparate data sources from multiple federal agencies, online data collections, and proprietary data hubs, as well as forming partnerships with universities and medical and research institutions to deliver a single source of truth and authoritative data. In doing so, they are able to understand and perform valuable analysis in the event of a health crisis or a hurricane or a wildfire, for example, and deliver valuable answers to questions such as “what are the best ways to relocate at-risk people during a pandemic” or “what’s the best way to get supplies to and from impacted communities?” These critical questions can only be answered if the necessary data sets are there in the first place.
Accordingly, it’s essential to feed these data sets into an analytical platform. As we think about disaster and emergency response scenarios, it’s usually all about situational awareness - essentially how can emergency responders, private entities, and state or federal agencies accurately assess and identify what is at-risk or damaged/harmed, whether it is infrastructure, assets, or people. For example, with a hurricane approaching, are particular transformers, substations, or power poles at risk, and given that, which communities of people will potentially be impacted? Analysing the data to understand what will likely be impacted first can make the response to serious crises more effective.
Evidently, in many of these scenarios, the main challenge is identifying the right analytical platform that can analyse volume at scale within the shortest time. Typically, traditional platforms struggle with the volume, speed, and real-time nature of the data, taking too long to glean intel on what could be or has been impacted, and by the time that result is available, the situation has likely already changed, especially in a disaster scenario.
Even after disaster strikes, an analytical platform can provide damage assessments that allow agencies to act fast. For example, in the case of a hurricane, valuable insights into how many people have potentially lost homes and are displaced and in need of shelter can be crucial for emergency responders. Supply chain shipments also play a critical role in disaster management, as federal relief and other resources including emergency vehicles, personnel, and medical supplies are often dispatched to the impacted locations afterwards. Leveraging a robust analytical platform diminishes uncertainties for first responders, identifying, for example, untraversable roads so that critical supply chain shipments arrive where they need to be without encountering additional hurdles and delays.
Under the Hood
Whether in a pandemic or not, emergency scenarios generally involve analysing massive datasets to identify the key data points that impact decisions in real time, in rapidly changing situations. And government agencies are especially data-rich. Not only do most have decades of historical data, but with 5G on the horizon, the data available to those agencies will also increase drastically - think billions of rows of data - streaming, historical, and geospatial data, among others.
One of the biggest challenges is being able to analyse this tremendous trove of data, which requires a technology platform built to equip emergency responders with the comprehensive intelligence needed to act instantly during a crisis. Static analytics, downsampled datasets, and even dashboards are no longer enough. Legacy architectures, passive analytics (like reports and traditional BI), and limited budgets have not been conducive to innovation across multiple industries, including disaster management.
However, Disaster Tech took a novel approach by enlisting the expertise of the Kinetica team. Built on years of advanced data analytics and machine learning experience, The Kinetica Streaming Data Warehouse is able to take on billions of streaming and historical data points, visualised in three-dimensional space, and orchestrate and analyse them so that Disaster Tech and other public sector organisations can use active analytics to respond to today’s needs and plan better for future disasters.
The resulting platform is built on Microsoft Azure and leverages NVIDIA GPUs to meet the needs of any disaster management scenario, from delivering high-resolution geospatial analytics to consuming real-time streaming data, to offering high-speed analytics that can be driven by machine learning and artificial intelligence.
Location Intelligence, Key to Emergency Response
Location-based intelligence is crucial in emergency response situations. Emergency responders need to be able to understand the geographic context of any given crisis. For example, a visual of a downed power line is not helpful unless first responders know where the danger exists and utility crews know where to send field repair technicians.
In disaster management scenarios, being able to combine geospatial, graph, and streaming and historical data analytics at scale--and integrate geospatial analytics into map-based applications for enhanced situational context--can vastly improve decision-making.
The combination of real-time streaming data query, native geospatial operations, and advanced map-based visualisations opens opportunities for disaster management teams to perform analyses that were previously difficult or impossible. For example, they can better analyse how emergency response support to affected areas could be impacted by road conditions or weather patterns, or how utilities can react to a fire to keep the public connected and safe.
Protecting a community from the worst impacts of a disaster can truly be achieved if we can resolve the technology gaps in emergency management and public health. If COVID-19 has taught us anything, it’s that there’s a growing need to be nimble and to pivot, using the right technology platform to understand and address key vulnerabilities, and ultimately help reduce risks and build more resilient communities. This will require a transformation in not just technology, but also education and policy, in order to develop a culture where we can get ahead of any global pandemic or natural disaster.
Author: Mathew Hawkins, Sr Director, Solutions Engineering at Kinetica
Subscribe to our newsletter
Stay updated on the latest technology, innovation product arrivals and exciting offers to your inbox.
Newsletter