Whole new markets and applications are opening up thanks to advances in meteorology, but standards will be needed if the sector is to avoid siloing and errors, says Simon Chester
A change in the weather is afoot. No, I’m not talking about some rain later today: meteorology as a whole is changing as new and improved observation and analysis technologies enable new insights and create new markets. This also comes at a time when private enterprise is now playing a major role in what was once a government-only domain.
As processing capacity and observation technology improve, weather is shifting from ‘forecasts’ to ‘predictive earth models’, and increased observation density has enabled ‘hyper local’, detailed information concerning the weather, opening up whole new markets and applications.
However, to maximise the benefits of this shift, data interoperability will play an absolutely critical role. Without broadly accepted, open standards to make weather data and information findable, accessible, interoperable and reusable (FAIR) – the proliferation of new players and technologies will lead to data silos and inefficiencies and possibly inconsistencies and errors in the predictive models integrating the heterogeneous data sources.
With this in mind, on 25 August last year, OGC and a community of location and meteorologists and other weather experts from the public and private sectors held a multi-session virtual ‘Weather Revolution Workshop’ to discuss the present and future state of the industry, its relationship to geospatial, and to help identify – and move to start solving – the challenges surrounding sharing information concerning the weather.
From the public sector were panellists from the Environment and Climate Change Canada/Meteorological Service of Canada, the European Centre for Medium-Range Weather Forecasts, the UK Met Office, the US NOAA National Weather Service, the US Federal Aviation Administration and the US Air Force. From the private sector were panellists representing ClimaCell, CustomWeather, LiveRoad Analytics, RiskPulse, Spire, TruWeather, The Weather Network and Weather Source.
The workshop underscored the potential transformative impacts on society and businesses and recognised the value in meteorological data being made available to decision-makers outside the discipline. The mixture of participants from across the globe and from private and public organisations provided different perspectives, with several overarching themes bubbling to the fore.
Current conditions
Panellists representing met offices in North America and Europe spoke of the major transition under way as they move to the next generation of meteorological and climate services. In line with this, markets are growing increasingly “weather hungry” for detailed weather information so that they can plan their daily operations in a manner that reduces risk and improves efficiency.
The weather sector comprises more than just governmental met offices, however: there is now a rapidly expanding body of observations from both public and private sector sources. New observation approaches offered by the private sector range from global observation coverage, such as Spire’s satellite-based radio occultation, to accessing IoT/sensor networks including LiveRoad Analytics’ use of ‘connected’ vehicles (for example, a localised activation of windshield wipers is a good indicator of rain) or sensors installed in telecommunications infrastructure. These new observation approaches together provide useful indicators that can drive hyper-local weather observations and forecasts for the marketplace.
Such a proliferation of new observation techniques has meant that the sector has had to move from primarily linear to heterogeneous computing environments, namely cloud processing and high-performance computing. An ongoing challenge noted by several panellists is the ongoing optimisation of this heterogeneous processing to ensure that it can continue to scale.
Linked to this is the increased application of Big Data analytics, machine learning (ML) and artificial intelligence (AI) as met offices shift their emphasis from forecasts to predictive Earth models and explore ways to deal with the rapid increase in data availability, volume, and sources.
Panellists underscored the need to advance standards that are aligned and responsive to the reality of a modern IoT, ML/AI and non-linear heterogeneous processing paradigm. Open standards, such as those created by OGC, were noted as essential to ensure interoperability as the complexity of data and processing increases in tandem with the diversity of technologies, platforms, and data sources. Standards simplify the integration, processing, and sharing of data.
Greater interoperability and a move towards FAIR data principles that promote data availability beyond just meteorological experts isn’t a purely technical problem, however: an ongoing theme of the workshop was the need for a greater focus on addressing weather and climate in terms of facts (observations) versus opinions (models, predictions, warnings, and forecasts). While observations can be expressed well to users in terms of accuracy, currency, and fitness for purpose, more needs to be done to ensure that metadata, policies, and technologies involved in communicating opinions can properly inform users and decision-makers on the provenance, context, uncertainty/confidence and risks of use. A major ongoing challenge is how to improve the community’s ability to help decision-makers understand and act on the expressions of probability and risk provided to them by the weather sector.
All panellists agreed that there is a strong need for deeper partnerships and closer engagement between the public and private sectors as industry capabilities grow and create opportunities for change. Such partnerships will allow both sectors to better optimise the overall weather community business model. Many of the panellists agreed that the key challenge facing the weather community is the need for a rebalancing of responsibilities among private and public sector players in order to avoid duplication and conflict of interest.
A fine forecast for the future
The workshop identified areas to further strengthen OGC’s contribution to improving interoperability of weather data and information. From fast-evolving observation, prediction, and modelling capabilities to access to data, analytics and data provenance/quality, the ideas and challenges expressed in this workshop will help fuel OGC activities to develop, test, demonstrate, and adopt interoperability solutions and community best practices to support the growing use and value of weather information.
However, OGC’s work is only as robust as the community that drives it. Come engage with – and become a part of – the OGC community to collaboratively address these challenges. Such engagement could be in the form of participation in our meteorology and oceanography domain working group (DWG) or other related DWGs, such as the artificial intelligence in geoinformatics DWG, or through sponsorship or participation in a future Innovation Program initiative.
With a robust community come robust standards that power innovation by encouraging diversity in the marketplace: an outlook that any predictive model will agree is ‘fine and sunny’.
Simon Chester is communication manager at OGC (www.ogc.org)