GWPrime

Enhanced Viewing Experience

Over the years, GIS has improved and scaled-up to cope with maturing data, offering a totally different Earth Observation experience.

By Nicholas Duggan

In the last century, scientists mapped the world through ground observations. It was on seeing veil-type clouds in the polar stratosphere that scientists set up the Halley Bay Observatory in Antarctica, which subsequently led to their observing a hole in the ozone layer. Little did they know that these ground observations with a Dobson spectrophotometer (which measures the total ozone by measuring the relative intensity of the Ultraviolet B (UVB) radiation that reaches the Earth and comparing it to that of Ultraviolet A (UVA) radiation at ground level) would lead to scientists from across the world coming together to address a global challenge.

Those who grew up in the 1980s and ’90s will not forget hearing the news of a growing, ominous ozone hole, which was going to become the size of Paris if the world didn’t change its ways. For many, the news that the world wasn’t going to last forever came as a shock; our self-indulgent ways needed to change. Today’s headlines are a reflection of those earlier news stories there are daily reports about rising tides and extreme heat, and the urgent need to consider our carbon footprint.

The ozone crisis brought scientists together, although work was still very disparate and siloed due to the challenge of sharing large or complex data such as imagery. This was a time before the Internet and digitalization, and large amounts of data were shared through hard drives. When new information was discovered (especially at a global level), it would often take long periods of time to be validated. It wasn’t until the start of this century when the Internet had matured, and Geographic Information Systems (GIS) started becoming available, that data could be shared in hours rather than weeks from sources across the globe.

Science modules of the Halley VI Antarctic Research Station

Change in demand

At the same time, the ESA (European Space Administration) and NASA had started to launch satellites into orbit with the intention of capturing imagery from Space, one of the first being GOME — Global Ozone Monitoring Experiment. This was a high-resolution spectrometer that could capture data at a global scale over time. At that time, this was ground-breaking, even though it captured data in 40km cells — a far cry from the 1km or better data that we use today.

As data matured, the demand for GIS and ways of managing this data matured. Satellite sensors improved, Remote Sensing became easier and national mapping agencies started to digitize their paper products to provide topographic context to the emerging climate and meteorological data. For a geospatial specialist, it was a time akin to herding cats — data was evolving quickly and there were many definitive sources appearing by the day. Without Cloud storage, many companies were stacking hard drives and filling up servers full of data. Back then, it was not uncommon to visit another GIS team and see a shelf-full of hard drives labelled with different data.

Geospatial Artificial Intelligence (GeoAI) and Machine Learning (ML) have started to make changes in the amount of work required to gain insights and run models from this data too.

A different picture

With time, GIS enabled larger and more complex Remote Sensing. With tools like QGIS and ArcMap incorporating statistical modeling and more complex raster spatial analysis tools, it was possible to start gaining insights on a global scale. In 2005, a well-known group called GEO (Group on Earth Observations) was created to facilitate access to Earth Observation (EO) data due to the rising demand for policy-making.

Modern EO in GIS is a totally different picture. There is data available in near real-time from almost all authoritative bodies. At present, NASA has around six satellites in orbit providing high-resolution data in GIS format. Let’s take the example of Sentinel 5P: it provides data on stratospheric ozone, air quality, and climate change monitoring. The horizontal resolution of the data is 7.5m for a single cell of information. To put it into context, this is over 5,000 times more detailed than the ESA satellite a decade ago.

Today, EO data is close to saturation; there is information and data everywhere. In fact, it is starting to become difficult to validate sources. Even software providers like Esri have data hubs where they share current and relevant EO data.

Multiple data types

What is amazing is that with our Cloud connectivity and high-access Internet, it is possible to get near real-time data on any part of the global geophysical system. With wave, sea-level or currents data, there are buoys that return real-time data on the NOAA (National Oceanic and Atmospheric Administration) or NOC (National Oceanography Centre) websites, or satellite data from NASA. For climate data, there are air quality sensors all over the globe that can be interpolated and accessed hourly from places like the UK air-quality network or from the Sentinel 5 satellite data on the NASA website. Further, there are elevation data, imagery data, light emission data, and even coastal erosion data, all available instantly on a global scale.

GIS has improved and scaled-up to cope with this data explosion. Running QGIS on the Cloud allows servers with much more powerful cores and RAM to analyze the data faster than any desktop machine. Esri employs a Cloud system called ArcGIS Online that allows for Cloud processing, so that these larger modeling and analysis can be run faster. GIS providers have also integrated programming and statistical languages, Python and R. This means that models and scripts can be geospatially enabled, or geospatial context data can be considered within a framework with which the user is familiar.

Emergence of GeoAI

Geospatial Artificial Intelligence (GeoAI) and Machine Learning (ML) have started to make changes in the amount of work required to gain insights and run models from this data too. By using ML to understand different patterns and models, we can use GeoAI and regression to make much more detailed predictions. If the Artificial Intelligence (AI) technique of super resolution is used to sharpen the images, the 7.5m accuracy can be reduced to 1m or less.

The value in Earth Observation data lies in not only understanding climate risk, but also how it influences so many other data. When considering construction, data like carbon emission, weather, flooding, and more are important to understand the impact on a project. This also goes for event planning, transport planning, and even farming. Using this EO data is becoming invaluable in providing cost saving, lowering carbon impact, and understanding the impact of industries on the world around us.

More Prime Read

3D Geospatial for Defense Spectrum Management

As the world becomes ever more connected, the communication infrastructure is critical. Effective spectrum management and network planning are vital for mission-critical safety communication users….

Digital Twin

A Symbiotic Relationship Supporting Sustainability

While geospatial infrastructure is required to model and simulate changes in the real world and feed them into the Digital Twin ecosystem, a digital replica at the national level can open doors…

Building Resilient Future for PNT

The position, navigation and timing (PNT) services provided by GNSS infrastructure go well beyond the maps on our phones; they are integrated into the very fabric of our digital world…

Capturing Real-time Echoes of Earth’s Heartbeat

Real-time persistent global monitoring enables defense and intelligence organizations to receive first-to-know insights that can be used for critical, mission-specific decision-making…