Harnessing the Satellite Boom
Every year, more and more satellites are launched into the Earth’s orbit, and improvements in sensor resolutions and computing capabilities have resulted in an abundance of high quality satellite data.
In the past, such data was often treated as a scarce commodity - rarely shared and usually with a large price tag. In the last few years however, data sources have expanded tremendously. Technological developments and political lobbying are creating a culture of shared and increasingly open information, and this benefits both public agencies and private industry. Countries all over the world are seeing government agencies provide their data freely via open source initiatives.
A good example of this where satellites are concerned, is the Copernicus program. Copernicus is operated by the European Commission, in partnership with the European Space Agency. It is currently the largest Earth observation programme and is designed to provide a comprehensive picture of the Earth. And all of the data collected by different clusters of satellites are available via open access. Companies involved in insurance and disaster risk assessment are rightly taking advantage of this rich data source, and harnessing this information to significantly improve the scope of their data analyses.
Among the varied risks we look at, flooding is probably one of the most challenging perils to model. Due to these advancements in data collection and imaging, flood modelling has improved immensely in recent years. Flood models rely on a wide range of variables, many of which are derived from satellite imagery. Elevation, precipitations, moisture levels in the soil and the detection of flood prone areas are just some of the things we look at.
In the past, these models relied on fairly coarse resolutions from satellite data, generally captured at about 90 meters. Now however, data is captured at 30m resolutions at least – 9x better than it was a few years ago!
These developments have led to more catastrophe models being released, with higher levels of accuracy and at increasingly affordable prices.
A Need for Speed
Nowadays, there are satellites capable of monitoring fires, precipitation levels and ground motion from earthquakes almost live, up to a resolution of 30cm. The deployment of interferometers (devices that measure small displacements and surface irregularities by comparing different wave signals, usually light or radio waves) and image reclassification techniques allows insurers to map natural and man-made catastrophes as they unfold over time.
data derived from satellites is revolutionizing the way we work, allowing us to confront natural catastrophe’s faster and more responsively.
In the past, when a large natural catastrophe occurred, insurance companies often relied on external modelling companies to provide data, often with a waiting time of several days. However, with the advent of free, open-source satellite imagery, specially developed for catastrophe detection, the post event analysis is now much faster. It is often much more accurate too.
For example, it’s now possible to map wildfires at a resolution of 375m, with a delay of just three hours between the acquisition and delivery of the data. During the recent Canadian wildfires for example, the fast deployment of these types of datasets meant ultra-quick estimates of the affected areas.
Challenges and Limitations?
Despite the vast improvements however, data quality can still be a challenge. Synthetic Aperture Radar (SAR) data, often used to detect floods, is a good example. Despite reaching resolutions down to just a few centimeters, backscatter data (signals reflected back from different surfaces back to the satellite) often suffers from high speckle noise caused by the refraction of microwaves against vertical surfaces, such as buildings. As a result, SAR-derived imagery is less accurate in urban environments, where it is often needed most.
Another constraint comes from the availability of resources. The size of the datasets and the complexity of turning raw, remote sensing data into post-processing imagery often requires large amounts of data storage, a lot of computing power, specialist software and dedicated expert personnel.
Looking to the Future
The abundance of data from different satellites, each one with slightly different capabilities and limitations means that exploiting this data is not without its challenges. Nonetheless, there is no doubt data derived from satellites is revolutionizing the way we work, allowing us to confront natural catastrophe’s faster and more responsively.
In the future, developing specialist know-how is going to be paramount for all risk assessment professionals trying to make best use of the enormous amount of data we have at our disposal.
This data can enable us to develop a clearer and deeper understanding of the dynamics of complex natural catastrophes. Vast increases in the availability and quality of satellite data is helping (re)insurers to price risks more accurately and create increasingly detailed risk models. And perhaps most importantly, these enhancements are helping companies and local governments in high risk areas to better understand the nature of the threats they face, and take more informed actions to prepare for an event and respond quickly and effectively in its aftermath.
Want to know more? You can reach Giacomo at email@example.com