Published on 2 August 2021
Near-real time mapping of water bodies from satellite imagery plays a critical role in water management. Continuous monitoring of environmental change over time, like the estimation of water availability or prediction of floods and droughts, is essential to human activities such as agriculture, hydrology and water management—and Copernicus Sentinel-2 data are lending a hand.
Researchers from the SpaceTimeLab of University College London (UCL) and the Joint Research Centre (JRC) of the European Commission, recently carried out a study on Convolutional neural networks (CNN) for water segmentation, using Copernicus Sentinel-2 red, green, blue (RGB) composites and also the NIR (Near Infrared) and SWIR (Short Wavelength Infrared) bands derived spectral indices, e.g. stating derived spectral indices using all spectral bands, or visible and infrared spectral bands.
Their study aimed to determine if a single CNN based on RGB image classification can effectively segment water on a global scale and outperform traditional spectral methods. Additionally, the study evaluated the extent to which smaller datasets (of very complex pattern, e.g. harbour megacities) can be used to improve globally applicable CNNs within a specific region.
Data from the Sentinel-2 satellites of the European Union’s Copernicus Programme were used. The mission comprises a constellation of two polar-orbiting satellites placed in the same sun-synchronous orbit, phased at 180° to each other. It aims at monitoring variability in land surface conditions, and its wide swath width (290 km) and high revisit time (10 days at the equator with one satellite, and 5 days with two satellites under cloud-free conditions, which results in 2-3 days at mid-latitudes) support monitoring of Earth's surface changes.