Spatial-temporal fraction map fusion with multi-scale remotely sensed images

Given the common trade-off between the spatial and temporal resolutions of current satellite sensors, spatial-temporal data fusion methods could be applied to produce fused remotely sensed data with synthetic fine spatial resolution (FR) and high repeat frequency. Such fused data are required to pro...

Full description

Bibliographic Details
Main Authors: Zhang, Yihang, Foody, Giles M., Ling, Feng, Li, Xiaodong, Ge, Yong, Du, Yun, Atkinson, Peter M.
Format: Article
Published: Elsevier 2018
Subjects:
Online Access:https://eprints.nottingham.ac.uk/51875/
_version_ 1848798594210988032
author Zhang, Yihang
Foody, Giles M.
Ling, Feng
Li, Xiaodong
Ge, Yong
Du, Yun
Atkinson, Peter M.
author_facet Zhang, Yihang
Foody, Giles M.
Ling, Feng
Li, Xiaodong
Ge, Yong
Du, Yun
Atkinson, Peter M.
author_sort Zhang, Yihang
building Nottingham Research Data Repository
collection Online Access
description Given the common trade-off between the spatial and temporal resolutions of current satellite sensors, spatial-temporal data fusion methods could be applied to produce fused remotely sensed data with synthetic fine spatial resolution (FR) and high repeat frequency. Such fused data are required to provide a comprehensive understanding of Earth's surface land cover dynamics. In this research, a novel Spatial-Temporal Fraction Map Fusion (STFMF) model is proposed to produce a series of fine-spatial-temporal-resolution land cover fraction maps by fusing coarse-spatial-fine-temporal and fine-spatial-coarse-temporal fraction maps, which may be generated from multi-scale remotely sensed images. The STFMF has two main stages. First, FR fraction change maps are generated using kernel ridge regression. Second, a FR fraction map for the date of prediction is predicted using a temporal-weighted fusion model. In comparison to two established spatial-temporal fusion methods of spatial-temporal super-resolution land cover mapping model and spatial-temporal image reflectance fusion model, STFMF holds the following characteristics and advantages: (1) it takes account of the mixed pixel problem in FR remotely sensed images; (2) it directly uses the fraction maps as input, which could be generated from a range of satellite images or other suitable data sources; (3) it focuses on the estimation of fraction changes happened through time and can predict the land cover change more accurately. Experiments using synthetic multi-scale fraction maps simulated from Google Earth images, as well as synthetic and real MODIS-Landsat images were undertaken to test the performance of the proposed STFMF approach against two benchmark spatial-temporal reflectance fusion methods: the Enhanced Spatial and Temporal Adaptive Reflectance Fusion Model (ESTARFM) and the Flexible Spatiotemporal Data Fusion (FSDAF) model. In both visual and quantitative evaluations, STFMF was able to generate more accurate FR fraction maps and provide more spatial detail than ESTARFM and FSDAF, particularly in areas with substantial land cover changes. STFMF has great potential to produce accurate time-series fraction maps with fine-spatial-temporal-resolution that can support studies of land cover dynamics at the sub-pixel scale.
first_indexed 2025-11-14T20:22:15Z
format Article
id nottingham-51875
institution University of Nottingham Malaysia Campus
institution_category Local University
last_indexed 2025-11-14T20:22:15Z
publishDate 2018
publisher Elsevier
recordtype eprints
repository_type Digital Repository
spelling nottingham-518752020-05-04T19:48:38Z https://eprints.nottingham.ac.uk/51875/ Spatial-temporal fraction map fusion with multi-scale remotely sensed images Zhang, Yihang Foody, Giles M. Ling, Feng Li, Xiaodong Ge, Yong Du, Yun Atkinson, Peter M. Given the common trade-off between the spatial and temporal resolutions of current satellite sensors, spatial-temporal data fusion methods could be applied to produce fused remotely sensed data with synthetic fine spatial resolution (FR) and high repeat frequency. Such fused data are required to provide a comprehensive understanding of Earth's surface land cover dynamics. In this research, a novel Spatial-Temporal Fraction Map Fusion (STFMF) model is proposed to produce a series of fine-spatial-temporal-resolution land cover fraction maps by fusing coarse-spatial-fine-temporal and fine-spatial-coarse-temporal fraction maps, which may be generated from multi-scale remotely sensed images. The STFMF has two main stages. First, FR fraction change maps are generated using kernel ridge regression. Second, a FR fraction map for the date of prediction is predicted using a temporal-weighted fusion model. In comparison to two established spatial-temporal fusion methods of spatial-temporal super-resolution land cover mapping model and spatial-temporal image reflectance fusion model, STFMF holds the following characteristics and advantages: (1) it takes account of the mixed pixel problem in FR remotely sensed images; (2) it directly uses the fraction maps as input, which could be generated from a range of satellite images or other suitable data sources; (3) it focuses on the estimation of fraction changes happened through time and can predict the land cover change more accurately. Experiments using synthetic multi-scale fraction maps simulated from Google Earth images, as well as synthetic and real MODIS-Landsat images were undertaken to test the performance of the proposed STFMF approach against two benchmark spatial-temporal reflectance fusion methods: the Enhanced Spatial and Temporal Adaptive Reflectance Fusion Model (ESTARFM) and the Flexible Spatiotemporal Data Fusion (FSDAF) model. In both visual and quantitative evaluations, STFMF was able to generate more accurate FR fraction maps and provide more spatial detail than ESTARFM and FSDAF, particularly in areas with substantial land cover changes. STFMF has great potential to produce accurate time-series fraction maps with fine-spatial-temporal-resolution that can support studies of land cover dynamics at the sub-pixel scale. Elsevier 2018-08-30 Article PeerReviewed Zhang, Yihang, Foody, Giles M., Ling, Feng, Li, Xiaodong, Ge, Yong, Du, Yun and Atkinson, Peter M. (2018) Spatial-temporal fraction map fusion with multi-scale remotely sensed images. Remote Sensing of Environment, 213 . pp. 162-181. ISSN 0034-4257 Land cover; Fraction maps; Spatial-temporal fusion; Spectral unmixing; Super-resolution mapping https://www.sciencedirect.com/science/article/pii/S0034425718302311 doi:10.1016/j.rse.2018.05.010 doi:10.1016/j.rse.2018.05.010
spellingShingle Land cover; Fraction maps; Spatial-temporal fusion; Spectral unmixing; Super-resolution mapping
Zhang, Yihang
Foody, Giles M.
Ling, Feng
Li, Xiaodong
Ge, Yong
Du, Yun
Atkinson, Peter M.
Spatial-temporal fraction map fusion with multi-scale remotely sensed images
title Spatial-temporal fraction map fusion with multi-scale remotely sensed images
title_full Spatial-temporal fraction map fusion with multi-scale remotely sensed images
title_fullStr Spatial-temporal fraction map fusion with multi-scale remotely sensed images
title_full_unstemmed Spatial-temporal fraction map fusion with multi-scale remotely sensed images
title_short Spatial-temporal fraction map fusion with multi-scale remotely sensed images
title_sort spatial-temporal fraction map fusion with multi-scale remotely sensed images
topic Land cover; Fraction maps; Spatial-temporal fusion; Spectral unmixing; Super-resolution mapping
url https://eprints.nottingham.ac.uk/51875/
https://eprints.nottingham.ac.uk/51875/
https://eprints.nottingham.ac.uk/51875/