A bayesian data fusion approach to spatio-temporal fusion of remotely sensed images

Remote sensing provides rich sources of data for the monitoring of land surface dynamics. However, single-sensor systems are constrained from providing spatially high-resolution images with high revisit frequency due to the inherent sensor design limitation. To obtain images high in both spatial and...

Full description

Bibliographic Details
Main Authors: Xue, J., Leung, Yee-Hong, Fung, T.
Format: Journal Article
Published: MDPI AG 2017
Online Access:http://hdl.handle.net/20.500.11937/66831
_version_ 1848761404374384640
author Xue, J.
Leung, Yee-Hong
Fung, T.
author_facet Xue, J.
Leung, Yee-Hong
Fung, T.
author_sort Xue, J.
building Curtin Institutional Repository
collection Online Access
description Remote sensing provides rich sources of data for the monitoring of land surface dynamics. However, single-sensor systems are constrained from providing spatially high-resolution images with high revisit frequency due to the inherent sensor design limitation. To obtain images high in both spatial and temporal resolutions, a number of image fusion algorithms, such as spatial and temporal adaptive reflectance fusion model (STARFM) and enhanced STARFM (ESTARFM), have been recently developed. To capitalize on information available in a fusion process, we propose a Bayesian data fusion approach that incorporates the temporal correlation information in the image time series and casts the fusion problem as an estimation problem in which the fused image is obtained by the Maximum A Posterior (MAP) estimator. The proposed approach provides a formal framework for the fusion of remotely sensed images with a rigorous statistical basis; it imposes no requirements on the number of input image pairs; and it is suitable for heterogeneous landscapes. The approach is empirically tested with both simulated and real-life acquired Landsat and Moderate Resolution Imaging Spectroradiometer (MODIS) images. Experimental results demonstrate that the proposed method outperforms STARFM and ESTARFM, especially for heterogeneous landscapes. It produces surface reflectances highly correlated with those of the reference Landsat images. It gives spatio-temporal fusion of remotely sensed images a solid theoretical and empirical foundation that may be extended to solve more complicated image fusion problems.
first_indexed 2025-11-14T10:31:08Z
format Journal Article
id curtin-20.500.11937-66831
institution Curtin University Malaysia
institution_category Local University
last_indexed 2025-11-14T10:31:08Z
publishDate 2017
publisher MDPI AG
recordtype eprints
repository_type Digital Repository
spelling curtin-20.500.11937-668312018-07-16T00:10:13Z A bayesian data fusion approach to spatio-temporal fusion of remotely sensed images Xue, J. Leung, Yee-Hong Fung, T. Remote sensing provides rich sources of data for the monitoring of land surface dynamics. However, single-sensor systems are constrained from providing spatially high-resolution images with high revisit frequency due to the inherent sensor design limitation. To obtain images high in both spatial and temporal resolutions, a number of image fusion algorithms, such as spatial and temporal adaptive reflectance fusion model (STARFM) and enhanced STARFM (ESTARFM), have been recently developed. To capitalize on information available in a fusion process, we propose a Bayesian data fusion approach that incorporates the temporal correlation information in the image time series and casts the fusion problem as an estimation problem in which the fused image is obtained by the Maximum A Posterior (MAP) estimator. The proposed approach provides a formal framework for the fusion of remotely sensed images with a rigorous statistical basis; it imposes no requirements on the number of input image pairs; and it is suitable for heterogeneous landscapes. The approach is empirically tested with both simulated and real-life acquired Landsat and Moderate Resolution Imaging Spectroradiometer (MODIS) images. Experimental results demonstrate that the proposed method outperforms STARFM and ESTARFM, especially for heterogeneous landscapes. It produces surface reflectances highly correlated with those of the reference Landsat images. It gives spatio-temporal fusion of remotely sensed images a solid theoretical and empirical foundation that may be extended to solve more complicated image fusion problems. 2017 Journal Article http://hdl.handle.net/20.500.11937/66831 10.3390/rs9121310 http://creativecommons.org/licenses/by/4.0/ MDPI AG fulltext
spellingShingle Xue, J.
Leung, Yee-Hong
Fung, T.
A bayesian data fusion approach to spatio-temporal fusion of remotely sensed images
title A bayesian data fusion approach to spatio-temporal fusion of remotely sensed images
title_full A bayesian data fusion approach to spatio-temporal fusion of remotely sensed images
title_fullStr A bayesian data fusion approach to spatio-temporal fusion of remotely sensed images
title_full_unstemmed A bayesian data fusion approach to spatio-temporal fusion of remotely sensed images
title_short A bayesian data fusion approach to spatio-temporal fusion of remotely sensed images
title_sort bayesian data fusion approach to spatio-temporal fusion of remotely sensed images
url http://hdl.handle.net/20.500.11937/66831