Images provided by various remote sensing satellites are multispectral, low resolution, and panchromatic, high resolution, which are fused, enlarging the low resolution images to make them the same size as the panchromatic ones. Panchromatic images have good spatial resolution but only one spectral band and multispectral images typically have four or eight bands but are four times lower in spatial resolution than a panchromatic image. Image fusion of this type seeks to combine the best feature of the high spatial panchromatic image with the low spatial multispectral image to obtain an image with high spatial and spectral resolution. Several techniques have been developed to perform this fusion however the techniques with low computational resource consumption are EIHS Algorithm, Brovey Algorithm, Averaging Algorithm. To compare them, in this work, natural color photographs are taken, from which high resolution monochromatic and lower resolution chromatic images are obtained to emulate the real situation. The low resolution color images obtained were interpolated using three satellite image interpolation techniques. The fusion techniques were evaluated, obtaining the quantitative spectral and spatial ERGAS indices and the RMSE. The EIHS and Brovey techniques were found to produce artifacts because the color component values can fall above or below the representation interval [0,255]. After correcting this issue, it was found that the EIHS and Brovey methods, in that order, produced the lowest RMSE, followed by the averaging method. Since this result proved to be inconsistent with that obtained with the mean ERGAS, a new normalized mean ERGAS that gives a better indication of fusion quality, matching the result given by the RMSE, was proposed to be used instead.
|