

“DRISHTI : SAR Image Colorization System By Using GAN and CNN”
Abstract
This work involves the design of a machine learning–driven, generative model for the colorization of Synthetic Aperture Radar (SAR) images. SAR images, though valuable for remote sensing due to their all-weather and day–night imaging capability, are mostly grayscale and difficult to interpret. To address this, a Multidomain Cycle-Consistency Generative Adversarial Network (MC-GAN) will be developed to translate grayscale SAR images into realistic, terrain-specific colored images. The system integrates mask vectors to represent terrain types and applies multidomain classification loss to ensure correct and consistent colorization across farmlands, deserts, urban, and rural areas. L1 cycle-consistency loss enables the use of unpaired SAR–optical datasets, overcoming the limitations of traditional methods like pix2pix that require paired data. An application framework will provide visualized outputs, enabling easier interpretation for non-experts and supporting applications such as terrain classification, remote sensing data fusion, and visualization. The expected result is more realistic and interpretable SAR imagery, reduced dependency on paired datasets, and enhanced accuracy in remote sensing analysis with maximum efficiency and reliability.
References
S. M. D’Souza et al., “Interactive sar image colorization using conditional gans, resnet and generative adversarial refinement with real-time insights,” in 2025 6th International Conference on Mobile Computing and Sustainable Informatics (ICM CSI). IEEE, 2025.
N. Patnaik et al., “Demystifying sar with attention,” Expert Systems with Applications, vol. 276, p. 127182, 2025.
X. Zhang, H. Qin, J. Li, J. Geng, Z. Gao, and Y. Yu, “Spatial-frequency information interaction diffusion for sar colorization,” in ICASSP 2025- 2025 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). Hyderabad, India: IEEE, 2025.
K. Shen et al., “A benchmarking protocol for sar colorization: from regression to deep learning approaches,” Neural Networks, vol. 169, pp. 698–712, 2024.
Z. Wang et al., “Enhanced small target detection and color interpretation of syn thetic aperture radar images,” Journal of Applied Remote Sensing, vol. 18, no. 4, pp. 044504–044504, 2024.
N. Aburaed, M. Al-Saad, M. S. Zitouni, M. Q. Alkhatib, S. A. Mansoori, and H. A. Ahmad, “A critical examination of sar colorization impact on flood mapping accuracy,” in IGARSS 2024- 2024 IEEE International Geoscience and Remote Sensing Symposium. Athens, Greece: IEEE, 2024, pp. 8504–8508.
Z. Guo et al., “Sar2color: Learning imaging characteristics of sar images for sar-to optical transformation,” Remote Sensing, vol. 14, no. 15, p. 3740, 2022.
G. Koukiou, “Perceptually optimal color representation of fully polarimetric sar imagery,” Journal of Imaging, vol. 8, no. 3, p. 67, 2022.
Q. Song, F. Xu, and Y.-Q. Jin, “Radar image colorization: Converting single polarization to fully polarimetric using deep neural networks,” IEEE Access, vol. 6, pp. 1647–1661, 2017.
Refbacks
- There are currently no refbacks.