A Case Study on Image Co-Registration of Hyper-Spectral and Dual (L & S) Band SAR Data and Ore Findings Over Zewar Mines, India

Main Article Content

Dipanjan Dutta
Tamesh Halder
Abhishek Penchala
Kandukoori Vamshi Krishna
Grajula Prashnath
Debashish Chakravarty

Abstract

The technique of superimposing two or more photographs in a way that ensures that for each image, the same pixel corresponds to the same location of the target scene is known as image coregistration It is a crucial stage in the picture enhancement process for satellite images. Different frequency bands store feature. Image fusion makes it possible to superimpose co-registered pictures taken by several sensors to get a superior image incorporating elements from both sources. On many match patches that are evenly dispersed over the two scenes, we estimate pixel offsets between possibly coherent picture pairings as image coregistration allows a more detailed single image to be obtained than many photos with distinct attributes. This study presents existing various fusion methods for ASAR (Airborne Synthetic Aperture Radar) images in the Sband and L-band to interpret urban, forestry, and agricultural areas. AVIRIS hyper spectral data also shows mining possibilities on ore of region. Hence, the seeking of ore region, and coregistration using fusion facilitates the remote sensing architecture next to drones.

Downloads

Download data is not yet available.

Article Details

How to Cite
A Case Study on Image Co-Registration of Hyper-Spectral and Dual (L & S) Band SAR Data and Ore Findings Over Zewar Mines, India (Dipanjan Dutta, Tamesh Halder, Abhishek Penchala, Kandukoori Vamshi Krishna, Grajula Prashnath, & Debashish Chakravarty , Trans.). (2024). International Journal of Emerging Science and Engineering (IJESE), 12(6), 17-25. https://doi.org/10.35940/ijese.A8055.12060524
Section
Articles
Author Biographies

Abhishek Penchala, Department of Mining Engineering, IIT Kharagpur (West Bengal), India.



Kandukoori Vamshi Krishna, Department of Mining Engineerin,  Zewar Mines, Zewar (Rajasthan), India.





Debashish Chakravarty, Department of Mining Engineering, IIT Khragpur (West Bengal), India.



How to Cite

A Case Study on Image Co-Registration of Hyper-Spectral and Dual (L & S) Band SAR Data and Ore Findings Over Zewar Mines, India (Dipanjan Dutta, Tamesh Halder, Abhishek Penchala, Kandukoori Vamshi Krishna, Grajula Prashnath, & Debashish Chakravarty , Trans.). (2024). International Journal of Emerging Science and Engineering (IJESE), 12(6), 17-25. https://doi.org/10.35940/ijese.A8055.12060524
Share |

References

M. Costantini et al., "Automatic Coregistration of SAR and Optical Images Exploiting Complementary Geometry and Mutual Information," IGARSS 2018 - 2018 IEEE International Geoscience and Remote Sensing Symposium, 2018, pp. 8877-8880, doi: 10.1109/IGARSS.2018.8519242. https://doi.org/10.1109/IGARSS.2018.8519242

A. Plyer, E. Colin-Koeniguer and F. Weissgerber, "A New Coregistration Algorithm for Recent Applications on Urban SAR Images," in IEEE Geoscience and Remote Sensing Letters, vol. 12, no. 11,pp.2198-2202,Nov.2015,doi:10.1109/ LGRS.2015.2455071. https://doi.org/10.1109/LGRS.2015.2455071

Scheffler, Daniel & Hollstein, André & Diedrich, Hannes & Segl, Karl & Hostert, Patrick. (2017). AROSICS: An Automated and Robust Open-Source Image Co-Registration Software for Multi-Sensor Satellite Data. Remote Sensing. 2017. 676. 10.3390/rs9070676. https://doi.org/10.3390/rs9070676

M. Costantini et al., "Automatic Coregistration of SAR and Optical Images Exploiting Complementary Geometry and Mutual Information," IGARSS 2018 - 2018 IEEE International Geoscience and Remote Sensing Symposium, 2018, pp. 8877-8880, doi: 10.1109/IGARSS.2018.8519242. https://doi.org/10.1109/IGARSS.2018.8519242

D. P. Bavirisetti and R. Dhuli, "Fusion of Infrared and Visible Sensor Images Based on Anisotropic Diffusion and Karhunen-Loeve Transform," in IEEE Sensors Journal, vol. 16, no. 1, pp. 203-209, Jan.1, 2016, doi: 10.1109/JSEN.2015.2478655. https://doi.org/10.1109/JSEN.2015.2478655

B. K., Shreyamsha Kumar. (2015). Image fusion based on pixel significance using cross bilateral filter. Signal, Image and Video Processing. 9. 1193-1204. 10.1007/s11760-013-0556-9. https://doi.org/10.1007/s11760-013-0556-9

Liu, Yu & Chen, Xun & Cheng, Juan & Peng, Hu & Wang, Z.. (2017). Infrared and visible image fusion with convolutional neural networks. International Journal of Wavelets, Multiresolution and Information Processing. 16. 10.1142/S0219691318500182. https://doi.org/10.1142/S0219691318500182

Zhou, Zhiqiang & Dong, Mingjie & Xie, Xiaozhu & Gao, Zhifeng. (2016). Fusion of infrared and visible images for night-vision context enhancement (Code available in Linked data). Applied Optics. 55. 6480-6490. 10.1364/AO.55.006480. https://doi.org/10.1364/AO.55.006480

Civardi, Gaia Letizia & Bechini, Michele & Colombo, Alessandro & Quirino, Matteo & Piccinin, Margherita & Lavagna, Michelle. (2022). VIS-TIR Imaging for Uncooperative Objects Proximity Navigation: a Tool for Development and Testing.

Bavirisetti, Durga & Xiao, Gang & Zhao, Junhao & Dhuli, Ravindra & Liu, Gang. (2019). Multi-scale Guided Image and Video Fusion: A Fast and Efficient Approach. Circuits, Systems, and Signal Processing. 38. 10.1007/s00034-019-01131-z. https://doi.org/10.1007/s00034-019-01131-z

Durga Prasad Bavirisetti, Ravindra Dhuli,Two-scale image fusion of visible and infrared images using saliency detection,Infrared Physics & Technology,Volume 76,2016,Pages 52-64,ISSN 13504495,https://doi.org/10.1016/j.infrared.2016.01.009. https://doi.org/10.1016/j.infrared.2016.01.009

Ma, Jinlei & Zhou, Zhiqiang & Wang, Bo & Zong, Hua. (2017). Infrared and visible image fusion based on visual saliency map and weighted least square optimization (Code available in Linked data). Infrared Physics & Technology. 82. 10.1016/j.infrared.2017.02.005. https://doi.org/10.1016/j.infrared.2017.02.005

Z. Liu, E. Blasch, Z. Xue, J. Zhao, R. Laganiere, and W. Wu, "Objective assessment of multiresolution image fusion algorithms for context enhancement in night vision: A comparative study," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 34, pp. 94–109, 2012. https://doi.org/10.1109/TPAMI.2011.109

X. Zhang, P. Ye and G. Xiao, "VIFB: A Visible and Infrared Image Fusion Benchmark," 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), Seattle, WA, USA, 2020, pp. 468-478, doi: 10.1109/CVPRW50498.2020.00060. https://doi.org/10.1109/CVPRW50498.2020.00060

D. P. Bavirisetti, G. Xiao and G. Liu, "Multi-sensor image fusion based on fourth order partial differential equations," 2017 20th International Conference on Information Fusion (Fusion), 2017, pp. 1-9, doi: 10.23919/ICIF.2017.8009719. https://doi.org/10.23919/ICIF.2017.8009719

Zhou, Zhiqiang & Dong, Mingjie & Xie, Xiaozhu & Gao, Zhifeng. (2016). Fusion of infrared and visible images for night-vision context enhancement (Code available in Linked data). Applied Optics. 55. 6480-6490. 10.1364/AO.55.006480. https://doi.org/10.1364/AO.55.006480

Jiayi Ma, Chen Chen, Chang Li, Jun Huang,Infrared and visible image fusion via gradient transfer and total variation minimization,Information Fusion,2016.

Lloyd Haydn Hughes, Diego Marcos, Sylvain Lobry, Devis Tuia, Michael Schmitt,” A deep learning framework for matching of SAR and optical imagery” ,ISPRS Journal of Photogrammetry and Remote Sensing, Volume 169, 2020, Pages 166-179, ISSN 0924-2716, https://doi.org/10.1016/j.isprsjprs.2020.09.012

Aritro Pal Choudhury, Tamesh Halder, Rintu Kumar Gayen, Arundhati Misra Ray, Debashish Chakravarty,” C-band and L-band AirSAR image fusion technique using anisotropic diffusion”, Materials Today: Proceedings, Volume 58, Part 1, 2022, Pages 433-436,10.1016/j.matpr.2022.02.393. https://doi.org/10.1016/j.matpr.2022.02.393

Pajares, Gonzalo & de la Cruz, Jesús. (2004). A wavelet-based image fusion tutorial. Pattern Recognition. 37. 1855-1872. 10.1016/j.patcog.2004.03.010. https://doi.org/10.1016/S0031-3203(04)00103-7

Ehlers, Manfred, Klonus, Sascha, Johan Åstrand, Pär;"Multi-sensor image fusion for pansharpening in remote sensing" International Journal of Image and Data Fusion 25-45Taylor & Francis 1947-9832 doi: 10.1080/19479830903561985 https://doi.org/10.1080/19479830903561985

M. Neumann, L. Ferro-Famil, and E. Pottier, “A general model-based polarimetric decom- position scheme for vegetated areas,” in Proceedings of the 4th International Workshop on Science and Applications of SAR Polarimetry and Polarimetric Interferometry (ESRIN), Fras- cati, Italy, 26–30, Citeseer (2009).

Safy*, M., Eltanany, A. S., & Amein, A. S. (2019). SAR Images Co-registration Based on Gradient Descent Optimization. In International Journal of Innovative Technology and Exploring Engineering (Vol. 9, Issue 2, pp. 2361–2367). https://doi.org/10.35940/ijitee.b6226.129219

Abdel-Wahab*, A. M., Abdel-Gawad, A. K., & AWAD, A. A. D. I. (2020). Urban Expansion Classification using the Change Detection of High-Resolution Images, for Jeddah Province. In International Journal of Recent Technology and Engineering (IJRTE) (Vol. 8, Issue 6, pp. 5080–5092). Blue Eyes Intelligence Engineering and Sciences Engineering and Sciences Publication - BEIESP. https://doi.org/10.35940/ijrte.f9813.038620

Image Enhancement based on Fusion using 2D LPDCT and Modified PCA. (2019). In International Journal of Engineering and Advanced Technology (Vol. 8, Issue 6S3, pp. 1482–1492). https://doi.org/10.35940/ijeat.f1264.0986s319

Sharma, Dr. K., & Garg, N. (2021). An Extensive Review on Image Segmentation Techniques. In Indian Journal of Image Processing and Recognition (Vol. 1, Issue 2, pp. 1–5). https://doi.org/10.54105/ijipr.b1002.061221

Nasir, F. M., & Watabe, H. (2020). Validation of the Image Registration Technique from Functional Near Infrared Spectroscopy (fNIRS) Signal and Positron Emission Tomography (PET) Image. In International Journal of Management and Humanities (Vol. 4, Issue 9, pp. 63–69). https://doi.org/10.35940/ijmh.i0877.054920

Most read articles by the same author(s)

1 2 3 > >>