Underwater image enhancement using multiple processing techniques based on DCP, CLAHE, CNNs, and U-Net
DOI:
https://doi.org/10.37868/hsd.v7i2.1596Abstract
In the last decade, interest in the underwater world has increased due to the abundance of resources and abundant species of aquatic organisms and their reliance on them as a source of food or energy. It was necessary to prepare the necessary conditions to make what is underwater visible naturally, which is difficult to achieve due to the loss of color in the blue and red channels, in addition to darkness, fog, refraction, and dispersion. All of these things require us to do our best to make what is underwater easy to control and monitor. For this reason, work was done to develop a fusion algorithm for many techniques, starting with removing fog, improving luminance, reducing noise and preserving edges, then obtaining fine details, then multi-level analysis to enhance lighting, then building the trained model to extract image features and improve them for vision, highlighting final details and improving sharpness, then performing the accurate evaluation process using quality measurement standards between the original and final images, Which led to obtaining good results for the proposed method compared to modern algorithms in terms of results with the standard quality criteria used (PSNR, SSIM, RMSE, VIF).
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2025 Ameen A. Noor, Nur Intan Raihana Ruhaiyem

This work is licensed under a Creative Commons Attribution 4.0 International License.
Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution License that allows others to share the work with an acknowledgement of the work's authorship and initial publication in this journal.
This journal permits and encourages authors to post items submitted to the journal on personal websites or institutional repositories after publication, while providing bibliographic details that credit its publication in this journal.