Karakaya, DiclehanUlucan, OguzhanTurkan, Mehmet2023-06-162023-06-162022978-1-6654-0540-91520-6149https://doi.org/10.1109/ICASSP43922.2022.9746779https://hdl.handle.net/20.500.14365/194047th IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) -- MAY 22-27, 2022 -- Singapore, SINGAPOREHigh dynamic range (HDR) imaging enables to immortalize natural scenes similar to the way that they are perceived by human observers. With regular low dynamic range (LDR) capture/display devices, significant details may not be preserved in images due to the huge dynamic range of natural scenes. To minimize the information loss and produce high quality HDR-like images for LDR screens, this study proposes an efficient multi-exposure fusion (MEF) approach with a simple yet effective weight extraction method relying on principal component analysis, adaptive well-exposedness and saliency maps. These weight maps are later refined through a guided filter and the fusion is carried out by employing a pyramidal decomposition. Experimental comparisons with existing techniques demonstrate that the proposed method produces very strong statistical and visual results.eninfo:eu-repo/semantics/openAccessHigh dynamic rangemulti-exposure image fusionprincipal component analysissaliency mapguided filteringPAS-MEF: MULTI-EXPOSURE IMAGE FUSION BASED ON PRINCIPAL COMPONENT ANALYSIS, ADAPTIVE WELL-EXPOSEDNESS AND SALIENCY MAPConference Object10.1109/ICASSP43922.2022.97467792-s2.0-85131252300