Please use this identifier to cite or link to this item: https://hdl.handle.net/20.500.14365/1431
Full metadata record
DC FieldValueLanguage
dc.contributor.authorUlucan, Oguzhan-
dc.contributor.authorUlucan, Diclehan-
dc.contributor.authorTurkan, Mehmet-
dc.date.accessioned2023-06-16T14:11:36Z-
dc.date.available2023-06-16T14:11:36Z-
dc.date.issued2023-
dc.identifier.issn0165-1684-
dc.identifier.issn1872-7557-
dc.identifier.urihttps://doi.org/10.1016/j.sigpro.2022.108774-
dc.identifier.urihttps://hdl.handle.net/20.500.14365/1431-
dc.description.abstractThe visual system enables humans to perceive all details of the real-world with vivid colors, while high dynamic range (HDR) technology aims at capturing natural scenes in a closer way to human perception through a large dynamic range of color gamut. Especially for traditional -low dynamic range (LDR)- de-vices, HDR-like image generation is an attractive research topic. Blending a stack of input LDR exposures is called multi-exposure image fusion (MEF). MEF is indeed a very challenging problem and it is highly prone to halo effects or ghosting and motion blur in the cases when there are spatial discontinuities in between input exposures. To overcome these artifacts, MEF keeps the best quality regions of each exposure via a weight characterization scheme. This paper proposes an effective weight map extraction framework which relies on principal component analysis, adaptive well-exposedness and saliency maps. The characterized maps are later refined by a guided filter and a blended output image is obtained via pyramidal decomposition. Comprehensive experiments and comparisons demonstrate that the developed algorithm generates very strong statistical and visual results for both static and dynamic scenes. In ad-dition, the designed method is successfully applied to the visible-infrared image fusion problem without any further optimization.(c) 2022 Elsevier B.V. All rights reserved.en_US
dc.language.isoenen_US
dc.publisherElsevieren_US
dc.relation.ispartofSıgnal Processıngen_US
dc.rightsinfo:eu-repo/semantics/openAccessen_US
dc.subjectHigh dynamic rangeen_US
dc.subjectMulti -exposure image fusionen_US
dc.subjectPrincipal component analysisen_US
dc.subjectSaliency mapen_US
dc.subjectWell -exposednessen_US
dc.subjectQuality Assessmenten_US
dc.titleGhosting-free multi-exposure image fusion for static and dynamic scenesen_US
dc.typeArticleen_US
dc.identifier.doi10.1016/j.sigpro.2022.108774-
dc.identifier.scopus2-s2.0-85138033973en_US
dc.departmentİzmir Ekonomi Üniversitesien_US
dc.authoridUlucan, Oguzhan/0000-0003-2077-9691-
dc.authoridTurkan, Mehmet/0000-0002-9780-9249-
dc.authorwosidUlucan, Oguzhan/AAY-8794-2020-
dc.authorwosidTurkan, Mehmet/AGQ-8084-2022-
dc.authorscopusid57212583565-
dc.authorscopusid57891531000-
dc.authorscopusid57219464964-
dc.identifier.volume202en_US
dc.identifier.wosWOS:000862589400003en_US
dc.relation.publicationcategoryMakale - Uluslararası Hakemli Dergi - Kurum Öğretim Elemanıen_US
dc.identifier.scopusqualityQ1-
dc.identifier.wosqualityQ2-
item.openairetypeArticle-
item.cerifentitytypePublications-
item.grantfulltextopen-
item.openairecristypehttp://purl.org/coar/resource_type/c_18cf-
item.fulltextWith Fulltext-
item.languageiso639-1en-
crisitem.author.dept05.06. Electrical and Electronics Engineering-
Appears in Collections:Scopus İndeksli Yayınlar Koleksiyonu / Scopus Indexed Publications Collection
WoS İndeksli Yayınlar Koleksiyonu / WoS Indexed Publications Collection
Files in This Item:
File SizeFormat 
477.pdf12.78 MBAdobe PDFView/Open
Show simple item record



CORE Recommender

SCOPUSTM   
Citations

21
checked on Nov 20, 2024

WEB OF SCIENCETM
Citations

15
checked on Nov 20, 2024

Page view(s)

74
checked on Nov 25, 2024

Download(s)

44
checked on Nov 25, 2024

Google ScholarTM

Check




Altmetric


Items in GCRIS Repository are protected by copyright, with all rights reserved, unless otherwise indicated.