Please use this identifier to cite or link to this item: https://hdl.handle.net/20.500.14365/4725
Full metadata record
DC FieldValueLanguage
dc.contributor.authorYıldız, B.-
dc.contributor.authorÇağdaş, G.-
dc.contributor.authorZincir, I.-
dc.date.accessioned2023-06-19T20:56:19Z-
dc.date.available2023-06-19T20:56:19Z-
dc.date.issued2023-
dc.identifier.issn0961-3218-
dc.identifier.urihttps://doi.org/10.1080/09613218.2023.2204418-
dc.identifier.urihttps://hdl.handle.net/20.500.14365/4725-
dc.descriptionArticle; Early Accessen-US
dc.description.abstractThe paper presents a novel method for classifying architectural spaces in terms of topological and visual relationships required by the functions of the spaces (where spaces such as bedrooms and bathrooms have less visual and physical relationships due to the privacy, while common spaces such as living rooms have higher visual relationship and physical accessibility) through machine learning (ML). The proposed model was applied to single and two-storey residential plans from the leading architects of the 20th century Among the five different ML models whose performances were evaluated comparatively, the best results were obtained with Cascade Forward Neural Networks (CFNN), and the average model success was calculated as 93%. The features affecting the classification models were examined based on SHAP values and revealed that width, control, 3D visibility and 3D natural daylight luminance were among the most influential. The results of five different ML models indicated that the use of topological and 3D visual relationship features in the automated classification of architectural space function can report very high levels of classification accuracy. The findings show that the classification model can be an important part of developing more efficient and adaptive floor plan design, building management and effective reuse strategies. © 2023 Informa UK Limited, trading as Taylor & Francis Group.en_US
dc.language.isoenen_US
dc.publisherRoutledgeen_US
dc.relation.ispartofBuilding Research and Informationen_US
dc.rightsinfo:eu-repo/semantics/closedAccessen_US
dc.subjectArchitectural space classificationen_US
dc.subjectartificial intelligenceen_US
dc.subjectfloor plan analysisen_US
dc.subjectmachine learningen_US
dc.subjectClassification (of information)en_US
dc.subjectFeedforward neural networksen_US
dc.subjectFloorsen_US
dc.subjectTopologyen_US
dc.subjectArchitectural spaceen_US
dc.subjectArchitectural space classificationen_US
dc.subjectClassification modelsen_US
dc.subjectFloor plan analyseen_US
dc.subjectFloorplansen_US
dc.subjectMachine learning modelsen_US
dc.subjectMachine learning techniquesen_US
dc.subjectMachine-learningen_US
dc.subjectSpatial relationsen_US
dc.subjectVisual-spatialen_US
dc.subjectMachine learningen_US
dc.titleArchitectural space classification considering topological and 3D visual spatial relations using machine learning techniquesen_US
dc.typeArticleen_US
dc.identifier.doi10.1080/09613218.2023.2204418-
dc.identifier.scopus2-s2.0-85158819483en_US
dc.departmentİzmir Ekonomi Üniversitesien_US
dc.authorscopusid57212454250-
dc.authorscopusid6602952073-
dc.authorscopusid55575855800-
dc.identifier.wosWOS:001006177300001en_US
dc.institutionauthor-
dc.relation.publicationcategoryMakale - Uluslararası Hakemli Dergi - Kurum Öğretim Elemanıen_US
dc.identifier.scopusqualityQ1-
dc.identifier.wosqualityQ2-
item.grantfulltextembargo_20300101-
item.openairecristypehttp://purl.org/coar/resource_type/c_18cf-
item.cerifentitytypePublications-
item.openairetypeArticle-
item.fulltextWith Fulltext-
item.languageiso639-1en-
Appears in Collections:Scopus İndeksli Yayınlar Koleksiyonu / Scopus Indexed Publications Collection
WoS İndeksli Yayınlar Koleksiyonu / WoS Indexed Publications Collection
Files in This Item:
File SizeFormat 
4725.pdf
  Until 2030-01-01
3.6 MBAdobe PDFView/Open    Request a copy
Show simple item record



CORE Recommender

Page view(s)

74
checked on Sep 30, 2024

Google ScholarTM

Check




Altmetric


Items in GCRIS Repository are protected by copyright, with all rights reserved, unless otherwise indicated.