Daylighting And Quality View Prediction For Atriums: LEED-IEQDemonstration Approach Michal Gryko [1] Sammar Z. Allam[1] [3] Gabriella Rossi [2] , Hesham Shawky [1] , David Leon[1] 1Institute
for advanced Architecture of Catalonia, IAAC, UPC, Spain 2 CITA/ Royal Danish Academy 3, College of Architecture and Design, Effat University, Saudi Arabia
Keywords: Atrium Buildings, LEED-IEQ , Environmental Simulation, ANN, Kohonen Map
Abstract Atrium spaces globally form an integral part of many public buildings and have a great impact both psychologically and environmentally on the spaces they form. The ability to rapidly assess the physical form and environmental impact simultaneously while adhering to internationally recognized green building rating systems could prove to be invaluable in early-stage design. To fulfill indoor environmental quality (IEQ) requirements on green building rating systems like LEED, certain specifications are needed which can be time-consuming to calculate. Through the generation of datasets recording key geometric features of atriums and their relationship to environmental predictions including daylighting, thermal comfort and view quality, a database is created to allow Machine Learning algorithms to predict ratings more quickly than manual simulations. In this case, regression using artificial neural networks is used to predict the environmental output of a varied range of standard atrium types and as a result calculate what criteria is met to give points score. ANN is trained using ‘relu’ as an activation function, ‘Adam’ as optimizer, and ‘mean_squared_error’ as loss function. In combination with parametric modeling software like Grasshopper and live connections to machine learning libraries like Tensor Flow through plugins like Hops, a trained and simulated atrium model can provide instant LEED’s score as the atrium is configured by the designer. 1.
Introduction
In the realm of climate change and global warming the world is witnessing nowadays and has been persisting for the last decade, a holistic perspective of architectural design and building construction is essential to address the increasing demand in energy and address pragmatically green buildings’ design. Building construction energy consumption is 34% of the total energy pattern. Furthermore, carbon emissions resulting from buildings is almost one thirds of the total carbon emissions as stated by the Euspoean environment agency (EEA) .
Figure 1. LEED rating system intervention through design process phases Moreover, Artificial intelligence (AI) is a recently evolving tool which encompass performance prediction towards optimization. Consequently, demonstrating AI-based tools into buildings’ energy performance prediction is a beneficial instrument to refine buildings’ design through the design process. AI-aided green building design can predict energy consumption and carbon emissions of buildings prior its construction and support architects’ decision making to finetune design alternatives from a sustainable perspective.
Green building design metrics comply with international rating systems like the Leadership in Energy and Environmental Design (LEED). LEED consists of 9 criteria to evaluate buildings’ design and their sustainable performance and impact to the ecosystem. Indoor Environmental quality (IEQ) is one of the criteria that addresses healthy buildings. Furthermore, IEQ has in increasing significance and number of points from version 4 to v4.1 in operation and maintenance as delineated in figure 1. Hence, this increase reflects the impact of IEQ to green and sustainable building design and operation. .
Figure 2. Indoor Environmental Quality points share for LEED v4 O=M and LEED v4.1 O+M
Poor indoor air quality affects productivity, and health. World health organization (WHO) has described healthy building to address physical, psychological, social health, and well-being of people. Post-COVID-19 pandemic, a common concern has been raised to IEQ and achieving safe context in interior spaces as an easier controlled environment. Concurrently, IEQ is one of the key categories in green building certification in the leadership of environmental and energy design (LEED) and BREAM. the LEED EQ category for existing buildings: •
Prerequisite: Minimum indoor air quality performance
•
Prerequisite: Environmental Tobacco Smoke Control
•
Prerequisite: Green cleaning policy
•
Credit (2 points); Indoor air quality management program
•
Credit (2 points): Enhanced indoor air quality strategies
•
Credit (1 point): Thermal comfort
•
Credit (2 points): Interior lighting
•
Credit (4 points): Daylight and quality views
•
Credit (1 point): Green cleaning—custodial effectiveness assessment
•
Credit (1 point): Green cleaning—products and materials
•
Credit (1 point): Green cleaning—equipment
•
Credit (2 points): Integrated pest management
•
Credit (1 point): Occupant comfort survey
2
Harvard university as well has drawn a holistic aspect to highlight healthy building key factors and playing elements as delineated in figure 1. It covers acoustic and water contamination issues that is not addressed by former building certifications like LEED and BREEAM.
Figure 3. Research methodology the 9 foundation of Healthy Buildings Source: Healthy Buildings, School of Public Health, Harvard University, Available from: https://www.hsph.harvard.edu/joseph-allen/healthy-buildings/ [ accessed 27 march, 2022] Obviously, IEQ is inter-connected process. Building performance in design manner and opening articulation affects daylighting, indoor thermal comfort so eventually energy efficiency, air quality, and contaminant distribution through air circulation and finally energy performance. Meta-data and hyperparameters using neural networks (NN)dimensionality is worth investigation and experimentation toward IEQ. Healthy Environment is one of the SDG’s intervened terminology in multiple goals from cities to indoor environment. No doubt, Chronic disease in addition to possible pandemics like COVID-19 represents a direct hazard and obligates architects to reconsider buildings’ design process and elements. Healthy buildings interpret multiple layers including indoor thermal comfort, indoor air quality, interior lighting, and daylighting. Simultaneously, the design approach to achieve comprehensive and balanced comfort and health for indoor spaces requires utilizing recent technologies, smart tools, and algorithmic computations. Basically, Indoor air quality is depending on meter readings and measurements to monitor VOC percentile and toxics or contaminants. Sensors and IOT ( Internet of things) play a significant role not only for air pollutants monitoring, but also enhancing indoor thermal comfort. Air circulation through natural ventilation is essential to get rid of VOC’s or air contaminants, moreover, achieve thermal comfort measures. Hence, the aim of this research is to address healthy buildings indoor environmental quality (IEQ) through indoor air circulation using natural or hybrid techniques or passive design strategies. The main objective is to adopt AI-approach with data-driven models to reflect building design opening and indoor spaces hierarchy to generate a pool of IEQ measured designs. It is utilizing recent algorithmic computations in design , geometry, and link this to simulation readings and effectiveness in results. This research investigates the effectiveness of AI-based approach to achieve multi-dimensional IEQ objective through air circulation enhancement in indoor spaces through building design and opening articulation. Neural network can deal with n-dimensional inputs to achieves multi-objective measurements and parametric manipulation in design. Building design and opening affects as well indoor thermal comfort, daylighting and hence energy consumption. N-dimensionality is a benefit to be projected through IEQ-oriented design model in this research.
3
2.
Methodology
2 Methodology In order to achieve the rapid prediction of LEED scoring for common atrium design scenarios, a five step process is undertaken (figure 4). The initial step requires the generation of the geometric generative model which acts as a base for all the environmental calculations to take place. With this in place, the simulations are performed across the generative models utilizing the grasshopper plugins Ladybug, Honeybee and Climate Studio. The core dataset is extracted from this process and then further amplified using the interpolation method based on Kohonen selforganizing networks. With this extensive dataset, deep learning ANN and SL regression algorithms are utilized to generate accurate predictions from the data. This results with a live prediction using hops which translates the data into rhino and with a live link to the geometry. Within the Grasshopper interface, the average user can configure common atrium types with immediate LEED score results.
Figure 4: Five step process for interactive atrium design and LEED score evaluation 2.1 The Parametric Model As a common typology, atriums are designed in a large variety of sizes and types. The focus of this paper is on the fully enclosed atrium as one of the most common types. For the environmental prediction to be versatile and accurate, the selection and creation of the main geometric features is an essential step. For the initial studies, a larger selection of features are created based on attributes that would affect environmental scoring the most. The 18 selected features included critical data such as floor count, heights, window sizing, atrium depth, various areas and ground floor details. Although these features enable an in-depth recreation of a typical atrium, the vast number of features proved to reduce the accuracy of the Machine Learning prediction in later stages due to weak correlations which are elaborated upon in the next chapter. This resulted in the necessity of reducing and refining the inputs from 18 features to describe our geometry to 9 features (figure 5). A trial-and-error method could be taken to determine the most effective features by purely evaluating the prediction accuracy from the deep learning regression algorithms, however the user of dimensionality reduction utilizing Principle Component Analysis (PCA) is the most efficient method. This linear algebra technique is one that can be used to automatically perform dimensionality reduction. It works by identifying patterns in data based on the correlation between features which can be visually analysed through plots including Explained Variance Ratio (EVR) plots and correlation heatmaps. Through this process of applying and evaluating the PCA method, directions of maximum variance in high-dimensional data are identified and projected with fewer dimensions that create less noise in the prediction.
4
Figure 5: Atrium feature selection and optimization through Principle Component Analysis When applied iteratively to the original 18 selected atrium features, the dimensionality was reduced to 9 features which had the effect of improving the predictive capabilities of regression algorithms which are explained in the results chapter. In summary, the strongest features were identified as the glazing and atrium ratios and width due to the strong impact on daylight entry to the building and the floor heights and depths which augmented this as well.
2.2 Environmental Simulation The creation of a parametric model allows for the geometry upon which all the necessary environmental simulations can be performed on. As explained in the introduction, the LEED criteria is quite extensive and requires an in-depth analysis to get a complete evaluation. For the scope of this paper, the accreditation for thermal comfort, daylight and quality views was selected as largely affected by geometric design and provide a good basis for future expansion to other LEED evaluation categories. Two different plugins are employed for the simulations. The first involves the use of the grasshopper plugins Ladybug and Honeybee in order to output percentage data to evaluate the quality of views, daylighting and glare. All these components are linked to the solid and glazed, generative model to generate a general understanding of these environmental conditions (figure 6).
Figure 6: Primary script for the parametric atrium models and environmental dataset creation.
5
The three main categories of analysis take guidance from the LEED EQ categories. Daylighting is calculated separately on the ground and typical levels as these are often different. The result is the Spatial Daylight Autonomy (sDA) percentage which is a yearly metric that describes the percent of space that receives sufficient daylight dictated by LEED criteria and to determine the building's daylight performance. The data created is the percentage of floor area that receives at least 300 lux for at least 50% of the annual occupied hours which is between 9am and 6pm in this case. Similarly, Annual Sunlight Exposure (ASE) simulations are performed separately on the ground and upper levels. This analysis outputs the percentage of space that receives extreme direct sunlight, stipulated as 1000 Lux or more for at least 250 occupied hours per year. The final simulation for view quality is determined by achieving a direct line of sight to the outdoors via vision glazing for 50% of all regularly occupied floor area, with a percentage of the floor area reaching these criteria. A second strategy was experimented with for the simulation of the previous points and the addition of thermal comfort. The Grasshopper plugin Climate Studio was employed for simulation due to its ability to deliver more reliable results for thermal comfort, daylight autonomy and views. However, in the case of this study, it resulted in consuming more time to run than the previous simulation tools and would create difficulty in the production of the large dataset required for the Machine Learning algorithms. 2.3 Dataset Interpolation The quality and quantity of data used in Machine Learning algorithms have a direct impact on the quality of the results. Generally, at least 3000 data samples are required to acquire adequate results and the greater the number, the more data the deep learning algorithm has to calculate predictions. The primary constraint to achieve these large datasets is time. Particularly for environmental simulations on complex geometry, it can become a very time-consuming process although there are techniques to reduce the time required. For this paper, two dataset creation techniques are utilized. The first, involves a python script for randomizing all the parameters for the atrium model each time a simulation is run and saving the data to a csv file. This csv file can be easily used by python libraries to clean and process the data including Pandas and Tensor flow libraries. In this case, the python randomizer script is set between 2500 and 3000 iterations with the key geometric features changing each iteration and saving out the key features and environmental percentages from the simulations. The second approach takes a more advanced approach of using only a few extreme samples and interpolating them using Kohonen self-organizing networks to produce thousands of datasets rapidly (figure 7). The effectiveness of the data interpolation is dependent on the quality of the initial data samples used. By identifying the four most extreme designs on all scales, a more accurate interpolation is created. To aid in this process the genetic optimizer Wallacei is utilized with the fitness goals of highest area, daylight, views and lowest glare values. The highest and lowest achieving options are then interpolated with the Kohonen map to produce a larger dataset much more quickly than through manual creation.
Figure 7: The combination of using Extreme design samples selected by the genetic optimizer Wallacei and interpolated the data with Kohonen maps
6
2.4 Machine learning The primary purpose of utilizing machine learning for the proposed system is to predict simulation results rapidly. A regression model is needed to achieve this and both shallow and deep learning algorithms are explored. Utilizing Python Tensor Flow libraries, linear regression (SL regression) and regression artificial neural networks (ANN) are employed. Linear Regression as a simpler supervised learning technique learns the relationship between the features and the target and effective with systems with fewer targets. On the other hand, Artificial Neural Networks for Regression learn the complex non-linear relationship between the features and target, due to the presence of activation function in each layer. This contrasts with Linear Regression which can only learn the linear relationship between the features and target and therefore cannot learn the complex non-linear relationship. Five main iterations are created, each time changing the regression model, predictions and features based on data analysis and PCA analysis. This iterative approach, evaluating the results with covariance graphs or EVR (explained variance ratio), feature pair plots and histograms allows for guidance on what features and machine learning algorithm to change in order to improve the results. The next chapter explains in depth this process and how the LEED score prediction is achieved.
2.5 LEED scoring The final step involves implementing the ML model for prediction through hops to connect to live geometry in rhino and show real-time environmental analysis instantly. This step provides both the simple evaluation and the ability for the average user to select their own atrium design and receive instant predictions. To achieve this, a script for LEED IEQ points utilizes the environmental percentage prediction from the ANN model. If the view percent is 75 or more 1 point is added to the results. For daylighting, between 55 to 75 percent 2 points can be gained, while more than 75 percent allows for 3 points. In relation to the ASE or glare, if the percentage is less than 10 %, a point can be added. All these conditions are added to the final prediction and selected geometry to achieve the goal of rapid LEED scoring (Figure 8).
Figure 8: LEED IEQ points live prediction output utilizing Hops plugin to provide a live link to the machine learning algorithms
7
3.
Results
Testing 5 main iterations have taken place though which changing each time the regression model, predictions and features based on data analysis and PCA analysis. Two AI models have been examined, an ANN regression and a SL regression. ANN architecture for different datasets, simulationsourced and Kohonen map-augmented, have been explored. For the first iteration and analysis we could tell that the data was unevenly distributed and too many features, but could identify ones to remove through low correlation analysis such as balcony width, building height and various area features which duplicated information. Through initial testing with deep learning the prediction accuracy was very low due to inadequate dense layers and no clear features to correlate with the predictions this can be seen in the bottom right on the truth vs prediction plot.
Figure 9:Iteration of Deepl Learning using different algorithms and AI architecture for each. Basically, Iterations has been explored using different datasets, Iteration 1 to 3 used a dataset from a randomizer. While iterations 4 and 5 ran a Kohonen map interpolated dataset of 40, 000 samples. After dimensionality reduction with features in the 5th iteration, a minimum MSE loss function is achieved of 0.00016. 3.1 Iteration 1
Figure 10 :Iteration 1 used ANN regression of 5 layers architecture.
8
Over the iterations, we improved the geometry and features along with the sequential layers of the Regression model. For the 4th and 5th iterations we used the Kohonen map as a way of increasing the sample size and to increase the interpolated steps between values for better correlations as seen the the heatmap and scatter plots. However, the covariance plot reached a plateau between 6-8 .
Figure 11 :Iteration 1B used ANN regression of 3 layers architecture using same number of samples as in Iteration 1. 3.2 Iteration 2 In Addition, the team has explored a different supervised learning algorithm, which is the Shallow linear learning regressions (SL regression) using 2969 samples (the same dataset for iteration 2) with 11 features to predict daylight autonomy of atrium buildings. The SL regression uses a standard scalar of the dataset to improve the model results. However, the train versus prediction graphs shows a polynomial correlation rather than a linear one. Hence, the team decided to train more ANN.
Figure 12 :Iteration 2 used ANN regression of 4 layers architecture using different number of samples with 11 features.
9
Figure 13 :Iteration 2B used SL regression using same number of samples as in Iteration 2 with 11 features. 3.3 Iteration 3 In iteration 3, another prediction is added which is the quality view to meet LEED-IEQ scoring categories. Basically, 11 features are used to train ANN regression for 6 prediction. The MSE loss function is 0.0012. However, the covariance scatter-plot shows some outliers so we decided to increase the dataset size and train more.
Figure 14 :Iteration 3 used ANN regression with 3 layers architecture using different number of samples with 11 features but 6 predictions.
10
3.4 Iteration 4 Hence, In iteration 4 and 5, a Kohonen map interpolated dataset of 40,000 samples is used to train ANN regression. PCA analysis of features in the covariance EVR plot has reached a plateau of nearly 8. So, the team trained a model of the 11 features for iteration 4 . While a dimensionality reduction has taken place to train iteration 5.
Figure 15 :Iteration 4 used ANN regression same architecture Iteration 4 but samples generated by kohonen map algorithm. Iteration 5 uses a 40000 sample dataset and is trained with an ANN regression model with 3 dense layers of 36 neurons using ‘relu’ as an activation function, ‘Adam’ as an optimizer, and ‘mean_squared_error’ as a loss function. Finally, Hops is used to visualize and test. Finally, results show successful predictions of values within the min/max used to train the models.
Figure 16 :Iteration 5 used samples generated from Kohonen map PCA analysis so reduced number of features are used than in iteration 4.
11
3.5 Iteration 5 As a result of dimensionality reduction to 9 features, in iteration 5 ANN regression. Moreover, the model is trained with double hyperparameters of 400 epochs. Consequently, an optimum result of the MSE loss function reaches 0.00016 which is the minimum of all iterations.
Figure 17 :Iteration 5 used ANN regression same architecture Iteration 4 but samples generated by kohonen map algorithm but reduced features as a result from PCA analysis.
3.6 IMPLEMENTATION: HOPS AND LEEDS SCORING The final step involved implementing the ML model for prediction through hops to connect to live geometry in rhino and show real-time environmental analysis instantly.
Figure 18 : Implementation to test prediction of the final selected ANN Architecture. A script was added to convert the predicted analysis into a LEEDs score category and assign the points accordingly. hops is used for the prediction test. In addition, we added a script for leed IEQ points calculation using iteration 5 trained ANN regression model. If the view percent is 75 or more we gain 1 point. For daylight, from 55 to 75 we gain 2 points. With more than 75 we gain 3 points. As for the ASE or glare, it should be less than 10 %.
12
Figure 19 : Implementation to interpret predicted parameters to LEED IEQ indices and gained points.
4.
Recommendations
After running all these iterations we learned a lot about the process to create better predictions. To further develop this project we would rework the original grasshopper script to improve the features, and ranges and add more influential geometry such as walls and facades. In addition, We would hope to add further analysis to provide rapid LEEDS evaluations with the rest of the criteria.
Figure 19 : recommendation for future work per category.
13
5.
Conclusions
As the world becomes more environmentally conscious, the benefit of identifying the environmental impact of atrium designs in real-time cannot be overstated. Particularly to global standards such as the LEED criteria by which many projects are measured against, this prediction tool acts to augment designs from the outset. The three tested criteria of daylighting, quality views and glare reduction are only the starting point for the accreditation. Adding other criteria for assessing indoor environmental quality increases the depth of the analysis but simultaneously the complexity of the design required to be assessed. This would require additional parameters of materiality, occupancy, and other measurements to be included in the design options as both meta data and geometry. A balance needs to be met to enable the designer the flexibility of early-stage design and the reliable feedback of a LEED score to see the affect of the design changes. Aside from increasing the scope of the design analysis, in relation to the data generation and AI algorithms used there is still room for improvement. The process of evaluation and refinement explained in the results chapter demonstrates how the iterations results in better predictions both through the improved data generated and the parameters used to train the ANN regression model. Improvements include adding geometry which directly relate to the predictions such as facades, to improving the dataset creation through stronger correlations. The inclusion of local site geometry and other local climate factors will strengthen the datasets and accuracy of the LEED predictions. The ability to configure the location and geography of options would be another invaluable function to the toolset. With the wide scope of development, a versatile toolset will be added to the environmentally conscious designer enabling rapid evaluations with the rest of the LEED criteria.
References 1. Global share of buildings and construction energy consumption, 2017 (International Energy Agency and the United Nations Environment Programme, 2018) 2. Healthy Buildings, School of Public Health, Harvard University, Available from: https://www.hsph.harvard.edu/joseph-allen/healthybuildings/ [ accessed 27 march, 2022] 3. Indoor Environmental Quality & LEED V4, USGBC, Available from: https://www.usgbc.org/articles/indoor-environmental-qualityand-leed-v4 [Accessed 27 March, 2022] 4. Healthy Building, healthy people- a vision fo rthe 21st century , EPA, United s=states Environmental Protection Agency, available from: https://www.epa.gov/indoor-air-quality-iaq/healthy-buildings-healthy-people-vision-21st-century [Accessed 27 March, 2022] 5. Zhang, Anxiao, Regina Bokel, Andy Van den Dobbelsteen, Yanchen Sun, Qiong Huang, and Qi Zhang. 2017. "The Effect of Geometry Parameters on Energy and Thermal Performance of School Buildings in Cold Climates of China" Sustainability 9, no. 10: 1708. https://doi.org/10.3390/su9101708 6. Abbas, F. (2016) Potential influence of Courtyard onIndoor Environment Conditions of Office buildinsg, MSc. Civil and environmental engineering, Universiti Tun Hussein Onn Malysia. 7. Fang, Y., Cho, S. (2019) Design optimization of building geometry and fenestration for daylight and 8. Energy performance, Solar Energy, Volume 191, Page7-18, https://doi.org/10.1016/j.solener.2019.08.039 9. available on: https://www.sciencedirect.com/science/article/abs/pii/S0038092X19308199
14
10. Allam, A.S., Bassioni, H., Ayoub, M. and Kamel, W. (2022), "Investigating the performance of genetic algorithm and particle swarm for optimizing daylighting and energy performance of offices in Alexandria, Egypt", Smart and Sustainable Built Environment, Vol. aheadof-print No. ahead-of-print. https://doi.org/10.1108/SASBE-11-2021-0202 11. Ciardiello, A., Rosso, F., Dell’Olmo, J., Ciancio, V., Ferrero, M, Salata, F. (2020) Multi-objective approach to the optimization of shape and envelope in building energy design, Applied energy, Volume 280, 115984, Esevier. https://doi.org/10.1016/j.apenergy.2020.115984 12. Golbazi, M., Can B. Aktas, “Analysis of Credits Earned by LEED Healthcare Certified Facilities”, Procedia Engineering, Volume 145, 2016, Pages 203-210,ISSN 1877-7058, https://doi.org/10.1016/j.proeng.2016.04.062. 13 . EEA ,Greenhouse gas emissions from energy use in buildings in Europe, retrieved on 21 august 2022, from : https://www.eea.europa.eu/data-and-maps/indicators/greenhouse-gas-emissions-from-energy/assessment 14. Salem, N. and Hussein, S. (2019). Data dimensional reduction and principal components analysis. Procedia Computer Science, 163, pp.292–299. doi:10.1016/j.procs.2019.12.111. 15. Sarzeaud, Olivier & Stephan, Yann. (2000). Fast Interpolation Using Kohonen Self-Organizing Neural Networks. 1872. 126-139. 10.1007/3-540-44929-9_11. 16. Mohamed, Z.E. Using the artificial neural networks for prediction and validating solar radiation. J Egypt Math Soc 27, 47 (2019). https://doi.org/10.1186/s42787-019-0043-8 17. Chen, Y., Gao, Q., Wang, X. (2021) Inferential Wasserstein Generative Adversarial, Machine Learning Statistics, Cornell, University, https://doi.org/10.48550/arXiv.2109.05652. 18. Feri, Ludia E., Jaehun Ahn, Shahrullohon Lutfillohonov, and Joonho Kwon. 2021. "A Three-Dimensional Microstructure Reconstruction Framework for Permeable Pavement Analysis Based on 3D-IWGAN with Enhanced Gradient Penalty" Sensors 21, no. 11: 3603. https://doi.org/10.3390/s21113603 19. AI generated interior Design , Michael Hasey, retrieved n 10/96/2022 : http://www.michaelhasey.com/gan_interior 20. Zhang, T., Liu, Q., Wang, X., Ji, X., Du, Y., 2022, A 3D reconstruction method of porous media based on improved WGAN-GP, computers and geosciences, https://doi.org/10.1016/j.cageo.2022.105151 21. Gyimah, K., Amos-Abanyie, S., Koranteng, ch., (2021), Simulation based Indoor Environmental Quality analysis of an existing window used in a tropical warm humid climate., Proceedings of the 16th IBPSA Conference Rome, Italy, Sept. 2-4, 2019 2 22.https://towardsdatascience.com/a-one-stop-shop-for-principal-component-analysis5582fb7e0a9c 23. Sarzeuad, O., Stephan, Y., (2000), Data interpolation using Kohonen networks, IEEE Xplore, Conference: Neural Networks, 2000. IJCNN 2000, Proceedings of the IEEE-INNSENNS International Joint Conference on, Volume: 6, 10.1109/IJCNN.2000.859396 24. Kazanasmaz, T. , Günaydin, M., Binol, S., Artificial neural networks to predict daylight illuminance in office buildings, Building and Environment, Volume 44, Issue 8, August 2009, Pages 1751-1757
15