Vous êtes sur la page 1sur 11

This article appeared in a journal published by Elsevier.

The attached
copy is furnished to the author for internal non-commercial research
and education use, including for instruction at the authors institution
and sharing with colleagues.
Other uses, including reproduction and distribution, or selling or
licensing copies, or posting to personal, institutional or third party
websites are prohibited.
In most cases authors are permitted to post their version of the
article (e.g. in Word or Tex form) to their personal website or
institutional repository. Authors requiring further information
regarding Elsevier’s archiving and manuscript policies are
encouraged to visit:
http://www.elsevier.com/authorsrights
Author's personal copy

Energy and Buildings 62 (2013) 350–359

Contents lists available at SciVerse ScienceDirect

Energy and Buildings


journal homepage: www.elsevier.com/locate/enbuild

Variations in results of building energy simulation tools, and their


impact on BREEAM and LEED ratings: A case study
Yair Schwartz a,∗ , Rokia Raslan b
a
The Bartlett School of Graduate Studies, University College London, Central House, 14 Upper Woburn Place, London WC1H 0NN, United Kingdom
b
CBES, The Bartlett School of Graduate Studies, University College London, Central House, 14 Upper Woburn Place, London WC1H 0NN, United Kingdom

a r t i c l e i n f o a b s t r a c t

Article history: The increased awareness of building energy consumption and sustainability has resulted in the devel-
Received 8 January 2013 opment of various means of predicting performance and rating sustainability. The Building Research
Received in revised form 22 February 2013 Establishment Environmental Assessment Method (BREEAM) and Leadership in Energy and Environ-
Accepted 13 March 2013
mental Design (LEED) are the most commonly used Performance Rating Systems.
To predict energy consumption and award relevant energy performance credits, these systems use
Keywords:
computer-based Building Performance Simulation tools (BPS). Predictive inconsistencies between BPS
BREEAM
tools have been acknowledged in various studies. The probability of achieving different ratings by using
LEED
Building energy performance simulation
different BPS tools or rating systems raises questions concerning the ability to rate ‘sustainability’ in a
tools consistent manner. To investigate this, a case-study based inter-model comparative analysis was imple-
Energy performance and credits mented to examine the extent of the variation in the results produced by three of the most widely used
Results variability BPS tools (Tas, EnergyPlus and IES), and assess their influence and impact on overall BREEAM and LEED
scores.
Results showed that different simulation tools resulted in different energy consumption figures, but
had only a minor effect on BREEAM or LEED energy performance credit scores. Nonetheless, due to the
differences between BREEAM and LEED assessment procedures, the case study building was awarded a
considerably different rating level in each.
© 2013 Elsevier B.V. All rights reserved.

1. Introduction the US-developed Leadership in Energy and Environmental Design


(LEED) have emerged as the leading systems in terms of worldwide
The increase in global energy consumption had led to world- use [7].
wide concern with regard to the political, environmental, economic These sustainable building assessment rating systems are
and social implications associated with it [1]. Projections of future largely market-driven since they rely on market recognition of
global population and economic growth, predict a steady increase the value of sustainable buildings [8]. Various studies have high-
in global energy demand [2]. Energy use in the built environment lighted both the importance and impact of buildings rating systems
is responsible for more than 40% of global energy consumption [9,10]. For example, in terms of energy performance, a study [11]
[3] and is considered as a key contributor to anthropogenic cli- showed that sustainable building rating systems indeed succeed in
mate change [4]. The promotion of energy efficiency in buildings decreasing energy use in buildings, where the energy consumption
using sustainable-driven mitigation measures [5] is therefore a key of LEED certified buildings was 25–30% lower than the U.S national
element in ensuring a more sustainable future [6]. average. In terms of economic impact, the value of certified build-
In the built environment, in addition to policy-driven legisla- ings is believed to increase by 11% for new buildings and by almost
tive mitigation measures, a number of voluntary standards and 7% for renovated ones, and the rent value increases by more than
building assessment rating systems have been developed by inde- 6% for new buildings and nearly 20% for existing ones [12]. Gov-
pendent bodies. Although numerous rating systems have been ernments across the world also encourage project entrepreneurs
developed around the world, the UK-developed Building Research to use rating systems by offering tax incentives [13].
Establishment – Environmental Assessment Method (BREEAM) and While many of the rating systems share the general aim of
assessing the sustainability of a development, each system adjusts
itself to the economic and cultural environment it was originally
designed to work in. Since they define and measure ‘sustainabil-
∗ Corresponding author. Permanent address: 26th Lone Tree Avenue, Impington,
ity’ differently, it has been found that each of these rating systems
Cambridgeshire CB24 9PG, United Kingdom. Tel.: +44 07721 872250.
E-mail addresses: y.schwartz.11@ucl.ac.uk (Y. Schwartz), r.raslan@ucl.ac.uk may award a building different rating scores [14]. Furthermore,
(R. Raslan). to assess the credits awarded in energy-related categories, which

0378-7788/$ – see front matter © 2013 Elsevier B.V. All rights reserved.
http://dx.doi.org/10.1016/j.enbuild.2013.03.022
Author's personal copy

Y. Schwartz, R. Raslan / Energy and Buildings 62 (2013) 350–359 351

Table 1
Comparative overview of BREEAM and LEED.

BREEAM 2011 LEED 2009

Certifying body BRE – British Research Establishment USGBC – U.S Green Building Council

Scope of accredited buildings Over 200,000 buildings [20] Nearly 45,000 commercial buildings
Nearly 19,000 certified residential units and 75,000
registered residential units [21]

Schemes • New construction • New construction and major renovations


• Refurbishment • Existing buildings
• Code for sustainable homes • Commercial interiors
• Communities • Core and shell
• In-use • Schools
• Retail
• Healthcare
• Homes
• Neighborhood development

Latest version (new construction) BREEAM New Construction 2.0:2011 [22] LEED 2009 New Construction and Major Renovations
(updated November 2011) [23]
Main parameter for reduction Annual CO2 emissions Annual energy cost

Categories, available credits and Management 22 (w = 12) Sustainable site 26


weights (where applicable) Health and Wellbeing 10 (w = 15) Water efficiency 10
Energy (Ene) 30 (w = 19) Energy and atmosphere (EA) 35
Transport 9 (w = 8) Materials and resources 14
Water 9 (w = 6) Indoor environmental quality 15
Materials 12 (w = 12.5) Innovation in design 6
Waste 7 (w = 7.5)
Land use and ecology 10 (w = 10)
Pollution 13 (w = 10)
Innovation 10 (w = 10)

Rating scale (%) Outstanding >85 Platinum >80


Excellent 70 Gold 60–79
Very good 55 Silver 50–59
Good 45 Certified 40–49
Pass 30
Unclassified <30

have the most significant impact on the overall rating [15], rating For large non-domestic developments, the most commonly used
systems utilize any of a number of computer-based Building Perfor- variations are BREEAM New Construction (BREEAM NC) and LEED
mance Simulation tools (BPS). A fundamental issue that has been New Construction/Major Renovation (LEED NC/MR) [18,19].
highlighted in various studies is the predictive variability found Both systems are subdivided into a number of environmental
between BPS tools e.g. [16,17]. In considering these issues, the prob- categories and sub-categories, each of which has an allocated num-
ability of achieving different ratings as a result of using different ber of points/credits. Table 1 summarizes and compares the main
standards or BPS tools therefore raises questions concerning the features of each system. Although they share many features in com-
ability to measure sustainability in a credible and consistent man- mon, a number of significant differences exist between them, these
ner. Furthermore, in view of the wider impacts highlighted above, include:
this inconsistency could potentially have a significant economic
implications on future developments.
• Assessment aims: BREEAM focuses in decreasing CO2 emissions
The purpose of this study is to examine and quantify the poten-
caused by energy use in buildings, whereas LEED’s main concern
tial impact of using different BPS tools on the scores achieved
is reducing annual expenses on energy in buildings.
in building sustainability rating systems. The study will focus on
• Applications of weightings: While BREEAM’s categories have dif-
examining the most up-to-date versions of the two most commonly
ferent weighting in the overall BREEAM score calculation, LEED
used building rating systems for the new non-domestic sector
does not use any weighting system.
(BREEAM-NC and LEED-NC/MR) and will examine the results gen-
erated by three widely used BPS tools (Tas, EnergyPlus and IES-VE).
To enable this, a comparative inter-model format will be employed A comparison between the two ratings systems found that a
using a single case-study building and the following questions will development with a standard level LEED building (Silver or Gold)
be investigated: (i) How do energy loads predictions vary with the is considered to have low performance under BREEAM (Between
use of different BPS tools? (ii) How does the use of different BPS “Pass” and “Very good”)1 [14]. It has also been suggested that LEED
tools affect BREEAM and LEED energy credit scores? (iii) What, if is both more suited to climates where mechanical ventilation and
any, is the impact of these variations on the overall BREEAM and HVAC are widely used and places where the infrastructure that
LEED score? encourages driving exists. On the other hand, BREEAM is consid-
ered to be more supportive of facilities that encourage pedestrian
2. Background and cycle-based modes of transport, as well as more efficient water
consumption aspects [24].
2.1. Building rating systems: BREEAM and LEED

The BREEAM and LEED systems are both available in a range of


schemes geared toward a number of different building typologies. 1
Since this study, BREEAM outstanding has been introduced.
Author's personal copy

352 Y. Schwartz, R. Raslan / Energy and Buildings 62 (2013) 350–359

Table 2
BREEAM and LEED energy credit assessment procedure.

BREEAM 2011 – Ene-01 LEED 2009 – EA credit-1


a b
Maximum sub-category credits available 15 (out of a total 27 ) 19 (out of a total 35)
Calculation method EPR-NC: Energy performance ratio. Measures the Performance improvement ratio. Measures the
performance improvement of the proposed design operational energy cost of the proposed design against
energy consumption, energy demand and CO2 the ones of a ‘Baseline building’
emissions against the ones of a ‘Notional building’, and
against NCM’s actual building database.
’Base case’ building defined by ‘Notional building’ – Part L of the UK Building ‘Baseline building’ – ASHRAE 90.1-2007 [29]
Regulations [26] and the NCM 2010 [27]
Minimum improvement required None 10%
a
Contributing approximately 10 overall BREEAM credits after weighting) [25].
b
Though the maximum BREEAM Ene category in most cases is 30, the maximum score for ‘multi-residential buildings’ is 27 [25].

2.2. BREEAM and LEED energy credit calculation advantages associated with the application of BPS tools have been
widely discussed in relevant literature; some examples of this can
For both BRREAM and LEED, energy performance is a key com- be listed as follows:
ponent of sustainable design. Accordingly, in both of these systems
it is the energy consumption categories – Ene for BREEAM and EA • The encouragement of new design concepts and strategies
for LEED – that are the most influential in terms of their impact on through the evaluation and development of appropriate hard-
the overall assessment score. The most influential sub-categories ware components [30,31].
in each rating system are: • The improvement of the environmental performance of buildings
through the provision of an effective mechanism for optimizing
• For BREEAM: Ene-01 Reduction CO2 emissions (15 available cred- internal environmental conditions [30].
its, contributing approximately 10 overall BREEAM credits after • The facilitation of the application of a holistic approach to
weighting) [25]. assessing the overall performance of design proposals [32–35].
• For LEED: EA Credit-1 Optimize Energy Performance (19 available • The support of the exploration of innovative approaches to satis-
credits). fying performance requirements [36,37].

The ‘Ene-01’ and ‘EA Credit-1’ sub-categories grant credits for In following the increasing international trend for the integra-
buildings that demonstrate a performance improvement over a tion of BPS tools in the design process [38], both the BREEAM and
specific target. As described in Table 2, for BREEAM this target is LEED systems now rely on the use of BPS tools for the calculation
related to UK building regulations requirements, and for LEED it of various credits, most notably the main building energy per-
is associated with ASHRAE performance standards. To enable this, formance credits in the energy consumption categories: BREEAM
the performance of the ‘Designed building’ is compared to that of ‘Ene-01’ and LEED ‘EA Credit-1’.
a generated ‘Base case’ building by using any of the approved sim- For each of the rating systems, tools used for this calculation
ulation tools according to each standard. The ‘Base case’ building should be accredited based on a specified accreditation procedure.
shares some characteristics with the ‘Designed building’ (shape, For BREEAM Ene-01, the calculation procedure requires the use of
size, activities etc.), but includes adjusted specifications for build- any of the three classes of building energy calculation software
ing fabric and service systems. For each standard they are referred approved by the appropriate schemes for use for implementing
to as follows: the NCM (as described in Table 3). For LEED EA Credit-1 calcula-
tions, while USGBC does not certify or formally approve simulation
• For BREEAM: The ‘Base case’ building is known as the ‘Notional software packages for compliance with ASHRAE requirements or
building’ and is defined in accordance with Part L of the UK Build- for generating an appropriate baseline model, the software used
ing Regulations [26] and the NCM 2010 [27]. needs to be approved by the rating authority. The qualified sim-
• For LEED: The ‘Base case’ building is known as the ‘Baseline build- ulation program must follow a list of specification that includes
ing’ and is defined in accordance with the specifications outlined requirements that the tool must be able to [29]:
in ASHRAE 90.1-2007.
• Calculate 8760 h of building operation to simulate annual energy
There are differences between some characteristics of the ‘Base use.
case’ and ‘Designed building’ between the two standards. A study • Model hourly variations in occupancy, lighting power, miscella-
by Ng et al. [28] shows that BREEAM’s ‘Notional building’ consumes neous equipment power, thermostat set points, and HVAC system
18% less energy than LEED’s ‘Baseline building’. The ‘performance operation.
improvement’ calculation for BREEAM uses the EPR-NC (Energy
Performance Ratio for New Construction) – a figure which is directly Table 3 lists a number of commonly used tools that have been
influenced by energy consumption, delivered energy and CO2 emis- approved for use for the calculation of the relevant energy perfor-
sions. The LEED ‘performance improvement’ is simply a cost savings mance credits for each of the rating systems.
calculation. It is assumed that the approved BPS tools will have equivalent
results, and therefore the use of different simulation tools will
2.3. The use of building energy simulation tools in performance not lead to a major rating score difference [40]. To maintain the
modeling and assessment credibility of these systems, it is integral that the BPS tools used for
BREEAM and LEED credit calculations produce valid and consistent
Given the increasing importance of energy efficiency, a variety results. However, the investigation of predictive variations in BPS
of BPS tools have been developed and adopted in an aim to sup- tools has been the subject of various studies that have explored
port the decision-making process for energy efficient design. The both the extent of results variability between tools and the
Author's personal copy

Y. Schwartz, R. Raslan / Energy and Buildings 62 (2013) 350–359 353

Table 3
Accreditation requirements and approved tools used for BREEAM and LEED energy credit Assessment procedure.

BREEAM 2011 – Ene-01 LEED 2009 – EA credit-1

Accreditation requirements Tools accredited by the appropriate body/procedure for Tools that must comply with ASHRAE requirements or
implementing the NCM for generating an appropriate baseline model

Approved tools The following software tools classes can be used: Commonly used approved tools include:
• SBEM – Simplified Building Energy Model – calculates monthly • EnergyPlus
average energy consumption in buildings (excluding dwellings) • DOE-2-Based Software (eQUEST, VisualDOE)
• FI-SBEMs-Front-End Interfaces to the Simple Building Energy • IES-VE
Model. Includes tools such as Carbon Checker, DesignBuilder and • Tas
CYMAP. • EnerSim
• DSM – Dynamic Simulation Models – enables hourly energy
loads calculations [39]. Includes tools such as IES-VE, Tas and
Hevacomp Simulator (EnergyPlus Engine)a
a
Although only Tas and IES are explicitly approved under for implementing the NCM, EnergyPlus is the calculation engine for one of the other DSM approved tools
(Hevacomp) and has therefore been included in the comparative analysis.

difference between predicted and actual building performance.


These have found that range of performance results calculated
by different simulation tools varies [41–56] and is influenced by
various factors such as:

• The reliability and accuracy of physical input data.


• User skill in data interpretation and tool use.
• The calculation method (algorithms) used.
• The impact of weather file data.
• Applicability of the tool to the building and climate being ana-
lyzed.

Based on the above, it is therefore essential to examine the


potential for predictive variability between tools and quantify the
Fig. 1. ‘Average floors’ model (left) and a typical ‘Average floor’ in the three tools
associated impact this may have on the scores achieved in building (right).
sustainability rating systems. Even though previous studies have
examined the difference in BREEAM and LEED scores [7,57], no
with identical envelop parameters and systems was used for each
examination of the influence of various simulation tools on the
of the three tools (as is described in detail in section 3.3).
overall score has been carried out. Similarly, the majority of studies
that have examined the variation in results of BPS tools did not fully
3.2. The case-study building
consider the influence on a design of an actual building.
The case-study building selected for the study is a 14,000 sqm,
3. Research design 32 story student residential hall, located in London, UK. The build-
ing is comprised of 470 studios and several common areas. Its
3.1. Research approach bottom floors as well the south – west façades of its top stories
(24th to 32nd floors) are fully glazed. The east façade of the build-
Software testing can be conducted through a variety of ing has 10% glazing, and the other facades – between 16 and 20%
approaches (analytical, empirical and comparative) that differ (Figs. 1 and 2). Tables 5 and 6 describe the buildings’ characteristics
according to the objectives and scope of the test [58]. This study – materials, ventilation systems, etc.
aimed to examine performance results generated by three of the Due to the complexity of the case-study the model was simpli-
most widely used BPS tools – Tas, EnergyPlus and IES-VE – on an fied through the use of one ‘average floor’ for every 2–5 adjacent
actual design. Table 4 highlights the main features of each of this
tools and their user base.
In order to examine the potential for results variability, an
inter-model comparative analysis was carried out using a sin-
gle case-study building as will be discussed below. The principle
behind comparative analysis is the evaluation of tools by conduct-
ing a series of test-runs and then comparing the results. The tests
can either be carried by simulating a single case in various tools (the
inter-model approach), or by simulating various cases in a single
tool.
An important pre-requisite for conducting an inter-model
comparative analysis is to ascertain that models used for the sim-
ulations are set on an equal ground to enable the determination of
tool-influenced variability. This can be achieved through the har-
monization of model input data and ensuring the consistency of
specification and accuracy of key modeling assumptions such as the
building geometry, thermal and physical properties of the envelope Fig. 2. A typical 3–8 ‘Average floor’ floor plan (left) and west elevation of the building
and weather files. As such, for this study a single model (geometry) (right).
Author's personal copy

354 Y. Schwartz, R. Raslan / Energy and Buildings 62 (2013) 350–359

Table 4
BPS tools used in the study.

Tool Description Version used User base

Tas-EDSL A commercial dynamic simulation tool developed in the 1970s 9.2.1 Most commonly used in the UK
at Cranfield University. Tas uses a CAD-linked 3D Modeller for International user in approximately 700 different companies,
building geometry input and modification and the Tas 9.2.1 many with multiple licenses [59]
simulation engine. Functions such as compliance calculations
are performed via a dedicated program Macros.
EnergyPlus A multi-platform whole building energy simulation program 7.1 Large international user base with approximately 100,000
used by the U.S Department of Energy for building energy individual licenses [60]
performance assessments. The tool is free to use and widely
used for research application.
IES-VE A suite of commercial building analysis tools (The VE Suite) 6.4.0.10 One of the most popular simulation tools used worldwide
that uses the dynamic simulation engine ApacheSim. today Used in over 80 countries. Used in 19 out of the top 20
Geometry is taken from the <VE>3D building model. 3D BIM engineering firms, 18 out of the top 20 consultancy firm and 15
imports and shares information with ApacheCalc and of the top 20 architectural design firms in the UK [61]
ApacheSim via VE Integrated Data Model

Table 5
A comparison of the modeled ‘Designed’, ‘Notional’ and ‘Baseline’ building properties.

Category Designed building Notional building (BREEAM) Baseline building (LEED)

Simulations’ orientation As designed As designed 4 simulation in a 90◦ rotation interval


Model geometry As designed As designed As designed
Thermal zones Each room is a thermal zone As designed building As designed building
Fenestration As designed 1.5 m × full façade width, or 40% of wall Same as designed building, or 40% of wall
area, whichever smaller area, whichever smaller
Sill height – 1.1 m
Glass–frame ratio 10% of window area 10% of window area Not mentioned in ASHRAE 90.1-2007.
Modeled as 10% of window’s area
Domestic hot water
Energy source As designed: gas boiler As designed ASHRAE 90.1-2007 sec 7.4.1: Electricity
Efficiency SCoP: 90% Minimum SCoP: 83.6% Minimum SCoP: 93%
Ventilation rate 2 ACH at occupancy timesa According to NCM’s databaseb As designed
Infiltrationc 5 m3 /h per m2 envelope at 50 Pa.0.35 ach 5m3 /h per m2 envelope at 50 Pa. 0.35 ach Same as designed building
Internal gains Hourly data by SPPARC As designed As designed
Thermostatd 20–25 ◦ C As designed As designed
U-Value (W/m2 C)
Source As designed NCM 2010 ASHRAE 90.1.2007
External wall 0.24 0.26 0.365
Internal wall 1.8 1.8 0.365
Fenestratione 1.65 1.8 2.56
Ground floor 0.2 0.22 0.2
Upper floor/ceiling 0.19 1 0.214
Peripheryf 0.005 0.005 0.005
a
Contributing approximately 10 overall BREEAM credits after weighting) [25].
b
Since this study did not have access to NCM’s database, ventilation rates were calculated according to CIBSE guide A [62] –“Minimum ventilation rates for air quality”.
c
CIBSE guide A Chapter 4.7.2 [62] – Empirical data for air infiltration.
d
SPPARC, 2010 [63].
e
90% glass and 10% frame.
f
Was used as floor and ceiling construction in all ‘average floor’ models.

stories. The performance of each was then multiplied by the num- of the performance of this single case-study building, as calculated
ber of floors it represented to account for the overall performance. by different BPS tools, enables an examination of the performance
It is important to note that the case-study building was only results and the difference between them. Consequently, the find-
used as an enabling tool to implement a study where the focus of ings of this study that relate to the influence of BPS tools on BREEAM
the analysis was the performance of the BPS tools. The comparison and LEED ratings can therefore be generalized.

Table 6
A comparison of the modeled ‘Designed’, ‘Notional’ and ‘Baseline’ detailed HVAC specifications.

BREEAM – NCM 2010 LEED – ASHRAE 90.1-2007

Designed building Notional building Designed building Baseline building

Energy source As designed As designed As designed ASHRAE 90.1-2007


Heating: hot water gas boiler Heating: hot water gas boiler G3.1.1: heating: hot water fossil fuel boiler
Cooling: electric Cooling: electric Cooling: electric heat pump
Centralized VAV system Centralized VAV system

Efficiency Heating SCoP: 90% Heating SCoP: 79.2% Heating SCoP: 90% ASHRAE 90.1-2007
Cooling SSEER: 3.6 Cooling SSEER: 3.6 Cooling SSEER: G3.1.2 and section 6.4.1:
Section 6.4.1 – 2.72 Minimum efficiency
Heating SCoP: 82% (6.8.1F)
Cooling SSEER: 2.72 (6.8.1.D)
Author's personal copy

Y. Schwartz, R. Raslan / Energy and Buildings 62 (2013) 350–359 355

3.3. Study implementation

This study was implemented in two stages:

1. The comparison of variations in predicted loads, calculated by


different BPS tools.
2. The examination of the influence of results by different BPS on
BREEAM and LEED ratings.

To replicate the actual assessment process as closely as possi-


ble and ensure the consistency of model specification and analysis
approach, the following principles were applied in the implemen-
tation of the study:
Fig. 3. Total energy consumption.
• Model generation: The models were constructed separately in
each simulation tool, with the exception of the EnergyPlus model
which was built in SketchUp interface to the tool. tools. The graph shows that Tas consistently resulted in the low-
• Use of Plugins/external software: For the BREEAM assessment est total energy consumption, which was nearly 35% less than that
the designated Part L compliance tools in TAS and IES were not calculated by IES. Despite these differences, all tools generally fol-
used, since EnergyPlus does not have the same ‘plug-in’. Instead, lowed the same trend – The ‘Baseline building’ had the highest
the raw outputs – energy demands – generated by each BPS tool, heating and lowest cooling demands, while the ‘Designed building’
were used to manually calculate the rating scores. had the lowest heating and highest cooling demands.
• Use of weather files: While BREEAM requires the use of an offi- In an aim to explain the variance in the results, the ‘Designed
cial CIBSE Test Reference Year (TRY) weather-file, LEED allows building’ was further examined. This highlighted two main con-
using any weather file with hourly weather data represent- tributing factors to the variation:
ing the site. For this study, the designated weather-files were
used in each tool: In the case of TAS and IES, the official 4.1.1. The difference in the way the tools interpret the
‘GBR London-Heathrow CIBSE-TRY.TWD’ (1995). For EnergyPlus, construction element areas
‘cntr Heathrow TRY.EPW’ (1970) was instead used, as no CIBSE Although all models were based on the same DWG files, each
weather-file was found for it.2 Examination of monthly degree tool calculated the floor area differently. While Tas considers the
days showed that the TWD had warmer temperatures. area of the floor as it is in reality – it subtracts the constructions’
• Calculation of results: For all BPS tools, the energy consumption thickness, EnergyPlus and IES’ graphic representation of a wall is a
for domestic hot water was calculated manually since it was not single line. As a result, the Tas model’s floor area was 6.55% smaller
possible to calculate it directly in Tas. than that of EnergyPlus or IES. An example for the implication of
this difference can be seen in lighting loads examination: It was
For each rating system, a detailed procedure describing the found that Tas model lighting loads were 5.4% lower than in Ener-
modeling and simulation process is prescribed. In addition gyPlus. This suggests that the floor area, calculated by the different
to describing simulation instructions, these procedures include tools, affects the heat transfer calculation, and results with different
detailed specifications of the ‘Designed’ and ‘Base case’ buildings energy demands predictions.
to be used. These include details of such aspects as building ori-
entation, materials, fenestration and glazing ratios, thermal zone
4.1.2. The use of different weather files in the simulations
division, and thermal gains and losses, as well as general building
Tas and IES were simulated with the TWD while EnergyPlus with
systems and detailed HVAC specifications. While some charac-
the EPW. In an aim to eliminate the potential influence of using
teristics are based on the actual ‘Designed building’ parameters,
different weather-files, TAS and IES simulations were conducted,
others are described by NCM (for the BREEAM ‘Notional building’)
using an imported EPW weather file. Fig. 4 shows that even when
or ASHRAE (for the LEED ‘Baseline building’).
using an identical weather-file, Tas had considerably lower heating
For this study the differences between the ‘Designed’, ‘Notional’
loads – 32% lower than IES. Also, EnergyPlus had significantly higher
and ‘Baseline’ buildings properties are described in Table 5 and the
cooling loads – around four times higher than the IES and three
detailed HVAC system specifications of each are outlined in Table 6.
times than Tas.

4. Results and discussion

This section will discuss the result of the study. It starts with an
energy consumption prediction comparison, followed by an exami-
nation of some aspects that led to these results. Finally, it calculates
and compares the relevant BREEAM and LEED energy credits.

4.1. Predicted energy consumption performance

Fig. 3 shows the predicted energy consumption in the ‘Designed’,


‘Notional’ and ‘Baseline’ buildings, calculated by the different BPS

2
In order to examine the influence of the weather-file on the performance pre-
dictions, and since it was not possible to import the TWD to EnergyPlus, the EPW
file was later imported to IES and Tas. Fig. 4. The impact of different weather files on energy performance predictions.
Author's personal copy

356 Y. Schwartz, R. Raslan / Energy and Buildings 62 (2013) 350–359

of both the ‘Designed’ and ‘Notional’ buildings were calculated in


each BPS tool.
The results illustrated in Fig. 6 and Table 7 show that despite
the variation in total energy consumption between the BPS tools,
the calculated performance improvement of the ‘Designed build-
ing’ compared with the ‘Notional building’ for each consumption
parameter (demand/delivered/CO2 emissions) were similar in all
three BPS tools (approximately 2% improvement). As a conse-
quence, the Ene-01 score in all three tools was within the same
range (6–7 Ene-01 credits). The difference between the overall
BREEAM score for each tool was under 1%. Fig. 6 also illustrates
the impact of the transition from the original BPS tools outputs
– ‘energy demand’ – into ‘Energy delivered’ and ‘CO2 emissions’
values. In all BPS tools this change increased the calculated perfor-
mance difference between the ‘Designed’ and ‘Notional’ buildings
from 2% to approximately 10%.
Fig. 5. The influence of floor area and weather-file on heating and cooling load
simulation results. 4.2.2. LEED credit-1: ‘Designed’/‘Baseline’ building performance
For LEED EA credit-1, the performance improvement was cal-
culated (evaluated by energy expense) for both the ‘Designed’
To further investigate the influence of the differences between and ‘Baseline’ buildings by each BPS tool. Fig. 7(a) shows
calculated floor area, weather-files and simulation platforms on that despite the differences in the overall predicted energy
the variability of results, a comparison between 4 simulations demand generated by the various BPS tools, the performance
were carried out using the case study models. These used differ- improvement between the ‘Designed’ and ‘Baseline’ building
ent combinations of simulation platform, modeling platform and was similar in all three tools (around 3%). Though transform-
weather-file. For simulations 3 and 4 (Fig. 5), different simula- ing the ‘energy demand’ into ‘total energy expenses’ (Fig. 7(b))
tion engines were used while all other factors were kept identical. increased the performance improvement from 3% to 8%, nei-
The results also show that simulation (2) used more heating and ther of the BPS tools showed an improvement of more than 10%
less cooling than simulation (1). This can be attributed to the fact (a minimum value required by LEED). Therefore, neither of the
that the EPW weather-file has colder temperatures than the TWD. simulations was able to achieve any LEED EA credit-1 points
A comparison between simulations (2) and (3) confirms that the (Table 8).
model geometry did have a significant influence on the predicted Table 9 sums up the score of the energy sections out of the overall
loads. scores of BREEAM and LEED, as it was calculated by each BPS tool.

4.2. Calculated energy credits 4.2.3. NCM 2010 and ASHRAE 90.1-2007 comparison
A comparison between NCM 2010 and ASHRAE 90.1-2007
4.2.1. BREEAM Ene-01 credits: ‘Designed’/‘Notional’ building shows that although NCM and ASHRAE ‘Designed building’ energy
performance demand were practically identical in each BPS tool (Fig. 8a),
In order to calculate the BREEAM Ene-01 score, the EPR-NC the differences between the ‘Designed building’ system proper-
parameters (energy demand, energy delivered and CO2 emissions) ties in each standard (HVAC and domestic hot water systems,

Fig. 6. EPR-NC components in the ‘Designed’ and ‘Notional’ buildings.

Table 7
BREEAM Ene-01 score.

Tool EPR-NC (a) Ene-01 (b) Maximum Ene (c) Score gained/maximum (d) Section Contribution to overall
score gained score for BREEAM NC score (%) weighting BREEAM score (%)
(a)/(b) (c) × (d)

Tas 0.37 6 27 22.2 0.19 4.28


EnergyPlus 0.39 6 27 22.2 0.19 4.28
IES 0.43 7 27 25.9 0.19 4.92
Author's personal copy

Y. Schwartz, R. Raslan / Energy and Buildings 62 (2013) 350–359 357

Fig. 7. Performance improvement components in the ‘Designed’ and ‘Baseline’ buildings. Left: (a) energy demand. Right: (b) total energy expense.

Table 8
LEED EA credit-1 score.

Tool Performance Minimum 10% improvement EA credit-1 Contribution to overall


improvement (%) requirement? available score LEED score (%)

Tas 8.2 No 0 0
EnergyPlus 7.5 No 0 0
IES 8.3 No 0 0

materials, etc.), resulted with ASHRAE ‘Designed building’ deliv- ‘Notional building’. However, IES results showed an inverse trend
ered energy being approximately 0.3% higher than that in the NCM (Fig. 8c). Despite this, after considering the efficiency coefficients
(Fig. 8b). according to each system, this trend was reversed: all tools resulted
A comparison between the NCM 2010 ‘Notional’ and ASHRAE with the ‘Notional building’ having worse performance than the
90.1-2007 ‘Baseline’ buildings shows that the ‘Baseline building’ ‘Baseline building’ (Fig. 8d). This demonstrates the significant con-
under Tas and EnergyPlus had higher total demands than the tribution of NCM efficiency coefficients to the final BREEAM score.

Fig. 8. NCM/ASHRAE performance comparison. Top left: (a) ‘Designed buildings” energy demand. Top right: (b) ‘Designed buildings” energy delivered. Bottom left: (c)
‘Notional’/‘Baseline’ buildings energy demand. Bottom right: (d) ‘Notional’/‘Baseline’ buildings energy delivered.
Author's personal copy

358 Y. Schwartz, R. Raslan / Energy and Buildings 62 (2013) 350–359

Table 9 systems, but also due to the recent USGBC announcement (March
Contribution of Ene-01 and EA credit-1 to overall BREEAM and LEED score.
2012) acknowledging the similarities between LEED and BREEAM
Simulation tool BREEAM LEED and confirming their recognition of credits from BREEAM’s New-
Tas 4.28 0 Construction and most updated international schemes [64].
EnergyPlus 4.28 0
IES-VE 4.92 0 6. Further research

In extending the research presented in this paper, a body of


5. Findings and conclusion further work aiming to examine the loads predictions generated
by different tools in more details is planned. This analysis will be
This study highlights a number of important findings which are undertaken with the view that the findings may be used to improve
relevant to BPS tool analysts in general, and for BREEAM and LEED the accuracy of these tools.
assessed projects in particular. The main conclusions of this work While the variations in results for this case study did not have
can be listed as: major impact on BREEAM or LEED, it might influence results of
other standards that aim to evaluate absolute energy consumption.
i. Load prediction variation by different BPS tools – This study has
illustrated that comparative accuracy and consistency within Acknowledgements
different BPS tools cannot be guaranteed – it had resulted
with up to 35% difference in total loads. As illustrated in the The data for the case study building used in this study was sup-
study carried out, the use of different weather-files does have plied courtesy of SPPARC studio (updated September 2010). At the
a considerable influence on the total predicted loads, but more time of writing, the building was still under design.
importantly, the way different tools calculate floor areas also The work undertaken for this study was in part supported by
significantly affects results. Other possible contributing reasons the Kenneth Lindsay Fund.
for the variations might be:
• Algorithm differences – Though the thermo-dynamic calcu-
References
lations should lead to a similar result, the syntax complexity
and algorithm structure differences might lead to a different [1] IPCC, Climate Change 2007: Synthesis Report, Summary for Policy-
output. makers, Valencia, Spain, 2007. Available from: http://www.ipcc.ch/
• Differences in the required input data – Although the study pdf/assessment-report/ar4/syr/ar4 syr spm.pdf (accessed 27.12.12).
[2] OPEC, World Oil Outlook, OPEC, Vienna, 2010, ISBN 978-3-9502722-1-
implementation methodology aimed to standardize the data 5). Available from: http://www.opec.org/opec web/static files project/media/
inputs used for each of the tools, some variation between downloads/publications/WOO 2010.pdf (accessed 27.12.12).
the sets of inputs required by each tool remained due to the [3] UNEP, Buildings – Investing in Energy and Resource Efficiency, UNEP,
2011. Available from: http://www.unep.org/greeneconomy/Portals/88/
inherent nature of each tool. documents/ger/9.0 Buildings.pdf (accessed 27.12.12).
• Human error – While various measures were taken to min- [4] D. Urge-Vorsatz, L.D.D. Harvey, S. Mirasgedis, M.D. Levine, Mitigating CO2 emis-
imize the likelihood of user input error, the possibility of its sions from energy use in the world’s buildings, Building Research & Information
35 (4) (2007) 379–398.
occurrence cannot be completely eliminated. [5] USGBC, LEED|U.S. Green Building Council (online), 2012.
ii. BPS influence on BREEAM and LEED ratings – In examining http://new.usgbc.org/leed (accessed 10.12.12).
the influence of different BPS tools on the ratings of a case [6] S. Sorrell, Making the link: climate policy and the reform of the UK construction
industry, Energy Policy 31 (2003) 865–878.
study building in BREEAM and LEED’, this study has found that [7] Y. Roderick, D. McEwan, C. Wheatley, C. Alonso, Comparison of energy per-
although different BPS tools resulted with different energy con- formance assessment between LEED, BREEAM and GREEN STAR, in: Eleventh
sumption loads, this had only a minor effect on BREEAM and International IBPSA Conference, 27–30 July 2009 Glasgow, Integrated Environ-
mental Solutions Limited, Glasgow, 2009, pp. 1167–1176.
LEED’s ratings (Table 9). This is because both BREEAM and [8] USGBC, Foundations of LEED, USGBC Council Board of Directors, 2009.
LEED express ‘performance improvement’ as a ratio between Available from: http://www.usgbc.org/ShowFile.aspx?DocumentID=6103
the performances of the ‘Designed’ building against a ‘Base-case’ (accessed 27.12.12).
[9] N.C. Kibert, C.J. Kibert, Sustainable Development and the U.S. Green Building
building. Assuming both buildings are “built” and simulated
Movement – Profitable Development Projects Can Be Good for the Planet, Too,
by the same tool using the same weather-file, the major con- Hein Online, 22 Prob. & Prop. 21, 2008.
tributors to the overall rating will be the parametric difference [10] S. Dermisi, Effect of LEED ratings and levels on office property assessed and
between the buildings (U-values, glazing ratio, system efficien- market values, Journal of Sustainable Real Estate 1 (1) (2009).
[11] C. Turner, M. Frankel, Energy Performance of LEED for New Construction Build-
cies etc.). ings, USGBC, 2008.
iii. Comparing BREEAM and LEED rating-score [12] USGBC, LEED is Good for Business, 2012. Available from: http://new.usgbc.
org/leed/applying-leed/leed-for-business (accessed 27.12.12).
[13] C. Goulding, J. Goldman, N. Demarino, LEED Building Tax Opportu-
This study showed that building ratings scores are strongly nities. Corporate Business Taxation Monthly, 2008. Available from:
affected by the rating scheme that is employed: while the examined http://www.energytaxsavers.com/articles/LEED-Building-EPAct-179D-
Tax-Opportunities.pdf (accessed 27.12.12).
building achieved between 4.28 and 4.92 BREEAM credits (40–46%
[14] T. Saunders, A Discussion Document Comparing International Environ-
of Ene-01’s available credits), it did not achieve any LEED cred- mental Assessment Methods for Buildings, BRE Global, 2008. Available
its. This is both due to the different system efficiencies coefficients from: http://www.dgbc.nl/images/uploads/rapport vergelijking.pdf (accessed
(defined by NCM and ASHRAE), as well as the fact that LEED has a 27.12.12).
[15] K.R. Bunz, G.P. Henze, D.K. Tiller, Survey of sustainable building practices
10% improvement as a minimum requirement. in North America, Europe and Asia, Journal of Architectural Engineer-
Due to the increasing use of relatively newly developed tools for ing 12 (1) (2006) 33–62, Available from: http://www.rpd-mohesr.com/
quantifying ‘sustainable design’ – both BPS software and sustain- uploads/custompages/survey%20of%20sustainable%20building%20design.pdf
(accessed 27.12.12).
able building rating systems – and since the potential effects of BPS [16] J. Neymark, R. Judkoff, Building Energy Simulation Test and Diagnostic Method
inconsistencies on the credibility of rating systems had not been for Heating, Ventilating, and Air-Conditioning Equipment Models (HVAC
investigated thoroughly, we believe that this study has highlighted BESTEST), Volume 1: Cases E100–E200, International Energy Agency, 2002.
[17] A. Brun, C. Spitz, E. Wurtz, L. Mora, Behavioural comparison of some pre-
the importance in understanding the differences between BREEAM dictive tools used in a low-energy building, in: Proceedings of the Eleventh
and LEED. This is not only due to the fact that these two stan- International IBPSA Conference (BS’09), International Building Performance
dards are of the most popular global sustainable buildings rating Simulation Association, Glasgow, Scotland, 2009, pp. 1185–1190.
Author's personal copy

Y. Schwartz, R. Raslan / Energy and Buildings 62 (2013) 350–359 359

[18] T. Bevan, How the new BREEAM 2011 version measures, quantifies and evalu- [43] R. Judkoff, J. Neymark, Home Energy Rating System Building Energy Simu-
ates the key life cycle environmental impacts of new buildings at the design and lation Test (HERS BESTEST), NREL/TP-472-7332, National Renewable Energy
construction stages, in: CIBSE Technical Symposium, DeMontfort University, Laboratory, Golden, CO, 1995.
Leicester UK – 6th and 7th September 2011, 2011. [44] G. Guyon, The role of the user in results obtained from simulation software pro-
[19] AIA, How Changes to Leed? Will Benefit Existing and Historic Buildings, Knowl- gram, in: Proc. BS’97 The International Building Simulation Conference, Prague,
edge Communities, 2012. Available from: http://www.aia.org/practicing/ 1997, 1997.
groups/kc/AIAS076321 (accessed 27.12.12). [45] F. Karlsson, P. Rohdin, M. Persson, Measured and predicted energy demand of a
[20] BRE Global, The World’s Foremost Environmental Assessment Method and low energy building: important aspects when using building energy simulation,
Rating System for Buildings, 2011. Available from: http://www.breeam.org/ Building Service Engineering Research and Technology 28 (2007) 223–235.
filelibrary/BREEAM Brochure.pdf (accessed 27.12.12). [46] T. Kalema, G. Johannesson, P. Pylsy, P. Hagengran, Accuracy of energy analysis
[21] USGBC, Press Release: U.S. green building council announces the LEED of buildings: a comparison of a monthly energy balance method and simulation
green building program to recognize energy credits from BREEAM, Wash- methods in calculating the energy consumption and the effect of thermal mass,
ington, DC, 2012. Available from: https://www.usgbc.org/Docs/News/ Journal of Building Physics 32 (2008) 101–130.
LEED%20Intl%20Roundtable%20Meeting%20in%20Paris%20031612.pdf [47] R. Judkoff, D. Wortman, B. O’doherty, J. Burch, A methodology for validating
(accessed 27.12.12). building energy analysis simulations, Technical Report NREL/TP-550-42059
[22] BREEAM, BREEAM New Construction Non-domestic Buildings Technical Man- April 2008, 2008.
ual, BRE Global (SD5073-2.0:2011), Watford, 2011. [48] R. Raslan, M. Davies, Results variability in accredited building energy per-
[23] USGBC, LEED 2009 for New Construction and Major Renovations rat- formance compliance demonstration software in the UK: an inter-model
ing system with alternative compliance paths for projects outside the comparative study, Journal of Building Performance Simulation 3 (1) (2009)
U.S., Washington, DC, 2011. Available from: http://www.usgbc.org/ShowFile. 63–85.
aspx?DocumentID=8868 (accessed 27.12.12). [49] T. Maile, M. Fischer, J. Haymaker, V. Bazjanac, Formalizing approximations,
[24] INBUILT, BREEAM versus LEED, Inbuilt LTD, Kings Langley, 2010. Avail- assumptions, and simplifications to document limitations in building energy
able from: http://www.ukpassivhaus.org/media/406565/breeamvsleed.pdf performance simulation, Comparing Measured and Simulated Building Energy
(accessed 27.12.12). Performance Data (2010) 96.
[25] S. Barlow, Guide to BREEAM, RIBA Publishing, 2012, ISBN 978 1 85946 425 0. [50] H. Radhi, A comparison of the accuracy of building energy analysis in Bahrain
[26] UK Building Regulations, Building Regulations 2010, Approved Document L2A, using data from different weather periods, Renewable Energy 34 (3) (2009)
Conservation of Fuel and Power, NBS, London, 2010. ISBN 978 1 85946 326 0. 869–875.
[27] NCM, National Calculation Methodology (NCM) Modeling Guide (for buildings [51] Y. Sun, Y. Heo, M.H.Y. Tan, H. Xie, C.F. Jeff Wu, G. Augenbroe, Uncertainty
other than dwellings in England and Wales), Building Research Establishment quantification of microclimate variables in building energy models, Jour-
Ltd, 2011. nal of Building Performance Simulation (2013), http://dx.doi.org/10.1080/
[28] S.T. Ng, T. Chen, J.M.W. Wong, Variability of building environmental assess- 19401493.2012.757368.
ment tools on evaluating carbon emissions, Environmental Impact Assessment [52] L. Wang, P. Mathew, X. Pang, Uncertainties in energy consumption introduced
Review 38 (2013) (2013) 131–141. by building operations and weather for a medium-size office building, Energy
[29] ASHRAH, ASHRAE Standard Energy Standard for Buildings Except Low-Rise and Buildings 53 (October) (2012) 152–158.
Residential Buildings, ASHRAE, Atlanta, 2007. ISSN 1041-2336. [53] B. John, Reducing the variability between novice modelers: results of a tool for
[30] CIBSE, Building Energy and Environmental Modelling, Applications Man- human performance modeling produced through human-centered design, in:
ual AM11: 1998, Chartered Institution of Building Services Engineers, Proceedings of the 19th Conference on Behavior Representation in Modeling
1998. and Simulation, Charleston, SC, 21–24 March 2010, 2010.
[31] J.W. Hand, Removing Barriers to the Use of Simulation in the Building Design [54] H. Brohus, P. Heiselberg, A. Simonsen, K.C. Sorensen, Uncertainty of energy con-
Professions, PhD Thesis, 1998. sumption assessment of domestic buildings, in: Eleventh International IBPSA
[32] J. Hensen, N. Nakahara, Energy and building performance simulation: current Conference, Glasgow, Scotland, July 27–30, 2009, 2009.
state and future issues, Energy and Buildings 33 (2001) vii–ix. [55] Y. Yildiz, Z.D. Arsan, Identification of the building parameters that influence
[33] P. Wilde, Computational Support for the Selection of Energy Saving Building heating and cooling energy loads for apartment buildings in hot-humid cli-
Components, DUP Science, Delft, The Netherlands, 2004. mates, Energy 36 (July (7)) (2011) 4287–4296.
[34] D.B. Crawley, J.W. Hand, M. Kummert, B.T. Griffith, Contrasting the capabilities [56] G.R. Newsham, S. Mancini, B.J. Birt, Do LEED-certified buildings save energy?
of building energy performance simulation programs, Joint Report, US Depart- Yes, but. . ., Energy and Buildings 41 (August (8)) (2009) 897–905, ISSN 0378-
ment of Energy, University of Strathclyde, University of Wisconsin & NREL, 7788.
2005. [57] R.A. Fenner, R. Ryce, A comparative analysis of two building rating systems.
[35] D. Spekkink, Performance Based Design: an explanation, Building Research & Part 1: Evaluation, Engineering Sustainability 161 (ES1) (2007).
Information: Special Issue on Performance Based Building, 2005. [58] M.J. Witte, R.H. Henninger, J. Glazer, Seventh International IBPSA Conference,
[36] BCA, Performance-Based Building Regulations (online), 2004. http://www.bca. 13–15 July 2001 Rio de Janeiro, Building Simulation, 2001, pp. 353–360.
gov.sg/Performance-Based/others/FAQPerformanceBased.pdf [59] B. Haytack (ben@edsl.net), 4 December 2012, Subject: Ticket (SAL-
[37] P.A. Strachan, Simulation support for performance assessment of building com- 9021304815), E-mail to Y. Schwartz (y.schwartz.11@ucl.ac.uk), personal
ponents, Building and Environment 43 (2008) 228–236. communication.
[38] J. Clarke, D. Tang, A co-operating solver approach to building simulation, in: [60] D.B. Crawley, EnergyPlus: DOE’s Next Generation Simulation Program, U.S
Proceedings of the Bi-Annual Conference of IBPSA-ESIM2004, International Department of Energy, 2010. Available from: http://www1.eere.energy.
Building Performance Simulation Association, Vancouver, Canada, 2004, pp. gov/buildings/pdfs/eplus webinar 02-16-10.pdf (accessed 27.12.12).
213–220. [61] E. Cramp (edwina.cramp@iesve.com), 5 December 2012, Subject: IES Support
[39] UK GOV, DCLG Approved Software for the Production of Non Domestic Enquiry, E-mail to Y. Schwartz (y.schwartz.11@ucl.ac.uk), personal communi-
Energy Performance Certificates, 2012. Available from: https://www.gov.uk/ cation.
government/uploads/system/uploads/attachment data/file/43614/121109 - [62] The Charted Institution of Building Services Engineers London, CIBSE Guide-A:
Table for website - list of DCLG approved Software 2 .pdf (accessed Environmental Design, CIBSE Publications, Norwich, 2006, ISBN-10: 1-903287-
11.02.12). 66-9, ISBN-13: 978-1-903287-66-8.
[40] W.L. Lee, Benchmarking energy use of building environmental assessment [63] SPPARC Architects, Quill Design & Access Statement, SPPARC Architects, Lon-
schemes, Energy and Buildings 45 (2012) 326–334. don, 2010.
[41] P.R. Rittelman, A.S. Faruq, Design Tool Survey, IEA SHC Task 8 Passive & Hybrid [64] USGBC, Press Release: U.S. Green Building Council Announces the LEED
Solar Low Energy Buildings, Subtask C Design Methods, 1985. Green Building Program to Recognize Energy Credits from BREEAM, Wash-
[42] R. Judkoff, J. Neymark, International Energy Agency Building Energy Simulation ington, DC, 2012. Available from: https://www.usgbc.org/Docs/News/
Test (BESTEST) and Diagnostic Method, NREL/TP-472-6231, National Renew- LEED%20Intl%20Roundtable%20Meeting%20in%20Paris%20031612.pdf
able Energy Laboratory, Golden, CO, 1995. (accessed 27.12.12).

View publication stats

Vous aimerez peut-être aussi