Vous êtes sur la page 1sur 1167





eWork and eBusiness in

Architecture, Engineering and
Edited by

Attila Dikba
Istanbul Technical University, Turkey
Raimar Scherer
University of Technology, Dresden, Germany



Copyright 2004 Taylor & Francis Group plc, London, UK

All rights reserved. No part of this publication or the information contained herein may be
reproduced, stored in a retrieval system, or transmitted in any form or by any means, electronic,
mechanical, by photocopying, recording or otherwise, without written prior permission from the
Although all care is taken to ensure the integrity and quality of this publication and the information
herein, no responsibility is assumed by the publishers nor the author for any damage to property or
persons as a result of operation or use of this publication and/or the information contained herein.
Published by: A.A.Balkema Publishers, a member of Taylor & Francis Group plc
http://balkema.tandf.co.uk/ and http://www.tandf.co.uk/
This edition published in the Taylor & Francis e-Library, 2006.
To purchase your own copy of this or any of Taylor & Francis or Routledges collection of
thousands of eBooks please go to http://www.ebookstore.tandf.co.uk/.
ISBN 0-203-02342-0 Master e-book ISBN

ISBN 04 1535 938 4 (Print Edition)

eWork and eBusiness in Architecture, Engineering and ConstructionDikba & Scherer (eds.)
2004 Taylor & Francis Group, London, ISBN 04 1535 938 4

Table of Contents




Keynote papers
The future forces of change for the construction sectora global perspective
Vectors, visions and values
Help wanted: project information officer
The next generation of eBusiness and eWorkwhat is needed for the systemic
innovation? An executive summary of the EU supporting research and


Product modelling technology

Virtual building maintenance: enhancing building maintenance using 3D-GIS
and 3D laser scanner (VR) technology
V.Ahmed, Y.Arayici, A.Hamilton & G.Aouad
Supporting standard data model mappings
Virtual building environments (VBE)applying information modeling to
A persistence interface for versioned object models
D.G.Beer, B.Firmenich, T.Richter & K.Beucke
Semantic parameterized interpretation: a new software architecture for
conceptual design systems
Harmonization of ISO 120062 and IFCa necessary step towards





A novel modelling approach for the exchange of CAD information in civil

Integration of product models with document-based information
Aligning IFC with the emerging ISO10303 modular architecture. Can AEC
community take advantages from it?
R.Jardim-Gonalves, K.Farinha & A.Steiger-Garcao
Optimization of project processing in the steel construction domain
E.Holtzhauer & H.Saal
Location sensing for self-updating building models
O.Icoglu & A.Mahdavi
Modeling cast in place concrete construction alternatives with 4D CAD
R.P.M.Jongeling, T.Olofsson & M.Emborg
Pilot implementation of a requirements model
A.Kiviniemi & M.Fischer
A combined product-process model for building systems control
FIDE: XML-based data model for the spanish AEC sector
J.M.Molina & M.Martinez
A framework for concurrent structure analysis in building industry
A.Niggl, R.Romberg, E.Rank, R.-P Mundani & H.-J.Bungartz
IFC supported distributed, dynamic & extensible construction products
information models
M.Nour & K.Beucke
Product definition in collaborative building design and manufacturing
H.Oumeziane, J.C.Bocquet & P.Deshayes
Implementation of the ICT in the Slovenian AEC sector
T.Pazlar, M.Dolenc & J.Duhovnik
Adding sense to building modelling for code certification and advanced
I.A.Santos, F.Farinha, F.Hernndez-Rodrguez & G.Bravo-Aranda
Towards engineering on the grid
.Turk, M.Dolenc, J.Nabrzyski, P.Katranuschkov, E.Balaton, R.Balder &
Managing long transactions in model server based collaboration
M.Weise, P.Katranuschkov & R.J.Scherer
A software generation process for user-centered dynamic building system models
G.Zimmermann & A.Metzger
Process modelling technology








Embedded commissioning for building design

.Akin, M.T.Turkaslan-Bulbul, I.Gursel, J.H.Garrett Jr, B.Akinci & H.Wang
The development of a technical office organization structure for enhancing
performance and productivity in fast track construction projects
T.A.H.Barakat, A.R.J.Dainty & D.J.Edwards
Innovative production planning system for bespoke precast concrete products
V.Benjaoran, N.Dawood & R.Marasini
Process and information flow in mass customisation of multi-story housing
T.Olofsson, L.Stehn & E.Cassel-Engqvist
RoadSim: an integrated simulation system for road construction management
S.Castro & N.Dawood
Connet Turkeygateway to construction in Europe
A.Dikba, S.Durusoy, H.Yaman, L.Tanaan & E.Ta
Modelling collaborative processes for Virtual Organisations in the building
M.Keller, P.Katranuschkov & K.Menzel
Process modelling in building engineering
M.Knig, A.Klinger & V.Berkhahn
Space competition on construction sites: assignment and quantification utilising
4D space planning tools
Z.Mallasi & N.Dawood
Project planning: a novel approach through a universal e-engineering Huba
case study of seismic risk analysis
G.Augenbroe, Z.Ren, C.J.Anumba, T.M.Hassan & M.Mangini
A decision support model for material supply management for the construction
J.Perdomo, W.Thabet & R.Badinelli
Modeling processes and processing product model information based on Petri
U.Rueppel, U.F.Meissner & S.Greb
A building material information system: BMISin the context of CONNET
Turkey project
E.Ta, L.Tanaan, H.Yaman & A.Dikba








Managing changes in the AEC industryhow can ontologies help?
Q.Y.Cai & F.F.Ng
An ontology-driven approach for monitoring collaborative design knowledge
Y-C.Lai & M.Carlsen
Setting up the open semantic infrastructure for the construction sector in
Europethe FUNSIEC project
C.Lima, B.Fis, C.Ferreira da Silva & S.Barresi


Practical use of the semantic web: lessons learned and opportunities found
R.V.Rees, W.V.Vegchel & F.Tolman
Supporting ontology management through self-describing concepts


eWork and eBusiness

An assessment methodology for eBusiness and eCommerce in the AEC sector
A.Grilo, R.Mal & R.Jardim-Gonalves
The digital dormerapplying for building permits online
J.P.van Leeuwen, A.J.Jessurun & E.de Wit
An inquiry into building product information acquisition and processing
A.Mahdavi, G.Suter, S.Husler & S.Kernstock
Usefulness and ease-of-use assessment of a project management tool for the
construction industry
B.Otjacques, G.Barrre, F.Feltz & M.Naaranoja
Development and implementation of a functional architecture for an eengineering Hub in construction
Z.Ren, C.J.Anumba, T.M.Hassan & G.Augenbroe
Legal and contractual issuesare they considered in RTD achievments
M.A.Shelbourn, T.M.Hassan & C.D.Carter
Modeling of ERP system solutions for the construction industry
M.O.Tatari, B-Y.Ryoo & M.J.Skibniewski
Construction informatics themes in the framework 5 programme




Collaborative working
Virtual pools of resources eliminate idle or missing equipment in AEC
G.Antoniadis & K.Menzel
DIVERCITY: distributed virtual workspace for enhancing communication and
collaboration within the construction industry
Y.Arayici & G.Aouad
Cooperation and product modelling systems
S.Blokpoel, R.R.M.Jongeling & T.Olofsson
Linking early design decisions across multiple disciplines
R.Drogemuller, J.Crawford & S.Egan
State of the art of the implementation of Information Management Systems in the
construction industry in Spain
N.Forcada, M.Casals & X.Roca
Agent-enabled Peer-To-Peer infrastructure for cross-company teamwork
A.Gehre, P.Katranuschkov & R.J.Scherer





Virtual communities: design for collaboration and knowledge creation

I.L.Kondratova & I.Goldfarb
The design frameworka web environment for collaborative design in the
building industry
Collaborative work practices in Turkey, five case studies
Architecture for collaborative business process managementenabling dynamic
S.Zang, O.Adam, A.Hofer, C.Hammer, M.Jerrentrup & S.Leinenbach
Comprehensive information exchange for the construction industry




Mobile computing
Mapping site processes for the introduction of mobile IT
S.L.Bowden, A.Dorr, A.Thorpe & C.J.Anumba
Mobile field data entry for concrete quality control information
Issues of context sensitivity in mobile computing: restrictions and challenges in
the construction sector
K.Menzel, K.Eisenbltter & M.Keller
A context based communication system for construction
D.Rebolj, A.Magdi & N..Babi
MOBIKOmobile cooperation in the construction industry based on wireless



Knowledge management
Support for requirement traceability in design computing: an integrated approach
with building data modeling
I.zkaya & .Akin
Interlinking unstructured text information with model-based project data: an
approach to product model based information mining
S.-E.Schapke & R.J.Scherer
Live capture and reuse of project knowledge in construction: a proposed strategy
C.E.Udeaja, J.M.Kamara, P.M.Carrillo, C.J.Anumba, N.Bouchlaghem & H.Tan
Development of product family structure for high-rise residential buildings using
industry foundation classes
T.Wallmark & M.M.Tseng




Construction site and project management

Assistance to building construction coordination by images
S.Kubicki, G.Halin & J.-C.Bignon
Gesprecons: eSafety and risk prevention in the construction sector
J.M.Molina, M.Martinez & I.Garca
Intelligent Construction Sites (ICSs)
T.Mills, Y.Jung & W.Thabet
Organizational point of view for the use of information technology in
construction projects
Virtual reality at the building site: investigating how the VR model is
experienced and its practical applicability
S.Woksepp, O.Tullberg & T.Olofsson
Evaluating competitiveness in construction industry: an alternative frame
A.Y.Toprakli, A.Dikba & Y.Sey




Seismic risk and environmental management

Analyses of Izmit earthquake by means of remotely sensed data: a case study,
Yalova city
S.Kaya, F.Bektas, C.Goksel & E.Saroglu
Do phased Environmental Management Systems actually benefit SMEs?
L.L.Hopkinson & C.Snow
Software based knowledge integration for seismic risk management
R.Pellegrini & P.Salvaneschi
Real-time earthquake prediction algorithms
S.Radeva, R.J.Scherer & D.Radev



IT supported architectural design

Hybrid approach to solve space planning problems in building services
G.Bi & B.Medjdoub
A computational architectural design approach based on fractals at early design
.Ediz & G.ada
APSIS architectural plan layout generator by exhaustive search
B.Kisacikoglu & G.ada
Architectural parametric design and mass customization
S Boer & K Oosterhuis




S.Boer & K.Oosterhuis

A model for hierarchical floorplan geometries based on shape functions
G.Zimmermann & G.Suter


E-learning and education

Parametric representation of functional building elements with reference to
architectural education
M.Aygn & .etiner
Life long learning for improved product and process modeling support
E-learning with puzzle collages
C-H.Lin, T-W.Chang, L-C.Yang & S-C.Chen


Author index



eWork and eBusiness in Architecture, Engineering and ConstructionDikba & Scherer (eds.)
2004 Taylor & Francis Group, London, ISBN 04 1535 938 4


The global community has stepped into the next revolutionary phase of the long-term
evolution of the information society and is now facing a new challenging phenomenon:
Ambient Intelligenceproviding and getting the right information to the right people in
the right configuration at the right time anywhere. Our business processes have started to
change. New working methods are available and asked for; new forms of organizations
have been proven to be efficient and effectivethe vision of the previous decade have
been conquering practice. Ambient intelligence is the final keystone for a breakthrough
and the industry-wide business revolution, in particular for our one-of-a-kind multishareholder and hence very complex projects.
Intelligent management of the right information has become the focus of research.
Computing power is now available on the Web and basic technologieslike P2P, Grid,
Agents and Web serviceshave been developed to ripeness by the informatics
community for application in AEC/FM. Apply it to your benefitthis is the offer of the
informatics communityand also the challenge.
Making intelligence happen requires more than solely utilizing the basic technologies
and computing power on the Web. It means algorithms, either numerical or reasoning
ones and it means enhanced semantic data structures, in which the information and
knowledge is integrated and can be retrieved on requestwhen and where and how
desired. Intelligence does not mean merely powerful numerical algorithms for solving
and simulating complex engineering systemsas understood in computational
mechanics. In this context intelligence means autonomous problem specification,
decision preparation for problem solving and to some extent even problem solving itself.
Such systems, not necessarily located on one computer and eventually distributed
throughout the Web, should be capable of recognizing, deciding, retrieving and providing
any piece of information, not only explicitly stored data, and at the same time support the
co-operation with the end-user to serve him/her intelligently and polite. Data structures
and hence product and process modelling are as important as the respective algorithms to
make this happen, in particular for recognising the context, which is the prerequisite for
any autonomous action. Data structures, i.e. data schemata must inherit meanings,
semantics must be more than an identifier. They have to encapsulate knowledge on the
objects. This knowledge must be re-usable in a flexible way and provide for reasoning to
interrelate it with knowledge on other objects and their status described by the object data
in order to build up the current context. Recognizing the effective state and crystallizing

the particular problems and various actors in an instantaneous process we are able to
finally provide the right and focused information. This makes ambient intelligence
Research on and building of ontologies besides product data models have increasingly
been the focus of research activities in AEC/FM. However, do ontologies really replace
product data models? Or if not, do they subsume them? It is neither of them. Ontologies
extend product models adding a new functionality, namely carrying knowledge, which is
simply another objective. The main objective of product models is the very generic
representation of real world objects as well as their respective general relationships to
form a generic object net from the singular units, the objects to model a very generic
skeleton for any kind of application. Other extensions to the generic product model are
already on the way. For instance, product models are favoured, being the anchor for
project documents and structuring the document information space. Data and text mining
methods are increasingly applied to identify the representative semantic items of the
documents and mapping them to the semantics of the product model in order to interpret
the meaning of the document, i.e. recognizing its information contents and further multiinterlinking it with the product model. Again, being accessible via a VR building
environment, ambient intelligence makes document information tangible. The user is no
longer required to search for the right document in order to get the right information, he
only has to identify the building object in his VR model and the information system
provides him with the right information at any place and any time. The power of the
automatic selectiveness depends upon the capacity and power of the underlying contextsensitive systemand again context-sensitivity is first of all determined by logic
reasoning on product and process models based ontologies. We can subsume generic
product models and ontologies as well as any other knowledge-related extensions of
product models to be intelligent product models.
In recent years, the quality of product models has reached a level that allows for the
design of reasoning systems to check architecture and engineering systems consistency as
well as conformity with building codes and guidelines. The few existing and very
successful examples have to be considered first attempts, looking at the great variety of
reasoning methods provided by basic informaticsthis new area has just been touched
on. However, the results gained are more than promising. The consistency checking
methods are an important pre-requisite for co-operative and concurrent working, namely
the consistency problems arising from long-term transactions in complex data bases, as it
is the case in our AEC/FM data bases. We have now the confidence that they can be
handled, but practically sufficient solutions still need valuable research and development
efforts to cope with the whole AEC/FM domain.
In this context, the numerical and reasoning algorithms are utilised in a new, separate
information process, namely the information configuration process, so that we can now
distinguish among processes on three different levels. Besides modelling the tangible
work processes such as the production, organizational as well as the planning and
controlling processes, we have to consider the intangible communication processes
supporting formal information management and information logistics as well as the
configuration processes to determine e.g. the users information needs, critical
notification events or the optimal configuration and presentation of the information. In
the future our research efforts will more and more shift from basic product and tangible

process modelling to enhanced intelligent product modelling and information process

In recent years, new business concepts and modelling techniques have been developed
for the virtual enterprise that have demonstrated their proficiencies in several best
practice cases. Again ambient intelligence and additionally mobile computing are
expected to provide for a push to flexible adopt the formal business models in AEC/FM
practice. It will be of utmost importance to the industry to extend these organisational
models to efficient autonomous teamwork across enterprises anywhere and in any team.
Flexible systems and automatic configuration methods are required to install immediately
operable virtual teams within short lead times, that are supported by sound organization
structures, team-focused information spaces and corresponding information logistics.
Virtual enterprises will no longer be limited to strategic alliances providing
interoperability on a corporate and/or product level, but will also be able to significantly
reduce the management cost of true interenterprise collaboration on the team level.
Focusing on a few selected but outstanding topics of todays research on Product and
Process Modelling the papers of the ECPPM 2004 draw a very good overview on the
current state of the art in practice, emerging new business models as well as on the
cutting edge technologies available for architecture, engineering and construction. It thus
provides for solid fundament to explore the outlined possibilities of applying ambient
intelligence in our domain.
The Istanbul Technical University, Turkey has been selected to host the ECPPM in
2004. After holding the ECPPM 2002 in the former candidate state of Slovenia, the
EAPPM therewith again takes a clear stand for integrating researchers from all over
Europe and aligning the various activities in product and process modelling for a better
future. Today, Turkey is potential new EU member state of great importance and an agile
economy. Moreover, it is the bridge between Europe and Asia and it has been a melting
pot of cultures for more than 3000 years.
In Istanbul the ECPPM 2004 again introduces a new platform to share knowledge and
transform it into an active, fimctional asset ready to be shared, integrated and traded.
Latest research results and businesses applications in the areas of eWork and eBusiness,
product and process modelling, collaborative working, mobile computing, knowledge
management, ontology will enable research and industry organisation to develop new
lines of services and usher in a new breed of research areas. The committees of ECPPM
2004 have selected the best papers and organized attractive sessions for their
presentation. The number of abstracts submitted was again unusually high and their
quality was remarkable.
Numerous people have made conference and the proceedings possible. We thank the
authors, the scientific committee members and the ITU Project Management Centre for
their contribution, support and encouragement in compiling this book. Sincere gratitude
to each and all of them.
Attila Dikba and Raimar Scherer
Istanbul and Dresden, June 2004
eWork and eBusiness in Architecture, Engineering and ConstructionDikba & Scherer (eds.)
2004 Taylor & Francis Group, London, ISBN 04 1535 938 4

ECPPM 2004 Organization

Attila Dikba, Istanbul Technical University, Turkey
Raimar Scherer, University of Technology Dresden, Germany
Ziga Turk, University of Ljubljana, Slovenia
Glsn Salamer, Istanbul Technical University, Turkey
Nzhet Dalfes, Istanbul Technical University, Turkey
Yildiz Sey, Istanbul Technical University, Turkey
Amor, R., University of Auckland, New Zealand
Andersen, T., FMRI Consultant, Denmark
Augenbroe, G., Georgia Institute of Technology, USA
Bjoerk, B-C, Swedish School of Economics and Business Administration, Finland
Bhms, M., TNO, Netherlands
Cervenka, J., Cervenka Consulting, Czech Republic
Christiansson, R, Aalborg University, Denmark
ada, G., Istanbul Technical University, Turkey
Da, H., Istanbul Technical University, Turkey
Drogemuller, R., CSIRO, Australia
Ekholm, A., Lund University, Sweden
Fischer, M., Stanford University, USA
Froese, T., University of British Columbia, Canada
Fruchter, R., Stanford University, USA
Giritli, H., Istanbul Technical University, Turkey
Goncalves, R., Universidade Nova Lisboa, Portugal
Haas, W., Haas+Partner Ingenieurges. mbH, Germany
Kalay, Y., Berkeley University, USA
Kanolu, A., Istanbul Technical University, Turkey
Katranuschkov, R, TU Dresden, Germany
Lemonnier, A., ADEI, Spain
Menzel, K., TU Dresden, Germany
Mitchell, J., Graphisoft, Hungary

Moore, L., University of Wales, EG-SEA-AI, UK

Rezgui,Y., University of Salford, UK
Salamer, A., Istanbul Technical University, Turkey
Skibniewski, M., University of Purdue, USA
Smith, L, Federal Inst. of Tech., IABSE WC6, Switzerland
Steinmann, R., Nemetschek, Germany
Thomas, K., Waterford Institute of Technology, Ireland
Tzanev, D., University of Sofia, Bulgaria
Baslo, M., Istanbul Technical University, Project Management Center
Ergun, Z.N., Istanbul Technical University, Project Management Center
Amor, R., University of Auckland, New Zealand
Andersen, T., FMRI Consultant, Denmark
Anumba, C., Loughborough Uni., UK
Augenbroe, G., Georgia Institute of Technology, USA
Bjoerk, B-C., Swedish School of Economics and Business Administration, Finland
Bhms, M., TNO, Netherlands
Borkowski, A., Polish Acad. of Sciences, Poland
Cervenka, J., Cervenka Consulting, Czech Republic
Christiansson, P, Aalborg University, Denmark
Coyne, R., University of Edinburg, UK
Drogemuller, R., CSIRO, Australia
Ekholm, A., Lund University, Sweden
Fischer, M., Stanford University, USA
Froese, T., University of British Columbia, Canada
Fruchter, R., Stanford University, USA
Garas, F., Consultant, UK
Garrett, Jr., J., Carnegie Mellon University, USA
Goncalves, R., Universidade Nova Lisboa, Portugal
Gudnason, G., Icelandic Building Research, Iceland
Haas, W., Haas+Partner Ingenieurges. mbH, Germany
Hannus, M., VTT Technical Res. Centre of Finland
Howard, R., Technical University of Denmark
Juli, R., Obermayer Planen+Beraten, Germany
Kalay, Y., Berkeley University, USA
Katranuschkov, P, TU Dresden, Germany
Llambrich, A., ADEI, Spain
Lemonnier, A., ADEI, Spain
Liebich, T., AEC3, IAI, Germany
Mangini, M., Geodeco S.p.A., Italy
Martinez, M., AIDICO Constr. Tech. Inst., Spain
Mitchell, I, Graphisoft, Hungary
Moore, L., University of Wales, EG-SEA-AI, UK
Nolan, J., European Commission, Belgium
Rezgui,Y., University of Salford, UK

Skibniewski, M., University of Purdue, USA

Smith, L, Federal Inst. of Tech., IABSE WC6, Switzerland
Sozen, Z., Istanbul Technical University, Turkey
Steinmann, R., Nemetschek, Germany
Storer, G., Consultant, UK
Tzanev, D., University of Sofia, Bulgaria
Vanier, D., National Research Council, Canady
Winzenholler, J., Autodesk, Germany
Wix, J., AEC3, IAI, UK
Zarli, A., CSTB, France
Akkoyun, I., ITU, Project Management Center
Artan, D., ITU, Project Management Center
Aslay, Z., ITU, Project Management Center
Baslo, M, ITU, Project Management Center
ada, G., ITU, Faculty of Architecture
elik, ., ITU, Project Management Center
Da, H., ITU, Informatics Institute
Erdem, A., ITU, Faculty of Architecture
Ergun, Z. N., ITU, Project Management Center
Gke, Umut, ITU Project Management Center, TU Dresden
Gksel, ., ITU, Faculty of Civil Engineering
Ilter, T., ITU, Project Management Center
Oraz, G., ITU, Faculty of Architecture
ney, E., ITU, Faculty of Architecture
Sanal, A., ITU, Faculty of Architecture
Tali, R., ITU, Faculty of Civil Engineering
Yaman, H., ITU, Faculty of Architecture

Keynote papers

eWork and eBusiness in Architecture, Engineering and ConstructionDikba & Scherer (eds.)
2004 Taylor & Francis Group, London, ISBN 04 1535 938 4

The future forces of change for the

construction sectora global perspective
Roger Flanagan
The University of Reading, UK
ABSTRACT: All organisations, whether they are engineering and design
consultants, contractors, or manufacturers and suppliers in the
construction sector, need a strategy to survive, grow and succeed in a
rapidly changing world. This paper identifies nine drivers that are
impacting construction organisations. These drivers emanate from
political, environmental, technological, social and economic changes
impacting the global economy. In facing change, there is a need to balance
the internal juxtaposition of change and continuity. The error made by
some organisations is that they see all the new technology and materials
and feel it must be used as soon as possible. Stopping to develop a
strategy is important; it provides the framework to implement a plan for
the future whilst maintaining the goals and the direction of the

The challenge for all organisations is facing, managing and implementing change, whilst
at the same time ensuring profitability and maintaining customer satisfaction.
Construction organisations need to recognise today, the oppoijunities of tomorrow.
Realism must prevail; construction is predominantly a local business using mainly local
labour and complying with local requirements. The developed countries will have
different needs to developing, and newly industrialised countries. For example, Indias
need is to have an efficient industry that can provide work for the people, whereas in the
USA, with its higher cost base, the need is to build efficiently by exploiting technology,
more mechanisation, and off site pre-fabrication wherever possible.
Our lives have been transformed by electronics and information technology but, most
of all, by the processes of change itself. Knowledge has become pivotal and globalisation
has changed the face of competition.
Local issues will always be important, but construction sectors around the world are
not immune from the global issues that impact upon the economy, demand for their
services, and quality of life. Drivers can be defined as those forces that cannot be
changed and are an inevitable result of development in the broadest sense. The drivers of
change involve social, technological, economic, environmental and political trends. Many
countries have undertaken futures studies and Foresight studies with the aim of
identifying the drivers that will influence construction in the next 20 years. Studies from
10 developed countries (Australia, Canada, Finland, France, Germany, Ireland,

The future forces of change for the construction sector

Singapore, Sweden, UK and USA) were analysed, from which nine key drivers were
identified for the purposes of this paper; it is possible to identify many more drivers. Each
country is influenced by local needs and challenges, with different emphasis between the
developed and developing world. However, organisations need to consider the drivers of
change and ask: How will the drivers affect our business in the future, are they a threat
or an opportunity, how should we react to the challenge?
1 Urbanisation, growth of cities, and transportation
2 Ageing population
3 Rapid technological and organisational change
4 Environmental and climate change
5 Shift from public to private
6 The knowledge economy and information overload
7 Technologies for tomorrow
8 People, safety and health
9 Vulnerability, security, corruption and crime
2.1 Urbanisation
The move from rural to urban communities, and the change from agricultural to industrial
societies in all parts of the world is putting pressure on infrastructure and changing
patterns of settlement. Between 1990 and 2025 the number of people living in urban
areas is projected to double to more than 5 billion (UN, 1996).
In 1800, only 2% of the worlds population was urbanised; this rose to over 30% in
1950, and 47% in 2000; a population that was growing three times faster than the
population as a whole. Figure 1 shows that the percentage of urbanisation is predicted to
be over 60% by 2030.
Growing urbanisation creates congestion, puts pressure on utilities, and results in
many social issues. In many cities built since the Industrial Revolution there is a decaying
infrastructure that is not meeting increased demand. By 1900 only 12 cities had 1 million
or more inhabitants, by 1950, this had grown to 83 cities. In 2004, there are over 410
cities with over 1 million people (UN). The current stock of infrastructure cannot cope,
and modification, modernisation and refurbishment will be required to the existing
infrastructure, with particular emphasis on the environmental impact. This dilemma is
typical of many countries around the world.

eWork and eBusisness in architecture, engineering and construction

Figure 1. The growth of urbanisation

(The Population Institute, 2004).
People are more mobile, using roads, rail and air more frequently. In the UK, the average
person travelled 5 miles per day in 1950, and 28 miles in 2001. Projections suggest this
could reach 60 miles a day by 2025 (Cabinet Office, 2001). New airport development is
fraught with social and environmental problems as airport development increases
urbanisation, putting pressure on available land. Increased airport capacity will involve
new regional airports with technology to cope with noise levels.

The future forces of change for the construction sector

2.1.1 Growth of cities

Congestion is an increasing problem in urban areas, impacting the economy and the
environment. European research showed that congestion costs between 12% of GDP
(Cabinet Office, 2001). New methods of car parking will be required on streets and in car
parks. Automatic (electro-mechanical) parking without manual assistance is being used in
congested city centres, based on an underground silo system making maximum use of
limited space (Trevipark, 2004).
2.1.2 Transportation
Modernisation and retrofit is required for existing transport infrastructure. Engineers will
retrofit roads with new technologies rather than reconstruct them; interactive vehicle-road
systems will be widespread.
Underground road construction will be inevitable as cities become more crowded.
According to one report, it is anticipated that 10% of the trunk road network in the UK
will be tunnelled by 2050. However, the report highlights the cost of tunnel
maintenanceabout 10 times that of an equivalent surface road. Restricting tunnels to
cars and lighter vehicles can improve operation and reduce construction cost by around
40%. (RAC Foundation, 2002). This trend is also evident, for example in Sweden, the
Gota Tunnel, and the Big Dig in Boston. Tunnelling must be seen in the future as a
viable option if all social and environmental costs are included.
Light rail systems and people movers will be used increasingly in urban areas. Rail
infrastructure is in need of renewal and improvement to take account of high speeds,
greater density of use, improved safety measures and modernisation of control systems.
Maglev (magnetically levitated) trains, that allow speeds of up to 350 km per hour, have
experienced a long period of research, but development and application is now
proceeding. For example, China is considering 250 km of rail extensions north and south
of Shanghai using a maglev system.
Greater demand management is needed including price tolling and inter-modality,
maintenance planning and durability. Advanced transport telematics (ATT), will become
prevalent, specifically concerned with improving safety and efficiency in all forms of
transport and reducing damage to the environment. ATT allows efficient management
and improvements in many areas of road transportation, such as demand management
and automatic debiting, driver information and guidance, pedestrian and vehicle safety;
monitoring of vehicle emissions; trip planning; integrated urban traffic control; public
transport; and freight transport.
2.2 Ageing population
The developed world has an ageing population whilst populations are getting younger in
the developing world. According to the United Nations, the number of persons aged 60
years or older was estimated to be nearly 600 million in 2000 and is projected to grow to
almost 2 billion by 2050, at which time the population of older persons will be larger than
the population of children (014 years) for the first time in human history (UN, 2004).

eWork and eBusisness in architecture, engineering and construction

The majority of the worlds older persons reside in Asia (53%) while Europe has the next
largest share (25%). Figures 2 and 3 show the percentage of population over 60 in
different countries across the world.

Figure 2. Percentage of population

over 602002 (UN, 2004).

Figure 3. Percentage of population

over 602050 (UN, 2004).
One of every 10 persons is now aged 60 years or older; by 2050, the United Nations
projects that 1 person of every 5 and, by 2150, 1 of every 3 will be aged 60 years or
older. The percentage is much higher in the more developed than in the less developed

The future forces of change for the construction sector

regions, but the pace of ageing in developing countries is more rapid, and their transition
from a young to an old age structure will be more compressed in time.
Few facilities are built to cope with an ageing population, so infrastructure will need to
be built for an inclusive population and to meet a growing need for more healthcare
facilities. An increasing number of people with severe disabilities are living longer and
wanting to live independently. Design companies and construction organisations will
need to think and work differently to meet this demand.
2.3 Rapid technological and organisational change
The new kind of economy will create many more business opportunities, the rate of
change will make it more difficult for an organisation to profit from an investment before
a new competitor or development erodes the temporary competitive advantage. We are
more used to the idea of firms seeking an environment in which they can put down roots
and flourish, than to the idea of firms being created for an intentionally brief life to
exploit an idea before being washed away by a new wave of innovation (Chatham House
Forum, 1998).
Technology enables almost anything to be done; deciding what to do becomes the
critical skill. In the broadest sense of technology, our capacity to perform tasks, and our
ability to perceive and interact with complicated, remote, huge or tiny, abstract or
concrete things will be unprecedented. Personal computers will not be the main source of
information. Instead of buying a computer, most people will buy devices with computers
in them (embedded systems): those devices will talk to each other (interoperability). The
big breakthrough will come when all communication technologies become integrated.
Then you will have an all-in-one device that communicates.
Agile, knowledge-deploying firms may be able to build sustainable positions in the
new environment, but they will do so in an innovative way. The electronics industry talks
of copetitionco-operation merged with frenzied competition. In design consultancy
businesses, the high cost of developing the integration of CAD and visualisation will
mean that development and application costs will be shared between competitors.
2.4 Environmental and climate change
There is an increasing environmental awareness by governments, industry, clients and the
general public. Global environmental problems are high on political agendas with
increasing environmental legislation at a national, supranational and international level.
Ozone depletion, pollution, depletion of resources, and global warming are all common
topics of concern.
Climate change will affect physical and biological systems in many parts of the world.
The earths climate is predicted to change because human activities are altering the
chemical composition of the atmosphere through the build-up of greenhouse gases
primarily carbon dioxide, methane, and nitrous oxide. The heat-trapping property of these
gases is undisputed. Although uncertainty exists about exactly how earths climate
responds to these gases, global temperatures are rising.
A change in a regional climate could alter forests, crop yields, and water supplies.
Flooding of settlements near low-lying coastal areas and rivers will be prevalent causing

eWork and eBusisness in architecture, engineering and construction

severe damage to buildings and infrastructure and putting greater pressure on the repair
and maintenance sector of the industry. Energy demand is expected to increase for space
cooling and decrease of space heating, according to location. Energy supply may be
disrupted in the same way as other infrastructure.
2.5 Shift from public to private
There is an increasing trend towards private funding of public infrastructure.
Infrastructure projects such as power, telecommunications water and sewerage, and
transport facilities have a number of characteristics: they lack portability, are rarely
convertible to other uses, and investments in them are difficult to reverse. Infrastructure
projects require very large capital investments, and have long development and payback
There has been a change in the forms of financing over the last few decades with a
shift from public to private sector financing. For example, the UK government
implemented a Private Finance Initiative (PFI) and there has been a major privatisation of
utilities companies. The number of BOT, BOOT, BOO, and public/private partnerships
has increased. The public good nature of infrastructure projects makes them sensitive to
public opinion and political pressure. The mechanisms to attract private finance into
infrastructure provision are becoming more complex and more acceptable with the multilateral development agencies and institutional investors embracing the BOT concept.
The message for construction is that there is no shortage of projects around the world,
there is a shortage of bankable projects. This new form of procurement will grow in size,
importance, and complexity. Ways will have to be found for large companies and SMEs
to meet the challenges of the shift from public to private.
2.6 The knowledge economy and information overload
The know-how of people is one of the critical determinants of competitiveness, both at a
company and national level. Rapid technological changes mean that the traditional skill
bases are no longer enough and the future will be characterised by skill shortages and
skill gaps. High obsolescence of knowledge will have to be tackled in the context of an
increasingly ageing workforce. There will be at least 1 billion university graduates in
2020 compared with a few million in 1920. There will be several billion more
sophisticated customers by 2020, who will be better informed and more demanding than
ever before. (Chatham House Forum, 1998)
Learning matters, for individuals, companies, industry and the economy as a whole.
The tradition has been to measure success by economic growth and by the level of
capital. In todays knowledge economy, knowledge capital is more important.
Knowledge capital is the source of economic value added by the organization, over and
above the return on its financial assets (Strassman, 1998). Investment in education and
training helps form the human capitalthe knowledge, skills, competencies, and
attributes embodied in individuals that facilitate the creation of personal, social, and
economic well-being (OECD, 2001)that is a vital element in assuring economic
growth and individual advancement and reducing inequality.

The future forces of change for the construction sector

Technology gives us more and more access to information, so life gets more and more
chaotic. Information chaos prevails and we need to help people find the information that
they want, when they want it.
2.7 Technologies for tomorrow
Technology is a word that frightens some, excites others and prompts a feeling of
inevitability in the rest. There have been major advances in materials and technologies in
general. Extensive research has been undertaken into the use of composite materials,
providing lightweight, strong materials that do not rely on the earths non-renewable
resources. For example, soya and castor seed oils that are cheaper, bio-degradable and an
economic multiplier of using local products (ACRES, 2002). Many of the new/smart
materials are finding their way into the construction sector, having been first developed
for other industries such as automotive, aeronautic and defence.
These new materials, combined with the incorporation of intelligence, herald exciting
scientific advances. Smart or intelligent materials or structures are those that recognise
their environment and any changes and can adapt to meet those changes. System
integration, mass and energy reduction are just some of the benefits of using smart
materials. The technology of intelligent or smart materials uses the knowledge of a
number of different technologies such as materials science, biotechnology, biomimetics,
nanotechnology, molecular electronics, neural networks and artificial intelligence.
Four new technologies are considered in this paper:
1 Biomimetics
2 Smart materials and structures
3 Nanotechnology
4 Embedded intelligencethe application of information and communication
2.7.1 Biomimetics
Biomimetics has been defined as the abstraction of good design from nature. This
relatively new science advocates a radical approach of copying naturebiomimicry.
Biomimetics needs the collaboration of the scientist and the engineer. The biologist
understands the organisms and systems within nature whilst the engineer looks at the
design, the strength and durability characteristics. Nature has already produced smart
materials, ones that interact with their environment, responding to changes in a number of
ways. For example, plants have the ability to respond to changes in temperature, sunlight
etc. in order to make maximum use of their environment. The feathers of a penguin are
perfectly designed to be light but able to keep the bird warm in sub-zero temperatures.
Imagine a cladding that could do the samelight and strong with efficient insulation that
adapts to the environment.
Biomimetic engineering could provide clothing that is light, responsive and strong and
could be used in harsh site conditions. Mimicking nature could produce new designs in
civil engineering that are lighter, stronger and with greater adaptability to a changing
environment. New adhesives, based on those produced in nature (the blue mussel), could
revolutionise the building process. Buildings could be glued together, giving stronger,

eWork and eBusisness in architecture, engineering and construction


faster and cleaner construction techniques. The possibilities for the use of biomimetics
appear to be endless, but the research needed to achieve effective, efficient and viable
materials will not happen overnight.
2.7.2 Smart materials and structures
Extensive research has been undertaken into the use of composite materials, providing
lightweight, strong materials that do not rely on the earths non-renewable resources.
These new materials, combined with the incorporation of intelligence, herald exciting
scientific advances.
Smart or intelligent materials or structures are those that recognise their environment
and any changes and can adapt to meet those changes. System integration, mass and
energy reduction are just some of the benefits of using smart materials.
The technology of intelligent or smart materials uses the knowledge of a number of
different technologies such as materials science, biotechnology, biomimetics,
nanotechnology, molecular electronics, neural networks and artificial intelligence. These
technologies are inter-related. Just as stone implements triggered the Stone Age, alloys of
copper and tin triggered the Bronze Age and iron smelting triggered the Iron Age, the
new generation of materials will have a revolutionary effect.
Smart materials can be further defined as (Jane and Sirkis, 1994):
Materials functioning as both sensing and actuating.
Materials that have multiple responses to one stimulus in a co-ordinated fashion.
Passively smart materials self-repairing or standby characteristics to withstand sudden
Actively smart materials utilising feedback.
Smart materials and systems reproducing biological functions in load-bearing structural
Sensor materials should have the ability to feedback stimuli such as thermal, electrical
and magnetic signals, to the motor system in response to changes in the thermomechanical characteristics of smart structures (Jane and Sirkis, 1994). Actuators should
also react to the same stimuli, but their reaction should be to change shape, stiffhess,
position, natural frequency, damping and/or other mechanical characteristics.
2.7.3 Nanotechnology
Nano as a prefix to any measure is a one billionth. For example, a nanosecond is one
billionth of a second; a nanometre is one billionth of a metre etc. The essence of
nanotechnology is the ability to create large structures from the bottom up, that is by
starting with materials at a molecular level an building them up. The structures created
nanostructures are the smallest human-made objects whose building blocks are
understood from first principles in terms of their biological, chemical and physical
Diamonds are lightweight, very strong and have a number of materials properties that
would make an ideal choice of materials for many items, from aeroplanes to cars.
However, although its versatility and strength are ideal its cost/availability is not.

The future forces of change for the construction sector


Nanotechnology may provide the answer to this by taking manufacturing down to atomic
scale. Manufactured products are made from atoms if the atoms in coal are rearranged,
the result is diamonds; atoms of sand are rearranged then get computer chips are born.
Rearranging the atoms in dirt, water and air produces grass (Merkle, 1997). A
shatterproof diamond could be purpose grown to provide an ideal component in the
electronics, manufacturing, and construction sectors.
2.7.4 Embedded intelligence
A number of industrial applications are beginning to emerge that exploit the newly
emerging Internet capabilities of embedded systems. Embedded systems differ markedly
from desktop systems, being fitted with just enough functionality to handle a specific
application, enabling them to be produced at low-cost. Such systems have a more limited
processing speed, CPU power, display capability and persistent storage capability. The
challenge for developers is to produce embedded systems that are able to provide
network fiinctionality within these constraints.
The future is where all electronic devices are ubiquitous and networked with every
object, whether it is physical or electronic, electronically tagged with information
pertinent to that object. The use of physical tags will allow remote, contactless
interrogation of their contents; thus, enabling all physical objects to act as nodes in a
networked physical world. This technology will benefit supply chain management and
inventory control, product tracking and location identification, and human-computer and
human-object interfaces. In the construction sector auto-ID technologies will have a huge
impact on the supply chain, the design and construction process, and facilities
management (Marsh et al., 1997).
2.8 People, safety and health
2.8.1 People in a two-speed world
We have a two-speed world with a widening gap between the haves and the havenots. Large areas of the world have missed out on the information revolution,
threatening to widen the gap between rich and poorsee Figure 4. We need to bridge the
digital divide. According to a World Bank report a global explosion of knowledge is
underway which may lift hundreds of millions of the worlds poor out of poverty, or it
may create a widening knowledge gap, in which poor countries lag further and further
The richest 20% of the worlds people consume 86% of all goods and services while
the poorest 20% consume just 1.3%. The richest 20% consume 45% of all meat and
fish, 58% of all energy used and 84% of all paper, has 74% of all telephone lines and
owns 87% of all vehicles.
The three richest people in the world have assets that exceed the combined gross
domestic product of the 48 least developed countries.
2/3rds of Indias 90 million lowest-income house-holds live below the poverty line.

eWork and eBusisness in architecture, engineering and construction


The estimated additional cost of achieving and maintaining universal access to basic
education for all, basic health care for all, reproductive health care for all women,
adequate food for all, and clean water and safe sewers for all is roughly US$40 billion
a yearor less than 4% of the combined wealth of the 225 richest people in the world.
The message for construction organisations is that more focus will be required on
regional markets. For example, China has the knowledge and capacity to build innovative
and complex structures, but it lacks the finance and the managerial efficiency. Hence,
finance and managerial systems help to bridge the gap. The developing world needs
appropriate technology, rather than leading edge advanced technology. Local power
generation, waste water treatment, and fresh water supply will need to be designed for
local provision. Affordability is key, both of the capital plant and the communitys ability
to pay for the service.

Figure 4. The widening gap between

the richest and the rest.
Human capital is an increasingly important asset; the tacit knowledge of a business rests
within its workers. Therefore, the health and work environment of construction workers
needs to become more important. The overall cost of accidents and near misses on a
typical building site can amount to some 8.5% of the contract price; applied to the UKs
84bn annual output, this is a significant cost (Minister of State for Work, Department for
Work and Pensions 13 September 2003). An HSE report calculated that one third of all
work fatalities happen in construction and construction workers are six times more likely
to be killed at work than employees in other sectors (HSE, 2003).
New construction processes will lead to greater mechanical assistance for construction
workers and the elimination of dirty, dangerous and debilitating activities through the
provision of advanced mechanisation. They will benefit safety (due to better ways of
working) and job satisfaction (due to changes in the nature of the work accompanied by
new rules for site management procedures).
Short-term contracts, self-employment and job mobility will increase, creating
demands for personal pensions and rental stock. Teleworking will increase, but human
interaction will remain fundamentally important.

The future forces of change for the construction sector


New employment patterns with the old idea of the employer and employee are
becoming obsolete. No one can feel secure in the sense of lifetime employment. Only
those who learn new skills will achieve long-term employability. Service providers are
growing in importance with outsourcing to specialist providers.
2.8.2 Safety
A focus on safety from a design and construction perspective by companies is
encouraged by insurance companies and legislation, and is important to employers,
employees, and public attitudes. Ultimately, safety by design will be viewed as part of the
normal design process. Accident and illness prevention plans need to be built into
schemes at the design stage in response to design-led safety information required by
clients. Scheme safety requirements will also include information feedback reporting to
originating scheme designers and to a master industry reference database.
Training, advances and greater use of personal protective equipment and clothing, and
using technology will combine to make the construction process safer. Better safety
policies and regulations will control risks associated with construction sites and
environmental decisions.
Virtual reality will simulate site working environments for safety training and to help
minimise vehicle movements and risks in general. Modular design, offsite prefabrication,
lntelligent site vehicles and use of robotics will reduce the number of traditional
tradesmen required, leading to fewer people on-site and a reduction in accidents.
Automation will also reduce the need for scaffolding and the number of people working
at height. More off-site work could tackle the problems of quality, safety and speed of
2.8.3 Health
Over 1 billion people in the world are without safe drinking water. Almost 3 billion
people (roughly half the worlds population) are without adequate sanitation in
developing countries. Technology has the solutions to provide safe drinking water, but
cost is the issue.
2.9 Vulnerability, security corruption and crime
Different designs are being studied that minimise the impact of bomb-related threats.
Structures are being designed such that a column collapse would only result in the
collapse of a single floor or area without causing the collapse of the floors below it.
Reinforcement of the columns in existing buildings by the use of fibre glass or carbon
fibre materials is being researched and also how to minimise the impact of shattered
glass. Experts are investigating the effects of the introduction of an aerosol agent into the
heating, ventilation, and air-conditioning (HVAC) system through the development and
installation of devices that are designed to kill microorganisms or filter harmful

eWork and eBusisness in architecture, engineering and construction


2.9.1 Corruption
Levels of investment, both, foreign and domestic depend on the quality of the business
environment of a country. The business environment among others is a function of the
rule of law, in particular the stability of rules and regulations governing business
transactions, political stability and transparency. Corruption increases the uncertainty of
doing business because it erodes the rule of law and is associated with high levels of
bureaucratic red tape. Some describe corruption as a tax that adds to the cost of doing
business. Various business surveys have concerned themselves with the prevalence of
corruption in everyday business operations. An empirical analysis of transition
economies in Eastern Europe and Central Asia showed that investment levels in countries
with high levels of corruption were 6% lower on average than in countries with medium
levels of corruption (21% and 27% respectively) (The World Bank, 2000).
2.9.2 Crime
Crime is a growing industry with crime and terrorism becoming increasingly important
for the built environment. The events of September 11th have highlighted the importance
of life safety. Prior to that, building protection related to terrorism primarily focused on
the threat of bombs detonated inside vehicles. There is now a more extensive range of
threats, particularly those of a biological and chemical nature.
The best way to predict the fiiture is to create itignore the future at your peril!
We have enormous potential for the future. This includes technology, improvements in
communication, availability of capital, and increases in the quantity and availability of
information and knowledge. These require a capacity to invent and seize opportunities,
and innovative thinking. Innovation is the means by which firms can exploit change as an
opportunity for a different business or service and gain a competitive advantage.
The drivers above relate to a snapshot in time; they will change over time and in
importance and impact. The impact on the developing world will be different to the
developed world. For example, in the developing world the results of desertification,
deforestation, hunger and depravation will all ultimately impact the developed world. For
design and construction organisations they represent both a threat and an opportunity.
ACRES (2002) Affordable composites from renewable sources, University of Delaware, Center for
Composite Materials, USA
Cabinet Office (2001) Transport: trends and challenges. Performance and innovation Unit, Cabinet
Office, Her Majestys Government 13 November 2001
Chatham House (1998) Open Horizons Report from the Chatham House Forum. Royal Institute of
International Affairs. London. ISBN 1-86203-094-4

The future forces of change for the construction sector


Jain, A.K. and Sirkis, J.S. (1994) Continuum damage mechanics in piezoelectric ceramics, in
Adaptive structures and composite materials: analysis and application, Garcia, E., Cudney, H.
and Dasgupta, A. (Eds). Presented at ASME 1994 International Mechanical Engineering
Congress and Exposition, Chicago, November 611, pp. 4758
Marsh, L., Flanagan R. and Finch, E. (1997) Enabling technologies: a primer on bar coding for
construction. The Chartered Institute of Building, ISBN 1 85380 081 3
Merkle, RC. (1997) Its a small, small, small, small world, MIT Technology review Feb/March
OECD (2001) The Well-Being of Nations: The Role of Human and Social Capital. Paris:
Organization for Economic Cooperation and Development OECD
RAC Foundation (2002) Motoring towards 2050an independent inquiry RAC Foundation for
Motoring, London
Strassmann, P.A. (1998). The value of knowledge capital. American Programmer, 11(3), pp. 310
The Population Institute (2004) Website: www.population-institute.org
UN (1999) World Urbanization Prospects, The 1999 Revision, Population Reference Bureau, UN
UN (2004) World population trends on web site:
UN Population Division (1996) World urbanisation prospects, New York, 1996
Urban Task Force (2000) Our Towns and Cities: the future. Urban White Paper, Office of the
Deputy Prime Minister, London, UK, 183pp

eWork and eBusiness in Architecture, Engineering and ConstructionDikba & Scherer (eds.)
2004 Taylor & Francis Group, London, ISBN 04 1535 938 4

Vectors, visions and values

Research Institute for the Built & Human Environment, University of
Salford, Salford, UK
ABSTRACT: This paper explores the circumstances which are coming
together to produce revolutionary change to the way in which construction
processes are being exercised. It argues that we are close to the tipping
point where a small change can have a dramatic effect. It then goes on to
explore the nature of change in this context and the issues related to
research and development which will aid this process. It argues that for
change to occur then there must be technological development which can
be measured and can give a sense of direction, then there needs to be a
vision to provide the will to make things happen and lastly the change
must be in line with the values which a particular society holds dear.
Vectors, visions and values lie at the heart of the changes which the
research community must address but perhaps the greatest of these are

One of the pre-occupations of this age is the desire to see into the future. This is
understandable because the speed of change is so great that if you do not prepare then
you begin to lose out in some way. This is particularly true of organisations and the
concept of the learning organisation (Senge, 1990) to prepare for change is now an
established metaphor for this preparatory process. We need to learn in advance in order
that when change occurs we have the tools and culture to adapt to its requirements.
This has been taken a stage further with foresight studies where the scientific and
technological base of whole countries has been marshalled to examine future possibilities
and to prepare a research agenda to match. Over thirty countries have undertaken such
exercises over the past thirty years and many have found it enormously helpful. In many
cases it has been the process that seems to have been the great benefit. To get several
hundred experts to engage in such a process begins to change the culture of the country
towards a desire for self improvement.
Within such foresight exercises there has often been sector groups looking at the needs
and possibilities for major industries and of course construction being one of the major
manufacturing industries of the world has received due attention. Flanagan and Jewell
(2003) summarise the results of such exercises (See Table 1). Some aspects need to be
interpreted because, for example, Information Technology may be assumed by some
countries to be embedded in all the various aspects and therefore it does not necessarily
require to be shown as a separate item. However it is, of course, a major issue. Likewise

Vectors, visions and values


the improvements in process, whether design, manufacture, assembly or occupation can

be found within many of the assumptions made about where improvements will occur.
This kind of exploration sometimes using scenario planning (Ratcliffe, 2004) is
healthy for any discipline and reveals the maturity of the industry in terms of its
realisation of, and the willingness to, change. It can be argued that once a corporate view
takes hold, caused by sufficient people seeking and adopting the new view, that change
can be rapid and revolutionary. It may be that construction is reaching such a point when
it comes to the adoption of Information Technology and process improvement.
Malcolm Gladwell (2001) in his international best seller entitled The Tipping Point
identifies a phenomenon whereby an activity or a technology suddenly emulates the kind
of behaviour that we see when we talk of an epidemic in medical terms. It is a significant
point in time when there is a dramatic moment when everything can change at once. The
situation moves from incremental to revolutionary change in what appears to the observer
a very short space of time. Gladwell attempts to identify three characteristics required for
this phenomenon. Firstly, contagiousness where the concept or idea suddenly becomes

Table 1. Comparison of Foresight issues from

various countries (Flanagan and Jewel).
Australia Canada Finland France Germany Ireland Singapore Sweden UK USA









Repair &





Yes Yes






processes &











eWork and eBusisness in architecture, engineering and construction


and project






Environment/ Yes
whole life/




codes &












Yes Yes
Yes Yes



accepted wisdom and produces a new paradigm which the vast majority follow.
Secondly, a period where little causes can have big effects and thirdly, where change
happens not gradually but at one dramatic moment. He applies this to many instances
where social behaviour becomes revolutionary but the same can also be said of
It was the introduction of the personal computer which suddenly made the power of
that computational machine available to the masses which in turn led to changes in
communications and the way people undertook many of their normal activities whether it
be leisure, or communication with friends or purchasing travel tickets or discovering
knowledge. The world changed in the space of less than one working lifetime to
something quite new. Partly it was contagious as the word was passed on as to what this
technology could do for the everyday life of people and once imparted it was difficult to
stop. Partly it was the fact that a relatively small but significant piece of software, the
internet, enabled people to access knowledge and interact with it through the machine at
their office or their home. Partly it was the dramatic possibilities which were seen
suddenly by so many that help create a critical mass of activity which brought the
investment, intellectual capital and imagination to produce the information infrastructure
we have today. Of course there were many factors which aided and abetted the change
but viewed from a distance these major drivers created an epidemic in human behaviour
which still continues today.

Vectors, visions and values



So what happened to the Construction Industry and the application of Information
Technology? Here is an industry which appears ripe for reaping the rewards of improved
communication. It requires vast stores of inter-disciplinary knowledge, it can be aided
enormously by visual imaging of a finished product and the simulation of performance
when at the present time the cost of physical prototyping is just too prohibitive. The
recent short term forecasts for when the industry might get its act together e.g. when its
supply chain will come on-line have all proved much too optimistic. There have been
significant mini epidemics, for example when contractors of all sizes suddenly found the
benefit of the mobile phone to communicate in a geographically distant and often dirty
and noisy environment. The industry was one of the first to take this technology on board
in a big way. But what about the big changes where collaborative working in design,
manufacture and operation are seen and exercised through a virtual model for the benefit
of all stakeholders in the process: where remote sensing and control allows machines to
manage and direct activity in what are often dirty and hazardous environments: where
ordering and purchasing all resources can be done electronically: where it is possible to
try before you buy and know what you are going to get and why. The industry is
sometimes described as the worlds largest but here you see this great industry locked
into its craft technology which in principle has not changed for millennia. The
management of large projects has become more complex, certainly so has some of the
structures which are now designed (Gehry, 2002) and in many cases they could not be
built except for the support of computer technology. However the wide scale adoption of
the machine to harness its power in a way that can be seen in, say, the aircraft industry, is
just not in place despite the excellent aspirations and investment made by enlightened
clients such as British Airports Authority. Where there is movement it comes from
collaboration between individuals such as the way in which the Frank Gehry Partnership
has worked with Dassault Systemes to adapt software originally designed for aircraft
design to meet the aspirations of one of the worlds great architects. It is interesting to see
that it was another industry that provided what was needed to achieve a new free form
structure which has excited the world.
These breakthroughs are relatively minor outbreaks of a benign driver which pave the
way for what might be. The epidemic is still to come. There are signs that mass breakout
is possible soon and this conference identifies the work of some of the thought leaders
in the field. It addresses what is happening, what might happen, what should happen and
what should definitely not happen! Although the term thought leader seems to have
Orwellian overtones it does capture one important aspect. It identifies the power of
thought and the imagination to provide visions of the possible. This aids the first
ingredient of the tipping point, that of contagion. So what of the other two ingredients?
If we can identify little causes which can have big effects then we may be well on the
way to radical change. A view of the industrial/social world we live in would provide us
with the following trends which coming together might provide the spark for ingredient
number two. As with all epidemics it is impossible to predict but somewhere in the soup

eWork and eBusisness in architecture, engineering and construction


of ideas and developments lurks a minor change which will revolutionise the way the
construction industry works.
Convergence: The last decade has seen a massive change in digital technologies which
has seen all forms of media whether it be visual imagery, radio, television, audio,
personal computers or telephone communications all come together in one digital
representation. Mobile phones today now have the capacity to bring most of these
aspects together. It does not end there. Society across the world is changing and
despite resistance in some quarters there is much more sharing of knowledge leading
to a common or converging viewpoint which may in the long run lead to globalisation
of values. The seduction by western values is seen by many to be one of the downsides
of such open access which is controlled by a few. Will the construction industry come
together in a way we have never seen before?
Connectivity: Alongside the convergence through technologies has been the vast
increase in communication and the access we have in the developed world to all forms
of information. We can now be connected anytime any place anywhere and with the
development of ambient computing this is going to extend still further. With
connectivity comes contact, access and the inability to hold on to and protect specialist
information for more than a short period. The hold of the professions and their
fortresses of knowledge protected by their examination systems and barriers to entry
begins to disappear and boundaries between knowledge disappear. Connectivity
allows us to change quickly and for the virus of change to move through the
population unfettered, unleashing a contagion of ideas which can tip us into a new and
unknown situation.
Culture: As the technologies converge and connectivity allows the spread of the
contagious idea then it needs a receptive culture within which it is easy to breed. The
present generation of university leavers are the first cohort of graduates who have been
through the complete school system where information technology was an integral
part of the curriculum from the very first year of entry into education. To them it is the
norm whereas to previous generations it had to be learnt and absorbed and systems
had to be re-learnt to embrace change. The information technological change is now
endemic in society as a whole and it is even stranger to be outside it than to be in it.
Creativity: Do computers release creativity or constrain it? In past generations the need
to standardise and formalise to use the machine was prevalent. Now this is changing as
the nature of the machine becomes more flexible and adaptive. There is still a long
way to go and the culture has changed so that there is mutual give and take between
machine and user to which both are becoming more accustomed. The games industry
is a leading example where the users speak the language and seldom seem to have to
read any rule book before they can participate at a high level. This natural take up
needs to extend to industries like construction.
Content improvement: As the content of what is provided through the technology
improves so it is more likely that more people will want to use it. When that content of
knowledge or access becomes indispensable for normal living then the technology also
becomes indispensable. In the developed nations we are getting close to this situation
as our financial, employment, consumerism etc is being built around electronic
processing. For the construction industry we have some way to go but the industry is a

Vectors, visions and values


laggard in the race towards electronic business and falls sharply behind transport,
banking and other sectors.
Collaborative working: When the stakeholders need to work together for maximum
efficiency and they are geographically separated then the drive for integrated
communication and sharing becomes paramount. In addition the real benefits often
arise when the stakeholders work together and it is just not possible for one
organisation to act alone. The benefits of airline booking of tickets would not be as
successful if each company developed its own system which could not speak to the
others. Where the benefit is of this nature it may be necessary for Government or a
major player in the software industry to take the lead. In addition there must be
willingness for all parties to work together in pre-competitive research to establish the
Content: With the growing developments in the hard technologies comes an increased
impetus to provide the content for users to find the technology even more usefiil. The
entertainment industry has been one of the first to realise the potential for extra
services and education is following close behind, often using the same technology. It
has been argued that the distribution networks required for the content may create a
monopoly of knowledge, not unlike the half a dozen or so global film distributors who
control the films made available to us for general viewing. This could be dangerous as
we then leave the access to knowledge and the values that the knowledge conveys in
the hands of a few.
Cost reduction: As quickly as a new refinement to the technology takes hold then an
improved version is produced. This highly competitive market creates a leap frogging
effect which sometimes leaves the purchaser bewildered and unable to invest without
substantial risk. However the overall impact is for more computing power to become
available to each individual which in turn enables him or her to do more for the same
cost and in some cases to be more flexible in their use of the technology, thus
removing some of the barriers to use.
Common Standards: This may be a temporary factor in the tipping point agenda. The
technology is moving so fast that the hurdles we see now to inter-operability are likely
to disappear and the issue will become unimportant. However for the time being the
move towards standards for inter-operability such as the Industry Foundation Classes
(IFCs) is opening the opportunity to exchange information and to integrate processes
together. This in turn allows the collaborative working around a single model which
has long been the holy grail of the IT model builders.
We may well find within the above list that key activity which will tip the balance and
bring the construction industry to the fore in e-business. It is likely to be a combination of
many of the above but one new development could well take us into a new digital
craftsmanship to replace the old. If this is about to happen and many think the time is ripe
then we need to consider future possibilities and what it might be like to live in this new
world. What will be the advantages and the pitfalls? To do this we need to consider the
manner in which we approach the subject. This can be considered under three headings
namely, vectors, visions and values. All three share a degree of inter-dependence but all
three have significant lessons to teach us.

eWork and eBusisness in architecture, engineering and construction


A vector is defined in one dictionary as a quantity completely specifled by magnitude
and direction. It is the realm of quantitative research and the fruitful field of the PhD
student. By measuring and refining and postulating and experimenting we see how we
might change the status quo and determine what factors contribute to our understanding
of an item or aspect. Even in qualitative research we often seek the quantification of our
findings through surveys and other mechanisms although in most cases we would hesitate
to say we have completely specifled the magnitude. It is also the field of systems which
specify how things behave, often in an integrated way. It has been the mechanisms by
which scientific method has enabled us to advance. By nature it tends to be reductionist,
reducing the problem to something which we can handle and understand although often
we lose the impact on other aspects of the world we live in. It is often the field of the
short term, dealing with the problems as we see it today.
If we take the information technology developments we can see how harnessing the
technology coupled with an understanding of science to aid in imagination, manufacture
and use has produced significant developments. If we link this with the idea of direction
then we move into what will form the next research agenda.
The European Fifth Framework project called ROADCON (Strategic RTD
Roadmap for ICT in Construction, 2001) attempted to identify where we are now in
terms of IT and the roadmap of where we should be goingthe direction. This is a
summary of what was listed:
Current: These are dedicated to specific engineering functions andproject/building
life cycle stages.
Future: Total life cycle appraisal supported by user-friendly functional applications
and persistent data ensuring holistic decision making.
Products and Components
Current: Have little added value to the building operation.
Future: A mixture of high and low value components acting intelligently.
Knowledge re-use
Current: Relies on industry wide sharing of experiences and fiindamental
understanding of complex systems interacting at all levels.
Future: Experience and previous solutions are available in personal and
departmental archives but new solutions are regularly re-invented in every project.
Information access
Current Company and project data available via LANs and web based technologies.
Future: Ambient access provided, anytime, anywhere, by industry wide
communications infrastructure, distributed and embedded systems, ambient
intelligence and mobile computing.
Project Information and Communication technologies

Vectors, visions and values


Current: Based on ICTs which augment the creation and sharing of humaninterpretable information.
Future: Based on model based ICT enabling context awareness, automation,
simulation and visualisation based on computer interpretable data.
Nature of technology
Current: Invasive technology where the user has to adapt to proven and emerging
Future: Technology is human-centred based around design and build paradigms
promoted by ICTs that enhance the social condition of individuals in the society.
Data Exchange
Current: Available at file level between different applications and companies based
mainly on proprietary formats at low semantic level.
Future: Flexible inter-operability between heterogeneous ICT systems which allows
seamless interaction between all stakeholders.
Current: Business processes are driven by lowest cost but there is a growing
awareness of customer perceived value which is not supported by prevailing
business models.
Future: Performance driven process assuring compliance with clients requirements
and emphasis on customer perceived value.
Collaborative teams
Current: Teamwork between distributed experts in participating companies is
supported by web-enabled document management systems in project web sites.
Future: Virtual teams combine distributed competences via global collaboration
environments that support cultural, linguistic, social and legal transparency.
Systems Flexibility
Current ICTs require customisation to meet the varying needs of users and has to be
tailor-made for new situations requiring manual maintenance, configuration and
Future: Adaptive systems are created which learn from their own use and user
behaviour, and are able to adapt to new situations without manual maintenance,
configuration and support.
The above list suggests where development might take place to overcome some of the
difficulties we face today and provide a working environment which is more finely
attuned to the needs of human beings. It is technology centred and is looking for technical
solutions. To obtain these solutions then quantitative measures are needed for the science
to produce the tools and the technology to make use of scientific discovery. Much will be
based on an understanding of the natural sciences and the engineering necessary to make
the science useful. In this broad sense the work is in the realm of the vector, quantities

eWork and eBusisness in architecture, engineering and construction


completely specified in magnitude and direction. Without measurement and direction

through an understanding of what is possible these advances could not take place. But is
this all? What else needs to be considered? Here we come to the realm of the vision.
It is possible to have an over abundance of technical solutions but without change
occurring. In the early days of information technology the term solutions in search of a
problem was often used. By this was meant that the technology was advancing so fast
that it was outstripping the ability of society to assimilate it in a meaningfiil way. Often
its use was lost on the community it was meant to benefit, or worse, the creator had
designed something which genuinely had no use for the foreseeable future. In the former
case it is critical that society has some vision of what it wants to achieve in order for it to
take advantage of the new tools. To do this it needs a vision.
One dictionary definition of vision is lntelligent Foresight. In this sense, then, the
intelligence gathered from the vectors can be used to give an insight into the future. The
difference between foresight and forecasting is that forecasting attempts to predict the
future (whether it is events, technological advances or expected dates for occurrence)
whereas foresight tries

Table 2. Visions and themes for the Australian

Construction Industry 2020 (Hampson & Brandon).
Potential impacts
Design & Communication

Process & manufacture

5. Information &

6. Virtual

7. Off-site

8. Improved

1. Environmentally





2. Meeting client





3. Improved





4. Improvement of
labour force





9. Research and





to provide guidelines for policy makers about the directions they should follow. In one
case (forecasting) the industry asks how do I respond to these events? knowing they are

Vectors, visions and values


powerless to do anything about them and in the other (foresight) the industry asks what
do I need to achieve these goals? It is the difference between saying the future is
inevitable and we just have to predict what will happen and on the other hand saying we
can influence the future, we are not just helpless bystanders.
The most recent foresight study is the Construction 2020 Vision arranged through the
CRC Construction based at Queensland University of Technology, Brisbane, Australia,
involving all the major organisations and professions in the industry. Several hundred
people attended workshops and completed questionnaires in which they identified their
vision for an improved Australian Construction industry. The final summary report of
these deliberations (Hampson & Brandon, 2004) reveals the integrated nature of the
aspirations of the industry. Table 2 shows the broad outline of the nine visions or themes
distilled from all the responses made. On one axis it can be seen that it is the
environment in which construction takes place which is the key issues. (Environment
here means the complex of social and cultural conditions affecting the nature of an
individual or community). These include the needs of the workforce, a sustainable
environment, responding to clients needs, an improved business environment and
research and development. On the other axis are the technologies which might well aid
the improvement in the environments identified and these include process issues and
those related to ICTs. The strength of relationship does of course vary between the two.
What is interesting here is that it is not the technologies which dominate. In fact in the
analysis of responses it was the improved business environment and environmentally
sustainable construction which headed the list. The technologies, although considered
important, were seen as a means by which the other issues could be achieved. In other
words, it was the people issues which were really considered to be important, whether it
was now (as in the case of the business environment) or in the future (as in the case of
sustainable development).
This suggests that visions of the future as expressed in peoples aspirations are more
about quality of life rather than mere technological advance. This may well be something
we should note as we invest our time and energy into issues of self improvement. In fact
the drive is towards values rather than solutions to current technological problems. This is
also more evident as people are asked to look into the longer term future rather than the
short and medium term. What we see is a shift to values the more we leave the baggage
of the present behind.
At the heart of values are the belief systems to which we hold. These in turn are arise
from or are created by the culture in which we live. In democratic societies, at least, these
are partially enshrined in the legislation and regulation which the people have determined
to represent those values. Whilst in past times these matters were largely stable and often
confined by national or other boundaries, this is not so true today. The internet and other
technologies do not recognise such boundaries and can pose a threat to those who hold
strong beliefs. We are moving into a period when values are becoming a key issue in
development and world politics as globalisation begins to be the mantra of the many.

eWork and eBusisness in architecture, engineering and construction


When we consider the research agenda for countries the question of values is often
forgotten in our desire to improve the systems and technologies with which we work.
When the Australian community calls for a better business environment, what is it calling
for? Does it mean more profits for all and if so does this mean that someone else will
suffer? Does it mean a fairer distribution of risk, in which case who wins and who loses,
assuming the present system is unsatisfactory. Does it mean that those with technology
win and those without lose? It is a very complex issue but one that is fiindamental to the
well-being of the people we seek to serve. Our research cannot be undertaken, and a new
tool produced, without considering whether people want it, whether it has negative as
well as positive contributions to make or whether it supports or undermines the values of
the society in which it is to be used.
These matters are critical in the information sciences. Knowledge is not neutral, it
empowers some and can disempower others. At the same time the technologies used to
convey knowledge use models, which by definition, are not fiilly representative of the
object or system they try to represent. They represent the item but they do not convey it
in its entirety. We are moving to the creation of a virtual world where we aim to create
reality within a machine. As we move in this direction we begin to touch on some very
key and sensitive issues. How do we really know that this new world truly reflects our
own? Even if it doesare we interpreting it in the right way? In the real world it only
affects a small number of individuals and changes can be made and the model adapted.
Computer models on the other hand are designed to be used time and time again by many
people who do not necessarily communicate with each other. Mistakes become fossilized
and values become frozen to the point where an oppressive tool may have been created.
The author, as a programmer, many years ago was concerned by some of the knowledge
he was placing in some computer programmes. In several programming languages the
expression IFTHEN was common. IF a certain set of circumstances existed THEN a
certain action was taken. At the time we were writing into the programme well
recognised techniques and best practice but what if our knowledge increased or society
did not want to implement that action when that set of circumstances occurred? In a
simple program it could be changed but not before many had used it or still continued to
use it. In a complex program the piece of knowledge became embedded so deep that it
was often impossible to find it and extract it and change it. It became part of the system
and it was almost impossible to detect the manner in which it influenced the full model or
This became even more acute when Knowledge Based Systems came into being. We
captured the knowledge of experts and we made that available to those who were less
expert. The knowledge of the expert and to some extent his or her value system was now
built into the model. We tried to devise ways round this by designating some knowledge
as stable (but who says so) and some as unstable and therefore made more explicit and
easy to change. This can work in relatively small systems dealing with focussed
applications but the trend is towards integrated systems and greater intelligence for the
machine. In other words we will be leaving more of the decision making to the machine.
What algorithms will the machine use and how many of these will represent values?
When we begin to have conversation with the machine how do we know what
mechanisms it is using to guide us towards a particular solution?

Vectors, visions and values


This is but one example of where technology is taking us into the value systems arena,
although some would argue that we have been there for some time. These are not trivial
matters. As we allow machines to intrude on our privacy and on our decision-making are
we going to be constantly challenging its reasoning powers as we do in debate and
conversation? How will we get all of us to buy in to what it is doing when the users are
not a coherent homogenous group who can exercise some kind of democratic power? We
are already talking about jacking in computers direct into the brain. This raises even
greater questions about at which point the brain leaves being human and becomes a
machine and who provides its value system, man or machine?
This must seem like science fiction to some but it is coming upon us fast. In our
research we must ask the question about what we are creating and how this really ties in
with the aspirations of our fellow human kind to have a reasonable quality of life. There
should come a point when every piece of research but particularly research in terms of
knowledge and processes should require a set of questions to be asked about how it
impacts upon the society which it seeks to serve.
This paper has attempted to raise some fundamental questions about the research we do,
particularly in the field of information and communication technologies, but also in the
way we do it. It has recognised the great debt we owe to scientific method and the
reductionist approach which has provided advances from which we have all benefited.
This is the realm of the vector where measurement reigns supreme. It has also recognised
the importance of looking to the future to provide further direction to our efforts. Here
studies are finding increasingly that it is quality of life issues which now dominate, rather
than technology. Technology is seen as enabler but needs to be kept in its place. The need
to know the general direction we are heading in is a key to investment and efficient
utilisation of resources. The faster the speed of change then the greater the need to
envision where we are going. With an increase in speed, so must the headlights become
stronger! As we move toward quality of life then we begin to embrace the values of
people and their aspirations. The vision for the future must address these issues. Finally
these values need to be subject to constant debate and exploration and the technology
must be sufficiently transparent and flexible to adopt the conclusions of the debate or else
we will create a monster of horrific proportions.
Whether these approaches result in the tipping point is unknown. It is likely that it is
the combination of scientific method scenario planning and a response to values which
will provide the changes that will see Construction move in a way which has been seen
by many industries. These issues around construction are now reaching a crescendo of
movement which seems to suggest that this point is near and we need to consider what
part each of us should play.
In conclusion vectors underpin our understanding of the future and provide material
for our visions; visions allow us to provide scenarios in which we can mould events and
seek to match the aspirations of society; values underpin all that we do and unless these
are part of the foregoing processes then we may be undermining the very progress we are
trying to achieve. Values should dominate if the tipping point is to provide us with a

eWork and eBusisness in architecture, engineering and construction


technological base which will be human centred and serve humankind and our industry
Flanagan, R. & Jewell, C. 2003. A Review of Recent Work on Construction Futures. London.:
CRISP Commission 02/06, Construction Research and Strategy Panel
Gehry, J. 2002. Gehry Talks. Architecture+Process. USA: Universe Press
Gladwell, M. 2001. The Tipping Point. UK: Abacus
Hampson, K. & Brandon, P. 2004. Construction 2020-A Vision for Australias Property and
Construction Industry. Australia: CRC for Construction Innovation, QUT, Brisbane
Ratcliffe, J. 2004. Imagineering the Futurethe prospective process through scenario thinking for
strategic planning and management; a tool for exploring IT futures in Designing Managing and
Supporting Construction Projects Through Innovation and IT solutions (Editors Brandon, P.
Heng Li, Shaffii, N. & Shen, Q.) Malaysia: CIDB
ROADCON: Strategic RTD Roadmap for ICT in Construction. 2001. European Fifth Framework
Project. (IST-2001-37278)
Senge, P. 1990. The Fifth Disciplinethe Art andPractice of the Learning Organisation. London:
Random House

eWork and eBusiness in Architecture, Engineering and ConstructionDlkba & Scherer (eds.)
2004 Taylor& Francis Group, London, ISBN 04 1535 938 4

Help wanted: project information officer

University of British Columbia, Vancouver, BC.Canada
ABSTRACT: Innovations in information technologies promise significant
improvements in the effectiveness and efficiency of designing and
managing construction projects. Yet the new demands that these
information technologies create for expertise and management tasks may
be more than typical project personnel can accommodate. This paper
explores the potential for introducing a new role into the project team
that of a project information officer. The paper is organized in the form of
a hypothetical job description for such a position. It first described the
duties of the project information officer relating to the implementation of
an information management plan, the specific project systems to be used,
and new approaches to overall project management. The paper then
discusses the organizational role, skills and qualifications, and the
compensation and evaluation issues for the position.

Current trends in information technology (IT) are yielding a wide range of new
computer-based tools to support the architecture, engineering, construction and
facilities management (AEC/FM) industrieseverything from project collaboration
Web sites to virtual building environments. These tools promise great increases in the
effectiveness and efficiency of designing and managing construction projects. However,
no one claims that these improvements will come without cost in terms of new skills and
work tasks that will be required of many of the project participants. These new
requirements are often required of senior project designers and managers. Yet the reality
of the AEC/FM industry is that these people will rarely be in the position to take on these
new requirements. They can typically be characterized as busy, highly effective people,
and in the spirit of putting first things first (Covey 1990), taking the time to learn and
implement new IT will rarely be at the top of their priority list, regardless of the expected
benefits. Moreover, trends towards the integration of information resources create new
requirements for project-wide information coordination, which must be administered by
someone. To address these practical barriers to IT innovation, we suggest that a new role
is required for AEC/FM projectsthat of a Project Information Officer. This paper
explores the anticipated roles and requirements of the Project Information Officer in the
form of a hypothetical job description for such a person.

eWork and eBusisness in architecture, engineering and construction



A position is available for a Project Information Offlcer (PIO) for a large AEC/FM
project. The PIO will be responsible for the overall information management on the
project, information technology strategy and implementation, information integration and
coordination for the project, and related training activities for project participants.
The duties required of the PIO are organized into three main categories: implementing an
information management plan, project systems and areas of expertise, and assisting in the
implementation of a unified approach to project management. Each of these is discussed
in the following sections.
3.1 Implementing an information management plan for the project
The PIO will be responsible for implementing an overall information management plan
for the project. The information management plan will address three primary elements:
project tasks, information transactions, and overall integration issues. For each of these
elements, the plan will analyze information requirements, design information
management solutions, and produce specific information management deliverables. Each
of these tasks is described more fully below. The level of detail required for the
breakdown of project tasks and transactions described below will be as needed to achieve
an effective overall project information management system. In general, this will be at a
level where distinct work packages interact with each other, not the level at which work
is carried out within the work packages themselves (for example, it will address the type
and form of design information that must be sent to the general contractor, but not the
way that individual designers must carry out their design tasks).
3.1.1 Elements of an information management framework
The information management plan is based upon an overall information management
framework that adopts an underlying process model for AEC/FM projects. This model
views projects in terms of the following elements (illustrated in Figure 1):
A collection of tasks carried out by project participants (all tasks required to design and
construct the facility, including tasks relating to archiving project information,
providing information to facility users/operators, etc.).
A collection of transactions that communicate information between tasks.
A collection of integration issuesissues relating to the interactions between the tasks
and transactions as a whole rather than as a set of individual elements. This also
includes issues relating to information integration across organizational boundaries,

Help wanted: project information officer


integration of legacy and existing technology with plans for new and ftiture
technology, and so on.
The model considers these elements across all project participants. The
information management tasks described below are carried out for each of these
project elements.
3.1.2 Analysis of information management elements
As the first step in developing the information management plan, the PIO will analyze
each element (tasks, transactions, and integration issues) to assess the overall information
requirements, as follows:
Define each task, transaction, or integration issue, including identifying participants,
project phase, etc. This should correspond largely to an overall project plan, and thus it
may not need to be done as a distinct activity.
Assess the signiflcant information requirements for each element: Determine, in general
terms, the type of information required for carrying out the tasks, the information
communicated in the transactions, or the requirements for integration issues. With
traditional information technologies, information requirements generally correspond to
specific paper or electronic documents. With newer information technologies,
however, information requirements can involve access to specific data sources (such
as shared databases) that do not correspond to traditional documents.
Assess tool requirements: Determine key software applications used in carrying out
tasks, communication technologies used for transactions, or standards used to support

Figure 1. Elements of an information

management framework that considers
projects in terms of tasks, transactions,
and overall integration issues. From an
information perspective, tasks are

eWork and eBusisness in architecture, engineering and construction


associated with computer applications

and transactions are associated with
Assess the information outputs: Determine the significant information produced by each
task. This typically corresponds to information required as inputs to other tasks.
3.1.3 Design of information management elements
Given the analysis of the project information requirements (as established in the previous
section) the PIO will design the information management strategies and solutions for the
project and formalize the requirements that the overall plan will place on each of the
project elements, as follows:
Formalize information input and output requirements: the requirements analyzed
previously will be formalize as the information required as inputs for each task, and
the information that each task must commit to producing.
Requirements for tools, technologies, standards, etc.: establish the basic requirements,
constraints, and recommendations for the software tools used to support individual
tasks, communication technologies used for transactions, and data standards adopted
to support integration, etc.
Staffing requirements: define roles and responsibilities relating to information
Work practices and procedures: establish requirements and constraints on how various
work tasks are carried out in order to assure information management requirements
can be met.
In deciding from among alternative solutions for the above strategies, an overall costbenefit analysis approach will be followed. This may not be a straightforward process,
however, since the costs involved in improving information management elements may
be incurred by parties that are different from those that receive the resulting benefits.
3.1.4 Deliverables of the project information officer
The PIO will be responsible for the following specific outputs:
Information management plan documents: The requirements for information
management strategies and solutions as described in the previous section will be
formalized into a documented information management plan for the project. This will
include the minimum requirements that individual tasks and participants must meet,
and additional optional recommendations.
Implementation of the information management plan: The PIO will be responsible for
all aspects of implementing the information management plan. This includes
coordination among all key project participants (for example, holding regular
information management coordination meetings), carrying out administrative duties
for the plan, monitoring conformance and results, and so on.

Help wanted: project information officer


Training: The PIO will organize the training necessary for project participants to carry
out the information management plan. This will be especially necessary in the case of
new information technology.
Provide project information technology resources: The PIO will be responsible for
acquiring and supporting any information technology resources (computing hardware
and software) that are best provided for the project as a whole, as opposed to
individual participants (for example, this may include a project collaboration web site,
but not specific CAD software).
Provide information management and technology support for project participants: The
PIO will act as a resource available to all key project participants on issues relating to
information management and technology.
3.2 Project systems and areas of expertise
It is anticipated that the specific types of information systems used during project will be
as described in the following list. The PIO is required to have a basic expertise in all of
these areas, and to include each of these with in the information management plan.
Project document management and collaboration web site: a web site will be
established for the project that will act as the central document management and
collaboration vehicle for the project. This will include user accounts for all project
participants, access control for project information, online forms and workflows,
messaging, contact list, etc. A commercial service will be used to create and host the
Classification systems, project break downs structures and codes, and folder structures:
much of the project information will be organized according to various forms of
classification systems. These range from the use of industry-standard numbering
schemes for specification documents, to the use of a project work breakdowns
structure, to the creation of a hierarchical folder structure for documents placed on the
project web site. The PIO must have familiarity with relevant industry classification
systems such as OCCS (OCCS Development Committee 2004), and will be
responsible for establishing the project classification systems.
Model-based interoperability: many of the systems described below work with modelbased project data, and have the potential to exchange this data with other types of
systems. The project will adopt a model-based interoperability approach for data
exchange for the lifecycle of the project. The PIO must be familiar with the relevant
data exchange standards, in particular the IFCs (International Alliance for
Interoperability 2004), and must establish specific requirements and policies for
project data interoperability. The PIO must also establish a central repository for the
project modelbased data (a model server).
Requirements management system: a requirements management tool will be
used to capture significant project requirements through all phases of the project
and to assure that these requirements are satisfied during the design in execution
of the work.
Model-based architectural design: The architectural design for the building will
be carried out using model based design tools (e.g., object-based CAD). Although
this improves the effectiveness of the architectural design process, the primary

eWork and eBusisness in architecture, engineering and construction


motivation here is the use of the resulting building information model as input to
many of the downstream activities and systems.
Visualization: using the building information model, which includes full 3-D
geometry, there will be extensive use of visualization to capture requirements and
identify issues with the users, designers, and builders. This may include high-end
virtual reality environments (e.g., immersive 3-D visualization), on-site
visualization facilities, etc.
Model-based engineering analysis and design: the building information model
will be used as preliminary input for a number of specialized engineering analysis
and design tools for structural, building systems, sustainability, etc.
Project costs and value engineering: the building information model will be
used as input to cost estimating and value engineering systems. These will be
used at numerous points through the lifecycle of the project (with varying degrees
of accuracy).
Construction planning and control. the project will make use of systems for
effective schedule planning and control, short interval planning and production
engineering, operation simulation, esource planning, etc. Again, the systems will
make use of the building information model and will link into other project
information for purposes such as 4-D simulation.
E-procurement: project participants will make use of on-line electronic systems
to support all aspects of procurement, including E-bidding/tendering, project
plans are rooms, etc.
E-transactions: on-line systems will be available for most common project
transactions, such as requests for information, progress payments claims, etc.
These will be available through the project web site.
E-legal strategy: project policies and agreements will be in place to address
legal issues relating to the electronic project transactions.
Handoff of project information to facilities management and project archives:
systems and procedures will be in place to ensure that complete and efficient
package of project information is handed off from design and construction to
ongoing facilities operation and management, as well as maintained as archives of
the project
3.3 Assist in the implementation of a unifled approach to project
It has been argued that there is a fundamental mismatch between emerging IT solutions
for AEC/FM and current project management practices (Froese and Staub-French 2003).
The IT solutions rely on a high degree of integration and collaboration, whereas current
practice makes heavy use of decomposition and modularization to minimize
interdependencies between project tasks and participants. IT developers must strive to
accommodate current practice, yet project management practice may also need to adapt
in order to take full advantage of the capabilities offered by emerging IT solutions.
The proposed project will adopt these modifications and use a unified approach to
project management. In this unified approach, the work is still carried out by defining
distinct work packages and assigning these to individual project participant groups.

Help wanted: project information officer


However, there is much more emphasis on a continuously evolving project deliverable,

where this deliverable is initially the building information model, which expands with
new information over time until, during the construction phase, the building information
model is used to drive the production of the physical building itself. The focus of the
individual work tasks, then, is to draw necessary information from the building
information model and add new content back into the building information model or new
components into the physical building. Information technology plays a critical
component of this new unified approach to project management, since it relies on the
ability for project participants to collaborate on the building information model. More
specifically, the approach uses the following standard views as the primary conceptual
structures that are shared and are common to all prqject participants (these are illustrated
in Figure 2):
The project lifecycle view: a time-based view that organizes project information
into well-defined project phases.
The workflow view: a process-based view that organizes project information into
work packages and tasks.
The product/deliverable view: views project information in terms of the specific
information or physical components of the overall project.

Figure 2. Schematic illustration of

three primary views of project
information and their interrelationships in a unified approach to
project management.
The PIO will work with project managers to help design and implement a unified
approach to project management that can fully leverage the opportunities offered
by the project IT.


eWork and eBusisness in architecture, engineering and construction


The PIO may be an employee of the project owner, lead designer, or lead contractor
organizations, or may work as an independent consultant/contractor. Regardless of
employer, the PIO will be considered to be a resource to the project as a whole, not to an
individual project participant organization. The PIO will be a senior management-level
position within the project organization (i.e., not a junior technology support position).
The PIO will report to the owners project representative and will work with an
information management committee consisting of project managers and information
specialists from key project participants. Depending upon the size of the project, the PIO
will have an independent staff. In addition to the information management committee,
liaison positions will be assigned within each project participant organization.
Candidates for the position of PIO will be required to have a thorough understanding of
the AEC/FM industry, information management and organizational issues, data
interoperability issues, and best practices for software tools and procedures for all of the
major project systems described previously. Candidates with be expected to possess a
masters degree relating to construction IT and experience with information management
on at least one similar project.
Advanced construction IT offers great promise for improving the project effectiveness
and efficiency while reducing risk. Not all of these benefits directly reduce costs, yet the
overall assumption is that the costs of the PIO position will be fiilly realized through
project cost savings. This will not be a direct measure, but will be assessed on an overall
qualitative basis through an information management review processes that examines the
following questions of the information management and technology for the project:
To what degree was waste (any non-value-adding activity) reduced?
What new functionality was available?
How efficient and problem-free was the informa tion management and
technology relative to projects with similar levels of IT in the past?
What was the level of service and management effectiveness offered by the
What is the potential for future improvements gained by the information
management practices on this project (i.e., recognizing the long learning curve
that may be associated with new IT)?

Help wanted: project information officer


The description of a PIO role and an overall project information management context
described in this paper is preliminary, incomplete, and overly idealistic. Many of the
tasks and technologies described here are currently in place on construction projects.
However, the position of a project information officer and information management
procedures of the nature described here could go a long way towards easing some of the
significant practical barriers that stand between emerging IT solutions and real
improvements to construction projects. Next steps would include collecting best practices
for information management on construction projects, further development and
refinement of an information management process, and greater inclusion of overall
information management practices as part of IT research and development projects.
Covey, S. (1990) The 7 habits of Highly Effective People, Fireside: New York.
Froese, T. and Staub-French, S. (2003). A Unified Approach to Project Management, 4th Joint
Symposium on Information Technology in Civil Engineering, ASCE, Nashville, USA, Nov.
International Alliance for Interoperability (2004), IAI International Home Page, URL:
http://www.iai-international.org/iai_international/ (accessed June 3, 2004).
OCCS Development Committee, (2004). OCCS Net, The Omniclass Construction Classification
System, web page at: http://www.occsnet.org/ [accessed June 24, 2004].

eWork and eBusiness in Architecture, Engineering and ConstructionDikba & Scherer (eds.)
2004 Taylor & Francis Group, London, ISBN 04 1535 938 4

The next generation of eBusiness and

eWorkwhat is needed for the systemic
An executive summary of the EU supporting
research and innovation
Head of Unit DG Information Society, New Working Environments
ABSTRACT: The presentation is building on three main pillars: Firstly
the policy context of the EU is described, in the context of industrial
competition and the new innovation processes. Secondly the presentation
is looking in detail to the drivers the knowledge, networked economy is
bringing to the sustainable economical growth and thirdly describing the
IST research programme, and also the new thinking of the EU regarding
new research policy instruments favouring the full deployment of the
European research capacity.
The EU has set it policy goals towards 2010 in the well-known Lisbon Agenda. It is an
important document form several perspectives: It sets the ambition of Europe towards
sustainable growth, competitiveness and high-quality jobs. The timeline was set to
incorporate the enlargement process which is halfway done now, increasing the social
cohesion policy implications.
However the real issue is how to receive the Lisbon goals. In the speech it is shown
that the need for breaking from the past paradigms is evident, and this view is backed up
by some recent studies on the productivity growth due to new paradigms. The growth
reported by using modern ICT technologies in an innovative, systemic way is reported to
be several tens of percent, in average.
This brings challenges to organisational behaviour and their fiiture developments.
Virtual organisations have been talked about for tens of years, but are there really those,
in the meaning of organisations being at the same time capturing the advantage of being
small, thus flexible and at the same time large, thus effective? Not so many. Perhaps the
construction sector itself is leading the way towards the new organisation formats.
We can not either forget the role of entirely new forms of organising the work, like in
and by professional communities rather than fixed organisations. The networking and
connectivity together with advanced collaboration tools lead to entirely new possibilities
to build virtual (e.g. design) tams across traditional boundaries, and even continents. The
24 hours continuous work paradigm is very close, and promising. What is required
though on policy and legislation levels? Do we need to reconsider the IPR issues when
approaching these new paradigms not to inhibit innovation?

The next generation of eBusiness and eWork


Some key projects running in the field of new working paradigms are described to
illustrate the possibilities the technology is providing already today, not to talk about
The third part of the speech continues precise information of the experiences the EU
has from past projects in the field, the achievements and also the experiences of the first
and second call of the IST programme where themes like networked business and mobile
work were present.
The third call which is closing these days is elaborated in the perspective of the longterm goals of the EU in constructing new undertakings and instruments to capture better
the whole innovation process, which has moved from sequential to strongly parallel,
more dynamic and multidisciplinary than ever before.
Here a new approach to the building of the research and innovation agenda is
described. The unit New Working Environments of DG Information Society has started a
set of research communities, interacting in a multidisciplinary way. The communities
consist of industrial and research actors, policy makers and those who are needed to
cover the whole innovation process. This set of communities, called ami@work (Ambient
Intelligence at work) are now in the start-up phase, but already encompassing more than
600 people being actively involved. The site can be found under www.amiatwork.com,
which is inviting you all to participate in those communities closest to you.
As example of supporting the innovation process on national base is also described, to
illustrate the needed interactions for full efficiency. Industrial and research community
participation is encouraged to make the interactions and thus the whole innovation
process more effective.
The fourth IST call, which is to be published at the yearly IST 2004 conference, this
year in the Hague, The Netherlands is discussed as the whole IST research work
programme 20052006, which is investing in IST research some 1,8 Billion EURO in the
forthcoming two years. New themes are approaching, and the background thinking
leading to this is described.
Last but not least the path towards the 7th EU research Framework programme is
described. The proposal from the Commission is to double the research investment to
match with the goals of 3% of research of the GDP following the Lisbon agenda goals.
New instruments like technology platforms are debated, as well as the balance
between the long-term (individual) research versus the current collaborative research
schemes. The state of the play and the rationale of the choices will be discussed. Also the
next steps opening participation possibilities for the industry and research sectors are
described to encourage the common way to meet the sustainable growth goals.

Product modelling technology

eWork and eBusiness in Architecture, Engineering and ConstructionDikba & Scherer (eds.)
2004 Taylor & Francis Group, London, ISBN 04 1535 938 4

Virtual building maintenance: enhancing

building maintenance using 3D-GIS and 3D
laser scanner (VR) technology
V.Ahmed, Y.Arayici, A.Hamilton & G.Aouad
School of Construction and Property Management, University of Salford,
Greater Manchester, United Kingdom
ABSTRACT: The renovation and refurbishment market is rapidly
expanding within the construction industry, bringing the role of the
Facilities Management (FM) department to the forefront. Operating and
maintaining a facility however, takes the biggest proportion of the
lifecycle cost of a building, which can be costly and time consuming. The
wide spread and use of advanced technologies within the construction
industry can be used to drive the productivity gains by promoting a freeflow of information between departments, divisions, offices, and sites;
and between themselves, their contractors and partners.
The paper describes a scope in the INTELCITIES project undertaken by
75 partners including 18 cities (Manchester, Rome, Barcelona, etc),
20ICT companies (Nokia, IBM, CISCO, Microsoft, etc) and 38 research
institutes (University of Salford from UK, CSTB from France, UPC from
Spain, etc) across Europe to pool advanced knowledge and experience of
electronic government, planning systems, and citizen participation across
Europe. The scope includes capturing digital data of existing buildings
using 3D laser scanning equipment and showing how this data can be
used as an information base for enhancing the refurbishment process and
Furthermore, the paper discusses the state of the art for operating and
maintaining facilities, describing the prevailing methods of building
maintenance, highlighting their limitations with proposed alternatives,
such as 3D Geographic Information Systems 3D GIS to enable the
spatial analysis and static visualisation of critical of query outputs and 3D
laser scanning technology for obtaining the digital information of existing
buildings for construction maintenance.

The new-construction market has been shrinking, while the renovation and refurbishment
market is rapidly expanding in the construction industry (Mahdjoubi and Ahmed, 2004).
Operating and maintaining a facility takes the biggest proportion of the lifecycle cost of a
building. The growing emphasison lifecycle considerations through new forms of project

eWork and eBusisness in architecture, engineering and construction


relationships, together with the increasing refurbishment, retrofit and renovation of

existing buildings (instead of new build) is bringing the role of the Facilities Management
(FM) department to the forefront.
Furthermore, previous research had shown that there would be no substantial change
in aggregate demand for housing over the next decade (Simmonds and Clark, 1999).
Therefore, organisations need to be able to quantify costs and communicate management
information about their facility and infrastructure (Wix, 2003). To do this, they are
turning to new information technologies to drive productivity gains. The most successfiil
companies promote a free-flow of information between stakeholders.
Typically, construction facilities require maintenance and occasional repairs on a
regular basis, due to deterioration and aging. This is to keep them functional and in a
satisfactory appearance. In fact, many organisations own a large variety of buildings and
other types of constructed facilities, which need regular maintenance, occasional
renovation and rehabilitation, and sometimes reconstruction of new facilities. Often,
these organisations face a crucial dilemma, regarding the urgency and prioritisation of
works and associated costs (Rosenfeld and Shohet, 1996). However, not many companies
have utilised information technology to increase the efficiency of the refurbishment
process for building maintenance.
The above issues are addressed in the INTELCITIES project, which has a focus on the
prevailing methods of building maintenance, highlighting their benefits and limitations.
The paper also describes a proposed approach for the use 3D Geographic Information
Systems GIS and 3D laser scanning system, to enable the analysis and static
visualisation of critical query outputs for building maintenance.
The INTELCITIES (Intelligent Cities) Project is a research and development project that
aims at helping achieve the EU policy goal of the knowledge society. INTELCITIES
project brings together the combined experience and expertise of key players from across
Europe, focusing on e-Government, e-Planning and e-Inclusion, e-Land Use Information
Management, e-Regeneration, Integration and Interoperability, Virtual Urban Planning,
etc, (http://www.intelcitiesproject.com/).
The overall aim is to advance the possibilities of e-Governance of cities to a new level
through the development of a prototype of the IOSCP (Integrated Open System City
Platform), as a clear and easily accessible illustration of a shared civic place in virtual
space continuously available to allwhether officials, decision-makers and other
professionals, such as planners, developers, politicians, designers, engineers, transport
and utility service providers, as well as individual citizens, community groups/networks
and businesses, through a wide range of interfaces.
This paper focuses on the e-Regeneration work package of the project. The objectives
of the package are to:
1. Produce a city vision for the post industrial city in the knowledge society and set of
targets for systems to enhance regeneration.
2. Produce a system to support improved decision making about strategic planning of

Virtual building maintenance


3. Produce a system to support development planning processes and that engage citizens
in planning regeneration.
4. Show how these systems could be integrated with other city systems.
5. Report on how a holistic approach to all elements of building, refurbishment and urban
planning and design can lead to successful, sustainable cities.
The objective 5 specifically is addressed in the paper. The task, which is defined to
achieve the objective 5, is the building data capture using the laser scanner technology
and investigate this technology to enhance the refiirbishment process and maintenance.
Figure 1 illustrates the vision, which is beyond the INTELCITIES project, for the use of
3D laser scanner for maintenance and refurbishment process. In the INTELCITIES
project, the laser scanner technology is aimed at showing how it can be used for building
refurbishment and maintenance.
In figure 1, the first step centres on the creating VR models subject to the requirements
of use and usage of the VR models such as building redesign and renovation, building
survey and evaluation, reverse engineering, fabrication and construction inspection,
health and safety, and urban planning and analysis.
In the second step, integration is the main concern. Therefore, integration of the laser
scanning system will be endeavoured with the GPS systems for linking the OS
(Ordanance Survey) data or for linking the local authority data, with the GIS system for
accomplishing the full integration of VR and GIS and with the Workbench for
interactively analysing the VR models produced through laser scanning system.

Figure 1. Show how laser scanner

technology can be used for
maintenance and refurbishment
process in the INTELCITIES project.
In the third step, it is aimed at building data integration that is related to developing a
conceptual model of nD modelling (Lee et al, 2003) system and associating it with the
other data structures including relational databases and object-oriented databases to
illustrate how data can be integrated to support intelligent city and construction systems.

eWork and eBusisness in architecture, engineering and construction


The rest of the paper considers DSS (Decision Support System) and delves into the
integration of the 3D laser scanning technology with GIS system for building
maintenance. In the next section, the existing methods of building maintenance and their
limitations are explained in order to justify the integration of the 3D-GIS and the 3D laser
scanner systems.
Planning and control of building maintenance works are commonly performed using
traditional media, such as paper-based plans and sketches. Other techniques have also
emerged as decision support systems (DSS) and integrated environments.
In recent years, major efforts were devoted to the development of decision support
systems (DSS) to address building maintenance issues. Several of these systems have
been developed to assist managers and decision-makers in planning building maintenance
activities. Each DSS has its own functionality and designed for its unique purpose. These
tools range from renovation design to initiation of renovation projects.
Rosenfeld and Shohet (1996) have developed a unique DSS, which is capable of
suggesting various building/facility-upgrading alternatives. This system was
demonstrated on a 25-year-old dining facility in a military base that had suffered serious
structural damage due to foundation problems. This system has proved valuable for the
maintenance work. It provided managers with alternatives depending on the input
criteria, including full descriptions of building evaluation and end-results. However, it
only provides information on the general condition of the facility, including costs and
subsequently life span of facility depending on how much money is available or what
alternative is chosen.
Underwood and Alshawi (2000) developed an integrated construction environment for
the UK construction industrythe Simultaneous Prototyping for an Integrated
Construction Environment (SPACE). MAINTenance ForeCASTing in an Integrated
Construction Environment (MAINCAST) (Underwood and Alshawi, 2000) is an
amplification of SPACE, which forecasts building element maintenance of a project as
part of a fully integrated environment MAINCAST and was developed to assist the
facility manager/owner (Client) in facility/project management by automatically
generating detailed maintenance valuations, outlining the required maintenance during
every operational year of the projects life, etc.
However, these media suffer from several limitations. Firstly, it is difficult to identify
the refiirbishment and renovation tasks. Secondly, it is also difficult to monitor the
various tasks, because of the complexity of the operation tasks. The Rosenfeld and
Shohet system DSS for instance is not capable of enabling managers and decision-makers
to view the facility and see the damaged elements or locate them. Overall, the main
limitation of these DSS systems is related to their output. They usually provide the results
in a text format or tables and, in some cases, bar charts. This form of output is often not
appropriate for decision-makers to visualise the results of their queries, especially when
lay-clients are involved in the communication process. These tools have yet to adopt
spatial analysis techniques such as GIS technology in their operation. A GIS enables the

Virtual building maintenance


spatial analysis and static visualisation of critical of query outputs (Enache, 1994) was
critical of the failure of current DSS systerns to make use of advances in GIS technology.
In addition, it does not allow them to visualise the final changes, before starting the
maintenance work. Clearly, there is a need to improve the management of information
and tasks about building maintenance.
Geographic Information Systems (GIS) are collections of computing techniques and
databases that support the gathering, analysis and display of large volumes of spatially
referenced data (USEPA, 2002).
On the other hand, the innovation consists of a laser scanner controlled by a laptop
computer. The scanner is targeted to the physical objects to be scanned and the laser
beam is directed over the object in a closely spaced grid of points. By measuring the time
of laser flight, which is the time of travel of the laser from the scanner to the physical
objects and back to the scanner, the scanner determines the position in three-dimensional
space of each scanned point on the object. The result is a cloud of points thousands of
points in three-dimensional space that are a dimensionally accurate representation of the
existing object (Schofield, 2001). This information can then be converted in a 3D CAD
model that can be manipulated using CAD software, and to which the design of new
equipment can be added.
3D Laser Scanner is currently used for a variety of sectors range from industrial
applications for process automation in automotive industry, steel industry, robotics, etc,
to mining, archaeology, survey, urban planning and railway, tunnel and bridge
construction (Arayici et al, 2003).
In recent years, however, the emerging GIS systems have presented organisations and
management sectors with significant advances in making informed decisions. Ehler,
Cowen, and Mackey (1995) argued that linking GIS with DSS systems has enabled the
user to make well-informed decisions, based on the problem at hand. Also, Modis (2001)
reported that tools, which are based on GIS technology, have ofFered managers and
decision-makers substantial benefits, including usability, accuracy, and efficiency.
Consequently, organisations around the world are reaping considerable benefits by
capitalising on spatial technology solutions. GIS applications in (DSS) provide an
enhanced means of resolving complex geo-analytical problems.
Furthermore, systems based on 3D GIS technology are starting to supersede the early
GIS systems (Jordan, 2000), (Song et al, 2002, 2003). Although still in its infancy, this
emerging technology could clearly support the planning process of building maintenance
projects. 3D modelling capability of GIS could also enable managers to foresee changes
and modifications in an improved manner. However, despite the evident advantages of
3D technology to this type of planning and construction work, its fiill benefits could not
be realised without an improved visualisation of the output. Indeed, the results of 3D GIS
systems are usually displayed as a static cardboard model, which does not allow users to
explore and rapidly visualise the results of their queries.

eWork and eBusisness in architecture, engineering and construction


Combining 3D GIS with advances in the Laser Scanner VR technology could provide
decisionmakers more robust tools to visualise in real-time the 3D GIS environment.
Verbree et al. (1999) argued that VR technology offers new and exciting opportunities to
visualise 3D GIS data that, in turn, improve DSS usability and enable users to walk
through 3D environments. It allows them to see building elements and appreciated
proposed changes in a real time environment. Sidjanin (1998) demonstrated that linking
GIS and VR oifered great capabilities for decision-making, as it could produce real-time
and realistic visualisation of spatial data. In addition, VR interface could improve
understanding of GIS spatial analyses and handling of queries on the data, as well as
navigating through the dynamic map model and for using GIS functions.
Similarly, the ability to rapidly sketch and visualise design ideas has been stressed as
an important task in urban design (Smith, 1998). Hence the VENUE Project was
conceived as a means of experimenting with links between GIS and 3D visualisation
tools (ibid). The project demonstrated that a set of urban features can be visualised as
building block outlines in 2D ArchView (based on Ordnance Survey base data).
Removing building sub-divisions and line vertex generalisation enabled the production of
3D VRML (Virtual Reality Modelling Language) model by assigning a height attribute in
ArchView. This approach is on a macro scale in relation to buildings but can be extended
to a more detailed micro scale application suitable for building maintenance (Camara and
Raper, 1999).
In line with the foregoing, it can be established that there are several approaches that
have successfiilly linked 3D-GIS with the Laser Scanner VR technology as a means of
enhancing decision support. These successful developments further exposes the
possibility of employing this combination to enhance current building maintenance DSS.
The following section describes a proposed methodology, which is partly inspired by the
work of (Mahdjoubi and Ahmed, 2004).
The aim of this section is to propose a framework which includes a series of analytical
tools that will enable various stakeholders in the building maintenance sector to make
informed decisions relating to building maintenance works. This framework, which is
depicted in Figure 2, includes:
1) The development and population of a geo-spatial project database with the digital data
of existing building captured with the laser scanning equipment.
2) The analysis of complex building information maintenance options within a
knowledge repository environment, digital building data captured by the laser scanner
is retrieved with 3D GIS system for the analysis.
3) The visualisation of the project information through a range of different interconnected
graphic windows. The laser scanner VR model can be visualised in different platforms
including workbench.
The geo-spatial project database will describe the geometries of both the building frame
and its components. Simple open geometric descriptions will be used, but each entry will

Virtual building maintenance


also be associated with data on inventory information such as name, supplier, date
installed/replaced, number of previous replacements, etc.
The procedure for the development will be based on establishing a robust objectoriented database management system (OOMS). The system will enable the capture of all
geo-spatial information of the building frame and components using laser scanner.
Inventory information relating to each frame and component will also be captured within
the relational

Figure 2. The Virtual Building

Maintenance System Framework.
structure of the database. Such information will be accessible in real-time with some of
the attributes (e.g. component supplier information) hyperlinked to the World Wide Web.
Sequel to the development of the OODM, information captured will be linked to a
knowledge repository developed purely for rule base and/or case-based interpretation of
possible building maintenance schedules. This component of the VBM system will
facilitate the generation of alternatives based on user-specified queries.
GIS software will be used to generate and analyse thematic developments relating to
the building properties and associated maintenance management strategies. ArcGIS 3D
analyst, for example, enables users to effectively visualise and analyse surface data.
Using the 3D spatial analysis capabilities of the tool, a range of possible scenarios of a
building can be evaluated. Surfaces can be viewed from multiple viewpoints, queried,
interrogated for visibility and viewed for the creation of a realistic perspective image.
Furthermore, the evaluation can also be extended to display static images of building
components that require immediate or 'near-future' maintenance based on the realtime
information captured within the OODM and the knowledge repository. However, a final
selection of the most appropriate software will be based on the most suitable
representation (i.e. raster or vector) of the captured data.
The VR environment will be developed using the laser scanner technology which also
provides data models in different formats including the Virtual Reality Modelling
Language (VRML). This approach is complimentary to previous work done on the
information infrastructure developed through the OSCON, VIRCON and HyCon projects
and the ongoing research for nD modelling project (Lee et al, 2003). Therefore, the
results of the spatial analysis obtained within the 3D GIS environment can be evaluated

eWork and eBusisness in architecture, engineering and construction


in real-time with the options of viewing building maintenance alternatives developed

from querying the knowledge repository.
The research is of potential benefits and practical applications to the construction industry
and professions. It will provide a better support for evaluation and visualisation of
building maintenance works so that informed policies can be effectively targeted. It will
benefit construction companies, facility and estate managers, and all those concerned
with building maintenance issues.
The ultimate beneficiaries of this work will be professionals and stakeholders of the
construction industry involved with the:
building maintenance,
improved predictability of building maintenance requirements,
reduced maintenance planning and execution time,
increased safety,
Increased productivity.

This paper provides an overview of the e-Regeneration package of the INTELCITIES
project, which aims at helping achieve the EU policy goal of the knowledge society.
INTELCITIES project aims to bring together the combined experience and expertise of
key players from across Europe, focusing on a number of built and human environment
issues including e-Government, e-Planning and e-Inclusion, e-Land Use Information
Management, e-Regeneration, Integration and Inter-operability, Virtual Urban Planning,
etc, (http://www.intelcitiesproject.com/).
This project recognises the need for integrating visualisation techniques and systems
for building maintenance and refurbishment. In particular, the vision for the use of laser
scanner equipment for building refurbishment and maintenance is addressed (see figure
1) and a framework for integrating such 3D GIS and Laser scanner systems is developed
to assist the flow of information. Lastly, the beneficiaries of such integration are
For the time being, the integration of 3D GIS and Laser scanner technology is being
conceptually modelled. Once this is completed, it will be implemented.
Arayici, Y., Hamilton, A., Hunter, G. (2003) Reverse Engineering in Construction, the
conference of World of Geomatics 2003: Measuring, Mapping, and Managing, Telford, UK
Camara, A.S., Raper, J. (1999) spatial multimedia and virtual reality. London, Taylor & Francis.
Decision-Support Model. Application of the performance concept in buildingInternational

Virtual building maintenance


Ehler, G., Cowen, D., Mackey, H. (1995) Design and Implementation of Spatial Decision Support
System for Site Selection. ESRI, International User Conference, 1995, May 2226, 1995, Palm
Springs, (California), ESRI
Enache, M. (1994) Integrating GIS with DSS: A Research Agenda. URISA Conference,
Milwaukee, Wisconsin, August 1994
Jordan, L. (2000) Web Accessible 3D Viewing Next Step for GIS Virtualising the 3D Real World
multi-view interface for 3D GIS. Computer & Graphics, 23, pp. 497506.
Lee, A., Marshall-Ponting, A.J., Aouad, G., Wu, S., Koh, W.W.I., Fu, C., Cooper, R., Betts, M.,
Kagioglou, M. Fisher, M. (2003) Developing a vision of nD-enabled construction, Construct IT,
University of Salford, UK
Mahdjoubi, L., Ahmed, V. (2004) Virtual Building Maintenance: Enhancing Building
Maintenance using 3D GIS and Virtual Reality (VR) Technology, Conference of Designing,
Managing, and Supporting Construction Projects through Innovation and IT solutions
(INCITE2004), February 2004, Langkawi, Malaysia
Modis (2001) IT Resource Management. <http://www. modisit.com/gis/> (accessed on 28
Rosenfeld, Y. Shohet, I.M. (1996) Initiation of Renovation Projects: Techno-Economic
Sidjanin, P. (1998) Visualisation of GIS Data in VR Related to Cognitive Mapping of
Environment. IEEE Computer Society: conference on Information Visualisation, 1998, July,
London: IEEE Computer
Schofield, W. (2001) Engineering Surveying 5th Edition: Theory and Examination Problems for
Students, ISBN 07506 4987 9
Simmonds, P., Clark, J. (1999) UK Construction 2010-future trends and issuesbriefing paper
Smith, A. (1998) The Venue Project: Adding 3D Visualisation Capabilities to GIS. Society, pp.
Song, Y., Hamilton, A., Trodd, N.M., (2002) technical Design Issues of Linking Geospatial
technology for 3D Visualisation, Interaction and Analysis, in the Proceedings of the Conference
on GIS Research in the UK, pp. 256262.
Song, Y., Hamilton, A., Trodd, N. (2003) Developing an Internet based Geographic Visual
Information System, In the proceedings of the GIS Research in the UK 2003 Conference, 9th
11th April 2003, City University, London.
Underwood, J. Alshawi, M. (2000) Forecasting Building Element Maintenance within an Integrated
Construction Environment. Automation in construction 9 (2000) pp. 169184
Usepa (2002) GIS-Visualization (VIS) Integration Efforts. United States Environmental Protection
Verbree, E., Van Maren, G., Germs, R., Jansen, F. Karaak, M. (1999) interaction in virtual world
views-linking 3D GIS with VR. Geographical Information Science, 13(4), pp. 385396
Wix, J. (2003) Domain/Facilities Management within the International Alliance of Interoperability,

eWork and eBusiness in Architecture, Engineering and ConstructionDikba & Scherer (eds.)
2004 Taylor & Francis Group, London, ISBN 04 1535 938 4

Supporting standard data model mappings

Department of Computer Science, University of Auckland, Auckland, New
ABSTRACT: Very little work has been done on specifying a standard
mapping between the overlapping semantic specifications in the
standardized data models used in architecture, engineering and
construction (A/E/C, e.g., IAI-IFC and ISO-STEP standards). However,
several companies have developed bespoke mappings from these
standards into their design tools, and back out again. With this approach it
is difficult to understand how complete their mappings are, and what
assumptions are made in the development Of the mappings. Yet for
semantic mappings, as distinct from mappings over geometric
representations, this has a profound implication for the correctness of the
resultant data. In this paper the development of a suite of mapping support
tools is discussed to illustrate the level of support required to ensure
semantically correct mappings across data models.

1.1 Problem statement
The specification of a semantically correct mapping between any two standard data
models used in the A/E/C industries is an enormous task. Data models have in the order
of 500 entities and many thousands of relationships and attributes (e.g., IFC 2.x, IAI
2004). The mere task of sitting down and describing which entities are related to each
other is daunting, let alone managing to encompass the full semantic coverage of the
contents of each of these entities. Yet without some definition of a mapping to be
implemented it is basically impossible to guarantee the correctness of any implemented
translator for a standard data model.
It is clear that human experts are needed to perform this task, knowledgeable in both
schemas being mapped between. Yet even for such experts the management problem of
describing a mapping over such large schema forces a requirement for some
computerized support. This support comes in the form of notations and environments to
specify what is equivalent between two schema in a form that can then be used to
generate the code to actually perform the mapping.
In the last decade there was an active research community developing approaches to
mapping languages in engineering domains (Khedro et al 1996; Verhoef et al 1995;
Eastman 1999: Chapter 11). Several of those efforts have been pursued in the
development of the ISO mapping standard EXPRESS-X (Hardwick and Denno 2000),
and in the development of mapping tables (ISO 1993).
These mapping approaches are now being utilised on a wide range of standard data
models available from ISO 10303 STEP, ISO 13584 Parts libraries and catalogs, CIS/2

Supporting standard data model mappings


(Crowley and Watson, 2000) and lAIs IFCs (IAI 2002). However, every mapping
between two of these standard data models will be duplicating the work of previous
attempts. If it were possible to specify a mapping in an easily comprehensible manner,
and there were tools that industry experts could use to agree on the correctness of the
defined mapping, then a consensus on major mappings between schema could be
developed and published in much the same way that standard schema are published
today. This paper examines what tools would be required to reach this position.
To manage the task of developing a mapping between two data models there is a
requirement for a range of support functions for the specifier. These include:
A graphical mapping notation to enable the specifier to visually comprehend the
mapping being described between subsets of the data models.
A mapping specification environment to enable navigation through, and partitioning of,
the space of mappings specified. Such a tool can also determine what has, or has not,
been mapped between.
Automated mapping support to enable a significant proportion of the mappings required
between two schemas to be automatically determined.

Figure 1. VML-G: a graphical

mapping formalism.
A mapping interpreter to allow evolving mappings to be tested on partial sets of data.
A verifier to check the correctness of the developing mapping specification. Such a
verifier would offer support from basic syntactic checking across the data models
through to a more comprehensive semantic analysis of the proposed mapping.
The development of such a support environment is described in the following sections.
With this environment in place it is then possible to move on to providing standard
mappings between the major standard schemas which exist in our domain.

eWork and eBusisness in architecture, engineering and construction


In order to describe the equivalences which exist between data structures in two different
schema it is necessary to have a notation for the specification. A range of notations have
been developed and utilised ranging from straight specification within a standard
programming language (such as C or Java), through ISO mapping tables (ISO 1993), and
the evolving ISO mapping language EXPRESS-X (Hardwick and Denno 2000).
In many respects these approaches are analogous to the use of the EXPRESS language
to specify the conceptual data structures for a schema for a particular domain. These
approaches provide for a complete and detailed specification of how the mapping
between portions of the schemas will have to be realised.
However, they do not provide a way to gain an overview of the mappings which have
been developed between two schema or the completeness of any particular mapping.
Where schema have several hundred classes in them this is of major concern to the
specifier. In the same way that EXPRESS-G is used as a high-level notation for
describing the basic structures within a schema, and to view various subsets of a schema,
a graphical mapping formalism will allow a high-level overview of the mapping between
schemas to be presented.
A range of graphical formalisms have been developed at the University of Auckland to
represent mappings to different classes of users. Figure 1 shows a programmer level
formalism for specifying mappings between UML styled class diagrams in two schema.
The VML-G language (Amor 1997) shown in Figure 1 uses a wiring approach to
denote a mapping between attributes, or classes, in a schema and an icon representing
that particular mapping. The mapping icon provides three areas in order to separate
general mappings between attributes and classes from the specification of invariants,
which direct when the mapping is applicable, and initialisers, which describe starting
values for particular attributes of a newly created object. As can be seen in Figure 1 the
specification of the actual mapping is hidden from view and presented as a classification
to either a straight equivalence (=), an equation (eqn), a functional equivalence (func), or
a procedurally described equivalence (proc).
By examining such a graphical mapping specification it is very easy to verify that all
attributes are being handled in the mapping, and by examining the invariants across
several mappings it is possible to verify that all possible conditions are being modelled. It
also allows a high-level specification of the equivalences between portions of a schema
without concentrating on the detail of how to achieve the mapping.
The author contends that any textual mapping notation needs to be supported by a
graphical formalism which allows for a high-level overview of the mappings which are
being specified.
If a simple textual notation is used to describe a mapping then it can be developed in any
textual editor. However, if a graphical formalism is going to be utilised to specify a
mapping then it needs to be supported

Supporting standard data model mappings


Figure 2. A business level mapping

specification environment.
by a more comprehensive specification environment. Such a specification environment
must allow for both graphical and textual notations to be viewed and the consistency
between these views to be maintained under edits to either view.
In Figure 1 the specification environment for VML-G allows for classes from the
related schema to be viewed within a window, for a mapping icon to be placed in the
window, and for wiring from attributes and classes to be drawn to the mapping icon. In
this environment each window represents a particular mapping, and by navigating the
various windows a specifier can examine the full set of mappings developed. The
specifier can also switch to a textual view to see the full mapping specification and any
edits made to the textual view are propagated back into the graphical view. From a
programmer level support perspective this is very useful, but it does not tie to real data to
help checking.
In Figure 2 a business level specification environment is shown (Li et al 2002). Within
this environment the schemas being mapped between are visualized as business forms
and a wiring approach is used to specify the mappings between various fields in the
forms. In this environment the specifier can view not just the data schema in a format
close to its business use, but also exemplar data within each of the fields. Tied to this is

eWork and eBusisness in architecture, engineering and construction


the ability to run each of the partial mappings specified and hence to view the result of
the application of the specified mappings in the other business form.
While the tools highlighted in Figures 1 and 2 clearly provide for greater comprehension
and checking of the mappings which are being described it is also clear that detailing the
mappings between schema which comprise several hundred classes is going to take a
long time.
In order to ease this workload it is useful to consider approaches which will allow for
the automated specification of the mapping (or a portion of the mapping) between two
schema. This is an area of ongoing research with many approaches being considered (see
Rahm and Bernstein 2001 for a survey of approaches).
This sort of tool is also useful to handle mapping between versions of particular
product models. For example, the IAI have produced six versions of the IFC in the last
seven years and the CIS/2 LPM is expected to be updated every year. The mapping
between consecutive versions of a particular schema tend to be fairly minor which make
for an easier problem when considering automated mapping creation. This problem is
also closely related to that of schema evolution in object-oriented databases (Banerjee et
al 1987, Lerner and Habermann 1990, Eastman 1992, Deux 1990, Zicari 1992, and
Atkinson et al 2000).
A previous student developed a hybrid mapper utilizing structure and name
comparison to automate the creation of mappings for IFC versions (Amor and Ge 2002).
This demonstrated that approximately 80% of

Figure 3. Voting in an automated

mapping tool.

Supporting standard data model mappings


an IFC schema could be automatically mapped to the next version.

An examination of points where this hybrid mapper failed illustrated that diiferent
approaches to identifying mappings performed well in different settings. To explore how
this might be utilized in automated mapping creation there has been a project (Bossung
2003) to develop an infrastructure to allow multiple matchers to vote on their proffered
mapping for particular parts of an inter-schema mapping. Figure 3 shows a screen
snapshot of this tool where three matching tools (a Levenshtein matcher, a partial name
matcher, and a type matcher) bid for their mapping for a partial structure match. With this
tool the user can examine the highest ranked mappings for any portion of the schema and
select between the mappings being offered. Further work on this tool is looking at rematching mappings based on selections (changes) made by the user of the tool.
In order that the specified mappings can be enacted it is necessary to generate code for
each mapping. Within a tool which supports visualization of mapped exemplar data this
has to take the form of an interpreter for individual snippets of the mapping. Every
mapping tool must be able to generate the full mapping specification in some target
With a high-level mapping specification language, as demonstrated in Figures 1 and 2,
the mapping code generator allows for the creation of code in a number of target
languages from bespoke languages such as VML (Amor 1997) and EXPRESS-X
(Hardwick and Denno 2000) through to generic mapping languages such as XSLT and
even straight into programming languages such as C and Java.
As detailed in the introduction, developing a high-level mapping provides the specifier
with a way to ensure that the semantics of the data in two schemas is going to match.
However, due to the size of the schemas being developed this is still a difficult process.
The provision of a graphical formalism helps in checking, as do support environments
which map exemplar data based on the developing mapping. But, to ensure a correct
mapping has been developed requires a comprehensive testing regime based around nontrivial exemplars.
While the IAI and ISO do have a certification process and testing suites the approach
is certainly not as rigorous as would be expected in a field such as software testing.
The author suggests that the testing of a round trip mapping should be considered as
the main form of verification for mapping specifications. While implemented translators
for geometric models (e.g., DXF, IGES, etc) are known, and assumed, to have errors this
is not such an issue as human interpretation is used to determine the semantics of a
translated geometric model. However, for an object-based model such errors are far more
serious. A round trip mapping which does not preserve individual objects and their
original parameters has changed the specification of the building to almost any tool
which uses this data.

eWork and eBusisness in architecture, engineering and construction


The development of mappings between two schema is a large and very important process
when developing translators for the various standard schemas being used in the
construction industries. To ensure correct specifications requires not just an expert in the
various schema being manipulated, but also a range of support tools to help the specifier
through the process. A range of these support tools have been developed and described
briefly in this paper.
However, to take the industry to the next stage where they can have confidence in the
translators and mappings which exist requires that further rigor is injected into the
mapping testing process. It is also recommended that a range of certified mappings be
developed between the main standard schemas being developed for the industry as well
as between the various versions of the schema which have been produced.
Amor, R.W. & Ge, C.W. 2002. Mapping IFC Versions, Proceedings of the EC-PPM Conference on
eWork and eBusiness in AEC, Portoroz, Slovenia, 911 September, 373377.
Amor, R. 1997. A Generalised Framework for the Design and Construction of Integrated Design
Systems, PhD thesis, Department of Computer Science, University of Auckland, Auckland, New
Zealand, 350 pp.
Atkinson, M.P., Dmitriev, M., Hamilton, C. & Printezis, T. 2000. Scalable and Recoverable
Implementation of Object Evolution for the PJama1 Platform. Persistent Object Systems, 9th
International Workshop, POS-9, Lillehammer, Norway, 68 September,292314.
Banerjee, J., Kim, W., Kim, H. & Korth, H. 1987. Semantics and Implementation of Schema
Evolution in Object-Oriented Databases, Proceedings of the 1987 ACM SIGMOD international
conference on Management of data. San Francisco, USA, 311322.
Bossung, S. 2003. Semi-automatic discovery of mapping rules to match XML Schemas,
Department of Computer Science, The University of Auckland, NewZealand, 71 pp.
Crowley, A. & Watson, A. 2000. CIMsteel Integration Standards, Release Two, 5 Volumes, Steel
Construction Institute and Leeds University, UK.
Deux, O. 1990. The story of O2. IEEE Transactions on Knowledge and Data Engineering (TKDE),
2(1), March, 91108.
Eastman, C.M. 1999. Building Product Models, CRC Press, Orlando, FL, USA.
Eastman, C.M. 1992. A data model analysis of modularity and extensibility in building databases,
Building and Environment, 27(2), 135148.
Hardwick, M. & Denno, P. 2000. The EXPRESS-X Language Reference Manual, ISO
TC184/SC4/WGH N117, 20000628.
IAI. 2002. International Alliance for Interoperability, web site last accessed 18/6/2004,
ISO 1993. Guidelines for the documentation of mapping tables, ISO TC184/SC4/WG4 M105,
Khedro, T., Eastman, C., Junge, R. & Liebich, T. 1996. Translation Methods for Integrated
Building Engineering, ASCE Conference on Computing, Anaheim, CA, June.
Lerner, B.S. & Habermann, A.N. 1990. Beyond schema evolution to database reorganization,
Object-Oriented Programming, Systems, Languages, and Applications (OOPSLA), Ottawa,
Canada, October, 6776.

Supporting standard data model mappings


Li, Y., Grundy, J.C., Amor, R. & Hosking, J.G. 2002. A data mapping specification environment
using a concrete business form-based metaphor, In Proceedings of the 2002 International
Conference on Human-Centric Computing, IEEECS Press, 158167.
Rahm E. & Bernstein, P.A. 2001. A survey of approaches to automatic schema matching, The
International Journal on Very Large Data Bases (VLDB), 10(4), 334350.
Verhoef, M., Liebich, T. & Amor, R. 1995. A Multi-Paradigm Mapping Method Survey, CIB
W78TGIO Workshop on Modeling of Buildings through their Life-cycle, Stanford University,
California, USA, 2123 August, 233247.
Zicari, R. 1992. A Framework for schema updates in an object-oriented database systems. Morgan
Kaufmann Series In Data Management Systems, 146182.

eWork and eBusiness in Architecture, Engineering and ConstructionDikba & Scherer (eds.)
2004 Taylor& Francis Group, London, ISBN 04 1535 938 4

Virtual building environments (VBE)

applying information modeling to buildings
Lawrence Berkeley National Laboratory, University of California,
Berkeley, CA, U.S.A.
ABSTRACT: A Virtual Building Environment (VBE) is a place where
building industry project staffs can get help in creating Building
Information Models (BIM) and in the use of virtual buildings. It consists
of a group of industry software that is operated by industry experts who
are also experts in the use of that software. The purpose of a VBE is to
facilitate expert use of appropriate software applications in conjunction
with each other to efficiently support multidisciplinary work. This paper
defines BIM and virtual buildings, and describes VBE objectives, set-up
and characteristics of operation. It informs about the VBE Initiative and
the benefits from a couple of early VBE projects.

Most manufacturers thoroughly test their products before delivering them to the market.
In some industries the testing is done on physical prototypes, in other it is done virtually;
in many industries it is done both ways. Manufacturers know exactly how their products
are going to perform or hold up before these products are built and sold. For example, car
manufacturers extensively test prototypes (both virtually and on the track) and know how
the new models will perform before they start manufacturing them, even if they do not
disclose all information in public. NASA thoroughly tested all new space vehicles in the
Gemini, Apollo and Space Shuttle Programs in simulation and launched them only after
all problems were proven in simulation to have been solved; when unforeseen problems
developed later (such as during the Apollo 13 mission), NASA was able to analyze and
resolve problems in simulation.
In most developed countries the buildings construction industry, sometimes called the
Architecture-Engineering-Construction-Operations (AECO) or Architecture-EngineeringConstruction-Facilities Management (AEC/FM) industry, is the second largest industry.
Yet, virtually no testing of the primary product of the industrythe buildingis done
before irrevocable and often very costly decisions are made. True, pre-manufactured
components are often tested at least in some ways before delivery; mock-ups of critical
parts of a building are built and tested at times; various types of the buildings
performance are occasionally simulated; and extensive visual simulations (including
walk-throughs and fly-bys) are becoming the norm. But no testing of the whole
building in all of its aspects of performance is performed before the building is delivered.
Commissioning constitutes only a partial substitute for testing and is of little help if the

Virtual building environments


building in question needs substantial redesign and/or reconstruction to perform as

originally expected.
The product conception-construction-delivery process in most other industries follows
the designtest/verify-manufacture-deliver-warranty script In contrast, the AECO
industry seams to employ the convince-build-pray modus operandi. The designers
convince the client by demonstrating a few selected performance aspects (usually cost
and image) he/she can understandbut the designers cannot guaranteethat the building
will work to the clients expectations; the builders build the building, and then everyone
awaits to see how the building will work once it is occupied and in use. They hope for the
best, but fear the worst. At best, everybody is eventually relieved (even if some feelings
have been hurt in the process); at worst, almost everybody involved faces legal
Given that the standard of practice for how buildings are procured and delivered has
not substantially changed in more than a hundred years, no other modus operandi can
really be expected. Most (building design) decisions are made without testing their effect
first. Whatever testing takes place in the design and construction phases is limited to only
a few aspects of performance (i.e. it is intended to aid only specific types of decisions); it
is infrequent and usually lacks follow up. It is slow and often delays the building
procurement and delivery process it is expected to expedite. It is often very costly and is
seldom required, or even accounted for, in contracts.
Obviously, tests and comprehensive verification of performance are very difficult to
attain when each product is essentially a very costly one of a kind, and when it takes a
long and laborious multidisciplinary effort to design and build it. It does not help that the
product is also very complex and complicated, and that smaller scale partial replicas can
reproduce only a few of the performance aspects of the product, and even those mostly
only as approximations. In similar situations, decision makers in other industries use
software that can simulate the products performance that is of interestthey build
virtual products they can experiment with and test with computers. It is clear that the
AECO industry will be able to test its product (i.e. buildings) in a comprehensive manner
only virtuallyit will have to first build virtual buildings, test them (and make the
necessary design and planning modifications) and physically construct them only after
The industry has been using industry specific software more than just occasionally for
about 20 years, ever since the first versions of AutoCAD appeared on the market and in
schools of architecture and engineering. Today, industry use of software extends well
beyond CAD with downstream applications that model performance relevant to or
resulting from different parts of a buildings life cycle. However, unless they belong to an
integrated suite of software tools, these applications have little to do with each other
they are unaware of each other, often describe essentially same data in different ways,
and do not exchange or share data. This is resulting in an unnecessary generation of
duplicate data, and is causing a lot of unnecessary errors and omission, cost and delays
(Bazjanac 2001).
The creation of virtual buildings and their productive use in experimentation and
testing will require additional software and, more importantly, organized coordination
among all software that may be used. Some leaders of the AECO industry have realized
this and have formed a slew of new organizations and consortia in the last decade

eWork and eBusisness in architecture, engineering and construction


designed to bring new technology and software interoperability to the industry.

These include the International Alliance for Interoperability (IAI 1995), the Building
Lifecycle Interoperable Software consortium (BLIS 2000), the Continental Automated
Buildings Association (CABA 2002), FIATECH to bring technology to capital projects
(2002), the Construction Users RoundTable (CURT 2003), the Open Standards
Consortium for Real Estate (OSCRE 2003), to name just a few in North America.
Perhaps one of the most important of these to date is the IAI, because it developed the
Industry Foundation Classes (IFC), the first open object oriented comprehensive data
model of building that provides rules and protocols for definitions that span the entire life
cycle of a building. IFC are also the only such model that is an international standard
(ISO/PAS 16739). All major CAD vendors have developed their internal intelligent
data models of buildings; these are designed to support the work of and data exchange
within a particular vendors suite of tools and are thus limited in scope, are dissimilar and
proprietary. Nonetheless, together with non-proprietary developments, these are all
beginning to move the users of industry specific software from defining buildings as sets
of lines and text that must be interpreted by the observer to defining buildings as
information models. Definition of buildings as information models will be the foundation
in creating virtual buildings with software that can seamlessly access data from the
information model, manipulate/use them, generate new data, and return them to the
information model.
2.1 Building information models
Building information modeling (BIM), used as a verb, is the act of creating a Building
Information Model (BIMa noun). While it was apparently a term originally used by
Autodesk staff internally, Jerry Laiserin was the first to widely publicize it in the industry
(Laiserin 2002).
Used as a noun, a BIM is an instance of a populated data model of buildings that
contains multidisciplinary data specific to a particular building which they describe
unambiguously. It is a static representation of that building (i.e. it uniquely defines that
building in a section of time)it contains raw data that that define the building from
the point of view of more than one discipline. Data contained in a BIM are also rich:
they define all the information pertinent to the particular building component. A threedimensional surface model of building geometry alone that is used only in visualization
is usually not a BIM. A BIM includes all relationships and inheritances for each of the
building components it describes; in that sense it is intelligent. A data set that defines
only a single view of a building (i.e. that describes a specific single type of
performance), such as a data set that, for example, includes all data a structural engineer
may need for structural calculations (but nothing more) is, by itself, not a BIM.

Virtual building environments


2.2 Virtual buildings

A virtual building is a BIM deployed in software. It simulates the behavior or
performance of a building or building component(s) entirely within a computer system,
without any physical construction of the building or any of its components. A virtual
building constitutes the use of data that are contained in a BIM to reproduce the behavior
or performance of a building or building component(s) with accuracy appropriate to the
reason for reproduction. The BIM is deployable by a suite of software that can reproduce
behavior or performance in a comprehensive way and, as appropriate, over time. A virtual
building is a dynamic building representation, even if a particular single view of the
building is static.
Any software that can access and use data contained in a BIM to simulate some form
of behavior or performance of the building can be part of a virtual building software
suite. Different software within the same suite can depict behavior or performance at
different levels of detail, as long as each is appropriate for the view it represents.
The software is operated by qualified professionals who are experts in both the use of
a particular software application that is part of that suite and in the industry discipline that
application belongs to. Virtual building operators need this dual expertise to properly
resolve or interpret issues that arise from limitations of software, lack of reliable data
and/or professional conventions.
It takes an enormous amount of data to define everything even in small and simple
buildings. The amount of data to describe a building increases manifold with increase in
building size and complexity. The temptation to reduce the amount of data by discarding
data in which one has no interest is countered by the realization that each datum is
potentially of interest to someone else.
As explained above, a BIM contains data that define building status in section of time.
To reduce the physical size of a BIM, instead of replicating data available externally
(such as manufacturers product data for some of the building components), it only
includes pointers to external data bases where such data are available. In the case when a
building definition depends on results generated by a software application, instead of
capturing the entire (usually large) submission from that software, the BIM contains only
data that enable the regeneration of the submission (i.e. the BIM captures only the data
needed to reproduce the input for that software).
Virtual buildings generate enormous amounts of data on their own. These are
measured and/or simulated time based data that are critical to the definition of building
behavior and performance. When used in a virtual building, a BIM also includes pointers
to data bases that keep such data externally.
The shear amount of data that define a building can pose problems in data exchange.
Standard file exchange is impractical when the file includes a complete BIM. Model
servers which facilitate partial model exchange (i.e. exchange of only some of the data)
can solve that problem: The (very large) BIM file is resident in a model server, and

eWork and eBusisness in architecture, engineering and construction


clients query for and extract only data needed by a particular application. The extracted
data, given proper authorization, can come from any part of the BIM: a specific
individual datum or data sets that represent a particular view (or a set of views) of the
building. Depending on the location of the model server, the data can be accessed directly
or via web services.
File exchange usually requires implementation of the same version of the building
data model by both the generating and receiving software. Data model versioning is
typically irrelevant to model servers.
Despite recent eiforts by the leading CAD vendors and the new industry organizations to
promote building information modeling as the way to define buildings, an overwhelming
majority of building procurement projects is still done the same old way by defining
and representing buildings in dumb 2-D and text documents and with little, if any, use
of contemporary IT technology. This is true even though technology exists now that can
make professional work in most of the industry disciplines much more efficient and
effective than it is today. The industry in general is resisting efforts to change toward
information modeling and creation and use of virtual buildings. The causes for this
resistance are found in several pragmatic reasons: steep learning curves, lack of time and
adequate funding, and shortcomings of software.
Most software applications that are specific to the AECO industry share a common
characteristic: Their proper use requires intricate knowledge of the application and
expertise in the corresponding industry discipline. Obtaining such level of knowledge and
expertise for all industry software that is used for a given project is very difficult and
often prohibitively expensive. End users that have the task to create a real life project
BIM and use interoperable and non-interoperable software with it can face very steep
learning curves and software that is sometimes at best in beta status and cannot easily do
what the user expects it to do. They are under pressure to meet tight regular deadlines,
and seldom have any meaningful additional resources to do their work on a given project
in a way different than before. After briefly trying information modeling, their typical
response is: This does not work (for me/for this project/for my office/at all). They then
revert to the old ways of working and using dumb software.
Often overlooked is the impact of the old way of procuring buildings on
multidisciplinary teams that are assembled to work on a given project. As currently
practiced, their work is unnecessarily difficult: Communication is far from efficient, data
exchange is costly and ridden with errors and omissions, data sharing is not practiced and
is often practically impossible, and their work is behind schedule almost by definition.
In addition, their group experience and knowledge is seldom reused in another project
and is usually lost after the project is over. The work of multidisciplinary teams could
become much more efficient and effective with the use of BIM and virtual buildings.
BIM development and the use of virtual buildings today usually require help. This
help is now beginning to be available in the form of Virtual Building Environments that
are designed to assist end users of industry software and serve as a break through

Virtual building environments


mechanism to get building information modeling and virtual buildings deployed in the
4.1 What is a virtual building environment?
A Virtual Building Environment (VBE) is a place where a group of industry software is
operated by industry experts who are also experts in the use of that software. The primary
purpose of a VBE is to facilitate expert use of appropriate software applications in
conjunction with each other. A VBE employs software applications that, as a group,
define a building, its parts, its behavior and its performance. It involves simultaneous or
near-simultaneous simulation and display of data generated by multiple sources. A VBE
facilitates the manipulation of data that are used in the planning, design, construction and
operation of a building. It makes it possible to conduct experiments on the building or its
parts, without first erecting them. In summary, a VBE is a physical place (i.e. a location)
that facilitates expert creation of and use of virtual buildings.
Ideally, a VBE follows a buildings entire life cycle, and the selection of software
changes correspondingly from that related to design, to that related to construction, to
that related to commissioning, to that related to operation and maintenance, and
eventually to that related to demolition. The selection of software and participating
experts supports broad definitions of design, construction and operations. For example,
the construction and maintenance processes can be planned and modeled along with the
building itself to evaluate constructability and maintainability early in a project.
Similar to a selected group of software, a VBE involves a group of experts. Group
members have the experience, expert knowledge and skills in both software applications
and industry disciplines the software is related to. They understand the relevance, the
meaning and the quality of data used in a particular industry project, as well as the
implications of decisions made in the use of software. They can solve problems and
define tasks appropriate to specific applications, and can create work-arounds within a
particular application if the application cannot deal with the problem or data as defined.
When a VBE is employed in a specific industry project, the group of experts contains
those that have expertise, knowledge and skills relevant to the particular project. From
the VBE perspective this group of experts is temporarily extended (for the duration of the
project) by staff or others from organization(s) that are working on the particular project
or are involved with it. From the project perspective these experts join the project team
temporarily to assist the team so it can more effectively use software needed for the
project, create the BIM and test its designs, solutions and/or plans in a virtual building.
4.2 VBE objectives
Other industries, such as automotive and aero-space, have reaped significant benefits
from the use of IT. Virtual building environments should help experience and
demonstrate explicit benefits from the use of contemporary IT in the building
procurement process: the use of groups of software to solve multidisciplinary problems,
the use of comprehensive project data depositories that contain all project data (including
historical), the automation and semi-automation of repetitive tasks, prompt access to
expert knowledge, instantaneous distribution of complete data sets to all who need them,

eWork and eBusisness in architecture, engineering and construction


seamless and instantaneous multi directional exchange or sharing of interdisciplinary

project information, virtual collaboration, concurrent engineering, and much more.
They can provide support to many different types of industry projects that can benefit
from the use of virtual buildings. These, among many other, include architectural,
engineering and interior design projects to test design alternatives, refine decision,
control building cost, and explain and communicate results; new construction and
reconstruction projects to foresee and prevent problems in construction and its
sequencing, detect insufficient or missing information, and test and explain cost effective
substitutions and/or deviations from design documents and specifications; energy
conservation projects to test alternatives in heating, cooling and illuminating a building,
as well as alternative building energy management strategies; in building security and
safety training to explore what-if scenarios, prepare first responder teams to provide
most effective response in different emergency and disaster situations, and to test
different response plans; in capital facilities projects to minimize risk to owners and
operators by providing much more complete and reliable information about a given
buildings design and construction and its operation throughout its life cycle.
A VBE can be described as a resource center or a center of excellence that can
serve as:
(a) an industry specific software deployment center for industry projects
(b) a center of education, and
(c) a knowledge and technology development center
4.2.1 Software deployment center for projects
As a software deployment center a VBE provides immediate expert help in the use of
established and new industry software. VBE experts can help industry project staffs
select the right software for the project, hold (their) hands (i.e. demonstrate how to use
the software to accomplish specific tasks) as they start using the software, and advise
them in the selection of proper choices they may make in the use of software.
Some of the industry software on the market today is still in initial stages of maturity.
Such software cannot successfully perform all of the tasks end users expect it to perform,
or it cannot perform its tasks if a building is unusual (i.e. not trivial), complex,
complicated or large. VBE experts can find ways around some of these problems (i.e.
develop work-arounds) and report specific software shortcomings and its causes
directly to software developers, so software can be improved. In that way a VBE can
become an important factor in making industry software more mature and robust.
Some of the interfaces of otherwise robust industry software to a data model (such as
IFC) or other software may not work properly under all circumstances. VBE experts can
detect and identify causes of such software interface problems and work with interface
authors to correct them.
Many organizations are hesitant to acquire costly new software their staffs do not
know how to use, even if it is recommended to them for a specific project. A VBE
provides an opportunity to use such software for the project without purchasing it,
experience firsthand the benefits from using it, and learn how to use it before purchasing
it. Thus a VBE can help spread and expedite the use of productive software in the

Virtual building environments


4.2.2 Center of education

A VBE provides opportunities to accumulate relevant knowledge and provides
opportunities to share knowledge and learn. Few industry professionals have currently
the knowledge and experience needed to operate groups of software at the level that is
required when dealing with complex and complicated issues and problems. All too often
project personnel are unable, hesitant or not in position to start learning on their own. A
VBE provides opportunities to members of commercial design and engineering office
staffs, construction managers, building operators and officials, code checking and
enforcement officials, and others to create a BIM and operate a virtual building under
expert supervision, as they are productively working on their project. This provides them
with the initial experience of successfully doing that, which in turn may lead to the
formation of an in-house partial VBE in their organizations.
Industry-wide use of BIM-generating and virtual buildings software requires the
support of professional consultants that are at ease with this technology. A VBE provides
a framework to teach a new generation of such consultants by including them in the work
on VBE projects (i.e. industry projects temporarily conducted at a VBE). It provides
opportunities to consultants to join project teams and learn new skills (by participating,
without compensation, in the work on such projects) which they will then be able to
competently offer to the industry.
Professional industry workforce will have to develop additional skills that will enable
it to effectively utilize the new technology in daily work. With very few exceptions, these
skills are not taught systematically in todays professional schools, and many of their
graduates do not know or understand this technology and are ill at ease with it. The lack
of faculty members at institutions of higher learning who are knowledgeable in this area
of industry technology is one of the main reasons for that. A VBE provides opportunities
to faculty on leave of absence to work directly on such projects, learn and assemble
information needed for the development of new courses and curricula.
4.2.3 Knowledge and technology development center
A VBE also serves as a center of knowledge that is needed to identify and solve problems
that arise from the use of the new technology. These include problems encountered in
information modeling of complex buildings, in massive or selective data exchange, in
finding work arounds, and in support of newly emerging industry tasks, to name a few.
A VBE provides a frame-work for research and development that will help software
developers deliver more useful software.
If not properly staged and controlled, intense exchange of project data can be actually
counter productive. Issues of staging and control of industry data exchange are not yet
well understood. A VBE provides opportunities to determine the proper sequencing
between project information developed in different industry disciplines and that needed
by software applications not in those disciplines. Without proper sequencing of data
exchange, the use of some software may yield meaningless results.
The limits of data exchange, data sharing and inter-operability among industry
software are not clear at present. A VBE provides opportunities to explore and learn what
these limits might be, and to explore and define ways of circumventing such limitations.

eWork and eBusisness in architecture, engineering and construction


As a technology development center a VBE provides talent to engage in limited

software development when such development is needed and is not provided by the
industry. When a group of critical industry software (i.e. mission critical software that
is the primary software used in professional work of an industry discipline) would be well
served by new middle-ware, the talent at a VBE may develop and disseminate such
middle-ware. If a mission critical application lacks the interface to make it
interoperable and its vendor cannot afford to develop it, a VBE may provide a framework
to develop the interface. When manufacturers cannot agree on a common format for their
products, a VBE may provide a framework to develop common product data bases for
access by industry software.
As need arises to make additional mission critical software interoperable, existing
data models of buildings (such as IFC) will have to be extended to expand their
fimctionality. A VBE may provide the necessary framework and expertise to develop and
implement new data model extension schemata.
To promote the idea and stimulate the formation of virtual building environments, the
Center for Integrated Facilities Engineering (CIFE) at Stanford University, Lawrence
Berkeley National Laboratory (LBNL) and VTT (Technical Research Center of Finland)
kicked off the VBE Initiative at the end of June 2002. The Initiative is an attempt to plan
and create initial virtual building environments that will eventually spread worldwide
expert interdisciplinary deployment of multiple industry specific software.
Since most governments, with noted exceptions of those in Finland and Singapore,
have shown little real interest and support to change how the AECO industry operates,
the change will have to come from within the industry. The AECO industry will have to
experience and learn the benefits from developing and using BIM and virtual buildings
step by step, one project and project delivery staff at the time. It will have to learn how to
improve the way it operates today. The VBE Initiative is a pragmatic strategy to start that
The main goal of the Initiative is to propagate and operate virtual building
environments, create a global network of Virtual building environments, and to promote
opportunities these offer to AECO firms and organizationsopportunities to bring their
real life projects to a VBE, to have VBE experts help their staffs do their project tasks
more effectively, and to learn new things in the process, all at minimal (affordable) cost.
Those who take advantage of these opportunities will have a chance to experience how to
reduce professional and overall project delivery costs while increasing the value of their
work and product, deliver the building sooner, or operate the building at a much lower
cost with measurably fewer problems.
To be able to properly support experts that are needed for the operation of a VBE,
institutions that host a VBE need modest longer term funding not related to or coming
from industry projects. Thus, another goal of the VBE Initiative is to stimulate seed
funding for virtual building environments.
The Initiative has several additional goals that may affect and change how the industry
will operate in the future. These range from showing how to change and/or enhance

Virtual building environments


current industry processes to enabling industry soflware interoperability, and from

providing help to real life industry projects to educating professionals. Some goals,
such as assisting real life industry projects, are short term; others, such as helping
educate new generations of professionals, are longer term.
Because experts and talent needed to operate a VBE are still scarce, initial virtual
building environments are by necessity hosted at academic institutions and research
laboratories. If the VBE network is successful, the skill and expertise will gradually shift
to the industry; the support virtual building environments can provide now will become
less unique and the need for it will gradually diminish. With time, as the AECO industry
in general becomes more skillful and trusting in the use of the new software and
technology and effectively changes its work processes, the need to hold hands and
assist project staffs, and to serve as centers of education may completely dissipate.
The success of the Initiative so far has varied from country to country. The government in
Finland established a VBE at the Tampere University of Technology, and the major
Finnish property owner, Senaatti, has started several VBE pilot projects there. The
Commonwealth Scientific & Industrial Research Organization (CSIRO) has started a
number of pilot projects in Australia.
Seed funding for virtual building environments has been developing very slowly in
USA. The work of experts at quasi-VBE facilities is instead mostly fiinded from requests
for specific expert service not otherwise available to the industry. In these cases the
multidisciplinary work is typically limited to only a small number of disciplines and
software (i.e. a small number of views of BIM), such as architecture, visualization,
mechanical engineering and building energy performance simulation and assessment, or
quantity take-off, construction process scheduling and visualization. Opportunities to
provide effective VBE support will greatly increase in USA once a vendor supplies
interoperable cost estimating software the results from which the industry can trust.

eWork and eBusisness in architecture, engineering and construction


Figure 1. Aurora II design alternatives

A (top) and B, compared for
construction and life cycle costs. (By
courtesy of Jiri Hietanen of the
Tampere University of Technology.)
Still mostly in initial phases of development, existing virtual building environments
seem to share a global characteristic: limited number of resident experts. By necessity, all
virtual building environments in existence so far started small, offering VBE experts
only in a few of industry disciplines. When needed, other VBE experts are hired as
external consultants (if available); this increases the cost to the project and sometimes
delays its progress.
In June 2004, two years after the kick-off of the Initiative, its original authors
organized a VBE workshop at Stanford University. The workshop showed that several
pilot VBE projects are now in progress in different parts of the world. On two projects

Virtual building environments


that started earlier than others the first phase of work has been completed (both dealt with
the schematic phase of architectural design).
Aurora II VBE project at the Tampere University of Technology compared two design
alternatives (Fig. 1) at the project development stage when conventional methods of
analysis, based on schedules of spaces and some qualitative information from the
building program, yield construction cost estimates expected to be within 20% and life
cycle cost estimates within 25% of later actual costs. By creating virtual buildings for
the two alternatives and discussing them simultaneously with the project client in an iRoom (a three-screen interactive workspace originally developed at Stanford University),
VBE experts decreased the inaccuracy of the early construction cost estimate to within
3% and the life cycle cost estimate to within 5% (Laitinen & Hietanen 2004).

Figure 2. Glare test for a typical office

in the e-Lab building using photometrically accurate lighting simulation.
The VBE project for the e-Lab at the LBNL (Bazjanac 2002) used virtual building
experiments to demonstrate various types of energy performance of typical office and
laboratory spaces, as well as the building envelope. Using a suite of 10 different directly
and indirectly interoperable simulation and visualization tools it showed in advance to the
future occupants of these spaces and the client (US Department of Energy) how these
spaces will fiinction (Fig. 2).
When CAD software started being embraced in the AECO industry some 20 years ago, it
changed the nature of professional staffs in architecture and engineering. Offices replaced

eWork and eBusisness in architecture, engineering and construction


large numbers of pencil and paper draftspersons with fewer (albeit somewhat higher
paid) CAD operators, who produced more drawings faster. This increased the volume of
work and output per payroll unit and soon made offices who switched to CAD more
competitive in the market. Information modeling and virtual buildings will inevitably
change the office landscape once again.
The most important new job position will be the Virtual Building Coordinator. This
position will require substantial knowledge in modeling and software use relevant to the
different industry disciplines that are part of that offices business. Qualified candidates
will be often hard to find and pricey; this role may have to be filled by special
The CAD draftsperson will be replaced by a BIM modeler. This position will require
skills in the use of computers and BIM authoring software.
The current VBE experts will evolve into high power consultants: cost estimators,
energy performance analysts, construction managers, etc. They will have the ability to
determine the right course of action to resolve a specific project issue, modify and/or
interpret information in external data bases, create work-arounds for software in their
specialties, effectively communicate and explain information, and more.
Another new job position will be that of a BIM keeper. Holder of this job will be
responsible for the maintenance, safeguarding and administration of a BIM through the
life time of the BIM. This position will require at least modest skills in information
modeling and substantial knowledge of collaborative engineering.
Some of the technical issues that surfaced in initial BIM authoring and the use BIM
accessing software, such as data incompatibility, data model and software limitations, and
problems in file based exchange, were reported earlier (Bazjanac 2002).
It is now becoming increasingly clear that there are other major obstacles that are
slowing down the process of moving toward industry wide use of information modeling
and virtual buildings. These include poor quality of some of the BIM authoring and
accessing software (that is buggy, immature and/or not robust), difficulties in reaching
industry wide agreements in the definition of BIM views and/or in implementation of
standard data model definitions in software, the small number of interoperable industry
specific software, issues in data sequencing when populating a BIM, problems in
managing different resolution of the same data as needed by different software, and
The complete lack of aids for end users is glaring: there are no manuals, templates,
case studies published in sufficient detail, nor anything else to guide a newcomer in the
initial use of this technology. Missing also is a better understanding of measurable
benefits from the use of information modeling and virtual buildings, and of ways to
measure them. Recent work at CIFE (Fischer & Ju 2004) is beginning to address these

Virtual building environments


The AECO industry is finally beginning to use IT, BIM and virtual buildings more
effectively. Virtual building environments are a strategy to spread the use of this
technology throughout the industry. A VBE provides opportunities to organizations in the
industry to get help: a structure (an organized way to do it), software, facilities and
experts who can guide the work related to building information modeling and virtual
buildings required by real life industry projects.
It is now up to industry organizations to bring their projects to VBE centers and take
advantage of these opportunities. The VBE Initiative represents the beginning of a global
VBE network that will hopefully help the entire industry take advantage of the new
technology. That will lead to different sets of industry processes, thorough testing before
building, and (eventually) much better designed, built and working buildings.
The author wishes to thank Ari Ahonen from Tekes (the Finnish National Technology
Agency), Prof. Martin Fischer from Stanford University, Jiri Hietanen from the Tampere
University of Technology, Arto Kiviniemi and Tapio Koivu from VTT and Stephen E.
Selkowitz from LBNL for their ideas and direct and indirect contributions in the
formulation of virtual building environments.
This work was partly supported by the Assistant Secretary for Energy Efficiency and
Renewable Energy, Office of Building Technology, Building Technologies Program of
the U.S. Department of Energy under Contract No. DE-AC03-76SF00098.
Bazjanac, V. 2001. Acquisition of building geometry in the simulation of energy performance. In
R.Lamberts et al. (eds), Building Simulation 2001, Proc. intern. conf., Rio de Janeiro, Vol. 1:
305311.ISBN 85-901939-2-6.
Bazjanac, V. 2002. Early lessons from deployment of IFC compatible software. In .Turk&
R.Scherer (eds), eWork and eBusiness in Architecture, Engineering and Construction, Proc.
fourth Euro. conf. product process modelling, Portoro, SLO: 916. Balkema. ISBN 90-5809507-X.
BLIS. 2000. Building Lifecycle Interoperable Software. http ://www.blis-proj ect.org
CABA. 2002. Continental Automated Buildings Association. http://www.caba.org/
CURT. 2003. Construction Users Round Table. http://www.curt.org/
FIATECH. 2002. http://www.fiatech.org/
Fischer, M. & G.Ju. 2004. Case studies of the implementation and valuation of Virtual Building
Modeling (VBM). CIFE SEED project (in progress), Stanford University.
http://www.stanford.edu/~fischer/ VBECIFE0604
IAI. 1995. International Alliance for Interoperability. http://www.iai-international.org/
Laiserin. 2002. Comparing pommes and naranjas. The LaiserinLetter(tm). Issue 15.

eWork and eBusisness in architecture, engineering and construction


Laitinen, J. & J.Hietanen. 2004. Aurora II project. VBE workshop. Stanford University.
http://www.stanford.edu/~fischer/ VBECIFE0604
OSCRE. 2003. Open Standards Consortium for Real Estate. http://www.oscre.org/

eWork and eBusiness in Architecture, Engineering and ConstructionDikba & Scherer (eds.)
2004 Taylor& Francis Group, London, ISBN 04 1535 938 4

A persistence interface for versioned object

Daniel G.Beer1, Berthold Firmenich2, Torsten Richter1 & Karl Beucke1
Informatik im Bauwesen, Bauhaus-Universitt Weimar, Germany
CAD in der Bauinformatik, Bauhaus-Universitt Weimar, Germany
ABSTRACT: Object models of current planning applications are very
complex, huge and individually structured. Complexity increases, if they
are versioned for distributed processing. Such models have to be made
persistent. This paper proposes a persistence concept for such purposes.
The concept is independent from a specific persistence schema and
database. For a pilot implementation alternatives are discussed briefly and
a specific persistence schema for a chosen database are described.

1.1 Distributed processing of common planning material
The distributed synchronous and asynchronous processing of a common planning
material is in the focus of current research (DFG 2004). New publications in the field of
CAD show concepts for the distribution of the planning material abstracted as objects and
object versions (Firmenich 2002): Work starts by loading a valid structured subset of
object versions from a common project in the planners private workspace (Beer &
Firmenich 2003). Object versions of the workspace can be modified or deleted and new
object versions can be created independently from the project and the network with the
help of an integrated planning application (Beer et al. 2004). At the end of the work (long
transaction: from hours up to days) object versions are stored in the common project.
Operations to ensure consistency are described in (Firmenich 2002). A corresponding
system architecture for the distributed processing is shown in Figure 1. It is based upon a
projectworkspace approach.
1.2 Objective and content
The unversioned model of the planning application has to be stored into (loaded from)
the persistent model of the common project.
The second section describes the general stmcture of object oriented planning
application model. This is the base for the persistence mechanism. Requirements on the
persistence mechanism are deduced.

eWork and eBusisness in architecture, engineering and construction


Figure 1. System architecture for the

distributed processing.
The persistent model of the project will be described in the third section. Versioning and
the selection of subsets that influences the persistence schema have to be considered.
Possible persistence schemas and data stores will be discussed and compared with each
other. A suitable persistence schema and data store is chosen for a pilot implementation.
The description of the persistence layer and a pilot implementation in the fourth
section concludes this contribution.
2.1 Object types and structuring
The model of a planning application is very complex. There are many objects
representing a lot of elements of a complex building. These objects are instances of a
wide range of classes as we can see for example in the IFC model (IFC 2004). This huge
variety of classes also results from different planning applications for specific planning
tasks involved in the common project. Thus, the objects are structured diversely. It is not
certain that all applications use a common labelling and it can not even be assumed that
there is one.

Figure 2. Types of object attributes.

A persistence interface for versioned


2.2 Versioning
Most planning applications do not support versioning. The workspace manages an
unversioned model to use such applications. The transformation between unversioned
model of the workspace and versioned model of the project is described in (Beer et al.
2.3 Attributes
As shown in Figure 2, objects (a) can have attributes of different types:
Atomic values (f1) can not be decomposed anymore. Examples are strings or numbers as
well as objects that can be represented by strings, for example dates.
Objects (f2). There is a 1:1 relation to another object (b).
(Un)ordered sets of objects (f3). There is a 1:n relation to other objects belonging to a
set (s).
(Un)ordered relations of objects (f4). There is a 1:n relation to object tuples belonging
to a relation (r). To reduce complexity, only binary relations are considered in this
contribution. Relations of higher order can be treated analogously.
2.4 Inheritance
Objects can inherit attributes from a super class. This has to be considered if objects
extending a super class have to be made persistent.
3.1 Versioning
Each object of the planning application model is stored as an object version in the
common project (Beer et al. 2004). The mathematical foundations of this model are
described in (Firmenich 2002). An example is given in Figure 3.

eWork and eBusisness in architecture, engineering and construction


Figure 3. Versioned model of the

project (example).
Object versions (a4, b2) are elements of set M. The version history is stored as ordered
pairs of object versions ((a2, a4)) in the relation V. Dependencies between object versions
are stored as ordered pairs of object versions in relation B.
These three sets can be interpreted as a graph. The version history is shown by dashed
arrows, the dependencies are shown by continuous arrows.
3.2 Subsets
A uniform access to differently structured objects is needed for the selection of subsets
from the project model. Therefore the native attributes should be used as the least
common base of all objects. As shown below (see Subsection 4.2) an object wrapper can
unify attributes with a common semantics.
A set algebra respectively a language is a flexible and easy way to describe subsets
(Beer et al. 2003). The feature logic (Zeller 1997) is a set algebra that uses attributes or so
called features. Thus, feature logic is well qualified to be used for object access via
attributes. Table 1 shows important operations that are partially based upon objects
3.3 Persistence schema
A persistence schema tailored to feature logic is given in (Firmenich 2002). The schema
is shown in entity relationship notation in Figure 4. With the help of this schema, the
operations shown in Table 1 can be implemented straightforward.
Objects are stored as elements of a domain (domain). They are identified via a
generated name. Atomic features are elements of the domain, as well. Values of atomic
features are stored in Atom. Features are stored in Feature. The connection between
object, feature and value is a relation of Relslot.

A persistence interface for versioned


The schema requires a decomposition of objects into objects identifier (name),

objecfs attributes and attributes values. Solutions for mapping objects to

Table 1. Feature logic operations.






Value of the feature f of the object set S


f: S

Objects, whose feature f has the value S


f: T

Objects that have a feature/


Objects that have no feature f



Features f and g have same values



Features f and g have different values



Complement of S


Either R or S



R and S



If R then S



Only if R then S

Term equiv.


Equality of R and S

Figure 4. Feature logic persistence


eWork and eBusisness in architecture, engineering and construction


tables (Keller 1997) are known. They are discussed briefly and compared with the feature
logic schema in the following. This is done for all supported attribute types (see
Subsection 2.3).
Generally, queries against the data store are generated from feature logic terms with
the help of a interpreter (Richter et al. 2003). The feature logic persistence schema is
tuned for that purpose. Thus, the user has to speak feature logic, instead of formulating
query expressions against the data store.
Mappings for the different types of attributes, object versions and inherited attributes
are discussed briefly and compared to the feature logic persistence schema in the
following paragraphs. Advantages and disadvantages of the feature logic persistence
schema are discussed in the last paragraph of this subsection.
3.3.1 Mapping of atomic attributes
The feature logic schema is used for mapping atomic attributes (f) to tables: They are
stored with their values (x, shown by a rectangular shape) as strings in different tables
(Fig. 5). The feature type stores the type of the object.

Figure 5. Mapping of atomic attributes.

Figure 6. Mapping of object attributes.

A persistence interface for versioned


3.3.2 Mapping of object attributes

Example (Fig. 6): Object a has an object attribute h of type B with value b.
There are different concepts for mapping object attributes respectively 1:n relations or
so called references to tables:
Single table aggregation (Fig. 6a) (Keller 1997)
Mapping: The attributes of the referenced object are stored in the same table as
the attributes of the referencing object to avoid references.
Advantages: Objects can be retrieved with a single table access.
Disadvantages: Attribute values of multiple referenced objects are stored
multiple. If the reference is deeper than one hierarchy step every change of
referenced types causes an adaptation of all referencing types. A dot notation or a
similar naming convention for the attributes of the referenced objects has to be
used (for example h.g). Furthermore, it is hard to formulate queries using
reference types.
Foreign key aggregation (Fig. 6b) (Keller 1997)
Mapping: A separate table for referenced types is used. Object identifiers (b) link
from the referencing to the referenced object.

Figure 7. Mapping of set attributes.

Advantages: This schema is more flexible and better to maintain as variant
a. Referenced objects can easily be queried.
Disadvantages: A join operation or at least two database accesses are
needed. It is difficult to achieve referencing from referenced objects.
Feature logic schema (Fig. 6c) (Firmenich 2002)

eWork and eBusisness in architecture, engineering and construction


Mapping: The mapping is similar to variant b. The difference is that all objects
are stored in one table. Tables Atom, Domain and Feature are omitted in the
3.3.3 Mapping of set attributes
A set includes a number of other objects (elements). An example is shown in Figure 7.
There are different concepts for mapping set attributes to tables:
Foreign key association (Fig. 7a) (Keller 1997)
Mapping: The identifier of the set is inserted into the set element table (attribute
Advantages: The mapping schema does not collide with normal forms. Hence, it
allows reasonable maintenance cost. The space consumption is nearly optimal.
Disadvantages: Reading a set costs a join operation or two read operations.
Feature logic schema (Fig. 7b) (Firmenich 2002)
Mapping: Set elements are represented by tupels (elmc, Fig. 7c) that are included
in the set (s). They have a reference to the set element itself (c). The specific
features elm (element) and in (included) are used. Sorted sets (sequences) are
possible with the help of the specific feature i (index). Tables Atom, Domain
and Feature are omitted in the example.
3.3.4 Mapping of relation attributes
A binary relation includes a number of pairs of other objects. An example is shown in
Figure 8. The feature logic schema is used for mapping binary relations to tables:
Feature logic schema (Fig. 8a) (Firmenich 2002)
Mapping: Elements of the relations are represented by tupels (elmru, Fig. 8b) that
are included

Figure 8. Mapping of relation


A persistence interface for versioned


Figure 9. Mapping of object versions.

in the relation. They have a reference to the elements (r and u). The specific
features in (included), src (source) and dst (destination) are used. Sorted
relations are possible with the help of the specific feature i (index). Tables
Atom, Domain and Feature are omitted in the example.
3.3.5 Mapping object versions
Object versions are part of the persistent model. An example shows Figure 9. The feature
logic schema is used for mapping the version history to tables:
Feature logic schema (Fig. 9) (Firmenich 2002)
Mapping: A revision (b2) of an object version (b1) is stored as the value of the
specific feature rev (revision). Tables Atom, Domain and Feature are omitted in
the example.
3.3.6 Mapping of inheritance
Objects can inherit attributes from a super class (Fig. 10). There are different concepts for
mapping object inheritance to tables:
One table for one inheritance tree (Fig. 10a) (Keller 1997)
Mapping: The union of all attributes of all objects in are used as columns of a
single table. Null values represent unused fields in each record.
Advantages: Reading/writing of base class descendants needs only one database
operation. Schema evolution is straightforward and easy as long as the inheritance
hierarchy does not become too deep.

eWork and eBusisness in architecture, engineering and construction


Figure 10. Mappingof inheritance.

Formulating queries for all objects of the inheritance is fairly easy.
Disadvantages: The waste of space depends on the inheritance depth. The deeper
the hierarchy and the bigger the difference between the union of all attributes and
the attributes of an average object, the bigger the waste of space.
One table for every class (Fig. 10b) (Keller 1997)
Mapping: The attributes of each class are mapped to a separate table. An object
identifier links derived classes with their parent.
Advantages: The pattern provides a very flexible mapping. The space
consumption is nearly optimal. Schema evolution is straightforward.
Disadvantages: The mapping is performance expensive in terms of database
operations. The costs rise with the depth of the inheritance hierarchy. Queries are
hard to formulate as the mapping generally requires accessing more than one
One table for one inheritance path (Fig. 10c) (Keller 1997)
Mapping: The attributes of each class as well as the attributes inherited from
parent classes are mapped to separate tables.
Advantages: The mapping needs one database operation to read or write an
object. The space consumption is optimal.

A persistence interface for versioned


Disadvantages: A polymorphic scan of all objects given by their super class

includes some tables. Inserting a new subclass means updating all polymorphic
search queries but the structure of the tables remains untouched. Adding or
deleting attributes of a super class results in changes to the tables of all derived
classes. Queries are hard to formulate as the mapping generally requires accessing
more than one table.
Feature logic schema (Fig. 10d) (Firmenich 2002)
Mapping: The attributes of each class as well as the attributes inherited from
parent classes are mapped to the same table. Tables Atom, Domain and Feature
are omitted in the example.
3.3.7 Feature logic schema
The feature logic schema was chosen for the pilot implementation.
There is only one schema for all classes. It is independent from the typing. Thus, the
schema is very flexible and easy to maintain.
There is no redundancy. The schema fulfils the requirements of the normal forms.
Objects can be used in different sets or relations with the help of tupel elements. They
are stored only once.
Queries can be formulated very easy with the help of an extended feature logic (Beer et
al. 2003). The query is transformed with the help of a database independent compiler
(Richter et al. 2003).
References can be traversed via recursive queries (Price 2002). They are encapsulated
by operations of an extended feature logic (Beer et al. 2003).
Objects can be retrieved only via multiple table access.
Set and relation elements have to be loaded in a step after the set respectively the
relation itself. This results in multiple queries. They may be optimised and executed as
batches by the data store.
3.4 Data store
Different types of data stores for a pilot implementa tion have been investigated.
Files/file system: The file input/output mechanism is relatively slow compared to RAM
access. Furthermore, there are no adequate methods for queries on files. That is why
file formats like XML are not qualified.
Relational database management systems (RDBMS) have been proven for commercial
use and are technologically well established. They are well qualified for the use of
mass data. RDBMS offer a very flexible and easy data access via a standardised query
language (SQL). RDBMS can store relations in tables. Thus, the object oriented
application model has to be mapped to the persistent relational model. The relational
model corresponds with the feature logic persistence schema.

eWork and eBusisness in architecture, engineering and construction


XML databases offer new concepts. They are very generic so that such databases are
slower than relational database management systems. XML databases do not offer a
better access to objects stored than RDBMS.
Object oriented database management systems (ODBMS). Currently available systems
do not support the flexible selection of subsets with the help of a powerful query
Java Data Objects (JDO) is a persistence standard for Java objects. It defines interfaces
that allow for data store independent persistency. Different types of data stores can be
used depending on the commercial implementation that is used. However, the
selection of subsets is limited.
Specific databases for engineering applications are in the focus of current research
(Biltchouk & Pahl 2003). The objective is the straightly access to engineering objects.
Such data stores may be used in the fiiture.
A RDBMS has been used for the pilot implementation.
4.1 Properties
Flexibility: Existing objects without persistency convention as well as new objects
designed for this purpose are supported (see Subsection 4.2).
Independency: The persistency layer is independent from the data store used by a pilot
implementation. The proposed system architecture (Fig. 11) shows interfaces and
adapter classes that implement general fiinctionality.
Performance: Memory access is much faster than file system access. Furthermore, the
storage of objects can be done in parallel with the running application so that the
loading of objects has to be optimised.
Maintenance: The schema used is a very simple schema for all classes. That is why
redundancy is no problem for maintenance.
4.2 Attribute access
Objects to be made persistent have to publish all necessary attributes to be stored.
Attributes have to be read for storing into project model and to be written for loading
objects from the project model. There are different concepts for implementation:
Reflection: The programming language Java (Horstmann & Cornell 2002) offers a
mechanism called reflection that delivers all public attributes. Private attributes can
not be accessed. Objects can not be made persistent completely. Public attributes may
not be sufficient to recreate the object.

A persistence interface for versioned


Figure 11. Persistence concept.

Interface: Objects to be stored have to implement an interface that defines methods for
attribute access. This can be done in the source code (manually) or the compiled code.
This is called source/ byte code enhancing. The mapping information has to be
provided in a separate file (Jordan & Russel 2003). However, existing classes possibly
must not be changed so that they can not implement an interface.
Conventions: Naming conventions, like JavaBeans for reusable software
components, offer a standardised attribute and method access. However, most existing
classes do not fulfil such conventions.
Wrapper: Classes that wrap existing classes and that fulfil a given interface are well
qualified. The mapping between public attributes, attributes behind methods and
persistent attributes is very flexible. Only those attributes needed to recreate an object
have to be stored. They can differ from the objects attributes and they can be named
in a standardised way to have a common view on common attribute names. Specific
attribute names outside a standard are possible, too. It is assumed that the persistence
wrapper class returns the attributes of the wrapped class as well as the inherited
attributes of its super class(es), too.
For the access to the objects attributes the wrapper concept in combination with the
interface concept is used. Thus, it supports existing objects as well as new designed
objects. Wrapper classes for all application defined classes implement a specific
persistence interface PersistenceCapable (Listing 1, simplified: no exceptions, most
important methods).
The PersistenceCapable interface defines methods for the exchange of attributes and
their values via strings. Thus, the type conversion is not a generic task of the database but
can be done in the wrapper object individually. This increases maintenance because of
the independency of the data types offered by the database used. Furthermore, the
wrapper concept allows for a common structure of different structured objects.

eWork and eBusisness in architecture, engineering and construction


Standardised attribute names are possible by changing their original names (e.g. IFC
conform names) as well as specific context dependent named attributes.
interface PersistenceCapable {
// Atomic properties
Proplterator getAtomicProperties() ;
void setAtomicProperty(
String name, String value
// Aggregations
Proplterator getAggregationProperties() ;
void setAggregationProperty(
String name, String objID
interface Proplterator {
// Iteration
boolean iterate();
// Property information
String getName();
Class getType();
Object getValueO ;
boolean isChangeable ();






4.3 System architecture

The persistence concept (Fig. 11) has three layers: The Atomizer, the PersistenceSchema
and the DatabaseManager. They are managed by the PersistenceManager. The simplified
interface is shown in Listing 2.
The concept is independent from the persistence schema and the database used. A
specific PersistenceManager class manages all persistence classes and instantiates
specific classes for a specific schema anddatabaseused. Persistence-Capable classes can
be registered to wrap existing objects (method registerPCClass). The PersistenceManager
can store (a set of) objects and version relations (e.g. method storeObject) as well as load
an subset of the persistent model described by a query (method load). The return values
are persistent identifiers created by the project (case store) respectively pairs of persistent
identifiers and associated objects (case load). Storing (a set of) objects the
PersistenceManager calls the Atomizers decompose methods (e.g. method
decomposeObject) to add the object information to the persistent schema managed by the
The Atomizer (Listing 3) has the task of a serialiser. It decomposes the object graph
into object types and attributes with atomic values. The object graph is traversed by

A persistence interface for versioned


object references (Fig. 2). Sets and binary relations are supported. They have to be
distinguished because there are different container classes (Set, Map) defined by Java. A
specific persistence schema is managed (method setPersistenceSchema). The created
persistent identifier of already decomposed objects is used for all other references of this
object. The Atomizer can store a schema (method store) and load a subset of the
persistent model described by a query (method load). The return values are void (store)
respectively pairs of persistent identifiers and associated objects (load). The class
information is stored as the value of a specific feature type to instantiate the object. The
Atomizer interface is implemented by the Atomizerlmpl class.
interface PersistenceManager {
// Registration
void registerPCClass(
Class pcCls, Class objCls
) ;
// Persistence functionality
String storeObject(Object o);
String storeSet (Set s);
void storeObjectVersionRelation(
String PIDl, String PID2
Map load(String query);

Listing 2. PersistenceManager interface.

interface Atomizer {
// Persistence schema
void setPersistenceSchema(
PersistenceSchema ps
// Decomposition and storage
String decomposeObject(Object o) ;
String decomposeSet(Set s);
void store ();
// Loading and recomposition
Map load(String query);

Listing 3. Atomizer interface.

The PersistenceSchema (Listing 4) is called by the Atomizer (e.g. method
addObjectSchema) to store a specific type of attribute (Fig. 2) and create specific schema
information (Figs 510). The schema can be stored (method storeSchema). Loading
subsets from the persistent model described by a query a new schema is created (method
createSchema). Objects represented by the schema can be received via method
getObjects. Attributes and attribute values for a specific object identified by the persistent
identifier (objID) are returned by method getProperties. The PersistenceSchema is

eWork and eBusisness in architecture, engineering and construction


implemented by PersistenceSchemaAdapter class. For the used feature logic schema (Fig.
4) a specific class PersistenceSchemaFL was implemented. This allows for changing the
schema used.
interface PersistenceSchema {
// Create schema from object tree
String addObjectSchema(Class type);
void addAtomicPropertySchema(
String objID, String name, String val
void addNonAtomicPropertySchema(
String objID, String name, String vID
String addSetSchema(Class type);
void addSetElementSchema(
String setlD, String elmlD
void addObjectVersionRelationSchema(
String PIDl, String PID2
// Store/ load schema to/ from database
void storeSchema();
void createSchema(String query);
// Schema information
Set getObjects();
Map getProperties(String objID);

Listing 4. PersistenceSchema interface.

interface DatabaseManager {
// Initialisation
void init(String driver, String url,
String user, String pwd
// Reading from database
ResultSet load(String sql) ;
ResultSet loadBatch(
String sql, StringU batch
// Writing to database
void store(String sql);
void storeBatch(
String sql, String[][] arguments

Listing 5. DatabaseManager interface.

A persistence interface for versioned


The DatabaseManager (Listing 5) defines methods for accessing a database. The

Database-ManagerAdapter implements relational database access. Specific classes (e.g.
DatabaseManager-Oracle) support specific database access.
4.4 Pilot implementation
The interfaces described above have been implemented. An ORACLE database and the
feature logic schema are used.

Figure 12. Example: Object a to be

made persistent.
class PCA implements PersistenceCapable {
// Attribute of type A
private A a;
// Object instantiation
createObject () {a=new A (...);}
// Atomic attribute access
Proplterator getAtomicProperties() {
return new Propertylterator () {
int i=0;
boolean iterate() {return i++<l;}
String getName() {return f;}
Class getType() {
return getValue() .getClass ();
Object getValueO {
return a.getf ();
// Object attribute access
Proplterator getAggregationProperties(){
return new Propertylterator () {
int i=0;
boolean iterate() {return i++<l;}
String getName() {return g;}
Class getType() {
return getValue().getClass();

eWork and eBusisness in architecture, engineering and construction


Object getValueO {
return a.getg();

Listing 6. Persistence wrapper for class A.

4.5 Persistence wmpperan example
It is assumed, that we have the situation described in Figure 12a: An object a of type A
with atomic attribute f (value v), object attribute g (reference to object b) and set
attribute h (reference to set s with objects c, d and e as elements) has to be stored.
A wrapper class in particular for class A has to be provided (Figure 12b). The wrapper
has to implement the PersistenceCapable interface and manage objects of type A to ask
for attributes and create new instances as shown in extracts in Listing 6.
Furthermore, the wrapper has to be registered at the project via the registerPCClass
method of the project. A wrapper class can be multiply registered with several classes to
manage these classes in common.
The authors gratefully acknowledge the financial support of the project interCAD by the
German Research Foundation (Deutsche Forschungsgemein-schaftDFG) within the
scope of the priority program Network-based Co-operative Planning Processes in
Structural Engineering (SPP 1103). We also want to thank the members of the working
group Distributed product models for the productive co-operation.

A persistence interface for versioned


Beer, Daniel, G. & Firmenich, B. 2003. Freigabestande von strukturierten Objektversionsmengen
in Bauprojekten. In Digital Proceedings des Internationalen Kolloquiums uber Anwendungen
der Informatik und Mathematik in Architektur und Bauwesen (IKM). Weimar: BauhausUniversitt Weimar. www.uni-weimar.de/ikm (17 May 2004).
Beer, D.G.; Firmenich, B. & Beucke, K. 2003. Motivation fur eine Sprache zur Handhabung
strukturierter Objekversionsmengen. In Kaapke, K. & Wulf, A. (eds), 75. Forum Bauinformatik:
128135. Aachen: Shaker.
Beer, D.G.; Firmenich, B.; Richter, T. & Beucke, K. 2004. A Concept for CAD Systems with
Persistent Versioned Data Models. In Digital Proceedings of the Tenth International Conference
(ICCCBE-X). Weimar: Bauhaus-Universitat Weimar.
Biltchouk, I. & Pahl, P.J. 2003. Entwicklung einer Datenbasis fur Anwendungen im Bauwesen. In
Digital Proceedings des Internationalen Kolloquiums iiber Anwendungen der Informatik und
Mathematik in Architektur und Bauwesen (IKM). Weimar: Bauhaus-Universitt.
DFG 2004. DFG priority program 1103: Network-based co-operative planning processes in
structural engineering. http://www.iib.bauing.tu-darmstadt.de/dfg-spp1103/en/index.html (17
May 2004).
Firmenich, B. 2002. CAD im Bauplanungsprozess: Verteilte Bearbeitung einer strukturierten
Menge von Objektversionen. Aachen: Shaker.
Horstmann, C.S. & Cornell, G. 2002. Core Java 2Expertenwissen. Miinchen: Markt+Technik.
IFC 2004. International Alliance for Interoperability: Industry foundation classes (IFC).
http://www.iai-international.org/iai_international/Technical_Documents/iai_documents.html (17
May 2004).
Jordan, D. & Russell, C. 2003. Java data objects. Beijing [u.a.]: OReilly.
Keller, W. 1997. Mapping Objects to Tables. In Proceedings of Second European Conference on
Pattern Languages of Programming (EuroPLoP 97). Siemens Technical Report 120/SWl/FB.
Munich: Siemens.
Loney, K. & Theriault, M. 2001. Oracle 8i DBA-Handbuch. Miinchen [u.a.]; Hanser.
Price, J. 2002. Oracle9i JDBC programming. New York [u.a.]: McGraw-Hill/ Osborne.
Richter, T.; Firmenich, B. & Beucke, K. 2003. Ein Java-Paket zur Verarbeitung von
Datenstrukturen in beliebigen Datenquellen. In Kaapke, K. & Wulf, A. (eds), In 15. Forum
Bauinformatik: 136146. Aachen: Shaker.
Zeller, A. 1997. Configuration Management with Version SetsA Unified Software Versioning
Model and its Applications. Braunschweig: Fachbereich Mathematik und Informatik.

eWork and eBusiness in Architecture, Engineering and ConstructionDikba & Scherer (eds.)
2004 Taylor& Francis Group, London, ISBN 04 1535 938 4

Semantic parameterized interpretation: a new

software architecture for conceptual design
Informatics and Mathematical Modelling, Technical University of
ABSTRACT: Class-based CAD systems make the majority of systems for
building design. However, such systems lack of dynamics in two respects:
Objects cannot be specialized incrementally, and introducing new classes
or properties requires code restructuring and recompilation. The two sorts
of dynamics are, however, attractive as they are necessary to support
incremental design.
The present paper suggests a new software architecture which offers the
two kinds of dynamics. It does so by understanding the design system as
an interpreter of conceptual models. In these models, free terms can be
introduced when these are needed for expressing the current design. The
meanings of the terms are defined in a semantics which becomes a
parameter to the interpretation mechanism.
The result of an interpretation is a certain perspective a view on the
model. Such a view can be a representation of visualisation, equations for
stress analysis, or other information derived from the model.

The introduction of object-orientation has had a tremendous impact on the foundation and
software architectures for computer aided design systems (CAD systems). Such systems
are today mostly object-oriented and founded on built-in class hierarchies for the kinds of
objects that can be added to a design. Examples are classes of walls, ceilings, beams,
doors, and furniture. A design in this respect is a configuration of objects, and
designing is a process of composition in which objects of classes are sequentially added
to the design configuration.
Class-based languages make the majority of objectoriented languages. In class-based
languages, information is represented in objects by means of properties. Objects are
instances of classes and contain methods for representing and manipulating object data.
Classesbeing type specifications for objects group objects by definitions (Abadi and
Cardelli 1996).
An objectin object-orientationis a named record of properties. Some of these
represent intrinsic and extrinsic properties like material and position in space,

Semantic parameterized interpretation


respectively. Other properties denote functions (methods) for technical calculations,

administrative procedures, and visualisation.
In building design, the objects often represent potential artefacts. That is, man-made
physical things which may come into presence in future. Most mechanical and
architectural designing concerns artefacts. However, designing an artefact is not just a
process of selecting a predefined object class or determining values for a set of
predefined properties. That would obstruct the creative process of designing, as the it
limits the space of possible artefacts to design.
In (Ekholm 1995; Ekholm and Fridqvist 1998; Fridqvist 2000; van Leeuwen 1999; Eir
and Ekholm 2002), foundations for conceptual design systems have been explored.
Lately, such foundations together with the incremental process of designing, have
been modelled using mathematical abstractions in order formally to capture the basic
structuresthe intrinsicsof the design process (Eir 2004a; Eir 2004b; Eir 2004c). Such
intrinsic domain models are believed to serve as solid foundations for requirements and
design of software systems (Bjrner 1997).
It is argued that properties of objects are fundamental ontological entities in artefact
descriptions, and that the design process can be seen as an incremental process which
emphasizes the notion of multiple inheritance of properties. This understanding appears
to be a suitable abstraction of design as a problem solving process. Furthermore,
understanding design as an incremental process of set-inclusion of properties, appears to
be a model capturing the essence of designing at early as well as at late stages.
The approach is called property-orientation in contrast to object-orientation. An
essential difference between property-orientation and the class-centered approach of
object-orientation is that property-orientation maintains the property of subsumption
In incremental designing, a solution to a technical problem is approached by
incrementally ascribing properties to design objects. The process goes from the rough
sketches of e.g. dimensions to more and more detailed descriptions. In this understanding,
designing is incrementally ascribing a collection of properties to each object. This
collection gives the object in question certain dispositions such that it offers certain
ftmctionality in itself or in a complex.
However, trying to introduce incremental designing in an object-orientation
environment, yields a problem. Classes are types in the language in which the design
program is written and changing a class by adding a property, requires code restructuring
and recompilation.
Thereby, object-oriented design systems lack of dynamics in two respects. The first is
that the systems do not support incremental design; i.e. a process in which objects are
being specialised incrementally by their kinds. In object-orientation, each object is born
with a number of properties; those specified by its class. The second respect is that
introducing new sorts of properties means restructuring the class hierarchy which again
may require code restructuring and recompilation of the design program code. As a
result, over time the class system may become complicated and even inconsistent (Eir
In this paper, we suggest a new software architecture which supports incremental
design. The idea is to separate the conceptual models of artefacts from the perspectives
that can be put on such models.

eWork and eBusisness in architecture, engineering and construction


In object-oriented CAD systems, certain methods represent different kinds of semantic

information. One method could be a paint method which encodes a wall object and
directs the graphical framework on how to display such objects. Another method could be
one for calculating stress for wall objects of the class. A third method could be one which
lists the bill of material for making a wall of this kind.
The three methods are all relying on a conceptual model for wall objects. In the first
case for describing shape, in the second for making engineering calculations, and in the
third for making quantitative summations. The core information is the conceptual data,
and each method puts on its own perspective by interpreting these conceptual data.
The software architecture which is presented in this paper, aims at making this
distinction explicit. It is founded on the idea that conceptual models are interpreted in
various ways; each defining a specific perspective on the object in question. The principle
thus moves class specifications to a dynamic level, thereby facilitating that properties can
be introduced freely when these are needed in order to express the current design in mind.
However, in order to succeed, the meanings of the terms that are taken to denote these
properties, need to be specified. This is done in a semantic specification which becomes a
parameter to the interpretation mechanism. The architecture is specified formally using
the RAISE Specification Language (RAISE Language Group 1992).
Our approach commits to a certain ontological picture of objects and properties. The aim
is to separate entities which are intimately connected with design objects, from the ones
which are not. The material of an object is intimately connected to an object as the
existence of the property of being made of a certain material does not depend on the
existence of other objects. Contrary, the load bearing ability of a beam is something
which is only potential. It comes into play, only if other objects are present.
This perspective draws on Shoemakers ontology which is an epistemological quest
for justifying how we can know that an object has certain properties and what it means
that it has these properties (Shoemaker 1997). In short our application of the ontological
picture is this. The meaning of an object is its intrinsic (and if necessary, extrinsic)
properties, and the meaning of having these properties (and thus the causal meaning of
the object) is the collection of so-called causal dispositions. We can think of these
dispositions as the fiinctionality offered by the object. From a collection of intrinsic
properties, we derive that the object in question has certain dispositions. If a beam has
certain dimensions and is made of a certain material, the beam possesses a specific
strength and flexibility. However, these are only potential until the object is placed in a
causal environment where laws apply. The notion of finctionality is this combination of
intrinsic properties which are positioned in the causal environment through extrinsic
properties like position in space.
The specification of intrinsic properties states the ontological knowledge intimately
connected to the object. Its functionality, behaviour, and possible application is an
interpretation of these properties. Such an interpretation may define how the object looks
like from various angles, or it may be the physical laws interpretation of the properties.

Semantic parameterized interpretation


Often, what is grasped when reading a design or requirements specification, is a

collection of images constituting the meaning of what is read. Making various different
interpretations of the same conceptual model corresponds to giving a collection of such
images. This collection can be seen as our computerized way of representing the
meaning of an artefact as an alternative to having it as a causally measurable thing.
Artefacts like buildings can often be represented as conceptual models which list the
properties of the artefact in mind.
Consider the I-profile on Figure 1, which shall serve as our intuition in the following:
The properties of the I-profile can be represented by the following conceptual model:
[width 400, height 600, length 8400, thA 90, thB 140, material
steel, sort IPE]
Assuming a certain way of understanding the names in the model, we can make the
drawing on Figure 1. A conceptual model is meant for interpretation and many different
perspectives can be put on this model. Such perspectives include representation of
visualisation, calculations for stress analysis, etc.
We shall now apply a language-oriented understanding of software systems. In this
understanding design models are written in a formal language. This language needs to be
extensible as we require that new properties can be introduced when these are necessary
for expressing the current design idea. Properties are simply names (attributes) mapping
to value names. Thus [material steel] and [length 3400] would correspond to
attribute-value pairs for properties of a certain beam object; or of a beam class in general.
However, even though such names may make sense in our normal understanding of
artefacts, they do not in computing systems until we formally specify the meaning of
them (Eir 2004c). In class-based systems, the meanings of collections of properties are
embedded within certain class methods. E.g. apaintmQthod specifies how intrinsic
properties like those of dimension, material and form, are to be expressed visually. The
method defines a sort of interpretation of the data for doing so. However, adding the
desired dynamics concerning properties, calls for separating such semantic information
from the design program.

Figure 1. Steel profile.

eWork and eBusisness in architecture, engineering and construction


We can get a representation for visualising the I-profile by using the conceptual model as
a substitution of free terms in an AutoLISP expression like:

(/ width 2) (/ height 2))
(/ width 2) (/ height 2)))
(/ width 2) (/ height 2))
(/ width 2) (+(/ height 2) thA)))
(/ width 2) (+(/ height 2) thA))
(list line (-(/ width 2) thB) (+(/ height 2)
(-(/ width 2) thB) (+(/ height 2) thA))
(-(/ width 2) thB) (-(/ height 2)thA)))
(-(/ width 2) thB) (-(/ height 2)thA))
(/ width 2) (-(/ height 2) thA)))
(list line (/ width 2) (-(/ height 2) thA))
(/ width 2) (/ height 2)))
(/ width 2) (/ height 2))
(/ width 2) (/ height 2)))
(/ width 2) (/ height 2))
(/ width 2) (-(/ height 2) thA)))
(/ width 2) (-(/ height 2) thA))
(+(/ width 2) thB) (-(/ height 2) thA)))
(+(/ width 2) thB) (-(/ height 2) thA))
(list line (+(/ width 2) thB) (+(/ height 2)
(+(/ width 2) thB) (+(/ height 2) thA))
(/ width 2) (+(/ height 2) thA)))
(/ width 2) (+(/ height 2) thA))
(/ width 2) (/ height 2)))

Applying the substitution and evaluating algebraic expressions gives what we shall call a
view on the model. The language in which it is written, we call a target language (t:j).
Entering the view in AutoCAD makes this tool display a visualisation of it, as shown on
Figure 2.
Using the substitution, we can also get a MATLAB expression of the shear stress
distribution as a function of the thickness of the flanges:
x1-height/2 : step : -height/2+thA;
xm=-height/2+thA : step : height/2-thA;

Semantic parameterized interpretation


x2=height/2-thA : step : height/2;
X=[xl xm x2];
S=[Sl Sm S2];
T=0:10 : thA;

Figure 2. Visualisation of I-Profile.

eWork and eBusisness in architecture, engineering and construction


Figure 3. Visualisation of stress in

Applying the procedure mesh on the result in MATLAB, makes this tool display a stress
visualisation as shown on Figure 3.
The expressions having free terms to be bound, we call templates. A specification
which determines the conditions for applying a certain template, is called a semantics.
Thereby, a semantics defines the meanings of sets of properties in terms of target
language expressions.
In general, different semantic specifications may apply to the same model, and
different models may use the same semantics. This principle is depicted on Figure 4,
where mi denote conceptual models, Sj denote semantic specifications, and vij denote the
resulting views. The process of producing a target expression from a model using a
semantics, we consider a process of abstract interpretation (or just interpretation). In
fact, an artefact model can be interpreted in as many ways as there are kinds of
information to be derived from it.

Semantic parameterized interpretation


Figure 4. Models and views.

The class of design systems we outline in this paper relies on a principle we call semantic
parameterized interpretation. The design systems belonging to this class follow a
common structure of components and connectors between componentsthey have the
same software architecture (Bjrner 2003). This architecture is centered around an
interpretation mechanism which utilizes a semantics in order to encode the meaning of
objects from a design model, as target language expressions.
The principle is based on the assumption that design models are expressed in a formal
language. The meaning of a modeli.e. a perspective of what it may expressarises as a
function of the meanings of the objects in the model. The meaning of an object is the set
of properties ascribed to the object.
The formal design model is expressed in terms which need interpretation. In order to
do so, the meaning of the terms are specified in a semantic specif ication which binds sets
of free terms to target expressions. Interpretation is now made in two stages. The first
stage gives a target expression for each object in the model. However, such a target

eWork and eBusisness in architecture, engineering and construction


expression may be unsaturated in the sense that it may contain free terms and
unevaluated expressions. The second stage aims at: (i) substituting such free terms with
property values to which the termsas attributesare bound, (ii) evaluating expressions,
and (iii) combining the resulting views into one. The second stage involves term rewriting and evaluation of expression. It requires what we shall call a calculus semantics
CS which basically is a canonical set of rewriting rules.
A conceptual model which represents an artefact in a design process, we call an artefact
model. An artefact model (:) is a mapping from object identifiers (x:X) to mappings
from property designators (attributes) (a:A) to sets of values (vs:VS). The mapping from
attributes to value sets is called a property pattern (:). Formally, we write:

An example is:

The empty set of properties corresponds to absurdum; i.e. the impossible or conflicting
value of a property. It may appear as a result of combining two conflicting design
A semantics defines the meanings of sets of properties. A semantics specification states
expressions in target languages for a collection of property patterns. These are the
patterns which are used to determine which target expressions represent the meaning of
which kinds of objects.
A semanticsin this respectis represented by an environment (e:ENV) which maps
property patterns (a:2) to unsaturated target expressions

Semantic parameterized interpretation


6.1 Target saturation

The target expressions stated in a semantic environment are unsaturated in the sense that
they may contain placeholders; i.e. free terms, and unevaluated arithmetic expressions.
These free terms need to be bound to the values of properties; a process we shall call
target saturation.
The process is performed using a so-called calculus semantics (C5:CS). A calculus
semantics is a function type which builds a term-rewriter and evaluator from a rule








and makes it saturated by using an artefact model for term

substitution. Evaluation is performed according to the rule specification. The signature of
the calculus semantics is:

As an example of a rule specification, consider the saturation of pre-fix addition of two

properties with attributes a1 and a2 of an object (x:X):

We use the syntactical braces

to emphasize that the expression on the left-hand side is
considered syntax, whereas the expression on the right-hand side is semantics.
We now have presented the necessary main components for semantic parameterized
interpretation. The following sections aim at defining a number of necessary preconditions for performing interpretation of artefact models.
6.2 Completeness
We specify a criterion for the applicability of a semantics in interpretation of an artefact
model. We say that the semantics is complete concerning that model.
A semantics represented by (e:ENV) is complete concerning a model represented by
(:) if and only if for all objects (x:X) of the model, the semantics contains a property
pattern (:) which subsumes the property pattern of x. Formally, we write:

A property pattern (1:) subsumes another property pattern (2:) if and only if 2
contains at least the attributes of 1 and the value sets of 2 are subsets of the value sets of
1 for such common attributes. We say that 1 subsumes 2; written 1<: 2. E.g. the
property pattern 2=
{200,220,240}] subsumes the pattern 2=[width
100, height

eWork and eBusisness in architecture, engineering and construction


220]. Thus, 2 is more specialized than 1. The subsumption operation <: is defined
on pairs of property patterns and pairs of properties, respectively:

6.3 Well-constrainedness
Artefact models (:) are meant for representing designs during a process of incremental
specialisation. In such a process, it is convenient to specify several allowed values for
properties. E.g. we may state that a window should be either 80 cm and 120 cm wide.
However, when interpreting models, it is convenient that all attributes of object are
only ascribed singleton value sets. For simplicity, we consider this rather strict criterion
to be necessary in order for interpretations to be defined.
We define the notion of wellconstrained design models, as introduced in (Galle
1989), by the following predicate:

6.4 Non-ambiguity
The notion of subsumption defines a partial order of property patterns. In such an
ordering, ambiguity may occur if two distinct property patterns have common properties.
In such cases, it may not be possible to determine which entry in the semantics to use.
Therefore, it is required that such problems of ambiguity are resolved by introducing
lattice meet as the combination of both entries. The most specialized property pattern
being the one we strive at findingis represented by lattice meet. We define the
following predicate for stating that a semantic environment is non-ambiguous:

Semantic parameterized interpretation


Lattice meet on pairs of property patterns is defined as:

Thus, the ordering of property patterns for a semantics must (at least) span a semilattice.
We have now presented the necessary mechanisms for performing semantic
parameterized interpretation of artefact models. The result is a view expressed in a target
This view is the result of interpreting each object in a model according to
the semantics. This is done by selecting a property pattern in the domain of the semantic
environment. This pattern has to comply with the properties ascribed the object. The
result is an unsaturated target expression for that object. The target expression is made
saturated by means of the calculus semantics. The target expressions, representing the
meaning of objects in the artefact model, are finally combined.
Figure 5 depicts the principle of semantic parameterized interpretation.

eWork and eBusisness in architecture, engineering and construction


The views of the objects are combined in a compositional way by folding using a binary
operation on view pairs:

The function select picks the most specialized property pattern of those that comply with
the pattern of the current object:

Figure 5. Semantic parameterized


Semantic parameterized interpretation



Besides offering the two kinds of dynamics concerning properties, the principle of
semantic parameterized interpretation makes system integration and data exchange more
flexible. It does so by cutting down to a minimum of ontological entities: properties and
objects denoting sets of properties. These sorts of entities are common in all sorts of
interpretations in civil engineering applications. Each application makes its own
interpretations of the conceptual data model. Thereby we gain flexibility as application
specific information is gained as the results of interpreting conceptual models within each
application. E.g., a design system may focus on interpretations for visualisation whereas a
planning tools may focus on economy and work procedures.
However, the architecture which has been specified in this paper is highly abstract. It
builds apon a compositional principle which says that the meaning of a whole is a
function of the meaning of its constituents. There are, however, situations in which we
wish to interpret a conceptual model according to a scheme which does not fit this
principle. E.g., the principle excludes views which require that the meanings of objects
are calculated from the meanings of other objects (in a non-cyclic way). Furthermore, the
principle excludes views which require optimisation in order to be produced. However, in
many cases, it is possible to write expressions in some fimctional languages such that the
views can still be defined in a compositional way.
In the paper, we have not made any distinction between intrinsic properties and
extrinsic properties. In (Eir 2004c), we have argued that ontologically it is a desirable to
do so, but technologically it may be inconvenient. As an example, introducing binar
relations between objects (instead of having these as unary extrinsic position properties)
may often require that free parameters need to be resolved through an optimisation
process during interpretation. Introducing mechanisms for such resolvings makes the
software architecture complicated as it involves the introduction of free parameters for
unbound properties, and combinatorial algorithms for their resolving. In general, such
issues are not important when considering the abstraction level at which we have
presented the software architecture. However, they may be of interest when exploring the
limits of the architecture as well as in further prototype development.
In this paper, we have specified a new software architecture for conceptual design
systems. The idea for the architecture has arised from the observation that todays design
systems lack of dynamics with respect to incremental specialisation of design objects and
with respect to dynamic evolution of class systems for such objects.
The architecture presented aims at introducing these dynamics by means of what is
called semantic parameterised interpretation. In systems based on this principle, design
models (artefact models) are written in a formal language. These design models refer to
names of properties and values of these. The meanings of these names are not statically

eWork and eBusisness in architecture, engineering and construction


defined as in object-oriented class systems. The meanings of the names are given by a
semantics which becomes a parameter to an interpreter mechanism.
Artefact models are now interpreted according to a semantics and a calculus for
evaluating expressions. The result is a view of the model which is a written specification
in some target language
It is essential to our principle that the same artefact model
can be subject to many different interpretationsa principle which is highly important in
todays distributed and many-sorted realm of construction. Thus, we have shown how
views for graphical presentation in AutoCAD and stress calculations in MATLAB, can
be defined as interpretations of the conceptual design data.
The interpretation of models is based on the properties of the objects, such that objects
with the same properties have the same meaning, if using the same semantics. However,
this induces problems if objects are described by the same set of properties even though
they are intended to be conceptually distinct. It is then convenient to introduce special
properties which aim at making such distinctions; e.g., a property with attribute sort. The
property does, however, not denote class name, as it is optional and can be eliminated if
the current distinction is not of interest.
A design tool with the presented architecture is generic in the sense that the
semanticsas a parameterdetermines the sorts of entities that can be designed. Thus, a
semantics specialises the architecture into a certain application.
It may be argued that the problem we have tried to solve is much easier solved by
means of database schemes. In such schemes, objects can be introduced as rows and
properties of objects can be introduced as columns. There are, however, two main reasons
why a database approach is not satisfactory. First, dynamically changing the attributes of
a database schemaknown as schema evolutionis hard when it comes to maintaining
such schemas consistent. Second, the result of database operations and queries are either
database schemas, tuples, or the field values of relations. It is essential to our approach
that we are able to define the meaning of artefact models as expressions in many different
incomparable target languages.
We believe that the introduction of semantic parameterized interpretation can serve as
inspiration for future research in this area, and that we have emphasized important issues
and solutions relevant in the interdisciplinary field of design and computing.
Abadi, M. and Cardelli, L. (1996). A Theory of Objects. Springer.
Bjorner, D. (1997, October). Domains as a prerequisite for requirements and software: Domain
perspectives and facets, requirements aspects and software views. In Proceedings US DoD/ONR
Workshop, Bernried.
Bj0rner, D. (2003). What is a Method? An Essay on Some Aspects of Software Engineering.
Programming methodology. Monographs in Computer Science, 175203.
Eir, A. (2004a). Construction Informaticsissues in engineering, computer science and ontology.
Ph.D. thesis, Chapter 5. Informatics and Mathematical Modelling, Technical University of
Denmark, pp. 105142.
Eir, A. (2004b). Construction Informaticsissues in engineering, computer science and ontology.
Ph.D. thesis, Chapter 6. An algebraic specification of incremental, conceptual building design,
pp. 143178. Informatics and Mathematical Modelling, Technical University of Denmark.

Semantic parameterized interpretation


Eir, A. (2004c). Construction Informaticsissues in engineering, computer science and ontology.

Ph.D. thesis, Chapter 7. Semantic parameterized interpretation as a foundation for conceptual
design systems, pp. 179224. Informatics and Mathematical Modelling, Technical University of
Eir, A. and Ekholm, A. (2002). From rough to final designs by incremental set-inclusion of
properties. In eWork and eBusiness in Architecture, Engineering, Construction, proceedings
ofthe ECPPM2002, pp. 293300. Swets & Zeitlinger Publishers.
Ekholm, A. (1995). A conceptual framework for classification of construction works. IT con 1.
Ekholm, A. and Fridqvist, S. (1998, June). A dynamic information system for design applied to the
construction context. In B.-C. Bjork and A. Jagbeck (Eds.), Proceedings of the CIB W78
workshop The life-cycle of Construction IT, pp. 219232.
Fridqvist, S. (2000). Property-OrientedInformation Systems for Design. Ph. D. thesis, Division of
Architectural and Building Design Methods, Lund University.
Galle, R (1989). Computer methods in architectural problem solving: Critique and proposals. In
CAAD: EducationResearch and Practice (eCAADe), pp.
RAISE Language Group, T. (1992). The RAISE Specification Language. Prentice Hall, CRIA/S.
Shoemaker, S. (1997). Causality and properties. In D.H.Mellor and A.Oliver (Eds.), Properties,
Oxford Readings in Philosophy, pp. 228254. Oxford University Press.
van Leeuwen, J. (1999). Modelling Architectuml Design Information by Features, andapproach to
dynamicproduct modelling for application in architectural design. Ph. D. thesis, Eindhoven
University of Technology, The Netherlands.

eWork and eBusiness in Architecture, Engineering and ConstructionDikba & Scherer (eds.)
2004 Taylor& Francis Group, London, ISBN 04 1535 938 4

Harmonization of ISO 120062 and IFC a

necessary step towards interoperability
Lund Institute of Technology, Lund, Sweden
ABSTRACT: There are two major candidates for Core Ontologies for the
construction and facilities management sector, the ISO 120062
Framework for classification of information, and the Industry Foundation
Classes, IFC. ISO 120062 has been developed to harmonize different
national and regional classification systems. The IFC are intended to
enable effective information sharing within the AEC/FM industry. The
standards have similar objectives but show fundamental differences in
semantics and structure. This work compares the standards and points at
similarities and differences, firstly in order to understand their structure,
and secondly to initiate a discussion about the need and the possibility to
co-ordinate them. The starting point of IFC was to reject classification,
and therefore a harmonization with ISO 120062 would require a major
shift of approach. Development of a common meta model, a generic
domain model, and a coordinated domain framework are considered
necessary tasks.


The demand for standardised concepts and terminology rapidly increases in the
construction and facilities management sector. Internationalisation of the industry and an
increasing use of information systems are decisive factors in this development. A
generally agreed ontology is a prerequisite for effective information exchange and
interoperability in any field of knowledge (Lima 2004). The development of the semantic
web with agent based information retrieval is a current example, where interoperability is
enabled through ontology development and standardisation (Berners-Lee et al. 2001).
An ontology consists of concepts that describe objects of interest in a domain. The
ontology for the construction and facilities management sector comprises concepts for
describing construction entities, their design, production, use and management, as well as
man using and experiencing the built environment. Internationally agreed ontologies in
the sector are scarce, the post-war world wide spread of the SfB building classification
system was an exception.
Classification systems are cornerstones in ontology development, they concern both
concepts and terminology and have a decisive influence in establishing a common
language for actors in a sector. Lately the interest in ontology for the construction and
facilities management sector has grown, at first connected with the interest in product

Harmonization of ISO 12006-2


modelling and now with the emergence of xml-based information exchange (Tolman
The construction and facilities management sector is traditionally national and
regional in character. Today, there are two major candidates for core ontologies common
to the sector, ISO 120062:2001, Building construction-Organization of information
about construction works-Part 2: Framework for classification of information (ISO 2002),
and Industry Foundation Classes, IFC, developed by the International Alliance for
Interoperability, IAI (IAI 2000).
ISO120062 defines a framework of generic classes of interest in construction and
facilities management. It is intended to be used as a starting point for development of
detailed classification tables. Tables that adhere to the principles laid out in the standard
are assumed to be similar and possible to translate between. ISO120062, with its roots in
the SfB-system, has recently been applied in the development of building classification
systems like the British UNICLASS (RIBA 1997), The Swedish BSAB 96 (The Swedish
Building Centre 1999) and the North American OCCS (OCCS 2003).
The scope of ISO120062 is the complete lifecycle of construction works but it is not
specifically considering the needs of ICT-based interoperability. IFC addresses
interoperability requirements and has a similar scope concerning both construction and
facilities management. IFC consists of a framework of classes and models and is intended
to be used as a resource for development of schemas in model oriented information
systems. Although its aim is not to develop a generic building classification, its
framework of classes is similar to those of ISO 120062.
An ontology for the construction and facilities management sector must be common to
the worlds of classification and product modelling. Already in the introduction of product
modelling research the idea of harmonization with building classification was suggested
by Bjork in the Unified Approach Model (Bjork 1992). This model was later integrated
into the IRMA model (Luiten et al. 1993). Both are compatible with the basic structure of
ISO 120062.
Both ISO 120062 and IFC have as purpose to establish a foundation for development
of effective information systems for the construction and facilities management sector.
However, there are marked differences in semantics and structure of the systems. The
aim of this research is to compare the structure of the standards, to point at similarities
and differences, in order firstly to understand why these standards are so different, and
secondly to initiate a discussion about the need and the possibility to co-ordinate them.
The following sections analyse and compare the structure of ISO 120062 and IFC,
discusses information requirements in critical processes, compares with other standards,
and reflects on a strategy for harmonizing the FST and IFC.
The ISO 120062 standard has been developed as a step in harmonizing different
national and regional building classification systems. It is intended to be used as a
framework for developing building classification systems by organisations on a national
or regional basis. An underlying assumption is that the ISO-standard in the long run will
enable the development of common tables in an international level.

eWork and eBusisness in architecture, engineering and construction


ISO 120062 defines a framework and a set of recommended table titles supported
by definitions, but not the detailed content of these tables (ISO 2002:6). It is based on
many years of practical experience, and is also shown to be compatible with scientific
ontology and systems theory (Ekholm 1996).
ISO 120062 identifies the main classes that are of interest to the construction sectors
building classification for purposes of CAD, specification, product information and cost
information systems (ISO 2002:4). The scope of the standard is the complete life cycle of
construction works within building and civil engineering. It lists recommended tables
according to particular views or principles of specialisation and gives

Figure 1. The high level process model

in ISO 120062.
examples of entries that may occur in these tables (ibid:6).
The ISO standard has not been expressed in a formal data definition language. The
standard illustrates objects and relations in an informal schema which for reasons of
space is not shown here. The relations between objects are depicted with arrows
representing subclass relations and other associations between classes and properties. The
schema together with the definitions in the standard are sufficient as a background for
representing the standard in a more formal way in EXPRESS-G diagrams, which make a
comparison with IFC easier. In the following text the ISO Framework Standard, will be
named FST for short, and the classes of the standard will be given short names to fit
within the schema boxes.
2.1 The FST Construction Object
The most generic entity in the FST is the Construction Object, defined as an object of
interest to the construction industry. The FST identifies four main classes of Construction
Object: Construction Resource, Construction Process, Construction Result, and
Property/Characteristic. These are related in a generic process model stating that
Construction Resources are used in Construction Processes that will result in
Construction Results, and all these objects have Properties/Characteristics. Every class in
the standard is a subclass of one these four. Relations are not treated explicitly in the
standard but may be treated as mutual properties of the related objects. The EXPRESS-G
schema in Figure 1 illustrates these most generic classes.

Harmonization of ISO 12006-2


The FST does not suggest any classification for properties but gives examples from
the CIB Master List, e.g. composition, surface and sensory, thermal etc. Generally,
building classification systems do not handle geometrical properties, since they are
supposed to be used together with drawings or models that contain this information.
2.2 FST Construction Process
Among the processes defined by FST, Construction entity life-cycle stage is an overall
process related to

Figure 2. Processes in ISO 120062.

Figure 3. Resources in ISO 120062.

the construction entity, e.g. design or production. See Figure 2. Project stage is defined
as a period of time in the duration of a construction project identified by the overall
character of the construction processes within it. According to this definition, Project
stage is a part-process of Construction entity life-cycle stage, e.g. building
programming or detailed design. Management Process is a planning or administrative
process. A Work Process leads to a Work Result which is a result classified
according to process or kind of work.

eWork and eBusisness in architecture, engineering and construction


2.3 FST Construction Resource

Resources in the FST are shown in Figure 3. A Construction Product is a resource
intended for permanent incorporation in a construction entity. Members of Construction
Aid are resources like tools and machinery, not intended for incorporation in a
permanent manner in a construction entity. The properties of a Construction Product
are basic to the properties of the built parts of the construction entity.
2.4 FST Construction Result
The FST identifies four main classes of result: Construction Complex, e.g. airport and
motorway, which consist of one or more Construction Entity, e.g. building and bridge,
and Construction Entity Part, e.g. wall and road surface. A Space, e.g. a room or
roadway, is contained within or otherwise associated with a building or other
construction entity (ibid:9). See Figure 4. The Result classes identified in

Figure 4. Construction Result

according to ISO 120062.
the FST seem limited in the sense that they describe material results. However a possible
interpretation is that also information like design results, e.g. ideas and abstract models,
representing concrete results are possible members of these classes.
The generic result classes Construction Complex, Construction Entity and
Construction Entity Part are related by a part-of relationship in a compositional
hierarchy. The result classes are abstract and only intended to be instantiated after a
first division into subclasses based on different views on the physical reality they

Harmonization of ISO 12006-2


represent. According to the FST, a Construction Complex is classified by function-oruser activity. A Construction Entity is classified either by form or by function-and-user
activity. Construction Entity Part is classified by function as Element, by type of
work as Work Result and as Designed Element by subdividing Element by Work
Result. Space can be classified by enclosure, e.g. outdoors or indoors, by user functionoractivity or by a combination of these. Space in the FST has no relation with
Construction Entity Part. A relation like enclose or composed of would seem
relevant according to (Ekholm and Fridqvist 2000). The subclasses based on separate
views are included in Figure 4.
From the example of Designed Element it is easy to imagine the need for other
combined classes e.g. Designed Construction Entity and Designed Space. The
Swedish BSAB 96 has a classification table for construction entities that could best be
described as Designed Construction Entity. It is a combination of Construction Entity
by Form, e.g. tunnels, bridges and buildings, and Construction Entity by Function
(The Swedish Building Centre 1999). The difference in view is motivated by the purpose
of the classification, if it is of importance to identify Construction Entities by the main
construction method, e.g. girder bridge, arch bridge, or truss bridge, or by function-oruser activity as railroad bridge, motor vehicle bridge or pedestrian bridge. A similar
subdivision is possible for Space, e.g. indoors or outdoors specify form, e.g. kind of
enclosure, and living room or kitchen specify fimction-or-user activity.
3.1 The objective of IFC
The IFC constitute a framework for sharing information between different disciplines
within the AEC/FM industry throughout the project lifecycle (IAI 2000:2). The main
purpose of the IFC is to enable effective information exchange between information
systems, so called interoperability. This concerns both semantic definitions and object
exchange formats. The semantic definitions of the IFC concern, just as ISO 120062,
objects of interest in construction and facilities management. However, IFC does not
adhere to the ISOstandard and has different definitions and general structure. The
documentation of IFC does not present a theoretical background for its structure or
choice of model classes.
IFC has gone through several practical tests that confirm its applicability and it is
integrated in an increasing amount of applications. However, with the exception of an
earlier study by the present author (Ekholm 1999), IFC has never been subject of a
detailed critical analysis concerning its relation to building classification.
3.2 Conceptual layers
The organization principle for the IFC framework provides for a modular structure of
models (ibid:5). The models are structured into conceptual layers of different scope.
There are four conceptual layers where sets of model schemata are defined (ibid:5):

eWork and eBusisness in architecture, engineering and construction


1. Resource classes.
2. Kernel and Core Extension classes.
3. The Interoperability classes.
4. The Domain classes.
The Resource layer contains classes that are applicable to most of the classes in other
layers, e.g. geometry, date and time, material and cost. Resources could be understood as
representing generic properties of domain objects.
The Core layer consists of the Kernel and the Core Extensions. The Kernel provides
all the basic concepts required for IFC models. In an early version of the standard the
Kernel is explained as a kind of Meta Model that provides the platform for all model
extensions (IAI 1997:6). In a later version the Kernel is explained as a template model
that defines the

Figure5. The IFC template model.

Figure 6. IfcObject.
form in which all other schema within the model are developed The Kernel is the
foundation of the Core Model (IAI 2000:8). The Kernel is independent of the AEC/FM

Harmonization of ISO 12006-2


3.3 The IFC Kernel

IFC uses EXPRESS as data definition language. The basic data units in EXPRESS are
entities, relationships and attributes (Schenck and Wilson 1992:26). The IFC apply these
units as a starting point to define the Kernel objects consisting of IfcRoot with the
subclasses IfcObject, IfcRelationship, and IfcPropertyDefinition (IAI 2000:12). See
Figure 5.
IfcRoot is the most generic entity, it has name, ID, description and history. IfcObject
represents concrete and conceptual objects in the domain. Among examples are wall,
space, grid, work task, cost item, labour resource, actor, and person. IfcRelationship
represents relations between members of IfcObject, and relations to support modelling for
database implementation. IfcPropertyDefinition represents different attributes of domain
The fact that relationships are treated separate from properties seem odd since
relations are mutual properties, e.g. position and before are properties based on
relations between two or more things.
The immediate subclasses of the template model constitute a second level in the
Kernel. Subclasses of IfcObject are shown in Figure 6.
In contrast with the FST, the IFC classes are not related in an explicit definition or
model and one may wonder whether the selection is complete or if the classes are
mutually exclusive, as they would be in a classification system.
To compare, for example the IfcResource is not equivalent to the FST Resource. An
IfcResource is defined as information needed to represent the costs, schedule, and other
impacts from the use of a thing in a process It is not intended to use IfcResource to
model the general properties of the things themselves. This is radically different from
the standpoint of the FST where a Resource like FST Construction Product is defined as
a commodity that may be incorporated into a construction entity in a permanent
manner. The IfcResource is an attribute, representing properties of resources, while the
FST Resource is a class concept referring to concrete resources. The FST Resource class
may be used independently of other classes while the IfcResource requires an instance of
IfcProduct to be applied.
An IfcProduct is defined as a physical item incorporated into an AEC/FM project
either directly as supplied or through construction/assembly of other products. IFC does
not distinguish between the FST classes FST Construction Product and FST Element.
Within the second level of the Kernel, the IfcRelationship class is specialised into five
categories, IfcRelAssigns, IfcRelConnects, IfcRelDecomposes, IfcRelAssociates, and
IfcRelDefines. These relate IfcObject to different other IfcObject, e.g. IfcRelAssigns may
be used for an arbitrary relation between objects, IfcRelConnects may represent a
physical coupling, IfcRelDecomposes represents the part-of relation and IfcRelDefines is
used for relating Property Sets or Type objects with an object instance. Each relationship
is further specialised according to the specific object that it relates, e.g.
The same reflections as for the IfcObject subclasses are relevant to make for the
IfcRelationship classes: are the different kinds of relationship theoretically well-founded,
is the selection exhaustive?

eWork and eBusisness in architecture, engineering and construction


3.4 The Core Extensions

The classes described above constitute the IFC Kernel. The next level is the Core
Extensions layer, which consists of specialisations of the Kernel classes IfcControl,
IfcProcess and IfcProduct. The subclasses of IfcProduct are IfcElement,
IfcSpatialStructureElement, IfcAnnotation, IfcGrid and IfcPort. Figure 7 shows the
subclasses of IfcElement and IfcSpatialStructureElement.
Subclasses of IfcElement are defined as components of an AEC product. The names
indicate that they are identified by function and thus similar to the different FST
Elements. However, this is not the intention as shown below in section 5.1.
IfcSpatialStructureElement classes are only spatially defined. In the technical
documentation of IFC 22,

Figure 7. IfcElement and

a spatial enclosure hierarchy shows IfcSite, IfcBuilding, IfcBuilding seen as section of a
building, and IfcBuildingStorey related through IfcRelAggregates, a subclass of
IfcRelDecomposes (IAI 2003:102).
In FST, Construction Complexes, Construction Entities and Construction Entity Parts
are related in a compositional hierarchy, as illustrated in Figure 4. A spatial hierarchy of
enclosure similar to IFC:s could be developed in parallel to the compositional hierarchy.
The FST does not mention the concept of construction site explicitly, but in principle it

Harmonization of ISO 12006-2


could be seen as a construction complex consisting of related construction entities like

roads, buildings, pavements etc. The other kinds of space would be derived from a spatial
view on construction entities and construction entity parts.
The FST does not specify how relations between different kinds of spaces are handled.
For example, the relation between a room and a building storey is not covered by the
FST. IFC needs to support this kind of specification but could be improved by applying a
more generic view of the concept of space and how it is related to buildings and parts of
buildings. Examples of relevant analyses of the concept of space are presented in
(Ekholm and Fridqvist 2000) together with a proposed definition of space relevant for
both classification and product modelling. Here, the concept of space is included in a
theoretical framework that also considers other aspect views on the building, e.g.
functional systems and their parts.
Although IfcGroup is not an IfcProduct, it has two subclasses in IFC Product
Extension, IfcSystem and IfcZone. IfcGroup could be understood as a generic class
describing an arbitrary aggregate of members of IfcObject. Functionally related parts of a
collection may be represented together as IfcSystem. Similarly if the collection consists
of adjacent spaces the collection may be represented as IfcZone.
The seemingly ad hoc based position of these classes in the Core Extension may be
explained as a consequence of the lack of theoretical foundation for the development of
the IFC framework. The generic concept of system should be defined already in the most
generic, ontological level, of the framework e.g. stating that any object may be a system
composed of parts.
3.5 The Interoperability Layer
The next lower level is called the Interoperability Layer. It contains classes common to
different actors and disciplines in the construction and facilities management sectors.
Here one may find, for example, IfcWall, IfcBeam, and IfcElectricalAppliance. The
classes of the interoperability layer are intended to be generic in scope. One example is
the class IfcWindow which is a leaf node in IFC, i.e. it is not subclassed in the
standard. Further detailing is achieved through assigning Property Sets, e.g. that assign
different numbers of glazing panes, opening types, framing arrangements etc.
The classes in this level are similar to those in classification tables of national and
regional classification systems. However, the classes are not intended to be equivalent to
those in classification as will be shown in section 5.1.
4.1 Views on Construction Entities
The separation of classes from spatial, functional, and compositional views and the
possibilities to combine these is characteristic to several processes in construction and
facilities management. The difference in view is motivated by the purpose of using the
information, for example, whether it is of importance to identify a construction entity by
main construction method or by function-or-user activity.

eWork and eBusisness in architecture, engineering and construction


The example given above in relation to the FST described a bridge as a girder bridge,
arch bridge, or truss bridge, and as a railroad bridge, motor vehicle bridge or pedestrian
bridge, respectively. The fimctional-or-user activity view on the Construction Entity may
be of specific interest in the brief stage or in the facilities management stage. Similarly,
the compositional view may be of interest during systems design, detailed design,
production and maintenance, where knowledge of composition and constituent materials
are necessary.
There is no equivalent in IFC to the FST classes Construction Entity by Form and
Construction Entity by ftmction-or-user activity. The only IFC class in this level is the
IfcBuilding, a subclass of IfcSpatialStructureElement, a semantically spatial concept.
4.2 Views in design
The FST reflects the idea of design as a process where functional requirements are met
with technical solutions and concrete work results. There is a need for separate classes for
these views, since they concern different stages and actors in the construction process.
The FST classes for construction entitiy parts are Element, Designed Element and
Work Result.
During design, building classification supports the successive determination of
properties of the designed object. At first the designed object is identified through a
spatial view, location and geometry are determined. Next, the object is functionally
determined and can be classified as Element. When the technical solution of a part has
been determined it may be classified as
Designed Element and Work Result. In principle the sequence is the same in drawing
based design and 3D-model based CAD, the designer starts by defining design objects,
i.e. building parts, by geometry, and successively determines fimction and technical
solution. However, the main 3D-modelling CAD-applications integrate the first two steps
and require a designer to instantiate a design object from an Element class with
predefined geometry parameters, e.g. a wall as a vertical plate. In this case the
instantiated object is already determined by fimction according to the definition of the
Element class.
4.3 Views in specification and cost calculation
In order to develop a specification or cost calculation using the FST each Element is
specified by Work Results including used resources, e.g. labour and material. Table 1
illustrates a specification using the Swedish classification system BSAB 96 from a
prototype test of information transfer from product model

Table 1. The structure of a specification based on

BSAB 96.



Work Result (WR)



Roof carcass


Beam framework

length (m)


Glue-laminated wood beam

length (m)

Harmonization of ISO 12006-2




amount (no)


Angular fittings

amount (no)

to cost calculation using IFC and BSAB 96 (Nilsson & Eriksson 2002).
IFC cannot handle cost calculation in this way since it does not identify classes based
on different views. Instead, cost calculation is enabled by associating instances of
IfcProduct, e.g. IfcBuildingElement, with IfcConstructionResource and related
IfcCostltem (IAI2003). It would seem more relevant to use predefined classes like FST
Work Result to handle this.
Applications for design, specification, and cost calculation might require that objects
emerging from different views are concatenated during the processes. This requires
support for multiple inheritance. An obstacle for IFC could be that it only allows single
inheritance (IAI 2000:39).
4.4 Views in other standards
The recognition of the relevance of distinguishing classes from different views is not
unique to the FST, rather, it is common in other standards. For example, STEP AP 221
EPISTLE, used for Product Data Management separates between a functional
physical object which represents a fimctional view on an object in the domain, while the
materialized physical object includes both a functional and a compositional view
(EPISTLE 2004).
Another industry standard, IEC 61346 Industrial systems, installations and equipment
and industrial products, developed for classification of technical objects, for similar
reasons as the FST, distinguishes between objects identified from three different views,
the functional: fiinction, the compositional: product and the spatial: location (IEC
5.1 Classification and product modelling
As a starting point for the development of IFC, the relevance of building classification for
product modelling was questioned since it only allows a user to categorize elements
according to primary functional role or as part of a system (IAI 1997:215). The
developers of IFC intended to avoid this by defining model elements, functional roles,
and systems separately so that an element can assume multiple roles and/or be a member
of multiple systems.
The development of IFC has been guided by these principles. As a consequence the
IFC Core Extension and Interoperability classes are not intended to be equivalent to
classification classes, but should be seen as some kind of placeholders for information
about the modelled instance. The properties of the instance are determined through
associations with GeometryResources, PropertySets and other classes in IFC.

eWork and eBusisness in architecture, engineering and construction


Accordingly, in order for an IFC instance to be classified as an FST Element it would

need to be assigned a Property Set equivalent to that of the Element definition.
In prototype tests of IFC this has not been tried out, but instead the IFC class names
have guided the interpretation of the IFC classes as functional elements. Where such IFC
classes have been missing the IfcProxy class has been applied to represent among others
Work Result classes (Tarandi 2003).
The problem with the IFC approach is the idea that model elements may be
identified independent of e.g. a spatial, functional, or compositional view. Philosophers
and scientists generally agree that it is not possible to have knowledge about the world
as it isbut only as we see it. Popper says that If we wish to study a thing, we are
bound to select certain aspects of it (Popper 2002:71). We see the world through our
concepts, and these are by definition classes (Bunge 1979:169). In principle we cannot
refer to anything without at the same time see it as something. Even to say that thing
is to classify it as having existence. When we call something a wall we immediately
include the thing into the fiinctionally defined class of enclosing/dividing things. It is
impossible to focus on an object without at the same time assigning it to a class. Similarly
an instance is by definition member of the class it is instantiated from.
If IFC had applied its principles it would enable a model element to be instantiated in
a generic level independent of functional, compositional, or spatial definition. But this is
not supported, e.g. all classes from IfcRoot down to IfcBuildingElement are abstract and
cannot be instantiated (IAI 2003:114).
In practice IFC has not succeeded in establishing the intended separation between
model elements and classification. The IFC classes have, to a large extent, similar names
as those used in classification systems. An example is the IfcWall, which also in IFC is
defined by its functional role as enclosing. Instances of this are not independent of
functional role. This would not have been problematic if IFC had acknowledged the fact
and adhered to FST or any other classification framework.
In fact building classification supports precisely the process which IFC strives for. As
explained above, classification classes must be seen as part of the information that is
determined in the process alongside with the geometry information expressed by drawing
objects. This fact is an important argument for revising the IFC class structure in
adherence to the FST.
5.2 Integrating FST and IFC, is there a possible strategy?
Recently, based on the experiences of the Workshop on eConstruction, the need for a
strategy for development of a unified building construction model has been stressed (Wix
2004:32). The analysis presented here suggests that the harmonization of building
classification represented by ISO 120062, and product modelling, represented by the
IFC, should be an essential part of the work.
What would be the reason for harmonising FST and IFC? Classification systems
adherent to the FST are used in daily practise in several countries for both manual and
computer based information structuring. IFC specifically addresses questions of
interoperability and represents a considerable investment of time and money. If IFC and
the FST were harmonized it would facilitate and speed up the integration of everyday
practise with object based information management.

Harmonization of ISO 12006-2


Would it be possible to integrate these standards? The FST and IFC both lack an
explicit theoretical foundation, and establishing a common ground would effectively
support an integration process. Compared with FST, the IFC:s framework is more ad hoc
which makes it harder to understand, apply and develop. A framework for information
systems in the construction and facilities management sector should be both theoretically
well founded and practically applicable. The former will increase versatility and life span
of the standard.
The FST and IFC support slightly different processes, but, as shown, there is a
significant overlap between the frameworks. The FST is developed to support
specification, cost calculation, CAD-layering, PDM-systems, brief development, etc. for
the construction and facilities management processes. IFC has a similar scope, but the
needs of CAD-systems and the definition of CAD objects were initially in focus.
How could the harmonisation be accomplished? A starting point would be to abandon
the IFC strategy of defining model elements, functional roles, and systems separately
and acknowledge the need for a framework based on views and classification. Then, it
would be necessary to define a meta model based on generic principles for modelling
domain objects starting, not from the EXPRESS language, but ifrom very generic
ontological theories, e.g. a general theory of systems and properties. This would include
the definition of objects from different views. An attempt in this direction may be found
in (Ekholm and Fridqvist 2000). A next consideration would be to build a generic domain
model similar to that of FST or the IRMA that defines the main classes, including
objectified relationships, needed to build the model schemas. The overall aim would be to
develop a framework for object oriented information exchange for construction and
facilities management that would be both scientifically well founded, and applicable and
acceptable for the processes that are to be supported.
Berners-Lee, T et al. 2001. The Semantic Web. Scientific American. May 2001. Accessed at
http://www.sciam.com/ article.cfm?articleID000481441OD21C7084A9809 EC588EF21
Bjork, B.-C. 1992. A Unified Approach for Modelling Construction Information. Building and
Environment. Vol. 27,No2.pp 173194.
Bunge, M. 1983. Epistemology & Methodology I: Exploring the world. Vol. 5 of Treatise on Basic
Philosophy. Dordrecht and Boston: Reidel.
Ekholm, A. 1999. Co-ordination of classifications for product modelling and established building
classifications. In: Durability ofBuilding Materials & Components 8, Vol. 4 Information
Technology in Construction, Proceedings of the CIB W78 Workshop at the 8dbmc conference
May 30-June 3, Vancouver, Canada (Eds.) Lacasse M.A. and Vanier D.J. NRC Research Press,
Ottawa, Canada.
Ekholm, A. 1996. A conceptual framework for classification of construction works. Electronic
Journal of Information Technology in Construction (Itcon). Vol. 1 1996. http://itcon.org/.
Ekholm, A., & Fridqvist, S. 2000. A concept of space for building classification, product
modelling and design. Automation in Construction, (9)3.
EPISTLE 2004. EPISTLE core Model 4.5. Accessed 20040401, at
http://www.epistle.ws/,http://www.infowebml. org/ECM4.5/ECM4.5.html and
http://www.btinternet/. com/~Chris. Angus/epistle/standards/ap221 .html.

eWork and eBusisness in architecture, engineering and construction


IAI 2003. IFC 2x Edition 2. Model Implementation Guide. Version 1.6. (Ed.) Liebich, T.
International Alliance for Interoperability, June 30, 2003.
IAI 2000. IFC Technical Guide. (Eds.) Liebich, T. & Wix, J. International Alliance for
Interoperability, October 27, 2000.
IAI 1997. Industry Foundation ClassesRelease 1.0 Specifications Volume 2. IFC Object Model
For AEC Projects. International Alliance for Interoperability, January30, 1997.
IEC 2000. IEC 613463: Industrial systems, installations and equipment and industrial products
Structuring principles and reference designationsPart 3: Application guidelines. 20000505.
International Electrotechnical Commission, IEC.
ISO 2002. Svensk Standard SS-ISO 120062:2001, Building constructionOrganization of
information about construction worksPart 2: Framework for classification of information.
Sprak: Svenska och Engelska. Geneva: Int. Organisation for Standardization.
Lima, C. 2004. Final draft CWA4 proposal European eConstruction Ontology version 200403
26. Workshop on eConstruction N083, Nederlands Normalisatie-Instituut.
Luiten, G. et al. 1993. An Information Reference Model for Architecture, Engineering, and
Construction. Proceedings of the First International Conference on the Management of
Information Technology for Construction, Singapore, August 1993.
Nilsson, K. & Eriksson, V 2002. Pilotprojekt NCCProdiiktion plus overlamnande. Koppling mellan
produktmodell och kalkyl- och forvaltningssystem. Slutrapport 20021217. IT Bygg &
Fastighet 2002.
OCCS 2003. Overall construction classsification system. Accessed 20040401 at
Popper, K. (2002). The Poverty of Historicism. First published 1957. London: Routledge.
RIBA 1997. Uniclass. Construction Project Information Committee. London: RIBA Publications.
Schenck, D. & Wilson, P. 1992. Information Modelling: The EXPRESS Way. Oxford: Oxford
University Press.
The Swedish Building Centre 1999. BSAB 96. The Swedish construction industry classification
system. Stockholm: The Swedish Building Centre.
Tarandi, V 2003. Implementering av produktmodeller baserade pa IFC och BSAB. Slutrapport
20021218. IT Bygg & Fastighet 2002. Accessed 20040607 at:
Tolman, F. et al. 2000. The bcXML Baseline. eConstruct IST-199910303, WPl, Tl 100, DlOl,
Rev.4. Accessed 20040607 at: http://www.bcxml.org/ 6Public/bcXML_CD/PublicDeliverables/dl01_v4.pdf.
Wix, J. 2004. draft CWA3 European eConstruction Metaschema version 20040301.
Workshop on eConstruction N068, Nederlands Normalisatie-Instituut.

eWork and eBusiness in Architecture, Engineering and ConstructionDikba & Scherer (eds.)
2004 Taylor& Francis Group, London, ISBN 04 1535 938 4

A novel modelling approach for the exchange

of CAD information in civil engineering
Berthold Firmenich
CAD in der Bauinformatik, Bauhaus-Universitdt Weimar, Germany
ABSTRACT: Existing CAD systems are predominantly incompatible.
Standardization attempts describe the structure of the data to be
exchanged. The information exchange is realized by data transformation.
Due to incompatible data schemes this process normally (1) results in the
loss of data that (2) accumulates during each information exchange. In the
traditional approach information is exchanged in an evaluated form. This
paper presents a novel solution approach that describes the information to
be exchanged in an unevaluated form: Instead of exchanging objects and
attributes, the applied operations are exchanged. This approach is denoted
as operative modelling. The generation and the description of the applied
operations are presented. The advantages of the novel approach over the
traditional approach are clarified by means of typical scenarios and
examples from civil engineering.

Existing CAD systems are predominantly incompatible. Standardization attempts
describe the structure of the data to be exchanged. However, existing CAD systems are
normally not based upon standardized data structures: The completion of their specific
tasks very often requires the usage of optimized data structures. Thus, the information
exchange is characterized by two data transformations: One transformation from the
native format of the source system to the standard exchange format and one
transformation from the standard exchange format to the native format of the destination
system. Due to incompatible data schemes this process normally (1) results in the loss of
data that (2) accumulates during each information exchange.
Available CAD systems store their models in an evaluated form as objects and attributes.
In this paper the new research project opCAD is introduced: The objective of this project
is the standardization of CAD in civil engineering by a language for unevaluated CAD
models. Such an unevaluated model is not described by objects and attributes, but by the
operations that have created the model. The unevaluated model is called an operative

eWork and eBusisness in architecture, engineering and construction


In available CAD systems the user stores the result of the construction process at
certain points in time. In the new approach the sequence of operations executed in the
construction is described by a language to be developed in the research project. This
operative model is continuously and implicitly stored by the system. The recorded
activity of the construction process can be played back later.
The main objective of the research project is the systematic and complete description
of a practical language for the operative modelling of CAD in civil engineering.
3.1 State-of-the-art
Existing standardization attempts focus upon the description of the results of the planning
The international organizations OMG (Object Management Group) and ODMG
(Object Database Management Group) define interfaces for the usage of objects in a
distributed environment and in object oriented databases (Serain 1999, Eastman 1999).
The SQL language has been standardized by the American National Standards
Institute (ANSI). SQL is a language for relational data models. The description of objects
and operations by the SQL language would be very cumbersome.
The ODMG has provided with OQL (Object Query Language) a standardized
language for the selection and manipulation of object models. OQL-instructions can be
optionally embedded in programs or can be used by experienced users as an ad-hoc query
language. OQL allows the selection of objects and their manipulation by calling methods.
Attribute values can be modified. The application of OQL by a user would require a deep
understanding of object oriented methods. In addition to that the formulation of operative
models by the OQL language would be a tedious task.
Another solution approach consists of the formulation of scripts for CAD. The Tc1/Tk
(Tool Command Language/Tool Kit) language is commonly used as scripting language.
Although the language allows the formulation of CAD macros no standardization took
place until now. However, Tc1/Tk could be an environment for the implementation of the
operative language (Ousterhout 1995).
Currently some CAD producers offer solutions on the basis of GDL (Geometric
Description Language) (GDL 2004): GDL allows, for instance, the import of catalogue
elements into CAD systems via the Internet. However, the scripting language contained
in GDL is not as open and extensible as Tc1/ Tk: An implementation of the operational
language on this basis would be almost impossible.
STEP (STandard for the Exchange of Product Model Data) is the informal description
of ISO 10303 (International Organization for Standardization)a whole family of
international standards for the exchange, data management in databases and
implementation (Haas, Ilieva, Kessoudis 2002). The German automobile industry tries to
improve the quality of the data exchange by using the two dimensional subset STEP-CDS
(Haas 1999).
The IAI (Industry Alliance for Interoperability) attempts to provide a universal
interoperable data basis for all phases in the lifetime of a building. The according data

A novel modelling approach for the exchange


model is the IFC (Industry Foundation Classes). This interoperability means that all
planners involved would use this data basis. Currently the data exchange of a subset of
the originally planned amount has been realised (Steinmann, Liebich 2002).
State-of-the-art concerning STEP and IFC in civil engineering is the communication
of the planners by the exchange of evaluated data. By contrast, our research project
focuses upon unevaluated operative CAD models.
XML (eXtensible Markup Language) is an open standard of the W3C (World Wide
Web Consortium) for the format of documents. Is it suitable for the textual description of
evaluated object models. However, the usage of XML as a language for the operative
modelling would be rather tedious.
3.2 Recent fields of research in Germany
Net-distributed processes are currently in the focus of research in Germany: These
objectives are currently being explored in the Priority Programme 1103 Netdistributed
planning processes in Structural Engineering of the German Research Foundation
(Firmenich 2004). As many as 14 research projects are participating in the research
programme. There are no overlappings between the opCAD project and the Priority
3.3 Traditional workflow scenario
Available CAD systems have data structures that are optimised for their own tasks. A
data exchange requires a transformation according to a common data scheme agreed
upon. Very often, the data cannot be completely transformed in practice due to the
incompatibility of the agreed data scheme. Even worse, the loss of information
accumulates with each data exchange. This problem is explained below using the
workflow of a typical co-operative scenario in the planning process:
PlannerA: Local work (Fig. 1a)
During project work planner A has reached an intermediate state MOA of the building
instance. As agreed this version of the planning material shall be technically
complemented by planner B.
PlannerA: Generation ofthe exchange data (Fig. 1a)
The CAD systems of planner A und B are incompatible with one another. For the data
exchange planner A has to generate the exchange data Z0 from the native data MOA
according to the agreed data scheme. Not all information can be stored: This information
is described by the difference set
Planner B: Import of the exchange data (Fig. 1b)
Planner B receives the exchange data Z0 and transforms it according to his own native
data scheme to the data model MQB. Set Z0\M0B contains the non transformable
information. The accumulated loss of information is described by set

eWork and eBusisness in architecture, engineering and construction



Figure 1a. Traditional workflow of the

data exchange: Planner A.
Planner B: Local work (Fig. 1b)
Planner B complements the building version MQB by some technical issues and stores the
product instance as MIB in the own native format. This data has to be transferred to
planner A.
Planner B: Generation ofthe exchange data (Fig. 1b)
Planner B generates the exchange data Z1 from the native data M1B. During this process
the information set M1B\Z1 gets lost. The accumulative loss of information is now
PlannerA: Import ofexchange data (Fig. 1c)
Planner A transforms the exchange data Z7 into the own native data M1A. The information
set Z;\M1A gets lost during this step. At the end of the workflow the set of the
accumulated information loss is:
The problem of exchanging evaluated data
The scenario illustrates the general problem of a cooperation based on the exchange of
evaluated data i.e. the accumulated loss of data during the numerous required data
Some of the projects involved into the current Priority Programme 1103 explore the
applicability of versioned building instances in the planning process. Available results
show that such systems will play an important role in future distributed applications. The
exchange of versioned building instances is considered a currently relevant field of
science because available specifications do not support this topic.

A novel modelling approach for the exchange


Figure 1b. Traditional workflow of the

data exchange: Planner B.

Figure 1c. Traditional workflow of the

data exchange: Planner A.
The proposed approach for the data exchange allows a version modelling. In version
modelling, two completely different approaches are known. In a version-oriented model
each single version is explicitly stored. In a change-oriented model the changes are stored
instead of the versions: Versions have to be generated implicitly by applying the stored
changes. In (Zeller 1997) it is shown that the version-oriented and the change-oriented
approaches are equivalent in result. This is important for the proposed solution since it
allows the exchange of versioned data.
4.1 Improvement of data exchange
The result of the CAD processing is a new version of the building instance. The
traditional data exchange is based upon the version-oriented approach (Figure 2) that can
be described mathematically as a graph
G:= (M;V)


The elements of the node set M represent versions, the elements of the edge set
Mrepresent the relationships between two versions. Differences between
two subsequent versions can only be obtained by a comparison between the two versions.

eWork and eBusisness in architecture, engineering and construction


This method leads to unsatisfying results because the semantics of the change between
the two versions cannot be reconstructed.
This thread is explained by an example from solid modelling. The consistent
adjustment of a duct inside a building requires a recess r inside the wall w. Technically,
this requirement has to be realized by a difference set operation wV. In a BRep model the
semantics of the operation recess planning could not be reconstructed subsequently. In a
CSG model, however, the operation and its semantics can be reconstructed because this
information is part of the data structure.
The solution approach presented is based upon the change-oriented approach (Figure
As in the version-oriented approach the node set M contains the versions. However,
these versions need not be necessarily stored in an evaluated form. It should be noted that
for performance reasons the

Figure 2. Version-oriented approach.

Figure 3. Change-oriented approach.

versions could additionally be explicitly stored. Each edge
assigned the change 5,y between the source version mt and the destination version mj.
The original version m0 represents a special case: It is established by a change x0
The changes ij can be explicitly exchanged
applied to the empty virtual version
between the users and can then be applied to versions.

A novel modelling approach for the exchange


The method described is state-of-the-art in Software-Configuration-Management

(SCM): Changes that transfer a source version into a destination version are named a
patch in SCM. This is described for instance in (Fogel, Bar 2002).
The solution approach described is also based upon the change-oriented approach.
However, there are serious differences between CAD and SCM. In SCM, changes are
described by character strings to be inserted, overwritten or deleted at a certain position.
Due to the complexity of the planning process, the required methods for CAD models
cannot be specified easily. This task is in the focus of the opCAD research project.
In the design and bidding phase specific proposals are developed frequently and have
an important impact on the building contract. Figure 3 shows that two variants m1 and m3
have been derived from version m0. A second version m2 has been derived from variant
m1. As is generally known the merging is a serious problem since the merged version has
more than one input edge. For instance, version m4 has been merged from the two source
versions m2 and w3 by the application of the two changes 24 and 34.
State-of-the-art methods barely offer functionality to handle variants: The merging
process cannot be reconstructed based on the destination version. In the proposed
solution approach this is handled completely different. Instead of the destination version
the operations to be applied upon the source versions are recorded.
4.2 Novel workflow scenario
The proposed solution approach can both be used in an unversioned and in a versioned
planning environment. Again, the scenario from chapter 3.3 is used to describe the
change-oriented approach and its

Figure 4a. Novel workflow of the data

exchange: Planner A.

Figure 4b. Novel workflow of the data

exchange: Planner B.

eWork and eBusisness in architecture, engineering and construction


advantages over the version-oriented approach:

PlannerA: Local work (Fig. 4a)
During project work planner A has reached an intermediate state MQA of the building
instance. According to agreements this version of the planning material shall be
technically complemented by planner B. The change x0 has been continuously and
automatically recorded during interactive work:
Data exchange between planner A and B (Fig. 4b)
Not the evaluated version MQA, but the changes x0 applied during interactive work of
planner A are exchanged. By the help of the changes x0 planner B generates the native
building instance MQB in his software system. Eventually, some of the changes described
in x0 can not be realized in M05: The following set describes this information:
B (6)
Planner B: Local work (Fig. 4b)
Planner B complements the building instance M05 by his own technical components and
stores it as version M1B. During work, the change 01 between MOB and M1B is
continuously and automatically recorded by the CAD system.
Data exchange between planner B and A (Fig. 4c)
Planner A still owns the version MQA. Therefore it is sufficient to exchange the change
01 and to apply it on this version. From this operation the new version M1A evolves. If the
change 01 cannot be represented completely in the native model of planner A then the set
of not transferable information can be expressed as:
It should be noted that the loss of information during the workflow process does not
accumulate: The loss

Figure 4c. Novel workflow of the data

exchange: Planner A.

A novel modelling approach for the exchange


Figure 4d. Novel workflow of the data

exchange: Planner C.
of information VB described in equation (6) is not contained in set VA described in
equation (7).
Data exchange between planner A and C (Fig. 4d)
Since planner C does not own the evaluated version MQA, the changes of sequence (x0,
01) are applied subsequently to the empty version x: This leads to the version M1C. The
set of untransferable information is
The loss of information does not accumulate since information that cannot be stored in
the native system of planner B (set VB according to equation (6)) may be absolutely
storable in the system of planner C.
Advantages of the solution approach
A change ij between two versions Mi and Mj can be formulated by the language to be
specified in this research project. The change ij must not be transformed for the data
exchangeit remains unchanged forever. The receiver of the change ij has an interpreter
that issues the required actions to transform version Mt to version Mj. During this
procedure a loss of information occurs if the result of a specific change cannot be
represented by the destination model. However, the loss of information does not
accumulate since the exchanged changes ij are conserved in their original form.
4.3 Change generation
Available CAD systems support the recording of the applied operations: Normally, these
journals are described by proprietary macro languages. Due to the absence of
standardized languages, these journals could not be used for the data exchange.
Inside a graphical user interface the journaling is a complicated task. An example for
this estimation is the application of a digitized point inside a graphical window by the
help of a pointing device. If this input is stored in device coordinates then the replay of
this procedure in a window with another coordinate system leads to completely different

eWork and eBusisness in architecture, engineering and construction


results. It should be noted that these problems do not occur if the evaluated model is
4.4 Other applications
Operative modelling could also be applied for other problem domains. For instance,
query and manipulation of the CAD data model could be handled by the help of the
language for the operative modelling.
In database technology the standardized SQL language serves to query and to process
the relational data model. The users of the database system formulate their ad-hoc queries
by the help of the SQL language (Date 2000). It is also possible to embed SQL programs
inside application programs. Today, SQL is a complete programming language for
relational database systems.
Analogous, the language for the operative modelling could be used to solve problems
in the CAD domain. A simple language would be highly desirable since existing modern
programming environments can only be handled by specialists. The simplicity of the
language to be specified is a major objective.
Another important application is the archiving of operative CAD models by the help
of the standardized language. According to experience the syntax and semantics of an
unevaluated model does not change as frequently as in an evaluated model. Beyond that,
the archived data can be interpreted by the users and not only by interpreters.
The advantages of the proposed solution approach are shown in an example from 3D
solid modelling. The available solid modeller ACIS (Corney & Lim 2001) describes
solids with a BRep data structure.
The BRep data structure describes the topology and the geometry of the solid
boundary. The BRep data structure allows to distinguish between points inside the solid,
points outside the solid and points on the boundary of the solid. The ACIS modeller
allows to save and restore solids as a textual representation in so called SAT files
(Standard ACIS Text).
The ACIS modeller has an interface for rapid prototyping: The Scheme language
allows to write programs to be executed by an interpreter. While the scheme language
allows the formulation of a program for an unevaluated description of a solid, the SAT
file represents an evaluated description of the solid.

A novel modelling approach for the exchange


Figure 5. Example from Structural


Figure 6. Building description with the

Scheme language.
A specific example from Structural Engineering is shown in Figure 5:
The example consists of a building b with a wall w1, a door d and another wall named
w2. The resulting solid of the building can be represented by the following set operation:
Equation 8 can be formulated by the Scheme language. Figure 6 shows that the
operations can be formulated in a program consisting of just four lines of code. In
contrast, the evaluated description of the buildings solid is shown in Figure 7. It should
be noted that the file consists of 288 lines of codemost of them were omitted due to
shortening reasons.

eWork and eBusisness in architecture, engineering and construction


At present, the exchange of information in civil engineering is predominantly based upon
the versionoriented approach. Due to incompatible data structures, this transformation
process is characterized by (1) a certain loss of information that (2) accumulates during
each transformation process.
In this paper another solution approach is proposed. The basic idea is that instead of an
exchange of the building instance the changes that lead to this

Figure 7. Evaluated building

description in a SAT file.

A novel modelling approach for the exchange


building instance should be exchanged. A prerequisite for the change-oriented approach

is a method for the formal description of the changes.
It is proposed that the changes should be described in the form of the executed
operations: This concept is denoted as operative modelling. A programming language for
the formal description of operative models is developed in the opCAD research project.
In order to be used by the engineers this language should ideally be rather simple.
It was pointed out that the novel approach has advantages over the traditional
approach. In particular, the loss of data during the document workflow does not
accumulate and the procedure can be perfectly used in a versioned environment. The
proposed language for operative modelling could be used by engineers and programmers
for the formulation of solutions to their specific technical problems. The language is
considered to have advantages in the area of long term compulsory archiving.
The author gratefully acknowledges the support of this project by the German Research Foundation

Corney, J.& Lim, T. 2001. 3D modeling with ACIS. Stirling: Saxe-Coburg
Date, C.J. 2000. An introduction to database systems. Reading: Addison-Wesley
Eastman, C.M. 1999. Building product models: computer environments supporting design and
construction. Boca Raton: CRC Press
Firmenich, B. 2004. Product Models in Network Based Co-operation in Structural Engineering. In:
Proceedings of the Tenth Conference on Civil and Building Enginering (ICCCBE-X). Weimar:
Bauhaus-Universitat, Universitatsverlag
Fogel, K. & Bar, M. 2002. Open Source-Projekte mit CVS. Bonn: mitp
GDL2004. GDLAlliance. http://www.gdlalliance.com/. (l-Jun-2004)
Haas, W. 1999. Datenaustausch und DatenintegrationSTEP und IAI als Beitrage zur
Standardisierung. Frankfurt: ACS
Haas, W., Ilieva, D. & Kessoudis, K. 2002. Erfahrungen beim Einsatz Web-basierter
Planmanagementsysteme im Planungsalltag. In: Bauen mit ComputernKooperation in ITNetzwerken. Diisseldorf: VDI Verlag
Ousterhout, J.K. 1995. Tcl und Tk: Entwicklung grafischer Benutzerschnittstellen fur das X
Window System. Bonn: Addison-Wesley
Serain, D. 1999. Middleware. London: Springer-Verlag
Steinmann, R. & Liebich, T. 2002. IAIIndustrie Allianz fur Interoperabilitat: Stand der
weltweiten Aktivitaten. In: Bauen mit Computern: Kooperation in IT-Netzwerken.
Zeller, A. 1997. Configuration Management with Version Sets. Dissertation am Fachbereich
Mathematik und Informatik der Technischen Universitat Braunschweig

eWork and eBusiness in Architecture, Engineering and ConstructionDikba & Scherer (eds.)
2004 Taylor & Francis Group, London, ISBN 04 1535 938 4

Integration of product models with documentbased information

University of British Columbia, Vancouver, BC. Canada
ABSTRACT: Recent trends in information technologies for architecture,
engineering, construction, and facilities management rely of model-based
applications and interoperability. Yet even in the most optimistic
scenarios for these building information model approaches, much project
information will remain as traditional unstructured documents. There have
also been many information technology advances to support documentbased information, but only limited work to inter-relate these two
significant bodies of project information. This paper introduces the issue
of integrating model-based and document-based information, and in
particular, the need to synchronize human and computer communication
channels in project transactions. The paper then outlines a series of
technical approach: cross-referencing, text processing and data mining,
hybrid document types, and a presentation layer for building information

Much of the recent research and development into information technologies for the
construction industry has focused on model-based systems. These systems use structured
data models of facilities and their associated construction projects to support a range of
application tools and the integration of information across the project lifecycle. However,
in even the most optimistic scenario for model-based approaches, the vast majority of
current project information exists in the form of unstructured documents. At present,
there is very little linkage between information technologies for working with
unstructured document-based technologies and model-based technologies. This paper
discusses issues relating to the integration of these two branches of information
After introducing model-based and document-based technologies, the paper will show
how a consideration of human-computer communication channels leads to the
requirement for synchronized model-based and document-based technologies. Next, the
basic techniques for establishing linkages between the two technologies are described.
These include references from an object-oriented project model to external documents, or
references to model objects from document meta-data. Text processing approaches will
be discussed as another relevant approach to integrating the two technologies. The paper
will also discuss specific types of documents that can span between unstructured
document-based and model-based approaches, such as 2D CAD and project
specifications. The paper will then discuss issues relating to the generation of static and
dynamic documents from project data models in the form of a presentation layer

Integration of product models with document based information


component of model-based technologies. This presentation layer can help ensure that the
receiver of a project data model receives the specific interpretation of information via an
exchange of a data model that the originator intended to send. This could have a
significant impact on enabling the use of model-based approaches for carrying out
specific information transactions.
This paper lays out a series of approaches to the integration of model-based and
document-based information. Our work is in the early stages of forming a research
prqject, so this paper outlines the technical issues, but does not neport detailed literature
search or research results.
Many of the leading edge information technologies emerging to support the construction
industry rely on model-based techniques. In general, we consider model-based
technologies to be any IT that organizes information into elements associated with
semantic meaning that reasonably corresponds to the semantics of the actual construction
projecti.e., the data objects model the real-world objects. The primary example of this
is the move from purely geometric CAD-where the system works with geometric
primitives (lines, etc.) and it is left to users to interpret these primitives as real-world
elements (walls, etc.)to object-based CAD systems where both the system and the user
work with elements that represent walls, slabs, doors, windows, etc.
Any software tool that uses model-based techniques, then, can be described as a
model-based application. Thus, object-based CAD, estimating, scheduling, and structural
or HVAC analysis software can be model-based, since they organize their information
around elements that correspond to real-world elements (e.g., building components, costs,
tasks, beams, and heat sources, respectively). On the other hand, word processor
documents, photographs, spreadsheets, and traditional CAD are generally not modelbased since they organize their information around elements that do not correspond to
real-world elements (e.g., words, raster images, cells, and lines, respectively).
Moreover, there is a trend towards comprehensive and integrated model-based
approaches, where a variety of model-based tools can exchange data and collectively
develop detailed, multi-purpose data models of construction projects. The key issue here
is that, since the software captures some of the semantic meaning of each element of
information, the potential exists to inter-relate all of this information. This integration can
greatly leverage the value of the individual applications and data sets. Ultimately, this
could lead to an approach where most project information and communication is centred
on a virtual project model that is developed and managed in parallel with the
development and management of the actual physical project. This approach, referred to
by various terms such as building information models and virtual design and
construction, requires both modelbased applications and model-based interoperability.
Model-based interoperability, in which project data models serve as a common language
for exchanging data between applications, is typified by the Industry Foundation Classes
(IFC) data standard (International Alliance for Interoperability, 2004; Kam et al., 2003).

eWork and eBusisness in architecture, engineering and construction


In spite of the interest in model-based technologies, even the most optimistic scenarios
must concede that a large proportion of project information will remain in the traditional
form of unstructured documents for many years to come. Here too, there have been
significant advances in supporting information technology. The vast majority of project
documents are now produced electronically using various computer applications, while
the consumption (viewing), transmission and storage of documents are a combination of
electronic and paper-based. Electronic document management systems provide a
comprehensive range of features to manage project documents, including conversion
between paper and electronic documents (printing, scanning, and character recognition),
sharing and distribution, storage, versioning, indexing and searching, tracking, etc.
From an IT perspective, these documents contain unstructured data, yet much of the IT
used to support document management makes use of document metadata, or structured
data about documents. Document meta-data can range from simple information such as
document type and creation date (which exists for virtually any electronic document), to
extensive industry-specific information that link documents to related people and roles,
type of work, contractual relationships, project phase, etc. Again, interoperability
becomes important and standards have been developed. Examples include the Dublin
Core as a general document meta-data standard (Dublin Core Metadata Initiative, 2004)
and various construction-industry-specific meta-data standards (International
Electrotechnical Commission, 1999). While meta-data provides opportunities for
organizing and managing documents, it can be difficult to capture and maintain
meaningful meta-data, particularly if it requires additional data entry from end users.
Other information technology trends that are not directly related to document
management per se, but that are closely related, include project web portals (which act as
a central collaboration site for project teams and often include extensive document
management features), and workflow management systems (which can be used to define
typical work processes, manage the assignment and progress of tasks, and automate much
of the information handling requirements).
Figure 1 illustrates key elements and information interfaces in an IT environment. Within
the construction industry, most design and management tasks are fairly well-supported by
computer tools. However, these are not isolated activitiesrather they are highly
collaborative, involving large numbers of project participants operating in a highly
fragmented and dynamic environment. Correspondingly, IT solutions involve not only
stand-alone computer applications, but must be viewed as elements in an overall
technical and social system.
Within this system, information flows between individual users and their computer
based tools (data entry from the user to the computer, and data interpretation from the
computer to the user). Information also flows between users (as direction

Integration of product models with document based information


Figure 1. Elements and information

interfaces for an individual participant
and the overall system of a
construction project.
face-to-face or telephone conversationsor via exchanged documents), and between
different computer applications (as shared computer data).
At present, information sharing typically involves a project participant entering project
data into a computer application to produce useful project information, creating a paper
or electronic document containing the information, and distributing the document to
others (via mail, fax, courier, or e-mail), after which other participants interpret the
document and re-enter relevant information into their own computer applications. Thus,
there is little systems-integration and interoperability, and the data exchange that does
occur is inefficient, time-consuming, error-prone, and a barrier to greater computer
The inefficiency of this approach to exchanging information between computer
systems (from computer application to human-interpreted documents and back into a
second computer application) is improved by using direct computer-to-computer data
sharing. However, it is not sufficient to rely on computer-based data sharing alone, since
this creates the opposite effect. A user working with one application may interpret some
project information as having certain significance for the project (e.g., the design doesnt
meet certain user requirements, the costs are over budget, or the work method is
infeasible). If the same project information is successfully communicated to different
computer application used by another project participant, there is no assurance that the
second user will interpret the same information in the same way. That is, they may have
the same data available to them, but they may not recognize the design, cost, or work
method problems.
As a further example, a project architect and a general contractor could collaborate to
develop a detailed and complete IFC-based building information model for a building
project. The architect might then send the product model to the structural engineer to
inform them of some design changes, or the general contractor might send it to a
subcontractor to bid on a work package. In both cases, the product model might be

eWork and eBusisness in architecture, engineering and construction


extremely beneficial, e.g., effectively feeding data into structural analysis or estimating
software. However, the building model can be a large data set that is beyond the scope of
any single application to effectively convey to the user. With the building model alone,
the structural engineer might not clearly understand what parts of the building had
changed, why, and what additional requirements existed; the subcontractor might not
clearly understand what scope of work was included in the work package. For the overall
communication to be effective, both the computerto-computer data sets and the humanto-human interpretation of the data must be exchanged.
This illustrates the fact that efficient project communication must take place along all
of the communication channels: human-to-human, human-to-computer, and computer-tocomputer. Furthermore, these different communication channels should be coordinated
in effect, a project communication from one user to another could say here is a data set
and here is a document that describes how I expect you to interpret this data.
Previous research into model-based interoperability has not addressed this
coordination of human-based and computer-based communication channels. Our view is
that a successful solution to project information management requires this type of
coordination, and that this is achieved by inter-relating the model-based and documentbased information technologies. The remainder of this paper introduces a variety of basic
technical approaches for establishing this inter-relationship.
Perhaps the most basic approach to integrating model-based and document-based
technologies is the use of cross references from one to the other. In most cases,
documents can be identified by some type of unique identifier. Project model standards
such as the IFCs include mechanisms to associate a reference to an external document
(via the document identifier) with any project object. If, for example, the document is
stored in an electronic document management system on a project collaboration web site,
this reference information could allow the user of a CAD system to select a component of
the building in the CAD system and directly access any associated document (such as the
manufacturers specifications for a window). A similar mechanism can be used in the
reverse order, where documents refer to individual objects in a project model by their
unique ID. This reference could be embedded in the unstructured document itself (similar
to including a hyperlink URL in a word processor document), or it could be part of the
structured metadata associated with a document (e.g., request-for-information notices
stored in a project collaboration web site could be indexed according to the building
components in question).
This approach of representing cross-references is technically straight-forward, yet it
provides a sufficient representation for various types of computer applications to create
quite effective integration of model-based and document-based information. Some of the
challenges include capturing the crossreference relationships in the first place, and
managing a large collection of cross-referenced project information. Another challenge is
that this approach creates a relationship between a specific model object and a specific
document, from which systems may need to infer more specific or more general
relationships. For example, a relationship may be established between a change-order

Integration of product models with document based information


notice and a specific room in a building; yet the document should be retrieved if a user is
looking for all change documents associated a specific wall within the room, or
associated with the entire floor of the building.
One opportunity to link model-based and document-based information is to use text
processing techniques on documents (or other types of data mining techniques on other
types of data) to extract significant words and a limited amount of semantic information
from the documents. This semantic information can then be associated with the
documents as anything from a set of simple keywords to a structured data model of the
document content, which could then be mapped to other model-based data sets. Text
processing and data mining of construction documents have been examined by a number
of researchers (Schapke et al. 2002; Caldas and Soibelman, 2002).
Certain systems represent information in a manner that spans between model-based and
unstructured-document-based (i.e., hybrid approaches). An example would be a
specification-authoring system that allows the user to organize the specification
documents for a project into a highly structured hierarchy of sections. Each section is
made up of a passage of unstructured text interspersed with certain words, phrases, names
of products, etc., that are structured data fields linked to an underlying data model and
database. In an integrated scenario, these data fields could be mapped to elements in an
overall project data model. We describe this as a hybrid type of document since the
information is only complete and usable when the unstructured text and the structured
data fields are considered together.
Section 4 described the need to communicate both the computer-to-computer data sets
and the human-to-human interpretation of the data as part of an effective transaction. For
model-based information, this might be done by deriving document-based versions of the
model data and incorporating these documents within the model itself. We describe this
approach as a presentation layer within the model.
The documents could be produced by any application that can produce a useftil view
or presentation of the information contained in the data model, possibly by reading parts
of the data model and requiring additional effort from a user to manipulate, interpret, and
supplement the model data.
A simple example would be a text document that lists the key changes made between
two versions of a data model. Another example would be a CAD program that reads the
data model and uses the model geometry to produce a 2D cross-section of a segment of a
building, which a designer then embellishes to produce an annotated design detail

eWork and eBusisness in architecture, engineering and construction


drawing. A final example would be a bill of quantities report in which certain quantities
are linked directly into an enumeration of objects within the data model.
Presentation layer documents can be either static (capturing a specific view of the data
set at a specific point in time), or dynamic (presenting a specific view of the data that is
automatically updated when the data model changes). In a static approach, for example,
the 2D design drawings described above might be written to an acrobat (PDF) file
embedded within the data model. In a dynamic approach, an XSLT template document
could be created that would produce an HTML version of the bill of quantities report
(mentioned above) at any time by applying the template to an XML version of the data
With any of these approaches, the presentation-layer documents give snap shots
(specific views) of the model data intended to convey specific information for specific
purposes in human-to-human communications that accompany model-based transactions.
The following describes a possible scenario that would incorporate many of the issues
and techniques discussed in this paper:
On a given building construction project, many of the key participants have agreed to
use IFC-compatible model-based design and management tools.
A project web portal is adopted that incorporates strong document management features
as well as a model-server that can host a shared IFC building information model and
interface directly with participants IFC-enabled software applications.
One company acts as the information manager on behalf of the overall project, with one
individual acts as a project information officer.
Throughout the project, participants make regular use of the portal, contributing project
information into the portal and accessing information from the portal.
Many of the information transactions are ad hoc in terms of who is using the
information, what information is used, and for what purpose. However, many other
transactions follow formalized transaction templates or specifications (e.g., requests
for information, progress payment claims, etc.).
The formalized transaction templates specify the content and form of information to be
included in the transaction. These can include complete or partial building models as
well as various types of documents.
Objects within the building information model and documents within the document
management system cross-reference each other, and tools on the project web portal
allow users to navigate between these different types of information.
The data required to create these cross references is acquired from user input, is
inferred from the context in which the information is entered into the system, and is
captured from within certain documents from some text processing capabilities built
into the portal site.
At any time, the entire building information model (or a specific subset), can be
exported from the portal site. This model can include the documents embedded in (and
cross-linked to) the model.

Integration of product models with document based information


This paper has argued that model-based information technologies and document-based
information technologies should be integrated and used to support transactions that
combine human-to-human and computer-to-computer communications. The technical
approaches outlined to achieve this integration include cross-referencing, text processing
and data mining, hybrid document types, and a presentation layer within data models.
Solutions based on these approaches need not be complex, but some attention to this
issue and recognition within an overall information management approach could
significantly improve the effectiveness of project information and communication
Caldas, C. and Soibelman, L. (2002), Automated Classification Methods: Supporting The
Implementation Of Pull Techniques For Information Flow Management, Proceedings IGLC10, Gramado, Brazil, http://www.cpgec.ufigs.br/norie/iglc10/papers/99-Caldas&Soibelman.pdf
Dublin Core Metadata Initiative. Dublin Core Metadata Initiative (home page).
International Electrotechnical Commission (1999), Project IEC 62045 Ed. 1: Management data
(meta data) associated with documents URL: http://tc3.iec.ch/txt/169.htm(accessedJune3,2004).
International Alliance for Interoperability (2004), IAI International Home Page, http://www.iaiinternational.org/iai_international/ (accessed June 3, 2004).
Kam, C., Fischer, M., Hanninen, R., Karjalainen, A. and Laitinen, J. (2003), The product model
and Fourth Dimension project, Electronic Journal of Information Technology in Construction,
Vol. 8, pg. 137166.
Schapke, S.-E., Menzel, K., and Scherer, R. (2003), Towards Organisational Memory Systems In
The Construction Industry, eSM@RT and CISEMIC Conference, http://cib.bau.tudresden.de/~sven/Publications/20020903_PubFinal_eSmart2002_SES.pdf

eWork and eBusiness in Architecture, Engineering and ConstructionDikba & Scherer (eds.)
2004 Taylor & Francis Group, London, ISBN 04 1535 938 4

Aligning IFC with the emerging ISO10303

modular architecture. Can AEC community
take advantages from it?
Ricardo Jardim-Gongalves, Fatima Farinha*, Adolfo Steiger-Garcao
Universidade Nova de Lisboa, Faculdade de Cincias e Tecnologia
UNINOVA, Portugal
*Universidade do AlgarveIST, Portugal
ABSTRACT: International Alliance for Interoperability (IAI) is a world
wide consortium aiming to define the requirements for software
interoperability in the AEC/FM industry. The deliverables of IAI are the
specifications of Industry Foundation Classes (IFC), an object oriented
software library for application development. IFC model is described in
EXPRESS, and based in the general architecture oflSO 10303STEP to
describe its reference model. However, during the last few years, STEP
has been reorganised adopting now a modular architecture, already with
encouraging results. Should IFC be aligned with the emerging ISO10303
modular architecture, getting AEC community advantages from it? This
paper discusses a modular architecture for IFC, aligned with the work
developed for ISO10303, examining the benefits that such proposal can
bring to AEC. The paper catches results from the research developed
within CEN/ISSS and IMS projects close to ISO TCl 84/SC4, when
developing one of the first modular STEP Application Protocols, and
extends them to investigate the potentialities of this new architecture on

Nowadays, business success depends on the seamless integration of enterprises internal
processes and relies on collaborations with outsiders. In the advent of electronic data
exchange, heterogeneous data models and processes need to be integrated in order to
provide interoperability between systems. This situation has been identified hard to
achieve, mainly because each application adopts its own distinct data structure and
semantics (Chen, 2000).
The principal recognized difficulties preventing interoperability between applications
are in its genesis problems related with: (i) data model compatibility and mapping; (ii)
different languages and methodologies for model representation; (iii) correctness in the
semantics of the data exchanged; (iv) readiness for model reusability and (v) accurate
conformance and interoperability checking between applications (Jardim-Goncalves,

Aligning IFC with the emerging


The adoption of a common standard model, e.g., IFC or STEP Application Protocols,
could help face this problem. However, these models should not be completely static in
order to enable reusability and dynamic adaptation along time.
This flexibility requires the definition and agreement of common semantics, to
represent uniformly the meaning of the data to be instantiated and exchanged between
Building and Construction (B&C) industry involves a long process direct at the
satisfaction of clients and customers needs through the provision of quality products that
fulfil their purpose at a reasonable cost. It comprises complex activities that involve the
combined efforts of several specialists from different disciplines.
Contrary to what is seen in other manufacturing industries, the B&Cs artefact is
almost always a oneoff product. This main difference between the creation of a single
item and mass production has led the adoption, by B&C community, of a conservative
technological attitude and progress has not been as fast as in other industries (JardimGoncalves, 2003).
The life cycle of an artefact is long and normally it involves a large number of
participants with different experience and knowledge, often located in different
geographic areas. In B&C the life cycle of a product is basically a sequential process
comprising various stages in which the following stage does not begin before the
previous one is concluded.
Nevertheless, B&C projects allow concurrency between the different stages. For
example, escavation can begin before the design and planning stage is fully completed,
allowing a reduction of the overall project time, while phased occupation can begin
before external paintings completion, providing an earlier return on investment.
If one considers, for instance, the design and planning stage, we find the processes
used are essentially sequential. However, concurrent procedures can be adopted easily in
large numbers of situations, even for small projects.
The major obstacles identified to block such an approach are caused by two main
1. Engineering data is not interoperable;
2. Interaction between different participants is neither represented nor correlated.
The need of a unified and interoperable model that integrates all the information and
knowledge related to the different stages and that allows participants to access all the
information is a requirement.
In this scenario, the end user, can directly access to the systems data or do it through
the project manager, controlled by the systems managing applications and rules.
However, for instance for practitioners in these industrial environments, time and
material planning is still very often done in a manual process at the construction site in a
standalone basis, mostly assisted by data received by phone calls or by fax machines
(paper support).
Regarding the utilization of state-of-the-art ICT in the B&C industry, adherence is still
very poor (Filos, 2000) (prodAEC, 2004). Most established ways are concentrated on the
individual use of CAD tools, scheduling/planning applications, and automation of certain
pre-fabrication processes.

eWork and eBusisness in architecture, engineering and construction


Each application usually runs isolated, without any capability to exchange

automatically data, driving companies to reach proprietary dependent solutions, without a
complete integration of applications, and product, process and business data.
Also, little work has been undertaken in the development of overall control
architectures for building sites, whereas the development of such control architectures for
other industrial sectors has advanced significantly during the last number of years.
The implementation of computer integrated environments including design, planning,
production, business and control to an outdoor building site can then be seen as a top
stage of ICT integration into the construction industry.
Reengineering is the fundamental key to rethinking and redesign of business processes
to achieve dramatic improvements in critical, contemporary measures of performance,
such as cost, quality, service and speed (Blockley & Godfrey, 2000).
The existence of effective seamless ways to exchange information (schedules,
resources, materials, cost, cash flow) between the different parties involved in building
and construction projects is a critical success factor. It can avoid projects time and costs
overrun and assure better quality.
Awareness of these facts twelve companies interested in being able to work together
without being concerned about the applications each one were using founded the
International Alliance for Interoperability (IAI). The key deliverable is the IFC Object
Model (IAI/IFC, 2001) that provides a formal specification of requirements that can be
used by software developers in creating compliant applications.
IFC model, described in EXPRESS (ISO1030311, 1994) is based on previously work
done by the International Standards Organization Technical Committee 184, Sub
Committee 4 (ISO TC 184/SC4, namely, ISO103031STEP (STandard for the
Exchange of Product model data).
Both organizations are working in liaison on developing neutral definitions of
information that can be shared electronically, where STEP is concerned with all industry
sectors including B&C whilst IAI is vertically focused on B&C industry sector.
STEP publishes a proposal for a methodology for development, implementation and
validation of an open architecture for exchange and share of product data, together with a
set of public data models identified as Application Protocols (APs), i.e., data models
valid to be used in the scope of one vertical application, ready to be adopted by the
industry and covering the most important activities of the manufacturing process and
product life cycle.
The development of an Application Protocol (AP) can be described following a
methodology in V. Figures 1 and 2 depicts such a methodology in two different views,
respectively: 1) time vs. industry/ standardization bodies, and 2) level of implementation
vs. application context.
The Application Activity Model (AAM) is in the first phase of the development of one
AP. The purpose is to identify the relevant activities that are carried out in the scope of an
AP, together with the applicable information flow between them.

Aligning IFC with the emerging


An AAM consists of a structured graphical representation of the main activities

defined in the scope of an AP, providing a global vision of the activities to be performed
in the system, together with the information flow, actors, processes, inputs and outputs.

Figure 1. V approach for AP

development: time vs.
industry/standardization bodies.

Figure 2. V approach for AP

development: level of implementation
vs. application context.
IDEFO/SADT is an accepted methodology to develop AAMs that is to be developed with
strong connection with the end-users, e.g., industrial experts.
Based on the AAM, the modeling experts workingtogether with the end-users develop
and validate the Application Reference Model (ARM). The ARM is the data model that

eWork and eBusisness in architecture, engineering and construction


describes the structure required to represent the information in the scope of the AP,
stating the information requirements and constraints in the application context.
This model is expressed using a normalized modeling language (e.g., ISO10303H
EXPRESS), and should be structured using entities, attributes, assertions and
relationships described for easy understanding by the industrial experts. Figure 3 depicts
the architecture for the development of one AP at reference level.
The Application Integrated Model (AIM) represents the implementable AP, when it
interprets the ARM using normative resources. The AIM acts as a formal data model
which defines the structure and content for the neutral data exchange (Figure 4).
Interpretation is a major task to be done by standardization experts who map the
industrial concepts at ARM level with the available standard entities, released as standard
Integrated Resources (IRs). IRs are sets of entity schemas that were identified as a
common concept for many industrial APs.

Figure 3. Architecture for AP

development at reference level.

Aligning IFC with the emerging


Figure 4. Architecture for

interpretation of one AP.
These resources were developed by standardization experts, and made available ready for
reuse, representing generalized models suitable to be adopted across many APs as the
basement of a platform for interoperability, extended to satisfy the information
requirements and constraints of an application reference model, within an Application
For the development of the AIM, when the IR concept is identified at one APs ARM
level, the respective IR should be adopted and integrated in the implementable model
through a mapping process.
The mapper will be the one to match and extend the semantics from the model at
ARM level and to the AIM, reusing the IRs. When a group of IRs that is suitable for
reuse by many Aps is identified, it can be interpreted and integrated in an application
oriented super-IR, identified as an Application Interpreted Construct (AIC).
An AP can be organized in many views, representing sub-scopes of usability
depending on the extent and objectives of its usage, i.e., the APs Conformance Classes
(CC). Each CC is defined selecting a set of the ARMs entities that cover a specific subscope of the AR For example, a CC from an AP for geometrical representation can be the
one that just supports 2D geometrical representation.

eWork and eBusisness in architecture, engineering and construction


Figure 5. Global architecture of one

integrated AP.
The organization of an AP in CCs is important for certification of the software
implementing the AR This implies that one application to be compliant with an AP is not
forced to fully implement it. It can only implement one or more of its CCs, and only
conform to them.
Figure 5 depicts the complete integrated architecture established for development of
one AP, showing the major roles and relationships between its components.
This architecture is divided into two major blocks. The lower level, representing the
implementation specification of the model, includes the AIM, built using selected
standard IRs and AICs. This level is independent of the industrial scope.
The upper level represents the conceptual specification of the AP. This level uses the
AAM defining the APs application context, scope and functional requirements as the

Aligning IFC with the emerging


entry point for the development of the ARM, which specifies in a formal modeling
language the application domain and information requirements for electronic data
The ARM is built with several schemas of entities, usually known as Application
Objects (AOs) that can be clustered by functionality matters, defining Units of
Functionality (UoFs). This level is dependent on the industrial scope of the AP.
The CCs are defined to organize the AP in sets of conformance requirements for
implementation. To certify an application as compliant with a complete AP, or to a set of
CCs of an AP, standard Abstract Test Suites (ATS) need to be defined for verification
and validation tests. These ATSs should be also part of the standard AP.
The two described blocks of this architecture are linked through a mapping procedure
that describes the reference path for complete semantics matching between the industrial
model representation and the neutral independent one.
An Application Protocol represents a referential standard model within a specific
application scope. However, the typically large dimension of an AP leads to complex
models, which become standards of difficult reusability. They need to be organized in
well defined and simple scope limited modules.
To respond to this need, modularization is an increasingly important research activity
related to the development of standards. It is expected to be a major contribution for
supporting solutions for interoperability.
Application modules were recently introduced to the STEP architecture and are
considered the key components for the new generation of APs, intending to make them
more interoperable, cheaper, quicker to develop, and easier to understand and to manage.
In this new architecture, each module is seen as an atomic self-contained AP with a
respective reference and interpreted model (ISO10303, 2004).
The inclusion of the application reference model in a module is a major clue in this
modularization approach, because it extends the application interpreted construct (AIC)
concept of the classical STEP architecture towards a representation easier to understand
by the user, keeping the implementation advantages for the implementer
(ISO10303,2001) (ISO10303, 2002).
Thus, the need to create flexible models to support very large combinations of systems
can be sustained by a set of selected application modules, as the basis to develop a
complete new AP.
When compared with most of the existent Application Protocols, typically complex
and in compacted big models, the granularity of this novel standard architecture makes
the systems representations more flexible, interoperable and independent, to better
support new model representations and respective implementations.
This is a major advantage that provides specialized and autonomous structures
prepared to be reused to support the emerging industrial and business modeling
Figure 6 depicts the architecture of a modular AP.

eWork and eBusisness in architecture, engineering and construction


Figure 6. Architecture of a modular

The IFC model architecture has been developed using a set of principles governing its
organization and structure (IFC, 1999b). These principles are focused on basic
requirements for interoperability between applications operating in the B&C industrial
sector and respective integration of data, and provide a structure for an integrated model
together with a suitable framework for exchange and sharing of information between
different disciplines within the AEC/FM industry.
This architecture provides a structure identified as the model schemata, and it has
four conceptual layers, which use a strict referencing principle. Within each conceptual
layer a set of model schemata are defined.
The first conceptual layer is the resource layer. It provides resource classes used by
classes in the higher levels. The second conceptual layer, i.e., the core layer, provides a
core project model. This core contains the kernel and several core extensions.
The interoperability level is the third conceptual layer. It provides a set of modules
defining common concepts and objects across multiple application types and AEC
industry domains. Finally, the fourth and highest layer is the domain layer. It provides a
set of modules tailored for specific AEC industry domain or application type.
There are three possible ways to share data using IFCs. These are:
1. By creating a physical file of information that may be shared across a network, by
email or on a physical medium such a floppy disk. The EXPRESS language
specification view of the IFC Object Model determines the structure of the file and the
syntax of the file is determined by ISO10303 part 21, or more recently in XML;
2. By placing information in a database which has an interface defined according to the
ISO10303 part 22 (Standard Data Access Interface) for putting in and getting out data.
The EXPRESS language specification view of the IFC Object Model determines the
structure of the information sent to or received from the database. Presently, a number

Aligning IFC with the emerging


of software applications work using shared databases (also known as project model
3. By using software interfaces that can expose the information content of defined groups
of attributes within an object Software interfaces allow for direct communication
between applications without the need for an intermediate file or database.

Both STEP and IFC architectures are based on EXPRESS. However, IFC is simpler,
though it does not consider the Interpreted Model, having their implementations at the
reference level. The existence of an unique reference model, makes the IFC model easier
to implement, and to reuse its models components.
Modularization is a major achievement in the STEPs architecture, because it will
allow having modules representing atomic aspects of the product life cycle that already
includes the respective interpretation. This situation facilitates the reuse of such existent
components/modules, enabling an immediate implementation at AIM level.
Modules development also obligates to have such components independent of each
other, acting as an autonomous atom. Therefore, this Atomic Modules, can circulate along
the applications adopted for its product life cycle, with independence, and using bounded
information description, i.e., without the necessity to bring attached the complete global
model (e.g., AP).
IFC is already developed in a well structured architecture, in many layers, each one
composed by a set of schemas, called modules. However, these schemata have strong
dependencies among the others, not having yet the required atomaticity. This makes
difficult potential extensions of the IFCs, as part of future releases of IFC or even by
external parties interested in such extensions, i.e., joining of external modules.
The reuse of the existent modules for the creation of new schemas at Application
Domains layer will be easier to develop using the proposed approach, i.e., complete
modularization will enable dynamic development of new modules, based on the IFC
modules, i.e., extension or reuse, specially based on the models in the Kernel and
Resources layers.
This dynamism will help to face the heterogeneity problem of an application willing to
use the IFC modules, when IFC cannot offer yet an integrated built-in complete solution
as intrinsic part of its standard.
Indeed, it will not be possible to cover all requirements foreseen by all applications
operating in the B&C area, including their many applicational views. In such way, and in
a case by case basis, each application can adopt IFC modules, and when necessary to
create new IFC-based modules, through extension from reuse of the existent IFCs.
AEC community can take important advantages from this strategy, allowing sustained
growing and increasing of the existing IFCs, having it as an open standard platform,
where external contributions will be absorbed, though developed and validated according
the IFCs architecture and quality assurance rules. With time, and with such distributed
effort, IFCs would be more and more complete, and adopted by the users and application

eWork and eBusisness in architecture, engineering and construction


The authors would like to thank all the national and international organizations that
supported the international projects that resulted in the development of framework
presented in this paper, the European Commission, CEN/ISSS, Ministry of Industry of
Portugal, IPQPortuguese Standardisation Body, ISO TC184/SC4. Also, the authors
express recognition for the project partners and our colleagues that work and contribute
in the international research and development projects developing the modular ISO10303
(STEP) AP236. To Ricardo Olavo, for his assistance at UNINOVA in the modularization
prodAEC, 2004, http://www.prodaec.net/.
Blockley, D. & Godfrey, P. 2000. Doing it differently: Systems for rethinking construction. UK:
Thomas Telford.
Chen, Q. 2000. Inter-enterprise collaborative business process management. Technical report, HP
Labs Palo Alto, http://www.hpl.hp.com/techreports/2000/HPL-2000107.pdf.
Filos, E. 2000. Moving construction towards the digital economy. In Goncalves et al. (eds.), 3rd
ECPPM conference, Lisbon, pp. 310, September 2000, Rotterdam: Balkema.
IAI/IFC 2001. International Alliance for Interoperability. Industrial Foundation Classes.
IFC 1999a. An Introduction to the International Alliance for Interoperability and the Industry
Foundation Classes. March 1999, IAI.
IFC 1999b. IFC Object Model Architecture Guide. March 1999, IAI.
ISO 10303, 2001, Standard for the Exchange of Product Data (STEP), ISO TC184/SC4 N1 113,
Guidelines for the content of Application Protocols that use application modules, International
Organization for Standardization.
ISO 10303, 2002, Standard for the Exchange of Product Data (STEP), ISO TC184/SC4, N535,
Guidelines for the development and approval of STEP application protocols, International
Organization for Standardization.
ISO 10303, 2004, Standard for the Exchange of Product Data (STEP), ISO TC184/SC4, Parts
IxxxApplication Modules, International Organization for Standardization.
ISO 103031 1994. Part 1Overview and Fundamentals Principles. International Standardization
Organization. http://www.tc184-sc4.org/.
ISO 1030311 1994. Product data representation and exchange Part 11: Description methods, The
EXPRESS language reference manual. International Standardization Organization.
ISO 1030321 1994. Product data representation and exchange Part 21: Implementation methods,
clear text encoding of the exchange structure. International Standardization Organization.
Jardim-Gonalves, R. & Steiger-Garo, A. 2001. Agile Manufacturing: 21st Century
Manufacturing Strategy, Chapter 48: Putting the pieces together using standards. Elsevier
Science Publishers, pp. 735757.
Jardim-Gonalves, R. Farinha, F. & Steiger-Garcao, A: 2003, A metamodel based environment to
assist integrating one-off production in B&C, International Journal of Internet and Enterprise
Management, Vol. 1, N 2, April-June 2003.
eWork and eBusiness in Architecture, Engineering and ConstructionDikba & Scherer (eds.)
2004 Taylor & Francis Group, London, ISBN 04 1535 938 4

Optimization of project processing in the steel

construction domain
E.Holtzhauer & H.Saal
Lehrstuhl fr Stahl- und Leichtmetallbau, Karlsruhe, Germany
ABSTRACT: Today, the main part of the work in the A/E/C domain is
supported by computer applications, reaching from pure building
applications to general company administration software. Because of the
diversity of occurring tasks within the company as well as within the
construction project, many problems in regard to data and information
management have to be solved, which still are the purpose of current
research activities. This paper deals with the requirements of the German
steel construction industry and proposes an approach to optimize the
project processes in the sector, combining existing partial solutions.

The A/E/C domain is characterized by the required cooperation between partners from
very diverse fields of activity, knowledge and terminology. This phenomenon is also
reflected by the applied software tools which were developed for one specific application
area. The design and construction process of a building is project-oriented, and thus
associates those partners in heterogeneous workgroups within the companies as well as
within the whole project. The different views on the project induce problems with regard
to information integrity and distribution. The latter are emphasized by parallel planning
steps on one and the same object, which are often necessary due to time pressure. Finally,
losses due to inefficiencies in project processing in the building industry are estimated up
to 40% of the total costs. Hence, the optimization of the project communication and the
rationalization of the project processes reveal a big potential to ensure the
competitiveness of enterprises within the sector. This cognition is mirrored in the past
and current developments of IT-technologies for the A/E/C domain. On the basis of the
requirements of the German steel construction companies, the specifications of an
integrated project processing system are elaborated. Because of the numerous existing
software standards, the number of new applications should be kept minimal. Therefore,
the actual state of the art in design and construction practice and developments of current
research activities are analyzed. From there on, a methodology for an optimized project
processingespecially for steel construction domainis proposed. The model is then
evaluated with regard to its applicability in time and for other A/E/C-domains.

eWork and eBusisness in architecture, engineering and construction


2.1 Project and company views
The organization of a company is human-centered, i.e. its structure is subdivided in
function of the competences and tasks of a department. Therefore one person or at least
one division work simultaneously on several construction projects. All systems to support
project processing should take this organizational structure in account (Katranuschkov et
al. 2002). The tasks of the diverse departments are generally supported by incompatible
and fragmented software applications, which rely on different concepts and database
structures. The German steel construction industry (DStV 2004) considers the creation of
integrated solutions which interconnect both all company actors and all project partners
as well as their respective software one of the most important challenges. The latter
should on one hand be able to represent the entire operational and organizational course
of business in the company and handle all project-oriented processes centrally. On the
other hand, the mentioned applications should be able to integrate themselves in the
project context. This would allow consistent information flows within the company for
the diverse departments, e.g. marketing, project-management, technical design, etc. and
over the project for the diverse involved partners, e.g. architect, civil engineer, steel
2.2 Data and information management
Because of different terminologies and software tools, the main difficulty of integrated
solutions consists in managing data and information. From the companys view, the
shared data has to be kept consistent and thus updated in all databases for each event.
Further, only the data relevant for at least two departments should be exchanged or held
centrally. Because of the different levels of confidentiality of the information, a
differentiated rights management system must be available. Any developed solution
should rely on existing standard software applications to keep expenses and
implementation efforts as low as possible. On the other hand, it should be kept very
flexible because of the various software systems and business structures of the diverse
companies. Actually most of the companies are not set to store all relevant project data in
external project spaces, e.g. for reasons of confidentiality or security. To avoid double
storage of information, the data management systems should partially open themselves to
the project space, to provide and get relevant information to or from the involved
partners. Therefore the sharing of information within the project should be similar to the
company internal model, but on another layer. Figure 1 shows this concept with the
example of a company with 2 departments working on 1 project. Because of parallel
planning steps, keeping the data unique, consistent and up-to-date is more complex. As a
matter of fact, real-time updating is not possible with regard to the possible interferences
between the project partners while planning.

Optimization of project processing in the steel


Figure 1. Data sharing model within

company and within project
2.3 Document vs. data exchange
The pure document exchange delivers only semantic contents to the concerned actor. He
has to analyze the information and perform the necessary tasks emanating from this
information. In most cases he will have to acquire this information in his own software.
This causes time losses and increases possible sources of error with regard to data
consistency over the whole project. Therefore data exchange is to favor over pure
document exchange. The user of the software should become the checker of the model,
while it is directly taken over by the software from the exchange file.

eWork and eBusisness in architecture, engineering and construction


2.4 Process automation

Although most of the building projects are unique, there are some common processes.
From the project view this could be the notification of changes to the concerned partners,
from the company view a specific task, e.g. transfer of CAD-data to NC-production. An
integrated project processing system should allow the definition of workflows and handle
them at least partially. For example if the architect changes a geometrical information of
the building, this information should be communicated automatically to the concerned
project partners. The degree of automation obviously depends on the information
formatin case of compatible data the latter change could be directly and automatically
imported by the concerned program, which would require the authorization from the
planner to save it.
3.1 Internet based project management systems
Internet based project management (IBPM) systems provide a virtual project space for
the involved partners. These internet platforms have 3 main functionalities. First they
work as a central project server, where all documents are stored, and thus are available
for further tasks, e.g. facility management. The whole project communication is
performed over those platforms. Assumed that the project partners check it, this ensures a
better information flow. Finally IBPM-systems allow the definition of workflows and
thus a better control of the project process. The main problem remains that most of the
companies will not source out all their documents and thus may hold them twice. Further
the virtual project spaces do not allow the systematic exchange of data, but only of
3.2 Peer-to-peer applications
Opposite to virtual project spaces, peer-to-peer applications only interconnect the local
computers of the partners over the internet, without any central data storage. The user of
a peer-to-peer network can release documents or files in this network and make them
accessible to other users while they remain on his own server. The major advantage is
that no additional memory space is required on the internet-server, but each file may be
stored in the system of each partner. Peer-to-peer networks are commonly used for
private applications, but did not assert themselves yet for project processing in steel
construction domain. The different principles of peer-to-peer networking and virtual
workspaces on the internet is shown in Figure 2 by the example of partner A, who adds
a new document to the project.

Optimization of project processing in the steel


Figure 2. Exchange principles of

virtual project spaces and P2Pnetworks.

eWork and eBusisness in architecture, engineering and construction


3.3 Product models

Product models intend to provide compatibility between a series of computer applications
and thus allow the direct data exchange mentioned in 2.3. They consist of a neutral
standard which defines all needed objects and is implemented in the applications. With
regard to German steel construction domain, the Produktschnittstelle Stahlbau (DStV
2002) is to be mentioned. It is one of the few interfaces which is largely used in design
practice and covers steel construction specific tasks from the raw design to the detailing
over the structural analysis. In view to a global approach to the A/E/C domain, the
Industry Foundation Classes (IAI 2002) is the most promising product model. Its purpose
is to cover all domains of expertise, but the model is yet not ready to be applied on largescale in steel construction domain. Figure 3 shows the concept of product modeling by
the example of the steel construction domain.
3.4 Object-oriented modeling
The object-oriented modeling is the basis for very adaptive simulation due to its layered
structure as shown in Figure 4. The objects are instances of

Figure 3. Product model as a link

between the diverse proj ect partners of
the steel construction domain.

Optimization of project processing in the steel


Figure 4. The layered structure in

object-oriented programming.
classes of the model. The meta-model defines those classes. In order to evaluate a
knowledge model, its structure has to be analyzed (Firmenich 2004).
This programming concept is also used for the development of product models, for
which the classes of layer 1 are defined and implemented, and the diverse programs
generate the instanced objects in exchange files. The re-use of an existing meta-model
allows the compatible extension of an existing knowledge model.
4.1 General approach
The basic idea of the approach for an integrated project processing system is to define
basic entities of a company or a project and link them. From there on processes can be
modeled easily, e.g. if the entity file is changed, then inform all linked entities project
partner. To keep the approach human-centered, a tool to model the companys structure
has to be defined first. Then the latter is connected with a project platform where all
exchanges between the involved actors occur. The company internal entities must also be
linked with the project platform entities. Because of the diversity of the fields of activity
of the involved partners, the handled model of the platform has to remain simple and
cannot represent all entities of all partners.
4.2 Adaptive company model
To define any process and control it with any software, first of all the basic entities of a
company or a project have to be defined. Later they will be linked together. This allows
to assign any information in any format to an actor or a process etc. The project
processing tool has to offer resources to define those entities, to link them. Further it
should have a simple messaging tool, to allow at least very simple workflows, e.g.

eWork and eBusisness in architecture, engineering and construction


noticing new versions. The proposed main classes to represent a company with its
business structure are listed below:
1 Structure: defines the companies subdivisions, e.g. departments and employees
2 Project: defines projects and their subclasses
3 Partner: defines the partner types of the company, e.g. customers or suppliers
4 Product: defines the diverse products or services of the company
5 Tool: names the hard- and software tools of the company
6 Document: defines document types.
Hence, each entity in the company is handled and can be addressed by the project
processing system. Some

Figure 5. Example of a simplified

company model, instances and
of the above listed classes and their subclasses already exist in the IFC2x2 Model (IAI
2002). Because of the potential of this product model, its scheme will be used as a basis
for our system. If required classes are not available, they will be formulated in
accordance to the IFC meta-model. Further the software should also allow the
development of strictly company internal classes by the user, i.e. meta-model rules
should also be implemented in the system. In this way the system allows exact images of

Optimization of project processing in the steel


the companys structure. The pre-defined classes must be those which are required within
the project environment.
Once a company model is defined, the entities can be linked to each other. The
management of those links has to be performed with this developed project processing
tool. If those links are available, workflows can be easily defined in the system. Figure 5
shows the above introduced principles. Once the companies entities are defined in a
neutral model, they can be instanced and linked. Those instances and links can be
addressed by workflows.
Assumed that the concerned application software has accessible database structures,
also more complex workflows can be defined, like automatic updating.
4.3 The project platform
To open the company to the project environment, its internal organization and process
model has to be connected with a project platform. The latter should be used for project
communication and document or file exchange. Because of the volume of accruing data
and their diversity, the idea of one unique file containing and treating all information is
not realistic. A better approach is to provide a simplified building model on the project
platform, which represents basic subdivisions of a building (Petersen & Diaz 2004).
Within the scope of this investigation, the building is split into basic elements of steel
construction domain. Those construction entities are then linked with concerned project
partners. With the company internal linking concept presented in 4.2, the latter step is
sufficient. The entity in the project space is automatically linked with all concerned
entities within the company. The information attached to the simplified model is unique
and stored on the internet platform. The real documents are accessible by links to local
servers. This also allows a better access control for the documents owner. To guarantee
data consistency, versions of detailed documents have to correspond to the version of the
related coarse element on the platform. If more detailed data are connected to one coarse
element, the more checks of consistency are necessary. On one hand this leads to an
optimization problem of the simplified models accuracy. On the other hand this situation
is a very strong argument for the use of product model based exchange files, where
checks can be performed automatically in a relatively simple way due to the common file
format, and the information can be directly taken over by the software. In a first step, the
common standard for data exchange can be the Produktschnittstelle Stahlbau, because it
is already used in the practice. But, because of its larger scope, the advantages of the IFC
are evident in this case. Figure 6 shows the concept of the proposed project processing
system with its internet platform and the peer-to-peer networks. In the example, the civil
engineer performs a change on frame A. The notification processes run over the common
internet platform. The internal links lead to the detailed data. Two alternatives are
presented with regard to the updating of information in the detailed files. In one case, the
concerned planner in the steel construction company gets to the changed file of the
structural engineer over the peer-to-peer access and performs the modifications
afterwards on his file. In the second case both file formats are compatible and a direct
exchange can be performed.

eWork and eBusisness in architecture, engineering and construction

Figure 6. Proposed information flows

of the project processing system.


Optimization of project processing in the steel


The above presented approach of a project process management system shall ensure data
integrity within A/E/C companies and within their projects. The system shall dispose a
simple messaging system. Assumed that the partners are considered by the company
model, an easy and systematic notification of changes is possible. If the used standard
software applications have open database structures, the update of data can be automated,
since their information contents are also instanced by the mentioned model. Thus, the
approach offers an immediate possibility to enhance data consistence by better
communication between the concerned actors, and in a further step, automation of
updating is possible. The adaptation of the method is up to the system administrator. With
the use of a simplified model on the shared project platform, interdependencies between
diverse project partners can be defined or detected easier. Further the data volume on the
web-server is limited. The peer-to-peer approach for real file sharing allows a better
access control within the company, which enhances the security and confidentiality.
The concept of the presented system is based on the improvement of existing project
communication systems. It does not depend on any product model standard, but only on
the abilities of the already existing and applied software applications of the companies.
Thus it offers a progressive approach of process optimization. Because of the use of
existing standards, it can be developed and applied in practice relatively rapid. The first
step is to provide a company modeling tool and a virtual project space. Then the process
model can be extended steadily.
The applicability to other domains is ensured by the general and adaptive approach of
the project pro cessing system. But as the process automation potential is dependent on
the scope of product models, its efficiency is also dependent on the latter.
This investigation shows, that an answer to the requirements of the industry can be found
by combination and small extension of existing technologies. In a first step, data
consistency and communication within companies and within building projects can be
improved. In a further step, several repeating processes can be automated, and thus the
efficiency of the sector improved.
The Lehrstuhl fr Stahl- und Leichtmetallbau is developing a prototype system in
cooperation with industry partners, which will be tested on real building projects.
Deutscher Stahlbauverband (DStV) 2002. Standard-beschreibung Produktschnittstelle Stahlbau
Teil 1, Teil 2, Teil 3 and Standard description for product interface steel construction.

eWork and eBusisness in architecture, engineering and construction


Deutscher Stahlbauverband (DStV), Arbeitsausschuss Informationstechnologie, Ad-hoc

Arbeitsgruppe Ablauforganisation 2004. Grundstzliche Anforderungen an ein integriertes
Management- und Informationsystem fr kleine und mittlere (Stahl-) Baufirmen. Not published.
Firmenich, B. 2004. Product Models in Network Based Cooperation in Structural Engineering.
Karl Beucke et al. (eds), Proceedings of the Xth International Conference on Computing in Civil
and Building Engineering, Weimar, 0204 June 2004.
International Alliance for Interoperability (2002). IFC 2x2 Final Documentation. http://www.iaiev.de/spezifikation/Ifc2x2/index.htm.
Katranuschkov, P., Sherer, R.J & Turk, Z. 2002. Multi-project, multi-user, multi-integration: the
IST for CE integration approach. In Ziga Turk & Raimar Scherer (eds), ECPPM 2002, eWork
and eBusiness in Architecture, Engineering and Construction; Proc. intern.. conf., Portoroz, 9
11 September 2002.
Petersen, M. & Diaz, J. 2004. Integrated Planning of Buildings based on Computer Models in
Project Communication Systems. Karl Beucke et al. (eds), Proceedings of the Xth International
Conference on Computing in Civil and Building Engineering, Weimar, 0204 June 2004.

eWork and eBusiness in Architecture, Engineering and ConstructionDlkba & Scherer (eds.)
2004 Taylor & Francis Group, London, ISBN 04 1535 938 4

Location sensing for self-updating building

O.Icoglu & A.Mahdavi
Department of Building Physics and Building Ecology, Vienna University
of Technology, Vienna, Austria
ABSTRACT: Emerging technologies in building automation have the
potential to increase the quality and cost effectiveness of services in the
building industry. However, insufficient range of collected data and
models of the physical and behavioral aspects of the facilities limit the
capabilities of building automation systems. We describe a project for
improving building services by collecting comprehensive data from
variable sources and generating high-resolution models of buildings. In
this context, location sensing is critical not only for data collection, but
also for constructing models of buildings as dynamic environments. We
first examine a range of existing location sensing technologies from the
building automation perspective. We then outline the implementation of a
specific location sensing system together with respective test results.

Building automation is expected to improve building performance by reducing the
operation and maintenance costs of buildings (e.g. for heating, cooling, and lighting),
improving environmental performance, augmenting human comfort, and providing higher
safety levels. However, data collection and monitoring activities in current building
automation systems are rather limited: the focus is mostly on service systems such as
elevators and office equipment. There is a lack of systematic and scalable approaches to
comprehensive facility state monitoring throughout buildings life cycle. To achieve a
higher level of building automation technology, collected data must cover not only the
state of systems such as elevators, but also the state of room enclosure surfaces, furniture,
doors, operable windows, and other static or dynamically changing building entities.
Toward this end, we focus on generating comprehensive and self-updating models of the
physical and behavioral aspects of facilities over their life cycle (Mahdavi 2001a, 2001b,
2003, Mahdavi & Suter 2002). Thereby, we are developing and implementing a prototype
sensor-supported self-updating building model for simulation-based building operation
support (Mahdavi 200Ib).
To deliver a proof of concept for the feasibility of the system, we focus on lighting
controls in a test space. The control scenario is as follows: at regular time intervals, an
Executive Control Unit (ECU) considers possible changes in the states of control devices
(e.g. the dimming positions of the electrical light fixtures, the position of window
shades). The ECU then requests a lighting simulation program to predict the implications

eWork and eBusisness in architecture, engineering and construction


of device states for the lighting performance of the space (i.e. light availability and
distribution) in the immediate future time interval. Based on the comparison of the
predicted performance levels with desired (user-based) objective functions, the ECU
initiates the transition to the most desirable control state. For this scenario to work, the
underlying model generation system must consider a wide range of space and system
characteristics, including the spatial and material properties of a space as well as the state
of the luminaries and furniture. Specifically, the lighting simulator requires an accurate
and up-to-date model of both internal space and external conditions (i.e. the sky
luminance pattern) to run the necessary simulations. This implies the need for a location
sensing system to provide real-time identification and location data for the construction
of a space model. This information is subsequently used by the ECU to construct a 3D
object model in the system database. The resulting model can be used for lighting
simulations. Similar models can be constructed to inform other applications for building
operation and facility management support.
The challenge in constructing a model is that the building infrastructure is not a static
entity and may change in multiple ways during its life cycle. In office buildings, an
indicator for these dynamics is churn, that is, the number of office moves during a given
year. Depending on the flexibility of a buildings systems, churn can involve significant
infrastructure changes. According to one study on churn, freestanding furniture changes
daily to monthly, or modular partitions once a year (Ryburg 1996). The ability to track
such changes automatically is necessary for the viability of simulation-based building
control. In our prototype, this task is performed by a location sensing system.
Prior to the implementation of a location sensing system, the available technologies are
examined from the building automation perspective (Brunner et al. 2004). A suitable
system must be capable to identify individual objects and return their locations.
Furthermore, it should require minimum maintenance, and be scalable to adapt itself to
changes in a facility. In addition to these basic requirements, accuracy, unobtrusiveness
(minimal installation and maintenance necessary), cost, scalability, and identification
capability are also considered among the primary evaluation criteria.
Most currently available location systems use tags, small items affixed to the actual
objects to be tracked. Location information is obtained by signal exchange between these
tags and a sensor infrastructure (sensors, readers). Even more so than in other ubicomp
applications, building model applications call for rather small, long-lived tags that require
no batteries or any other maintenance. Moreover, systems based on devices that obtain or
calculate position information internally (called localized location computation) are not
meaningful in building model applications, unless the location information is fed back to
the overall system.
Among the available technologies, the ones that exploit electromagnetic and radio
frequency, ultra-sound, and optical/vision-based methods are specially noteworthy.
Electromagnetic and radio frequency include technologies based on the measurement of
electromagnetic or radio frequency signals field strengths, distortion, time-of-flight or
frequency. Ultrasound-based systems typically consist of battery-powered tags and a set

Location sensing for self-updating


of transponder stations communicating with them; position information is obtained by

measuring time-of-flight of acoustic signals. Vision-based technologies utilize visual
attributes of the objects or the tags attached to them and use computer vision methods to
extract the identification and location data.
Based on this technical review (Brunner et al. 2004), it is concluded that there is no
perfect location system for self-updating building models today Vision based methods
appear as the most appropriate solutions that can form a basic infrastructure to our
requirements because of being software-supported and open for modifications and
improvements. The latest developments in distributed programming, software agents and
high power processors also make the vision-based solutions more promising. Based on
this technical review, we have adopted such a system, as described in the following
3.1 System framework
Our system is designed as a vision-based location sensing that uses a combination of
visual markers (tags) and video cameras. Among the reviewed vision-based methods, the
algorithm proposed in TRIP (Lopez 2002, Lopez et al. 2002) offers a suitable solution for
location sensing in building environments. The TRIP algorithm uses optimized image
processing and computer vision methods, and benefits from low-cost, black-and-white
tags. It obtains in real-time the identification and location (both position and orientation
data) of an object to which the visual tag is attached.
Our assessment criteria (see section 2) emphasize that the location system should also
provide fine-grained spatial information at a high update rate while being unobtrusive and
scalable in terms of sensing the locations of many objects in a wide area. To meet these
requirements, our Location Sensing System (LSS) is designed in a distributed structure,
with the software components tied together by the Internet. Communication and data
sharing is ruled by the Distributed Component Object Model (DCOM) protocol that
enables these software components to communicate over a network (DCOM 2004). The
distributed structure of the LSS enables scalability and incremental growth, in addition to
enhanced performance derived from parallel operation.
Figure 1 shows the framework of the LSS. It comprises four main software
components: Application Server, Database Server, User Interface Server, and Target
Recognition and Image Processing (TRIP) Clients. The Application Server is the central
unit that controls the distributed components of the system, and performs resource
sharing and load balancing among the clients. Resources are the available computing and
sensor devices in the system. The Network cameras (NetCam) are used as sensors that
own dedicated IP addresses, and act like network devices.
TRIP Clients are the consumers of sensor and computing resources that run these
client programs. A TRIP Client acquires images from the network cameras and applies
image processing to extract the identification and location of the tagged objects. TRIP
Client programs are implemented on different computers (computing resources) that may

eWork and eBusisness in architecture, engineering and construction


be distributed across a facility, and their number can be increased as needed. The results
obtained from multiple clients are combined in the Application Server, which is also
responsible for controlling the status of the cameras and TRIP Clients. It informs the
operator (a facility manager, for example) of inactive sources and dynamically assigns
active cameras to active TRIP Clients by taking their workload feedback into
consideration. This arrangement minimizes operator overhead.

Figure 1. Structure of the LSS.

All data regarding TRIP Clients, cameras, object information and system parameters
are stored in the XML (Extensible Mark-up Language) format. The Database Server
provides remote access to XML data for other components of the LSS. The User
Interface Server is responsible for the communication between an operator and the
system. It is a web based CGI (Common Gateway Interface) program that can be
accessed from any computer on the network. It presents combined location sensing
results (object identifications and locations) to the operator and enables the adjustment of
system parameters from web browsers.
3.2 Processing steps in the LSS
The primary goal of the LSS is to collect visual data from the sensors, and extract object
identification and location information in a wide area. Figure 2 demonstrates the process
flow that takes place in the system to convert images to location data. Raw camera data is
acquired and subsequently transferred to the Image Processing unit through the Camera

Location sensing for self-updating


Interface. Object IDs and locations are extracted by the Image Processing unit and are
conveyed subsequently to the Coordinate Translation. Coordinate Translation transforms
the location data with respect to camera coordinates to the location data with respect to
room coordinates. Multiple Image Processing and Coordinate Translation units run
parallel in the LSS, as they are implemented within distributed TRIP Clients. The ID and
location data extracted from various camera devices are combined in the Object Fusion
phase utilizing current and previous object information. Final results are transformed into
data packets for convenient data communication, and transferred to the main system,
ECU, for model construction. These data are also transferred to the Object History
database for the further processing of the Object Fusion.

Figure 2. Process flow in the LSS.

eWork and eBusisness in architecture, engineering and construction


3.2.1 Camera Interface

Network cameras are used as sensor devices for capturing images of the environment,
where eachNetwork camera owns a dedicated IP address, and acts like a network device.
Network cameras are inexpensive, off-the shelf products. They have web servers
embedded inside that convey images to the consumers in wide areas through HTTP.
Camera Interface is executed in the TRIP Clients for each network camera assigned by
the Application Server. This unit acts like a hardware interface and isolates the software
parts of the system from the hardware units. Thus, the effect of any change of the
hardware to the overall system is minimized. Camera Interface performs the
communication and image acquisition with establishing a connection to the network
camera as a web client, where the required communication parameters are retrieved from
the Camera Database (Fig. 2).

Figure 3. Sample visual tag with TRIP

code: 10 2011 221210001 (evenparity=10, radius length=58 mm,
ID=18795) (Lopez et al. 2002).
3.2.2 Image Processing
Image Processing, executed within the TRIP Client component, acquires images from the
Camera Interface and applies image processing algorithms for location sensing (Fig. 2).
The system works with circular black-and-white TRIP coded tags that can be gener- ated
with an ordinary laser printer (Fig. 3). Patterns on the circular partitions of the tag
determine the position of synchronization sector (starting point, 1 partition), even-parity
bits (2 partitions), actual length of the radius of the tag in millimeters (4 partitions) and
ID code (9 partitions).
The Image Processing unit enhances the original TRIP system (Lopez 2002, Lopez et
al. 2002) by integrating two additional algorithms that make it suitable for the building
environments. The original system was implemented on images captured by digital
cameras that provide uncompressed, high quality data. However, in the real world,
working on raw images is not applicable in distributed environments such as buildings.

Location sensing for self-updating


Network cameras, like other digital video devices, are designed to convey images as fast
as possible for user convenience and therefore apply compression on the original images
prior to transmission. To overcome the compression artifacts, enhancement algorithms
are integrated as described in the following. Original Image Processing method
The original method comprises target recognition and pose extraction algorithms that
turn the input images of tags into location and identification data. The target
recognition algorithm determines the identification and geometric properties of the
projections of TRIP tags. Since the projection of a circle in an image generates an ellipse,
TRIP tags circular patterns are observed as elliptical, and parameters of these ellipses are
extracted from the image. The outermost ellipse of a detected tag is marked as reference
ellipse and its parameters are used for the pixel sampling procedure of TRIP code
deciphering. The intensity values of the pixels at point locations around the reference
ellipse determine the entire TRIP code. The TRIP code is finally validated with the
evenparity check bits (Lopez 2002).
Target recognition returns the ID number, radius of the tag, the position of
synchronization sector and the parameters of the reference ellipse for each identified
TRIP tag. The pose extraction algorithm takes these values as input in order to
determine the 3D position and orientation of TRIP tags with respect to the camera. The
algorithm implements a transformation that back-projects the elliptical border of a tag
lying on the camera image plane into its actual circular form lying on the centre of the
target plane. The reverse projection makes the camera image plane become parallel to the
target plane and retrieves the 2D orientation of the TRIP tag by giving out the angles
around the cameras axes X and Y, and respectively. The position of the
synchronization sector is used to extract the final component of the orientation, angle
around Z-axis, . The distance between the camera and the target plane, d, is computed
using the radius length of the tag. Thus, in addition to orientation, position vector [Px, Py,
d]T is also generated where Px and Py are computed from the central point of the
reference ellipse. Enhanced Image Processing method
In our application, network cameras apply wavelet transformation with a 10:1
compression ratio. This process generates smoothed input images for the TRIP Clients
and causes the tag images to lose sharpness. To compensate this, TRIP Clients apply an
adaptive sharpening algorithm (Battiato et al. 2003) on the input image prior to target
recognition (Fig. 4). This method restores first the original image by an un-sharp masking
process, which is naturally affected by noise and compression-based ringing artifacts.
The algorithm then minimizes these artifacts by combining the adaptively restored image
with the original.
In addition to camera artifacts, an increase in the distance of tags to camera reduces
the pixel resolution of the tag images and makes the TRIP codes harder to decipher, even
though the tags are detected and reference ellipses are extracted properly. To solve the
problem, edge-adaptive zooming (Battiato et al. 2000) is applied locally to spurious

eWork and eBusisness in architecture, engineering and construction


TRIP tags from which the TRIP code could not be deciphered or validated. Edgeadaptive zooming, as opposed to its counter-parts such as bilinear and cubic
interpolation, enhances the discontinuities and sharp luminance variations in the tag
images. This procedure is repeated until the target recognition succeeds or the zoomed
image region loses its Details (Fig. 4). The latter case indicates a false alarm or an
unidentified tag.
3.2.3 Coordinate Translation and Object Fusion
Coordinate Translation is executed within TRIP Clients after Image Processing (Fig. 2).
The outcomes of Image Processing are a position vector and orientation angles

Figure 4. Enhanced image processing

with respect to camera coordinates from which the processed image is acquired.
Coordinate Translation converts these to the location data with respect to room
The location of each camera with respect to the room coordinate system is stored in
the Camera Database. This location data involves the coordinate-rotation vector [R, R,
R]T, that overlaps the axes of the coordinate systems, thus, when combined with the
original rotations (, , ), gives out the orientation values with respect to the room. The
location data also involves the coordinate-translation vector [Tx, Ty, Tz]T, that aligns the
origin of the camera coordinate system to the origin of the room coordinate system, and

Location sensing for self-updating


eventually gives out the position vector with respect to the room when combined with the
original position vector, [Px, Py, d]T.
Object Fusion, implemented in the Application Server, combines the object
identification and location data acquired from parallel-running TRIP Clients (Fig. 2). The
same object may be detected with multiple cameras, each of which is assigned to
different TRIP Clients. This may generate repeated records in the system. Object Fusion
combines these reiterated data based on identification codes and room coordinate
locations. Furthermore, TRIP Clients attach time stamps on each extracted objects
location data. In cases of inconsistency, Object Fusion uses previous object data to
perceive the correct identification or location information, and generates the final, unique
object information.
To evaluate the performance of the LSS, a demonstrative test was performed to observe
the identification and location accuracy. The test configuration was

Table 1. Percentage of identified tags as a function

of distance and angle for the original and
enhanced methods.



























designed to address system limitations. One limitation is the distance of the tags from
the camera. An increase in distance reduces the resolution of the tags that makes pixel
sampling unable to locate the circular regions within the tag image. A second limitation is
the incidence angle between the normals of the target plane and the image plane. As the
incidence angle increases, the elliptical properties of the tag image projections become
more difficult to acquire.
In our test, 1616 cm tags were located at 3 different distance values (2, 3, 4 m) and,
for each distance, 3 different incidence angles were evaluated (0, 30, 60). Thirty
sequential readings for each location were recorded using the TRJP Client program and a
network camera with 1/3 CCD sensor and 720486 resolution. As camera artifacts affect
input images in changing magnitudes and spatial values, multiple samples were taken for
each designated location.
The test was performed with the original and enhanced Image Processing methods
as described above. Identification percentages are given from the sequential reading
results in Table 1.
In addition to identification, location results were also observed. For the enhanced
method, the position and orientation data are within a maximum error range of 10cm

eWork and eBusisness in architecture, engineering and construction


and 10 degrees for 3 m distance. The error rates show an increase to 20 cm and 35
degrees for objects located within 3 to 4 m distance.
A single TRIP Client program running on a 2 GHz processor possesses an average
update speed of 5 image frames per second. The update speed of the overall system is
dependent on the number of cameras and TRIP Clients employed and it is configurable
based on the facility requirements and budget. Increasing the number of cameras also
increases the amount of space being covered by the LSS. However, this eventually
increases the image production rate and reduces the overall update speed. The resulting
drawback can be compensated with adding new TRIP Clients. Multiple TRIP Clients
executed in the system share the workload and may increase the update speed per camera
up to its maximum frame transfer rate.
We have presented a location sensing system to support self-updating building models for
building automation applications. The implemented system has some drawbacks inherited
from the general disadvantages of the visual methods as it requires line-of-sight between
the camera and tags, and its performance is dependent on the cameras image quality.
However, the results obtained from our location sensing system suggest that vision based
location sensing, when enhanced with software methods and integrated with appropriate
hardware, is a promising technology suitable for spatial domains such as facilities and
The implemented sensing system is still open for improvements. We expect that in the
future the tag size can be reduced and the effective distance of the system can be
augmented. Integrating pan/tilt units that can feed their position data back to the system
will also increase the effectiveness of the cameras. In addition, currently, coordinate
translation data has to be updated for any displacement of the cameras. Utilizing
reference tags will facilitate the automatic calculation of coordinate translation data,
allowing the relocation of cameras without manual system reconfiguration.
The research presented in this paper is supported by a grant from FWF (Austrian Science
Foundation), project number P15998-N07. The research team includes, in addition to the
authors, G.Suter, K.Brunner, B.Spasojevic, L.Lambeva, and M.Mohamadi.
Battiato, S., Castorina, A., Guarnera, M. & Vivirito, P. 2003. An adaptive global enhancement
pipeline for low cost imaging sensors. IEEE International conference on consumer electronics,
Los Angeles, June 2003. 398399.
Battiato, S., Gallo, G. & Stanco, F. 2000. A new edgeadaptive algorithm for zooming of digital
images. IASTED Signal Processing and Communications, Marbella, September 2000. 144149.

Location sensing for self-updating


Brunner, K., Icoglu, O., Mahdavi, A. & Suter, G. 2004. Location-sensing technologies for selfupdating building models. (to be published).
DCOM. 2004. Microsoft COM technologiesInformation and resources for the Component Object
Model-based technologies. http://\vww.microsoft.com/com/
Lopez de Ipina, D. 2002. Visual sensing and middleware support for sentient computing. PhD
dissertation, University of Cambridge, 2002.
Lopez de Ipina, D., Mendonca, P.S. & Hopper, A. 2002. Visual sensing and middleware support for
sentient computing. Personal and Ubiquitous Computing. 6(3): 206219.
Mahdavi, A. 2001a. Aspects of self-aware buildings. International Journal of Design Sciences and
Technology. Paris. 9(1):3552.
Mahdavi, A. 2001b. Simulation-based control of building systems operation. Building and
Environment. 36(6): 789796.
Mahdavi, A. 2003. Computational building models: Theme and four variations (Keynote).
Augenbroe, G./Hensen, J. (eds). Proceedings of the Eight International IBPSA Conference,
Eindhoven, 2003. 1:318.
Mahdavi, A. & Suter, G. 2002. Sensorgesttzte Modelle fr verbesserte Gebaeudeservices. FWF
proposal, 2002.
Ryburg, J. 1996. New churn rates: people, walls, and furniture in restructuring companies. Facility
Performance Group, Inc. 1996.

eWork and eBusiness in Architecture, Engineering and ConstructionDlkba & Scherer (eds.)
2004 Taylor & Francis Group, London, ISBN 04 1535 938 4

Modeling cast in place concrete construction

alternatives with 4D CAD
R.P.M.Jongeling, T.Olofsson & M.Emborg*
eByggCentre for Information Technology in Construction, Lule
University of Technology, Lule, Sweden *Betongindustri AB
Heidelberg Cement Group, Stockholm, Sweden
ABSTRACT: This paper presents the results from a 4D modeling study in
which 4D CAD was used to compare two different construction methods
for a cast in place concrete structure. Evaluating construction alternatives
in a virtual environment is considered to be an effective and cost efficient
way to introduce and evaluate innovative construction processes and
products. Results from the study show the potential of using multiple
virtual prototypes, but also show limitations regarding required modeling
effort and evaluation of results. Improvements are suggested that facilitate
and partly automate the 4D modeling process. Evaluation of the outcome
of 4D models can be improved by extending the graphical outcome of the
modeling process with a variety of analyses. Research and development is
needed to quantify results from 4D models, in order to allow comparison
of construction alternatives on a range of different criteria.

The planning process in the construction industry is focused on organizations and work
breakdown instead of construction operations, flows and supply chain management
(Ballard 2001). Decision-making is often based on practice, general information and
assumptions, resulting in sub-optimal solutions that often have a negative impact on the
total project costs. Product and process innovations are not easily adapted in this practice.
A widely used method for process modeling is the Critical Path Method (CPM). The
method concentrates mainly on the temporal aspect of construction planning and is seen
as one-dimensional (Heesom 2004). Construction projects have unique spatial
configurations and the spatial nature of projects is very important for planning decisions
(Akbas 2004). CPM schedules do not provide any information pertaining to the spatial
context of project components and requires users to look at 2D drawings to conceptually
associate components with the related activities (Koo 2000, 2003). This approach limits
evaluation and comparison of alternative solutions.
4D modeling is a new process method in which 3D CAD models are visualized in a 4dimensional environment. Construction plans can be represented graphically by adding
the time dimension to the 3D model to allow project planners to simulate and analyze
what-if scenarios before commencing work exe cution on site (Mallasi 2002). Koo (2000)
identifies 4D modeling as a tool to convey planning information (visualization tool),

Modeling cast in place concrete construction


enhance collaboration among project participants (integration tool), and to support users
to conduct additional analyses (analysis tool).
Although geometrical data and temporal data are present in commercial 4D CAD
software, the utilization of these models has so far mainly concentrated on the
visualization of construction processes, rather than on integration and analyses of
construction operations. However, recent research efforts in 4D modeling show an
increasing interest in providing an integrated 4D environment to support a variety of
analysis of for example work spaces, work flows and use of resources (Akbas 2004,
Akinci 2002, Heesom 2003, Li 2003, Mallasi 2002).
In this paper we describe the relevance of 4D modeling as an instrument to introduce
construction innovations and to evaluate construction alternatives. We report the
implementation and findings from a 4D modeling project conducted by a Swedish ready
mixed concrete supplier, in which 4D was used as a tool for visualization and integration.
The use of an Internet-based collaborative engineering environment to support interactive
decision-making processes is introduced, which was used to create 4D models. The main
findings and results from the study provide a base for fiiture research aimed at defining
methods for 4D comparative analyses of construction alternatives.
4D modeling formed the basis for a marketing project conducted by a Swedish ready
mixed concrete supplier. Motivation to conduct this project followed from routine
problems in the suppliers business process. The suppliers marketing department
experienced difficulties in introducing new products and production processes to its
clients: project developers and contractors.
As an example the introduction of permanent form work systems can be given. Paperbased brochure material and 2D engineering drawings are commonly used to
communicate product characteristics. The process of applying these products on site is
communicated via spread sheets and CPM schedules. These media and tools are
considered to be insufficient to create a common understanding in project teams about
advantages and disadvantages of new products and production technologies.
An alternative approach for introducing new innovations is performing a full scale
study in a pilot project. However, such projects are often expensive and resource
intensive. Furthermore, there is a risk that the efficiency and effectiveness of innovations
are not clearly shown, caused by a lack of experience, and other parameters such as
weather conditions that are beyond control of the project organization (Jongeling 2004).
In an effort to adequately support the introduction and evaluation of construction
innovations a pilot study was initiated in which 4D simulations were made to evaluate
two construction alternatives of a residential construction project.
The first alternative, the 0-Reference scenario, represented todays common practice for
cast in place concrete construction. The objective of this scenario was to represent
typical sequenced and concurrent activities on a construction site that are related to
casting of concrete walls and slabs.
The second alternative provided an industrialized approach to cast in place concrete
construction. A number of innovative production technologies formed the basis for

eWork and eBusisness in architecture, engineering and construction


this alternative. The objective was to visualize the potential for permanent form work
systems in combination with the use of prefabricated carpets of reinforcement and self
compacting concrete. Such a combination of innovative production technologies had
not been applied in actual projects and the possibility to evaluate these methods in a
virtual environment was considered to be very useful.
The simulations, by realistically visualizing typical construction activities, provided a
foundation and a base for evaluating the differences between traditional and permanent
form work systems. The period of construction was used as a primary comparator.
Typical construction activities were defined as operations needed for the installation of
form work, reinforcement and concrete. Original 2D architectural drawings from a
typical residential construction project were used to create a two-storey 3D CAD model.
The 4D experiments were conducted after the actual construction of the project was
finished and have no direct relation to the construction process by the contractor.
Modeled scenarios were based on estimates of product and process data acquired from
experienced practitioners at the supplier and other actors in the Swedish construction
industry. Many hypotheses and limitations were made in planning and representing these
processes in a 4D CAD environment. For example, the planning of resource use was not
directly addressed in this study.
A main interest in the study was to investigate the possible and practical use of the
software tools and 4D simulations in practice. For this reason software tools were
selected that were to some degree already familiar to both the supplier and other project
participants. The 3D models were created by using AutoCAD Architectural Desktop
(ADT) as client software to an Internet-based database management system, developed
by Enterprixe Ltd (Enterprixe 2002). The system stores all project data in a central
database, to which project members have concurrent access over the Internet. Depending
on access rights, users can view or edit information by loading data from the server to
local software clients. The server keeps track of the modeling work and modifications by
connected users, who can check out and check in parts of the project.
The contact between the concrete supplier and its clients is often highly interactive and
intense during certain stages of a construction project. The interoperability environment
provided by the modeling system was therefore considered useful to support these
processes and to ensure consistency of data. The possibility to create a user defined
object hierarchy for 3D CAD objects was another reason to apply the modeling system.
This functionality facilitates the creation of multiple parallel CAD object hierarchies that
can represent different construction alternatives.
The following section describes the method and findings of the modeling work of the
main 4D model components for both construction alternatives.

Modeling cast in place concrete construction


Figure 1. Main object hierarchy in

AutoCAD Architectural Desktop, used
as client software to the Enterprixe
model server.
The project database contained four CAD object groups per construction alternative:
Building Model;
Casting Sequences;
Form Work; and
In addition, the project database contained imported CPM schedule information for both
construction alternatives.
4.1 Building and production model
The building model served as a base for the project and included the geometry of
building components. This model was based on architectural 2D drawings and was the
same for both alternatives. A production CAD model, based on the building model, was
created consisting of form work elements, reinforcement bars and casting sequences for
the concreting work, Figure 1. The objects in the 3D CAD production model were linked
to activities and work packages whereas the building model objects only were created to
represent the geometry of the building. For this reason many objects from the building

eWork and eBusisness in architecture, engineering and construction


model had to be split and regrouped in order to suit the object hierarchy of the production
4.2 Schedule
CPM schedules were created on two levels of detail in MS Project. The first level, the
master level, contained the main work processes of the project. The level of detail in the
master schedule was appropriate in conveying the overall daily work flow in the project,
but was too abstract to represent construction operations on site. For example, activities
such as installing and dismantling form work were not included in this schedule. The
master schedule was subsequently detailed into a second level of detail in order to
represent work flows more accurately. This level of detail included time frames of 10
minutes and led to CPM schedules with large number of activities for the various work
flows. For example, the activity cast concrete walls phase one from the master schedule
was broken down into sub-activities; install traditional form work side A, install
reinforcement bars, install traditional form work side B and cast concrete. These subactivities were further detailed to standard sections of form work and sections of
reinforcement and walls. A similar approach was applied to the detailing construction
operations for the slabs.
The final CPM sChedule for both construction alternatives contained over 1500
activities. The CPM schedule was imported to the project database and manually linked
to the respective objects and objectgroups. Color settings for the visualization of
activities were made after the linking of activities to the 3D CAD models was completed.
The order of installment of all 3D CAD components was determined in the CPM
environment without a direct relation or visual check in the 3D model. Interdependencies
between different work flows, i.e. form work, reinforcement and concreting, could only
be checked after the CPM schedule was linked to the 3D CAD models. By browsing the
time planning of the two models errors could be detected, such as work space conflicts
and erroneous work flow directions. These observations led to updates in both the 3D
CAD model and the CPM schedule that in many cases had to be restructured and relinked in order to work properly. Schedule changes were difficult to manage due to the
large number of activities and the large number of dependencies between different
activities. The process of linking and updating the CPM schedule and the 3D CAD model
was labor intensive.
4.3 Casting sequences
To define casting sequences for concreting work the building model was split in a
number of work packages. Each of these work packages represented the concrete casting
work for one day and included sections of walls and slabs. The sequences were planned
in a traditional 2D paper-based way as this was deemed more flexible than planning the
sequences in a 4D environment. Sequences for the casting process of walls were planned
by using colored lines drawn into 2D drawings; each line representing casting work for
one day. The length and distribution of lines were manually determined in five iterations
in which a set of decision criteria was used. Main criteria were: required form work,

Modeling cast in place concrete construction


volume of concrete, work flow direction and possible work space conflicts. These criteria
were considered implicitly and were not quantified in the decision-making process.
In many cases the 3D building model components had to be split in a number of subcomponents, i.e. production objects, to represent the appropriate size of the individual
concreting activities. This was a complex process as the planning for casting work kept
changing during the modeling work.
4.4 Form work
Form work was different for both alteraatives. The 0-Reference alternative was based on
the use of traditional temporary form work. Traditional form work elements in the model
purely had a visual purpose and were abstract representations of the actual form work
elements. Form elements were grouped in the 3D CAD model by sections of standard
length of elements. Modeling and grouping form work into work packages was a labor
intensive process. Casting sequences determined the size and distribution of work
packages in the 4D model. As noted in the previous section, the planning for casting
work kept changing and subsequently the work packages containing 3D CAD form work
elements had to be changed to keep the model consistent.
A permanent form work system was used for the industrialized alternative. This form
work consists of cement-bonded particle boards, which remain in the building after
construction. As opposed to the 0-Reference alternative, where casting sequences
determined form work activities; the permanent form work objects determined the
planning for the industrialized construction alternative. 2D drawings were sent to a
supplier of permanent form work systems, who manually planned wall and slab form
elements in the drawings. Geometric data was manually extracted from drawings and
used in spread sheets to calculate required resources and element costs. Hand-written
element ID-numbers on 2D drawings were the key in linking 2D drawings and spread
sheets together. All the form work elements for walls and slabs were modeled and
organized in the 3D CAD model according to element ID-numbers given in the 2D
For every form work element an activity was created in the CPM schedule that was
linked to the corresponding 3D CAD element in the database. For the two storey building
used in the project this implied 180 form work element activities for slabs and 170
activities for wall elements. The order of work for installation of form work elements did
not follow the ID-numbers and was made in the CPM schedule by changing order of
activities. Changing order of activities for form work often had an impact on
approximately 70 shoring activities, 60 reinforcement activities and 80 concreting
This constrained the rapid evaluation of different work flow directions. It took
approximately 15 iterations before an installment order was found that visually satisfied
the planning criteria for work spaces and work space conflicts.
4.5 Reinforcement
Dimensions, locations, and number of reinforcement bars were determined by visual
analyses of the 3D CAD model. The reinforcement bars served a visual purpose and were

eWork and eBusisness in architecture, engineering and construction


not modeled for reasons of structural analyses. Reinforcement bars for slabs were
distributed over equal distances and modeled in one direction in order to minimize 3D
modeling work. The purpose of these bars was to show the reinforcement activity work
zones rather than the actual reinforcement components.
Planning the installation of reinforcement bars was done with the assumption of a
constant production rate, i.e. every reinforcement bar would take the same amount of
time. Specific geometrical situations were not taken into account. The relation between
the geometry of reinforcement bars and productivity for installment could only be made
visually. Considering specific production rates by analyzing the 3D and 4D CAD model
in detail was beyond the scope of the project, but could have contributed to the accuracy
of the simulations.
Both construction alternatives were simulated in parallel in a 4D CAD environment,
Figure 2. The two construction simulations showed to some extent similar work flows.
Activities for form work, reinforcement, and concrete were carried out concurrently,
enabling and constraining the execution of other activities.

Figure 2. Parallel simulation of

construction alternatives in 4D CAD.
(Left) 0-Reference alternative. (Right)
Industrialized alternative.

Modeling cast in place concrete construction


The main differences between both alternatives were the dependencies between the
different work flows and the production rate. This observation could be made visually,
but was not supported by measured data from the 4D model.
Parallel visualization was considered very effective to visually explain the differences
between the two construction alternatives. The simulations were used in seminars in
which a variety of professionals in construction was invited. It was generally agreed that
the 4D models helped to understand the different construction processes, but it was noted
that the models were limited in scope and non-interactive. Evaluation of alternative work
flow strategies or changes in productivity could not easily be managed in the 4D model.
The 3D CAD objects in the production model that had been created and grouped to
represent specific activities, i.e. form work, reinforcement and concreting, constrained the
rapid evaluation of alternatives. Changes in schedule implied often major changes in the
production model, involving considerable remodeling work.
The 4D models provided the actors from different disciplines an integrated visual
impression of the construction alternatives, but there was a general need to support this
visual analysis with data. We distinguish two types of actors here: specialists and
Specialists from different disciplines were mainly interested in making specific data
analysis of the 4D models. In addition to the visual comparison in 4D, these actors were
interested in making data analyses by using a range of criteria. Examples of these criteria
are: changes in crew composition, distances between parallel work flows, amount of
equipment, crews, available work spaces, etc. Data needed to make this analysis could
not easily be extracted from the available 4D models. This was partly due to absence of
specific input data and partly due to the fact that output data, i.e. the 4D model, was
limited to mainly graphical information.
Managers considered 4D models on a different level and considered Key Performance
Indicators of both alternatives. These indicators would summarize findings from the
various analyses by specialists in graphs that would provide them with a general
impression of the performance of different alternatives. Some of the indicators that were
suggested were rather specific, such as costs, resource use and project phase duration;
others were more abstract such as the efficiency of work space use. Data to support these
analyses could partly be obtained from the CPM schedule, but most of the needed data
was not readily available.
In summary, 4D models were considered very useful to visually compare construction
alternatives, but limited in the sense that it was difficult to interact with the models. Next
to the graphical output of 4D models, there was a need to quantiiy the results from the
analyses to allow comparison of construction alternatives on different criteria.
In this section we summarize the main findings from the conducted 4D modeling
experiments. We also suggest a number of directions for future research and

eWork and eBusisness in architecture, engineering and construction


6.1 Building model and production model

The pilot study started by using an architectural model as a base model. This model was
too abstract to be used for representation of specific construction operations. In order to
create a production model from the architectural model:
Changes were made in the 3D CAD object hierarchy. 3D CAD objects were regrouped
to represented work packages and activities;
Certain objects had to be split and regrouped. This especially applied to large CAD
objects, such as slabs; and
3D CAD objects were added that were not present in the architectural model. As an
example traditional form work can be given. The objects solely served a visual
purpose and were abstract representations of actual form work elements.
The detailed production model required considerable modeling effort and was due to its
complexity and interdependencies rather difficult to manage. The complexity of the CPM
schedule that contained a large number of dependencies between activities further
constrained the management of the production model. Akbas (2004) notes that when
models accurately represent construction operations, the model complexity increases
significantly and consequently the effort required to create and maintain these process
We consider two potential directions to address the issues related to creation and
maintenance of production models: a geometry-based process model (GPM) and
generation of CAD objects by feature-based 4D models.
6.1.1 Geometry-based process model
Akbas (2004) suggests a process model based on geometric techniques. The use of
geometry in this approach is not limited to visualization, but is integrally used to model
and simulate processes. This approach seeks to simplify process modeling work.
4D input models are decomposed into sub-systems. The 4D input models are based on
macro level CPM schedules. Every sub-system contains crew parameters, geometric
work locations and interactions. The

Figure 3. Geometry-based process

model. A triangular mesh is used to
plan and visualize work flows and

Modeling cast in place concrete construction


work locations related to specific parts

of 3D CAD objects. Differences in
color in combination with arrows are
used to indicate work flow directions.
approach then reduces this process model into queuing networks and uses discrete event
simulation to simulate construction operations. Each of these steps uses geometric
techniques, such as triangle meshes and geometry sorting.
Applying GPM to the comparison of construction alternatives in this study could have
reduced the modeling work related to splitting 3D CAD objects into production objects of
appropriate size. Based on 3D input models, GPM generates triangle meshes that can
represent parts of CAD objects, where each triangle represents a work location of a crew
in time.
GPM could also have reduced the level of detail of both the CPM schedules and 3D
models, and have facilitated the planning of work flow directions. As opposed to the
approach of the 4D experiments where work flows were planned in the CPM
environment, GPM uses the triangle mesh to define work flow directions, Figure 3.
Planning work flow directions in the actual production model is considered more
effectively than planning work flow directions with a CPM schedule.
As a future extension of the conducted 4D study, a number of 4D simulations is
planned with a developed GPM prototype, called GSim (Akbas 2004). The modeling
process and the results from the GSim simulation can then be compared with the results
from the conducted 4D study.
6.1.2 Feature-based 4D model
Many production model objects are typically not included in 3D input models. A large
number of objects in the production model of the 4D study were related to temporary
structures and were mainly added for visual purposes. This required a considerable
modeling effort in which the context of the 4D production model was not explicitly taken
into account.
Adding CAD objects to represent for example temporary structures, such as traditional
form work elements, could be partly automated by using feature-based 4D models.
Feature modeling is an approach whereby modeling entities termed features are utilized
to provide improvements for common geometric modeling techniques (Kim 2004,
unpubl.). In order to apply feature modeling techniques to temporary structure generation
and planning, relevant features need to be identified and formalized. Research efforts to
date have mainly focused on developing feature ontology for scheduling and cost
estimation (StaubFrench 2003), but have not taken into account the generation and
planning of temporary structures. Research work and developed prototype software by
Kim (2004, unpubl.) show the potential to generate and plan temporary structures rapidly
and efficiently.
The application of this feature-based approach to generate and plan temporary
structures in the conducted 4D study could have reduced the modeling work significantly
and could have contributed to the quality of the temporary structure plan.

eWork and eBusisness in architecture, engineering and construction


In addition to experiments with GPM, a future study to evaluate the temporary

structure plan of the conducted 4D study with a feature-based approach proposed by Kim
(2004, unpubl.) is planned.
6.2 Input and output data
One of the objectives of the 4D study was to explore the possibility to evaluate
construction innovations in a realistic virtual environment. The modeling process and
results from the 4D analysis show that virtual prototyping of certain construction
processes is possible, but that specific construction operations are difficult to represent.
The objective of the industrialized construction alternative was to show the potential of
permanent form work in combination with the use of prefabricated reinforcement carpets
and self compacting concrete.
Specifying activities for all permanent form work elements led to highly detailed and
complicated CPM schedules that became difficult to maintain. The specification of
activities by using CPM schedules on this micro level, i.e. object level, even resulted in
questions of how realistic this process model was. The CPM schedule contained finishto-start relationships and assumed sequential finality, i.e., predecessors had to be 100%
complete before their successors could start (Tommelein 1999). This assumption proved
to be too deterministic for simulation on object level. A more abstract process model was
considered to be more realistic.
It was found that there was no clear way to realistically simulate the benefits of
prefabrication of reinforcement and self compacting concrete. Benefits of these
production technologies, such as decrease in on-site activities, reduction of resource use
and improved work environment could not be represented in the used 4D CAD models.
These specific production technologies required support by graphs and other non-CAD
data that could not directly be obtained from the 4D models.
In combination with the study on creating a production model from a building model,
by using GPM and feature-based modeling, also the relation between different levels of
detail of input data for 4D models and possible output data will be examined. Especially
the quantification of 4D model results needs to be addressed. These results could provide
a better support to make specific comparisons between construction alternatives.
6.3 Application in practice
Evaluation of multiple design and construction alternatives by using virtual models in a
collaborative environment is considered useful to improve the performance of
construction projects (Ballard 2002, Zabelle 1999). Results from this study show to some
extent the power of using multiple virtual prototypes, but show at the same time
limitations regarding required modeling effort and evaluation of results. These limitations
result mainly from shortcomings in the applied 4D modeling environment, but result also
from organizational shortcomings.
One point of departure for the 4D study was the possible practical use of the software
tools and 4D simulations in actual projects. The used client-server CAD environment was
considered very promising and valuable by the project participants, but was considered
too unstable and too slow for presentation of 4D model results on seminars. Standard

Modeling cast in place concrete construction


construction sequences were therefore recorded in AVI-format. 4D modeling

functionality, specifically implemented for the 4D study in the client-server environment,
was limited to basic functionality and did not provide users with an interactive
environment. It was also not possible to rapidly extract data from the 4D models to make
data analysis. Furthermore, the adapted approach for creation and maintenance of the 4D
models was considered too labor and too resource intensive.
Evaluating multiple construction alternatives by using virtual models in a
collaborative environment is a fundamentally different way of how most traditional
construction projects are pursued. Understanding of the technology and commitment of
project teams to apply the technology in practice are examples of organizational
prerequisites to derive benefits from virtual prototyping in 4D. Organizational issues have
not been addressed in this study, but are considered critical for application in practice.
The 4D modeling process and results show the potential, but show also limitations of
virtual prototyping in 4D. Evaluating construction alternatives in a virtual environment
can be an effective and cost efficient way to introduce and evaluate innovative
construction processes and products, but has to be supported by a variety of comparative
analyses. For this reason, the mainly graphical outcome of todays 4D models has to be
extended with methods that quantify the out-come of 4D models.
The modeling process and required modeling effort currently limit the rapid evaluation
of construction alternatives. Detailed 3D CAD models combined with detailed CPM
schedules, lead to complex 4D models that are difficult to manage and maintain. A
number of technologies is suggested that could facilitate and partly automate the
generation of 4D models. Feature-based modeling is suggested to support the semiautomatic generation and planning of certain 3D CAD objects, such as representations of
temporary structures. Geometry-based process modeling can reduce the required input
level of detail of 3D CAD models and CPM schedules, and can facilitate the planning of
work flows.
Evaluation of construction alternatives in a collaborative environment is considered
promising to support todays interactive decision-making processes. However,
application in practice implies a number of technical and organizational challenges.
Collaborative modeling systems have to support multiple detailed CAD models and have
to provide a stable environment to manage and maintain these models. Adapting the
technology in organizations and construction projects requires organizational changes
and commitment. Research on organizational issues is needed to facilitate further
application of virtual prototyping in collaborative environments.
The financial support from SBUFDevelopment Fund of the Swedish Construction
Industry, Betongindustri and the European Unions structural funds is acknowledged.

eWork and eBusisness in architecture, engineering and construction


Akbas, R. 2004. Geometry-based modeling and simulation of construction processes, 150. PhD
Thesis. Stanford University. Stanford, CA.
Akinci, B., Fischer, M., and Kunz, J. 2002. Automated Generation of Work Spaces Required by
Construction Activities. Journal of Construction Engineering and Management. 128(4):10.
Ballard, G., Koskela, L., Howell, G., and Zabelle, T. 2001. Production system design in
construction. 9th International Group for Lean Construction Conference. Singapore. 2337.
Ballard, G., Tommelein, I., Koskela, L., and Howell, G. 2002. Lean construction tools and
techniques. in Best, R., and De Valence, G. (eds). Design and ConstructionBuilding in
Value:227255. Oxford: Elsevier Science Ltd.
Enterprixe 2002. Enterprixe White Paper. Enterprixe Software Ltd. Helsinki, Finland.
Heesom, D., and Mahdjoubi, L. 2003. Dynamic Interactive Visualization of Construction Space
Using 4D Techniques. 3rd International Postgraduate Research Conference in the Built and
Human Environment. Lisbon, Portugal.
Heesom, D., and Mahdjoubi, L. 2004. Trends of 4D CAD applications for construction planning.
Construction Management and Economics, February 2004, 22, 171182. February
Jongeling, R., Olofsson, T., and Emborg, M. 2004. Product modelling for industrialized cast-inplace concrete structures. INCITE 2004International Conference on Information Technology
in Design and Construction. Langkawi, Malaysia.
Kim, J. 2004, unpubl. Generating temporary structures with feature-based 4D models. 41.
Department of Civil and Environmental Engineering, Stanford University. Stanford, CA.
Koo, B., and Fischer, M. 2000. Feasibility Study of 4D CAD in Commercial Construction. Journal
of Construction Engineering and Management. 126(4):251260.
Koo, B., and Fischer, M. 2003. Formalizing Construction Sequencing Constraints for Rapid
Generation of Schedule Alternatives. 75. 28. Center for Integrated Facility Engineering,
Stanford University. Stanford, CA.
Li, H., Ma, Z., Shen, Q., and Kong, S. 2003. Virtual experiment of innovative construction
operations. Automation in Construction. 12(5):561575.
Mallasi, Z., and Dawood, N. 2002. Registering Space Requirements of Construction Operations
Using SitePECASO Model. CIB w78 conference 2002Distributing Knowledge in Building.
Aarhus School of Architecture, Denmark. 18.
Staub-French, S., Fischer, M., Kunz, J. and Paulson, B. 2003. A generic feature-driven activitybased cost estimation process. Advanced Engineering lnformatics. 17(1):2339.
Tommelein, I.D., Riley, D.R., and Howell, G.A. 1999. Parade Game: Impact of work flow
variability on trade performance. Journal of Construction Engineering and Management.
Zabelle, T.R., and Fischer, M.A. 1999. Delivering value through the use of three dimensional
computer modeling. 2nd Conference on Concurrent Engineering in Construction. Espoo,

eWork and eBusiness in Architecture, Engineering and ConstructionDikba & Scherer (eds.)
2004 Taylor & Francis Group, London, ISBN 04 1535 938 4

Pilot implementation of a requirements model

Arto Kiviniemi* & Martin Fischer
CIFE, Stanford University, Palo Alto, California, USA
*VTT Building and Transport, Helsinki, Finland
ABSTRACT: This paper reports on research at CIFE at Stanford
University aiming to formalize a conceptual model to enable an active
connection between the requirements for a building project and its
building product model based design solution. The research is part of
CIFEs Virtual Design and Construction (VDC) framework. The main
content of the model will be on the clients (owners and end-users)
requirements, but the model will cover also some external requirements
set by the community and regulations. This paper compares two case
studies and presents the pilot implementation of a requirements database
based on the preliminary requirements model of room related client
requirements. Our observation is that even a simple active link between
the client requirements and design tools can increase the usage of
requirements documentation throughout the design process and facilitate
necessary updates of the client requirements.

1.1 Overall goal
The focus of our research is to create a conceptual model which enables active
connections between requirements for a building project and the building product model
based design solution. Our intuition is that the link could improve the requirements
management process and also the quality of the end result; the building.
1.2 Problem description
A building program specifying a projects goals and requirements for all the spaces is the
typical client requirements documentation in building projects, though there are also
several other methods to capture client requirements. Regardless of the capturing method,
the requirements, depending on the project type, consist of more or less detailed
information about the required room properties; net area, activities, connections to other
spaces, security, appropriate or desired materials, conditions like daylight, temperature,
sound level, etc. Many requirements also cascade; e.g., create additional requirements
to building elements bounding the space and systems serving the space. Moreover, an
important part of the design process is that some requirements can be in conflict; the
project team must often prioritize and make trade-offs between different requirements,

eWork and eBusisness in architecture, engineering and construction


which creates the need to update the requirements, and thus, manage and document the
changes to the requirements and the design solution.
In practice several factors make it virtually impossible that all the participants know
and remember all the relevant requirements and especially their relationships to each
other and to the design solutions.
The main reasons for this argument are:
Amount and complexity of project information.
Duration of projects.
Designers need to work simultaneously on many projects.
Changing stakeholders in different project phases.
Shifting design focus, e.g., moving from overall problem solving to detailed technical
After conceptual design the requirements documentation is usually not used actively in
the current process, and often the evolving requirements are not even communicated to
the whole project team (Kagioglou et al 1998). Thus, the changes are compared to, and
decisions made based on, the previous design solutions. The current design tools do not
support recording of client requirements or designers intent in the documents. Thus, the
people deciding on the changes do not always even know the original intent, and the
solution can drift away from the original goal without actual decisions to change the
goal or understanding the contradiction between the proposed design and project goals.
(Kiviniemi and Fischer 2004a, Kiviniemi and Fischer 2004c).

Figure 1. Model hierarchy.

Pilot implementation of a requirements model


1.3 Research scope

Product models represent design solutions in a rich way (Froese 2002, BLIS 2004); the
intent of our requirements model is to relate the clients business processes to the design
solution represented in the form of product models (Figure 1). The requirements structure
is based on the traditional building programs. The direct requirements are limited to
architectural design. The aggregation of indirect requirements to the bounding elements,
e.g., walls, windows and doors, from these direct requirements is within the scope of our
Detailed requirements for other design areas, like MEP and structural engineering, are
not in the scope of the research, but the connection from architectural design to these
design areas will be addressed. However, only the need for such a connection from the
architectural design will be analyzed and shown, but the detailed content of these
requirements is not in the scope of the research.
2.1 Draft model structure
To address these limitations, we developed a concept that divides projects information
model into four linked sub-models presented in Figure 1. The reasons for this separation
(1) Typically the design team produces several alternative design proposals, which all are
expected to meet the defined requirements. Thus, having one requirements model
linked to the alternative design models is a logical structure instead of multiplying the
same requirements to different design alternatives, which would easily lead to
requirements management problems. Similarly there can be several alternative
production models and finally a separate maintenance model. All these sub-models
should be connected to a virtual integrated product model, so that it is possible to
access the content of the different models and compare the alternatives at any stage of
the process. Our work focuses on the requirements model and its connection to the
design model(s).
(2) One requirements instance can relate to a number of separate instances with
identical requirements in the product model.
(3) The existing product model standard, Industry Foundation Classes (IFC) has been
developed to describe mainly design solutions. Its current structure does not support
requirements management well, neither does the internal structure of existing design
software. In practice this means that most of the requirements data cannot be
exchanged using current IFC files nor stored in the design software, but must be kept
in a separate database.
(4) The flexibility of the requirements structure is greater if the two models are separated
and connected with a thin link. In this case, for example the only property needed
for the link of space requirements is the ID in the space objects, which is supported by
almost any design software. For indirect requirements the functional demand is to

eWork and eBusisness in architecture, engineering and construction


recognize the connection between bounding elements and spaces, which is supported
by some commercially available product model based software.
(5) Another reason for the separation is to make the distinction between requirements and
properties clear; for example sound insulation is a requirement in the requirements
database and a property in the product model.
A further important observation is that the requirements instances in the requirements
database have no physical locations, i.e., the requirements for bounding elements can
relate to one space only. In the product model the bounding elements are always either
between two spaces or part of the building envelope. This means that the requirements
for the bounding elements must be aggregated from the requirements of the related
spaces; they cannot be defined directly to the elements in the same manner as the space
requirements relate to the spaces.
3.1 Test cases
To test the existing problems and possible solutions we used rapid prototyping. We
analyzed two real building programs, implemented two test databases based on the
results, and entered the project information. In the test cases the research concentrated on
room related client requirements only, external requirements were not in the scope at this
stage. The detailed results of the test cases are published in ICCCBE-X proceedings
(Kiviniemi and Fischer 2004a) and also in a CIFE Working Paper (Kiviniemi and Fischer
2004c). This paper covers only the main conclusions from these case studies.
The two projects were the ICL Headquarters project in Helsinki built in 19941996
and the on-going Lucas Center Expansion at Stanford University. The ICL Headquarters
is a large office building consisting mainly of standard office rooms, but including also
some special rooms and requirements. The Lucas Center Expansion is a small special
laboratory consisting mainly of unique rooms with very little repetition.
3.2 ICL Headquarters program
The ICL Headquarters building program was one document. The required areas were
constantly compared to actual design solutions and the requirements file was constantly
updated during the design process. The requirements documentation with respect to
required room areas was coherent. The only identified problem was related to the
structure used in the document; all classification codes and requirements were entered
manually in each cell, which created the possibility for incoherent content and made
updates more laborious. Use of references to one source data, e.g., simple inheritance
structure, would prevent this problem.

Pilot implementation of a requirements model


3.3 Lucas center expansion

Available LCE project documentation consists of a set of design sketches, drawings and
MS Excel spreadsheets of different project stages, the architects requirements database
(Claris Filemaker), meeting minutes, and technical specifications. The project was at the
time of the study (November 2003) in the early construction stages.
LCEs Project Manager and Project Architect provided some insight on the project.
The basic conclusion based on these interviews is that Stanfords projects are generally
well-managed and have clearly defined processes for different stages. However, as is
typical in the AEC industry, the requirements capturing process is somewhat fuzzy, based
strongly on meetings, where end-users and the project team are interacting trying to find
solutions to specific problems. The decisions are recorded in the meeting minutes, and
the room areas of each design stage are documented in MS Excel spreadsheets. The
reasoning behind the changes and proposed solution becomes tacit knowledge and is
stored only in the minds of the participants. The main problem categories in the
requirements documentation for the LCE project were:
Lacking or different identifications of the rooms.
Contradictions in the different documents.
Incoherent descriptions of the same requirements.
Wrong or missing information.
Instead of actual spatial requirements the documents recorded the areas of the rooms in
the design solution.
Documents specifying detailed technical requirements had no relation to the room
related requirements documentation.
3.4 Conclusions of the test results
The requirements documentation and process in the LCE project are a typical example of
practices on current construction projects. Different parts of the requirements are stored
in several documents, and there is no comprehensive document containing all needed
information. In addition, the names and IDs for the rooms are often ambiguous, and
similar requirements are formulated in different ways. This makes it difficult to connect
requirements to the correct room even manually, and the use of ICT to manage the
relations between the requirements and solutions is almost impossible.
Though many of the mistakes in the LCE project were small, and probably caused
little, if any, real problems to the people, who have been actively involved in the project,
they are a clear indication of the general requirements management problem in the
current process. To anyone, who joins the project later, it is very difficult and time
consuming, sometimes impossible, to find out which requirements are correct and still
relevant. Furthermore, someone who wonders about the changes, like the growth in the
size of the project, will have great difficulty finding an answer in the project documents.
Though only the required area information of the ICL Headquarters project was linked
with the design solution, it provided some benefits. ICLs Project Manager states: Still
today, over 9 years later, ICL Headquarters is the only project, where I got practically
real time information comparing actual areas to the building program on a detailed

eWork and eBusisness in architecture, engineering and construction


level, and was able to follow constantly that the project design stayed within the
allocated limits. In addition, despite the simple approach taken in the ICL project to
only link the requirements and the design information for comparing required and real
areas, the coherent requirements information suggests that a link between requirements
and design tools and the constant use of requirements information in the process could
improve requirements management.
To explore the possible solutions to manage the room related requirements, we used rapid
prototyping and implemented some different database structures to find a usable solution,
which would:
Provide solutions to the problems identified in the LCE project.
Support inheritance of the room type requirements (ARqE) to rooms (IRqE) (Figure 2).
Enable in the next phase of the research a link between the requirements database and
the product model (Figure 2).
4.1 Requirements database tests
The user interface and database structure of the first pilot implementation were based
mainly on the room program documents of LCE project. The implementation was made
in MS Access 2002 database. The main criteria for the database structure were to provide
a solution to the identified problems:
Unique IDs for the rooms; i.e., IRqE and all the rooms in the product model referencing
it must share the same ID
unambiguous identification.
Use of requirements types (ARqE) and inheritance
efficient and easy maintenance
and updating of repetitive requirements.
Use of user-definable enumeration (list of values) instead of free text
No default values, which might inadvertently set wrong requirements.
Functionality to compare area requirements with areas in design documents.
Functionality to link external documents to the requirements database, e.g., to include
also complex descriptive requirements, not only short text and numerical

Pilot implementation of a requirements model


Figure 2. Draft concept to link detailed

room related requirements to a product
As introduced in Figure 2 the key idea is the use of two main tables: RoomTypes
(ARqE) and Rooms (IRqE). RoomType is an abstract requirement entity and
ZRoom is an instantiable requirements entity in the requirements database. Both have
the same fields and references (Shared Requirements, ShR) with the following
Room can reference a RoomType to inherit its requirements, but the opposite
relation is not possible.
Room can have a relation to department and other room(s), but RoomType cannot
have these relations (instance-specific properties, ISP).
The Rooms table contains a NumberOfInstances and RoomName fields, which
are not in the RoomTypes table (ISP).
Only RoomType has RoomTypeDescription and RoomTypeDoc fields, (type-specific
properties, TSP).
The requirements used in the implementation are only one example of possible
requirements, and do not cover all possible building types or use cases. However, they
can be categorized in two main groups:

eWork and eBusisness in architecture, engineering and construction


Single-value requirements (SVR), which can have only one value or reference for each
room, like, for example, noise level, maximum number of occupants, maximum
temperature, etc.

Figure 3. Elements of the pilot

implementation and the relationship to
existing software and processes.
Multi-value requirements (MVR), which can have a number of different values or
references in each room, like, for example, activities, equipment, windows, etc.
The separation of SVR and MVR defines the basic structure of the requirements database
and is an important issue because of two reasons:
(1) If all requirements would be defined and implemented as SVR types, the database
structure would not allow use of an unlimited number of requirements for each room,
which is necessary for some requirement types as described above.
(2) If all requirements would be defined and implemented as MVR types, the possibility
to give multiple values for all properties could cause contradicting requirements, like
several different maximum areas. In addition, the database structure would be more
complicated, which could create performance problems, and the user-interface to the
data would be more difficult to understand and slower to use, if all values were in subtables.
The RoomTypes were not used in the LCE project database, because the rooms
are not based on any repetitive types; all rooms are defined as separate instances.
In contrast, the ICL project is based on strong use of room types and this caused
one change in the database structure. RequiredNetArea and MaxOccupants,

Pilot implementation of a requirements model


which were located in both the RoomTypes and Rooms tables in the LCE
test, would have caused extensive duplication of similar type definitions with
different area and occupant values. Thus, the database structure was changed so
that these requirements were removed from the RoomTypes table and changed
to instance-specific requirements in the Rooms table (Figure 4).
Otherwise the same database structure, which was used in the LCE project test, also
worked for the ICL Headquarters project and enabled recording of the requirements in a
usable format; 782 physical room instances are stored in 186 requirements instances
based on 51 RoomTypes. The maximum number of type references is 16, the average 3.8
and the median 2. The population of the database took about 3 hours, which can be seen
as a reasonable effort.
Figure 4 shows the 1_to_1 and 1_to_many relations in the final pilot database.
Room-Type and RoomID are the key links between different tables.
This draft structure forces the user to define unique IDs for each requirements instance
(Rooms), and all the free text requirements, like departments, adjacent rooms,
equipment, activities, etc. are based on user-definable lists (enumerations), which
prevents slightly different requirements descriptions or references to non-existing rooms;
all problems we identified in the LCE project data.

Figure 4. Pilot database structure.

eWork and eBusisness in architecture, engineering and construction


Figure 5. Demo version of

requirements management userinterface; Room type definitions.
Based on the experiments with these two different room programs, the current
structure appears to be a sufficient basis for further work at this stage of our research.
Based on the structure, we developed a preliminary conceptual requirements model,
which is published in the GCADS workshop proceedings (Kiviniemi and Fischer 2004b).
4.2 Connection to a product model
The actual connection of the requirements database to the product model based design
software was not implemented, orily a mock-up of a connection from the design
application to the Access database was developed. By selecting objects in the design
software, e.g., rooms and bounding elements, the user can see all the related requirements
in the requirements database (Figure 5 and Figure 6).
RoomID is the connecting element between the requirements database and the
product model. The room instances in the product model are connected directly, but the
bounding elements related to the rooms are identified in the product model and the
connection to the requirements database is based on the RoomID of identified rooms.
The user interface mock-up shown in Figure 5 and Figure 6 illustrates how to access
the requirements database from the design software by adding a requirements view to its
user interface. Depending on the use scenario, the modifications of the requirements from
the design interface can be either allowed or denied; in some projects the client might

Pilot implementation of a requirements model


delegate the requirements management to the designers, in some projects it might be the
task for the PM or the clients own representative. The selected database approach
enables independent access control for the requirements database.
The research is still continuing. At the next stage we will analyze several other building
programs to find a

Figure 6. Demo version of

requirements management userinterface; wall requirements.
relevant structure for the requirements model. Some requirements are common to
practically all buildings, like, for example, required area, activities in the space,
connections to other spaces, etc. Some requirements are specific only to some building
types, like, for example, exact limits for minimum and maximum temperatures and
moisture; they are common for laboratories, museums, demanding technical facilities,
etc., but not defined for most buildings (Table 1). We argue that all these different types
of requirements can not be standardized. Thus the goal of our research is to identify a
reasonable set of common requirements within the defined scope and create a flexible
framework, conceptual model, which enables also project specific requirements.
We expect that our research will also create the basis for future research topics, like,
for example:

eWork and eBusisness in architecture, engineering and construction


Different building types and process phases: The scope of our research covers a few
building types only. Our intuition is that the same conceptual model could be applied to
most buildings, but because of the different requirements the database and UI
implementation might be different In addition, our research covers only design, a short
period of the process, the use of the requirements model in other parts of the process,
like, construction, FM, etc., is not covered in detail, though the same principles are
possibly applicable.
Technical systems and other design areas: The designers role in defining detailed
technical requirements for technical systems is more dominant than in the architectural
design, and LBNLs research in this area provides another view on building requirements
management (LBNL 2003). However, there is no link between the technical requirements
for systems and the building product model. Our research identifies some connections to
technical systems, but the formal link between these two requirements views will need
further research, as do requirements link and management in other design areas, like
structural engineering.
Requirements history: One interesting related research area is the requirements
history; how the requirements evolve during the process. Our research proposes a
conceptual model for requirements, which will provide a conceptual basis to store all the
requirements changes during the process in the database. How to implement such a
historic perspective of requirements management in detail and which functionalities the
UI would need, could be interesting areas for further development.

Table 1. Requirements analysis results from the

LCE and ICL project building programs.

Requirement Room Room One Many Data


Bounding Systems LCE, LCE, ICL Average

PA (%)
(%) (%)

Identification and overall definition








92 100




100 100









92 100% 100



100 100% 100


Individual properties and requirements

Number Of

Pilot implementation of a requirements model






100% 100




Basic properties








Connections, activities, furniture, equipment, doors and windows






































Ceiling height o

















eWork and eBusisness in architecture, engineering and construction






Light Level


Hyper x







Change Rate














Hyper x

Environmental conditions



Verification: Some requirements are fuzzy, verbal or otherwise only human

interpretable descriptions, but some have an exact content. The possibility to use the
exact requirements for automated verification, i.e., how well the design meets the
requirements, is a potential usage of the requirements model. Verification of the fuzzy
requirements must include designers interaction, but the designers or project managers
confirmation that the requirements are met, could be part of the database and serve as a
formal project history.
The goal of our research is to develop and test a method to create an active link between
requirements and product model based design tools. At this stage the research has
documented at least some aspects of the requirements management problem, and tested
one possible model for a requirements database structure. The final, anticipated scientific
contributions of the research are:
Documentation of the requirements management problem in the design process.


Pilot implementation of a requirements model


Documentation and analysis of the different requirements types.

Specification of a conceptual requirements model based on the analysis.
Specification for a link between client requirements and product model objects.
Specification for the aggregation of indirect requirements from the direct client
requirements to the building elements.
Extended BLIS view Room Program -> Architectural Design for the IFC product
model implementation.
The expected main contribution on a practical level is the development of a conceptual
model, which enables software supporting requirements management during the design
process and interaction between the design solutions and requirements.
Froese, T. 2002. Current Status and Future Trends of Model Based Interoperability, eSM@rt 2002
Conference Proceedings Part A, University of Salford, pp. 199208
Kagioglou, M., Cooper, R., Aouad, G., Hinks, J., Sexton, M. & Sheath, D. 1998. A Generic Guide
to the Design and Construction Process Protocol; University of Salford
Kiviniemi, A. & Fischer, M. 2004a. Requirements Management Interface to Building Product
Models, Proceedings ICCCBE-X, page 252
Kiviniemi, A. & Fischer, M. 2004b. Room related requirements model concept in building product
model environmentPREMISS project, GCAD04 symposium, Carnegie Mellon University
Kiviniemi, A. & Fischer, M. 2004c. Requirements management interface to building product
models, Stanford University, CIFE Working Paper
LBNL 2003. Lawrence Berkeley National Laboratory, Design Intent Tool,

eWork and eBusiness in Architecture, Engineering and ConstructionDikba & Scherer (eds.)
2004 Taylor & Francis Group, London, ISBN 04 1535 938 4

A combined product-process model for

building systems control
Department of Building Physics and Building Ecology, Vienna University
of Technology, Vienna, Austria
ABSTRACT: In order to suffice the requirements of building systems
control applications, the underlying representation must combine building
product information with building controls information. Currently, there is
a lack of systematic building representations that would integrate aspects
of product and process. This paper explores ways of coupling of a controloriented process model with a building product model. An approach is
described for the automated rule-based generation of a representation for
building systems control applications.

1.1 Problems in building systems control
Currently, building systems controllers (for heating, cooling, ventilation, lighting, etc.)
operate in terms of individual system components at several levels. While this allows, in
principle, for a distributed implementation of control logic, it also leads to the
configurations of environmental systems as isolated sub-systems. Thus, devices affecting
the same control zone are seldom integrated. Likewise, in most buildings, the level of
vertical integration of local and central systems is insufficient. These problems are
aggravated due to the contingencies associated with the boundary conditions of system
operation, e.g. dynamic changes in the outdoor conditions as well as building occupants
activities and control actions. There is generally a lack of representations for integrating
sub-systems with each other and with building product models. Most commercially
available environmental control systems for buildings do not offer explicit
representational frameworks either to systematically capture multiple control processes
and their interactions or to map those processes onto target building product model
entities (i.e. control impact zones).
1.2 Product and process
Numerous representational schemes (product models) have been proposed to describe
building elements components, systems, and structures in a standardized fashion
(Augenbroe 1995, IAI2004). Thereby, one of the main motivations has been to facilitate
hi-fidelity information exchange between agents involved in the building delivery process
(architects, engineers, construction people, manufacturers, facility managers, users). The

A combined product-process model for building


representational stance of most building product models is decompositional and static

(see Figure 1 as an example of aschematically illustratedbuilding product model). In
contrast, building control requires representational systems capable of capturing
procedural sequences of events, decisions, and actions. Toward this end, the underlying
representation must combine detailed building product information with building controls
process information. Currently, there is a divide between modes and styles of control
system representation in the building control industry and representational habits in
architecture and building science. Specifically, there is a lack of systematic building
representations that would unify building product, behavior, and control process
information. This paper explores ways of coupling a control-oriented process model with
a building product model.
1.3 Paper overview
Section 2 provides a high-level instance of a combined process-product model. Section 3
demonstrates how the control process model can be created using a logical and coherent
method, so that it could be used for a diverse set of building control applications. A set of
rules are presented that allow for the automated generation of the control system model.
Section 4 demonstrates the application of the representational framework and its rulebased generation using the concrete example of a test space. Section 5 includes the
conclusions of the paper.

Figure 1. SOM (shared object model),

an example of a building product
model developed in the course of the
SEMPER project (Mahdavi 1999,
Mahdavi et al. 1999a).

eWork and eBusisness in architecture, engineering and construction


2.1 Control as process
As opposed to abundant literature in building product modeling, there is a lack of an
explicit ontology for the representation of building control processes. Table 1 includes a
glossary of fundamental terms (together with definitions and examples) that are relevant
to building systems control representation. A basic control process involves a controller,
a control device, and a controlled entity (see Figure 2). An example of such a process is
when the occupant (the controller) of a room opens a window (control device) to change
the temperature (control parameter) in a room (controlled entity).
2.2 A high-level combined product-process scheme
It is useful at this point to explore ways of coupling the basic process model depicted in
Figure 2 with an instance of the previously mentioned building product models (Figure
1). Figure 3 illustrates a high-level expression of such a combined building product and
control model. While certain instances of the product model such as building, section,
space and enclosure constitute the set of controlled entities in the process view, other
instances such as aperture or technical systems and devices fulfill the role of control
2.3 Zones, sensors, and devices
Both the basic control process model depicted in Figure 2 and its embellished version
with elements of a product model (Figure 3) are rather schematic. They must be extended
and augmented to capture the details of realistic control processes. To achieve a more
detailed and operationally effective representational integration of building product and
process aspects, this scheme must be realized within the context a two-fold hierarchy, one
pertaining to the building product classes and the other pertaining to controller classes.
Key to the communication between the two hierarchies is the Janus-faced notion of the
controlled entity (or control zone). From the product model view point, a control
zone corresponds either directly to a product model entity (such as space) or to an
abstract entity derived by partition or aggregation of product model entities (e.g. a
workstation within a room, a collection of windows in a facade). From the control model
point of view, a zone is the abstract object of control (controlled entity) whose controlrelevant attribute (control parameter) is monitored via corresponding sensors.
As Figure 4 illustrates, zones can be viewed as fluid and reconfigurable entities whose
spatial extension may be mapped back to (properly partitioned or aggregated)
components of a building product model. Devices and sensors can be mapped back to a
product model in terms of physical objects (e.g. in terms of technical elements as per
Figure 1).

A combined product-process model for building


Table 1. Terms, definitions, and instances in

building control (Mahdavi 2001a).




Decision-making agent: influences the controlled

entity via a control device

People, software, thermostat


Goal of the control action

Maintaining a set-point


Is used to influence the controlled entity

Window, luminaire, HVAC



Interface between controller and control device

Valve, dimmer, people

device state

Control-relevant attribute of control device

Closed, open


Object of control (assumed target or impact zone

of a control device)

Room, floor, building


Indicator of control-relevant state of controlled


Room temperature,
illuminance on working plane


Reports the state of: control parameters,

environmental and occupancy conditions, control

Thermometer, illuminance
sensor, electricity counter

Control action Change in the control device state

Opening windows, turning on


Control state

Possible positions of a valve, a

dimmer, or a window

The logical space of possible positions of a (or a

number of) control device(s)

Figure 2. A general control scheme.

Whilst the building product modeling community is quite experienced in dealing with
complex hierarchies of product model components, explicit hierarchical schemes for the
representation of control processes are far less developed. Strictly speaking, the

eWork and eBusisness in architecture, engineering and construction


controller entity as shown in the basic schemes of figure 2 and 3 applies only to a
device controller (DC), i.e. the dedicated controller of a specific device. These schemes
stipulate that a DC receives control entitys state information directly from a sensor, and,
utilizing a decision-making functionality (e.g. a rule or an algorithm that encapsulates the
relationship between the device state and its sensory implication), sets the state of the
device. Real world building control problems are, however, much more complex, as they
involve the operation of multiple devices for each environmental system domain and
multiple environmental system domains (e.g., lighting, heating, cooling, ventilation).
An appropriate representation of control processes must thus capture the relationships
between primary

Figure 3. A high-level building product

and control process scheme.

A combined product-process model for building


device controllers and various layers of higher-level controllers (or meta-controllers).

Moreover, it must be created based on a logical and coherent method, allowing ideally for
the automated generation of the control process model. The next section of the paper
describes such a method.

Figure 4. Zones as the coupling links

between building product and control
process hierarchies.

3.1 Device and meta-con trollers
As such, the complexity of building systems control could be substantially reduced, if
distinct processes could be assigned to distinct control loops. However, controllers for
various systems and components are often interdependent. A controller may need the
information from another controller in order to devise and execute control decisions. For
example, the building lighting system may need information on the buildings thermal
status (e.g. heating versus cooling mode) in order to identify the most desirable
combination of natural and electrical lighting options. Moreover, two different controllers
may affect the same control parameter of the same impact zone. For example, the
operation of the window and the operation of the heating system can both affect the
temperature in a room. In such cases, controllers of individual systems cannot identify the
preferable course of action independently. Instead, they must rely on a higher-level
controller instance (i.e., a meta-controller), which can process information from both
systems toward a properly integrated control response.
We conclude that the multitude of controllers in a complex building controls scheme
must be coupled appropriately to facilitate an efficient building operation regime. Thus,
control system features are required to integrate and coordinate the operation of multiple
devices and their controllers. Toward this end, control functionalities must be distributed

eWork and eBusisness in architecture, engineering and construction


among multiple higher-level controllers or meta-controllers (MCs) in a structured and

distributed fashion. The nodes in the network of device controllers (DCs) and
metacontrollers (MCs) constitute points of information processing and decision making.
In general, first-order MCs are required: (i) to coordinate the operation of identical,
separately-controllable devices and (ii) to enable cooperation between different devices in
the same environmental service domain. A simple example of the first case is shown in
Figure 5, where an MC is needed to coordinate the operation of two electric lights to
achieve interior illuminance goals in a single control zone. Figure 6 shows an example
for the second case: moveable blinds and electric lights are coordinated here to integrate
daylighting with electric lighting.

Figure 5. Meta-controller for

individually controllable identical

A combined product-process model for building


Figure 6. Meta-controller for different

devices addressing the same control

eWork and eBusisness in architecture, engineering and construction


Figure 7. Schematic floor plan of the

test spaces.
In actual building control scenarios, one encounters many different combinations of the
above instances. Thus, the manner in which the control system functionality is distributed
among the controllers must be explicitly configured. The control process model must be
created using a logical, coherent, and reproducible method, so that it can be used for a
diverse set of building control applications. Ideally, the procedure for the generation of
such a control process model should be formalized and automated, given its complexity,
and given the required flexibility to dynamically accommodate changes over time in the
configuration of the controlled entities, control devices, and their respective controllers.
3.2 Generative rules
A set of constitutive rules has been developed and tested that allows for the automated
generation of the control system model (Mahdavi 200 la, Mahdavi 2003, Mertz &
Mahdavi 2003). Such a model can provide a template (or framework) of distributed nodes
which can contain various methods and algorithms for control decision making.
Specifically, four model generation rules are applied successively to the control problem,
resulting in a unique configuration of nodes that constitute the representational
framework for a given control context. The first three rules are generative in nature,
whereas rule 4 is meant to ensure the integrity of the generated model. The rules can be
stated as follows:
(i) Multiple devices of the same type that are differentially controllable and that affect the
same sensor necessitate a meta-controller.
(ii) More than one device of different types that affect the same sensor necessitates a
(iii) More than one first-order meta-controller affecting the same device controller
necessitates a second-order (higher-level) meta-controller.

A combined product-process model for building


(iv) If in the process a new node has been generated whose functionality duplicates that
of an existing node, then it must be removed. Any resulting isolated nodes must be

4.1 Control system representation
The following example illustrates the application of the rules introduced in the previous
section (Mertz & Mahdavi 2003).
The scenario includes two adjacent rooms, each with four luminaires and one local
heating valve, which share a set of exterior moveable louvers for daylight and insolation
control (see Figure 7). Hot water is provided by the central system, which modulates the
pump and valve state to achieve the desired water supply temperature. In each space,
illuminance and temperature are to be maintained within the set-point range. This
configuration of spaces and devices stems from an actual building, namely the Intelligent
Work-place (IW) at Carnegie Mellon University, Pittsburgh, USA (Mahdavi et al.
An effective way to define control zones (controlled entities) is to describe the
association between the sensors and devices. From the control system point of view,
controlled entities are represented by sensors, and the influence of devices on the
controlled entities is monitored via sensory information. In the present example, an
interior illuminance sensor (E) and a temperature sensor (t) are located in each space. The
sensors for Space-1 are called E1 and t1, and those for Space-2 are called E2 and t2. In
Space-1, both the louvers and electric lights can be used to meet the illumination
requirements. As shown in Figure 8, Sensor E1 is influenced by the louver state,
controlled by DC-Lo1, as well as by the state of four electric lights, each controlled by a
DC-EL. Similarly, both the local valve state and the louver state influence the
temperature in Space-1 (t1). Analogous assumptions apply to Space-2.
Once the associations of the devices and sensors have been determined and the control
zones (controlled entities) have been defined, the generation rules can be applied to the
control problem, resulting in the representation of Figure 9. A summary of the application
of Rules 1, 2, and 3 is shown in Table 2.
As to the application of rule 1, four nodes, namely DC-EL1, EL2, EL3, and EL4 are of
the same device type and all impact sensor E1. Thus, an MC is needed to coordinate their
action (MC-EL_1). Similarly, regarding the application of rule 2, both DC-Lo1 and DCVa1 impact the temperature of Space-1. Thus, MC-Lo_Va_1 is needed to coordinate their

eWork and eBusisness in architecture, engineering and construction


Table 2. Application of Rules 1, 2, and 3 toward the

generation of a control model for a test space (see
text and Figures 7 to 9).
Multiple controllers

Affected sensor

Affected device








Lo1, Va1



MC-Lo Va_l

Lo1, Va2




EL_1, Lo1




EL_2, Lo1







Application of Rule 1
EL1, EL2,
EL3, EL4
EL5, EL6,
EL7, EL8
Application of Rule 2

Application of Rule 3

EL Lo Va_1


Figure 8. Association between sensors

and devices in the test spaces (cp. text
and figure 7).

A combined product-process model for building


Figure 9. An automatically generated

control model for the test spaces (cp.
text and figures 7, 8).
As to rule 3, four MC nodes control the DC-Lo1 node. Thus, their actions must be
coordinated by an MC of second order, namely MC-II EL_Lo_Va_1. In this example,
Rules 1, 2, and 3 were applied to the control problem to construct the representation.
Using this methodology, a unique scheme of distributed, hierarchical control nodes
can be constructed. In certain cases, however, the control problem contains characteristics
that cause the model not to converge toward a single top-level controller. In these cases,
Rule 4 can be applied to ensure convergence.
Rule 4 is used to ensure that model functionality is not duplicated. Thereby, the means
of detecting a duplicated node lies in the node name. Since this may create hierarchically
isolated nodes, rule 4 also requires that such nodes be reconnected. The following
example illustrates the application of this rule.
Figure 9 shows a model that was constructed using rules 2 and 3. The application of
these rules is summarized in Table 3.
Rule 1 does not apply in this case because there are three distinct device types
involved. As to the application of rule 2, DC-EL1 and DC-BL1 both impact the state of
E1 and thus MC-BL_EL_1 is needed to negotiate between them. Three MC nodes are
created in this manner. When Rule 3 is applied, three second-order MCs are created. It is
apparent that the model will not converge. Moreover, the three nodes have the same
name: MC-BL_EL_Lo. This is an indication of duplicated functionality (of coordinating
devices BL, EL, and Lo). Thus, applying rule 4, nodes MC-BL_EL_Lo_2 and MCBL_EL_Lo_3 are removed

eWork and eBusisness in architecture, engineering and construction


Figure 10. Application of Rule 4 (see

also Table 3).
and node MC-BL_Lo_1, which is left without a parent node, is connected to the MCBL_EL_Lo_1.
The above illustrations were meant to provide a basic understanding of the proposed
approach toward the rule-based generation of control-oriented building process models.
This model generation approach can be automated and enables, thus, the control system
to effectively respond to changes in spatial organization, control zone configuration, and
control components. Moreover, the systematic and explicit definition of sensors, zones,
devices, controllers, and their relationships, together with the aforementioned generative
rules provide a control systems design environment that has been shown to be
advantageous in terms of scalability and robustness. Various model generation attempts
pertaining to realistic building and system considerations (with multiple systems and
multiple spatial levels) have consistently led to reliable and robust control system

A combined product-process model for building


4.2 A note on model-based control methods

The integrated representation introduced in this paper is syntactic in nature. It does not
provide specific solutions to address issues of control semantics. Rather, it provides a
flexible framework (a template as it were) for implementing different kinds of control
semantics. Given its systematic hierarchical structure, the representation is specially
suited for the implementation of distributed and modular control logic routines. Such
routines include also the class of so-called model-based control methods. Model-based
control methods aim at the behavioral description of the reaction of controlled entities to
various positions of pertinent control devices given a set of contextual conditions (i.e.
climate, occupancy).

Table 3. Application of Rules 2 and 3 toward the

generation of the control representation of Figure 9.
Multiple Controllers

Affected Sensor

Affected Device


EL1, BL1




EL1, Lo1




BL1, Lo1













Application of Rule 2

Application of Rule 3
BL Lo 1

Both rules and simulations can be applied to capture (and predict) system behavior. In
rule-based control, a simple statement describes the control function used to make
decisions. As an example, a rule used within a DC node can define the relationship
between the state of a device and its corresponding impact on the state of the sensor
monitoring the controlled entity. Rules can be developed through a variety of techniques.
For example, rules can rely on the knowledge and experience of the facility managers, the
measured data in the space to be controlled, or logical reasoning.
Simulation-based control (Mahdavi 2001b) can be used for building control within the
proposed representational framework through the following procedure: (i) Multiple
potential control device states are considered in the pertinent DC node; (ii) The resulting
device state space is analyzed via real-time simulation runs; (iii) Simulation results are
evaluated and ranked according to applicable objective functions; (iv) Devicecontroller
implements the most desirable device state.

eWork and eBusisness in architecture, engineering and construction


Rule-based and simulation-based control algorithms and methods can animate the
control system representation resulting in a versatile systems control environment. Such
an environment has the potential to simultaneously address: (i) The effects of control
actions on multiple zonal performance indicators (control parameter), (ii) The influences
of devices in multiple domains (heating, cooling, lighting, etc.), (iii) Conflicts amongst
devices used to control multiple zones, (iv) Cooperation between local and central
systems, (v) Multiple control objective functions.
This paper presented an approach to establish an adequate framework for the combined
representation of building product and process aspects. In this representation, target
entities of control actions (controlled entities or zones) act as the coupling links between
parallel hierarchies of building product classes and building controller classes.
The building control hierarchy can be generated automatically via a set of four rules.
Starting point for this generation process is the definition the controlled entities. This is
achieved in that explicit associations between devices and sensors are established. The
generated control representation includes various layers of multiple nodes that embody
primary device-controllers and higher lever controllers (meta-controllers). Multiple
expressions of distributed control semantics can be accommodated in such nodes.
Promising algorithmic candidates for such distributed control logic implementations
include model-based (rule-based and simulation-based) control routines.
To improve the applicability of the proposed approach, work is under way to address
issues of building model maintenance. Specifically, technology candidates for scalable
and pervasive monitoring of model entities are being examined so as to bestow upon
thetypically complexbuilding models the capability to update and reconfigure
themselves autonomously.
The work presented in this paper is support in part by the Austrian Science Foundation
(FWF), Grant number: P15998-N07.
Augenbroe, G. 1995. COMBINE 2, Final Report [online]. Commission of the European
Communities, Brussels, Belgium. Available from:
IAI 2004. International Alliance for Interoperability [online]. Website. Available from:
Mahdavi, A. 2003. Modell-basierte Steuerungsstrategien fr selbstbewusste Gebude.
Gesundheits-Ingenieur. Oldenbourg Industrieverlag Mnchen. Heft 5. ISSN 09326200. pp.

A combined product-process model for building


Mahdavi, A. 2001a. Aspects of self-aware buildings. International Journal of Design Sciences and
Technology. Europia: Paris, France. Volume 9, Number 1. ISSN 16307267. pp. 3552.
Mahdavi, A. 2001b. Simulation-based control of building systems operation. Building and
Environment. Volume 36, Issue 6, ISSN:03601323. pp. 789796.
Mahdavi, A. 1999. A comprehensive computational environment for performance based reasoning
in building design and evaluation. Automation in Construction 8 (1999) pp. 42735.
Mahdavi, A., Ilal, M.E., Mathew, P., Ries, R., Suter, G. & Brahme, R. 1999a. The architecture of
S2. Proceedings of Building Simulation 99. Sixth International IBPSA Conference. Kyoto,
Japan. Vol. III. ISBN 4-931416-03-9. pp.12191226.
Mahdavi, A., Cho, D., Ries, R., Chang, S., Pal, V., Ilal, E., Lee, S. & Boonyakiat, J. 1999b. A
building performance signature for the intelligent workplace: some preliminary results.
Proceedings of the CIB Working Commission W098 International Conference: Intelligent and
Responsive Buildings. Brugge. ISBN 90-76019-09-6. pp.233240.
Mertz, K. & Mahdavi, A. 2003. A representational frame-work for building systems control.
Proceedings of the Eight International IBPSA Conference. Eindhoven, Netherlands. Vol. 2.
ISBN 90-386-1566-3. pp. 871878.

eWork and eBusiness in Architecture, Engineering and ConstructionDikba & Scherer (eds.)
2004 Taylor & Francis Group, London, ISBN 04 1535 938 4

FIDE: XML-based data model for the spanish

AEC sector
J.M.Molina & M.Martinez
AIDICO, Valencia, Spain
ABSTRACT: The construction sector is very peculiar, it has some
features that make it special and different from other industrial sectors.
Firstly it is a highly fragmented sector, where 96% of the sector is made
up of SMEs with less than 20 workers, and with a very small market
quote. Secondly it is a sector of one-of-a-kind projects. And finally, each
project brings together several actors from different areas. These features
throw up the need for a common language to facilitate the data exchange
amongst the different stakeholders. Some interesting initiatives are being
carried out at the international level in this line. In this paper we present
the FIDE initiative, where the Spanish Administration has promoted the
work to develop a PDM for the Spanish construction sector.

The exchange and integration of the information among the different stakeholders who
participate in the processes of projecting and constructing public and private buildings
and infrastructures (architects, engineers, constructors, manufacturers, laboratories )
are critical points that the construction industry must approach for being able to benefit
from the enormous potential of productivity increase that the application of the
Information & Communications Technology do provide. To date it exists no reference
framework in Spain for the integration of the information in the construction process,
neither for the compatibility of the existing software applications in the market, nor it has
been considered a model or standard system of common data exchange within the sector
in the Business-2-Administration arena.
The usage of a common data model will allow a drastic reduction in the use of
resources by all the involved agents. This specially applies for the Public Administrations
in their task of the management and control of the information, documents and files
generated during the construction process. The Administration must play therefore a
fundamental role in the promotion and use of new technologies in the construction sector,
being one of the main beneficiaries of its use, since it may enormously optimise the
administrative processes which consume important resources as a result of an inefficient
management of exchanging information.
The FIDE model focuses on the scope of the Spanish National Technical Building
Code. The Technical Building Code (TBC) is the normative framework that establishes
the safety and habitability requirements of buildings set out in the building Act (LOE)

Fide: Xml-based data model


To promote innovation and technological development, the TBC has adopted the most
modern international approach to building norms: Performance-Based Codes or
The use of these new regulations based on performance calls for the configuration of a
more flexible environment, easily updated in accordance with the development of
techniques and the demands of society, and based on the experience of traditional norms.
Its development is coordinated at national level through a supra-regional stable
Working Group that establishes the bases for its development and that coordinates and
controls the extensions of the model, establishing approval mechanisms and integration
of partial developments made by third parties within the general structure of FIDE,
guaranteeing therefore the scalability of the model and the interoperability between
different scopes of the Technical Code.
FIDE model is public and open. In addition, its foundations have been established
taking into account the possible relations with the most internationally spreaded standards
and data models, like the IFC from the IAI [2]. This will facilitate the interrelation
between FIDE and these international standards, thus fostering its compatibility and
usability. This way FIDE will take profit of the existing tools and developments existing
in the market developed for the IFC data model. For this reason FIDE will be widely
spread not only at national level, but also at international level, showing it as example of
good practices in the sector.
The work done so far in FIDE has generated not only technical results, but what is
more important, a proposal of a Law project for the setting up of the legal framework and
procedures for the implementation of the initiative and the assurance of its general impact
in the sector. The FIDE project has developed medium and long term plans to ensure the
continuity of the initiative and to create the foundations for the sustainable development
of the model, as well as to increase the awareness in the sector of the use of data models
The paper is structured as follows: Section 2 describes the objectives of the
development of the model. Section 3 shows the methodology used in the development,
including management and organization issues as well as technical issues. Section 4 gives
the flavour of the work done in the technical part with the model development. Finally,
Section 5 summarizes the conclusions and lessons learnt from the work as well as the
near fiiture steps.
The main objective of the FIDE initiative is the development of a common product data
model for the Spanish Construction sector. A major concern, although is to keep the
compatibility with the existing initiatives at international level. The objective of FIDE
can be splitted into two main issues: firstly the development of the model itself, and
secondly the establishment of the necessary procedures for its maintenance and quality
The main trigger for such a development is the need for a common framework for the
different stakeholders in the sector to exchange information. One of the most benefited
actors, and in fact the main promoter, is the Administration. The Administration is in the

eWork and eBusisness in architecture, engineering and construction


very centre of the data exchange. They have to receive plenty of documentation related to
the construction process: building licence queries, quality control documents, health and
safety assurance reports, etc. Each document has to be processed according to a repetitive
established procedure. At the moment this is done manually. The use of a known standard
digital format will allow them to process this documentation in an automatic manner.
Furthermore, the Administration has the power to make it compulsory for the rest of the
stakeholders to use this data model for the delivery of the required documents.
On the other hand, there are also some important benefits for the industry side. In case
they use this common data model the information reusability increases, thus improving
the efficiency of the construction process. Also different actors collaborating within the
framework of a given construction project can take profit of the file sharing, thus
reducing the repetition of data introduction in different applications and improving,
again, the efficiency of the process. This becomes directly into economic benefits for the
sector stakeholders
Finally, the existence of a broadly accepted and spreaded product data model will be
the key to the development of software applications for the sector. These applications will
take profit of this common language to facilitate the re-usability of information, the
automatic processing of data, etc. So one of the intentions of the consortium is to set up
the basis for the implementation of such applications by third parties as soon as the
industry demands them.
To sum up, the objectives of the FIDE initiative can be enumerated as follows:
To foster the construction sector development by improving the efficiency of the
current ways of working.
To improve the current communication channel between the sector stakeholders and the
To provide the sector, specially the Administration, with a base for the later
development of tools based upon it.
To offer international interoperability by following the main international standards.
To facilitate the interoperability between the sector stakeholders: promoters, designers,
constructors, material provider, software vendors, etc., including the Public
Administrations, independently of the computer applications used for planning,
designing, estimating, and covering administrative management, authorizations
procedures, and other purposes.

3.1 Introduction
This section describes the methodology that has been followed for the development of the
FIDE model. The section has been structured in three parts: firstly the management
structure that was created is described. Then the model maintenance and extension
procedures that were established are described. In both cases the main functions and aims
for each component of the structures are defined. Finally the technical issues are
discussed, explaining and justifying the main technological decisions taken in the model
implementation and the tools used.

Fide: Xml-based data model


3.2 Management structure

In order to keep control of the model evolution and quality assurance, a management
structure has been defined.

Figure 1. General FIDE structure.

Figure 1 shows how the overall FIDE structure integrates within the Spanish
Administration hierarchy. In fact it is only a proposal, but according to it, FIDE would be
a subcommittee within the Technical Commission for the Quality in Construction.
The inner structure of the FIDE management bodies is composed of three layers. At
the upper layer is the Steering Committee which is in charge of the strategic direction.
Under this group is the Technical Committee which deals with all the technical issues
related to the model development. Finally, at the bottom level, there are the working
groups which are made up ad-hoc for the development of concrete projects.
The Steering Committee is in charge of setting the main strategic lines of the FIDE
initiative. Amognst their responsibilities are: approving the inclusion of new extension
projects under the FIDE denomination. For these decisions, they will have the support of
a group of technical experts: the Technical Committee members. Finally they have the
responsibility to take the decisions about activities funding. Furthermore, the Steering
Committee holds the representation of the FIDE activities within the framework of the
Spanish Administration.
The Technical Committee is responsible for the quality management of the FIDE data
model. Its members must take care of the different implementations and coordinate the
different work groups which are developing areas of the model. To this end, they must
watch and assess the different implementations and extensions of the model, and provide

eWork and eBusisness in architecture, engineering and construction


the developers with their advice and support. This Committee defines the technical
framework for the adequate development of activities under the FIDE initiative. Thus
providing a controlled environment for the developers, which is the first step towards the
quality assurance of the model. These activities include the specification of issues such as
the development methodology and tools, as well as infrastructure and common
information resources.

Figure 2. Work groups structure.

The Technical Committee is also the responsible for the representation and defence of
the FIDE initiative before other organizations or technical work groups. This includes
official standardization organizations (AENOR, CEN, ISO) and national or international
consortiums (IAI, OASIS, )
Work Groups can be made up ad-hoc for solving specific objectives within the model.
They should make an agreement with the Technical Committee to decide the suitability
of their objectives and their work plan within the FIDE model long-term objectives. The
work groups structure is shown in Figure 2.
As soon as the Technical Committee has approved an extension proposal, the work
group will be allowed to use common information resources from within FIDE
(templates, technology, etc), and provide their results to be integrated into the general

Fide: Xml-based data model


3.3 Maintenance and extension procedures

A major concern in the development of the FIDE data model is the quality assurance. The
need for a model like this one is obvious in the sector and as it has been justified above
its use will increase the efficiency in the construction processes as well as the quality of
the obtained results. However, in spite of this positive breeding ground, there exists the
high risk of developing an useless model. This may happen if the quality issue is not
given the adequate consideration. For this reason, the model quality assurance will be one
of the major issues in the development and the assessment of contributions.
In this line, one very important issue for the model quality assurance is its usability,
this is obtained by assuring the following features: it must be understandable for the
developers in charge of its implementation. This applies mainly to software application
developers but also to work groups dedicated to make an extension of the FIDE model in
some specific area. To this end, some basic norms will have to be followed, on the one
hand related to naming and structure of the model and on the other hand a complete and
well-done documentation of the model is essential. By following these guidelines, the
result will be to obtain a more intelligible and reusable model as well as a higher level of
Concerning the extensions of the FIDE model, the developers must follow a defined
methodology. This methodology describes the process for the extension proposals
fulfilment and presentation. Thus, a set of steps has been defined in order to facilitate the
task for the proposer groups and the Technical Committee. To sum up, a proposal should
include the following elements:
Description of the utility of the extension.
Integration of the extension within the general FIDE model and other standard
reference models.
Project development plan.
This way, the potential developers can communicate their idea in a homogeneous way,
and the Technical Committee has some objective parameters to assess the different
proposals and evaluate their feasibility.
The methodology also includes the description of the steps to be followed in the
development of new models, or sub-models. The process must be done in four steps:
Identification of the sub-model field. The developer group must identify the field of
study where they are going to work, that sets the framework where the concepts to be
modelled are fitted.
Process model definition. A process model must be defined, preferably using IDEFO
representation, which includes the processes under study in the modelling exercise.
The process model must include the exchanged documents as well as the participating
agents. On the other hand there also exists the possibility for the developers to identify
the processes on a reference standard process model instead of developing a new one.
Integration within the global model. A study must be performed in order to identify
which parts of the global model are going to be affected by the submodel to be
Model development. Finally, after the Technical Committee approval, the work group
will proceed to the actual model development. To that end, they will develop the parts

eWork and eBusisness in architecture, engineering and construction


not included in the global FIDE model. Subsequently, they will proceed to the
integration of the sub-model within the global FIDE model.
This development process will be performed following the technical indications and
advice from the FIDE Technical Committee.
3.4 Technical issues
As it has already been mentioned, one of the main objectives of the FIDE project, and
thus of the consequent FIDE data model, is the compatibility with similar international
initiatives. More concretely, the actual reference for the development of the model has
been the IFC model.
The IFC standard (Industry Foundation Classes) is a standard promoted from the
construction industry by means of the IAI (International Alliance for Interoperability).
IAI is an association of organizations involving engineers, architects, constructors,
national administrations, etc. It arised in 1995 aiming at fostering a product data model
for the data exchange amongst different applications within the construction sector.
After some years, ISO has approved its last release, IFC2X, as an ISO Publicly
Available Specification (ISO/PAS 16739), thus making IFC into an ISO standard.
One of the main decisions for the consortium was the choice of the method to
represent the model. In this line there are several possibilities, namely UML diagrams,
Express schemas, XML schemas. A deep study was performed evaluating pros and cons
of each of them. Eventually the decision was to use XML schemas [3,4]. This decision
has been strongly meditated as it has some advantages and some disadvantages. Anyway,
after this detailed analysis, there was a clear decision to use this meta-language for the
model representation. The reasons for this decision are stated below:
XML meta-language is the de facto standard for the data exchange in the network. On
the other hand, one of the main objectives in the development of the FIDE model is to
facilitate the B2A (business to Administration) to improve the relation between the
Administration and the industry. Providing the tools for procedures such as electronic
delivery of documentation, eTendering, etc. This way, by using XML in the
development of the model, we are moving forward to the facilitation of these B2A
There exists a very high level of use of XML at the International level, this will
facilitate the spread of the model.

At the moment there are a lot of available software tools to work with XML. On the
one hand there are plenty of tools for the manipulation (edition, visualization, creation) of
XML files. On the other hand there also are a lot of development tools such as
programming libraries. The proliferation of such tools does make easier the adoption in
the industry of the XML structures.
The most extended international standards in this sector, ISO STEP and IAIIFC, evolve
or already support XML [5].
The existence of programming libraries and SDK (Software Development Kits)
facilitates the task to developers. This is a key issue, because it promotes the faster
development of applications, and as a consequence a bigger expansion of the model.

Fide: Xml-based data model


The consortium suggests a set of tools in order to facilitate the developers task. Namely,
these tools and methods are IDEFO for the process models, UML for the conceptual data
models, and XML for the physical data model.
4.1 Model framework
As it has been introduced, the main aim of the current project was not the development of
a complete model. More precisely, the intention was to establish the mechanisms for a
self-maintenance and development of the model. These mechanisms have been described
in the previous section. In this section a sample of the developed model will be shown,
just to give the flavour of the model under development.
This sample sub-model has been developed in the area of quality management in the
construction sector. This is a very important issue in the sector and has gained a lot of
attention in the latest years. The main aim is to achieve better quality in the final product
by means of controlling the whole life-cycle of the construction process.
In the case of the Spanish situation the quality control procedures are mainly driven by
the Administration. Some of the mechanisms they use have been deeply studied within
the FIDE framework, namely a control book, quality profile, building book and material
Most of the information to make up the model has been extracted from the analysis of
this documentation. Furthermore the overlapping amongst them has been accurately
4.2 Reference model: IFC
The FIDE consortium has decided to take the IFC model from IAI as the reference
model. Specifically the latest XML version for the release IFC2x 2nd Edition: ifcXML2.
FIDE data model does not include the whole ifcXML2 model. It only plans to pick the
needed entities from IFC and then to complete the entities according to the specific
needs. In some cases these entities must be extended to fulfil the needs. One of the main
concerns is to keep the compatibility with the standard initiative in order to get an open
and usable data model.

eWork and eBusisness in architecture, engineering and construction


Figure 3. FIDE general structure.

4.3 FIDE model sample

The approach selected for the FIDE model architecture is a layered modular solution.
This structure is defined in the following lines:
Each entity represents a concept from the real world. These elements are represented
through XML schemas, composed by an identification, a set of attributes and some
relations with other elements.
The related elements are put together to make up conceptual groups. These groups are
called clusters.
The whole structure is a layered structure:
Each layer contains the entities that several entities from an upper layer do use. This
way the model avoids duplicity of re-used elements in the model.
The lower level is called kernel. It contains the most basic and re-used elements.
In the sample of model here shown, the modelling target has been a descriptor for the
building, focused on the administrative identifiers, and the general structure. The features
that describe the model and the work done are stated below .
The modelled element has been a descriptor for the building. It is a common element in
the documentation related to quality management in construction. It appears in the
design, construction and facility management phases.
The building descriptor contains the set of data that describe a building from the
administrative and formal point of view.
In order to make use of the defined methodology, the element was studied from
different points of view and several sub-models were obtained.

Fide: Xml-based data model


Figure 4. Building descriptor schema.

Finally these sub-models were integrated to obtain just one common model for the
building descriptor
Figure 4 shows a sample of the XML schema that has been developed within FIDE. In
concrete it shows part of model for the building descriptor.
Following are shown the most important conclusions:
It has been checked that the model tends to organise itself following the layered
approach presented above. This happens in a natural way as new elements are
analysed and included within the model. New elements share data sets with existing
elements, these data have to be included in the lower layers for reuse.
The development of diiferent parts of the model separately for its latter integration has
shown to be a right way of working. This has demonstrated the viability of the defined
strategy for the long term.
IFC model is very flexible and complete. However this makes it very complex and
ambiguous to some extent. To reduce the ambiguity is one of the main concerns in the
FIDE model. Thus avoiding problems between different software vendors
implementation of the same model.

eWork and eBusisness in architecture, engineering and construction


We have confirmed that the main modelling problems to be solved had already been
faced in the IFC model. This validates the modelling strategy defined within the FIDE
Despite the big scope of the IFC model, the absence of some important elements and
attributes for the Spanish construction sector has been detected. These elements are
essential for the utility of the model in the Spanish national level. That confirms the
need for extension of some of its features.
The XML available tools have facilitated the creation and understanding of the model
as well as the documentation generation. This is so thanks to its clarity in
representation, and the big amount of different tools
The nearest further work consists of the continuity in the development of the model,
firstly to complete the quality management area and then fulfilling new areas of interest
in the construction sector. The Administration has shown its determination and deep
interest in the FIDE project by assuring the development for several years on, so the
model will keep evolving along the coming years. Apart from this, and as important as it
or even more, the very next target will be the promotion and dissemination of the results
obtained so far. Furthermore, the emphasis will be set in the development of software
applications in the Spanish sector.
[1] http://www.codigotecnico.org/
[2] http://www.iai-international.org/
[3] XML Schema Part 1: Structures Second Edition, W3C Proposed Edited Recommendation,
March 2004.
[4] XML Schema Part 2: Datatypes, W3C Recommendation, May 2001.
[5] Options for the IAI regarding XML, Thomas Liebich.

eWork and eBusiness in Architecture, Engineering and ConstructionDikba & Scherer (eds.)
2004 Taylor& Francis Group, London, ISBN 04 1535 938 4

A framework for concurrent structure analysis

in building industry
A.Niggl, R.Romberg & E.Rank
Lehrstuhlfr Bauinformatik, Technische Universitt Mnchen, Germany
R.-P.Mundani & H.-J.Bungartz
IPVS, Abteilung Simulation groer Systeme, Universitt Stuttgart,
ABSTRACT: In this paper, a software-framework will be presented,
which helps to support the concurrent work of multiple planners in the
construction industry. Basis of this work is a strictly volume-oriented
building model. This model is stored in a central database, which supports
the cooperative work by using object-based check-in, check-out and
locking mechanism. Furthermore a decomposition algorithm will be
presented, which automatically derives a hexahedral mesh for a finite
element computation from this central building model.

The efficient and accurate exchange of data is an important basis for a successful
cooperative work in the field of computer-supported planning and design in building
industry. This fact not only holds for the exchange inside a homogenous group of
planners, it is also important for the data-transfer between planning processes within
different working-domains.
In this paper, a software-framework will be presented, where a central volumeoriented geometric model is considered as a basis to support the cooperative work and the
integration of different planning-processes. The central data set is thereby given by an
explicit geometric B-Rep (Boundary-Representation) model associated with semantic
product data attributes, which is originally derived from an IFC product model.
Using a classical client-server structure, the building model is provided centrally and
can be accessed by different planners. The concurrent access to the database is organized
similar to classical software-management-systems, where single entities can be checked
out, locally modified and checked in by the user. To ensure geometric consistency of
the common data model, an octree-based algorithm is applied, which was developed in
this project.
The geometric building model is the starting point for various subsequent tasks in the
planning process. For the structural analysis, we present an automatic generation of a
volume-oriented finite element mesh, consisting of solid hexahedral elements. In addition
to the finite element analysis, an indoor air flow simulation was also connected to the

eWork and eBusisness in architecture, engineering and construction


framework in an other project (v. Treeck et al. 2004). Figure 1 shows a schematic view of
the framework.
The outline of this paper is as follows: In the next section, the software-structure and
techniques used inthis work will be presented. Then in Section 3 the automatic derivation
of the hexahedral finite element mesh from the architectural model will be described in
detail. Finally, in Section 4 the cooperative work of two planners will be demonstrated in
an example

Figure 1. Schematic view of the

framework: an IFC-model is converted
into a geometric B-Rep model, which
is saved in a central database; different
clients can access in parallel the central
data; a finite element model can be
derived and analyzed automatically.

A framework for concurrent structure



2.1 From IFC to an explicit geometric model
In this project, a commonly shared geometric model is used as a basis for various tasks in
the planning and simulation process. The geometric data of a building model given by the
IFC-standard (IAI 2003) is usually not directly adequate for a numerical simulation, as it
only describes the topology and the mutual connections of different structural
components. For example, the geometry of a wall is described by a 2D profile together
with an extrusion direction or a window is given by its relative position to an anchor
point. But for the automatic derivation of subsequent simulation models, for example the
generation of a finite element mesh, we need an explicit description of the geometry.
Thus, we use the geometric modeller ACIS (Spatial 2004) to create a geometric model
from the IFC-product model, where each construction unit is described by a single B-Rep
object. The IFC data is accessed by using the Eurostep IFC-Toolbox (Eurostep 2000),
which is an object oriented C++ implementation of the IFC scheme representation and
which provides interface functionalities to access and manage instances of the product
model. The semantic data contained in the IFC object model is added as attributes to the
B-Rep entities und is saved parallel to the geometric data in an additional database.
2.2 Organization of the concurrent access
The technical basis of our cooperative workspace is a classical client-server architecture.
In order to ensure consistency of the central data model, the concurrent access by the
clients is being controlled by an intermediary management layer. Similar to well known
software-management systems, like CVS (concurrent version system), the server
provides methods for downloading, managing and uploading data. The smallest
organizational entity in the exchange is thereby one single B-Rep object. Using an
octreebased algorithm, the management layer ensures geometric consistency of the
internal data model.
In order to share information among the concurrently working planners, notification
services were developed in this project. The user can activate a locally working software
module, which connects to the server and informs him about modifications in the central
data model caused by other planners. In a configuration menu, the user can choose among
different notification levels. He has also the possibility to reduce the number of objects,
he wants to be informed about, to a subset of the complete model.
During runtime and depending on the type of useraccess, a single object on the server
can remain in one of three different states: clean, shared and locked. An object is clean, if
no client has accessed this object for modification. It is in state shared, if at least one user
has accessed the object in read- and/or write-mode and it is in state locked, if just one
client has claimed exclusive write permissions for an object.
The operational procedure in the user-side workspace will be demonstrated in an
example in the last section in detail. Thereby, one thing is always the same: In order to

eWork and eBusisness in architecture, engineering and construction


upload (check-in) modified or new objects to the central database, the user has to
checkout the selected objects first.
Based on the internal states mentioned before, for each object, the user can choose
among three possible modes of access:
Read-only: in this mode, a modification by the user is not allowed. However, the user
will be registered on the server and has the ability to activate the notification service
for the selected object.
Read-write: the user gets read and write access for the data, which allows him to check
in modified objects to the server. Modification by others is also possible.
Exclusive-write (lock): the user gets exclusive write access to the object. Modification
by others is not allowed.
Only in case of read-write mode, a concurrent access of objects is possible. In this case,
each upload by a user will overwrite the current version in the central database. To avoid
confusion, the notification service may help, so that the user can stay informed about the
actual state and can react accordingly. This makes sense especially in cases, where one
user (e.g. a designer) changes the geometry of an object while an other user (e.g. a
structural engineer) only wants to modify some attributes like loads or material. In such
cases, an exclusive lock by one user would only hamper the work of the other. In cases
where one wants to prohibit concurrent access by others completely, e.g. when
substantial modifications must be applied, exclusive-write access should be used instead.
After modifications in the local workspace are finished, the user will check in his
object to the central database. Each upload of a modified or new building object will
thereby initiate a consistency check on the server, according to the method which will be
explained in the next sub-section. In the case of geometric collisions or insufficient
access rights, the upload will be rejected and the user will be informed about the problem.
2.3 Consistency check
Before any consistency check can be performed, volume-oriented models have to be
derived from the respective surface-oriented ones. In (Mundani et al. 2003) we presented
an algorithm for the generation of octrees by intersecting half-spaces, allowing us a fast
and efficient derivation both in real time and on-the-fly.
Applying the Boolean operator intersection on two octrees, collisions of type
overlap and gap can easily be detected. Whenever two voxelsvolume elementsof
two arbitrary octrees claim for the same space an overlap occurred and the algorithm can
stop at this point. Depending on the maximum depth of recursion dmax overlaps up to
h=1/2 dmax on the finest resolution level can be found. In this case, the check-in of
modified parts is rejected by the server.
When no overlap could be detected, the two parts or the two volume-oriented models,
respectively, are either lying perfectly together side by side or are disjoint. In the latter
case, a gap among these two parts exists; only gaps of certain sizes are from further
interest. Therefore, the algorithm has to determine the maximum depth deff reached during
the intersection calculation, not to confuse with the maximum depth of recursion dmax. By
specifying dgap, the maximum gap size, only in case of dgapdeff< dmax a gap has been

A framework for concurrent structure


detected. The corresponding part is allowed to be checked in but further user feedback is
necessary, because most gaps unintentionally occur due to design or round-off errors.
3.1 A volume oriented flnite element approach
In contrast to the classical way in finite element analysis using dimensionally reduced
models (e.g. 2D-plates, shells or beams), in this project the structural analysis is
performed in a fully volume-oriented approach. The complete structure is discretized
consistently with solid hexahedral elements and the computation is carried out by using
higher order elements of the so-called p-version of the finite element method (Szab et al.
1991, Szab et al. 2003, Dster et al. 2003). This approach has some important
The automatic derivation of a finite element model from the original (product-) model is
simpler, if this transition can be done consistently in the same volume-oriented way.
The possibility of such an automatic model derivation releases the engineer from a
manual reconstruction of various numerical systems.
Using solid finite elements, possible three-dimensional stress conditions can be
There is no need for coupling different dimensionally reduced mechanical models.
3.2 Automatic derivation of the flnite element mesh
An important basis of our consistent volume-oriented approach is the automatic
derivation of the finite element model (Romberg et al. 2004). The basic idea of the
underlying algorithm is to decompose the given geometric building model into a set of
simpler geometric objects and decompose each of these objects again into hexahedral
elements. To ensure compatibility (i.e. no hanging nodes) of the final mesh, the whole
procedure is carried out in a set of steps, which are mainly used to find a common
discretization at the interface of different entities. It should be mentioned that this
approach does not aim at meshing general spatial volume-structures but is capable of
decomposing a typical building model, which usually consists of objects like plates,
beams, columns and slabs.
3.2.1 Connection model decomposition
As a first step in the process of creating a hexahedral mesh, the given geometric building
model is decomposed into a so-called connection model using boolean operations.
Figures 2-4 illustrate the basic idea.
Starting from the set of building models Mb, these elements are partitioned into a set
of coupling objects Mk and a set of difference objects Md. The set of coupling objects Mk
are then again recursively decomposed into coupling objects of different levels (Mkl

eWork and eBusisness in architecture, engineering and construction


Each coupling and difference object is itself a closed B-Rep body being described by
nodes, edges and faces. After decomposition, the intersection between difference objects
or coupling objects of the same level is given in points and edges only, whereas adjacent
difference and coupling objects intersect in faces, edges and nodes.
After applying this decomposition algorithm, the resulting elements have some
important characteristics with respect to the following steps in finite element

Figure 2. Initial configuration Mb with

three objects.

Figure 3. Creation of connection

objects (Mkl, Mk2) using boolean

A framework for concurrent structure


Figure 4. Decomposed connection

model Mc.
mesh generation: Each coupling object Mk possesses hexahedral structure, thus, it can be
easily partitioned into smaller elements, whereas each difference object Md is plate
shaped and can be assumed to be obtained from sweeping a two-dimensional polygonal
3.2.2 Generation of hexahedral finite elements
The connection model shown in the previous section is the starting point for the
automatic generation of hexahedral elements in the next step. Thereby we use either
elementary three-dimensional meshing macros mainly applied to the coupling elements
or, in case of the plate-like difference objects, hexahedral elements are obtained by
creating a 2D quadrilateral mesh on the polynomial mid-face and sweeping this mesh to
the third direction. Most crucial in meshing is yet the question of generation of
compatible elements. For this, we apply a two-step approach. In a first run, a reasonably
refined mesh for each (separate) difference object is defined. According to the different
discretization on the boundaries of adjacent elements a compatible discretization must be
determined. When this common discretization is found on the boundary,

eWork and eBusisness in architecture, engineering and construction

Figure 5. Original 3D volume model

given by the CAD system.

Figure 6. Decomposed connection



A framework for concurrent structure


a new mesh is created on the difference objects in a second run, which is then compatible
with its neighbours, i.e. the resulting mesh has no hanging nodes.
3.3 A complex example
In this section, the process from the original geometric building model to the finite
element results is demonstrated in an example of a realistic office building. The building
is constructed by reinforced concrete and consists of two massive inlying building cores,
six floor plates and supporting columns. It has dimensions of about 4030 meters in the
ground view. Figure 5 shows the geometric model.
In Figure 6, the decomposed connection model can be seen, which was derived from
the original model according to Section 3.2.1. One can see easily the connection elements
on the top floor plate, created at the intersection of the inlying building cores and the

Figure 7. Finite element mesh.

eWork and eBusisness in architecture, engineering and construction


Figure 8. Finite element results; plot of

vertical displacements.
Figure 7 shows the finite element mesh, derived from the connection model according
to subsection 3.2.2. It consists of 8313 hexahedral elements. In Figures 8 and 9 the finite
element results can be seen. First, in Figure 8, the displacement plot is depicted. Figure 9
shows mean stresses (v. Mises stress) in a zoomed detail. For the computation, vertical
loads on the floorslabs and horizontal wind-loads were considered. The results were
computed using the p-version of the finite element method with a polynomial degree of 3.
This resulted in a computation with 269, 043 degrees of freedom, which took about 2 h of
time on a Pentium IV with 1.7 GHz.

A framework for concurrent structure


Figure 9. Zoomed detail of stress plot

(v. Mises stresses).

Figure 10. Complete model in

workspace A.

eWork and eBusisness in architecture, engineering and construction



In this section, the concurrent work in a group of multiple planners will be demonstrated
in an example. The office building shown in the previous section is now modeled and
stored in the central database, which was described in detail in Section 2.2 and 2.3. In this
context the database is also referred to as server.
Let us assume, that planner A (e.g. an architect) decides to carry out some
modifications in the model. In a first step, he may request for an update in order to get the
newest model version from the server (Fig. 10). Then, he checks out some objects with
exclusive-write access in order to apply some modifications, e.g. move a column to
another place (Figs. 11, 12).
In the meantime, planner B (e.g. a structural engineer) has also checked out some other
objects and changes load attributes (Fig. 13).

Figure 11. Check-out of selected items

by planner A.

A framework for concurrent structure


Figure 12. Change of column by

planner A.

Figure 13. Change of load attributes by

planner B.

eWork and eBusisness in architecture, engineering and construction


Planner A has finished his work now. So, he selects the modified objects and tries to
check them in to the server (Fig. 14). Unless any other user has locked one of the objects
exclusively or the consistency check detects an intersection, the check-in is successful
and the objects will be stored in the database and overwrite their current version there.

Figure 14. Check-in of modified

column by planner A.

Figure 15. Local database agent

informs planner B about the modified

A framework for concurrent structure


This occurrence of objects checked in is now reported automatically to planner B,

because he has activated his local notification agent with the order to listen for geometric
modifications (Fig. 15).
Technically, every check-in is thereby posted to a message queue on the server, where
each message contains information about the type of modification (geometrically, only
attributes, type of attribute), the user, etc. This message queue can be read out by the
remote, client-side notification agent, which prefilters the messages according to the
users settings and sends a signal to the user.
Back to planner B, the structural engineer, after updating his local workspace, he
initiates a new finite element computation on basis of the modified model.

Figure 16. Resulting finite element

mesh of the modified model.
Again, the derivation of the finite element mesh and the computation is performed
completely automatic and needs no further manual interaction by the user. Figure 16
shows the finite element of the modified model, where a complete line of columns was
We have presented an approach, which may help to support the co-operation between
different planners in building construction. The concurrent work is organized, using a

eWork and eBusisness in architecture, engineering and construction


central database with its accessmanagement layer in combination with a consistency

check and notification services.
The automatic generation of a finite element mesh, based on a strictly volume oriented
model, releases the engineer from manually transferring design models to the numerical
simulation model. This helps to speed up the design process, especially in cases, when
modifications and different design variants must be investigated.
This research has been supported by the Deutsche Forschungsgemeinschaft (Priority
program 1103, Vernetzt kooperative Planungsprozesse im Konstruktiven Ingenieurbau)
to which the authors are grateful.
Dster, A., Brker, H. & Rank, E. 2001. The p-version of the finite element method for threedimensional thin-walled structures. International Journal for Numerical Methods in
Engineering:52, 673703.
IAI, International Alliance for Interoperability 2003. IFC Release 2.x. Internet: http://www.iaiinternational.org/.
Eurostep 2000. The IFC STEP Toolbox. Eurostep Group. Stockholm, Sweden. Internet:
Mundani, R.-P., Bungartz, H.-J., Rank, E., Romberg, R. & Niggl, A. 2003. Efficient Algorithms for
Octree-Based Geometric Modelling. Proceedings of the Ninth International Conference on Civil
and Structural Engineering Computing, Topping, B. (ed.). Civil-Comp Press.
Romberg, R., Niggl, A., v.Treeck, C. & Rank, E. 2004. Structural Analysis based on the Product
Model Standard IFC. Proceedings of the 10th International Conference on Computing in Civil
and Building Engineering (ICCCBE). Weimar, Germany.
Spatial Corp. 2004. ACIS 3D Geometry-Modeller. Westminster. Colorado, USA. Internet:
Szab, B.A. & Babuka, I. 1991. Finite element analysis. John Wiley & Sons Ltd., New York,
Szab, B.A., Dster, A. & Rank, E. 2003. The p-version of the finite element method. Accepted for
publication in Stein, E., de Borst, R. & Hughes, T.J.R. (ed.), Encyclopedia of Computational
Mechanics, John Wiley & Sons Ltd. NewYork,USA.
v.Treeck, C. & Rank, E. 2004. Analysis of Building Structure and Topology Based on Graph
Theory. Proceedings of the 10th International Conference on Computing in Civil and Building
Engineering (ICCCBE). Weimar, Germany.

eWork and eBusiness in Architecture, Engineering and ConstructionDikba & Scherer (eds.)
2004 Taylor & Francis Group, London, ISBN 04 1535 938 4

IFC supported distributed, dynamic &

extensible construction products information
M.Nour & K.Beucke
Bauhaus University, Weimar, Germany
ABSTRACT: This paper reports ongoing research work on a new
approach of using electronic product libraries based on the concept of
Object Information Packs. This approach is based on top of the IFC model
and the GTIN (Global Trade Item Number) concept. The paper presents a
complete specification of the OIP concept and a perspective of some of its
implementation scenarios. The main objective is establishing a link
between Objects in the IFC model from one side and their technical and
commercial attributes at the manufacturer and supplier from the other
side. This should enable multidisciplinary cross-industrial lifecycle
information to be captured by any IFC compatible application. The paper
discusses several scenarios for the implementation of the OIP concept
together with a simple IFC example.

1.1 Building information models
The construction industry is working very hard to bridge the gap between its islands of
automation. The efforts have taken many forms e.g. neutral file formats, APIs
(Application Programming Interfaces) and BIMs (Building Information Models). The
latter is expected to provide a means for sharing and exchanging of information. The first
intention towards a Building Information Model started in the mid-1970s by independent
efforts to develop a number of integrated systems, based on a single model that supports
various applications e.g. OXSYS CAD and CEDAR and HARNESS hospital design
systems in England (Eastman 1999). Although it is now more than thirty years later, the
BIM approach is still not a common practice in the AEC industry.
One major problem facing BIMs is the absence of the responsibility for modeling the
construction product and material in a multidisciplinary cross industrial level of
abstraction. It is obvious that architects working on a project that contains thousands of
elements would not have the resources to create a model for each element. They would be
most probably paid for the production of printed drawings and documents rather than
models. It is not only the architect; we can imagine the problem for all the involved
disciplines. The problem is even worse when we consider that these models could be
prqject specific and cannot be used in a product library for similar projects at other
organizations. Most probably this approach would go beyond any return on investment

eWork and eBusisness in architecture, engineering and construction


1.2 Product catalogues

The majority of manufacturers, product information brokers and portal websites are
offering online electronic paper versions of the old catalogues using convenient
presentation formats such as PDF and HTML (Augenbroe 1998). Some Internet portal
websites have developed their services to offer online product models. Although this is
considered a big step towards the process of automation of a building information
models, it still has its shortcomings and problems that need to be resolved.
One of the problems is that this approach does not seem to have obvious influence on
the traditional paper based catalogue selection process (off-the-shelf products).
Moreover, there is little evidence of efforts related to studying how designers search for
products, evaluate them and make their selection decision (Shailesh et al 2003). On the
other side of the value chain, it is not clear how can the required multidisciplinary cross
industrial information from the manufacturer and supplier can be captured online,
together with tools that assist queries and decision making processes rather than decision
A major problem with BIMs is that they become of little use, when the information
provided by the model is insufficient or obsolete e.g. (price or availability of a product)
or when a certain application requires a piece of information that can not be supplied by
the model. More often than not arises the need in the AEC industry for more information
or updates of information about a product. For example, it is argued by researchers like
(Laitinen 1998) that a cost estimate for a prqject is repeated seven times in average.
Furthermore, Value Engineering activities and cost optimization necessitates carrying
out different changes and substitutions of products and materials to the design to reach a
satisfying decision. In addition, the need for multidisciplinary life cycle information
makes it inevitable to need more up to date information about a certain product e.g. the
FM (Facilities Management) discipline is more often than not in need for maintenance
information about a certain product, when the rest of the AEC disciplines have already
left the project.
1.3 The proposed solution
In the coming section, the paper explains a new approach that is envisaged to help
automating the development of a building information model. The author suggests that
the information model should be produced as a part of the construction product itself.
This information model grows with the development of the product e.g. the manufacturer
would be responsible for the technical properties and later the supplier or wholesaler
would be responsible for the dynamic commercial aspects, when it is on sale in the
market. The aggregation of this type of multidisciplinary and cross industrial information
is represented and made available online through the OIP (Object Information Packs).

IFC supported distributed, dynamic & extensible


2.1 Definition
OIP Stands for Object Information Packs. An OIP is a multidisciplinary cross industrial
continually updated pack of information about a construction product or a service, upon
which there is a need to retrieve predefined information at anypoint in the value chain.
This information pack acts as a base unit of information supply to BIMs (Building
Information Models) throughout the buildings overall lifecycle.
2.2 Format
It is produced in a software independent neutral format like ISO 10303 p-21 STEP or
XML, i.e. it enables the transfer of structured data, by agreed message standards from one
party (computer) to another by electronic means with minimum human intervention, i.e.
machine-to-machine language.
2.3 Producer
The OIP has to represent both technical and commercial information of the construction
product. This information is usually produced jointly between the manufacturer from one
side and the supplier, retailer, importer or wholesaler from the other side. This means that
the OIP is finally determined at the point of aggregation of both types of information.
This aggregation or double composition ensures the uniqueness of the OIP as an
identifier of the construction product and enables its dynamic properties, i.e. commercial
properties can be continually updated and the technical properties can be extended. The
final OIP identifier is finally issued by the organization that owns the brand name of the
product regardless where, and by whom it has been manufactured.
2.4 Genesis
The OIP is designed to be built on top of EAN (Einheitliche Artikel Nummer), which is a
type of a GTIN system (Global Trade Item Number). It inherits its well-established
norms for global trading and adds further restrictions and capabilities to suit the
characteristics of the construction industry. The OIP can be mapped (converted) to EAN,
whenever needed.
2.4.1 OIP versus EAN
OIP is a construction oriented global lifecycle identifier that links cross industrial
multidisciplinary information. It is mainly designed to suit the characteristics of the
various procurement systems of the construction industry. On the other hand EAN is
considered to be a product item reference and a check digit. The EAN is a pointer to a

eWork and eBusisness in architecture, engineering and construction


database (at the EAN local organisation). However, this database does not include
information more than the producers name and contact details.
Both OIP and EAN have a check digit for the validity of the identifier. The OIP goes
beyond this by adding an extra validity check, that checks if the product fits into the
design or not, e.g. if an OIP of a 1.2 m door is linked to an opening of 0.9 m width in the
Building Information Model, this conflict should be detected and the OIP rejected.
OIP provides dynamic lifecycle information. This means that information can be
continually updated. Updates and versioning are two faces of the same coin. Therefore,
OIP allows for versioning. This enables dynamic properties of the product like price and
availability to be continually updated.
At the meantime, there are different priorities and objectives standing behind the EAN
and OIP. Some properties are of greater relevance to commercial trade uses than to
construction uses and vice versa, e.g. tracking and tracing of logistical units and
returnable assets is of a great value for trading. This could still be used in construction for
determining things like the percentage of completion of works and delivery of products to
construction sites and so forth. Although it seems of no use to stick a barcode on a beam
on a construction site, where it will most probably be lost, it still can be useful to read
(scan) the barcode from printed product catalogues and link the OIPs to the BIM in the
design phase. Moreover, the use of new technologies like: programmable mobile phones
with scanners and cameras in future may enhance the effect of an OIP Barcode for on-site
2.5 Degree of granularity
One of the problems facing OIPs is the degree of nesting of elements. In other words, to
what extend would the OIP reference other OIPs of the constituent components. In some
cases like in electromechanical equipment, we can not determine at which level should
the OIP referencing stop. Is it to the screw level, or to the material of the screw. To put
an end to this problem, OIP is designed to reference other construction products and
materials only as a maximum detailing level e.g. a concrete brick may reference cement,
sand and gravel as leaf elements. Other sophisticated construction elements like
electromechanical equipment and so forth are not further referencing their components.
However, there is another ISO Standard (ISO 13584 Plib), which is a STEP-EXPRESS
based standard that is designed specially for this purpose and it is technically feasible to
be referenced from OIPs whenever needed.
2.6 Limitations
OIPs are not aimed by any means to solve the taxonomy problems of the construction
products properties. Thus, the product properties are limited to the attributes and
published property sets of the IFC 2x model (ISO PAS 16793 (lAIntern 2003)). This
enables the exchange of multidisciplinary cross industrial technical properties between
parties beyond national borders without any mis-understanding due to differences in
languages, classification systems or organisational cultures. However, the commercial
properties will remain subject to international trading conventions and standards.

IFC supported distributed, dynamic & extensible


2.7 An OIP organization

An important task contributing to the success of OIPs is the responsibility of the
management of the OIPs themselves. Things such as numbering and keeping

Figure 21. The structure of the OIP.

records of technical properties of products have to be managed by an international nonaligned organization. Therefore, the main mission statement of the OIP organisation is
the allocation of the OIP technical section and keeping records of technical information
about products in an online database, where it can be accessed at any time by any user.
Furthermore, it may optionally in certain cases give a reference to the brand name holder,
where the dynamic commercial properties of the product reside.
2.8 The design of the OIP
Figure (21) shows that the OIP consists of two main parts; the technical part and the
commercial part. Both of them compose the OIP and formulate it as a global unique
identifier for a construction product, service or material. A product under different brand
names can have one or more OIPs with the same common technical part, but with various
commercial parts. This enables the OIP to represent commercial properties supplied by
the brand name holder in addition to the technical properties provided by the
manufacturer. The technical part is relatively static as the technical properties do not
change but can be extended. On the contrary the commercial aspects like price,
availability and discounts can change dynamically.
2.9 The formulation of the OIP
Figure (22) shows how an OIP is formulated and how it can reference the OIPs of its
constituents. It also shows the combination between the technical and commercial parts
of the OIP Furthermore, it shows the entire relation between the end-user, supplier,
manufacturer and the OIP organisation. It is an example of a simple brick that consists of
cement, sand and gravel. The brick itself has technical properties provided by its
manufacturer. The OIP of the brick further references the technical parts of the OIPs of
its constituents. As a general rule, the OIP is only complete, when its both components
(the technical and commercial parts) are present. The manufacturers or the suppliers have

eWork and eBusisness in architecture, engineering and construction


to register the technical properties according to the IFC model and its published property
sets at the OIP Standard Organisations. At the time of conducting this research work, the
EAN keeps in its database only some basic information about the brand name holder, like
contact details. However, it does not offer any commercial properties that belong to any
product. Such properties like availability, price and discounts are best managed by the
commercial organisation itself. Therefore, the OIP organisation can exceptionally include
a link to the brand name holder or commercial organisation to overcome this shortage of
the EAN system.
2.10 The OIP identifier
As it is earlier mentioned, the OIP is built on top of EAN. Hence, the commercial aspects
are encoded in the EAN. However, the EAN item number will be extended to include a
versioning system that enables the dynamic management of commercial information.
The technical part of the OIP is a reference to a pack of information residing at the
OIP organization. This pack of information contains information about the product
according to the attributes and published property sets of the IFC model. It may also

Figure 2-2. The formulation of the


IFC supported distributed, dynamic & extensible


Figure 2-3. EAN and OIP structures.

optional references to other OIP(s) of leaf elements that form the main product by their
aggregation, e.g. cement for the brick or reinforced concrete.
This part of the paper describes the implementation of the OIP concept in real life
scenarios, together with a simple example using a limited number of product attributes to
enable the reader to follow the logic behind the OIP idea. Before the example is
presented, the abstract concept of the OIP is clarified.
3.1 The basic concept
The whole idea can be simplified as a mapping and merging from two source models (S1
and S2) to a target model (T1). In Figure (3-1) S1 represents the OIP organization, where
all multidisciplinary technical information resides. S2 represents the supplier or brand
name holder of the product; where all the commercial properties reside. T1 represents the
client or the user, where it is envisaged to be a group of AEC applications built on top of
the IFC model.
There are many different scenarios where the OIP concept could be implemented.
However, this paper focuses on two main scenarios. First is the traditional way of using
paper based catalogues or CD-ROMs. The OIP reference (identifier) can be instantiated
by the CAD package or by adhoc software. Lifecycle information can then be retrieve
using the OIP identifier and the required data can be mapped and merged to the IFC
model at the clients side (through a distributed network application).
The second scenario is a more complex one. It depends on the capture of the required
object parameters from the CAD/IFC model. These parameters are

eWork and eBusisness in architecture, engineering and construction


Figure 3-1. The OIP implementation.

used to carry out a parametric search (attribute based) versus a descriptive search in the
first scenario. This can be achieved by using SQL, bcXML(Tolaman et al), or EXPRESSX. The result of this search is a list of products that satisfy the search parameters. This list
can be sorted according to the value of any selection attribute, e.g. price, sound
absorption coefficient, fire resistance and so forth.
A step forward in this approach would be the selection and appraisal process i.e.
decision making versus taking. This can be done by conducting a virtual experiment
under simulated real conditions, where the product will be performing (i.e. providing full
context conditions). The experiment is repeated several times on the short listed products.
Each time a product from the candidate products is substituted, tested and ranked
according to the performance in the virtual experiment.
By using this approach the user can determine a set of weighted performance
indicators that represent the full context in which the product would be used. This
coincides to a great extend with the principals of TQM (Total Quality Management);
which is quality of performance rather than specifications (Nelson 1996). Any need for
extra or up to date information should be reached through the OIP unique identifier.
3.1.1 Example
By looking at a simple example for a door (IfcDoor), a door is selected according to the
first scenario from an electronic catalogue from a portal web site (Figure 3-2). It is
transferred through a drag and drop environment to the CAD application, where the OIP
is instantiated to the doors Tag in the IFC attributes (Figure 3-3). The door in the IFC

IFC supported distributed, dynamic & extensible


model consists of two main parts: the Lining and the panel (Figure 3-4). All the attributes
of the door and its property sets can

Figure 3-2. Selection of a door from a

be reached through the OIP. The IFC published property sets of the door include things
like the operation direction, overall size, operation properties (swing), material, panel and
lining detailed properties, door common properties like: Infiltration, Thermal
Transmittance, Fire, Security and Acoustic Ratings and so forth. These properties might
be needed in later design or facility management stages by different AEC disciplines.
Access to this information should be enabled through the OIP. If at any time the need
for more information by any discipline arises, the product OIP can be accessed and the
property is selected and merged to the IFC model at the client side. If the product needs
to be changed for any reason, the same parameters could be used to conduct a new
parametric search. This can also be done as a result of a commercial property change e.g.
price or availability updates. If the product needs to be substituted with another product
then a new OIP unique identifier substitutes the old one and so forth.

eWork and eBusisness in architecture, engineering and construction

Figure 3-3. The instantiation of the


Figure 3-4. IfcDoor panel and lining.


IFC supported distributed, dynamic & extensible


This paper has introduced a new concept for the automation of the process of establishing
a Building Information Model. This concept is called OIP. It depends on the IFC platform
specification (ISO/PAS 16739) and EAN for the transfer, merging and mapping of
technical and commercial data of the construction product. This mechanism enables the
continuous up to date distributed communication between product models and their
attributes, which reside by the manufacturer or supplier. This approach is envisaged to
satisfy the need of information by the AEC disciplines during the products overall life
Such strong standardization concepts do have their impacts on creativity of the design
process. Researchers like (Howard 2001) argue that such standardization concepts limit
the freedom of design as the alphabet limits literary expressions. On the other hand, every
design is a redesign, and by applying this concept, we could foresee the window of
opportunities that such a new concept can open.
However, it may also have dramatic impacts on procurement of construction products
and the process of automation of generation of documents i.e. it may enhance all the
benefits of the Building Information Model and simulation applications.
Finally, it should be mentioned that this research work did not try to tackle the
taxonomy, languages, cross-organizational and cultural differences of the construction
product attributes and properties. It adhered to the attributes and published property sets
that are defined and published by the IAI in the IFC2x documentation. At the mean time,
queries can also be conducted using SQL, EXPRESS-X, bcXML and so forth.
Augenbroe, Godfried. 1998. Building Product Information Technology, White Paper. Atlanta:
Georgia Institute of Technology. Available from http://www.arch.gatech.edu/crc/ProductInfo/
EAN 2004. The Global Language of Business, EAN International. Available from http://www.eanint.org/
Eastman, C.M. 1999. Building Product Models: Computer Environments Supporting Design and
Construction, CRC Press LLC, 2000 N.W. Corporate Blvd., Boca Raton, Florida 33431, USA.
ISBN 0-8493-0295-5.
Howard, R. 2001. Classification of Building Information-European & IT systems. In Construction
Information Technology, International conference: IT in Construction in Africa, pp. 91 to 9
14. CSIR, Building and Construction Technology.
IAIntern 2003 IAI-Industrie Allianz fr Interoperabilitt, Nr. 1/03, Januar 2003. pp. 7.
ISO 10303-11 EXPRESS 1994. Industrial automation systems and integrationProduct data
representation and exchangepart 11: Description methods: The EXPRESS language reference
ISO 10303-21 STEP 2002. Industrial automation systems and integrationProduct data
representation and exchangepart 21: Implementation methods: Clear text encoding for
exchange structure.
Laitinen, J. 1998. Model Based Construction Process Management. PhD Thesis, Royal Institute of
Technology, Stockholm, Sweden.
Nelson, Charles 1996. TQM and ISO 9000 for Architects and Designers, McGraw-Hill.

eWork and eBusisness in architecture, engineering and construction


Shailesh, J. and Augenbroe, G. 2003. A methodology for supporting product selection from ecatalogues Journal of Information Technology in Construction, Vol. 8 (2003), pp. 383.
Tolaman, F.Rees, R. & Bhms, M. 2002. Building and Construction Mark-up language (bcXML):
The C2B/ B2C Scenario. Delft University of Technology, Netherlands.
Workman, Brad. 2003. BIM (Building Information Model): Does the Building Industry Really
Need to Start Over? A response from Bentley to Autodesks BIM proposal for the future.,

eWork and eBusiness in Architecture, Engineering and ConstructionDikba & Scherer (eds.)
2004 Taylor & Francis Group, London, ISBN 04 1535 938 4

Product definition in collaborative building

design and manufacturing environment
H.Oumeziane & J.C.Bocquet
Ecole Centmle Paris, Chtenay-Malabry, France
Ecole Centrale Lille, Lille, France
ABSTRACT: The aim of this work is to bring closer the actors of building
design and manufacturing process around a standard and commune
building product definition. In the actual context a real problem of the
building product definition exists: each actor has a specific point of view
not necessary the same with the other actors. As a solution to this problem
the paper proposes a conceptual model of the building as product using a
systemic approach, and UML as formal modelling language. It constitutes
the continuity of another works presented in the ITC@EDU workshop
entitled systemic approach for building design modelling.

The paper is about a conceptual model of building product in collaborative design and
manufacturing environment, based on a systemic approach and formalised using UML
language. It evolves on three principal parts; the first part describes the building design
and manufacturing context within actors interoperability. The second one focuses on the
conceptual modelling as tool of interoperability, and the last one consists on the proposals
of a formal conceptual model.
In spite of the several actors concerned with the building design and manufacturing, the
building sector remains one the rare fields excluding tools and methods dedicated to
collaborative work. Nevertheless, it is strongly depending on the legal framework specif
ic to each country.
2.1 Legal context
In the current European context of building design and manufacturing, buildings are
subject to a particular cutting of the life cycle regulated by the law. This legal framework
constitutes a privileged instrument of management of the activities related to the building

eWork and eBusisness in architecture, engineering and construction


sector (Ameziane 2001). The MOP law (Maitrise dOeuvre Publique) which is specific to
France, codifies the missions of each actor intervening in the building design and
manufacturing process (DA 2000) (similar laws exist in each country).
In this cutting, the project of construction is born from an intention which expresses a
need. This need is formalised by a program which expresses the requirements of the
customer. Based on this program, a draft including conceptual solutions is then developed
by actors of design.
Starting from this stage, a Preparatory Project Summary is created and integrated into
an administrative file for a building authorization. The development of the PPS will work
out the Preparatory Project Detailed which includes the technical solutions evaluated by
the partners of design (office of: structure studies, electric studies, etc.). This work leads
to the realization of the Tender Documents to the Companies composed of the Plans of
Execution of the Project and the Technical Specifications detailed.
The reception of the project is the last stage in the life cycle cutting; it comes to mark
the completion of the project. The building enters then in the phase of exploitation for
which it was intended (Sahnouni 1999).
2.2 Towards a collaborative design
The legal framework presented above constitutes a kind of method which influences the
production of the building. In nowadays other methods of design initially conceived for
the industrial sector come to influence the building sector. They are not always adapted to
it (Design for manufacturing, systemic production, etc.) and only few of them can
integrate it, in particularly: the concurrent engineering (Bignon et al. 1998).
The tendency today is for this new method; which gives a margin of flexibility for the
companies of building. These last are directed for their designs towards the co-operation
and the exchange of information, around a co-operative process of building production.
This co-operation is organized on the basis of information systems. It allows joining
different knowhow on the same problem, in order to produce a solution. This solution
would be only the compromise among the various points of view (of the architect, the
engineers, the contractor, and of course of the customer). This method seems to be
adapted to the building sector requirements.
Concurrent engineering is the normal evolution to which the building sector should
evolve. The continuation of this paper will take for objective to satisfy this need and will
deal only with tools related to this method.
2.3 The conceptual modelling as a background of interoperability
A great number of works about conceptual models in France were initiated (BOX, GSD,
MOB, JUICE, TECTON, etc.), but also in the international area (ATLAS, COMBINES,
MISSED, etc.), with a principal objective concerned on the description and the
development of data building product models and building production process models
(Ameziane 1998).
These works were variously based on research laboratories from the academic world,
with the assistance of institutional and industrial partners implied in the manufacture of
hardware and the edition of software.

Product definition in collaborative building


The most important international action for answering the problem of interoperability
with a conceptual model in nowadays is the project of the International Alliance for
Interoperability IAI. It consists on the IFC model (Industrial Foundation Classes). The
IFC model is different from all the precedent models in measurement that it proposes an
extremely detailed structure of the building product (Billon 1999). This level of detail is
justified by the fact that the IFC are a whole of resources, thought as a support to the
building software publishers.
The IFC propose the modelling of the building life cycle, structured according to four
level 1: four general phases are identified: feasibility, design, construction, and
exploitation of the building.
level 2: each preceding phase breaks up into a whole of secondary phases, organized
according to a chronological order (the phase of design for example breaks up into:
programming, diagrammatic design, detailed design, documents of execution, tender
level 3: each one of these secondary phases is declined in a series of chained processes,
which correspond to the various actions of the designers during the project. They
establish continuity in the cycle of design.
level 4: finally each process breaks up into a whole of activities. Each one of it is
associated to a diagram, in which are described the tasks to carry out, also indicated in
the model by methods of design.
It is to note that these works have relatively close ambitions. It is question of facilitating
the communication of information relating to the building product among the actors using
a conceptual model. The principal idea in this tool is to model the building product as a
whole of objects evolving in a process of production. It is important to know that this
vision is very restrictive of the reality.
Going beyond the actual cutting in the building design and manufacturing process (by a
conceptual model) means first of all the reconstitution of an informational continuity in
the building life cycle. We propose to intervene on this cycle according to a systemic
approach regarding the world of the building as being not a multitude of distinct elements
(as in the actual models), but as a single system integrating a set of components in
interaction (Le Moigne 1977).
The building system is the set of human, material, and immaterial components
intervening in the activities related to the life cycle. The system limits are the terminals
characterizing the life cycle beginning and end. The system inputs are data characterizing
these components and the outputs are the system levels of production (Oumeziane 2004).
Our paper untitled A systemic approach for building design modelling presented for
the ITC@EDU workshop introduces in four pages a conceptual model of the building
system proposed instead of the actual traditional life cycle. Based on these results, the
present paper proposes a building product definition in the building system.

eWork and eBusisness in architecture, engineering and construction


3.1 From the building system to the building product

The installation of a systemic framework of building design and manufacturing includes
the definition of the building system, but also the definition of the building product
produced in this system. Is it a set of different objects? Is it a set of spaces? Is it a service
offered in a defined space? These questions show well that there is a basic problem in the
building production related to the product definition.
The actual conceptual models introduced in this paper, consider in the major part of
the cases, the building as an assembly of objects (problem of limitation of objects), and a
whole of rules defining the relations between these objects.
3.2 Relativity of the product building definition
In practice, it is very current to relate several and different definitions to only one
building product. It is possible to distinguish two categories of views able to define the
building product. A category of actorsproduct view and another one of stages-product
In the first category, the product is related to an actor view. In general the point of
view of the economist participant to a project constitutes a compromise among the actors.
The building for the economist is seen as a set of objects and batches of objects. The
other actors have their own point of view. The architect for example sees the building
according to his personal convictions and according to his artistic tendencies. He
considers the building as a whole of full and vacuums, a whole of spaces served and
spaces given a service. An engineer of structure has a view considering the building as a
mechanical model, and so one (AFITEP 2000).
In the second category, the product is related to a stage of the life cycle. It is an
infrastructure, a superstructure, an envelope, and so on until it becomes a finished
construction assigned to a service.
In the continuity of the approach undertaken (system approach), all the points of view
of actors contributing to the production, must be taken into account and all the stages of
the life cycle. The definition given to the building product is in this case variable
evolving in a set of actors and stages.
From this fact a first conclusion can be done: the definition of the building product is
relative respectively, to the actors point of view and to the stages of the building life
cycle. It cannot be restricted to only one actor point of view or only one stage of
production. The product building becomes then definable on a set of views and different
stages; it is variable.
3.3 The building product variable in the building system
The building product is the variable component on the set of the stages of the state of
design (the state of design is the state of the building system including all the stages of:
feasibility, programme, APP, APD, etc.). In this state of the building system, the output
consists on a wallet of documents (plans, etc.) representing a first form of the building
product definition (Oumeziane 2004).

Product definition in collaborative building


Figure 1. The semantic of the building

product definition.
The design in this state of the system is organised according to a representation of
three referential: methodological, conceptual and normative (Oumeziane 2004). Defining
the building on this base means give a semantic definition taking in account the actors
point of view represented in the referential (Fig. 1).
We go then from an ineffective and traditional representation of modelling built
around building objects and their processes of realization to an effective and more
adapted representation built around referential.
The structure of referential frames which was set up must permit to define the building
product compared to a semantics built around several actors point of view (Martin 2001).
It permits to identify the product by particular characteristics established in the course of
production in the first state of the system (state of design). However, the semantic
definition obtained should be completed by a syntactic one. It is important to remember
that it is a question of a systemic approach where it exist two types of equality between
the components, a syntactic equality, and a semantic one (Giambiasi 2001).
In a semantic equality it is possible to define two different elements in their form but
similar in their function as equivalents. As example a chair used to break a window, and a
hammer, are thus equivalent in this case, they are used for the same function.
Our first definition of the building product in the preceding figure must integrate fully
this aspect of semantic equality. A syntactic equality considers equivalent only similar
elements. The equality hammerchair becomes false from a syntactic point of view.
In order to define a syntactic framework for the semantic definition proposed above,
we will be interested in the formal systems. A formal system is a system with only
syntactic equality and rigorous reasoning (Johnson 1970). (In the semantic definition of
our product, the reasoning is not rigorous, because the actors reason by analogy,
comparison, induction, etc. In the rigorous reasoning there is only one rule to reason:

eWork and eBusisness in architecture, engineering and construction


Figure 2. Syntactic structure of the

building product definition.
Following this approach the building product can be compared to a number n
pertaining to a mathematical set. For example an unspecified construction, can be
assimilate to a global set of products (general product of consumption), to a typological
set (product from the same family), to a die set (product of the same constructive die) and
to a components set from a purely formal point of view.

Product definition in collaborative building


Compared to the mathematic construction of sets, a number n in a formal definition

can be seen as pertaining to a general set R including all the real numbers, to sets K, J,
and to set N including the natural numbers. Four sets of definition are obtained then,
overlapping from the smallest to the largest, constituting a syntactic framework in which
is defined the variable component the building product (Fig. 2).
Accordingly, the building product cannot have any possible semantic in this purely
formal construction. The idea to present the building as element of socialization, element
founder of a culture, etc., is thus not taken into account in this definition.
An undivided component of the building (beam, column, window, furniture, etc.) will
be able to be defined on the fourth set, an element of structure on all the sets except the
components set, a particular kind of building on the two first sets only, and a whole of
buildings on the first set only.
3.4 A conceptual model of the building product
By associating the syntactic framework to the semantic one, it becomes possible to
characterize perfectly the production of the building system, a building product.
According to the conceptual model obtained, the building system will be able to
produce, with the same means and referential tools, a whole of buildings in the global
level, specific buildings in the typological level, structures of building in the die level,
and even components of building in the last level, with the same objectivity and the same
architectural vision that will be define in our referential framework. Each level will
indeed utilize a methodological, conceptual and normative referential as shown in Figure
3 next page.
A house will be defined in this model as: a whole of products developed in different
trades (electric components, air-conditioning, floor covering, etc.) taking an architectural
semantic conceived according to a referential framework (methodological, conceptual
and normative). It will be defined also on a set larger compared to a constructive die:
wood, metal, etc but always with a specific semantic to this level defined by the
referential framework. In the third level the house is defined as a building distinguished
by its function, extremely different from another type of building. Finally it is defined in
a set which contains all the precedents and allows making a distinction compared to the
physique and social environment of the house.
It is possible to define also in this diagram a simple component of a building, initially
as a unified product, then as pertaining to a constructive die, then to a type of building,
and finally as product influencing the external environment of the building product.
The informal model obtain allows to produce a building or just a component of
building according to a semantic built around referential framework and a syntax
structured by sets of appurtenance. The next paragraph is about a formal representation of
this model adapted to data processing implementation.
3.5 Formal model of building product and UML capacities
UML (Unified Modeling Language) is a means of expressing object models by
disregarding their implementation; that means that the model provided by UML is valid
for any programming language. UML is a language relying on a meta-model, a model of

eWork and eBusisness in architecture, engineering and construction


higher levels which defines the elements of UML (usable concepts) and their semantics
(their significance and modes of use) (Fowler 2002).
As first results the Figure 4 is a synthesis of the paper presented in the ITC@EDU
workshop and this paper. It consists on a model representing the building product classes
and the building system composed of the actors classes, the tasks classes assigned to the
actors and the tools classes used for that. It includes also the representation of the
building system states, its output and the referential used in the state of design
(Oumeziane 2004).

Figure 3. A building product


Figure 4. A formal model of the

building product.

Product definition in collaborative building


Setting up a framework of interoperability in the building sector should passes initially by
a conceptual level of modelling. The various current models which are used as a basis for
the collaborative design deal with only a part of the reality of the building product. These
models reduce the building product to an assembly of physical objects. The paper
proposed through a product definition more general and more complete taking base on a
systemic approach and formalized in a multi-referential language UML.
This work constitutes the beginning of a model more complex of the system
building including the roles of actors, the design methods, etc. We project to present in
our ftiture publications the detailed of the referential framework proposed (conceptual,
normative and methodological) in our building product definition.
AFITEP, 2000. Le management de projet, principes et pratique. Paris: AFNOR.
Ameziane, 1998. Structuration et reprsentation dinformation dans un contexte coopratif de
production du btiment. PHD thesis, Ecole dArchitecture de Luminy.
Ameziane, 2001. Building Production Management systems in a cooperative work, IEPM 2001,
Quebec City, Canada.
Bignon et al., 1998. Evolution de la mitrise duvre, pratique coopratives et informatique
rpartie. Confrence Mieux Produire ensemble. Plan construction et architecture. Nancy.
Billon, 1999. Comprendre les concepts des IFC. Dcrire son projet en vue des changes. Paris:
cahier du CSTB.
dA, 2000. La loi MOP mode demploi. Darchitecture. Paris: SEA editions.
Fowler, 2002. UML. Paris: Campus Press Poche.
Giambiasi, 2001. Dynamique des systmes. Course of Master in MCAO. Marseille:
Johnson, 1970. Thorie, conception et gestion des systmes. Paris: Dunod.
Le Moigne, 1977. Thorie du systme gnral. Paris: PUF.
Martin, 2001. Process design modelling through methodological integration. PHD thesis. Paris:
Ecole Centrale Paris.
Oumeziane, 2004. An adapted software environment for building design and manufacturing.
IDMME2004. Bath City.
Oumeziane, 2004. Systemic approach for building design. ITC@EDU conference. Istanbul.

eWork and eBusiness in Architecture, Engineering and ConstructionDikba & Scherer (eds.)
2004 Taylor & Francis Group, London, ISBN 04 1535 938 4

Implementation of the ICT in the Slovenian

AEC sector
T.Pazlar, M.Dolenc & J.Duhovnik
University of Ljubljana, Ljubljana, Slovenia
ABSTRACT: The AEC (Architecture, Engineering, and Construction)
sector is proverbially marked as traditional, not progressive oriented
towards the new technology implementation. In order to determine the
punctuality of the stated prejudice and evaluate the sector ICT
(Information and Communication Technologies) implementation, a
unified European state-of-the-art analysis has been made. One of the
prodAEC, a two year pan European program, deliverables has been used
for gathering the relative information. This paper presents the
comprehensive ICT implementation analysis of the Slovenian AEC sector
with structuring the results in different sub areas.

People involved in the construction informatics share a common objective irrespective to
the terms used for their work description. Supporting of all participants in the
construction industry in their individual tasks, and even more, in their synergistic
collaboration should strive towards the optimal design, construction, operation,
maintenance and removal in less time and with less cost. The knowledge contributed is
presented in the forms of product and process models, tools and representations serving
all participants (teams, individuals) in the construction process. All general estimations
regarding to the debated issues emphasizes the huge difference between
academia/research sphere and practice, quite matching the mentioned general public
Before making any accuracy estimation or prediction, the state-of-the-art analysis is
required. Several investigations referring to the ICT implementation in the AEC sector
were carried out (IT Barometer, SIENE Network and others), but most of them were time
and geographically limited. Consequently, a conception to overcome the described
restraints with a new international joint research has occurred.
2.1 Program objectives
The Fifth European Framework Program sets out priorities to the technical development
and implementation of the user-friendly information technologies. These priorities have

Implementation of the ICT in the slovenian


been associated in the IST program (Information Society Technologies) on the basis of a
set of common criteria reflecting the major concerns of increasing industrial
competitiveness and the quality of life for European citizens in a global information
society. ProdAEC, a two-year pan European IST program, main objective is to set up and
sustain a thematic network for product and project data exchange, e-work, e-business in
architecture, engineering and construction.
Program main motivations and goals are:
to become the primary source of information for standards on data exchange, e-work
and e-business in the AEC/FM (Architecture Engineering
Table 1. ProdAEC founding members.


















Taylor Woodrow



























* Project coordinator.

Construction/Facilities Management) sector for Europe;

to increase SME (Small and Medium Enterprises) competitiveness through the adoption
and implementation of standards, thus positioning SMEs at the same level of
opportunity as large companies;
to encourage progressive harmonization of overlapping standards;
to support and bring together national, local and industrial initiatives promoting the
development and use of standards in the AEC/FM sector;
to provide an extensive process-based overview of project modeling standards;
to stimulate technology and know-how transfer from knowledge centers to the industry;
to improve both working methods and peoples qualifications by using standard

eWork and eBusisness in architecture, engineering and construction


2.2 Program product and services

Comprehensive prodAEC network potential users range includes industrial associations,
software vendors, research community, standardization bodies, public administration and
AEC industry in general.
Project deliverables are gathered into four products and services:
1. Web AEC IT Project database
National and European database of IT related project on the AEC/FM sector
incorporating sensible search and filter engine query. Details and scope,
scheduling, financing data, description, objectives, results, areas and partnership
info are available for each project.
2. Standard-to-Process Matrix
Service incorporates a reference framework for whole construction process with
the identification of actors and their associated role. ProdAEC extended the
process matrix to incorporate the concepts of modeling, data exchange and ebusiness standards.
3. Informative on AEC e-business
An online service providing e-business information in terms of e-marketplaces,
software tools and public e-services and presenting the general figures about
debated topics.
4. Benchmarking Service
The need for online benchmarking service, designed for the ICT exploitation and
awareness level in the European AEC sector has been clearly marked in previous
projects industrial requirements initiatives.


3.1 Benchmarking concepts
Benchmarking in general presents a process of identifying, learning and adapting
outstanding practices and processes from any organization anywhere in the world in order
to help an organization improve its overall performance. The discussed service aims are
more specific:
it concerns people active or linked to the European AEC sector (industry and
government agencies, sector associations, research and consulting organization and
software vendors);
compare the relative position in the ICT use & awareness to the companies of the same
profile/ segment;
probe the views of different respondent groups;
relieve defining an appropriate investment in the knowledge acquisition & technology
identify lacks in the ICT awareness level and implementation at industrial level;
keep the track of the future requirements and sector evolution.

Implementation of the ICT in the slovenian


The Benchmarking service is available online, absolutely free of charge. All categories of
enterprises in the AEC sector are covered with the unified questionnaire and everyone
involved in the sector is welcome to participate in the enquiry. Its dynamic makes the
service unique. After filling in the basic identity questionnaire, users receive the
username and password via email. An immediate automatic feedback is generated in the
personalized reports form after filling in the questionnaire. Ensuring the report
coherency, only companies with the similar profile are used in the report generation. The
dynamic system allows participants to revisit and check the results anytime and
anywhere. Each participant has to fill in the questionnaire periodically ensuring the
automatic database update. Therefore, the company evolution can also be monitored. The
benefits of the benchmarking service are mutual. Answering the questionnaire is
rewarded with the personalized reports where the competitor's data are offered without
any special effort of collecting them.
The industry participants can use the reports to define appropriate investment in the
knowledge acquisition and technology evolution within the company. The government
agencies, sector association, research and consulting organization can obtain even more
identify lacks in the ICT awareness level and implementation in industry;
updated knowledge about the actual awareness level and status of the ICT in AEC
identify the training needs;
to collect data for designing future innovation public policies;
get valuable data influencing software product development strategies per country
(and/or) per sector.
The service requires no special maintenance since the operation is completely automated.
Maintenance with additional costs would appear only when modifying or preparing new
If the users find benchmarking service useful, they will check the results regularly and
therefore provide the requisite answers for the database. Although the prodAEC is a twoyear program, the zero operation costs could keep the network and tools developed
continue in the future.
3.2 The questionnaire
The inquiry based methodology covering the ICT, drafting, modeling and e-commerce is
used in benchmarking. Semantically, the questionnaire is divided into four parts:
A: Company information (company profile, size and turnover, participant individual
datainvolvement, etc).
B: Use of the computerization, modeling, e-commerce, EDMS (electronic data
management system).
C: Classification, exchange standards, reference libraries.
D: Training, organizational and human issues.
The questionnaire is available in English, Dutch, French, Portuguese, Czech, Slovenian,
Spanish, Italian and German language. Enquiry generally takes 20 minutes to complete.

eWork and eBusisness in architecture, engineering and construction


3.3 The benchmarking service promotion

In order to assure appropriate startup enquiry database, the intensive project/service
demonstration has been required. University of Ljubljana, for example, put a lot of effort
in the benchmarking service presentation. Various grips were used:
scientific paper about prodAEC project has been published in the Slovenian Civil
Engineering Journal;
personal email invitations has been sent to all members of the Civil Engineering
Chamber and to all employed in the academia and research institutions;
prodAEC presentation at the Slovenian Civil Engineering Cluster workshop.
The intensive benchmarking promotion helped us to establish the most comprehensive,
but not quite satisfying database as presented in Chapter 5.
More than 26 million workers are involved in the European AEC sector: 28.1% of the
industrial employment is dispersed in 2.4 million enterprises. Over 97% of them are SME
(small or medium enterprises) with less than 20 employees and 93% with less than 10
workers involved. Short term (generally just single project) collaboration increases the
AEC sector crumbling. Consequently, even the biggest European construction enterprises
cannot take over the market leadership role since their share does not exceed 5%.

Figure 1. Person in paid employment

in the European Union (construction
sector only).

Implementation of the ICT in the slovenian


Figure 2. Person in paid employment

in Slovenia (construction sector only).
The Slovenian AEC sector does not defers much from the European. The enterprises
percentage is a bit lower (25.8%), but the sector still remains the biggest industrial
employer with more than 140.000 people involved. The sector is even more crumbled
than the European: 95% of SME employs less than 10 people and 98.4% of them with
less than 25 employees. The average enterprise has only 3.4 workers employed (including
the owners).
Further, Table 2 and Table 3 present some relevant statistical data referring to the
Slovenian AEC sector.
The person in paid employment increased in 19932000 for only 22%, but the value of
construction put in place increased enormously (4.2 factor). The
Table 2. Review of the Slovenian construction industry development.



Value** (mio euros)









* Number of enterprises and other organizations.

** Value of construction put in place.

eWork and eBusisness in architecture, engineering and construction


Table 3. Nominal indices of the value of construction put in place, March 2004.

III 2004

III 2004

III 2004


II 2004

III 2003

III 2000





National Motorway Construction Program that started in the beginning of the nineties had
an important influence on incensement. The indices of the value of construction put in
place shows the promising prognosis.
5.1 General information
The inquiry results are based on the responses (43) having been collected in the
benchmarking service until April 2004. Although University of Ljubljana put a lot of
efforts in the Benchmarking service promotion, the response was under all expectations.
Since we do not know the exact number of people familiar with the inquiry, it is
impossible to estimate the percentage of responses. Generally, the response in
comparable surveys when informing the potential participants with e-mail message is
very low1015% in IT Barometer (Samuelson, 2002). Evidently, it is hard to persuade
the target e-mail recipients about the mutual benefits of such service. All observations
and conclusions as the result of the prodAEC Benchmarking service presented in this
paper are therefore based on a limited population.
The equal participant role is presumed in the results analysis. No answer weighting is
necessary. More participants foreseen from the big companies will compensate the large
number of those employed in the small enterprises.
5.2 Part A: Company and user profile
First part of the questionnaire contains the essential questions for creating the image
about the participants and furthermore for the inquiry results filtering.
As expected, most of the answers (54%) came from the design offices. Presumably
they were the best informed about the Benchmarking service. The low ICT
implementation in the AEC sector is usually referring to the constructionthe participant
percentage (10%) probably confirms this estimation. The share marked as other
presents the answers from the research/academia institutions. No feedback (not a single
answer) from the Public administration was considerable disappointment.
After taking into the consideration the crumbliness of the AEC sector we can expect
the high percentage (almost 50%) of enquiry participants employed in the small
enterprises (<=10 employed). The 25% share of the medium-big companies (51250
people employed) justified our assumption about the unnecessary answer weighting.
Since there are only two construction companies in Slovenia with more than 500 people
employed, their participation share (3%) is also anticipated.

Implementation of the ICT in the slovenian


The answers about the annual turnover can be linked to the answers about the
enterprise size. More than 50% of them have the turnover less than 0.5 mio euros. If we
set the limit to the 5 mio euros, more than 90% of survey participants are captured in the
Figure 4 presents the enquiry participants role in the enterprise. Since the participants
from the smaller enterprises are commonly involved in more than one area of interest,
they have picked up more than one answer and automatically weighted the area
importance. Generally, the collected response presents mostly the architects and
engineers involved in the design, construction and project development and partly
The first paragraph assumption about the inquiry participant role in the AEC sector is
confirmed with the answers about undertaking work location. More than 80% of
participants usually undertake their work in the main or area office instead of on-site.
5.3 Part B: Technological infrastructure
The first step in the computerization, modeling, e-commerce, EDMS usage evaluation is
the technological infrastructure implementation.
Internet and e-mail are the most accepted modern ICT in the AEC sector. Every
participant has access to the both technologies, but only two thirds use them. Similar
results are valid for the CAD system usage too.
It is difficult to understand why 32% of people who owns the digital photography
equipment do not use it. We are certain that this technology usage will increase rapidly in
next year or two.
Telephone and video conferences with the result of 15% and 12% are not so
disseminate. The usage percentage is much lower than the ownership in both cases and
indicates that users dont find those two types of technologies very useful. Presumably,
the main cause can be found in the technical difficulties usually present in the conference
usage (like low band width, etc).

Figure 3. Company profile.

eWork and eBusisness in architecture, engineering and construction


Figure 4. Areas of involvement.

Further we investigated the area and level of computerization Three answers were offered
referring to the usage: low <20%, partly 20% <60% and high >60. The bookkeeping
and invoicing presents the areas with the highest estimated use of computerization. More
than 45% indicated their usage as high. The bills of quantities, costs, budgeting and
project management closely follows with 30%. The computerization level in building
elements, civil engineering elements, building services (HVAC), and construction
reaches up to 25%. Consequently, the enquiry participants can be divided into two
groups: users with the high level (one quarter) and users with the low level of usage
(three quarters), regarding to the answers about their computerization usage. Other
areasbuilding elements, environmental impact, estimating, facilities management,
health & safety, project development are very low ranked (below 10% high usage). The
partly usage of computerization (20%60%) in all areas except planning, scheduling,
product catalogues & details and project management is also below 10%. A simple
conclusion can be obtained: If the users have technology, they use it.
The most common software tools used in all areas are the office software. Special
software tools are used mostly in the bookkeeping, invoicing, planning & scheduling,
project management, costs, budgeting and bills of quantities. 2D and 3D CAD application
are common only in the building elements & services, civil engineering elements and in
construction. Low data exchange ratio can be concluded. 2D CAD systems are still a
preferred choice for almost two thirds of actual users.
One of enquiry intentions is also an estimation of the ICT developments in the near
future. The most concerning fact obtained from the Figure 5 is the lack of specific plans
referring to the further e-business application usage. The market pressure should be
established in order to ensure the new technologies implementation. The current state-ofthe-art showed no such pressure (35%). Only the customers (30%) are forcing
participants to adopt e-business solutions. Surprisingly, almost no pressure comes from
the public authorities.
The opposite proposal, to move from traditional to the e-business methods, can put in
danger 10% of established partnerships.
One third of the participants found out less costs, time and errors (most important
benefit), process simplicity and fewer errors as the main benefits of the e-technologies
regarding to the account/finance area.

Implementation of the ICT in the slovenian


Figure 5. Technological infrastructure.

Figure 6. Plans for the ICT


Figure 7. Level of awareness.

E-technologies in procurement, commerce and enterprise resource planning do not
currently present such benefits. This estimation can be put under question mark since the
applications use percentage is low and therefore cannot reflect the real benefits. The
participants also do not estimate the e-technologies as an important factor in assuring the
new customers and in loyalty increase.
The level of awareness in the modern technologies in the AEC sector is insufficient.
None of the answers

eWork and eBusisness in architecture, engineering and construction


Figure 8. Awareness and usage of

drafting and data modeling standards.

Figure 9. Awareness and usage of

digital data capture, logistics and
automation technologies.

Figure 10. Training methods.

(EDI and private networks, XML, electronic commerce, mobile devices and remove
connections to software tools, electronic marketplaces, specific standards for AEC,
electronic services of private companies for AEC, electronic signature and digital
certificates) reaches more than 50%. Extremely low is the awareness percentage in the
AEC standards. The use of the described technologies is also poor (below 10%). The only
exceptions are digital certificates and electronic signature with the 20% usage. Those two
technologies will assure the necessary security for other applications (electronic
commerce, etc.). Solving the security

Implementation of the ICT in the slovenian


Figure 11. ICT effect on enterprise

operations and job roles.
problem will increase not just the level of awareness, but also the level of usage in
discussing areas.
The cost of the software solutions presents the mai obstacle (22%) for adopting ebusiness in the enterprises. The second group of barriers includes the dependence of
proprietary solutions, expensive external know how, lack of available know how inside
the company and poor adequacy of solutions to the industry needs are all ranked with
10% each. The lack of trust in the available technology and the workers/partners
resistance (max. 5% each) does not present important obstacle.
The overwhelming share of participants (more than 80%) does not have any plans
regarding to the EDMS (Electronic Data Management Systems). Where used, systems are
usually accessed via LAN or ADSL.
5.4 Part C: Standards
Most of the classification systems stated in the questionnaire are completely unknown to
the Slovenian AEC sector. No one uses them and no one plans to use them. The
participants are acquainted only with the existence of Building 90, Landscape Filling
Index, CAWS, European Waste Catalogue andNational Green Specification. The
awareness in each case does not exceed 5%.
The reference libraries implementation analysis presents similar results. Only few of
the stated libraries (AEC/ Bricsnet, Architects Standard Catalogue, BertelsmannSpringer,
COIB, Emap, Fraunhofer Informationszentrum Raum und Bau) are known to the enquiry
participants, but no one uses them.
The awareness and usage of the drafting and data modeling standards (Figure 8)
reflects the state-of-the-art on the CAD systems dissemination. AutoDesk software is
most widely used and the figures presenting the .dwg and .dxf format were expected. The
GDL and IGES standards are also known. Surprisingly low awareness about IFC and
Cimsteel standards is probably connected with predominant 2D modeling.
A general lack of awareness can be perceived in the electronic trading technologies
including the digital data capture, logistics and automation technologies. Only the
electronic data interchange (EDI), hand held digital data capture and XML are used.
Barcodes, transducers and already stated technologies present theareaof interest.

eWork and eBusisness in architecture, engineering and construction


5.5 Part D: Social, educational and organizational aspects

Regardless to various possibilities of educational methods offered by modern ICT (video,
internet), the traditional compulsory and formal courses are still the most common way of
education for all employees.
The knowledge transfer inside companies can also be comprehended from the answer,
since the enterprises stress the compulsory education for the managers, professionals and
According to the participants estimation, the advanced ICT will have the significant
effect on the job and skills, contractual relationship and on the way companies works.
Surprisingly big percentage of participants is convinced that ICT cannot reduce the
number of employees. The percentage in nearby future will probably lowered with data
exchange standards implementation.
Although the estimations, gathered in Figure 11, do not lean on the statistical data,
they still present the valuable projection of the AEC sector in future.
Unfortunately the other ProdAEC partners have not started with the Benchmarking
service presentation in their countries until the mid May 2004. No data from the debated
service was available and therefore no comparison could be made.
The statistic provided by prodAEC Benchmarking service can be marked as credible only
if the adequate number of people involved in the AEC sector will fill in the questionnaire.
If you find yourself as potential Benchmarking service user, please visit the ProdAEC
web page (http://www.prodaec.com/) and participate in the enquiry.
The prodAEC Benchmarking service presents a pioneers attempt to set a unified method
in measuring the ICT implementation in the whole European AEC sector. Although the
enquiry concept and realization can be marked as exemplary, more intensive promotion
should be made. Its concept to automatically collect answers just on the mutual benefits
basis is presumably too contemporary for the prosperous enquiry accomplishment.
Minor changes should be applied to the questionnaire according to the participant
comments about too long and too complex questions.
The enquiry revealed a lack of awareness on advanced ICT in the Slovenian AEC
sector. The current AEC sector state-of-the-art analysis showed needs on immediate
improvements, but we cannot mark the described situation as an obstacle to the creativity
in the concerning sector.
The answers indicate the market pressure as an efficient inducement for implementing
the new technologies in practice. One of the most concerning analysis results is the lack

Implementation of the ICT in the slovenian


of specific plans for the enterprise future ICT investments. New technologies should
focus on improving productivity (less time, costs fewer errors and process simplicity) and
be implemented through the user friendly information and training.
Furthermore, the academic and research institutions must stay in contact with current
achievements in ICT and also with the practice requirements in order to ensure the sector
Fenves, S.J. 1996. Information technologies in construction: A personal Journey. In Ziga Turk
(ed.), Construction on the information highway. CIB proceedings publication 198, Bled, 1012
June 1996. Ljubljana: University of Ljubljana. Faculty of Civil and Geodetic Engineering.
Rivald, H. 2000. A survey on the impact of information technology in Canadian architecture,
engineering and construction industry. Electronic Journal of Information Technologies in
Construction 5: 3765.
Samuelson, O. 2002. IT barometer 2000The use of IT in the Nordic construction industry.
Electronic Journal of Information Technologies in Construction 7: 126.
URL: http://www.prodaec.com/ [1.6.2004].

eWork and eBusiness in Architecture, Engineering and ConstructionDikba & Scherer (eds.)
2004 Taylor & Francis Group, London, ISBN 04 1535 938 4

Adding sense to building modelling for code

certification and advanced simulation
I.A.Santos & F.Farinha
Algarve University, Portugal
F.Hernandez-Rodriguez & G.Bravo-Aranda
Seville University, Spain
ABSTRACT: IFC-based models seam very promising for automated code
checking because they are expected to include the heuristics needed by
several applications during the building lifecycle, thus including the
capability to support the definition and checking of a large set of
requirements together with those contained in building codes. However,
the theoretical and practical aspects of the IFC-based modelling, as well as
the kind of the desired code checking assistance, still result in the
necessity of dramatic support from cognition-based techniques in order to
evaluate the presence of sense in each building description, specially
when the certification of a building plan is the goal. This paper discusses
these issues following the formerly developed concept of a Normative
Product Model. Three main topics are discussed: (1) trends and
limitations of IFC in respect to building simulation; (2) a contribution for
the essential notion of building sense, including the identification of its
admissible sources; and (3) the relevance of building sense to achieve for
the self-completeness of a particular IFC-based building description
starting from an initial low level user input.

As computers increasingly stimulate automation and greater efforts are being dedicated to
develop building models with enhanced representation capabilities, the question of an
automated code certification is certainly a challenge. To achieve this goal, the building
model must include much more data than it traditionally does, and it shall no longer be
dependent on human interpretation. In fact, the building model must include knowledge
and become a true building simulation with complete, accurate and integrated references
to all aspects that may be relevant for the code checking. Since this is certainly beyond
the usual approach of sensitive visual modelling, it may be referred as advanced
Besides, for the technology that can make this possible, each clause from a code that
can be automatically checked becomes essentially a design requirement, for which the
building model provides the necessary information resources and operating instructions.
So, the same building simulation technology can be used for the checking of other sets of

Adding sense to building modelling for code certification


design requirements, whether they originate from a particular building program or relate
to some specific analysis.s
This paper follows the concept of the Normative Product Model (Santos et al. 2002,
Santos 2003) described in Figure 1, accepting that the IFC standard (IAI 2003) is a basic
component of such an advanced building simulation, but it focus on the necessity of
developing formal definitions of building sense as an ultimate component, not only to
allow for the automated checking of design requirements but also to enhance the
completeness and accuracy of a building model, and still to support the design process.
Besides, this paper states the exclusive role of building codes as the main source for
building sense, among others.
The contribution of IFC for advanced simulation is first discussed, followed by the
characterization of distinct levels and contexts related to the automated checking of
building design requirements. Then the question of building sense is introduced, its
relevance for code certification and model completeness is described and its fundaments
are discussed. Finally, some conclusions are emphasized.
In the attempts to enhance building modelling, the main goal is not exactly about
simulation but simply to

Figure 1. Fundaments of the

Normative Product Model concept.

eWork and eBusisness in architecture, engineering and construction


allow that everyone on the world can share a single and common view of a building, at
least to a certain extent and quality that may be considered convincing by the industry
judgment. This view is expected to: (1) become a standard; (2) cover the entire building
lifecycle; (3) be independent from particular requirements; and (4) bring no limitations to
the design solutions, soon at the early phases of the design process.
However, to enhance building modelling starting from todays practice, sooner or later
it means facing the real world, where things that are put together usually interrelate and
even interact. So, an enhanced building model technology must also support a great
number of interrelations and dependencies between building elements, which may
outcome as a difficult trend, not only for software implementation, because of database
management problems, but also for the users that become responsible for the introduction
and assessment of building model data, much more than presently (Bazjanac 2002).
The evolution of the IAI design for the IFC standard seams to be more and more
successful about the referred expectations, but it denotes some conceptual limits in
relation to building simulation, which are more concerned to the quality of the modelling
than with extension capabilities (probably as a result of lessons taken from the difficulties
of previous attempts on standardization or implementation issues).
The most important of IFC conceptual limits under this particular perspective is its
conception as a building descriptive language, but also the flexibility of interrelation
declarations and the possibility of partial implementations in software applications
become relevant.
2.1 Conceived as a language
The IFC standard is strictly conceived as a language to describe a building during its
lifecycle: a building metamodel as referred by Santos (2003), meaning that it defines the
abstractions considered necessary to develop specific building models.
It includes a lexicon of generic designations, which starts from the most abstract level
like object or relation, goes through medium abstract level like building element or
space, and ends with the less abstract ones like window, opening area or wall.
It also includes a grammar that identifies the allowed interrelations between the
concepts of those designations. For example, it determines that a window has opening
area as an attribute and it can be associated to a wall meaning that the wall has
openings (by the fills/voids interrelation definition).
However, IFC does not include any definitions of sense outside the very own
metamodel context. It does not control, for example, if: the opening area of a
window is related, by a certain formula, to the space area of an associated interior
space (assuming that space area and interior space are, respectively, an attribute and
a type of space).
So, its possible to use IFC modelling resources to make an internally consistent
representation of a nonsense building, almost the same way as one can use the English
language with complete syntax correction, while not making sense, as in: a dog without
legs runs backwards.
Of course, since IFC is particularly dedicated to the building industry, the possibilities
of incorporating some sense soon at the grammar level are bigger than with a common
use language, mainly by means of the semantics embodied in attribute and specific

Adding sense to building modelling for code certification


interrelation definitions. But, especially in respect to an enhanced building simulation

perspective, modelling just at a language level appears as an initial restriction to
knowledge representation, for which something must be done.
2.2 Interrelation capabilities
IFC defines interrelations between building elements (others than attributes) as model
objects that intermediate between other objects, and this is one of its most important
features, because not only it solves the implementation problem of many-to-many
relations, as it becomes the basis for the structuring and pre-definition of the semantics of
interrelations in a meaningful way for building modelling.
However, the use of interrelation capabilities within IFC is fully optional, so one may
use the standard simply for a listing of individual building elements (whether physical or
conceptual), without making explicit acknowledgement of the multiple relations between
those elements.
For example, its possible to have walls surrounding a space, without declaring them
as providing the boundary for the space (nor the space as bounded by the walls).
Also a window can be placed into a wall without explicitly declaring it as voiding the
Certainly these issues can be largely solved by a good IFC implementation, where
intelligent software tools can greatly improve its usage. But, even then, it may be difficult
to keep a model data accurately updated concerning the interrelations, when successive
changes are made. Ultimately, the result can be the tendency to a graphical representation
only of building elements with a quite low simulation power.
2.3 Partial implementations
The IFC certification process, as determined by IAI, allows for partial implementations
under the notion of model views (which are similar to conformance classes within STEP).
It is expected that these model views do not reduce the interoperability, since they
originate from applications within a same specific domain that share a common set of
information requirements, far from the extensiveness of the entire model. However, these
model views can also be used to hidden the limitations of present software applications
and even generalize a low level usage of the model, which clearly turns to be negative
regarding softwares interoperability and enhanced simulation.
Once again, the seriousness of the problem depends on the quality of an
implementation, at least by keeping untouched the part of a building model data that is
not compatible with a specific application. But this will not avoid that either new data
produced by a certain user ignores potentially important interrelations with hidden data,
or that the results from some relation-based analysis can be inconsistent.

eWork and eBusisness in architecture, engineering and construction



Much of the requirements of building design come from applicable building and technical
codes. They represent the standardization of sets of requirements that are defined prior to
the existence of a specific building. Actually, the possibility of defining codes relies on
the fact that buildings tend to certain constancy on their spatial aspects, or the
construction technologies or the included technical systems. Based on the constancy of
buildings, some classification notions arise and the codes that are applicable to certain
categories become generic requirements for the respective set of buildings, while
particular requirements result from the building program, local conditions and some
other sources.
With the support of several modelling techniques, both sets of requirements can be
checked prior to construction, at least if they fit into the representation capabilities of the
building models. The regularity of checking procedures suggests that they can
progressively turn from human-driven to automatic.
The notion of automation here means the possibility of a process that demands some
control actions to become executed by a computer, by means of previously formalized
information and on-line collected data. For a requirements checking automation system,
this means putting codes and other sources of building requirements into some sort of a
computer operable representation, probably after a conversion from original paper-based
documents or eventually as a former specification.
Considering the kind and extent of such computer operable representation, three
distinct levels of automation can be identified in the case of codes: code referencing,
code checking assistance and code checking certification. They are discussed next,
together with the question of particular requirements.
3.1 Code referencing
Code referencing occurs when the selection of a single or complex building element from
a building model, made by an user, automatically leads to the referencing, with possible
presentation in text or graphics, of all related clauses from a set of treated codes, so that
the user can easily get just the information he/she needs at that particular time. It is a
human-driven approach, because only the user knows the meaning of the information
contained in the building model as well as the meaning of code documents.
To achieve for a code reference system, instead of any enhanced building simulation,
it is basically needed a simple list of cross-references between the lexical content of the
building code and the lexical content of the building model, which is perfectly achievable
even for geometry-based modelling through commonly used tags. However, this kind of
lexical definitions still presents a few significant problems, related to multiple
classification, treedepth evaluation, etc., which are similar to those of classification
systems used by the construction industry for materials and services acquisition (like
CI/Sfb or Master Format).

Adding sense to building modelling for code certification


Once these cross-references are defined, an interactive system, possibly hypertextbased, can be developed and distributed as a program directly integrated with design
tools, or as an Internet-based service.
3.2 Code checking assistance
Code checking assistance occurs when the selection of a building element from a
building model automatically leads to the checking of the relevant properties of its own,
or other associated elements, according to the requirements of a set of applicable clauses
taken from the treated codes, with the objective of detecting unconformity, and possibly
to present alternatives.
To achieve for a code checking assistance its necessary to go much further on the
computer operable representation of the code than for code referencing. The building
model needs to be meaningful, so that the checking system has enough knowledge about
the building model, the building code and the logic correspondence between both, and
takes consistent action.
However, since the building model is essentially descriptive, but the code content is
intended to be normative/declarative, while also descriptive (any code embodies a certain
view of a building), the referred correspondence is expected to cover only the information
requirements that concert to the description of a building, i.e., it relies upon a possible
common building language, leaving the normative content to be represented by some
distinct rule-based modelling.
The Normative Product Model concept states that, for each clause of a treated code,
the information requirements are considered at a base layer as a data structure that
includes both lexical definitions and interrelations, while the normative content is
represented by upper layer logical rules. Then, the data structure becomes the source for
the correspondence with the metamodel that supports the building model, either by a
mapping or by integration.
Typical problems of this kind of code checking systems are: (1) ad-hoc solutions for
the building model, when a true standardized modelling does not exist or does not
provide the appropriate semantics; (2) poor correspondence between the building model
and the content of the code, usually as a result of poor semantic capacity of the building
model (most frequent with geometry-based modelling); (3) non homogeneous treatment
of the code content, when some of its information requirements cannot be satisfied by the
building model.
Yet, this is the level of automation of the majority of the developed systems till now.
For example, the SEED project (SEED 1997), which is one of the most serious attempts
based on geometry-based building modelling, suffers from the referred mapping
limitations, while the undergoing CORENET project (Liebich et al. 2002) looks much
more ambitious with the exploration of the new possibilities of IFC on building
modelling. Also some software applications for specific technical domains include a few
capabilities of code checking assistance when the code information requirements are
close to their own database specifications, but it is rather limited and not suited as a
general approach.

eWork and eBusisness in architecture, engineering and construction


3.3 Code checking certification

Code certification is a kind of code checking that is performed at crucial moments of a
building lifecycle, possibly within a formal process of certification carried by some
authority or control agent. When it occurs during the design process, it is particularly
relevant as part of a strategy to prevent risk on building construction.
Code certification is the maximum extension of code checking and it presents a severe
demand in relation to the accuracy and completeness of a building model, because the
entire content of the model is completely checked against all the applicable clauses of a
certain code.
Even when it is assured basically by human intervention, it is expected that the
building model becomes a clear representation of the intended building, meaning one
that shows all the relevant aspects of the building after a proper interpretation.
Depending on the internal structure of a code, still it is possible to consider only a set
of clauses, but only if the set is clearly identified and generally recognized as a
detachable part.
For a building model to be automatically checked for certification, it is expected to be
totally meaningful and definitive in relation to the code. Final decisions about all relevant
design solutions must have been made, must be clearly described in the model and must
be kept unchanged for the future (if the certification is not to be repeated).
In order to reach this level its critical to assure that the building model satisfies every
information requirement of a treated code. As this can easily conflict with standardization
trends, the building metamodel that supports the modelling must include flexible
mechanisms for the representation of information requirements that cannot be directly
supported by preset definitions. For example: specific space type identifications or
specific property sets concerning a single element, required by a certain code (as it is just
the case of object type and external property definitions within IFC).
The referred Normative Product Model concept states the convenience, for code
checking certification, that is to join several distinct codes that may work together within
a specific time and territory, for a specific building category, beginning by those that may
contribute with more general and structuring knowledge. Both layers of a Normative
Product Model (metamodel integration and rule-based declarations) must be documented
by some sort of an official specification controlled by the same authorities that are
responsible for building codes, in order to have unique definitions within the same
modelling support technology.
Since an excellent correspondence is required, a special attention must be given to the
relations between the data structure determined by the code and the global data structure
of the building metamodel that is to be used. Model-based integration is better than
simple linear mapping because of the complexity of the interdependencies among
individual information elements.
Code checking certification is, by now, still a dream, but it points out a main goal for
future developments, though it demands global solutions and it will have impact on the
own way that building codes and other requirements will be specified, as well as in the
administrative processes of building design approval by official authorities.

Adding sense to building modelling for code certification


Building modelling for code checking certification greatly concerns building

3.4 Particular requirements checking
By definition, particular building requirements cannot be generally defined, so its
difficult to assure a good modelling support for them. However, the probability of an
enhanced building simulation technology to provide better support is certainly bigger
than with a merely graphical modelling.
Among the many sources of particular building requirements, there are: (1) the
building design program; (2) the building design constraints; (3) technical analysis not
directly related to building codes; (4) the building management; and (5) the building
maintenance. Also when some change on building configuration occurs, particular
requirements can be identified for an appropriate evaluation of the impact of thatchange.
Nowadays, when an authority receives a building plan and begins the process of checking
its requirements for certification, the very first notion that is really checked is whether it
looks like a building or not. In spite that, most probably, any code explicitly says that the
specifications of a building plan shall in fact become a building, or at least includes an
operable definition of what a building is supposed to be, the truth is that thinking of code
certification in the case of a non-sense building, just sounds malicious.
Now the authority probably looks first for some signals that may show the evidence of
a building at a glance, and after he/she uses further analysis to confirm or deny the
There are several conceivable circumstances where a building specification plan soon
cannot be accepted as corresponding to a building, as when:
It is clearly incomplete;
It is not a single building (as expected);
It doesnt refer to a foundation and site;
It is out of scale to human proportions;
There is no entrance;
Floors have too much angle;
There are undefined holes in outer walls;
There is no roof;
There are no fixed elements;
Axes are changed;
Everything looks mixed up;
There are non-identifiable elements;
Furthermore, it is possible to think of certain severe unconformities with defined
requirements as resulting in a non-sense building too. For example: (1) the absence of the

eWork and eBusisness in architecture, engineering and construction


water distribution system; (2) the existence of individual spaces without entrance; or (3)
the existence of too many apartments that do not conform to space requirements.
For a human interpretation based checking, this kind of considerations may sound
exotic, but for a computer-based automated checking it is a matter of major importance if
the building model is expected to be realistic, even because the entire checking procedure
becomes much less solemn and errors are easier to arise. So, there is an absolute need of
some formal definition of how a building must be, not only by a positive approach,
saying what a building is expected to be, but also by negative statements that can prevent
unexpected occurrences.
The previous considerations lead to the notion of the global sense of a building, from
which results a verdict about a building model can in fact refer to a realistic building, or
not. This global sense notion can take into consideration some building classification,
resulting in selective references, like: its a building of type X, except for this and that;
or its a nonsense building in regard to type X.
The question of sense is always associated to code checking because, by definition,
each clause of a code determines how something must be, or what makes sense about the
building in relation to the code it belongs. However, global sense is somehow different
because it first concerns the realism of a building simulation, which is an important
quality of a building model, especially when an automatic code checking certification
takes place.
An automated code checking certification must be based on a reliable and complete
building model, or otherwise its results can easily be false. This means that every
building element shall be properly described and classified, and all its relevant relations
with other elements shall be explicitly declared.
Firstly, this depends on the capacity of IFC, as a building metamodel, to satisfy all the
information requirements of the treated codes without exception, at least by the proper
use of its extension mechanisms.
Lets consider an example from a Portuguese building code: in a house, the kitchen
can never have a direct communication with a bedroom. In order to check this clause, the
checking system can perhaps start by the detection of a kitchen, then looks for all the
walls that surround it and for each one of them tries to find an opening (possibly a door or
a window). If successful, then identifies the space to which the wall connects and sees if
it is a bedroom type. But the result can be false if the space identifiers do not exist or are
not compatible with the code requirements (not allowing the distinction between kitchen
and bedroom types), or if the relation between a space and the surrounding walls is not
So the demand of completeness really means to explore the simulation power of IFC
modelling to the maximum, avoiding the negative tendencies that result from optional
interrelations between identified elements of a building, as well as partial
implementations in software tools. Besides, as a consequence of an intensive and
extensive exploration of IFC resources, the operating environment must be capable of
processing a dense network of data, specially for queries, both at high level (conceptual

Adding sense to building modelling for code certification


data) and low level (for example, when doors and windows are only geometrically related
to walls). In fact, some problems can arise in relation to this increased amount of data and
its quality:
The possibility of mistaken interrelations to be declared;
The possibility of users to forget relevant interrelations;
The corresponding machine processing cost;
The lack of specialized assistance;
For the first two of these problems its possible to develop software tools to go deeper
into the analysis of a building model and help to prevent such errors by detecting
important signals of possible occurrence. The kind of building cognition needed by these
tools is also related to the notion of global sense, but here the emphasis goes for a
probabilistic approach (not mandatory).
So, besides the capacity of IFC, the reliability and completeness of a building model
can be greatly improved by using sets of generic mles based on building sense, which can
be customized by users as model types. Following examples illustrate this kind of rules:
Always an interior space type is expected to have an entrance;
Always a bedroom space type is expected to have a window opening to the external
Every space that is totally surrounded by walls is probably an interior space;
Every interior space area where a dishwasher exists is probably a kitchen;
Every interior space to which more than two doors are related is probably a circulating
space type;
If a combustion equipment exists in an interior space, it needs the corresponding
connection to an air exhaustion system;
The implementation of this kind of building cognition must consider the existence of
multiple start points and directions to explore in the network of data of the building
model, where the goal is to find the proper hierarchical structure and sequence of
operations, in order to obtain the most quicker and effective results.
Once formalized in accordance to the IFC capabilities, software tools that use this kind
of building cognition can be integrated with user interfaces to IFC models, and become
an important contribution to facilitate its use.
The main reason why a standard building metamodel like IFC cannot include rules of
building sense, as those that have been described above, is that otherwise it would
become generally restrictive to the design process in an unacceptable way for a building
language level. The kind of knowledge associated to building sense is best intended as a
filter for building model data, which is to be used in special moments of the design
process (and eventually later on during the entire lifecycle) and which will be inevitably
dependent upon particular building concepts.

eWork and eBusisness in architecture, engineering and construction


Besides, where is the legitimacy to determine what makes and what does not make
sense in regard to a building?
Building codes are the prior admissible sources to determine building sense, though
they are just intended for that, not only by their content but also by the quality of being a
recognized and general imposition over the building design. Besides, by the same reason
that one shall not expect to find inconsistencies between the distinct codes that apply to
the same building, the seek for building sense gains much more effectiveness if a larger
and properly structured set of applicable codes is considered, because of the improved
global consistency. Usually, architectural building codes provide more general definitions
based upon space functionality, while more specific codes shall depend on these general
definitions for their detailed views. Thats why the Normative Product Model concept
also states that more general codes shall be considered first as a fundament for the formal
definition of sense in a building. And it points to distinctive manifestations of the concept
for each set of building codes sharing the same conditions of applicability (time period,
territory, building categories, etc.).
Nevertheless, even the most general and structuring building codes can become
insufficient in regard to the kind of global sense that has been described. So, a second
source must be considered, which is precisely the one known as common sense, and
this shall be used only to complement building codes concerning the realism, reliability
and completeness of a building model.
Building codes and common sense can be considered the only admissible sources for
the definition of building sense, at least within a normative perspective towards an
enhanced building simulation. However, all other sources containing building
requirements can also be considered for special objectives. Beyond the previously
mentioned sources of particular building requirements, the so-called codes of good
practice deserve reference, though they often represent the anticipation of important
requirements that can only be properly checked much later during the design process.
If a building sense relies on building codes and other particular sources, then there are
multiple building sense definitions, some of them corresponding to a particular set of
codes that embodies a specific Normative Product Model, while others simply become
tools to support the design process.
It has been shown that building sense is greatly important to enhance a building model
towards an enhanced building simulation, through realism, reliability and completeness.
This last achievement becomes especially relevant to stimulate the simulation capacity of
IFC-based modelling.
However, because building sense is not suited to become standardized, the Normative
Product Model concept points to a kind of building modelling that includes two layers
that can be referred as: building language and building sense. The first layer consists of
an integrated metamodel that results from the integration of the structural information
requirements of a building code into a standard metamodel like IFC, and its main content
consists of object definitions. The second layer becomes a structured set of rules that

Adding sense to building modelling for code certification


represent the normative content of building codes, for which it uses the building language
defined by the first layer, together with a standard formal language like EXPRESS.
It has been suggested that this approach is also useful for any other situation of
automate building requirements checking, once an integrated building metamodel can
satisfy the particular informatSion requirements of the respective source.
The incorporation of building sense and other requirements checking systems into
design tools, using IFC as a standard base component, can greatly improve the design
process, allowing for better simulation models at a lower cost and in shorter time.
Bazjanac, V 2002, Early Lessons From Deployment of IFC Compatible Software, in Z Turk and RJ
Scherer (eds.), E-Work and E-Business in Architecture, Engineering and Construction, Proc. of
4th European Conference on Product and Process Modelling, Portoroz, Balkema, Rotterdam,
IAI: 2003, [http://www.iai-international.org/iai_international].
Liebich, T., Wix, J., Forester, J. and Qi, Z: 2002, Speeding-up the building plan approvalthe
Singapore e-plan checking project offers automatic plan checking based on IFC, in Z Turk and
RJ Scherer (eds.), E-Work and E-Business in Architecture, Engineering and Construction, Proc.
of 4th European Conference on Product and Process Modelling, Portoroz, Balkema, Rotterdam,
Santos, IA., Hernandez-Rodriguez, F. and Bravo-Aranda, G: 2002, A Normative Product Model for
Integrated Conformance Checking of Design Standards in the Building Industry, in Z Turk and
RJ Scherer (eds.), E-Work and E-Business in Architecture, Engineering and Construction, Proc.
of 4th European Conference on Product and Process Modelling, Portoroz, Balkema, Rotterdam,
Santos, IA: 2003, Modelizacion de Edificios de Viviendas para la Verificacion Automatica de
Requisitos Formales Asociados a las Normas Generales de ConstruccionUn Desarrollo
Basado en los Estandares IFC y UML, PhD thesis, Dpto. de Ingenieria del Diseno, ESI, The
University of Seville.
SEED: 1997, SEEDA Software Environment to Support Early Phases in Building Design, The
Carnegie Mellon University, USA, and The University of Adelaide, Australia,

eWork and eBusiness in Architecture, Engineering and ConstructionDikba & Scherer (eds.)
2004 Taylor& Francis Group, London, ISBN 04 1535 938 4

Towards engineering on the grid

.Turk, M.Dolenc, J.Nabrzyski, P.Katranuschkov, E.Balaton, R.Balder &
All: The inteliGrid Consortium, c/o University of Ljubljana, Slovenia
ABSTRACT: Grids are generally known as infrastructure for high
performance computing. However, the original idea behind grid
computing was to support collaborative problem solving in virtual
organizations (VO). A challenge for collaboration infrastructures is to
support dynamic VOs that collaborate on the design, production and
maintenance of products that are described in complex structured product
model databases. Such VOs are typical for industries with long supply
chains like automotive, shipbuilding and aerospace. Perhaps the most
complex dynamically changing VOs and are in architecture, engineering
and construction (AEC). Semantic interoperability of software and
information systems belonging to members of the VO is essential for
efficient collaboration within the VO. We believe that the current state of
the artthe Web Services paradigm, is too fragile and tangled for
efficient collaboration in AEC. Grids provide the robustness but need to
be made aware of the business concepts that the VO is addressing. The
grid itself needs to commit to the products and processs ontology
thereby evolving into an ontology committed semantic grid. To do so
there is a need for the generic business-object-aware extensions to grid
middleware, implemented in a way that would allow grids to commit to an
arbitrary ontology. These extensions are propagated to toolkits that allow
hardware and software to be integrated into the grid. This is expected to
be done in a European Project called inteliGrid. This paper presents its
baseline, hypothesis and expected results. The projects impact is
expected to be wide; it will create knowledge, infrastructure and toolkits
that will allow for a broad transition of the industry towards semantic,
model based, ontology committed collaboration using the grid, rather than
the Web, as the infrastructure, thus enabling the grid to become a
mainstream collaboration paradigm.

The integration of the AEC industry and the interoperability of the hundreds of software
applications supporting the design and construction of the built environment have been
providing one of the most challenging environments for the application of information
and communication technologies. The islands of automation (Hannus & Silen, 1987)

Towards engineering on the grid


problem has been identified by the AEC community in the late 1980s and several national
and EU project have been tackling the problem since.
It is interesting to read Fosters definition of the grid (Foster, 2002) in the context of
collaboration requirements of the construction industry Foster defines grid computing as
coordinated resource sharing and problem solving in dynamic, multi-institutional
virtual organizationsnot primarily file exchange but rather direct access to computers,
software, data, and other resources, as is required by a range of collaborative problemsolvingin industry. This sharing ishighly controlled, with resource providers and
consumers defining clearly and carefully just what is shared, who is allowed to share,
and the conditions under which sharing occurs. This statement captures the essential
requirements of collaboration in the AEC.
An EU project was proposed to verify this hypothesis. It is expected to start in
September 2004 and will run until February 2007 with a total funding of 3.13 million
EUR. The partners in this project include University of Ljubljana (Slovenia, coordinator),
TU DresdenTUD (Germany), Polish Center for Super-computing ApplicationsPSNC
(Poland), VTT (Finland), EPM Technology ASEPM (Norway), Conject (Germany),
Sofistik (Greece), OBERMEYER Planen+Beraten GmbHOPB (Germany) and ESoCE
NET (Italy). The home page of the project is at http://www.inteligrid.com./
This section presents the state of the art in grid computing and AEC interoperability. We
believe that there is a clear convergence between the two.
2.1 Grids and semantic grids
Grid is a type of parallel and distributed system that enables the sharing, selection, and
aggregation of geographically distributed autonomous resources dynamically at
runtime depending on their availability, capability, performance, cost, and users qualityof-service requirements.
Grid computing is an innovative approach that leverages existing IT infrastructure to
optimize computing resources and manage data and computing workloads. According to
Gartner (Price WaterHouse Coopers, 2002), a grid is a collection of resources owned by
multiple organizations that is coordinated to allow them to solve a common problem.
Gartner further defines three commonly recognized forms of grid:
Computing gridmultiple computers to solve one application problem
Data gridmultiple storage systems to host one very large data set
Collaboration gridmultiple collaboration systems for collaborating on a common
Grid computing has its origins in solving computationally intensive problems. Recent
developments and trends of grid computing go beyond the solving of data (petabytes) or
computationally (teraflops) problems for scientists and engineers towards making grids a
suitable business infrastructure for virtual organizations. Grids are increasingly viewed as
services (Foster et al., 2002) aware of the business semantics. Semantic grids should

eWork and eBusisness in architecture, engineering and construction


provide to the grids what the semantic Web is providing to the Webcommunication
based on high level, meaningful entities.
There are numerous related research projects in the EU, the US and beyond: The Grid
Enabled Optimization and Design Search for Engineering (GEODISE)
(http://www.geodise.org/) project was one of the first to explore the possibilities of the
semantic grid, however, the semantics was being attached to files as metadata. True
semantic-rich that would study the meaning of the information inside the files is not
addressed. The myGrid (http://www.mygrid.org.uk/) project defined its architecture using
the OGSA architecture but does not seem to be based on a common ontology which is
what inteliGrid is aiming for. The Commodity Grid Kit (COG) (http://wwwunix.globus.org/cog/) project has similar goals to inteliGrid, but is addressing a different
business sector and seems to be primarily concerned with the heterogonous data formats
and not heterogonous information schema addressed in inteliGrid. Based on the review of
the state of the art, semantic information as we understand it in engineering has to date
not been addressed in a grid environment. This is a key contribution of this project. Also
relevant to the inteliGrid project are the Collaborative Advanced Knowledge
Technologies in the Grid (CoAKTinG) (www.aktors.org/coakting) and Grid-Enabled
Desktop Environments (GRENADE) (http://mrccs.man.ac.uk/research/grenade) projects
that address the interactive collaboration using grids. This is not targeted in inteliGrid but
their open source results could be re-used in the inteliGrid demonstrations. The only grid
project related to the AEC sector that we are aware of is the National Science Foundation
(NSF) funded Information Infrastructure for Earthquake Research (SCEC/IT)
(http://www.isi.edu/ikcap/scec-it/). The rather broad and practical goal is to provide
information technology infrastrucrure for earthquake research, including knowledge
representation and reasoning, Grid technologies, digital libraries, and interactive
knowledge acquisition.
World leading software companies such as Oracle, IBM, Microsoft and several
software SMEs are also developing grid middleware and grid extensions to their existing
software. The following are some software vendors, which could potentially make use the
results (semantic extensions) of the inteliGrid project:
Oracle delivers database products and application servers. Their Oracle 10 g version is
an enterprise grid version, using server consolidation and cluster computing
Avaki Corporation is a supplier of commercial grid software solutions that provide
wide area access to data, compute, and application resources in a single, uniform
operating environment.
Metapa supports business intelligence applications on open source, commodity
technology, including the use of Lintel platforms and Metapas database clustering
GridSystems develops and markets the InnerGrid multiplatform product that allows the
application of the Grid technology to the current corporate environment. It speeds the
key processes of a business converting underused resources in a virtual

Towards engineering on the grid


2.2 AEC interoperability

inteliGrid addresses the interoperability needs of automotive, aerospace, shipbuilding,
furniture and AEC industries. Perhaps the most demanding environment is the AEC
industry. Characteristic for the AEC industry are the uniqueness of the products, the
processes and the dynamic and quite improvised VO involved in the process. The items
to be integrated are seldom predefined and the integrated solution is unlikely to be
repeated. One stable element in this framework is the conceptual model of a building
product. While buildings are different from each other, the language (and the data
structures) required to describe them are believed to be stable. Both the International
Organization for Standardization (ISO) and the International Alliance for Interoperability
(IAI) have made a considerable investment into the building classification and building
product model standards that defined the data structures required to describe any building
product. Several European projects have demonstrated that by using these standards data
created in one application may be used in another. The Consortium has been actively
involved (coordinating, partnering) in several of these projects and has also learned from
the problems that they faced. The IAI and ISO efforts, however, are not limited to the
AEC industry but are targeting the whole spectrum of industries that are designing and
building three dimensional material products.
In spite of the extensive research in building information models, however, the
industry still communicates using line drawings, files and perhaps project webs. We
believe that one reason for that is that the generic IT infrastructures today are well suited
for semantically poor data formats and file or document level information exchange.
Semantic Web and web services technologies, built around XML have been demonstrated
in research projects, such as eCONSTRUCT (http://www.econstruct.org/) and ISTforCE
(http://www.istforce.com/), but their scalability in large complex industrial environments
has not been tested.
Building product model is defined, and IFC version 2.X is supported by key 3D
modelling suites like ArchiCAD, Architectural Desktop and Microstation. They can
produce a building information model (BIM) that will be used by hundreds of other
applications that support the design, planning and maintenance of building products.
These applications will need to read and write the building information model. Today,
they do so in a variety of ways. The prevailing way is by reading and writing files in a
format and schema conforming to the standard. Where to write data to and where to read
it from needs to be managed by the human user each time a program is started
(File...Open) or closed (File...Save as). These files may be uploaded/ downloaded to/from
project webs. Again, humans are to a large extent responsible for locating the right
information at the right time.
Building information model databases are being developed that will replace files as
container of BIM data. Several such databases are under development or even already
entering the market, for example WebStep by Eurostep, EXPRESS Data Manager by
EPM Technology, and IFC Model Server by SECOM, BSPro Server from Granlund etc.
For software to work with these databases, specialised interfaces (APIs) or dedicated
clients binding the program to a location of a particular database are needed. Again, in a
multi project environment, with multiple programs being used ondemand, this may be so

eWork and eBusisness in architecture, engineering and construction


hard that it may effectively discourage the use of building information models. These
problems are partially tackled by the generic services developed at TU Dresden (Weise et
al, 2004). The suggested approach mitigates hard demands associated with product data
sharing thereby allowing incremental improvement of the application of BIM in practice.
Product model standards and ontologies have in parallel been developed in other
industries as well. While they share common schema for the geometric information,
product structure and configuration management, they specialise when it comes to
information about distinct product components. The STEP standard ensures that
interoperability between domain models (known as Application ProtocolsAP) is
possible through the integrated information resource layer, but in order to further reduce
the cost of developing future APs ISO TC184 SC4 has introduced the Application
Modules (AM) layer, which defines self contained Units of Functionality (UoF) that can
be reused between the different models (http://step-mod.sourceforge.net/).
A major initiative that has resulted in the publication of a new application protocol is
PLCS (ISO10303239). It contains many of the modules found in PDM schema (itself a
subset of AP203, AP212, AP214 and AP232), with additions to support service and
maintenance concepts. The PLCS data model was designed in order to provide a data
model that is capable of supporting product data throughout the product lifecycle. It
supports automotive, aerospace, shipbuilding, AEC and other industries.
3 THE inteliGrid PROJECT
This section proceeds from visions, placing them in context, stating high levels goals and
refining them into measurable, scheduled objectives.
3.1 Vision
The vision of the project is to provide the industries with challenging integration and
interoperability needs a flexible, secure, robust, ambient accessible, interoperable, payper-demand access to (1) information, (2) communication and (3) processing
infrastructure. The idea to support virtual organizations has been central in grid protocols
development computting (Foster et al., 2001), however, most practical results to date
were related to a fast distributed computation and storage. The hypothesis of this project
is that grid technology has the potential to provide such infrastructure.
3.2 Contextintegration and interoperability in complex industries
The integration of the AEC industry and the interoperability of the hundreds of software
applications supporting the design and construction of the built environment have been
providing one of the most challenging environments for the application of information
and communication technologies. The islands of automation problem has been
identified by the AEC community in the late 1980s and several national and EU project
have been tackling the problem since. Grids are expected to be the solution to the islands
of computation problem. Figure 1 shows what has been known since later 1980s as the
islands of automation problem.

Towards engineering on the grid


Figure 1. Islands of automation

(Hannus & Silen 1987). The sea and
the various transports across to be
replaced or incorporated by a grid.
The islands in the figure are various areas which were automated at a certain time. It
should be imagined that the islands are rising slowly from the water. So, for example, in
the 1960s, only a few selected tasks were automated. By the 1990s large areas of
engineering design, architectural design and management were automated, however, the
gap between those (as well as within various peaks on the same island) still had to be
bridged by various integration and interoperability technologies, such as the Drawing
Interchange file format (DXF) ferry (symbolizing the exchange of drawings) or a more
modern Industry Foundation Classes (IFC) gate. It is the vision of this project to replace
the several different technologies used to travel between the islands, by freezing the
sea, by replacing the sea with the grid.
Grids could ensure the interoperability and collaboration platform providing that they
include the key ingredient required for a complex engineering virtual organizationthe
support for the shared semantics (Sowa 1984, Guarino et al., 1997). It is in this area
where we believe innovation and extension of the current grid architectures is required.
To the end user as well as engineering software developer this will bring two major
improvements. The grid infrastructure eliminates the need of knowing exact locations of
information and services. They are on the grid not at some Uniform Re-source
Identifiers (URI) or Internet Protocol (IP) address. The shared semantics powered by
ontology reduces the need to know the exact structure and access paths of the data in

eWork and eBusisness in architecture, engineering and construction


product model databases. The result is a semantic or cognitive grid. It is genericit gets
its business semantics from an ontology that can be an arbitrary one.
While the AEC industry is providing the testing environment for the project, all
technologies developed will be generic and applicable in any kind of virtual organization
environment and are not limited in any way to AEC. Business sectors that include long
and complex supplier chains share the same interoperability problems as the AEC. They
will be represented in inteliGridproviding requirements and evaluating the endas
well as interim results through the Industry Advisory Board (IAB) and ESoCE NET.
Moreover, many of the software applications to be integrated to the inteliGrid platform
are of general applicability to computational intensive problems coming from other
sectors as biomechanics, aerospace, shipbuilding and automotive industries.
3.3 Project goals
The long term practical goal of the project is to provide complex industries such as
construction, automotive and aerospace stable, co-allocated, reliable, unified, adaptive,
remote, ambient accessible, interoperable, pay-per-demand access to: (1) information, (2)
communication and (3) processing infrastructure and thereby provide the integration and
interoperability infrastructure.
This goal cannot be achieved by the project alone. But it can prepare the enablersthe
true project targetsfor the paradigm shift from internet and web services to the grid.
The enablers are researchers, standardization bodies and the key software developers.
They need a reference grid, which will be in a position to provide the strategic steering of
their future developments. Current state of the art addressing these needs is based on the
(semantic) Web services approach. This approach is viable in businesses with stable and
long term virtual organization relations where the investment into finding a service with
the Universal Description, Discovery and Integration (UDDI), learning its interface with
the Web Services Description Language (WSDL) and finally creating the links with
SOAP pays with the long time using of the services. In contexts where the involvement
of a partner in a VO is temporal, short term, but needs to be set up quickly, a grid based
approach seems more appropriate. The Figure 2 compares the two approaches.
The key scientific question addressed by this project is how grid technology can be
used to address the interoperability of software and services working with complex and
semantically rich information.

Towards engineering on the grid


Figure 2. Collaboration in a tangled

Web services environment with
multiple private ontologiescurrent
state of the art (above), compared to
the VO based on a semantic grid
platform committed to one ontology
In addition, the software and services need distributed processing power to crunch this
information. This needs to be done in an environment characterised by some standard
data structures that are undergoing a dynamic evolution.
The key technological goal is to make the grid infrastructure available to the mostly
small to medium enterprise (SME) companies that are providing the engineering
software. The core competencies of these companies are topics like structural mechanics
or 3D solid modelling and not latest trends in middleware technology. The results of the
technical work will demonstrate how typical server side applications (or components of
applications) can be made grid computing compatible and how the mostly client side

eWork and eBusisness in architecture, engineering and construction


Computer Aided Design (CAD) applications can interface with the grid. This will attract
new SMEs to enhance their applications with gridcomputing capabilities, since the
project will provide the necessary libraries, toolkits and guidelines.
The results are shown schematically in section 5.
3.4 Innovation
inteliGrid goes beyond simply grid-enabling present day applications, on present day
grids. It creates an underlying fabric in the form of abstracted toolkits and tools that can
be used to grid enable old applications, and more importantly to build innovative new
generations of grid applications that use semantics and ontologies. It also focuses on
applied research in the area of application scenarios in many areas, such as engineering,
construction, aerospace, fluid dynamics etc., that take unique and unprecedented
advantage of emerging semantic grid technologies.

Figure 3. Globus grid reference

architecture. The white elements are
part of the existing architecture. The
grey ones are generic extensions to be
developed and validated by the
inteliGrid project.
Main innovation activities are focused on extending the grid architecture with
semantics and ontologies beyond current work on metadata and heterogeneous data
formats. This is then verified by making vertical applications use these services.
Activities include state-of-the-art studies, requirements analysis, design and prototyping
of the software.
As discussed earlier recent developments in the grid community extended the
architecture towards the services paradigm that is a prerequisite for semantic grid. Figure
3 shows this architecture. We believe that if the grid is supposed to become an integrative
element for VO, the notion of the business concepts of this VO should be an integral part
of the grid. We plan to achieve this by adding an ontology layer into the grid that would
allow for any grid service to know what business relevant some data or process has. The
layer would be made available to other functionality through an ontology server.
Particularly important is this service to the database services of the grid, however, several

Towards engineering on the grid


ftmctions of the grid such as Monitoring and Discovery ServiceMDS

(http://www.globus.org/mds) and Globus Resource Allocation ManagerGRAM
(http://www.unix.globus.org/developer/resource-management.html) could work more
intelligently, if they are aware of the business context.
Interoperability in AEC today, at best relies on the management of IFC files. Attempts
to use true IFC databases are mostly academic. An exception is IFC data managed by
EPM for the Singapore Building Authority. This is a multi-user environment for
validating building plans against national building regulations (EPM Technology AS
EPM is also managing a Product Life Cycle Support (PLCS) implementation for the
Norwegian navy, which will handle all lifecycle data related to the frigates in their fleet.
The implementation uses the concept of DEX (Data Exchange Set) developed in PLCS to
allow applications to access subsets of the PLCS data model. In effect these are similar to
STEP conformance classes and allow different applications to access the data they are
interested in. Merging and partial extraction of the data to/from the PLCS data model, as
well as access control is managed by EPM. The PLCS implementation
(http://www.posccaesar.org/) also makes use of reference data that complies to ISO
15926 in order to further constrain the data population and to reduce the problem of data
File-based environments are fragile and depend on a single server that provides such
crucial fimctions as information and process management for a complex project.
inteliGrid proposes to use the grid as a robust, scalable, safe infrastructure for the
industry. It would allow seamless integration of software committing to any product data
standards and focus the developers into the ftmctionality and not data exchange or
interfacing with this or that information server. The grid is the place for the semantically
rich data.
Currently there are a few companies providing ASP services and project webs to
engineering communities that allow collaboration and information sharing. The grid will
extend this concept towards true resource sharing and on-demand resource renting. This
will allow for new business models to be developed as well as the rethinking of the
information technology infrastructures in the industry. Grids could provide the necessary
robustness as well as security (Welch et al, 2003) (through the X.509 mechanism) that
would make outsourcing the IT infrastructure a more realistic option than today when
they have to rely on a multitude of chaotically interwoven services.
This project is introducing two major generic improvements visible to the end user as
well as the application developer:
The grid infrastructure eliminates the need of knowing exact locations of semantically
rich data and complex problem solving services. They are on the grid.
The ontology reduces the need to know the exact structure and access paths of the data
in product model databases. They can be accesses by using an engineering ontology as
opposed object names and record keys.

eWork and eBusisness in architecture, engineering and construction



4.1 Grid standards
Grid (Services) Computing is based on an open set of standards and protocols (i.e.,
OGSA) that enable communication across heterogeneous, geographically dispersed IT
environments. The current trend is to produce a broader set of standards that cover all
aspects of Grid technologies (computational, data storage, networking and web services).
This effort is articulated through the Global Grid Forum (GGF).
The focus of the standardization contribution of this project to the global grid
movement will be the proposal of semantic extensions to the OGSA specification.
Currently OGSAs ontology is technicalit speaks of services, protocols, processes,
computers etc. We propose to build the semantic deep into the core of the grid standards
so that any grid related service or protocol can have a meaningful business role. Our
current idea is to allow for an arbitrary ontology, specified in one of the well established
ontology languages, become part of the very fabric of the grid.
More specific plan of the inteliGrid partners is to participate actively in two of the
working/research groups of the GGF:
Grid Scheduling Ontology Working Group (proposed), and
Semantic Grid Research Group (SEM-GRD) (http://www.semanticgrid.org/GGF/).
The roles of these groups are to produce ontology of Grid accompanied by a set of
documents describing the ontology and the tools/libraries used to create the ontology and
to make use of the ontology later. The ontology created will provide the machine
processable meaning of scheduling terms and conditions that is needed to negotiate
service level agreements between usually heterogeneous systems operated at different
independent sites. The working group will define usage and hierarchy of terms from the
Grid Scheduling Dictionary thus helping to understand these terms and enable tool
builders to incorporate the ontology into their tools. The ontology will overcome the
shortcomings of a dictionary allowing classification of schedulers, reasoning about
schedulers or mapping semantics of different scheduling systems for example. Using the
ontology generated by the working group when designing and implementing the next
generation of GRAM and their corresponding Grid services may further lead to ontologydriven systems.
The goal of the SEM-GRD is to realise the added value of Semantic Web technologies
for Grid users and developers. It provides a forum to track Semantic Web community
activities and advise the Grid community on the application of Semantic Web
technologies in Grid applications and infrastructure, to identify case studies and share
good practice.
The partners will propose topic oriented chapters of the GGF. GGF is now organised
either by geographic location or grid related technical topic, but not according to the
potential branch of industry having specific requirements to the grid. The proposed
aec.gridforum.org would focus on AEC virtual organizations, semantics and ontologies
for grids.

Towards engineering on the grid


4.2 Interoperability standards and ontologies

AEC interoperability standards, most notably the IAIIFC aka ISO PAS 16739 IFC,
(http://www.econstruct.org/), ISO 10303 STEP and the related STEP/SDAI (ISO 1994)
protocols are at an early stage of transition from research environments towards industrial
use. An adequate infrastructure, on which software that would support these standards
would run comfortably and smoothly, does not exist. We intend to provide it in the
inteliGrid project. It will, in this way, make a significant contribution to the introduction
of the most important family of standards that the AEC industry has developed over the
last 20 years.
At a recent workshop at Cormte Europeen de Normalisation (CEN) five areas of
specifications (CWAsCEN Workshop Agreements) that may lead to CEN
standardisation, were identified (resulting, in part, from the ICCI project involving three
inteliGrid partners):
1 European eConstruction Framework. This framework will model on a high abstraction
level the world of eConstruction with all relevant dimensions.
2 European eConstruction Architecture. A common, logical architecture fulfilling the EeF
incorporating vital components like: schemas, taxonomies, APIs and software.
3 European eConstruction Meta-Schema (EeM).
4 European eConstruction Ontology (EeO).
5 European eConstruction Software Toolset (EeS).
inteliGrids reference grid architecture directly addresses the needs of #1 and #2.
Through the development of construction ontology in WP3 it is directly addressing the
needs of #4:
inteliGrid also has an ambition to evolve the dated, client-server, STEP physical file
based ISO 1030322 SDAI into a modern, grid enabled information access interface to
product model data.
Partners of inteliGrid will continue to be involved with IAI-IFC development. TUD,
VTT, OPB and EPM are actively involved in the IFC development process.
inteliGrid will contribute to the harmonisation of competing ontologies currently
available in the construction sector, particularly in ironing out the differences between
the implicit ontologys of IAIIFC and those developed under the ISO 120062 and the
ISO 120063 framework standards. It is there three standards that could provide the
baselines for a common ontology for AEC.
inteliGrid will contribute to the development of an explicit ontology of the AEC
components of the IAI-IFC.

In Figure 4, the users are using the applications. These applications need information and
ftmctionality from the outside of the users workstation, from the grid.
The applications therefore have a workstation component and the grid based
component. The communication media between the workstation and the grid is the

eWork and eBusisness in architecture, engineering and construction


Internet and the workstation applications, their grid based counterparts, as well as the grid
only services are connected to the grid with a series of interfaces. At the very bottom
there are numerous computers on which these grid side services run. Workstations
would typically not know on which machine the service is running. Computationally
intensive services would run on several in parallel, big databases would be spread across
several machines. Simple services would have redundant backups in case of computer or
network failures. Any prototyping should therefore develop these components that
together form a semantic collaboration grid:
grid enabled workstation applications (the first three from the left) that connect to a grid
through a
specialised semantic grid adapter for each application. This adapter talks to the
workstation-side semantic grid client common to all applications on a workstation.
Over the internet, this client connects to
server side semantic grid server. It would run on machines providing grid enabled
semantic grid adapter will be used by services that are to made grid enabled
specialised core servers, such as the product database and an ontology server. This
software may not have a workstation component other than some administration

Figure 4. Draft architecture of

inteliGrid. The grid, enhanced with
generic product data and ontology
services provides the collaboration and
interoperability platform for problem

Towards engineering on the grid


In spite of successful pilots, the AEC industry lacks a robust collaboration infrastructure.
Grids are the latest hyped technology that promises the solution to this decades old
problem allowing both the researchers as well as the industry to capitalize on the
development in standards of the past decades. A grid is a natural transition path for the
project webs and application service providers. In this paper we have not been
mentioning a very clear potential that the grid has for the providers of complex numerical
and modelling software that is truly hungry for processing power and gigaflop computers;
the usefulness of those in solving complex engineering problems is obvious.
Research in the field of grids in AEC is just starting. An EU project has been
proposed, AEC partners have been involved in the preparation of a Grid integrated
project. There is at least one national grid related project (in Slovenia
http://www.gridforum.si/) seriously is focusing on the AEC aspects of the grid. However,
grid research in AEC is still rather new. The vision shared by the authors of this paper is
that the AEC community should work towards a single AEC grid in which various
services and software could be plugged in and not repeat the mistakes of the various
integration projects that developed their own collaboration infrastructures from scratch.
This paper is therefore proposing the establishing of aec.gridforum, to coordinate and
harmonize grid efforts in AEC as well as to show the general grid community, that to
support, with grid technology, virtual organizations of a particular domain, domain
specific solutions, particularly those related to domain ontologies, should be built into the
fabric of the grid.
This paper is based on the inteliGrid Project Proposal and its subsequent Description
of the work. Through their contribution to the definition of the project, these colleagues
also contributed to this pa per (in alphabetical order): T.CerovTiek, U.Forgber, A.Gehre,
U.Forgberger, J.Hyvarinen, J.Mitchell, B. Protopsaltis, R.Santoro, R.Scherer,
V.Stankovski, K.Tonn. As co-authors, only those are listed that made a significant
contribution to the delta between this paper and its baseline documents.
EPM Technology A.S., 2001. Singahpore leads the World in Automated Building Plan Approval,
Foster, I, Kesselman, C., Nick, J. & Tuecke, S., 2002. The Physiology of the Grid: An Open Grid
Services Architecture for Distributed Systems Integration.
Foster, I, Kesselman, C. & Tuecke, S., (2001. The Anatomy of the Grid: Enabling Scalable Virtual
Organizations. International Journal of High Performance Computing Applications, 15(3) 200
Foster, I. 2002. What is the Grid? A Three Point Checklist. http://wwwfp.mcs.anl.gov/~foster/articles/WhatIs TheGrid.pdf

eWork and eBusisness in architecture, engineering and construction


Guarino, N., Borgo, S. & Masolo, C. 1997. Logical modeling of product knowledge: towards a
well-founded semantics for STEP. In Proceedings of the European Conference on Product Data
Technology, Sophia Antipolis, France.
Hannus, M. & Siln, P., 1987. (updated by Hannus, M. 2002) Islands of Automation.
ISO 1994. DIS 1030322. Part 22: Standard Data Access Interface.
Price WaterHouse Coopers, 2002. Powerful technology trends continue despite downturn. PWC
Global Technology Center, Menlo Park, CA.
Sowa, J.F., 1984. Conceptual Structures: Information Processing in Mind and Machine., AddisonWesley, Reading, MA.
Weise, M., Katranuschkov, P &, Scherer, R., 2004. Generic Services for the Support of Evolving
Building Model Data. In Proceedings of the Xth International Conference on Computing in
Civil and Building Engineering, Weimar, Germany, June 0204, 2004.
Welch, V., Siebenlist, F., Foster, I., Bresnahan, J., Czajkowski, K., Gawor, J., Kesselman, C.,
Meder, S., Pearlman, L. & Tuecke, S., 2003. Security for Grid Services. In Proceedings of the
12th International Symposium on High Performance Distributed Computing (HPDC-12), IEEE
Press, June 2003.

eWork and eBusiness in Architecture, Engineering and ConstructionDikba & Scherer (eds.)
2004 Taylor& Francis Group, London, ISBN 04 1535 938 4

Managing long transactions in model server

based collaboration
M.Weise, P.Katranuschkov & R.J.Scherer
Institute of Construction Informatics, TU Dresden, Germany
ABSTRACT: We propose a novel approach for project collaboration
based on product data technology, the C/S paradigm and the concept of
long transactions. It provides an open server-based solution that can tackle
problems caused by the heterogeneity of ICT environments in AEC and
does not presume neither require tightly integrated application systems.
The essence of the suggested approach is in the coherent use of a set of
generic services supporting the sequence of well-identified data
modification processes. These services comprise partial model extraction,
model mapping, model matching (including model comparison and model
reintegration) and model merging. In this paper we focus specifically on
problems related to the reintegration of changed model subsets. We show
on theoretical level why generic model comparison cannot fully guarantee
error-free results for the reintegration of changed model data and discuss
the use of model subsets and their impact on project data sharing. At the
end we describe envisaged possibilities to apply the developed concepts to
the IFC project model and give a critical discussion for its application in
AEC practice.

Today it is widely accepted that efficient project collaboration can be best accomplished
using product data technology as basis (Eastman 1999). However, there are still a number
of problems to be solved. The existing heterogeneity of tools and systems in construction
IT, and especially the variety of data models used in the different stages of the design
process, strongly limit the successful application of PDT in practice. Undertaken
standardisation efforts, whilst principally successful, cannot fully overcome this problem.
It seems that additional model mappings will always be required (Turk 2001).
Integrated environments that have been demonstrated by a number of research projects
in the last decade are yet of little acceptance because they typically require wellharmonised applications, only capable to process agreed model data to a fine granularity
level. Scaling up such environments to the full set of practical use cases in computerintegrated construction is unlikely to be achieved due to the extreme increase of
complexity with regard to modelling representations, model mappings and consistency
(Amor & Faraj 2001). Obviously, solutions should be sought by other ways.

eWork and eBusisness in architecture, engineering and construction


We propose a novel approach to the realisation of a promising collaboration scenario

that may be practically achieved on short term: check-out/check-in of partial model data
provided by a central product data repository maintained by a model server.
In AEC/FM, and in fact in most other engineering domains as well, check-out/checkin of model data is a typical long transaction, comprising a set of well identified data
modification processes. Ideally, such long transactions should happen concurrently, on
(disjoint) subsets of the shared model data. Consequently, check-in of concurrently
changed model subsets leads to model matching and model merging problems that have
to be adequately tackled. An important aspect here is the capability to adequately deal
with model subsets (aka partial model) which has been addressed in several research
efforts (Lockley & Augenbroe 2000, Adachi 2002). However, in all known approaches
the final check-in is either poorly supported (the reintegration of the data into the shared
model is left almost entirely to the application tools which typically results in a tedious,
interactive process), or its complexity is ignored (assuming an ideally harmonised
environment where no data loss happens due to mappings between semantically and
structurally different representations).
In practice data mapping and all subsequent user modifications most often happen
within a black-box application (CAD, analysis/simulation tool) producing unknown
changes with regard to the overall management system. As there are a lot of different
black-box applications users might need to use, an objective of utmost importance is the
development of generalised data management methods that (1) are applicable to different
product models, and (2) can be flexibly assembled and tailored to support specific
process sequences and preferences derived from the general check-out/ check-in
procedure outlined above.
In this paper we describe how the problems related to the use of such black-box
applications can be successfully tackled. Suggested is an approach combining partial
model check-out based on GMSD, the Generalised Model Subset Definition Schema
developed by the authors and described in (Weise et al. 2003), and a generic method for
comparing changed model data only on the basis of the underlying model schema (Weise
et al. 2004).
We show how the information defined by the GMSD schema can be used for
comparing changed data and how the identified separate processes can be inter-linked to
improve the result. Special attention is given to the reintegration of the changed partial
model data which is, beside model merging and consistency control, one of the most
challenging tasks towards the achievement of model-based collaboration. The developed
services have been based on the broadly accepted EXPRESS modelling language (ISO
1030311 1994). Therefore they can be verified and used with the IFC project model and
all legacy applications supporting IFC-based data exchange.
Whilst the need for open, scalable and standardised ICT environments for AEC
collaboration is generally recognised, such environments are not easy to achieve in the
highly fragmented landscape of the construction industry. In order to apply product data
technology in practical work we have to acknowledge that:

Managing long transactions in model


1. The achievement of data integration requires sophisticated model mappings that in turn
may produce additional data conflicts or even data loss.
2. Engineering work requires long transactions and these transactions must happen
concurrently Hence, the goal should be to provide pragmatic methods which may not
guarantee full consistency at any time but which should support the users to regain
consistent model states.
3. Standards enabling collaborative work have not yet fully penetrated design and
construction practice. File-based data exchange of (partially) standardised model data
is the current, quite insufficient common denominator with regard to data sharing.
Therefore, in the development of collaboration approaches problems of imperfect
data exchange scenarios need to be tackled as well.
4. The road towards comprehensive life cycle data sharing will include a number of
incremental steps seeking to find the optimal balance between fully automated
consistent solutions for limited subsets of the design data and adequate interactive
functions to fill in the gaps.
Consequently, an approach allowing to alleviate hard demands associated with the
problems of data sharing is required.
Our approach is based on the conviction that full integration and consistency of the
evolving design data are not needed continuously but only at specific coordination points,
reasonably selected by the design team. We do not try to create a closed ideal world but
provide an open solution which allows to reduce data loss, improve data sharing quality
and reach a practically adequate degree of consistency. The developed concept does not
promise a perfect environment providing faultless data integrity. Instead, the strategy is to
mitigate the requirements to the involved engineering applications, reduce data loss
caused by data mapping and other data conflicts, and at the same time take into account
practical deficiencies in current data models and their software implementations.
The envisaged principal application scenario is based on the concept of long
transactions allowing off-line modifications and, in order to support parallel work,
involving versioning and merging of concurrently changed data. This is achieved with the
help of four key generic services as follows:
extraction of model subsets that are of interest for a specific design task,
mapping of the model data to support different modelling representations,
matching of two successive model versions to help recognise properly the latest
modified data, and
merging of concurrently made data changes.
We assume a common modelling paradigm and a commonly agreed (standardised) data
model to represent the data to be shared. Taking into account current practices and trends,
the developed services are based on the broadly acknowledged EXPRESS modelling
language and can therefore be used with a lot of existing data models, such as an IFC
project model of any version and a like.
Whilst there are many different use cases where these services can be applied, they
can all be derived from the principal scenario shown on Fig. 1 below. It starts at time
point ti with the consistent shared model version Mi based on the product data model M
(defining the data that have to be shared), and proceeds until the next coordination point
ti+c is reached.

eWork and eBusisness in architecture, engineering and construction


The data processing sequence for a single designer is comprised of the following six
1. Model subset definition and subsequent extraction of the needed model data from Mi to
a model subset Msi, which can be expressed as Msi=createSubset (Mi, subsetDef (Mi)).
2. Mapping of the model subset Msi to the domain model Si representing an instantiation
of the domain data model S, i.e. Si=map (Msi, mappingDef (M, S)).

Figure 1. Schematic presentation of the

principal application scenario for
model server based collaboration.
3. Modification of Si to Si+1 by the user via some legacy application which can be
expressed abstractly as Si+1=userModify (Si, useApplication (A, Si).
4. Backward mapping of Si+1 to Msi+1, i.e. Msi+1=map (Si+1, mappingDef (S, M)).
5. Model matching of Msi+1 and Msi+1 including object identification, comparison and
evaluation resulting in the found differences Msi+1,i, i.e. Msi+1,i=match (Msi, Msi+1).
6. Reintegration of Msi+1 in Mi resulting in Mi+1, i.e. Mi+1=reintegrate (Mi, Msi+1,i).
The final consistent model Mi+1 can then be merged with the divergent design data of the
other designers (modified in parallel using the same procedure) at the coordination time
point ti+c to obtain a new stable model state Mi+c. This can be expressed abstractly by
Mi+c=merge (Mi+1, Mi+2,, Mi+k), with k=the number of concurrently changed checked
out models.
If a standardised data model that can be processed by the involved engineering
applications is used, as e.g. IFC, then the model mapping shown in Fig. 1 will not be
necessary. Similarly, if all common data from Mi can be processed by the used
application(s), subset creation can be skipped.
The reintegration of changed data, specifically focused in this paper, is always
necessary in all application scenarios dealing with model subsets. It depends directly on
the result of the previous steps, especially on the quality of model comparison performed
by the matching operation. Additionally, the used model subset is needed to be able to
identify deleted model data. In order to evaluate the process of reintegration we will
shortly characterise the model subset extraction and the model matching methods.

Managing long transactions in model



The use of model subsets in the data modification processes has several advantages.
Beside reducing the quantity of exchanged data, the most notable reason is to specify the
model subset which can be properly managed by the requesting application. In the case of

Figure 2. Applying model subset

extraction to remove unchanged data
before matching.
the IFC model which covers several design domains, an application for architectural
design can hardly be expected to manage all information related to HVAC design, albeit
contained in the IFC model. This problem is well known to the Implementation Support
Group (ISG) of the IAI and is semantically tackled by model subset agreements, the so
called view definitions. However, we still miss a sufficient server manageable model
subset definition. To be used in the envisaged scenario, a model subset should be (1)
easily definable with as few as possible statements, (2) completely described within a
single service request and (3) usable for reintegration of changed data by allowing
elimination of mismanaged or unmodified data sets. To serve these requirements on
adequate scale we have developed a Generalised Model Subset Deflnition schema
(GMSD) which can be used as schematically shown in Fig. 2. The first createSubset()
operation generates the model subset Msi which is then modified by some design
application to Msi+1. However, in the case of using a STEP physical file mandatory
attributes have to be written to Msi even if not defined by the model subset request. This
results in superfluous and often mismanaged information. Removing such attributes can
be achieved by applying the same createSubset() operation to Msi+1 too. The model
subset can then be further advantageously reduced by removing unmodified data. For that
purpose, a new subset extraction has to be specified and applied to both Msi and Msi+1*

eWork and eBusisness in architecture, engineering and construction


so that the resulting model subsets

are processible by the following
model matching service. More details on GMSD are provided in (Weise et al. 2003).
Model matching can be divided into the comparison of (externally) modified design data
and, if a model subset is used, its reintegration into the complete set of design data. The
principle of a highly reusable generic comparison of object oriented models is based on
the premise that there are corresponding data object instances of the same model schema
which can be compared on attribute level. However, for real practice, different from a
pure academic approach, we have to consider that (1) not all objects can be uniquely
identified by a kind of a key value, (2) a structural difference in the data does not always
imply a change in their semantic meaning, (3) eventual data loss caused by model
mapping may emerge on object and attribute level and (4) errors from the used
application may result in replacement of valid data (such as wrongly set IDs,
replacements by default values etc.).
As indicated in the previous section, the problems (3) and (4) can be tackled by the
model subset definition. Problem (2) can be tackled by using a normalising model
mapping which can help reduce the variations of the data structures representing one and
the same semantic meaning. Hence, our proposed model comparison method has to deal
basically with the problem of object identification. Unfortunately, this problem is
theoretically not unambiguously solvable under the abovementioned real world
conditions. Therefore we suggest an algorithm that provides a simple scalable way for
finding corresponding data objects. Its essence is in the iterative generation of object
pairs by evaluation of equivalent references of already validated object pairs. The result
of the suggested algorithm is affected by the underlying data structure (particularly the
percentage of available object identifiers), the amount and kind of data changes and the
occurring variations of the data structure representing the same semantic meaning. Thus,
it cannot fully guarantee that all corresponding objects will be found or properly
established for any arbitrary practical situation. The results may also include multiple
corresponding objects in some of the cases where a single match would be the proper
solution. Finally, since the algorithm is based on pure examination of the data structure
(without involvement of any engineering semantics), the model comparison will always
find all data changes caused by allowed variations in the representation of the same
semantics (e.g. different geometry description of the same physical position in IFC).
However, in spite of these deficiencies, the suggested algorithm provides a 95% correct
solution in most practical situations, showing a very satisfactory performance, adequate
for online processing. It overcomes the complexity involved in the treatment of the most
general matching case and, if supplemented with a suitable interactive procedure to fill
in the gaps and adjust the results, can provide an error-free model comparison that fulfils
the set requirements and verifies the rationale of the approach.

Managing long transactions in model



If model subsets are used, as shown in the discussed principal application scenario, the
reintegration of changed data becomes necessary. As discussed in section 3 above a
model subset is created by removing data objects, cutting or reducing references, filtering
attributes etc. Reintegration means to invert the process of model subset extraction, i.e. to
add removed data objects, restore cut or reduced references and re-create the attributes
that have been filtered out. In our approach it is heavily based on the model subset
definition achieved via GMSD and on the results of the model comparison.
5.1 The reintegration process
The principal alignment of the reintegration process in the overall approach is illustrated
on Fig. 3. At first, by applying a GMSD-based createSubset( ) operation to a given model
version some objects and attributes will be removed. For object O1 this results in a new
version OS1 in which the simple reference a is removed and the aggregated reference b
is downsized by one element. This object is then modified externally by some application
to OS2 which differs from OS1 in the aggregated reference b, downsized by another
element, and the simple references c and d. The reintegration (which is the second part
of the shown match( ) operation) adds all objects and attributes from O1 that have been
removed according to the model subset definition. In this particular case, this will
recreate the cut/downsized references a and b. Additionally, all references from
unselected to selected data objects must be recreated too, as shown for the black object
in Fig. 3, using an arrow to denote its reference to O1.
Whilst this procedure is pretty clear and is more or less the same in all different
scenarios, there are various detailed problems that need to be dealt with. They are shortly
outlined in the following subsections.
5.2 Model comparison problems
Most critical for the formal reintegration of changed data are wrongly established object
pairs leading to an

Figure 3. Reintegration of model


eWork and eBusisness in architecture, engineering and construction


incorrect recreation of attributes. If OS2 is not based on O1 the cut and downsized
references of OS1 will be assigned to a wrong successor and hence violate the originally
intended meaning. It can be argued that shared significant, high-level data objects, such
as instances of IfcBuildingElement from the IFC model, can be uniquely identified and
that less important data objects, such as IfcPoint, are mostly used with the complete set of
attributes. However, on theoretical level we cannot assume such implicit knowledge of
the semantics of the model. Therefore, the reintegration of changed model subsets has to
be supervised by the user to compensate missing object identifiers. For the formal step of
reintegration this can be done by evaluating established object pairs and found data
changes, which can be limited to data objects where cut references or removed attributes
shall be re-established or added. Generally, this step represents additional user
interactions which are caused by insufficiencies of the used data exchange scenario and
the data handling in the participating applications. The amount of this additional work
depends on several criteria such as the percentage of uniquely identifiable data objects,
the quality of the data produced by the used application, and the selected model subsets
for design modifications, i.e. the number of objects which have to be reintegrated.
5.3 Structural problems
The described reintegration scenario is limited to object pairs in 1:1 relationships which
cannot be guaranteed by the suggested comparison algorithm. If needed, it can be
enforced by applying an appropriate post-processing of assumed object pairs, applying
additional criteria to reduce the cardinality. However, for tracking the design history it is
important to allow n:m relationships or change of object types as well. Such cases can be
tackled by applying some additional user interactions, mostly needed to resolve eventual
1:m version relationships
This case occurs if an old object is associated to several new objects. The reintegration
itself can be performed as shown in Fig. 3 by duplicating all removed attributes for all
new (corresponding) objects. Consequently, in the discussed example the attributes a
and b wills be re-established for both objects OS2 and OS2, as shown on Fig. 4 above.

Managing long transactions in model


Figure 4. Problems in the case of 1:m

For simple attributes like INTEGER, FLOAT or STRING this will work without
structural problems if not further constrained by some uniqueness rule (which is normally
required for attributes representing designated object identifiers). References are
principally treated in the same way, but they are often constrained by inverse
relationships, leading to a violation of the allowed cardinality. This is the case, too, if a
reference to the changed object has to be recreated, because for m new objects there are m
options to set the reference. This problem is shown on Fig. 4 for the dark object which
can either point to OS2 or OS2 but not to both. In such cases an additional decision by the
user is necessary to resolve the structural conflict. Alternatively, to reduce such user
interactions, some further strategies for automatic suggestions can be envisaged, for
instance based on least object changes.
If a duplication of references can be done without structural conflicts, it leads to
shared data objects, i.e. an object will be referenced by more than one object, and
consequently will destroy the tree structure of an existing reference tree. However,
reference trees are beneficial in the treatment of model subsets. Including a reference tree
in a model subset leads to a smaller number of cut references. Moreover, data changes are
locally limited and usually do not implicitly affect other parts of the model.
Thus, whilst the sharing of object references is a powerful modelling concept, it
should be used predominantly for uniquely identifiable objects. Otherwise, the data will
be much more sensitive to design changes.

eWork and eBusisness in architecture, engineering and construction


n:1 version relationships

This case occurs if several old objects are associated to one new object. Here the
reintegration as shown in Fig. 3 cannot be performed, since n options can be found for
each attribute which has to be recreated. This case requires additional user interactions to
decide about possible optionseither to reduce the n:1 relationship to 1:1, or the
recreation of single attributes. In the example given on Fig. 5 the user has to decide about
the attributes a and b, i.e. whether they shall be recreated from OS1, from KS1 or from
a mixture of both.
Less problematic are references to the objects. Formally, all removed references to OS1
and KS1 can be moved to MS2, the single new object representing the successor (new
object version) for both. This can be done without structural conflicts, if not restricted by
constraints of inverse relationships which consequently will be defined for MS2.
Change of the object type in a version relationship
This case occurs when corresponding objects are of different types. The reintegration as
shown in Fig. 3 can be performed for all commonly used attributes on the basis of the
first common object definition in the inheritance hierarchy of both object types. This is
shown on Fig. 6, where both objects use the attributes a, b and c. However, the
attribute d of OS1 cannot be represented using the object type of OS2. Consequently, the
data stored by d will be lost. Additionally, a problem may occur when trying to recreate
references originally pointing to O1, because such references may be restricted to types of
OS1 and would therefore not be allowed for the more abstract type of OS2. In such cases
the reference cannot be recreated. Moreover, this may lead to further problems e.g. when
the reference is mandatory and therefore may not be removed. Again, this requires
additional user interactions to decide about possible optionseither to change the object
type, or to adjust the troubling reference.
Using a more abstract object type for domainspecific design changes is explicitly
supported by the GMSD schema but it looks slightly different than shown on Fig. 6. In
principle, an object can be changed to a type defined higher in its inheritance hierarchy in
order to manage more abstract and less complex data objects, if this is sufficient for its
further use. Consequently, to avoid data loss for the richer object definition a
transformation to the original type will be necessary to reintegrate design changes of this

Managing long transactions in model


Figure 5. Problems in the case of n:1


Figure 6. Problems in the case of

changing the object type.
However, in that case the casting of object types occurs between O1 and OS1, and not
between OS1 and OS2 as shown in Fig. 6. Thus it can be restored without problems when
creating O2.
As it can be imagined, a mixture of these structural problems is also possible. Their
superposition definitely requires further user interactions but this can be additionally
utilised for a refined strategy of automatic suggestions for conflict resolution.

eWork and eBusisness in architecture, engineering and construction


5.4 Semantic problems

Even in the restricted reintegration scenario shown on Fig. 3 full consistency of the data
cannot be guaranteed solely by generic services because, in addition to the outlined
structural problems, we have to deal with semantic conflicts requiring domain knowledge
for their detection and resolution.
Basically, a semantic conflict occurs when changes made to a model subset require a
change of the remaining part of the model data in order to achieve a consistent model
state. Semantic conflicts are a typical (unwanted) result by the work with model subsets
which has to be tackled by the involved designers and normally requires further reviews,
additional design decisions, recalculations and so on. From the viewpoint of our approach
most critical and time consuming are rearrangements of the object structure without
changing the semantic meaning, such as changed units and coordinate systems. If such
changes implicitly affect the remaining part of the model data, an adjustment of a large
number of data objects may be necessary without requiring additional design decisions of
any designer. For example, if a globally used length measure is changed from meter to
millimetre, a change of all attributes using a length measure is necessary. Here, only a
multiplier has to be applied, but such automatic updates are not always so simple and
clear. In general, additional constraint definitions will be needed to help detect such
semantic conflicts. However, in current practical data model specifications such
definitions are mostly limited to simple rules (like restrictions of cardinality or
uniqueness of values). Therefore data consistency must finally be evaluated by the
involved designers during the matching and especially the merging processes, which is
not further discussed in this paper.
The theoretical approach outlined in the previous sections is being validated for several
available product data models. Most comprehensive checking has been done for the IFC
model due to its broadest acceptance, its overarching multi-domain scope (making model
subsets an important issue), and the large number of IFC-compliant applications.
As already mentioned, the result of applying the suggested services heavily depends
on the underlying model definitions. They are shortly addressed below, before discussing
drawn observations.
6.1 Evaluation of the IFC model from the viewpoint of model matching
Concentrating on aspects of using model subsets and its later reintegration into the
original source model, the following concepts of the IFC model definition have to be
The layer concept and the respectively applied ladder principle for references
separates commonly used data from domain specific data. It makes it easy to define
domain specific model subsets and finally leads to a smaller number of cut references
when working with such subsets.

Managing long transactions in model


Basic modelling concepts, such as defining the object geometry, location, material
properties and so on are reused by inheritance throughout the whole model definition.
Therefore, all participating applications will have a common understanding of these
concepts and the suggested use of more abstract objects can be applied.
The IFC model globally defines the used measures and the coordinate system for object
placement. The concept of relative placement makes the IFC model more sensitive to
changes of model subsets w.r.t. model consistency.
In many cases there exists a large variety for describing the same semantic meaning.
This makes the model vulnerable to semantic conflicts.
Not all IFC objects can be uniquely identified (via an object ID). Observations of
currently available IFC data sets have shown that the percentage of identifiable objects
on instance level is typically below 5%.
Starting from identifiable objects a reference tree can be created where many of the
unidentifiable objects can be unambiguously allocated. However, there are several
examples of shared objects without identifier where this procedure cannot be applied.
This is not at all an ideal situation for the generic model comparison algorithm.
The IFC model provides the possibility to attach individual, i.e. not standardised data
by using property set objects. Consequently, sophisticated model subset definitions
will be needed to restrict requested data to the manageable property set objects.
A comprehensive description of the IFC model can be found in Wix and Liebich (2001).
6.2 Practical use of the IFC model
To work with practical and real size model data we make use of available IFC
applications. Thus, we are limited to file based data exchange according to ISO 10303
21, which provides no adequate support for partial model exchange. However, we can use
a procedure which generates an IFC file containing the requested model subset and all
additionally required data (i.e. mandatory attributes) to formally fulfil the ISO 1030321
Consequently, the changed data has to be processed by the same GMSD request
removing such added attributes in order to work with the correct model subset that was
originally intended to use. This will be necessary for instance for the GloballD attribute
of IfcRelationship objects, which are mandatory but not managed by most known
applications, i.e. they are newly generated for each IFC export. Since relationship objects
can be seen as primary objects too, e.g. for the structural analysis domain, and because
of the fact that they will typically contain a lot of cut references when using model
subsets, the tracking of the data will be destroyed if identifiers of these objects are not
managed correctly. If newly created GloballD attributes can be ignored, corresponding
IfcRelationship objects can be found by the generic model comparison algorithm and thus
can be applied for the model subset approach.
To cope with such problems we have applied model subset definitions expressing the
capability of the used architectural design application, namely Graphisofts ArchiCAD.
Additionally, we have removed different object types, such as windows or doors to deal
with cut references.

eWork and eBusisness in architecture, engineering and construction


6.3 Evaluation of the developed concept

Successful data reintegration depends on the results of the model comparison, and
especially on the correct identification of corresponding objects. As discussed in Weise et
al. (2004), the test results for deliberately made changes were mostly very good, even
when dealing with a loss of identifiers leading to only 0,1% of uniquely identifiable
objects. However, for real changes in large models in the addressed long transactions the
comparison algorithm may not compensate so well the heavy loss of object identifiers
which can significantly downgrade the result.
The performed tests showed that objects that could not be properly recognised as
changed were mostly classified as new thereby leading, cascadingly, to the same
decision for objects referentially dependent on them. However, for the IFC data structure
propagation of this effect seems to remain on a limited scale because unidentifiable
objects are mostly used within a reference tree, i.e. they are not independently shared.
Moreover, for practical use of the IFC model cutting of references will be mainly
necessary for objects which can be identified with high reliability. In contrast, objects
that are not identifiable with certainty will mostly be used with all attributes which
requires no recreation of removed data.
Thus, the probability of data loss or wrongly reintegrated data appears to be generally
tolerable. Most critical seem to be semantic conflicts caused by changes of measures and
shared coordinate systems. Even so, we can conclude that IFC provides a good basis for
true database transactions but respective user friendly services are yet to be developed.
Such developments can be supported by the suggested novel approach presented in this
The support of the German Research Foundation (DFG), and the involvement of FZK
(Karlsruhe) in the testing of the developed services are herewith gratefully
Amor, R. & Faraj, I. 2001: Misconceptions about Integrated Project Databases. ITcon Vol. 6,
available from: http://itcon.Org/2001/5/
Adachi, Y. 2002. Overview of Partial Model Query Language. VTT Building and Transport /
SECOM Co. Ltd., Intelligent Systems Lab., VTT Report VTT-TEC-ADA-12, available from:
Eastman, C.M. 1999. Building Product Models: Computer Environments Supporting Design and
Construction. CRC Press, Boca Raton, Florida.
ISO 10303-11 IS 1994. /Cor.1:1999/Industrial Automation Systems and IntegrationProduct Data
Representation and ExchangePart 11: Description Methods: The EXPRESS Language
Reference Manual, International Organisation for Standardisation, ISO TC 184/SC4, Geneva.
Lockley, S. & Augenbroe, G. 2000. Data Integration with Partial Exchange, Proc. of International
Conference on Construction Information Technology, INCITE 2000, Hong Kong, pp 277291.

Managing long transactions in model


Turk, Z. 2001. Phenomenological Foundations of Conceptual Product Modelling in AEC.

International Journal of AI in Engineering, Vol. 15, pages 8392.
Weise, M., Katranuschkov, P. & Scherer, R.J. 2003. Generalised Model Subset Definition Schema,
In: Proc. of the CIB-W78 Conference 2003Information Technology for Construction,
Auckland, NZ.
Weise, M., Katranuschkov, P. & Scherer, R.J. 2004. Generic Services for the Support of Evolving
Building Model Data, In: Proc. of the ICCCBE-X Conference, Weimar, Germany.
Wix, J. & Liebich, T. 2001. Industry Foundation Classes IFC 2x, International Alliance for
Interopembility, http://www.iai-ev.de/spezifikation/IFC2x/index.htm.

eWork and eBusiness in Architecture, Engineering and ConstructionDikba & Scherer (eds.)
2004 Taylor & Francis Group, London, ISBN 04 1535 938 4

A software generation process for usercentered dynamic building system models

G.Zimmermann & A.Metzger
University of Kaiserslautern, Kaiserslautern, Germany
ABSTRACT: The architects view of buildings is drastically changing
because of technical progress, especially in control and facility
management. Also, the usage of many buildings is often altered during
their life time. Therefore, buildings have to be designed and maintained as
systems including the users and uses over time. We have developed a
formal building model that integrates the domains of building structures,
service systems, control systems, functional units, and user activities. To
intuitively model the dynamic behavior of user activities, we introduce a
graphical modeling notation that we call Message/Transition Charts and
show how these diagrams can semi-automatically be transformed into
executable code. We believe that these contributions to the modeling of
building systems and to transforming the models into simulators prove
that architects, building users, and facility managers can efficiently be
supported during the full life-cycle of buildings.

In architectural design four major stages are distinguished: the programming, proposal,
main proposal, and the detailed design stage. During the lifetime of buildings, facility
management stages and redesign or remodeling stages alternate. We foresee that in all
stages there will be an increasing demand for models, simulations, and tools that
incorporate user requirements and activities much more explicitly than it is the current
One of the reasons is that buildings are becoming technically much more sophisticated
because of the embedding of digital control and communication systems as well as other
technical advances. Such systems make the requirements analysis, design, use, and
maintenance of buildings much more complicated for all stakeholders and add a new
dimension of complexity.
Another reason is a growing awareness that the building users should be in the center
of attention during all stages and that a building should adapt to changing requirements of
users instead of the other way around.
In most building development, construction, and maintenance processes it is claimed
that everything is done for the best of the user. It is assumed that the average user
requirements are known and buildings that are built for these average users will fiilfill the
requirements of the majority of users.

A software generation process for user-centered


However, post-occupancy evaluations show that this assumption is not true. There is
no average user and all individuals have different requirements. Satisfaction case studies
showespecially in the case of automated service systems with no user intervention
that in many cases the majority of the users is not satisfied and would like to get more
individual control. Since satisfaction with the working environment is strongly correlated
to productivity, much can be gained by improving satisfaction. Also, users often do not
react to situations as assumed. This means that we are dealing with ranges of user
requirements and non-deterministic behavior of individuals.
Post-occupancy evaluations of users in buildings come too late in the process.
Therefore, we propose the use of simulators to be able to experiment with groups of
individual users, which also can have extreme requirements and exhibit non-deterministic
reactions. We are aware that the dynamic behavior of persons is not well known and
models of it are also based on average assumptions. Our hope is that by comparing
simulations with observations of real users, the models can be refined over time.
Advances in software and hardware technology and in computer science as well as
software engineering provide us with the means to establish models, simulators and tools
to support all stakeholders in coping with the problems that were mentioned above. One
such approach for systematically coping with the complexity that is introduced by
regarding sophisticated building technology and individual user activities is introduced in
this paper.
Still, it has to be clarified which questions computer simulation can answer that could
not be answered by experienced architects or by stochastic models or calculations. At this
time we can only make assumptions on the necessity of simulation. Applications of such
simulators can show that these can provide solutions to the posed problems.
One field we see as very difficult to tackle without simulation is the dynamic
interaction of users with building control systems and the test and optimization of
appropriate user interaction strategies. Simulations with user activities in a building
system that is equipped with a control system could answer the question how little should
be automated, how much user influence should be provided, how such systems should
react to unexpected user behavior and how different control strategies influence the total
building performance. The outcome might be that no control, computer aided control or
the fully automated control should be preferred to provide optimal user satisfaction (and
thus productivity) with little or no loss in energy efficiency.
Other questions are related to the use of resources shared by many individuals of
groups of users. Such resources are access and circulation areas and common facilities
and equipment. Instead of using stochastic models, dynamic simulations with many
individuals and situations could also provide answers about the average satisfaction or the
satisfaction in extreme or unexpected situations.
At this point, it should be stated clearly that we regard user activities as dynamic
activities that happen over time. This means that the models and the simulators have to
reflect this dynamics of user behavior in relation to time.

eWork and eBusisness in architecture, engineering and construction


There have been discussions about the necessity of more formally taking user activities
into account during the design and maintenance of buildings, and some progress has been
made to achieve this goal. For example, Eastmann & Siabiris (1995) and Eck-holm &
Fridquist (2000) have extended object structure models for building spaces by including
organizations, user activities, and activity spaces. The purpose of these models is the
formalization, communication, and storage of data, not so much the simulation. Eckholm
(2001) introduced semi-dynamic user behavior by showing situations of user activities in
a CAD-program.
Dijkstra & Timmermans (2002) extended this notion into the dynamic domain by
introducing models of space-time behavior of persons. They used agent technology to
implement such models. Experiments have been conducted in the domain of pedestrians.
The results are of great value in urban planning and also for the design and evaluation of
transportation in general and of circulation areas in buildings. There is ongoing research
to extend this work to other user activities in business processes.
We have developed structural models of buildings as complete systems (Zimmermann
2003) that include users, user activities and activity spaces. These models provide the
foundation for the automatic creation of simulators and is explained in Sect. 3. This paper
demonstrates how this model can be extended to also include the dynamics of user
We have shown that physical effects that are observed in buildings (like heat or air
flow) can be simulated by mapping simple physical objects into autonomous
computational objects that compute the required physical results at run-time
(Zimmermann 2002). We can use this technology to treat all physical simulation
problems by integrating suitable computational objects into the building system model.
The communication links between these objects can directly be derived from the static
building structure in case of the domains regarded so far. Where the topology of the
physical objects directly reflects the communication relationships in such a case (e.g.,
connected walls will exchange heat), this is not true when regarding users and their
activities. The reason for that is that the topological relations change while users move
(e.g., when going from an office to a meeting room) or change their memberships with
certain groups (e.g., when changing the role from private person to employee).
Additionally, the flow of messages and the behavior of the objects strongly depends on
their state. For example, an occupied meeting room will force a user seeking for an area
to hold a presentation in to search for a vacant room.
Therefore, we need to augment the building models with that additional information.
Unfortunately, all notations that are commonly available to the software engineer do not
seem to be suitable for these purposes.
A vast number of modeling notations, the so called scenario notations (Amyot et al.
2003), focus solely on the external communication between objects. Examples for such
scenario notations are Message Sequence Charts (MSCs, cf. Braek et al. (1993)), UML
Sequence Diagrams (Rumbaugh et al. 1999), or Use Case Maps (Buhr 1998). The
behavior described in these scenarios usually only represents a single run of the

A software generation process for user-centered


modeled system, thus forcing the modeler to create lots of diagrams to achieve an overall
understanding of the system. The concepts for hierarchical decomposition or repetition of
partial scenarios as suggested in some of the notations (e.g., Hierarchical MSCs) only
provides little help in our context.
As an other extreme, there are modeling techniques that only allow for the description
of the internal, stateoriented view of objects. SDUs state flow diagrams (Braek et al.
1993) or UMLs State Charts (Rumbaugh et al. 1999) are examples for such notations
that are commonly accepted in industry and academia.
Only UML Activity Diagrams (Rumbaugh et al. 1999) and variants thereof seem to
support the mixed specification of (external) messages and (internal) states. However, the
visual appearance and understandability by non-experts is far from ideal when the
number of states to be regarded is increasing.
It is the deficits of the above notations that made us conceive a new modeling
notation, which we call Message/Transition Charts or MTCs for short (cf. Sect. 4). This
paper will illustrate the notations elegance for the purpose of modeling user activities
and dynamic behavior between distributed objects.
Our approach bases on sound software engineering techniques that are applied to solve
the problem of modeling building systems including user activities and to implement
appropriate dynamic simulation environments in reasonable time. The most important
techniques that are applied are structuring (or separation of concerns), iterative
refinement, reuse, and model as well as code generation.
Structuring is exploited throughout several dimensions. As a main structuring concept
we have partitioned the building system model according to the domains: building
structure, service systems, control systems, functional units, and user activities. All
elements that are described by the sub-models of these five domains are further classified
as being of type space or of type matter. As an example, in the building structure domain,
the volume of a wall is considered as being of type space, where the materials that make
up the wall (like bricks) are regarded as matter.
Figure 1 shows the top level view of our building system model. All elements in this
model are derived from the generic SystemObjectType. The Matter and SpaceTypes are
found as specializations of this generic element (the more general element is depicted by
the hollow arrowhead), as are the elements of the respective domains (e.g.,
All SystemObjectTypes are related by Requirements. Requirements are typically of
such a form that an element in one domain requires a service that is fulfilled by one ore
more elements in other domains. A basic structuring principle is that these requirements
should only relate elements of the SpaceType if possible. In this way the spaces in the
five domains together with the requirements, form the backbone of the overall building
system model.

eWork and eBusisness in architecture, engineering and construction


Figure 1. Building system model

(system level).
Additional relations are depicted by the realizedBy arcs in Fig. 1, which bind together the
matter and space elements of the respective domain. In many cases, the matter is of
secondary importance and can be neglected in the early stages of modeling.
Besides the connection of space elements through requirements, spaces can also be
related to each other by topologic or geometric relationships. This is expressed by the
spatial relation between the SpaceType in Fig. 1. If we begin to refine the models of the
individual domains, the introduction of aggregation or composition relations (one
element is made up of other elements) or other relations between space or between matter
elements can become important.
When refining the top level building system model to create the individual domain
models, we use an approach of iteratively (step-by-step) refining these models following
certain levels of detail. Figure 2 shows these five levels. The system level corresponds to

A software generation process for user-centered


the model in Fig. 1. From this level, the elements of the domain level are described first
and then these elements are refined to form the application domain level. The different
application domain models extend the level of detail of the domain models for different
application domains.
The elements of all models presented so far represent a classification of types of reallife objects (e.g., the office building domain model contains an OfficeType element,
which is a type of a SmallOffice of an exemplary real-life object Office-32419).
Therefore, these elements form a library of so called metaobject-types that can be reused
when creating new models. When these meta-object-types are instantiated they form the
models at the project level, which consequently contains object-types (like the
SmallOffice). Rather than linking the project and the application domain level with the
generalization arrow, we use a simple line to depict this instantiation relationship. When
the project level models are transformed into simulators (see Sect. 5) and executed, the
runtime-objects (that are instances of the object-types) reflect the real-life objects at the
run-time level of the model hierarchy.

Figure 2. Model hierarchy.

eWork and eBusisness in architecture, engineering and construction


Figure 3 shows the user activity domain model as a more detailed example of a model
at the domain level. The generalization relations to elements of the building system
model are depicted by angle brackets. The types of roles that can be taken on by
individuals (IndivRolT) are composed of different individual types of activities
(IndivActT), which themselves can consist of more fine-grained activities. Accordingly,
group role types are defined.
Refinements of this model (at the application domain level) are meaningful in
different domains like office buildings, factories, or homes. For the offlce building
domain model (see Fig. 2) we would derive elements such as ManagerType,
SecretaryType, and VisitorType from the element IndividualType. In home applications,
other refinements would apply. The reason for this seemingly complex structure of levels
and domains is as follows: We do not aim at creating a monolithic simulation
environment for building systems that integrates all possible alternatives of such systems
as well as their usages and that can only be personalized by setting a large number of
parameters. We rather aim at a systematic and efficient method for constructing
customized simulation environments for each application of such simulators. The above
structuring and reuse concepts provide the framework for such an approach to be

Figure 3. User activity domain model.

A software generation process for user-centered


After having modeled the structure within the five domains with sufficient detail, we
have to define the behavior of the respective objects. As it has been noted, we concentrate
on user activities in this paper, which especially includes the specification of roles and
activities that make up these roles (see Fig. 3).
It is obvious that different users and roles are active concurrently. Therefore, a well
fitting computational model for describing this behavior is a set of communicating,
concurrent objects. A first step in specifying models of such concurrent objects is a rather
abstract and well-structured description, which depicts the communication between
objects and their change of states (state transitions) as triggered by the reception of
messages. From these high-level models, more detailed models can then be created that
represent the input to the simulator generation process as described in Sect. 5.
As it has been motivated in Sect. 2, we will use our notation of Message/Tmnsition
Charts (MTCs) as a suitable diagramming technique for such an abstract description.
MTCs consist of a few basic building blocks that can be structured in a hierarchical
fashion. At the lowest level, states of objects are identified (like the states occupied and
vacant that we have introduced in Sect. 2). These are depicted by small rounded
rectangles. Messages (painted as thick arrows) can trigger a change of states, i.e. a state
transition (depicted by thin arrows), which can imply the creation of new messages that
are sent to other objects.

Figure 4. Example of a
Message/Transition Chart.
Object boundaries are drawn as large rounded rectangles. As we are modeling user
activities with objects, such boundaries can show the boundaries of activities as well.
Figure 4 presents an example with two such activities, each having two states and two
This figure also shows the more advanced modeling constructs that are available in the
MTC notation. Small circles depict connection points that allow the usage of parts of the

eWork and eBusisness in architecture, engineering and construction


diagrams in a hierarchical fashion (the usefiilness of this feature will become obvious in
Sect. 6) or the connection of message flows (as shown within act2). When the transition
from state S1 to S2 is taken, a new message m3 is created. The text included in
parentheses following the message name specifies optional parameters of this message. In
the above figure the message m3 has the parameter a. Depending on the value of a, one
of the two transitions in act2 is triggered. In the case that a is one, the message m4 is
An MTC thus presents a set of possible chains of messages in one single and easily
comprehensible diagram, which neatly reflects the behavior we want to model on an
abstract level. The editing of MTC diagrams as well as the hierarchical management of
parts of such diagrams is supported by a tool that we automatically created from a formal
tool specification using the Meta-CASE environment DOME (Engstrom et al. 2000).
As it has been motivated above, our goal is the systematic an thus efficient construction
of customized simulation environments. One potent way of gaining efficiency is the
automation of repetitive or complicated tasks that can be described by simple strategies
(or algorithms). One such task is the transformation of parts of the building system model
into executable code. Like a programming language compiler automatically creates an
executable application from its source code, we will show that the same powerfiil
technique can be employed for generating building simulators from our building system

Figure 5. Simulator creation process.

A software generation process for user-centered


The overall process for attaining simulators is depicted in Fig. 5.

The start of the creation process is the structural model of the building system, which
is augmented by the behavioral description through MTCs. An example for such models
will be shown in Sect. 6.
From the augmented building system model, tabular documents are created by the
simulator developers. These documents, which are part of our software development
method PROBAnD (Metzger et al. 2002), contain the formal specification of the
structure, the behavior (specified by state transitions) as well as the messages that are
exchanged. Many tasks during the mapping of the MTC models to these PROBAnD
documents are straightforward and could be automated easily. However, as we first
wanted to gain experience in applying the MTC notation we have so far refrained from
implementing such automation tools.
As Fig. 5 shows, the PROBAnD models are then used to automatically generate
models in the specification and design language SDL (Braek et al. 1993). We have shown
the feasibility and the technicalities of this approach in (Metzger et al. 2003) and refer the
interested reader to this publication.
Usually, the generated SDL models are complete and can directly be used for
generating executable simulators (Zimmermann 2001, 2002). If not, extensions or
modifications can be performed by the developers. We have used the commercial code
generator Telelogic Tau for automatically generating simulators from such SDL models
(Mahdavi et al. 2002).
SDL has been the language of our choice because it allows for the specification of
independent objects (called processes) and the description of the object behavior by
(extended) state transition diagrams, thus presenting a seamless progress from the stateoriented descriptions in the MTCs. Further, interactions between processes are modeled
as message exchanges.
The above generation process has been successfully applied to the building and control
domain, where we can rely on thorough experience for behavior specification without the
need of using MTC models as an intermediate step. As we have pointed out numerous
times, such a behavior specification is more complex for the user activity domain. Here,
we will demonstrateby using a small and simple examplehow MTCs can be used for
Let us assume that we want to model the user activities within our university complex
(called UKL) as an instantiation of the application domain model for specifying office
buildings. Our UKL secretaries, which are of the type SecretaryT, take the role of a
UKLSecretary upon entering her office and starts with the activity desk work that might
consider the special context of working at our university. During the work hours the
person in this role might interrupt the desk work to make copies, meet with the manager,
etc. In all cases the person has to move from one place to another.
Already this very simplified example creates many requirements that have to be
fulfilled by objects of other domains. For example the role of a secretary occupies an

eWork and eBusisness in architecture, engineering and construction


qffice place, desk work requires a desk place, move uses circulation space, and so on.
These places require building spaces and services. Typical services are sufficient light
levels, which can be provided by natural or artificial sources under automatic or manual
control. Some of the requirements are more static in their nature; e.g., building space
requirements. But the fulfilment by actual spaces can change when a desk is moved,
causing a chain reaction in the resulting requirements. Other requirements like circulation
space requirements are of a dynamic nature in relation to an individual. Also, such spaces
can be shared, but limited resources.
This example shows the occurrence of many requirement chains that are interrelated
and can form complex graphs. To be able to model the fulfillment of all requirements and
simulate a possible solution, we have to model the structural relations of all involved
objects and the behavior of the resulting system.

Figure 6. Excerpt of the UKL user

activity project model.
The structure is modeled at the project level by object-types that are instantiations of the
meta-objecttypes from the application domain level; e.g., the UKLSecretary is of the type
SecretaryT (see Sect. 3). As a modeling notation we use UML Class Diagrams
(Rumbaugh et al. 1998). These diagrams have already been used for the introduction of
the building system models.
Figure 6 shows a small portion of the structural parts of the UKL user activity project
model as a further example. In contrast to the example of Fig. 3, this figure uses the colon
: to depict the fact that the element to the left of the colon has been created by

A software generation process for user-centered


instantiating the element to the right of the colon; e.g., the object-type UKLSecretary is
an instance of the SecretaryType of the application domain model.
Each UKLSecretaryRole aggregates one instance of the individual activities of
UKLDeskWork, UKLMakeCopy, UKLMove and UKLMeet, which have to be active in
mutual exclusion. The most abstract view of this behavior of the UKL secretary role can
be specified with the MTC as it is shown in Fig. 7.
In this diagram we have reduced the number of arcs by using bidirectional arrows for
the messages if applicable (the > and < symbols show the direction of the labels).
Also, for brevity, parameters have been omitted.
At the top of the diagram, the hierarchical activity UKLSecretaryRoleCtrl represents
an object that controls the overall behavior of the UKLSecretaryRole (the folded corner is
a visual cue that a refinement of this activity exists). All other nodes within
UKLSecretaryRole represent instances of activity types, whose behavior is defined
elsewhere. It should be noted that the instance mov1 only exists once and is shown in
three shared copies to simplify the layout.
Outside of UKLSecretaryRole the person UKLSecretary, is shown, which takes on the
role of secretary upon arriving at work. A first activity within this role is the secretarys
move to her desk to begin the desk work. This desk work can be interrupted (triggered by
the actionCtrl message) and either a meeting or a copy job can be performed. In each
case, the secretary has to move to the respective places.

Figure 7. Message/Transition Chart for

UKL secretary role.

eWork and eBusisness in architecture, engineering and construction


This diagram also shows the interface to objects within the ftinctional unit domain
(connection points on the right hand side of Fig. 7). As an example, the desk work
activity needs to get a desk or the copy activity needs to enter the copy place.
Because of its abstract nature, the MTC in Fig. 7 allows for different alternatives of
the dynamic behavior and the control of the different activities. The simulation
environment could very strictly control all activities by sending simulatorCtrl messages
with exact timing and ftmctionality requests. Typically, parameters of messages from
outside of the simulated domains would be provided by files. Therefore, the results of
such experiments would present a repeatable outcome. In contrast to that, simulation
control could be very loose by giving the objects autonomous control, similar to the
concept of independent agents. A non-repeatable behavior would result, which could be
analyzed with statistical methods. Finally, an indeterministic behavior could be achieved
by using random generators that influence the objects behavior. In such a case, the
simulation environment could be used to control the stochastic parameters. We believe
that this large range of behavioral alternatives can be employed to easily realize a variety
of different experiments.
There are different options for the further use of this abstract MTC: First of all it can
be employed to define all external and internal message interfaces of UKLSecretaryRole
for the subsequent stages of the software generation process. Second, it can be extended
by modeling the other domains at the same level of detail (connecting the MTC with
MTCs of the other domains). Third, it can be refined by completing all message relations
and by precisely modeling UKLSecretaryRoleCtrl in detail. Figure 8 shows one such
possible refinement.

A software generation process for user-centered


Figure 8. Refinement of
Before the role has been taken on by the secretary, it is in the state undeflned. As soon
as the response message (from the desk work activity in Fig. 7) arrives, the state of the
role changes. If the secretary has just taken on the role, she starts working. If she has just
been away for a meeting or a copy job, she resumes her work. Upon the first transition
from undefined to working, the occupOffPlace message is sent to the respective object in
the functional unit domain such that the occupancy of the person is noted (the message
parameter is true).
Whenever the simulation control environment triggers a new action, the request for
this action is propagated to the desk work activity. If the simulatorCtrl message requests
quitting the role (because the working time might have ended), the state of the role

eWork and eBusisness in architecture, engineering and construction


changes to undefined and the functional unit is notified that the office place is no longer
occupied by the secretary.
The hierarchical decomposition that we have illustrated above can be used at as many
levels as seem to be suitable, and therefore allows us to handle very complex systems. At
the bottom of this composition hierarchy, simple objects (or activities) reside that are
solely specified by states and state transitions with the appropriate actions (the
UKLSecretaryRoleCtrl has been an example for that).
The secretary MTC can easily be reused for creating models for other roles like a
manager role. Such a role could make use of the same or other elements as needed. To
support such kinds of reuse, we maintain an MTC library, which is extended with every
new project. Besides these abstract behavior descriptions, the library also contains the
refinements of these descriptions in the form of PROBAnD models to speed up the
simulator generation process.
This paper has shown the feasibility of efficiently creating customized building
simulators for various application domains. The examples that were presented as a
motivation for creating such simulators might seem obvious. Nevertheless, we hope that
once architects realize the potential of such a custom-specific tool generation, they will
come up with more interesting concepts for performing experiments and evaluations of
buildings before these are erected.
We believe that this area of building simulation is a very promising field for both
building and software architects to work together productively. Our vision is that
architects will be able to create the abstract behavioral and structural models (i.e., the
project level models) from which the software architects (software engineers) can take
over and refine these models into running simulators. We hope that the small examples of
our Message/Transition Chart notation has supported the visual appeal and ease of
understanding of modeling at this level and will provide a basis for further discussion in
the field of modeling user activities and processes.
Amyot, D. & Eberlein, A. 2003. An Evaluation of Scenario Notations and Construction
Approaches for Telecommunication Systems Development. Telecommunication Systems. 24(1),
(2003): 6194.
Braek, R. & Haugen, O. 1993. Engineering Real-Time Systems. An Object-oriented Methodology
Using SDL. New York, London: Prentice-Hall.
Buhr, R.J.A. 1998. Use Case Maps as Architectural Entities for Complex Systems. IEEE
Transactions on Software Engineering. Special Issue on Scenario Management. 24(12), (1998):
Dijkstra, J. & Timmermans, H. 2002. Towards a multi-agent model for visualizing simulated user
behavior to support the assesment of design performance. Automation in Construction.
11(2002): 135145.

A software generation process for user-centered


Eastman, C.M. & Siabiris, A. 1995. A generic building model incorporating building type
information. Automation in Construction. 3(1995): 283304.
Eckholm, A. & Fridquist, S. 2000. A concept of space for building classification, product
modelling, and design. Automation in Construction. 9(2000): 315328.
Eckholm, A. 2001. Activity Objects in CAD-programs for building design. Computer Aided Design
Futures, Einhoven, Einhoven,Netherlands, 2001.
Engstrom, E. & Krueger, J. 2000. Building and Rapidly Evolving Domain-Specific Tools with
DOME. IEEE International Symposium on Computer-Aided Control Systems Design.
Anchorage, Alaska. (2000): 6570.
Metzger, A. & Queins, S. 2002. Specifying Building Automation Systems with PROBAnD, a
Method Based on Prototyping, Reuse, and Object-orientation. OMERObject-Oriented
Modeling of Embedded Real-Time Systems. GI-Edition, Lecture Notes in Informatics (LNI), P5. Bonn: Kollen Verlag (2002): 135140.
Mahdavi, A., Metzger, A. & Zimmermann, G. 2002. Towards a Virtual Laboratory for Building
Performance and Control. Cybernetics and Systems 2002. Vol. 1. Vienna: Austrian Society for
Cybernetic Studies. (2002): 281286.
Metzger, A. & Queins, S. (2003) Model-Based Generation of SDL Specifications for the Early
Prototyping of Reactive Systems. Telecommunications and beyond: The Broader Applicability
of SDL andMSC. Springer Lecture Notes in Computer Science, LNCS 2599. Heidelberg:
Springer-Verlag. (2003): 158169.
Rumbaugh, J., Jacobson, I. & Booch, G.. 1999. The Unifled Modeling Language Reference
Manual. Reading, Harlow, Menlo Park: Addison-Wesley.
Zimmermann, G. 2001. A new approach to building simulation based on communicating objects.
Seventh International IBPSA Conference Proceedings. Vol. 2. Rio de Janeiro, Brazil (2001):
Zimmermann, G. 2002. Efficient creation of building performance simulators using automatic code
generation. EnergyandBuildings. 34. (2002): 973983.
Zimmermann, G. 2003. Modeling the building as a system. Eighth International IBPSA Conference
Proceedings. Einhoven, Netherlands. (2003): 14831490.

Process modelling technology

eWork and eBusiness in Architecture, Engineering and ConstructionDikba & Scherer (eds.)
2004 Taylor& Francis Group, London, ISBN 04 1535 938 4

Embedded commissioning for building design

.Akin, M.T. Turkaslan-Bulbul & I.Gursel School of Architecture
Carnegie Mellon University, Pittsburgh, USA
J.H.Garrett Jr, B.Akinci & H.Wang Department of Civil and
Environmental Engineering, Carnegie Mellon University, Pittsburgh,
ABSTRACT: Building commissioning has a broad scope that extends to
all phases of building delivery. We view commissioning as a building
delivery embedded process that persistently verifies and validates design
intent throughout the building lifecycle process. In the building lifecycle
approach, buildings are considered to have cradle-to-grave life spans.
They are modeled through a variety of different developmental phases. In
this research project, we intend to build the necessary theory and tools to
support the embedded commissioning process as a co-function of building

Building commissioning is an important new area of practice and research in the industry.
It has emerged, during the last 25 years, as the central phase of building delivery that is
responsible for verifying design intent. Currently, it is rapidly becoming the performance
verification tool in HVAC design and LEED (Leadership in Energy and Environmental
Design) certification in the USA.
Building commissioning is a multi-phase process that ensures the interacting systems
in a building are properly installed and operating. In the early phases of facility design,
commissioning is concerned with whether the program and the design are delivering the
owners desired functionality. During the construction process, commissioning is
concerned with ensuring that the performance of the selected building equipment agrees
with the design specifications and delivers the intended fimctionality. The process of
building commissioning tends to generate large amounts of data, much of which needs to
be shared across other facility delivery phases.
We view commissioning as a building delivery embedded process that persistently
verifies and validates design intent throughout the building lifecycle. The Embedded
Commissioning Model (ECM), which is described in this paper, combines the processes
of commissioning and building life-cycle in order to provide a framework for managing
the information exchange between them. Here, the role of commissioning is to
complement each of the lifecycle phases and their interactions through timely building
system evaluation.
The primary objective of our study is to investigate the computability of Embedded
Commissioning (EC) for HVAC systems. Our approach focuses on exploring the

eWork and eBusisness in architecture, engineering and construction


representational needs of the EC process and the management of EC data. Here, we

concentrate on how the EC process works? What kind of information is produced; and
what type of attributes can be defined? The output of this study is used to develop a proof
of concept prototype software that supports the decision making process in EC.
2.1 History
The term commissioning has originated from the naval practices. Commissioning
ceremony is a sign that the ship is accepted as an operating unit of the navy. By breaking
the commissioning pennant the ship is put into the responsibility of the commanding
officer who together with the ships crew has the task of making and keeping her ready
for any service required during peace or war. Prior to commissioning, the newlylaunched
vessel must pass some tests before she is considered complete and ready to be authorized
as a commissioned ship. The new ship goes through several sea trials during which
deficiencies that need correction are uncovered. The crew and the ship must function in
total harmony for maximum effectiveness and efficiency (Reilly 1975).
The association between ships and buildings is not new but commissioning was
introduced into the building industry only during 1977. Public Works Canada is the first
organization who started to use commissioning in project delivery. Then in 1981 Disney
Inc. issued a comprehensive commissioning program in the design, construction and
start-up of its Epcot theme park.
In the United States of America, formal work on the commissioning process began in
1984 when the American Society of Heating Refrigerating and AirConditioning
Engineers (ASHRAE) formed Commissioning Guideline Committee. The task of the
committee was to define a process which guarantees that fully fimctioning buildings were
turned over to the building owners. The motivation for the ASHRAE Commissioning
Committee was the growing number of complaints about unmanageable HVAC systems,
increasing operation expenses, decreasing comfort levels, and uneducated operations and
maintenance staff who did not understand how to maintain or operate new buildings.
After its foundation, the ASHRAE commissioning committee published two guidelines.
The original guideline was announced in 1989 and an updated version has been published
in 1996 (Guideline 19961).
After the announcement of ASHRAE Commissioning Guidelines, commissioning
practice started to draw attention from various areas. University of Wisconsin, Madison
offered commissioning courses and University of Michigan established a facilities
evaluation and commissioning group. In 1993 first National Conference on Building
Commissioning (NCBC) was held and National Environmental Balancing Bureau
(NEBB) developed a commissioning providers certification program. After 1993 a range
of governmental and private organizations started commissioning practices and issued
regulations or guidelines. In 1998 US Green Building Council added commissioning to
Leadership in Energy and Environmental Design (LEED) criteria. Finally, in 1999 the
Building Commissioning Association (BCA) was established.

Embedded commissioning for building design


2.2 Deflnition
ASHRAE defines commissioning as the process of ensuring that systems are designed,
installed, functionally tested and capable of being operated and maintained to perform in
conformity with the design intent (Guideline 11996). Commissioning is a systematic
approach. It starts with the programming phase and ends when the building is turned over
to the owner. Most commissioning companies also provide a one or two year guarantee
phase after the building is occupied. During the commissioning period the aim is to
ensure and verify, with documentation, that all building systems perform in the way that
they were intended and the operating and maintenance staff is trained according to the
owners operational needs.
Commissioning is occasionally confused with the testing, adjusting and balancing
(TAB) process or the punch list inspection process. The latter is a physical examination
done before a building is turned over to its owner. It is a one day process at the end of