Vous êtes sur la page 1sur 25

Introduction to Design Research:

a Methodological Background for


Scientific Work

Elena Paslaru Bontas

Semantic Web PhD Network Berlin


Brandenburg

30.09.2005
Outline
 Motivation

 Types of research
 Design Research Basics

 Evaluation in Design Research

 Conclusion
Motivation
 Motivation for research:
 pure research: enhance understanding of
phenomena
 instrumentalist research: a problem needs a solution
 applied research: a solution needs application fields

 Motivation for research methodology


 (qualitatively) control research process
 validate research results
 compare research approaches
 respect rules of good scientific practice
Research: A Definition
 Research:
 an activity that contributes to the understanding of a
phenomenon [Kuhn, 1962; Lakatos, 1978]
 phenomenon: a set of behaviors of some entity(ies)
that is found interesting by a research community
 understanding: knowledge that allows prediction of
the behavior of some aspect of the phenomenon
 activities considered appropriate to the production of
understanding (knowledge) are the research methods
and techniques of a research community
 paradigmatic vs multi-paradigmatic communities
(agreement on phenomena of interest and research
methods)
Scientific Disciplines
 Types of research [Simon, 1996]:
 natural sciences: phenomena occurring in the world
(nature or society)
 design sciences ~ sciences of the artificial:
 all or part of the phenomena may be created
artificially
 studies artificial objects or phenomena designed to
meet certain goals
 social sciences: structural level processes of a social
system and its impact on social processes and social
organization
 behavioural sciences: the decision processes and
communication strategies within and between
organisms in a social system
phenomena
design
sciences

Semantic Web
(CS)

activities

[Owen,1997]
Design research basics
 Process model
 Artifact types:
 result of the research work
 Artifact structure
 content of the research approach
 Evaluation:
 evaluation criteria
 evaluation approach
Process model
a problem-solving paradigm:
 seeks to create innovations that define the
ideas, practices, technical capabilities, and
products through which the analysis, design,
implementation, and use of information
systems can be effectively and efficiently
accomplished [Tsichritzis 1997; Denning
1997]
Design research process

knowledge
flows + operation and goal knowledge

process
circumscription
steps
Awareness of
Suggestion Development Evaluation Conclusion
problem

logical
formalism

abduction deduction

[Takeda,1990]
Artifacts
 are not exempt from natural laws or
behavioral theories
 artifacts rely on existing "kernel theories"
that are applied, tested, modified, and
extended through the experience,
creativity, intuition, and problem solving
capabilities of the researcher [Walls et al.
1992; Markus et al. 2002]
Design research outputs
[March & Smith, 1995]
 Constructs
 conceptual vocabulary of a problem/solution domain
 Methods
u t
 algorithms and practices to perform a specific task

Models t p
u

o
 a set of propositions or statements expressing
s
s i
relationships among constructs
he
 abstractions and representations
 Instantiations T
 constitute the realization of constructs, models and
methods in a working system
 implemented and prototype systems
 Better theories
 artifact construction
Design research outputs
constructs
better theories
emergent theory about
models
embedded phenomena
abstraction models
abstraction
knowledge as methods
operational principles constructs
better theories
abstraction

artifact as situated implementation instatiations


methods
constructs
[Purao , 2002]
Examples
 Open up a new area
 Provide a unifying framework
 Resolve a long-standing question
 Thoroughly explore an area
 Contradict existing knowledge
 Experimentally validate a theory
 Produce an ambitious system
 Provide empirical data
 Derive superior algorithms
 Develop new methodology
 Develop a new tool
 Produce a negative result
Artifact structure
 Structure of the artifact
the information space the artifact i s
s spans

e
h information
 basis for deducing all required
e t
about the artifact t h
o f
t
 determines the configurational
n
characteristics e
tnecessary to enable the
n
o the artifact
evaluationCof
Evaluation criteria
 Evaluation criteria
 the dimensions of the information space
which are relevant for determining
s the utility
of the artifact s e
c a

st
can differ on the purpose of the evaluation
Te
Evaluation approach
 Evaluation approach
 the procedure how to practically test an
artifact o d
t h
defines all roles concerned e with the

m
assessment and the n g way of handling the
t i
evaluation
e s
 T
result is a decision whether or not the artifact
meets the evaluation criteria based on the
available information.
Evaluation approach (2)
 Quantative evaluation:
 originally developed in the natural sciences
to study natural phenomena
 approaches:
 survey methods
 laboratory experiments
 formal methods (e.g. econometrics)
 numerical methods (e.g. mathematical
modeling)
Evaluation approach (3)
 Qualitative evaluation:
 developed in the social sciences to enable
researchers to study social and cultural phenomena
 approaches:
 action research
 case study research
 ethnography
 grounded theory
 qualitative data sources:
 observation and participant observation (fieldwork)
 interviews and questionnaires
 documents and texts
 the researcher’s impressions and reactions
Constructs

Structure Evaluation criteria Evaluation approach

meta-model of construct deficit ontological analysis


the vocabulary construct overload

construct redundancy

construct excess
Methods
Structure Evaluation criteria Evaluation approach
process-based meta appropriateness laboratory research
model completeness field inquiries
intended applications consistency surveys
conditions of case studies
applicability action research
products and results of
practice descriptions
the method application
interpretative research
reference to constructs
Models

Structure Evaluation criteria Evaluation approach


 domain correcteness syntactical validation
 scope, purpose completeness integrity checking

 syntax and semantics clarity sampling using selective

 terminology flexibility matching of data to actual


 intended application simplicity external phenomena or
trusted surrogate
applicability
integration tests
implementability
risk and cost analysis

user surveys
Instantiations
Structure Evaluation criteria Evaluation approach
executable implementation functionality code inspection
in a programming language usability testing
reference to a design model
reliability code analysis
reference to a requirement
performance verification
specification
reference to the supportability
documentation
reference to quality
management documents
reference to configuration
management documents
reference to project

management documents
Conclusion

Good research results require a careful


design of the research methodology and
considerable evaluation efforts
References
 „DFG Rules of Good Scientific Practice“ available at www.dfg.de, last seen September 2005
 Tsichritzis, D. "The Dynamics of Innovation," Beyond Calculation: The Next Fifty Years of
Computing, Copernicus, 1997, pp. 259-265
 Denning, P.J. "A New Social Contract for Research," Communications of the ACM (40:2),
February 1997, pp. 132-134
 Simon, H.A. The Sciences of the Artificial, 3rd Edition, MIT Press, Cambridge, MA, 1996
 Markus, M.L., Majchrzak, A., and Gasser, L., "A Design Theory for Systems that Support
Emergent Knowledge Processes," MIS Quarterly (26:3), September, 2002, pp. 179-212
 Walls, J.G., Widmeyer, G.R., and El Sawy, O.A. "Building an Information System Design
Theory for Vigilant EIS," Information Systems Research (3:1), March 1992, pp. 36-59
 Kuhn, T.S. The Structure of Scientific Revolutions, 3rd Edition, University of Chicago Press,
1996
 March, S.T. and Smith, G. “Design and Natural Science Research on Information
 Technology,” Decision Support Systems (15:4), December 1995, pp. 251-266
 Lakatos, I. „The Methodology of Scientific Research Programmes“, John Worral and Gregory
Currie, Eds., Cambridge, Cambridge University Press, 1978
 Wikipedia available at www.wikipedia.org, last seen Semptember 2005
 Purao, S. “Design Research in the Technology of Information Systems: Truth or Dare.” GSU
Department of CIS Working Paper. Atlanta, 2002
Danke für die Aufmerksamkeit

Viel Erfolg für die Promotion

paslaru@inf.fu-berlin.de

Vous aimerez peut-être aussi