Vous êtes sur la page 1sur 40



Volume 9 Number 6

Greater Delphi

Safety .NET

Security in the .NET environment is vintage

Microsoft: infinitely versatile and equally
complex. So begins Bill Todds in-depth
exploration of .NET security, in which he
demonstrates how to ensure that only trusted
code can run in your organization, that the code is
exactly what you wrote, and much more.

Reviewed: Easy MAPI, Team Coherence, and VssConneXion

Delphi Informant


The Complete Monthly Guide to Delphi Development

June 2003 vol. 9, no. 6

Safety .NET
An Introduction to .NET Security
IntraWeb Fundamentals:
3 Basic Objects

Page 18

E treme Testing -


Delphi Style

Page 24

The .NET Compiler

On the Net

Does XML Docs


The Myth of
the Upgrade

IntraWeb Fundamentals
Noel Rice continues his exploration of Delphi
7s RAD Internet app creation tool, IntraWeb, by
focusing on three key objects that work together to
create complete Web solutions: ServerController,
WebApplication, and UserSession.

Page 30

Page 48

$5.99 US

$8.99 CAN

Cover Art by Arthur A. Dugoni Jr.

In Development


Extreme Testing

If you havent heard about Extreme Programming

well lets just say you need to get out more.
But even if you havent, Ralph Krause shows you
how to get started now with DUnit testing. Example
projects are included, natch!

.NET Tech


Using the .NET Compiler: Part V

Alexei Fedorov and Natalia Elmanova continue
their comprehensive look at the Delphi for .NET
Compiler Preview this month by demonstrating
how to manipulate XML docs, including reading,
writing, and querying XML documents, applying XSL
transformations, and more.

30 VssConneXion

Product Review by Andrew Ghinaudo


Team Coherence 7.1


Easy MAPI 2

Product Review by Clay Shannon

Product Review by Matthew Hess


2 Toolbox
39 File | New by Alan C. Moore, Ph.D.

ModelMaker Tools Releases ModelMaker 7.05

IntraWeb 5.1 Released

ModelMaker Tools
announced ModelMaker 7.05, the
latest upgrade to
their visual modeling
and refactoring tool
based on UML technology.
New features
include: customizable views (create
and save the layouts
you want); IntelliReplace (automatic
renaming of certain
identifiers in code);
new built-in refactorings, which take advantage of IntelliReplace; auto display of unit procedures
in diagrams and custom compartments
in all relevant symbols; MMToolsAPI
extensions and modifications that
enable improved Bold integration in
combination with Datators new Bold
integration suite (www.datator.com);
records can now be modeled just like

Atozed Software announced the availability of IntraWeb 5.1. IntraWeb is a

development environment that allows
building large scale Web applications
(weblications) using the full power of
Borlands RAD tools Delphi, JBuilder, C++ Builder without sacrificing the ease of use of these tools.
IntraWeb is also available for C++
and Java, and will soon be available
for Visual Studio .NET.
With version 5.1, IntraWeb introduces
many new features, such as PDA support, mobile phone support, an improved
WYSIWYG HTML editor, more than 40
new components, extensive new documentation, improved performance, better
RAD integration, extended browser support, new demos, new session tracking
options, and more configuration options.
Visit the Atozed Web site for a complete
list of new features.

classes and interfaces; and many other

productivity features. Visit the ModelMaker Web site for a detailed overview
of all the new features.
ModelMaker Tools
Price: Single-user license, US$299; 10-user license,
US$1,095; site license, US$2,195. See Web site for
complete upgrade options.
Contact: info@modelmakertools.com
Web Site: www.modelmakertools.com

Create Context-sensitive Help with RoboLinker

eHelp Corp. announced the availability RoboLinker can be saved in a selfof RoboLinker, a tool for creating coninstalling executable. When launched,
text-sensitive Help without any coding
this file installs the Help into the target
or programming knowledge.
application on the recipients computer.
Users can add a customized Help butUse RoboLinker to create additional
ton to any specific window or dialog
customized, context-sensitive Help
box. The button can be set to one or
for applications without accessing the
any combination of the following funcapplications source code. The extra
tions: open a Help topic, open a Web
context-sensitive Help can enhance
page, or play an animated RoboDemo
applications used in the workplace,
Flash simulation. Help topic links can
from standard tools like Microsoft Outconnect to RoboHelp Help systems or
look to specialized proprietary systems.
any Help system in WebHelp2, WinHelp, or Microsoft HTML Help formats. eHelp Corp.
RoboLinker expedites the Help-author- Price: US$199; US$99 for current eHelp customers.
Multi-user licenses available.
ing process and reduces the potential
Contact: robohelpinfo@ehelp.com
for errors. And, the Help created with
Web Site: www.ehelp.com

Atozed Software
Price: Visit the Web site for complete details.
Contact: info@atozedsoftware.com
Web Site: www.atozedsoftware.com

Etv Software Announces

Etv Library 3.5
Etv Software announced Etv Library 3.5,
a database component pack for developing applications in Delphi.
Updates to this version include: Delphi 6 and Delphi 7 compatibility; print
wordwrap; TEtvCloneRecord.Caption
and EtvCloneRecord.Subdatasets.Caption
components; new functions in the
EtvDBFun unit; and various bug fixes.
Etv Library 3.5 contains lookup
components, an enhanced data-aware
grid, an end user-oriented query and
filter builder, a component for changing dataset sorting criteria, a component for record searching, popup
menus for different components for
design and run time, the capability
to edit dataset data at design time,
record cloning, record-by-record
search/replacing for DBRichEdit, and
other components and functions.
For complete details visit the Etv Software Web site.
Etv Software
Price: Single license, US$129 (US$99 without
source); site license, US$389. Separate component packs also available; see Web site for details
and pricing.
Contact: info@etvsoft.com
Web Site: www.etvsoft.com


/n software Announces
Public Beta of IP*Works!
EDI for Microsoft .NET
/n software announced the first
public beta of IP*Works! EDI for
Microsoft .NET. IP*Works! EDI is
new to /n softwares IP*Works!
product family of components for
Internet development. IP*Works!
EDI includes software components
that facilitate the transmission of
secure EDI transactions over the
Internet. The AS2Sender component can be used from any Web
or desktop application to connect
and securely deliver EDI data. The
AS2Receiver component allows
ASP.NET applications to securely
and reliably receive and respond to
EDI transaction requests. A digital
certificate management component
completes the package with certificate verification, creation, encoding, and signing capabilities.
The IP*Works! EDI components
are based on Applicability Standards (AS2), the leading Internet
standard for EDI transmissions,
enabling new levels of security and
reliability, and at the same time
providing significant cost savings
by leveraging the Internet as a
ubiquitous network.
Applicability Standards (AS2) is a
protocol that defines how to send,
receive, and validate data using
the S/MIME standard for message
security and the HTTP/S protocols
for communications.
Electronic Data Interchange (EDI) is
a set of protocols designed to streamline the exchange of highly structured
data between organizations, such as
purchase orders, medical records,
invoices, etc. EDI-INT refers to a set
of protocols related to the secure
transfer of EDI over the Internet.
The IP*Works! EDI Public Beta
for Microsoft .NET is available for
download from /n softwares Web
site. IP*Works! Subscription customers will receive free development licenses for IP*Works! EDI
when the final release is available.
/n software inc.
Contact: info@nsoftware.com
Web Site: www.nsoftware.com


Multilizer 5.1 for VCL Available

released Multilizer 5.1 for
VCL, which supports all versions
of Delphi and
C++Builder and
enables localization of software
and accompanying data files.
Multilizer now
flags strings that
have been added
to software since
the last build.
This helps keep track of changes that
affect localization. Advanced filtering
options allow users to view strings
according to their status, which simplifies working with big projects.
Instead of working with Delphi/
C++Builder projects, or project
groups, users can directly localize
the executable with Multilizer. This
binary localization lets users create
language versions, multilingual and
localized executables, and resource
DLLs. Developers will also find useful the Multilizer 5.1 for VCL func-

tions that enable a run-time language

switch in binary-localized software.
Another update to this version is
that Multilizer now groups resource
strings by units, which makes it easier to locate the strings from source
Visit the Multilizer Web site for
complete details on all the latest
upgrades to Multilizer 5.1 for VCL.
Multilizer, Inc.
Price: Starts at US$1,395
Contact: info@multilizer.com
Web Site: www.multilizer.com

AQtime 3.0 Available from AutomatedQA

AutomatedQA announced AQtime
diagram-generating Sequence
3.0, a complete, top-to-bottom
Diagram Link); support for multiple
performance profiler and memory
compilers (including Delphi,
usage debugger for Delphi,
C++Builder, Visual C++, Visual
C++Builder, Visual Basic, Visual
Basic, Intel C++, GCC, and Compaq
C++, Intel C++, GCC, and Compaq Visual Fortran); and support for
Visual Fortran applications.
server-side profiling. It also profiles
AQtime is a debugging and quality
almost any type of executable,
assurance tool designed to expose
including exe, dll, ocx, bpl, cpl,
application bottlenecks, memory
NT services, ISAPI applications,
leaks, resource-hogging code, and
COM, DCOM, and COM+ servers;
untested algorithms. AQtime shows
uses an open and fully documented
you which module, class, routine,
COM architecture to allow users
or line of code is causing problems.
to add custom profilers and
AQtime provides direct
UI elements; performs without
integration with the IDEs of Delphi,
modifying source code; and many
C++Builder, and Visual Studio, as
other enhancements.
well as AutomatedQAs TestComplete
AutomatedQA purchases include
and AQdevTeam TestComplete.
a 60-day unconditional money-back
New features include more than
20 new profilers and productivity
tools (including Timing, Coverage,
Price: US$349.99; upgrade for AQtime 2 users,
Sampling, HitCount, Function
US$179.99. Discounts available for multi-user
licenses and users of SleuthQA or MemSleuth
Trace, and Route Trace profilers;
from TurboPower Software Company.
Memory & API Resource Check;
Contact: sales@automatedqa.com
Platform Compliance; and a UML
Web Site: www.automatedqa.com

AcctSync Technologies Launches AcctSync SDK
AcctSync Technologies, a provider
Books through the QBXML SDK,
of components for the integration of
Intuits development kit that enables
financial management software with
data integration between third-party
other software packages, released
applications and QuickBooks. AcctSync
AcctSync SDK Delphi Edition, a toolSDK has no external dependencies,
kit for QuickBooks developers and
other than QuickBooks; end users need
integrators. AcctSync SDK will enable
not download the QuickBooks SDK, an
QuickBooks developers to build inteXML parser, or any other utilities to run
grated solutions in a fraction of the
applications built with AcctSync SDK.
time without any prior knowledge of
Most of the components included
qbXML, XML, or COM APIs.
in the AcctSync SDK correspond
AcctSync SDK connects to Quickdirectly to the most commonly used

QuickBooks constructs, like Customer,

Employee, Vendor, Invoice, etc. and
can be used to store information and
communicate with QuickBooks.
AcctSync SDK is currently available
for Delphi, C++Builder, ActiveX/VB,
Microsoft .NET, C++, ASP, and Java.
AcctSync Technologies, Inc.
Price: Visit the Web site for pricing
and licensing options.
Contact: info@acctsync.com
Web Site: www.acctsync.com

Xapware Releases Active! Focus 1.5

Active! Focus from Xapware Technologies is a workgroup solution for application lifecycle management.
Active! Focus presents a clear,
unencumbered view of software
projects throughout their lifecycles.
It maintains a complete inventory of
important project factors, such as:
requirements management, change
request management, defect management, risk management, issues man-

agement, and team discussions.

Active! Focus accomplishes this
with a simple user interface, complete with user-customizable displays
and multi-project support. You can
share teams across projects, take
advantage of configurable navigation
systems, and employ advanced editing tools. Active! Focus supports file
attachments for full documentation
capabilities. Detail views of items

JEDI-VCL Library version 2.10 Released

Project JEDI (Joint Endeavor of Delcredit is given to Project JEDI.
phi Innovators) released version 2.10
Among the many components youll
of its JEDI-VCL Library (JVCL). This
find in JVCL are sets of Labels, Winopen source library contains close to
dows dialogs, calendars, and several
400 VCL components donated by the
improved versions of original Delphi
Delphi community.
components. You can download the
Because the JVCL is released under
JVCL from http://jvcl.sourceforge.net.
MPL (Mozilla Public License), you
To install JVCL you also need the
can freely use it in personal, open
JEDI Code Library (JCL), a compresource, or commercial projects withhensive set of thoroughly tested and
out restrictions, as long as a proper
fully documented utility functions

also support rich-text editing directly

in the program, enabling better documentation of project success factors.
Active! Focus is available as both a
single-user management tool or as a
complete workgroup solution.
Xapware Technologies
Price: Active! Focus Workgroup, US$499; Active!
Focus Server, US$250/user; Active! Focus Personal Edition, US$49.
Contact: info@xapware
Web Site: www.xapware.com

and non-visual classes grouped into

several categories such as Strings,
Files and I/O, Security, Math, and
many more. It is included in the full
download, but it can also be downloaded from http://sourceforge.net/
For more information visit the Project JEDI Web site.
Project JEDI
Web Site: www.delphi-jedi.org

Mastering Delphi 7

Essential XML
Quick Reference

Building Web Applications with

ADO.NET and XML Web Services

Successful IT
Project Delivery

Marco Cant
ISBN: 0-7821-4201-X
Cover Price: US$59.99
(1,011 pages)

Aaron Skonnard and Martin Gudgin

ISBN: 0-201-74095-8
Cover Price: US$24.99
(402 pages)

Richard Hundhousen, Steven Borg,

Cole Francis, and Kenneth Wilcox
Wiley Publishing
ISBN: 0-471-20186-3
Cover Price: US$40.00
(359 pages)

David Yardley
ISBN: 0-201-75606-4
Cover Price: US$29.99
(346 pages)






By Bill Todd

Safety .NET
Part I: An Introduction to .NET Security

ecurity in the .NET environment is vintage

Microsoft: infinitely versatile and equally
complex. Regardless of the language you

use, if you intend to develop for .NET, or use

.NET applications in your organization, you must
understand the security features of the .NET environment and plan how you will use them before
you start developing applications.

.NET provides two types of security. Code access security

lets you control which resources an assembly can access.
Role-based security lets you control what an application
does based on the users identity. By the way, if youre new
to .NET, an assembly is a file that contains compiled code,
usually either an EXE or a DLL.
To implement .NET security, you must secure each assembly, and you must set the security policy on each PC on
which .NET applications will run. This article is language

Figure 1: Assigning the private key password.


neutral; whether you develop .NET applications in Delphi

for .NET, C#, or some other language, everything in this article applies. In the second installment of this two-part series
well look at role-based security, requesting permissions
from within your application, and other things that must be
done from within your application that are not, therefore,
language neutral.
Protecting Applications with
Strong Names
Every .NET assembly includes a hash of the assembly. This
hash is something like a checksum its a value that can
be used to determine if the assembly has been changed since
it was compiled. However, it is possible at least theoretically for someone to modify the assembly and recompute
the hash so the modification wont be detectable. To prevent
this, add a strong name to each of your assemblies.
Strong names use public key cryptography. Public key
cryptography uses two keys, a public key and a private
key. When you add a strong name to your assembly, the
private key is used to sign the hash. Its possible to verify
the digital signature using the public key. This provides a
secure way to ensure that the hash has not been changed.

Figure 2: The second request confirming the private key password.



Safety .NET

Figure 3: Selecting the file to sign.

Figure 5: Choosing the certificate file.

Figure 4: Choose custom signing.

Figure 6: Selecting the private key file.

The Common Language Runtime (CLR) knows that the

file can be safely executed if the digital signature verifies
that the hash has not been changed, and the hash verifies
that the file has not been changed.

need to purchase a digital certificate from a recognized

authority, such as VeriSign (www.verisign.com) or Thawte

To add a strong name to your assembly, start by using the

Strong Name utility, sn.exe, to create a file that contains a
public and private key pair. Open a command prompt window and enter the following command:

The .NET SDK includes all the tools you need to work
with code signing, including a Certificate Creation Tool
that lets you create test certificates. This lets you try the
entire code signing process before purchasing a certificate.
You can create a test certificate and sign an assembly
using the following steps.

sn -k mykey.snk

You can give the file any name, and the .snk extension is
optional. You can use any file extension you choose, or no
extension at all. The Strong Name utility creates the key file in
the current directory. Next, add the key file to your assembly
by adding the AssemblyKeyFile attribute to your source code:
[assembly: AssemblyKeyFile('c:\keyfiles\mykey.snk')]

Code Signing Applications

Code signing adds an Authenticode digital signature to an
assembly. Authenticode is a method for verifying the identity of a software publisher. To use code signing you will


Use the Certificate Creation Tool, makecert.exe, to create a

certificate and private key using this command:
makecert - n "CN=Your Company Name" -sv test.pvk test.cer

The dialog box shown in Figure 1 will appear, asking you to

enter a password for the private key. After you enter a password and confirm it, click the OK button. When you do, a
second dialog box appears asking you to enter the private key
password again (see Figure 2). When you re-enter the password and click the OK button, the dialog box will close and the
word Succeeded will appear beneath the makecert command
in your command prompt window. Youll now have two files in



Safety .NET
Permission Class

Grants Access To


Active Directory


The Domain Name System


Environment variables


The Windows event log


Access to files through a file

dialog box, such as the open
file dialog box


Files and directories


Private virtual file systems


Isolated storage


Message queuing using MSMQ

Figure 7: The Security Warning dialog box.


the current directory: test.pvk, which contains the private key,

and test.cerm, which contains the certificate.

Data using classes in the

System.Data.OleDb namespace


Performance counters




.NET reflection


The Windows registry


Unmanaged code


Controlling Windows services


Windows sockets (Winsock)


Data using classes in the



The user interface


Web connections

The makecert program creates an X.509 certificate, for which

you need a software publishers certificate. The Software
Publishers Certificate Test Tool, cert2spc.exe, will perform
the conversion using this command:
cert2spc test.cer test.spc

When you run this command, cert2spc will display Succeeded and create a file named test.spc in the current directory.
Use this file and the File Signing Tool, signcode.exe, to sign
an assembly. Type in signcode.exe at the command prompt
to start the Digital Signature Wizard. Click Next to move to
the second page of the wizard, shown in Figure 3, and enter
the path to the assembly you want to sign. Move to the next
page, shown in Figure 4, and select the Custom radio button.
You can only specify a certificate file if you select Custom.
On the next page, click the Select from File button, and choose
your certificate file in this case, test.spc, as shown in
Figure 5. Click Next, then enter the path to your private key
file as shown in Figure 6. Click Next, enter the private key
password, then click OK. Click Next, select a hash algorithm
and click Next again. Click Next to move through the remaining pages of the wizard without changing any of the default
options. When you get to the last page, click Finish and enter
the private key password when prompted.
Now that your assembly has been digitally signed, you can
check the signature using the Certificate Verification Tool,
chktrust.exe. When you type the command:
chktrust.exe assemblyfilename.exe

at the command prompt, youll get the warning shown in

Figure 7. In this case, the dialog box contains a security
warning that the contents cannot be trusted, because the
assembly was signed with a test certificate. If you use
chktrust.exe on an assembly that was signed with a certificate from a certificate authority, the dialog box will contain
a message such as, Publisher authenticity verified by VeriSign Commercial Software Publishers CA.


Figure 8: Code-access permission classes.

Understanding the Runtime

Security Model
The security policy features of .NET let you control what
code will be allowed to run on a particular machine for a
particular user. You must understand three concepts to know
how code access security works: permissions, permission
sets, and code groups.
Permissions are implemented as classes. Each permission
class, shown in Figure 8, grants access to some capability of
the CLR. A permission set is a named group that contains
some or all of the permissions shown in Figure 8.
A code group is really a named set of criteria. The criteria
are also called the membership condition. Any assembly that
satisfies the membership condition is automatically a member of the code group. When you create a code group you
specify the criteria an assembly must satisfy to be a member
of the group, and you specify a permission set that is granted
to any assembly that is a member of the code group. The
result is that the code group to which an assembly belongs
determines which permission sets the assembly gets.
You control permission sets and code groups with the
Microsoft .NET Framework Configuration snap-in for the



Safety .NET
mine to which code groups it belongs.
Evidence includes such things as the
publisher, strong name, hash, and
location from which the code was
loaded. Using the evidence, the CLR
examines all the code groups at the
enterprise level, and determines to
which groups the code belongs.
Code groups are organized hierarchically, starting from a root node named
All_Code. Any code group can have
one or more child code groups and
code groups can be nested to any
level. If an assembly is not a member
of a code group, it cannot be a member of any child of that code group.

Figure 9: The .NET Framework Configuration snap-in.

Microsoft Management Console (MMC). To start the .NET

Framework Configuration snap-in, double-click Administrative Tools in the Control Panel, then double-click Microsoft
.NET Framework Configuration. This should open MMC with the
.NET Framework Configuration snap-in loaded, as shown in
Figure 9. If it doesnt, you have a corrupt mscorcfg.msc file
and youll need to follow the instructions outlined in the
accompanying sidebar, Fixing MSCORCFG.MSC.
Expand the Runtime Security Policy node in the tree, and
youll see that security can be set at the enterprise,
machine, or user level. The enterprise policy level affects
every computer and user on the network, and can be controlled only by enterprise or domain administrators. The
machine policy level affects all users on the machine. The
user level affects the current user.
If you select Runtime Security Policy in the tree, youll see
a link named Create Deployment Package in the right pane.
Click this link to launch the Deployment Package wizard.
This wizard lets you convert the enterprise, machine,
or user security policy on your computer to a Windows
Installer MSI file that you can distribute throughout your
organization using Group Policy or Microsoft Systems
Management Server.
With three levels of security policy that might conflict
with each other and the possibility that code can
belong to multiple code groups, each with a different permission set the CLR must go through a complex process to determine which permissions to grant.
Heres what happens. The process begins at the Enterprise
level. First, the CLR evaluates the evidence presented by
the assembly. Evidence is a term used to describe all the
characteristics of the assembly that can be used to deter8


Any code group can have its Exclusive

attribute set to True. If the assembly
belongs to an exclusive code group,
the checking stops and the code
belongs only to the exclusive group
and to no others at the enterprise
level. Next, the CLR determines the
union of all the permission sets for
the code groups to which the code belongs. This means
that if a permission is granted by any one code group, the
code will have that permission.
After the CLR has determined the permissions for the enterprise level, it saves them and moves down to the machine
level. It repeats the process just described for the machine
level, then moves down to the user level and repeats the
process again. The CLR now has three sets of permissions:
one for the enterprise level, one for the machine level, and
one for the user level.
To get the final set of permissions, the CLR takes the
intersection of the permissions at all three levels. Taking
the intersection means that a permission must be granted at every level for the assembly to get that permission.
This means you cannot grant a permission at the machine
level that the code doesnt have at the enterprise level,
and you cannot grant a permission at the user level
that the code doesnt have at both the enterprise and
machine levels.
There is one more wrinkle in determining the permissions that an assembly gets. Code groups have a property
named LevelFinal. If the CLR discovers that an assembly
belongs to a LevelFinal code group at any level, the code
groups at lower levels are not checked.
When the final permission set for the assembly has been
determined, the CLR walks up the stacks call chain,
from the code being evaluated up to the original application. At every level in the call chain, the CLR takes the
intersection of the permissions of the code being evaluated, and the permissions of the calling routine. This means
that code cannot have permissions that have not been
granted to the code that calls it.



Safety .NET
application belongs to a code group
that doesnt grant ServiceControllerPermission. When you try to run
your new application youll get
a security exception when your
application tries to use any of the
methods of the IBManager class,
because IBManager cannot have
ServiceControllerPermission if the
application that calls it doesnt have
that permission.
If you return to the .NET Framework
Configuration tool and expand the
Enterprise and Machine nodes in the
tree, you should see something like
the screen shot shown in Figure 10.
The enterprise level has a single code
group named All_Code, and all code
is a member of this group. The permission set for All_Code is FullTrust,
which grants all permissions, so, by
default, all code has all permissions at
the enterprise level.

Figure 10: The default code groups and permission sets.

The machine level is where all

default run-time security is applied.
Here youll find code groups named My_Computer_Zone,
LocalIntranetZone, Internet_Zone, Trusted_Zone, and
Restricted_Zone. As you can probably guess from their
names, the membership conditions for these code groups is
based on from where the assembly was loaded.
Like the enterprise level, the machine level has a default set of
permission sets named FullTrust, SkipVerification, Execution,
Nothing, LocalIntranet, Internet, and Everything. Right-click
the LocalIntranet_Zone code group and choose Properties from the
context menu, then click the Permission Set tab. Notice that,
by default, assemblies loaded from another computer on your
network dont have File I/O or several other of the permissions shown in Figure 8. The user level, like the enterprise
level, has a single code group named All_Code that grants the
FullTrust permission set to all code.

Figure 11: Adding permissions to the permission set.

As an example, suppose you write a class named

IBManager that allows you to work with an InterBase
server. This class has methods that let you start
and stop the InterBase server, and a method that
determines if InterBase is installed on any machine you
specify. Because you want to use this class from many
applications, you put it in a DLL assembly and reference
it from many applications. You give the assembly a strong
name, and ensure that it belongs to a code group that
grants it ServiceControllerPermission. Next, you build a
new application and reference the DLL. However, the new


Working with Permission Sets

Assume you need a new permission set to grant all permissions, except the security permission, to assemblies
with a particular strong name. To create a new permission set, right-click Permission Sets in the tree and choose
New. This starts the Create Permission Set wizard. Enter
IntraNet Managed Code for the name and enter an optional
description. Click Next to move to the second page, shown
in Figure 11. You must select the permissions you want to
grant one at a time and click the Add button. Each time
you click the Add button a dialog box will open showing
properties you can set for the permission. Set the properties and click OK to finish adding the permission. You can
change the properties of any permission in the Assigned
Permissions list box by selecting the permission and clicking
the Properties button. After youve selected the permissions
you want to assign, click the Finish button to create the
new permission set.



Safety .NET



All Code

All code will belong to this group

Application Directory

Code loaded from the same directory as

the running application


Code with a specified hash


All code from the specified publisher


All code from the specified site

Strong Name

All code with the specified strong name


All code from the specified URL


All code from the specified zone

Figure 12: Code group membership conditions.

Follow these simple steps to open the .NET Framework Configuration snap-in from Administrative Tools:
Open a command prompt window.
Type mmc.exe and press R.
Select Console | Add/Remove Snap-in from the menu to open
the Add/Remove Snap-in dialog box.
Click the Add button to open the Add Standalone Snap-in
dialog box.
Select .NET Framework Configuration from the list and click the
Add button.
Click OK to close the Add/Remove Snap-in dialog box.
Choose Console | Save As from the menu. Navigate to the
C:\WINNT\Microsoft.NET\Framework\v1.0.3705 directory and save the file as mscorcfg.msc.
You should now be able to open the .NET Framework Configuration snap-in from Administrative Tools.
list to assign the new permission set to this code group. Click
Next then Finish to create the code group.
To change the properties of a code group, select the group
in the tree and click the Edit Code Group Properties link in the
right-hand pane. Figure 14 shows the property dialog box.
Note the two check boxes at the bottom of the General tab.
The top check box sets the code groups Exclusive attribute
and the second check box sets the LevelFinal attribute.
These attributes were described earlier and cannot be set
from the Create Code Group wizard. Use the Membership Condition and Permission Set tabs to change the code
groups membership condition and permission set.

Figure 13: Using the Strong Name condition for a code group.

To make changes to a permission set, select the permission

set in the tree, then choose one of the links in the pane on the
right side of the window. The View Permissions link lets you view
the permissions in the permission set and change their properties. The Change Permissions link lets you add or delete permissions and change any permissions properties. The Rename Permission Set link lets you change the name of the permission set.
Working with Code Groups
To create a new code group, right-click the code group that
is to be the parent of the new code group, and choose New
from the menu. For this example, use the All_Code code
group. This will start the Create Code Group wizard. Enter
In_House_Named_Code for the name. Enter anything you want
for the description, then click the Next button.
Choose Strong Name from the drop-down list of conditions.
The available conditions are listed in Figure 12. After you
choose the condition, the wizard page changes, as shown in
Figure 13. Click the Import button and select any assembly
that has the strong name you want to use. Click Next and
choose Intranet Managed Code from the Permission Set drop-down


Do You Have It Right?

An obvious question arises if you create a complex security
policy for your .NET run-time environment: How do you test
assemblies to see if they have the permissions they need? To
find out, select Runtime Security Policy in the tree, then click the
Evaluate Assembly link in the right pane to display the wizard
shown in Figure 15. Click the Browse button and select the
assembly to evaluate. Use the radio buttons in step 2 to choose
whether you want to see the permissions the assembly has, or
the code groups to which it belongs. The drop-down list in step
3 lets you choose to evaluate the assembly at all levels or at
just the enterprise, machine, or user level. Click Next to see the
result, then click Finish to close the wizard when you are done.
Requesting Permissions
If an application tries to perform an action that it doesnt
have permission to perform, the CLR will raise a security
exception. You can prevent this by adding attributes to
your application that request all required permissions. If
you do, the CLR will make sure the application has those
permissions when it loads it. The program will not run if
the permissions cannot be granted.
Basic role-based security works much the same way. You
use attributes in your application to specify the role of
which the user must be a member in order to run the
application. The .NET run-time environment verifies that
the user is a member of the required role before it lets the



Safety .NET

Figure 15: The Evaluate Assembly wizard.

Figure 14: Changing a code groups properties.

application execute. Well look at both types of attributes

in more detail in Part II of this series.
The .NET run-time environment provides all the tools you
need to develop and run both local and distributed applications with safety. By controlling the permissions granted to
code based on who signed the assembly, the strong name, the
hash, or where the code originated, you can ensure that only
code you trust can execute on any machine in your organization. From the developers perspective, strong names and code
signing let you give the organizations using your code an easy
way to grant the permissions it needs, with confidence that the
code they are running is exactly what you wrote.
Although a bit complex, the .NET security system is very flexible. In addition to the techniques presented here, there are



classes you can use to implement both code access security

and role-based security in a more dynamic way. Whatever
security model your organization needs, the .NET security system will let you implement it.
By now, you may be asking: But what about Delphi? As
stated earlier, this article doesnt have anything to do with a
particular programming language. That all changes next
month, when the second and final part of this series looks
at implementing these security features using Delphi.

Bill Todd is president of The Database Group, Inc., a database consulting

and development firm based near Phoenix. He is co-author of four database
programming books, author of more than 100 articles, a contributing editor
to Delphi Informant Magazine, and a member of Team B, which provides
technical support on the Borland Internet newsgroups. Bill is an internationally
known trainer and is a frequent speaker at Borland Developer Conferences in
the United States and Europe. Readers may reach him at bill@dbginc.com.





By Noel Rice

IntraWeb Fundamentals
The ServerController, WebApplication, and UserSession Objects

ast month, I introduced you to the elegance and

simplicity of Web development using IntraWeb,
and demonstrated how it delivers on its RAD

promise. This month, Im going to drill down a bit

and present the fundamental building blocks that
IntraWeb uses to construct Web sites.

The following three objects work together to bind an IntraWeb

application into a working whole:
The ServerController object controls behavior and data
at the server level, including authentication, response
to exceptions, and files locations. The ServerController
module typically contains non-visual components and
server-wide code.
TUserSession = class(TComponent)
{ Your properties go here. }

The WebApplication object is the ultimate owner of all

forms in the project. Every user that initiates a session
automatically gets a new WebApplication object.
The UserSession object is created inside the ServerController
unit. It is here that you add global variables that need to
be referenced throughout a session.
These objects are typically used in combination. For
example, if you include a UserSession object when creating
a project, an empty TUserSession object is declared in
the ServerController unit. The instance of TUserSession is
created during the ServerController OnNewSession event,
and stored in a property of WebApplication. WebApplication
has a Data property of type TObject that represents the
TUserSession instance. The IntraWeb Application Wizard
produces all of the shell code for you, as shown Figure 1.
Notice how the public UserSession function mimics a static
object instance? Now you can add the ServerController unit to
your uses clause, and simply reference UserSession.MyProperty

function UserSession: TUserSession;

function UserSession: TUserSession;
Result := TUserSession(WebApplication.Data);
{ OnNewSession event handler }
procedure TIWServerController.
TIWApplication; var VMainForm: TIWBaseForm);
ASession.Data := TUserSession.Create(nil);

Figure 1: The IntraWeb Application Wizard produces all of the shell code for you.



Figure 2: Creating an IntraWeb StandAlone application with the IntraWeb

Application Wizard.



N e t

IntraWeb Fundamentals

TUserSession = class(TComponent)
FEmpNo: Integer;
FFirstName: string;
FHireDate: TDateTime;
FLastName: string;
FPhoneExt: string;
FSalary: Double;
DataModule1: TDataModule1;
property EmpNo: Integer read FEmpNo;
property FirstName: string read FFirstName;
property HireDate: TDateTime read FHireDate;
property LastName: string read FLastName;
property PhoneExt: string read FPhoneExt;
property Salary: Double read FSalary;
constructor Create(AOwner: TComponent); override;

Figure 3: Adding properties that match the columns in the Employee table.

without having to cast WebApplication.Data. (Note: The

examples for this article were built with IntraWeb 5.1.20.)
Login Authentication
Why use UserSession? It contains state information that
isnt already tracked by IntraWeb. This next example
demonstrates a login to a database table where data retrieved
for the user is kept in UserSession. Start a new IntraWeb
StandAlone application, checking the Create DataModule and
Create User Session options (see Figure 2).
Add a standard BDE TTable to the DataModule, set the
DataBaseName property to DBDEMOS and the table name
to Employee.db. Theres one caveat: For production sites
running on IIS you need a Session component to manage
threading for the BDE, and the AutoSessionName property
for Session must be set to True.
Employee last name is used as a login name, and employee
number is the password. The remaining data columns (first
name, salary, start date, and phone extension) are loaded
into new properties added to the UserSession object. Add
properties in the ServerController unit that match the columns in the Employee table (see Figure 3).

One route to
authentication is
to hard-code the
AuthList property
with user names
and passwords. You
can experiment by
editing AuthList
where user names
and passwords are
listed as name/value
pairs (see Figure 4).

Figure 5: The login dialog box.

The login dialog box shown in Figure 5 appears when you

run the application. If the user fails to log in correctly
after three tries, or cancels the dialog box, a Web page
appears with the message Authentication Failed.
Be aware that authentication may fail for ISAPI DLLs based on
IIS settings. If Domain Authorization is enabled the User name entry
for the dialog box expects both user name and a domain. This
is named Integrated Windows Authentication in Windows XP Pro. To
disable it in IIS configuration, right-click the directory where
the ISAPI DLL will live and select Properties | Directory Security | Edit
| Integrated Windows Authentication. Uncheck the option, and click
OK to exit the dialog box.
You get the same effect with more control by creating an
event handler for the OnAuthRequest event, which passes
in parameters for the user name and password entered in
the login dialog box. In the example shown in Figure 6, the
AValid parameter is set to True if AUserName can be located
as a last name in the Employee table, and if APassword
matches the employee number for that table row. If the login
succeeds, UserSession properties are populated from the
Employee table. There are edit controls for each Employee
table column on the main form. The forms OnRender event
(called each time the page is generated for display in the
browser) is in charge of moving UserSession properties to the
controls on the form (see Figures 7 and 8).
You might be wondering if you can use both AuthList and
OnAuthRequest? And if so, in what order? Its possible to
use both, but I dont recommend it, because it virtually
guarantees a maintenance problem. When the same name
is used in both locations, IntraWeb expects to get the
password from AuthList and will ignore a second password coded to OnAuthRequest.
Session Management
In traditional HTML- or ASP-based Web sites, session
management was entirely manual. To recognize users
from one page to the next required code to pass cookies,
query strings, or posts between pages. You could use a
database table to store user information, but you still had
to pass a record ID.

Figure 4: Adding user names and passwords to the AuthList property.



IntraWeb session management is automatic, but configurable

through SessionController properties and events. The Session
TrackingMethod property decides how session information



N e t

IntraWeb Fundamentals

procedure TIWServerController.
IWServerControllerBaseAuthRequest(const AUserName,
APassword: String; var AValid: Boolean);
AValid := False;
with DataModule1.tblEmployee do begin
if Locate('LastName', AUserName, []) then begin
AValid := FieldByName('EmpNo').AsString = APassword;
UserSession.FEmpNo := FieldByName('EmpNo').AsInteger;
if AValid then begin
UserSession.FFirstName :=
UserSession.FLastName :=
UserSession.FSalary :=
UserSession.FHireDate :=
UserSession.FPhoneExt :=

Figure 6: Verifying the last name and password.

Figure 8: UserSession data from Employees.db displayed on the welcome page.

tmHidden. This option stores the session ID in HTML hidden fields. Heres a sample browser address:

procedure TIWForm1.IWAppFormRender(Sender: TObject);

STitle = 'Welcome <b>%s %s</b>';
with UserSession do begin
lblTitle.Caption :=
Format(STitle, [FirstName, LastName]);
edtID.Text := IntToStr(EmpNo);
edtFirst.Text := FirstName;
edtLast.Text := LastName;
edtExt.Text := PhoneExt;
edtSalary.Text := FloatToStr(Salary);
edtHireDate.Text := DateTimeToStr(HireDate);

Figure 7: If the login succeeds, UserSession properties are populated from the
Employee table.

is passed, based on one of three possible values: tmURL,

tmHidden, and tmCookie. Most of the IntraWeb session
handling mechanism is hidden from view. What follows is a
comparison of the visible results of using the three session
tracking methods in a simple one-button, hello world project.
tmURL. This is the default setting, which causes the session ID to be passed as part of the URL. The browser
address will look something like this:

The action defined in the generated HTML causes a post

back to the same page:
<form onsubmit="return FormDefaultSubmit();"


The generated HTML also causes a post back, but the session ID is in a hidden field, not in the URL:
<form onsubmit="return FormDefaultSubmit();"
name="SubmitForm" method="POST">
<input type="HIDDEN" name="IW_SessionID_"

tmCookie. This option creates a cookie named <user

name>@<IP>.txt. The cookie is handled automatically.
Heres the sample browser address:

No special cookie handling code appears in the JavaScript or

<form onsubmit="return FormDefaultSubmit();"
name="SubmitForm" action="/EXEC"

When Bad Things Happen to

Good Applications
IntraWeb handles unexpected circumstances in two ways. The
robust, structured exception handling in Delphi is fully supported in IntraWeb. So how do you raise an exception in an
IntraWeb application? In the same manner that Delphi does:
raise MyException.Create(
'Something unfortunate has occurred.');

How do you respond to an exception in an IntraWeb application? That depends on the amount of intervention you want.
In a Delphi application you can leave the exception unhandled,



N e t

IntraWeb Fundamentals
OutputDebugString Windows API from inside the
A quick demo of OnException shows how to
vary the exception message output on-the-fly,
based on a TUserSession property:
TUserSession = class(TComponent)
FShowMessageType: TIWShowMessageType;
property ShowMessageType: TIWShowMessageType
read FShowMessageType write FShowMessageType;

TIWShowMessageType is the underlying type

for ExceptionDisplayMode. When an exception
occurs, the OnException event handler of the
ServerController object passes the Application and Exception
objects as parameters. ShowMessage displays the exception
using the display mode defined in UserSession:

Figure 9: ExceptionDisplayMode settings and the resulting output.

procedure TIWServerController.
TIWApplication; AException: Exception);

How is the display mode stored in UserSession? An

IWRadioGroup lists the possible message type names in
the main form. The radio group items can be populated
manually or with code. The button click to raise the
exception gets the current radio group selection, casts the
ItemIndex to be a show message type, and then stores the
show message type in the UserSession ShowMessageType
procedure TIWForm1.btnAppExceptionClick(Sender: TObject);
{ Raise an exception using the display type indicated
in the radio group. }
UserSession.ShowMessageType :=
raise Exception.Create(edtAppException.Text);

Figure 10: ServerController paths to pages for predefined error conditions.

and let the application object catch it and display a message

dialog box. In IntraWeb, the ServerController object provides
display options through the ExceptionDisplayMode property.
Figure 9 shows examples of each mode.
Use the ServerController OnException event handler if
you need a more flexible way to handle exceptions. Note:
Perhaps you dont want to show any kind of dialog box
when an exception occurs, but instead wish to log the
exception. Check out the TIWClientDebugOut component
from TMS (www.tmssoftware.com); it supports calls to the


A second, more limited way that IntraWeb handles unexpected

conditions is through redirection to special-purpose HTML
pages defined in ServerController properties. If your site has
pages defined for the cases listed here, simply point to them
using the properties illustrated in Figure 10:
TimeOutURL. The session object is destroyed after the
number of minutes set in SessionTimeout. By default the
session timeout occurs after 10 minutes. Does this mean
that after 10 minutes a message pops up on the browser?
No. Only when the user attempts to use the page again
is the browser redirected to the Web page described in
the TimeOutURL property of ServerController.
InvalidCommandURL. The command used internally
to execute an action is defined in the ExecCommand
property, and is EXEC by default, e.g. http:
// The browser is redirected to
InvalidCommandURL if the command isnt recognized.



N e t

IntraWeb Fundamentals

NoCookieSupport. This is a TIWFileReference that

includes both Filename and URL sub properties. If cookies are turned off in the browser, the browser is redirected to either Filename or URL.
NoJavaScriptSupport. IntraWeb relies on JavaScript support being available. If its not available, the browser is
redirected to either Filename or URL.
UnknownBrowser. Unsupported or unknown browsers
attempting to access the server are redirected to either
Filename or URL.
The Forms List
Starting with the FormCount and Forms array properties and
drilling down through the Controls array, the WebApplication
object lets you traverse every form, and every IW control
on every form, in your project. An IWTreeView control can
represent our trip through the forms list as a hierarchy.
By clicking a treeview node, the ClassName of the item
is displayed in the label above the treeview, and in the
Hint property for the node. Note: IntraWeb 5.0 used a
SubItems.Add method to tack on child items. There is no
SubItems property in version 5.1. Instead, items are created,
and the ParentItem property for the item is assigned.
The sample
shown in
Figure 11, has
three forms,
two of which
have only a
label and button
each. The main
form contains
a label and an
control. The other
forms are created
in the main forms
OnCreate event
handler. Then the
Figure 11: The tree view displaying all forms and
components in the project.
FormCount drives
a for loop where
each iteration creates a tree view item. Each tree view item
caption is set to the form name, then the form components
are traversed. Check out the key bit of code shown in the
second for loop (see Figure 12) where the child tree view
item is created, and ParentItem is assigned.
When the user clicks a tree view node, how can we
determine if the node represents a form, or a component
on a form? The tree view OnClick event handler passes
ATreeViewItem. If the selected nodes ParentItem property
is nil, we have a top-level item representing a form.
FindComponent is used to return a form or component
object, based on the tree view item caption (see Figure 13).
Working with COM
What if your IntraWeb application fails with the exception
CoInitialize has not been called, but you arent using
COM in your application? If your project has ADO datas16


procedure TIWForm1.IWAppFormCreate(Sender: TObject);

FrmCount, CmpCount: Integer;
TopLevelItem, ChildItem: TIWTreeViewItem;
{ Create a couple of extra forms to demonstrate the
WebApplication FormCount and Forms array. }
{ Iterate the forms array and the components within
form. Add the information to an IntraWeb treeview
control. }
for FrmCount := 0 to WebApplication.FormCount - 1 do
with WebApplication.Forms[FrmCount] do begin
TopLevelItem := trvwForms.Items.Add;
TopLevelItem.Caption := Name;
TopLevelItem.Hint := ClassName;
for CmpCount := 0 to ComponentCount - 1 do begin
{ The essential two lines of code for
assigning child items. }
ChildItem := trvwForms.Items.Add;
ChildItem.ParentItem := TopLevelItem;
ChildItem.Caption := Components[CmpCount].Name;
ChildItem.Hint := Components[CmpCount].ClassName;
ChildItem.Expanded := True;

Figure 12: Assigning child items.

procedure TIWForm1.trvwFormsClick(Sender: TObject;
ATreeViewItem: TIWTreeViewItem);
SelectedForm: TIWAppForm;
SFormName: string;
{ In the title above the treeview, display the ClassName
of the selected component. }
if ATreeViewItem.ParentItem = nil then
// clicked on the form
SFormName := ATreeViewItem.Caption;
SelectedForm := WebApplication.FindComponent(
SFormName) as TIWAppForm;
lblFormsList.Caption := '<b>Forms List:</b> ' +
else // Clicked on a component under the form.
SFormName := ATreeViewItem.ParentItem.Caption;
SelectedForm := WebApplication.FindComponent(
SFormName) as TIWAppForm;
lblFormsList.Caption := '<b>Forms List:</b> ' +

Figure 13: Determining if the tree view node represents a form or a component.

ets, you are using COM. You can code OnNewSession to

fix the problem:
procedure TIWServerController.
TIWApplication; var VMainForm: TIWBaseForm);
ASession.Data := TUserSession.Create(nil);
CoInitializeEx(nil, 0);



N e t

IntraWeb Fundamentals

However, ServerController provides an easier way. Set the

ComInitialization property of ServerController at design time to
one of the following:
ciNone: The default. COM is not initialized.
ciNormal: COM is initialized. This is the setting you want
when using ADO or other COM servers.
ciMultiThreaded: ISAPI is multithreaded, so ISAPI DLLs
with COM require this setting.
The fundamental IntraWeb objects (ServerController,
WebApplication, and UserSession) propel us past the simple
creation of Web pages into the realm of building entire
IntraWeb-based sites. You can build IntraWeb applications
without touching these components, but knowing how to use
these powerful tools is key to constructing real-world sites that
incorporate authentication, application-defined properties for
each session, and robust exception handling.
The four example projects that support this article are available
for download on the Delphi Informant Magazine Complete
Works CD located in INFORM\2003\JUN\DI200306NR.
Noel Rice is senior architect at Kazoo Software where he has been
instrumental in architecting several Fortune 500-based projects and just
completed the authorized IntraWeb courseware written in collaboration
with Lino Tadros (www.kazoosoft.com/IW5Course.aspx. Also visit
www.kazoosoft.com to see our new downloadable IntraWeb videos to help you
master this great technology). He can be reached at nrice@kazoosoft.com.









By Ralph Krause

Extreme Testing
Introducing DUnit

ts difficult to be a programmer today without

having heard of Extreme Programming (or
XP). XP appeared around 1996 and is a set

of programming practices that stress simplicity,

communication, and feedback. Projects developed
using XP are characterized by frequent releases
based upon incremental functionality changes
and constant code refactoring.

Code refactoring is the process of rewriting code to make it

simpler and cleaner. Although programmers typically shy
away from refactoring for fear of breaking existing code, this
isnt the case for XP programmers. Under XP, code is always
tested before release, which ensures that the program still
works with the changes. The tests are written in an automated fashion before the code and testing are done, making
it is easy to run the tests every time a change is made.
This is in contrast to most accepted programming
methodology, where tests are usually determined after

constructor Create; overload;
constructor Create(LoanAmount: Currency;
InterestRate: Real; Months: Integer); overload;
property InterestRate: Real
read FInterestRate write SetInterestRate;
property LoanAmount: Currency
read FLoanAmount write FLoanAmount;
property MonthlyPayment: Currency read GetMonthlyPayment;
property Months: Integer read FMonths write FMonths;
procedure Reset;
property TotalInterestPaid: Currency
read GetTotalInterestPaid;
property TotalLoanAmount: Currency
read GetTotalLoanAmount;

Figure 1: Public declaration for TMonthlyLoan.



the code is written, and the testing is then performed

by someone else. The problems with the accepted
methodology are summed up as follows in an article
by Erich Gamma and Kent Beck (see end of article
for reference): Testing is not closely integrated with
development. This prevents you from measuring the
progress of development you cant tell when something
starts working or when something stops working.
This article will introduce the use of DUnit to implement
automatic code testing in Delphi and Kylix. DUnit is a
framework of classes based on JUnit for Java thats designed
to support the Extreme approach to testing. It allows unit
tests to be run and monitored through a GUI or the command
line. DUnit is released as open source and was at version
7.0.3 at the time of this writing. It can be downloaded from
SourceForge at http://dunit.sourceforge.net.
DUnits installation is straightforward: Download the
DUnit archive, extract the contents to your computer,
then tell Delphi where to find it. You can set the path
to DUnits src directory in either the Library entry under
Delphis Environment Options, or the Search Path under Project
Options in individual projects.
DUnit uses a program called a test runner to run the tests
that have been defined by the programmer, and provide
feedback as to which tests were successful and which were
not. A series of test cases is called a test suite. And the
test runner allows the programmer to run or skip any or
all of the tests in each test suite. A test suite can consist
of individual tests or other test suites, which allows test
hierarchies of varying complexity to be created.
There are several ways to incorporate testing with DUnit
into your projects. One way is to add the DUnit code directly
into your source code files; this is the method used in
this article. For production code, the DUnit calls would be
wrapped in {$IFDEF} blocks so they arent compiled into the



Extreme Testing

released application. Sample projects and program files that

illustrate some of the other ways to use DUnit are contained
in the examples directory included in the DUnit archive.
Using DUnit for Testing
A Delphi class that calculates monthly payments for a loan
will be the vehicle for introducing DUnit in this article. This
class exposes properties for setting the loan amount, interest
rate, and number of payments along with a method to
reset these values. After setting the loan values, the user can
query the class for the monthly payment, total loan amount
including interest, and total interest charged. See Figure 1
for a complete list of class properties.
For the LoanCalculator project a new application was created
in Delphi, and the default form was removed. An empty unit
named MonthlyLoan was added and the project was saved as
LoanCalculator. Under the interface section in the MonthlyLoan
unit a uses clause was added, and TestFramework was listed
under it to provide access to DUnit.
The tests for the MonthlyLoan unit are contained in a class
named TMonthlyLoanTests, which descends from TTestCase.
DUnit tests are contained in classes that descend from
TTestCase; each of these classes can perform any number of
different tests. DUnit uses runtime type information (RTTI)
to determine what tests have been defined for test cases, so
the tests must be declared in the published section of the
test class. Tests are performed in the order they are defined
(which could cause problems if, for some reason, a test
depends upon the state left by another test).
The first test well examine is named CreationTest; it ensures
that instances of TMonthlyLoan are created properly. Instances
of TMonthlyLoan can be created with or without starting
values, so this test will create instances of the class using both
methods, and then query the LoanAmount, InterestRate, and
Months properties to ensure they were set properly.
Tests are performed in DUnit by calling its Check method
with two parameters: a test condition which should evaluate
to True or False, and a message to show if the test failed.
This test is shown in the listing in Figure 2. In addition to
Check, DUnit provides several other methods for performing
tests, including CheckEquals, CheckNotEquals, CheckNull,
CheckNotNull, and CheckSame, which are all used in a
similar manner to Check. One advantage to using these
specialized methods instead of Check is that the expected
and actual test results are shown if the test fails.
In this simple example, the tests in TMonthlyLoanTests will be
used as the test suite that is run by the test runner. TTestCase
classes can be used as test suites, or test suites can be created
by adding several test cases together. They can also be built
through code by creating an instance of TTestSuite and adding
specific test cases to it, such as TMonthlyLoanTests.
Test suites must be registered with DUnit before they can
be run. This is accomplished in this program by calling
RegisterTest(TMonthlyLoanTests.Suite) in the MonthlyLoan
units initialization section. Some modifications must
also be made to the LoanCalculator project file to perform


type TMonthlyLoanTests = class(TTestCase)

{ Private declarations }
{ Protected declarations }
{ Published declarations }
procedure CreationTest;
procedure TMonthlyLoanTests.CreationTest;
MyLoan: TMonthlyLoan;
MyLoan := TMonthlyLoan.Create;
Check((MyLoan.LoanAmount = 0) and
(MyLoan.InterestRate = 0) and
(MyLoan.Months = 0),
'Create without starting values failed.');
MyLoan := TMonthlyLoan.create(10000, 6, 60);
Check((MyLoan.LoanAmount = 10000) and
(MyLoan.InterestRate = 6) and
(MyLoan.Months = 60),
'Create with starting values failed.');

Figure 2: Example of Check.

program LoanCalculator;
Forms, TestFrameWork, GUITestRunner,
MonthlyLoan in 'MonthlyLoan.pas';
{$R *.res}
// Application.Run;

Figure 3: Project file modifications for DUnit.

the tests that have been registered. First TestFrameWork

and GUITestRunner have to be added to the projects uses
clause. Then Application.Run must be replaced by GUITestRu
nner.RunRegisteredTests, as shown in Figure 3.
When the project is run, the GUI Test Runner window
should appear as shown in Figure 4. The tests are run
by pressing 9, clicking on the green arrow, or selecting
Actions | Run from the menu. Tests that are passed are
indicated by green squares, failures are indicated by
magenta, and tests that generate exceptions are indicated
by a red square.
Set Up and Clean Up
Now well examine properties and tests that allow
us to read and write the individual loan values. The
properties are named LoanAmount, InterestRate, and
Months, and the tests are named TestSetLoanAmount,
TestSetInterestRate, and TestSetMonths.
Although code could be written in each of these tests
to create an instance of TMonthlyLoan, and then free it,
DUnit provides a way to do this with less code duplication.



Extreme Testing

type TMonthlyLoanTests = class(TTestCase)

FTestLoan: TMonthlyLoan;
procedure SetUp; override;
procedure TearDown; override;
procedure CreationTest;
procedure SetLoanAmountTest;
procedure SetInterestRateTest;
procedure SetMonthsTest;
procedure TMonthlyLoanTests.SetLoanAmountTest;
FTestLoan.LoanAmount := 25000;
Check(FTestLoan.LoanAmount = 25000,
'Setting LoanAmount failed.');
procedure TMonthlyLoanTests.SetInterestRateTest;
FTestLoan.InterestRate := 3.75;
Check(FTestLoan.InterestRate = 3.75,
'Setting InterestRate failed.');

Figure 4: GUI Test Runner Window.

TTestCase provides two virtual methods, SetUp and TearDown,

which are run before and after each test. Overriding these
procedures in the test class provides common initialization
and clean up code for the tests. These routines are declared in
the protected section of TMonthlyLoan, so they dont appear
in the test runner. The test class creates a private instance
of TMonthlyLoan in SetUp before each test, and frees it in
TearDown after each test (see Figure 5).
Programs often raise exceptions to indicate invalid values or
other problems. There are two ways to write tests for this
using DUnit. The first is to wrap the test code in a try..except
block, calling Check with a value of False right below the line
that is supposed to cause an exception. If no exception was
raised, this line will execute, causing the test to fail.
The other way is to see if a specific exception was raised
by using CheckException. CheckException accepts three
parameters: a routine that should cause an exception to be
raised, the type of exception expected, and the message
to display in the test runner if the correct exception isnt
raised. CheckException will fail if an exception other than
the one expected is raised, or if no exception was raised.
The TMonthlyLoan class will raise an EInvalidInterestRate
exception if a negative interest rate is supplied. Two
tests were created to show the different ways of testing
for this: SetNegativeRateTest1 uses a try..except block
to conduct the test, while SetNegativeRateTest2 uses
CheckException. A routine named CauseInterestRateError
is called by both of these tests to cause TMonthlyLoan to
raise the EInvalidInterestRate exception (see Figure 6).
Most of the remaining test cases are written the same
way as those mentioned previously, so they wont be
explained here. However, the test for a loan with no


procedure TMonthlyLoanTests.SetMonthsTest;
FTestLoan.Months := 36;
Check(FTestLoan.Months = 36, 'Setting Months failed.');
procedure TMonthlyLoanTests.SetUp;
FTestLoan := TMonthlyLoan.Create;
procedure TMonthlyLoanTests.TearDown;

Figure 5: SetUp and TearDown.

interest caused DUnit to exhibit unexpected behavior, so

we should take a look at it. Because monthly payments
for this type of loan are equal to the loan amount divided
by the number of months in the loan, the first version of
the Check statement to test this was:
Check(FTestLoan.MonthlyPayment =
FTestLoan.LoanAmount / FTestLoan.Months, 'Error');

This check kept failing, even though the numbers

appeared to be equal in the IDE. The problem was solved
by performing the calculation in the test routine, and
assigning the result to a variable. That variable was then
compared to FTestLoan.MonthlyPayment in the Check call,
as shown in Figure 7.
Special Test Extensions
DUnit provides several ways to extend tests in a unit named
TextExtensions, two of which will be mentioned here. TTestSetup
and TRepeatedTest are classes that use the Decorator pattern
to add functionality to TTestCase. With TTestSetup, the SetUp
procedure is called once before the first test, and TearDown is
called once after the last test (in contrast to TTestCases behavior
where SetUp and TearDown are called for each test).



Extreme Testing

procedure TMonthlyLoanTests.CauseInterestRateError;
FTestLoan.InterestRate := -2;
procedure TMonthlyLoanTests.SetNegativeRateTest1;
Check(False, 'We should never reach this.');
on EInvalidInterestRate do
Check(True, 'Test Passed - Correct error raised.');
'Test failed - EInvalidInterestRate not raised.');
procedure TMonthlyLoanTests.SetNegativeRateTest2;
'EInvalidInterestRate not raised.');

Figure 6: Testing for exceptions.

unit SingleSetUpMonthlyLoan;
SysUtils, TestFrameWork, TestExtensions;
type TMonthlyLoanTests = class(TTestCase)
procedure CauseInterestRateError;
procedure CreationTest;
procedure SetLoanAmountTest;
procedure SetInterestRateTest;
procedure SetMonthsTest;
procedure SetNegativeRateTest1;
procedure SetNegativeRateTest2;
type TSingleSetUpTests = class(TTestSetup)
procedure SetUp; override;
procedure TearDown; override;
uses Math;

procedure TMonthlyLoanTests.ZeroInterestRateTest;
ExpectedResult: Currency;
ExpectedResult :=
FTestLoan.LoanAmount / FTestLoan.Months;
ExpectedResult := SimpleRoundTo(ExpectedResult, -2);
FTestLoan.InterestRate := 0;
Check(FTestLoan.MonthlyPayment =
ExpectedResult, 'Zero percent interest rate failed.');

Figure 7: Testing against a calculated value.

This is especially beneficial if these routines connect to

a database, or perform some other time-consuming task.
TRepeatedTests provides a mechanism for running tests a
specific number of times, without repeatedly registering them
with TestFramework. An example of using the TTestSetup class
is shown in Figure 8. Examples of using TRepeatedTest can be
found in the documentation that accompanies DUnit.
To create tests based on TTestSetup, first create a class that
descends from it. Then add the necessary set up and tear down
code by overriding the SetUp and TearDown methods from the
base class. An instance of the TTestSetup-based class is then
created, and a test suite containing the tests to run with the
new SetUp and TearDown routines is passed to its constructor.
In the sample code shown in Figure 8, the new SetUp
and TearDown routines are declared in the definition for
TSingleSetUpTests. The variable SingleSetUpTests is declared
as TTestSetup, and holds an instance of TSingleSetUpTests.
MonthlyLoanTests is declared as TTestSuite, and serves as
a container for the tests defined in TMonthlyLoanTests.
An instance of MonthlyLoanTests is created in the units
initialization section, and tests in TMonthlyLoanTests are
added to it using the AddTests method. An instance of
SingleSetUpTests is then created, and MonthlyLoanTests
is passed to its constructor. Finally, SingleSetUpTests is


MonthlyLoanTests: TTestSuite;
SingleSetUpTests: TTestSetup;
FTestLoan: TMonthlyLoan;
MonthlyLoanTests :=
TTestSuite.Create('Monthly Loan Tests');
SingleSetUpTests := TSingleSetUpTests.Create(
MonthlyLoanTests, 'Single Set Up Tests');

Figure 8: An example of using the TTestSetup class.

registered with the TestFramework so that its tests will be

run by the test runner.
Console Mode Testing
Although the GUI test runner provides a visual way to control
and monitor testing, it doesnt work well for automated builds
and testing. This is why DUnit includes a console test runner
that is used in these cases.
To run tests using the console, the project containing the test
code must either be a console application, or have Generate
Console Application checked in the Linker tab in its Project Options.
Replace GUITestRunner in the projects uses clause with
TextTestRunner, and replace GUITestRunner.RunRegisteredTests
with TextTestRunner.RunRegisteredTests. The text test runner
displays a dot for each test that was passed when this project
is run, F for failed tests and E for tests that raised an
exception. Pass rxbPause to the RunRegisteredTests call to keep
the console window open after the tests have run.
Extreme Programming is a method for writing programs that
differs from traditional coding practices in many ways. One of
these ways is in the area of code testing, where XP practices



Extreme Testing

dictate that tests be created before code is written, and that

the code is tested every time a change is made. As mentioned
in the ReadMe file that accompanies DUnit, this makes
applications self-testing, and gives the programmer confidence
that refactoring hasnt broken existing code. DUnit provides the
means to write these tests, keep them up-to-date with the code,
and run them automatically in Delphi and Kylix.
Further Information
JUnitTest Infected: Programmers Love Writing Tests by
Erich Gamma and Kent Beck, http://junit.sourceforge.net/
ReadMe.html for DUnit
The examples referenced in this article are available for
download on the Delphi Informant Magazine Complete
Works CD located in INFORM\2003\JUN\DI200306RK.

Ralph Krause has been programming professionally since 1995 and has been
using Delphi exclusively for the past three years. He lives in Michigan and
works for Genesys Systems, Inc. writing software for automobile dealerships.



. N E T



By Alexei Fedorov and Natalia Elmanova

Using the .NET Compiler

Part V: Working with XML Documents

n our series introducing the Delphi for .NET

Compiler Preview we have covered some of the
basics of ADO.NET, discussed how to implement

connected and disconnected scenarios in console

applications, demonstrated how to create console and
Windows Forms applications, and showed how to use
ADO.NET to access databases in .NET applications.
Most recently, we demonstrated how to create Windows
ADO.NET applications and how to use data-aware controls
in them. Its important that ADO.NET uses XML as a universal data transmission format. This means that to provide
data exchange with another application, we simply need an
XML processing engine for this application regardless of
the platform it works on. In other words, any application,
on any platform, can share ADO.NET data if it uses the
same XML schema as the format for those data.
In this installment, well cover some basics of XML support in Microsoft .NET: reading, writing, and querying
XML documents, performing navigation, and applying
XSL transformations.


The System.Xml namespace contains all classes for parsing XML documents, including the Document Object
Model (DOM) parser, and the XmlReader and XmlWriter
classes. The System.Xml.XPath namespace provides
classes that are used to navigate a document with XPath.
The System.Xml.Xsl namespace contains classes for
applying XSL transformations. In the System.Xml.Schema
namespace, well find classes that can be used to programmatically create or edit XSD schemas. Finally, the
System.Xml.Serialization namespace contains classes that
are used to serialize objects into XML format documents
or streams. All of these namespaces are packaged in the
System.Xml.dll assembly.
Reading XML Documents
The System.Xml namespace contains an abstract class
named XmlReader. This class provides a fast, forward-only,
read-only cursor for processing an XML document stream.
The streaming model this class implies requires no inmemory cache and works in a fashion similar to SAX, i.e.

During the last several years, XML has become a universal data exchange format. There are many possible
sources of XML data: XML documents themselves, XML
data generated by XML Web Services, XML data extracted
from Microsoft SQL Server databases (for more information on this see Alexei Fedorovs five-part series on Delphi
and SQL Server XML starting in the August, 2001 issue
of Delphi Informant), data from some other DBMS, ADO,
and ADO.NET datasets, and many others.
.NET XML Namespaces
The .NET Framework class library contains an integrated set
of classes for working with XML documents and data. XML
support in .NET is partitioned over several namespaces:


Figure 1: An element-centric XML document.


Te c h

Using the .NET Compiler

it retrieves the elements in the order

they appear in a document, from top
to bottom. XmlReader is a base class
for several classes used in processing
XML documents:
XmlNodeReader is used to read
data from an XmlNode object.
XmlTextReader is a forward-only
reader that has methods that return
data read from the XML document
as a text-based stream. This reader
doesnt support data validation, but
does check that the document is
well formed.
XmlValidatingReader is a reader
that provides a fully compliant
validating parser, or XML parser,
with Document Type Definition
(DTD), XSD schema, or XDR schema support.

Figure 2: An attribute-centric XML document.

The XmlReader class can read XML data

when it is entirely available, find the
depth of the XML element stack, check if
an element is empty, read and navigate
attributes, as well as skip over elements
and their content.
The XmlReader class supports the pull
model. This means that an application
pulls the data from the reader using the
Read method. The typical code for doing
so would look something like this:

Figure 3: Reading an element-centric XML document.

while Reader.Read do begin

// Process an XML document

Before discussing how to use the XmlReader class, we need

to mention that there are different ways for XML to represent
the same data. For example, if we have a relational table that
stores data about employees, we can represent data about each
employee (first name, last name, title, etc.) as separate elements that are subnodes of the <employee> node, and create
an element-centric document (see Figure 1).
Alternatively, we can present the same data as attributes of the
<employee> node, and create an attribute-centric document
(see Figure 2). Both documents are included in the accompanying code samples; see end of article for download details.
Moreover, we can mix these types of data representation in
an XML document to create a document that better suits our
needs in each particular case.
Lets look at how we use the XmlTextReader class to read the
element-centric XML document shown in Figure 1. To read this
document, we must create an instance of the XmlTextReader
class, and specify the document to read:
Reader : XmlTextReader;
Reader := XmlTextReader.Create('CustomersElements.xml');



Then, use the Read method to pull the data from the document, and create a string that contains the type of the element and its value:
while Reader.Read do begin
FormatOneLine(Reader.Name + ' ' +
Reader.NodeType.ToString + ' ' +
Reader.Value, Reader.Depth);

In our example, the FormatOneLine function provides a

console output of the string we just created, indented
according to the element depth:
function FormatOneLine(Line: string; Depth: Integer):
I : Byte;
for I := 0 to Depth do
Console.Write(' ');

The result of executing this application is shown in

Figure 3. The full source code for this application is
shown in Listing One (on page 28).
Our next example will show how to use the XmlTextReader
class to read the attribute-centric XML document shown
in Figure 2. The code for this looks similar to the previous
example. However, when reading an element, we also
need to move through all of its attributes using the


Te c h

Using the .NET Compiler

XmlWriter. This class provides a noncached, forward-only way of producing
XML document streams that conform
to the XML 1.0 recommendation of
W3C (World Wide Web Consortium),
and the namespaces defined in this

Figure 4: Reading an attribute-centric XML document.

XmlWriter is a base class for the

XmlTextWriter class that defines
an interface for writing XML, and
supports generating text-based XML
streams. With this class we can specify
whether to support namespaces,
write well-formed XML streams and
files, manage the output and determine its progress, write
several documents to the same output stream, flush or
close the output, and so on.
We can use the XmlWriter class to write an element-centric
XML file. First, create an instance of the XmlWriter class,
set the name of the XML file we will create, and define its
formatting settings (such as indentation):
Writer : XmlTextWriter;
Writer := XmlTextWriter.Create(
'employees.xml', nil);
Writer.Formatting := Formatting.Indented;
Writer.Indentation := 2;

Then create a start element of the document:


Figure 5: Writing an element-centric XML document.

After this, create the root node:

Writer.WriteStartElement('', 'Northwind', '');

MoveToFirstAttribute and MoveToNextAttribute methods, and

create appropriate output:

Then create a child subnode:

while Reader.Read do begin

FormatOneLine(Reader.Name + ' ' +
Reader.NodeType.ToString + ' ' +
Reader.Value, Reader.Depth);
if Reader.HasAttributes then
FormatOneLine(Reader.Name + ' ' +
Reader.NodeType.ToString + ' ' +
Reader.Value, Reader.Depth);
while Reader.MoveToNextAttribute do begin
FormatOneLine(Reader.Name + ' ' +
Reader.NodeType.ToString + ' ' +
Reader.Value, Reader.Depth);

and write its subnodes:

The result of executing this application is shown in Figure 4.

The full source code for this application is shown in
Listing Two (on page 28).


Writing XML Documents

Along with the XmlReader class and its descendants, the
System.XML namespace provides an abstract class called

The document resulting from this application is shown

in Figure 5. The full source code for this application is
shown in Listing Three (on page 28).



Writer.WriteStartElement('', 'Employees', '');

Writer.WriteElementString('LastName' ,
Writer.WriteElementString('HireDate' ,


After data generation, close all the open nodes:

// End the Employees Node
// End the root node

To clean up, close the document, flush the stream buffer, and
close XmlTextWriter:


Te c h

Using the .NET Compiler


The Document Object Model (DOM) is an in-memory tree
representation of an XML document. This model enables
the navigation and modification of a document, such as
adding, updating, or deleting content of elements. The
DOM specification of W3C provides a standardized way to
manipulate XML data.
The XML DOM is supported in the XmlDocument class;
it allows us to read an XML document and parse its
contents into a set of nodes. These nodes represent
the structure and content of the document, allowing
applications to read and manipulate the information in
the document using random access to the documents
content. In other words, after parsing a document, its
nodes can be explored in any direction. Its necessary to
use the XmlDocument class when you need to modify the
document structure, or apply the XSL transformation to
the document.
The XmlDocument class allows some operations on the
document as a whole, such as loading or saving an XML
file. In addition, this class allows manipulating the nodes
of the entire XML document in particular, to access
and modify attribute, element, and entity reference nodes,
as well as retrieve entire nodes, and the information
contained in them.
Lets look at how to use the XmlDocument class. In our
example, well parse the XML document. To do this,
create an instance of the XmlDocument class to represent
a document, as well as the XmlNodeList and XmlNode
classes to represent the list of the document child nodes,
and a single node:
: XmlDocument;
RootNode : XmlNode;
NodeList : XmlNodeList;
XmlDoc := XmlDocument.Create;
RootNode := XmlDoc.DocumentElement;
NodeList := RootNode.ChildNodes;

Having the document in memory allows us to process it

in any way. For example, we can create output showing
the text contained in its nodes:
for I:= 0 to NodeList.Count-1 do

The full source code for this application is shown in

Listing Four (on page 29).
Querying XML Documents
If we need to programmatically select and manipulate
nodes of an XML document, we can use the XML Path
Language (XPath) defined by W3C. This language allows
us to navigate nodes of a document tree, and select individual nodes or groups of nodes in an XML document.
The XPath data model maps an XML document to a tree of
standard node types, such as root, element, attribute, text,


comment, processing-instruction, and namespace. XPath

expressions can identify these nodes in the XML document
based on their types, names, and values, as well as their
relationships to other nodes in the document. An XPath
expression can return a set of nodes, a single node, a Boolean value, a floating-point number, or a string.
Our next example will show how to use the XPath expression to select a single node. First, create an instance of
the XmlDocument class, as well as two instances of the
XmlNode class, to represent the root node of the document, and the node we want to select:
XmlDoc : XmlDocument;
: XmlNode;
: XmlNode;
XmlDoc := XmlDocument.Create;
Root := XmlDoc.DocumentElement;

Then, select the necessary node using the SelectSingleNode

method of the XmlNode class. This method accepts the
XPath as a parameter:
Node := Root.SelectSingleNode(

Finally, create output that shows the attributes of the

selected node:
for I := 0 to Node.Attributes.Count-1 do
Console.WriteLine(Node.Attributes.Item(I).Name + '=' +

The result of selecting a single node by querying the XML

document is shown in Figure 6. The full source code for
this application is shown in Listing Five (on page 29).
Now lets look at how we can select a set of nodes using XPath
expressions. In this case, we need to create an instance of the
XmlNodeList class, and use the SelectNodes method of the
XmlNode class, with the XPath expression as its parameter:
XmlDoc :
Nodes :
Nodes :=


The result of selecting a set of nodes by querying the

XML document is shown in Figure 7. The full source code
for this example is shown in Listing Six (on page 29).
Performing XSL Transformations
If we need to transform an XML Document to another XML
document, or to a document of another type (e.g. HTML or
text), we should use the XML-based language called Extensible
Stylesheet Language Transformation (XSLT). The purpose of
XSLT is to transform the content of a source XML document
into another document that differs from the source document


Te c h

Using the .NET Compiler

in format or structure. XSLT allows us

to exchange data with different business
systems, as well as with different devices,
such as PCs, WML-enabled mobile phones,
and pocket computers.
There is a specification of XSLT defined
by W3C. The functionality of this
specification is implemented in the
XslTransform class that can be found
in the System.Xml.Xsl namespace.
This class can transform input from an
XmlDocument, XmlDataDocument, or
XPathDocument that is an optimized XSLT Figure 6: Selecting a single node from an XML document using the XPath expression.
data store. The result of a transformation
can be saved to a stream, to a TextWriter,
or to an XmlWriter. It can be also provided
as an XmlReader.
Lets explore how to apply an XSL
transformation to an XML document. In our
last example, well convert the customers
data in an XML file into a comma-separated
list that contains the following information:
company name, contact name, and phone.
The appropriate XSL stylesheet is shown
<?xml version="1.0" ?>
<xsl:stylesheet xmlns:xsl=
"http://www.w3.org/1999/XSL/Transform" Figure 7: Selecting a set of nodes from an XML document using the XPath expression.
<xsl:template match="/">
<xsl:for-each select="NWCustomers/
<xsl:value-of select="@CompanyName"/
<xsl:value-of select="@ContactName"/
<xsl:value-of select="@Phone"/>

To apply the transformation, create an

instance of the XslTransform class, and
load the XSL stylesheet using the Load
method. Then call the Transform method:
XmlDoc := XmlDocument.Create();
XslDoc := XslTransform.Create();
Writer := XmlTextWriter.Create(
XslDoc.Transform(XmlDoc, nil, Writer);

Figure 8: Applying an XSL transformation to an XML document.

The result of querying the XML document is shown in

Figure 8. The full source code for this example is shown
in Listing Seven (on page 29).
Weve shown how to use some XML.NET features to


manipulate XML data. We discussed how to read, write, and

navigate XML documents, how to query XML documents using
XPath expressions, and how to apply XSL transformations.
The demonstration projects referenced in this article are
available for download on the Delphi Informant Magazine
Complete Works CD located in INFORM\2003\JUN\DI200306AF.


Te c h

Using the .NET Compiler

Alexei Fedorov is a developer and consultant based in Moscow, Russia.

During his 20 years of experience he has worked as a chief technology
officer for a Swiss IT company, has provided technical support for Borland
languages, has participated in development and software localization,
and has created many Internet and intranet sites. Alexei also contributes
articles for asp.netPRO magazine, and has co-authored many books,
including Professional ASP 2.0, ASP 2.0 Programmers Reference, and
Advanced Delphi Developers Guide to ADO. Alexeis recent book, A
Programmers Guide to .NET, was published by Addison-Wesley in 2002.
Natalia Elmanova, Ph.D. is a developer and trainer based in Moscow,
Russia, as well as an executive editor for ComputerPress magazine
(www.compress.ru). During the last 15 years she has trained several hundred
Delphi and Visual Basic developers and has been a team leader in several
software projects for various commercial companies, research institutes, and
governmental organizations. Natalia was a contributing author to the 10th,
11th, 12th, and 13th Annual Borland Conferences and has co-authored many
books, including Advanced Delphi Developers Guide to ADO.

Begin Listing One Read an

element-centric XML document
// Reading XML file (element-centric)
program XMLNetDemo;
uses System.XML;
var Reader : XmlTextReader;
// Increase indentation of a line
// depending on the node depth.
function FormatOneLine(Line: string; Depth: Integer):
I : Byte;
for I := 0 to Depth do
Console.Write(' ');
// Open an XML document.
Reader := XmlTextReader.Create('CustomersElements.xml');
// Pull data from it.
while Reader.Read do
FormatOneLine(Reader.Name + ' ' +
Reader.NodeType.ToString + ' ' +
Reader.Value, Reader.Depth);

End Listing One

Begin Listing Two Read an
attribute-centric XML document
// Reading XML file (attribute-centric).
program XMLNetDemo;
uses System.XML;
var Reader : XmlTextReader;
// Increase indentation of a line
// depending on the node depth.
function FormatOneLine(Line: string; Depth: Integer):



I : Byte;
for I := 0 to Depth do
Console.Write(' ');
// Open an XML document.
Reader := XmlTextReader.Create(
// Pull data from it.
while Reader.Read do begin
FormatOneLine(Reader.Name + ' ' +
Reader.NodeType.ToString + ' ' +
Reader.Value, Reader.Depth);
if Reader.HasAttributes then
FormatOneLine(Reader.Name + ' ' +
Reader.NodeType.ToString + ' ' +
Reader.Value, Reader.Depth);
while Reader.MoveToNextAttribute do
FormatOneLine(Reader.Name + ' ' +
Reader.NodeType.ToString + ' ' +
Reader.Value, Reader.Depth);

End Listing Two

Begin Listing Three Write an
element-centric XML document
// Writing XML file (element-centric).
program XMLNetDemo;
System.XML, System.IO;
var Writer : XmlTextWriter;
Writer := XmlTextWriter.Create('employees.xml', nil);
Writer.Formatting := Formatting.Indented;
Writer.Indentation := 2;
// Start the document.
// Place a comment.
Writer.WriteComment('Deplhi .NET XmlWriter Demo');
// Start the root node.
Writer.WriteStartElement('', 'Northwind', '');
// Start Employees Node.
Writer.WriteStartElement('', 'Employees', '');
// Write subnodes.
Writer.WriteElementString('FirstName', 'Alexei');
Writer.WriteElementString('LastName', 'Fedorov');
Writer.WriteElementString('Title', 'CTO');
Writer.WriteElementString('HireDate', '15-01-2002');
// End the Employees Node.
// End the root node.
// End the document.
// Flush buffer to stream and close writer.

End Listing Three


Te c h

Using the .NET Compiler

Begin Listing Four DOM-style XML


Begin Listing Six Select a set of

nodes using XPath expression

// Reading XML file.

program XMLNetDemo;

// Querying XML file

program XMLNetDemo;



uses System.XML;

uses System.XML;

XmlDoc : XmlDocument;
RootNode : XmlNode;
NodeList : XmlNodeList;
I : Integer;
XmlDoc := XmlDocument.Create;
RootNode := XmlDoc.DocumentElement;
NodeList := RootNode.ChildNodes;
for I:= 0 to NodeList.Count-1 do

XmlDoc : XmlDocument;
Root : XmlNode;
Nodes : XmlNodeList;
I : Integer;

End Listing Four

Begin Listing Five Select a single

node using XPath expression
// Querying XML file
program XMLNetDemo;

End Listing Six

Begin Listing Seven Apply an XSL
transformation to the XML document


// Using XSL Transformations.

program XMLNetDemo;

uses System.XML;


XmlDoc : XmlDocument;
Root : XmlNode;
Node : XmlNode;
I : Integer;

System.XML, System.XML.XSL;

XmlDoc := XmlDocument.Create;
Root := XmlDoc.DocumentElement;
Node := Root.SelectSingleNode(
for I := 0 to Node.Attributes.Count-1 do
Console.WriteLine(Node.Attributes.Item(I).Name +
'=' + Node.Attributes.Item(I).Value);

End Listing Five


XmlDoc := XmlDocument.Create;
Root := XmlDoc.DocumentElement;
Nodes := Root.SelectNodes('customers[@Country=''USA'']');
'Found ' + Nodes.Count.ToString + ' items:');
for I := 0 to Nodes.Count-1 do


XmlDoc : XmlDocument;
XslDoc : XslTransform;
Writer : XmlTextWriter;
XmlDoc := XmlDocument.Create;
XslDoc := XslTransform.Create;
Writer := XmlTextWriter.Create(Console.Out);
XslDoc.Transform(XmlDoc, nil, Writer);

End Listing Seven




By Andrew Ghinaudo

Visual SourceSafe Access Made Simple

ve been using Microsoft Visual SourceSafe

(VSS) since Delphi 3 was introduced. Since that
time, Ive taken a look at a few free and not-so-

free software tools that integrate VSS directly into

the Delphi IDE. VssConneXion is one of the few
products I would purchase for this task.

For me, the key to any software tool is simplicity.

VssConneXion was easy to install and easy to use. I began
accessing my VSS database immediately after installing the
product via the new SourceSafe menu options in the Delphi IDE.
The first thing I noticed was a very familiar interface
provided by VssConneXion called Project Explorer
(see Figure 1). This is basically a lightweight replica of the
VSS interface itself, and I found it very useful. It displays
the SourceSafe projects in the same
manner as VSS itself, with the same
basic project options and file listings.
The only function that I use in VSS
that I didnt find in Project Explorer
was the Add Files option, but after
I ventured further into the new
SourceSafe submenu items, I found a
shortcut that did the trick.

Figure 1: The Project Explorer provides a UI similar to VSS itself.

Figure 2: VssConneXion provides an integrated menu for the Delphi IDE with the most-used VSS tasks.



It was good to see that I could get

started with VssConneXion with a
familiar tool like Project Explorer,
but the value of a VSS integration
tool obviously isnt in a reproduction
of the VSS UI itself. The value
really lies with the products other
integrated menu options that provide
one-click access to the primary
VSS functions. All of the most-used
features in VSS are available from
VssConneXions SouceSafe menu.
In addition, the menu options are
tailored to your current Delphi
project. So instead of opening
VSS, traversing to a subproject,
then selecting a file or files and
performing a VSS action such as Get
Latest Version or Check Out, you can
simply perform a Get <Unit.pas> or a
Check Out <Unit.pas>, which is already
prepopulated in the menu.
The menu also includes quick access
for such tasks as Show Difference,





Just the Facts

VssConneXion provides integrated menu options that allow
one-click access to the primary Visual SourceSafe (VSS)
functions. All of the most-used features in VSS are available,
and the menu options are tailored to your current Delphi
project. There are many Delphi IDE integration features in
VssConneXion that make using your VSS database a lot
quicker and simpler. If you use Microsoft Visual SourceSafe,
its worth your time to check out VssConneXion.
EPocalipse Software

Figure 3: The Compare To option makes looking at changes easy and quick.

Contact: info@epocalipse.com
Web Site: www.epocalipse.com
Price: Delphi or C++Builder versions available: 1-5 users,
US$59.95; 6-10 users, US$49.95; 11+ users, US$39.95;
site license, US$800.

you want to compare to and a Difference Viewer pops up

The Synchronize option is another great feature that provides
some useful options for synchronizing your project
(see Figure 4). With this option you can make sure that
you always have the latest source code, and that other
developers are updated with your changes.

Figure 4: Synchronizing options.

Compare To, Files (Get Latest Version, Check In, Check Out, Undo
Check Out, Get Updated Files, Check In Modified Files,
Synchronize, Show History,

and more (see Figure 2).


Some of the integrated extensions that are very helpful

are the Compare To and Synchronize options. The Compare
To option displayed in Figure 3 makes looking at code
changes easy with a quick list of the last number of
checked-in versions. You dont have to locate the unit
in a treelist, or go into a History dialog box to find what
youre looking to compare; simply click which version



There are many other Delphi IDE integration features in

VssConneXion that make using your VSS database a lot
quicker and simpler, but you can discover those for yourself
if you choose to purchase the product. Im already sold.
For $59.95 you can purchase VssConneXion from the
EPocalipse Web site. They also have some attractive volume
discounts, a reasonable site-license option, and a free trial
version. If you use Microsoft Visual SourceSafe, its worth
your time to check out VssConneXion for yourself.
Andrew Ghinaudo is Manager of Enhanced Services Development at
Premiere Conferencing in Colorado Springs, CO. He also maintains his own
development/consulting business, MediaMicronics, and is the Technical Editor
for Delphi Informant Magazine. For information concerning MediaMicronics
consulting services send e-mail to info@mediamicronics.com.




By Clay Shannon

Team Coherence 7.1

Software Configuration Management Tool Brings Teams Together

s its name indicates, Team Coherence is an

SCM (Software Configuration Management)
tool that facilitates teams working together

cohesively. Team Coherence supports collaboration, not only of teams working in the same geographical area connected by a LAN or an intranet,

includes the TC Tracker and TC Builder modules, which

the Professional and Personal editions do not. The Entry
Level edition, though, is a single-user version; you cannot
connect to a remote server or allow other computers to
attach to your server (which resides by definition on
your machine in this case). You can think of the Entry
Level edition as a full-featured Personal edition, or as a
learners edition of the Enterprise version.

but also those who are dispersed across the world

and connected via a WAN or the Internet. I used
Team Coherence on a project where one team
member was in a neighboring U.S. state and the
other was in Germany.
The core piece of Team Coherence is its Version Control
module, which comes in all four editions: Personal, Entry
Level, Professional, and Enterprise. Quality Software
Components Ltd. (QSC), the makers of Team Coherence,
also make available two add-in modules: TC Tracker, a
defect/issue tracking system; and TC Builder, a build
automation utility. These modules integrate seamlessly
into the core product. In fact, they integrate so well that I
was unaware that TC Tracker and TC Builder were add-ins
until I investigated the specifics of the product offering
more thoroughly in preparation for writing this review
(we used the Enterprise edition).
The Personal and Entry Level editions are single-user
editions, while Professional and Enterprise are multi-user.
The Entry Level edition has all of the same features as the
Enterprise edition. In other words, the Entry Level edition


Figure 1: Integrate Team Coherence into your compilers.




Team Coherence 7.1

within Team Coherence), determines
whether files can be checked in
without being prompted for comments,
sets promotion levels (Testing, QA,
Production), and much more.
On the client side, you set up the
server to which you will connect, via
the Connection Manager (available
through either Start | Programs | qsc
| Client | Connection Manager, or from
Delphis Workgroups menu, if youve
integrated Team Coherence into the
Delphi IDE) by entering the Host
(server name) and Port.

Figure 2: Team Coherence in action.

Installing Team Coherence is similar to installing most
commercial software. If youre going to use one of the multiuser editions, install the server on the machine to which all
users will have access, and the client on the machines that
you want to access the server.
You can use Team Coherence as a completely separate
environment, or you can integrate Team Coherence into
Delphis IDE. In fact, the documentation states that Team
Coherence, although useful for a variety of other tools such
as Visual Studio and ModelMaker, was created primarily
for Delphi and C++Builder. This isnt surprising, because
Team Coherence itself was developed with Delphi 5.
To integrate Team Coherence into the Delphi IDE, select Start |
Programs | qsc | Client | IDE Integration on the client machines. Team
Coherence will determine which compilers that it supports are
installed on your system, and allow you to select the one(s)
into which you want it to integrate (see Figure 1).
If you integrate Team Coherence into the Delphi IDE, you
can then access it by selecting Workgroups | Connect to TC (or
via the Connect to Team Coherence speedbutton thats added
above the Component palette). The Workgroups menu will
then be populated with several of the more common actions
you will normally perform, such as getting files, checking
out (locking), and checking in files. Or you can select
Workgroups | Run Team Coherence to access the full TC GUI.
Whether or not you decide to integrate Team Coherence into
the Delphi IDE, you can always access the Team Coherence
Repository on the server via the Windows Start menu (Start |
Programs | qsc | Client | Team Coherence).
Team Coherence is highly configurable, especially (as
is to be expected) on the server side. The administrator
adds users and passwords, sets users properties (such as
their e-mail address, so e-mail can be sent to them from


You also must set up your Working

Path so Team Coherence will know
where to place files you get and
check out (and from where to upload
them when you check them in). For
example, you may be retrieving the
files from a drive on the server that
has been mapped to a letter that you dont have mapped on
your machine, such as R.
One minor complaint I had is that there is no Close
Connection speed button or menu item from within
the Delphi IDE. You can connect to the remote Team
Coherence Repository, but to disconnect you must select
Workgroups | Run Team Coherence, then disconnect from there.
I was told this would be addressed in the future.
Team Coherence Overview
You can begin using Team Coherence once its installed
and configured. If youre familiar with using other VCS
tools (such as StarTeam, Visual SourceSafe, PVCS, and
freevcs, to name a few), you should have no problem getting up to speed with Team Coherence. It supports all the
actions youre accustomed to performing on files: get,
check out, check in, lock, unlock, file comparison, versioning, labeling/baselining, etc. Additionally, you can freeze
and thaw files, i.e. mark them as frozen to prevent further modifications, thaw them if its determined you prematurely froze them.
Besides being feature-rich, Team Coherence is intuitive to
use. To give you a better idea of this, see Figure 2 for an
example of what Team Coherence looks like when you run
it. As you can see, Team Coherence has an attractive and
easy-to-understand UI. You navigate in the repository to
the project, directory, and file[s] you want to work with,
then perform the necessary action by right-clicking the file,
clicking a speed button, or selecting a menu item. The Outlook-style vertical bar on the left side of the screen indicates which modules you have installed (in our case, all
three: Version Control, TC Tracker, and TC Builder).
Note that, unlike some VCS systems, you can add any type
of file, not just source code files. Because of this, you can
manage your project documentation, screen shots, or any
other type of file you want to archive.




Team Coherence 7.1

Nevertheless, the ability to use the Parallel Model is there, if you want it. Of
course, Team Coherence doesnt force
you to work that way.
Some Server Specifics
The connection to the repository on
the server is made via TCP/IP. The
server could, if necessary, reside on
the other side of the planet and be
accessed using a standard Internet
connection. Indeed, that was the case
with the project I worked on. I was in
the United States and the server was
in Germany. Even so, and with a dialup connection, transferring an entire
large projects worth of source files
took only a few seconds.

Figure 3: The TC Tracker add-in.

Team Coherence can be configured to automatically add comments to your files so you can ascertain the history of a file
at a glance. Things like when it was last checked in, what
changes were made at the time, and who made changes can
be very helpful. And setting up Team Coherence to automatically add these types of comments can be very useful.
Parallel Model vs. Serial Model
Another area where Team Coherence differs from some VCS
tools is that it supports the Parallel Development Model (in
addition to the more common Serial Development Model).
In the Serial Model, only one user can work on a specific
file at any given time, locking all others out from changing
that file until the user is through with it and checks it back
in (and unlocks it).
The problem with the Serial Model is that it can create serious bottlenecks in the development process while other
members of the team wait to modify a locked file. The Parallel Model eliminates this problem by allowing multiple
users to work on the same file simultaneously. When all the
developers working on the one file are finished, they merge
their changes (which Team Coherence obviously supports).
Although this can be beneficial in certain circumstances
when there is good communication between developers
about who is working on what (you certainly wouldnt want
multiple developers working on the same method at the same
time) it also presents a potential problem: one developers
changes may not work when combined with those of another.
Im not a fan of the Parallel Model it seems too fraught with
peril for my taste. The preferred way of working would be to
either limit the scope of the files so that rarely would more
than one developer want to work on the same file at the same
time, or use the Extreme Programming practice at least
in an instance like this of both coders working on the same
instance of the file together (at the same computer).


The Team Coherence server can be run

as a service under Windows NT and
2000, or as a standalone application
under Windows 9X.
The TC Tracker Add-in
If you have the TC Tracker add-in installed, there will be
another page displayed to allow you to associate issues with
the files you are checking out. You can get a good idea of how
TC Tracker works, and the information it provides, by studying
the screen shot shown in Figure 3. The Version Control and TC
Tracker modules, although not dependent on one another, are
integrated in the sense that specific revisions of files in Version
Control can be associated with specific issues in TC Tracker.
Not surprisingly, TC Tracker is highly configurable. Among
other things, you can set it up to show you issues assigned to
only you. The administrator can also set up shared views that
are available to all users.
The TC Builder Add-in
In smaller projects, you may not have a complicated build
process. On large projects, though, one developer is usually
assigned the responsibility of managing the build process. In
fact, on one very large project on which I worked, the builds
were the sole responsibility of one team member. For these
types of projects, the build automation utility, TC Builder,
can come in handy.
TC Builder is a powerful utility for managing and semiautomating the build process. Rather than creating build
scripts at the command line, adding arcane switches
and parameters, TC Builder allows you to visually create
semi-automated builds that can be repeated at any future
point. Workflow objects allow you to visually define the
order of a build and any dependencies there may be, and
what to do in the event of an error.
Like TC Tracker, TC Builder integrates tightly with the Version
Control module to provide access to any revision of the files in
the system, allowing you to rebuild any version of your system
at the click of a mouse.




Team Coherence 7.1

Just the Facts

Created primarily for Delphi and C++Builder, Team Coherence is a Software Configuration Management tool that
facilitates teams working together cohesively. You can use
Team Coherence as a completely separate environment or
integrate it into Delphis IDE. Team Coherence is highly configurable, especially on the server side. It is easy to use, and
is accompanied by excellent documentation.
Quality Software Components Ltd.
6 Suttie Way
Bridge of Allan
E-Mail: info@qsc.co.uk, support@qsc.co.uk
Web Site: www.qsc.co.uk
Price: TC Personal, US$149; TC Entry Level, US$249;
TC Professional, US$399 (discounts available for more than
5 licenses and upgrades); TC Enterprise, US$599 (discounts
available for more than 5 licenses and upgrades). TC Tracker
and TC Builder are also available separately; see QSCs Web
site for pricing details.
TC Builder targets Delphi and C++Builder developers. Nevertheless, it supports most types of compilers through some
generic interfaces. As far as Delphi goes, it supports both packages (*.dpk) and projects (*.dpr) in Delphi 3-7.
TC.exe, the Command-line Utility
To further enhance your productivity, you can automate some
of your oft-repeated steps with TC.exe, the Team Coherence
command-line utility. This command-line tool allows you to
execute several Version Control and maintenance tasks using
batch files and macros.
Often sparse and terse, poorly written, or, worst of all, misleading or downright erroneous, documentation can be the Achilles heel of software products. Even in this oft-neglected area,
though, Team Coherence excels. Except for a few typos, the
documentation is plentiful, clear, and well written.



The documentation includes help files for each module (Version Control, TC Tracker, and TC Builder), as well as separate
help files dedicated to the APIs, which you can call if you
want to create an add-in of your own. There is also a help file
for TC.exe, the command-line utility. The documentation is so
comprehensive that it even includes a list of the constants that
Team Coherence defines, including a list of error codes.
If you want to learn exactly what Team Coherence can do
before you order it, the documentation is available as a separate download from www.qsc.co.uk/doc_tc.htm.
Et Cetera
Youll need a license for each person using Team Coherence.
There are numerous editions of Team Coherence, and there
are multi-license discounts available, as well as discounts for
upgrading; for details visit www.qsc.co.uk/prices.htm.
By the time you read this, QSC should have released a Linux
version of the command-line tool and the API (scheduled for
release in April 2003). Kylix 3 is being used to develop this
tool. Check their Web site for the latest details.
The Bottom Line
Team Coherence is exceptionally easy to use. The impressive
part is that QSC has achieved this ease of use without skimping on features. On the contrary, Team Coherence has everything you could want in an SCM tool: its extensible, integrates
well into Delphi, and is highly configurable. For the complete
rundown of which features are available in which editions,
see the feature matrix at www.qsc.co.uk/features_tc.htm.
To try before you buy, download an evaluation copy from

Clay Shannon is a Borland and PDA-certified Delphi 5 developer and author

of The Tomes of Delphi: Developers Guide to Troubleshooting (Wordware,
2001) as well as the novel he claims is the strangest one ever written, The
Wacky Misadventures of Warble McGorkle (see www.winsite.com/bin/
Info?12500000036639 for more information about this and other novels he
has written). You can find out more about Clay at http://hometown.aol.com/
bclayshannon/myhomepage/index.html. To check out Clays shareware and
determine his current availability visit http://hometown.aol.com/bclayshannon/
myhomepage/business.html. You can contact Clay at BClayShannon@aol.com.




By Mathew Hess

Easy MAPI 2
MAPI Power without the Complexity

he Messaging Application Program Interface

(MAPI) is an extensive library of functions
and services that developers can use to create

mail-enabled applications. There are many reasons

why you might want to work with MAPI. Perhaps
you need to write a custom messaging application,
or integrate an application with several messaging
clients, each with its own API. Perhaps you want
to send mail from a Windows NT service. Or maybe
you just want to make those entertaining Microsoft
Outlook XP security warnings go away.

Whatever your goal, if youve ever tried to deal with

MAPI directly, you know that MAPI especially Extended MAPI is no walk in the park. Fortunately, Easy
MAPI from RAPWare delivers just what the name promises: a clean, object-oriented, easy-to-use set of components
and objects that hide the complexity of MAPI, yet allow
you to leverage its significant power.
Solid Engineering, Excellent
Documentation, Many Examples
One look at the Easy MAPI source code tells you that this is
a solid piece of engineering. The code is written in a consistent style with logical object encapsulation, and is extensively commented (see Figure 1).
property LastModifiedOn: TDateTime read GetLastModifiedOn;
// The LastModifiedOn property contains the date and time
// the object or subobject was last modified.
// This returned value is in UTC format.
// Also note that CreatedOn and LastModifiedOn datetime of
// MSG messages refer to the date and time when the MSG
// file was created and last updated, instead of the
// create/modify dates of the message itself.
// See Also: RwUTCToLocal

Figure 1: An example of helpful comments by the RAPWare developers

(reformatted for fit).



The Help file is thorough and current (they seem to update

it with every build) and Easy MAPI comes with 20 sample
projects illustrating basic and advanced techniques from
browsing message stores to managing folders to responding
to message store events (see Figure 2).
Lets take a look at some of the functionality provided by
Easy MAPI.
Five Core Components
RAPWare installs five core components on the Component
Palette: RwMapiSession, RwMapiStore, RwMapiAddressBook, RwMapiFolderDialog, and RwWAB, which is new in
version 2.0.1 (see Figure 3).
The RwMapiSession and RwMapiStore components provide the backbone of MAPI connectivity. After dropping
these components on a form, connecting a MAPI session
to the default store is just this easy:
MyStore.Session := MySession;
with MySession do begin
LogonInfo.Shared := True;
LogonInfo.ProfileName := RetrieveDefaultProfile;
LogonInfo.Password := '';
LoggedOn := True;
MyStore.Active := True;

The RwMapiAddressBook component provides access to

address books, and allows you to manage a recipients
collection for a message. For example, after youve connected an RwMapiAddressBook to a logged-on session,
this single code statement:
'RAPWare MAPI Address Dialog',
MyAddressBook.RecipientList, True, True, False);

creates the dialog box shown in Figure 4, which allows

the user to pick recipients for a message.
Then, whether recipients have been added via a dialog box
or directly in code, resolving them, removing unresolved
entries, and attaching the list to a message is as easy as this:




Easy MAPI 2

Figure 2: One of the Easy MAPI 2.0.1 sample projects; this one shows how to
create and send e-mail using text, HTML, and RTF message bodies.

Figure 4: Using AddressBook.DisplayAddressDialog to prompt the user to select

recipients for a message.

Figure 3: The Easy MAPI Component Palette.

with MyAddressBook do begin
if ResolveNames('', False) <> S_OK then
if RecipientList.Count > 0 then

A Clean, Object-oriented Approach

Supporting these five components are a host of classes,
such as TRwMapiMessage, TRwMapiRecipientList,
TRwMapiRecipient, and TRwMapiProp, which encapsulate
the complex MAPI interfaces in coherent, usable objects.
For example, many MAPI items, such as messages and
recipients, are deep down implementations of the
IMapiProp interface. Through TRwMapiProp, Easy MAPI
lets you easily access data at this level. You can set MAPI
properties on these items directly like this:
with MyMessage do begin
SetStringProp(PR_CONVERSATION_TOPIC, sString);
SetLongwordProp(PR_IMPORTANCE, 1);
SetTimeProp(PR_END_DATE, dDate);

You dont have to worry about allocating stack space for

the strings, or converting TDateTime values to SystemTime.
Easy MAPI does it for you.
Besides providing a nice wrapper to the MAPI interfaces, these
lower-level components also provide some useful enhancements and conveniences, such as the RemoveUnresolved method shown above. Heres another one: TRwMapiProp has a
DisplayProperties method that displays a dialog box for
inspecting the MAPI properties on an item (see Figure 5).


Figure 5: MAPI properties on a message item as shown by the

TRwMapiProp.DisplayProperties function.

For convenience, these objects also frequently include

several overloaded versions of a function. For example,
TRwMapiMessage has three overloads for CreateAttachment.
A Live Product with Developers
Who Respond
I recently used RAPWares Easy MAPI quite extensively
in a major project, and I am pleased to say that Peter
Wolters (the P in RAP) responded almost immediately to
all of my questions and implemented several enhancement requests (and one bug fix) within a week of my
asking. And this was before he knew that I might write
a review for Delphi Informant!
Its also comforting to know that Easy MAPI is a dynamic
product that is constantly being updated and improved. Version 2.0.1 was released in March of 2003, but an updated
version is already in the works and contains several exciting




Easy MAPI 2

Just the Facts

Easy MAPI 2.0.1 from RAPWare is a component toolkit
encapsulating the Simple MAPI, Extended MAPI, and Windows Address Book programming interfaces. Although Easy
MAPI is indeed easy, its not simple nor shallow. Easy MAPI
is offered in two versions: Lite and Professional (Professional
includes full source).
E-Mail: info@rapware.com
Web Site: www.rapware.nl
Price: There are two pricing levels, single license and site
license: Easy MAPI Lite Single, US$49; Easy MAPI Lite Site,
US$129; Easy MAPI Professional Single, US$175; Easy MAPI
Professional Site, US$499. RAPWare also offers free trial versions for Delphi 5, 6, and 7; see Web site for download details.
enhancements, such as named property and filter support.
Ive had a chance to look at recent Betas and things look
promising. Perhaps a new version will be available to the
public by the time this review reaches your mailbox.



RAPWares Easy MAPI is easy to install and easy to use.
Developers wishing to get up and running quickly with
MAPI support wont go wrong. Others wanting to do more
sophisticated things will also not be disappointed. To quote
from the RAPWare Web page, Easy MAPI lets you do MAPI
the Delphi way. For example, instead of calling:
function MAPILogonEx(ulUIParam: ULONG;
lpszProfileName: PChar; lpszPassword: PChar;
ulFlags: ULONG; out lppSession: IMAPISession):
HResult; stdcall;

you would simply call:

MAPISession.LoggedOn := True;

Spend a little time delving into Easy MAPI and you too will
discover many interesting and powerful possibilities.

Matthew Hess is a Delphi developer working for ProLaw Software, the

legal industrys leading practice management provider. At ProLaw, Matthew
specializes in COM programming and cross-product integrations. Readers may
contact him at matthew@mlhess.com.



The Myth of the Upgrade

By Alan C. Moore, Ph.D.

here are certain things many

of us take for granted. I suspect that one of those assumptions
for people involved in technology is
the notion that upgrades are always
desirable. In this column Ill attempt
to debunk that myth, sharing some
personal experiences in which I
adopt the role of the user and not
the developer. Then, moving from
general application issues to Delphispecific ones, Ill share some developers views on upgrades to our favorite
development tool.
Personal horror stories. Let me
begin by taking you back to the
morning I started writing this column. I spent the morning, the entire
morning, attempting to install a new
version of a popular set of utilities
on which I have depended for many
years. By mid-afternoon I finally
succeeded, but not before getting
very angry and deciding enough was
enough it was time to start writing. The experience I describe herein
is something we should never put our
users through, so be warned!
The first problem I encountered was
a message that the Windows installation application was out of date and
needed to be updated. I attempted
to do this many times, but kept getting the same error message. I finally
went online to the vendors Web site
and learned what I needed to do:
go into the Windows/System folder
and rename a DLL. This is a rather
extreme solution, but its not the first
time Ive seen the Windows system
do strange and unexpected things to


At least I was then able to launch the

install program. But during that process I was forced to upgrade Internet
Explorer (IE). I couldnt. I suspect it
had to do with some of the problems
created by the process I had endured
up to this point. No matter how
many times I re-booted (and I admit
I lost count), I got the same message
that another upgrade was pending
and had to be completed first. It was
time for more desperate measures!
Now I really started to go into hacking mode. I did some editing of the
registry, but that didnt help. Then I
edited a Windows initialization file
(backing it up first, of course) that
just didnt look right. I deleted most
of it, re-booted again and was
then able to install IE followed by
the set of utilities. There were a few
remaining problems, but they were
minor and easily solved.
Was it worth it? I think so, but this
frustrating experience will be with
me for a while.
Unfortunately, Ive had other, more
recent, frustrations. The computer I
have in my office at the university is a
33 megahertz Win95 machine. It runs
my word processing application just
fine (what computer doesnt?), but
is limited if I attempt anything more
demanding. When I returned to my
teaching duties last August, I discovered the university had effected yet
another upgrade to our e-mail client.
I should have installed the earlier version on my hard drive at least then
I wouldve been able to continue to
use a version that worked but chose
to link to the version on the network.

Again, I gave it my best effort, freeing up space on the small hard drive,
removing resident programs I didnt
need, and re-installing the new client multiple times all to no avail.
Several weeks ago the hard drive on
the computer crashed with multiple
bad sectors rendering Windows inoperable. Much to my surprise, I found
myself celebrating this occurrence!
Rest in peace, old Win95 dinosaur.
The world of Delphi. As Delphi
developers, we must be concerned
with upgrading issues that include
helping our users avoid the scenarios Ive chronicled here. We must
also consider questions of how to
handle bug fixes, how to schedule
new feature additions, how often to
issue major upgrades, and a host of
related business decisions. Should
we provide free upgrades, including
both bug fixes and new or enhanced
features? As we discuss these issues,
well consider how some Delphi
developers feel about issues related
to upgrades.
A while back there was a very interesting thread in the Delphi Talk list
titled The Need to Bitch. The originator of the thread posed this question: Is anyone else getting tired of
this particular side of the industrys
seemingly unconscionable attitude
regarding their responsibility to us
when it comes to bug fixing? He
went on to point out that vendors
dont always live up to their responsibility to ensure ... that their products
work as advertised, include all the
features, abilities, etc., advertised,
and will be fixed in a timely and free



manner whenever bugs are discovered. Here is one Delphi developer

with high expectations of those from
whom he buys software. But is he
really asking that much?
What is the relationship between
an upgrade strategy and the initial release of a product? Another
contributor to this thread, agreeing
with these statements, raised questions about a particular upgrade
[his quotes] that simply adds 2 new
features and fixes 6 old bugs but
still charges customers. Putting it
even more strongly, he opined that,
Charging for updates when its
apparent the product hasnt reached
maturity is highway robbery. When
the developers are still excited about
all the neat features to add, its not
time for release! In such a situation
he suggested giving the component
away, promising that the developer
would end up getting a weight in
gold back in feedback and in the
field experience. After the developer
has produced a mature, stable product ... then youre ready to offer it
for sale. I know of at least one small
component vendor that successfully
used such an approach, inviting people to take part in a public beta, then



providing them with a free version of

the released product.
Best practices. What is the best
approach to managing upgrades? I suggest you start by considering the myth
of the upgrade, i.e. the assumptions
(not always realistic) that users have:
1) that a new version of a product
will work as well as its predecessor;
2) that upgrades can always be easily installed, and; 3) that with all of
their productivity enhancing features,
upgrades are indispensable for the
user. Then consider and implement
the steps needed to ensure that your
upgrade will work under those versions of Windows youre supporting, is
easily installable, and is relatively bugfree and feature-rich.
Managing upgrades can be a balancing act. If you release too soon, your

users may be disappointed because

of too many bugs or too few new
features. If you release too late, you
may suffer lost profits or give your
competition an advantage by coming
out with an attractive product sooner.
Always find a means to get continual
feedback from your users, and use
that feedback as a basis for your
upgrading schedule.
The issues weve addressed this
month apply to many types of development, from application development
to component creation. They also
relate to the various types of development in which we might be engaged,
Delphi for Windows or Kylix for
Linux. And speaking of Kylix (something not covered in this column for
some time), next months column will
examine many of the Kylix books currently available. Until then...

Alan Moore is a professor at Kentucky State University, where he teaches music theory and
humanities. He was named Distinguished Professor for 2001-2002. He has been named the Project
JEDI Director for 2002-2003. He has developed education-related applications with the Borland
languages for more than 15 years. Hes the author of The Tomes of Delphi: Win32 Multimedia API
(Wordware Publishing, 2000) and co-author (with John C. Penman) of The Tomes of Delphi: Basic
32-Bit Communications Programming (Wordware Publishing, 2003). He also has published a number
of articles in various technical journals. Using Delphi, he specializes in writing custom components
and implementing multimedia capabilities in applications, particularly sound and music. You can
reach Alan on the Internet at acmdoc@aol.com.