Académique Documents
Professionnel Documents
Culture Documents
Libraries Guide
Table of Contents
Preface . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . v
Informatica Resources. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . v
Informatica Customer Portal. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . v
Informatica Documentation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . v
Informatica Web Site. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . v
Informatica How-To Library. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . v
Informatica Knowledge Base. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . vi
Informatica Support YouTube Channel. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . vi
Informatica Marketplace. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . vi
Informatica Velocity. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . vi
Informatica Global Customer Support. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . vi
Table of Contents
ii
Table of Contents
Table of Contents
iii
Message Splitter. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 68
SWIFT MT Parsers and Serializers. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 68
MX Validators. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69
MT/MX Translators. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 70
Telekurs VDF Library. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 71
Telekurs VDF Message Structure. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 71
XML of a Telekurs VDF Message. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 71
Thomson Reuters Library. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 72
Thomson Reuters Report Structure. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 72
XML Version of a Thomson Reuters Report. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 72
Creating a Thomson Reuters Service. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 73
Index. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 93
iv
Table of Contents
Preface
The Data Transformation Libraries Guide is written for developers and analysts who need to transform data
to or from industry standard messaging formats. This guide explains how to use the library transformation
components with the industry standards and how to customize the library components for specialized needs.
This guide assumes that you understand the standards you are using and that you know how to deploy and
run Data Transformation services.
Informatica Resources
Informatica Customer Portal
As an Informatica customer, you can access the Informatica Customer Portal site at
http://mysupport.informatica.com. The site contains product information, user group information, newsletters,
access to the Informatica customer support case management system (ATLAS), the Informatica How-To
Library, the Informatica Knowledge Base, the Informatica Multimedia Knowledge Base, Informatica Product
Documentation, and access to the Informatica user community.
Informatica Documentation
The Informatica Documentation team takes every effort to create accurate, usable documentation. If you
have questions, comments, or ideas about this documentation, contact the Informatica Documentation team
through email at infa_documentation@informatica.com. We will use your feedback to improve our
documentation. Let us know if we can contact you regarding your comments.
The Documentation team updates documentation as needed. To get the latest documentation for your
product, navigate to Product Documentation from http://mysupport.informatica.com.
solutions to common problems, compare features and behaviors, and guide you through performing specific
real-world tasks.
Informatica Marketplace
The Informatica Marketplace is a forum where developers and partners can share solutions that augment,
extend, or enhance data integration implementations. By leveraging any of the hundreds of solutions
available on the Marketplace, you can improve your productivity and speed up time to implementation on
your projects. You can access Informatica Marketplace at http://www.informaticamarketplace.com.
Informatica Velocity
You can access Informatica Velocity at http://mysupport.informatica.com. Developed from the real-world
experience of hundreds of data management projects, Informatica Velocity represents the collective
knowledge of our consultants who have worked with organizations from around the world to plan, develop,
deploy, and maintain successful data management solutions. If you have questions, comments, or ideas
about Informatica Velocity, contact Informatica Professional Services at ips@informatica.com.
vi
Preface
Use the following telephone numbers to contact Informatica Global Customer Support:
North America / South America
Asia / Australia
Toll Free
Toll Free
Toll Free
Preface
vii
viii
CHAPTER 1
Purpose of Libraries, 1
Obtaining Libraries, 2
Installing Libraries, 3
Purpose of Libraries
Data Transformation libraries contain predefined transformation components for use with the following
industry messaging standards:
ACORD AL3
ACORD-MDM Mappers
ASN.1
BAI
CREST
DTCC-NSCC
EDIFACT
FIX
FpML
HIPAA
HL7
IATA PADIS
IDC-IDSI
MDM JavaBeans
NACHA
NCPDP
SEPA
SWIFT
Telekurs VDF
Each library contains a large number of components, such as parsers, serializers, and XML schemas,
designed for use with industry standard and specific application messages. You can use these components
to transform documents from industry standards to XML and from XML to other formats.
Library components offer the following advantages:
Saving time. You can use predefined library components, rather than designing and implementing the
components yourself.
Robust construction. Supports all options and syntactic intricacies unique to each industry standard.
Library Structure
A library is a directory that contains a family of fully configured Data Transformation components. All libraries
contain parsers, serializers, and XML schemas. Some libraries contain additional components for purposes
such as message validation, acknowledgments, and diagnostic displays.
When you create a project, you can import a library component instead of building your own from scratch.
You can use a library component without modification, or you can edit it to meet your needs.
Each library is designed to work with a particular industry standard. For example, the HL7 library contains
project components for each of the message types or data structures available in the HL7 messaging
standard for medical information systems.
The subdirectories of the library contain Data Transformation projects that define the parsers, serializers, and
schemas. The projects are organized by category. For example, the HL7 library contains subdirectories such
as:
Admission Discharge and Transfer Messages
Ancillary Data Reporting
Control Section
The Admission Discharge and Transfer Messages directory, in turn, contains directories for specific types
of HL7 messages such as:
ADT_A01_Admit_a_Patient
ADT_A02_Transfer_a_Patient
The directories at the lowest level of nesting typically contain two Data Transformation CMW project files, for
the parser and the serializer, respectively. In addition, these directories contain TGP script files and XSD
schema files that define the transformations.
Obtaining Libraries
The Data Transformation libraries are updated and released at frequent intervals. To obtain a library, contact
Informatica. Be sure to specify:
Your Data Transformation version. With certain limitations, a single library release can be used with
multiple Data Transformation versions.
Any special requirements, such as compatibility with existing projects created using a particular library
version.
Library Documentation
In addition to reading this book, be sure to read the documentation supplied with each library, such as a
ReadMe file, Release Notes, or other explanatory material.
Installing Libraries
Install the libraries on a Microsoft Windows computer where you run Data Transformation Studio. In the
Studio, you can use the library components to create transformation projects. After you configure a project,
you can deploy it as a service to a production environment that runs Data Transformation Engine.
2.
3.
In the left panel, select Data Transformation Libraries, and then click Add/Update in the right panel.
The Add Data Transformation Library dialog box appears.
4.
5.
If you want to view the Release Notes for the library, click Release Notes.
6.
Click Next.
7.
Uninstalling a Library
1.
2.
Installing Libraries
you to import components from older library versions. As needed, the Studio upgrades the internal code of
the components for compatibility with the current Data Transformation version, while preserving the
transformation logic of the older version. For more information about upgrading projects, see the Data
Transformation Administration Guide.
RosettaNet
Standard for B2B commerce and collaborative processes. For more information, see
http://www.rosettanet.org.
SEPA
Single Euro Payments Area electronic payments standard. For more information, see SEPA Library on
page 65.
SWIFT MX
XML version of the SWIFT standard for the financial industry. For more information, see SWIFT
Library on page 65.
TWIST
Transaction Workflow Innovation Standards Team for electronic commerce. For more information, see
http://www.twiststandards.org.
UNIFI (ISO 20022)
UNIversal Financial Industry messaging standard. For more information, see http://www.iso20022.org.
CHAPTER 2
Using Libraries
This chapter includes the following topics:
2.
Under the Data Transformation node, select Library Project and click Next.
3.
4.
On the Select Message page, expand the tree to display the library name, the library version, and the
specific message or data structure that you want to process.
For example, if you want to process HL7 version 2.3 ADT_A01 messages, expand the tree to the following
directory:
HL7\2_3\Admission Discharge and Transfer Messages\ADT_A01_Admit_a_Patient
5.
Select the specific parser or serializer component that you want to use in the project.
For example, select the following parser:
ADT_A01_Admit_a_Patient_Parser
6.
Click Finish.
Data Transformation Studio imports the required library components, such as the CMW, TGP, and XSD
files, to a new project. The Data Transformation Explorer view displays the new project.
2.
In the Data Transformation Explorer, expand the existing project to which you want to add the library
component.
3.
To add a parser or serializer, right-click the Scripts node and click Add File.
Browse to the new library project, select the TGP script file, and click Open. The Studio copies the file to
the existing project.
You might need to add multiple TGP files. For example, a TGP file that defines a main parser might
other TGP files that define secondary parsers. If you are not sure exactly which TGP files are required, it
is safest to add all of them. Unused TGP files have no effect on the project.
4.
To add an XML Schema, right-click the XSD node and click Add File.
Browse to the new library project, select the XSD file, and click Open. The Studio copies the file to the
existing project.
Be sure to add all the XSD files that the library component requires. If you are not sure which XSD files
are required, add all of them.
5.
You can delete the new library project if you do not need it for other purposes.
2.
2.
Testing a Parser
To test how a parser processes a source document, run it on an example source document and examine the
output. Alternatively, you can view its operation graphically.
Open the TGP script file of the project in the script panel of the IntelliScript editor.
2.
3.
4.
Browse to the source document that is nested within the LocalFile component.
The example pane of the IntelliScript editor displays the test document.
5.
Click IntelliScript > Mark Example to highlight the data that the parser finds in the document.
6.
7.
In the Events view, examine the log and confirm that the parser ran without error.
8.
To view the parser output, double-click the Results/Output.xml file in the Data Transformation
Explorer.
Open the TGP script file of the project in the script panel of the IntelliScript editor.
2.
3.
Testing a Serializer
To test a library serializer:
1.
Prepare an XML test input document conforming to the schema that is defined in the project.
An easy way to generate the XML document is to create a library parser project, and run it on a source
document. The output of the parser is an XML document that you can use as the test input of the
serializer.
2.
3.
Optionally, set the example_source property of the serializer to the XML test input document.
If you omit this step, Data Transformation Studio prompts you for the test document when you run the
project.
4.
5.
In the Events view, confirm that the serializer ran without error.
6.
To view the serializer output, double-click the output file in the Results folder of the Data
Transformation Explorer.
Troubleshooting Graphically
1.
2.
Display an example source document in the example pane of the IntelliScript editor, and click
IntelliScript > Mark Example.
3.
4.
Right-click one of the anchors in the IntelliScript, and click View Marking.
Data Transformation Studio highlights the anchor in the test document.
2.
3.
In the Events view, expand the Execution node of the event log.
4.
5.
Data Transformation Studio highlights the anchor or component that caused the event, in both the
IntelliScript pane and the example pane.
You can change the designation of the corresponding anchor from optional to mandatory. If the anchor is
mandatory and an input lacks the anchor, Data Transformation reports a failure. This does not necessarily
stop the transformation of the rest of input. For more information about failure handling, see the Data
Transformation Studio User Guide.
Conversely, a standard might consider a field to be mandatory, but your applications might treat it as
optional. In the library components, you can change the corresponding anchor from mandatory to optional. If
an input lacks the anchor, Data Transformation does not report a failure.
1.
Open the library component in the script panel of the IntelliScript editor.
2.
3.
Click the >> icon next to the anchor name to display its advanced properties.
4.
Adding a Segment
Some messaging standards, such as HL7 or EDI, group data fields into units such as segments. You can
modify a library component to support a custom segment.
One way to do this is to identify a portion of the IntelliScript that processes a similar segment. Copy and
paste the IntelliScript components, and edit them as required. You might need to add a secondary parser and
serializer to process the new segment. If necessary, edit the XML schema to support the new segment
definition.
Documenting Customizations
Document the editing operations that you perform on library components. The documentation can be very
helpful if you later update the library component and need to reproduce the customizations.
10
CHAPTER 3
ASN.1 Library, 15
BAI Library, 20
Bloomberg Library, 22
CREST Library, 29
DTCC-NSCC Library, 30
EDIFACT Library, 32
EDI-X12 Library, 34
FIX Library, 39
FpML Library, 40
HIPAA Library, 40
HIX Library, 54
HL7 Library, 56
IDC-IDSI Library, 58
NACHA Library, 63
NCPDP Library, 64
SEPA Library, 65
SWIFT Library, 65
11
12
<Error_Code_14 />
<Network_Reference_Code_15 />
<Network_Reserved_For_Future_Use_16 />
<Message_Transmission_Date_Time_17>200406111100051</
Message_Transmission_Date_Time_17>
</Message_Header_Group>
Description
ACORD_LNA_to_MDM_<version>.xml
MDM_to_ACORD_LNA_<version>.xml
The following table describes the folders in the ACORD MDM Mappings directory of the ACORD-MDM
Mappers library:
Folder Name
Description
ACORD_LNA_to_MDM
Folder with mapper files that translate from an ACORD LNA file to an output MDM
JavaBeans XML file with Data Transformation Studio. You cannot modify the mapper
scripts after you import the files.
MDM_to_ACORD_LNA
Folder with mapper that translate from an output MDM JavaBeans XML file to an
ACORD LNA file with Data Transformation Studio. You cannot modify the mapper
scripts after you import the files.
13
<MDM:com.siperian.inservice.bo.ProductSpecification>
<shortname>123456</shortname>
</MDM:com.siperian.inservice.bo.ProductSpecification>
</productSpecification>
</MDM:com.siperian.inservice.bo.Agreemt>
</agreemts>
<controlData>
<MDM:com.infa.b2b.ControlData>
<transRefGUID>01231212121234567890</transRefGUID>
<transType>Values Inquiry</transType>
</MDM:com.infa.b2b.ControlData>
</controlData>
2.
Select Informatica > Import Object Metadata File (Advanced) and click Next.
The Import File screen appears.
3.
Click Browse and select the mapper XML file that you want to import.
The Select Objects to Import screen appears.
4.
14
In the Source pane, select the XMap and the schemas for the mapper and click Add to Target.
5.
Click Next.
The Summary screen appears.
6.
Click Finish.
The Developer tool imports the mapper and creates a Data Processor transformation with XMap objects.
For more information about XMap and the Developer tool, see the Informatica Developer User Guide.
ASN.1 Library
The ASN.1 library enables you to process data that complies with the Abstract Syntax Notation One (ASN.1)
standard.
The ASN.1 standard, maintained by the ISO and ITU-T organizations, is widely used in the
telecommunication industry. The standard provides a protocol specification syntax that enables the
representation of any data in compact structures. The standard allows the data to be encoded in several
binary or text formats. For more information about the standard, see
http://en.wikipedia.org/wiki/Abstract_Syntax_Notation_One.
The library provides a Data Transformation service that parses ASN.1 protocol specifications. The input of
the service is a protocol specification (*.asn) file. The output is a Data Transformation project that contains
the following features:
A parser, serializer, and mapper that contain an AsnToXml document processor. The processor input is
encoded data conforming to the protocol specification. The output is an XML representation of the same
data.
The installation and configuration procedures for the ASN.1 library differ from those of other Data
Transformation libraries. The ASN.1 library does not provide a set of predefined transformations for specific
message types. Rather, it provides tools that create the transformations on demand. By using these tools,
you can configure transformations that process any ASN.1 source data.
The current ASN.1 library release enables processing input in the ASN.1 BER encoding. For information
about processing other ASN.1 encodings, contact Informatica.
The ASN.1 library transformations do not validate their input. If the input is invalid, the transformations might
fail or they might generate unexpected output.
The library includes a copy of the OSS Nokalva ASN-1Step utility that you can use to edit ASN.1 files.
You can access the ASN.1 library services through the C/C++ API or the .NET API.
OPTIONAL,
OPTIONAL,
OPTIONAL,
ASN.1 Library
15
basicServiceUsedList
supplServiceUsedList
camelServiceUsed
valueAddedServiceUsedList
dualServiceRequested
operatorSpecInformation
BasicServiceUsedList
SupplServiceUsedList
CamelServiceUsed
ValueAddedServiceUsedList
DualServiceCode
OperatorSpecInfoList
OPTIONAL,
OPTIONAL,
OPTIONAL,
OPTIONAL,
OPTIONAL,
OPTIONAL
...
ChargeableSubscriber ::= CHOICE
{
simChargeableSubscriber SimChargeableSubscriber,
minChargeableSubscriber MinChargeableSubscriber
}
...
SimChargeableSubscriber ::= [APPLICATION 199] SEQUENCE
{
imsi
Imsi
OPTIONAL,
msisdn
Msisdn OPTIONAL
}
The entire structure is called a protocol data unit (PDU). A single protocol specification can define multiple
PDUs, in other words, multiple kinds of data structures having different top-level elements.
16
<imsi>45404200</imsi>
<msisdn>85290469</msisdn>
</simChargeableSubscriber>
</chargeableSubscriber>
b.
2.
Copy the line that contains the host name and code.
Send an email to Informatica Global Customer Support. Include the following information:
Informatica Global Customer Support sends you the ossinfo license files.
3.
4.
The step you take to install the ASN.1 Library depends on the environment:
Environment
Actions
Development
Production
For each machine, save the associated ossinfo license file in the Data Transformation installation
directory.
For example, if you installed Data Transformation in the default location on a Windows platform, save the
ossinfo file in the following directory:
c:\Informatica\9.5.0\DataTransformation
Uninstalling
On a Windows platform, uninstall the ASN.1 library before you uninstall Data Transformation. On a UNIX-type
platform, the order of uninstallation is not important.
If you uninstall Data Transformation before the ASN.1 library on Windows, the uninstallation does not
completely remove the library. To complete the removal:
1.
Restart Windows.
2.
3.
ASN.1 Library
17
Upgrading
Before you install a new version of the ASN.1 library, you must uninstall the previous version.
2.
Under the Data Transformation category, select an Import Project, and then click Next.
3.
4.
5.
6.
Click Finish.
The project appears in the Data Transformation Explorer view.
7.
Double click the parser, serializer, or mapper file to open it in the script panel of the IntelliScript editor.
The transformation contains only the AsnToXml document processor. It does not contain an example
source, anchors, or actions.
8.
9.
18
The example source must contain one or more messages that conform to the protocol specification. In
the current release, the example must be in the BER encoding (a *.ber file).
10.
11.
Under the contains line, add components such as anchors and actions that process the XML and
generate output.
12.
ASN-1Step Utility
As a utility for editing and validating ASN.1 files, the library includes a copy of the ASN-1Step development
environment. You can install ASN-1Step on a Windows computer. You can use ASN-1Step to edit ASN.1
protocol specifications and source data.
1.
2.
3.
Browse to the Data Transformation installation directory where you stored the ossinfo license file.
4.
Description
asn_file
header
Defines a header to exclude from the XML. The header property has the following
options:
- NewlineSearch. The header is a newline.
- OffsetSearch. The header is defined by the number of characters from the beginning of
the file.
- PatternSearch. The header is defined by a regular expression.
- TextSearch. The header is defined by an explicit string or a string that you retrieve
dynamically from the source document.
no_constraints
Determines whether the ASN file is processed with constraints. The no_constraints
property has the following options:
- true. The ASN file is processed without constraints.
- false. The ASN file is processed with constraints.
Default is false.
ASN.1 Library
19
Property
Description
pdu_type
process_first_message
Default is false.
separator
Defines text to ignore between records. The separator property has the following
options:
- NewlineSearch. The separator is a newline.
- OffsetSearch. The separator is defined by the number of characters from the end of the
previous record.
- PatternSearch. The separator is defined by a regular expression.
- TextSearch. The separator is defined by an explicit string or a string that you retrieve
dynamically from the source document.
Example
In an ASN.1 BER file, there is a header before the first message and separators between the subsequent
messages. The header and the separators are random sequences of hexadecimal 00 and FF bytes, of any
length. For example, a header or separator might have the following form:
00 00 FF 00 FF FF FF 00 FF 00
The following configuration enables the AsnToXml processor to find the header and separators. The header
and separator properties use regular expressions to define the pattern.
pre_processor = AsnToXml
asn_file = SGSN-CDR-def-v2009A.asn
header = PatternSearch
pattern = "[\x00|\xFF]+"
separator = PatternSearch
pattern = "[\x00|\xFF]+"
The processor ignores the headers and separators, and it processes the messages between them.
BAI Library
The BAI library implements the following specifications developed by the BAI financial service industry
organization:
The BAI library transformations do not validate their input. If the input is invalid, the transformations might fail
or they might generate unexpected output.
For more information about the specifications, see http://www.bai.org/operations/reportingcodes.asp.
20
BAI Library
21
MAYDAY
Bloomberg Library
The Bloomberg library validates Bloomberg response messages and converts them to XML.
The Bloomberg library performs the following types of validation:
The Bloomberg field type definitions change on a daily basis. The Data Transformation Bloomberg library can
create a service from a sample request or response message. The service contains a parser for small input
files and a streamer for large input files.
22
The following table describes the files that the service produces:
File
Description
output.xml
errors.xml
errorsFound.txt
Boolean flag that indicates whether the service detected errors in the input.
Note: When you use a streamer to process large files, some comments do not appear in the output.
|10/02/2009|100.0000
|10/02/2009|100.0000
|11/01/2007|103.203500
Bloomberg Library
23
<Flag name="REPLYFILENAME">
<Value>DLMRT_comma_bulklist.out</Value>
</Flag>
<Flag name="FIRMNAME">
<Value>dl927</Value>
</Flag>
<Flag name="PROGRAMFLAG">
<Value>oneshot</Value>
</Flag>
<Flag name="COLUMNHEADER">
<Value>yes</Value>
</Flag>
<Flag name="SECMASTER">
<Value>yes</Value>
</Flag>
<Flag name="CREDITRISK">
<Value>yes</Value>
</Flag>
<Flag name="DELIMITER">
<Value>,</Value>
</Flag>
<Flag name="OUTPUTFORMAT">
<Value>bulklist</Value>
</Flag>
</Header>
<Fields>
<Field name="CALL_SCHEDULE" />
</Fields>
<Record object_id="US4581401001 US">
<CALL_SCHEDULE />
</Record>
<Record object_id="US4581401001 US Equity">
<CALL_SCHEDULE />
</Record>
<Record object_id="431022KW1 Equity">
<CALL_SCHEDULE />
</Record>
<Record object_id="431022KW1 Corp">
<CALL_SCHEDULE>
<Row>
<BC_DT>10/02/2009</BC_DT>
<BC_CALL_SCHEDULE_PX>100.0000</BC_CALL_SCHEDULE_PX>
</Row>
</CALL_SCHEDULE>
</Record>
<Record object_id="431022KW1 Govt">
<CALL_SCHEDULE>
<Row>
<BC_DT>10/02/2009</BC_DT>
<BC_CALL_SCHEDULE_PX>100.0000</BC_CALL_SCHEDULE_PX>
</Row>
</CALL_SCHEDULE>
</Record>
<Record object_id="US694032BD48 Muni">
<CALL_SCHEDULE>
<Row>
<BC_DT>11/01/2007</BC_DT>
<BC_CALL_SCHEDULE_PX>103.203500</BC_CALL_SCHEDULE_PX>
</Row>
</CALL_SCHEDULE>
</Record>
</BBG:BloombergDataLicense>
24
Description
lookup.out
Defines valid values for fields and the structure of bulk format fields.
fields.csv
Defines fields for Per Security, Back Office, and Extended Back Office.
crisk_fields.csv
backoffice_fields.xls
backoffice_extended_fields.xls
2.
Select Data Transformation > Import Project, and then click Next.
The Import Data Transformation Project page appears.
3.
4.
5.
Click Next.
6.
7.
8.
9.
If you want to run a streamer service, open the TGP script for the streamer, and then set the streamer as
the startup component.
10.
Bloomberg Library
25
A parser that transforms input data from the COBOL EBCDIC data definition to XML.
You can deploy the parser and serializer as Data Transformation services that process COBOL data, and you
can incorporate them into other projects.
2.
3.
On the next wizard page, enter a name for the project, and select the storage location.
The default location is the Data Transformation Studio workspace folder.
4.
On the next page, browse to the COBOL data definition file, in other words, to a COBOL copybook.
5.
Click Finish.
Data Transformation Studio generates a project containing an XSD schema file, a parser, and a
serializer for the COBOL data.
The first line must be a remark, with a * in column 7, or it must start with a level number
26
INDEXED BY clauses
Test Procedures
This section briefly describes procedures for testing and working with COBOL transformations. For more
information, see the Data Transformation Getting Started Guide and the Data Transformation Studio User
Guide.
In the Data Transformation Explorer view, double-click the TGP script file of the parser.
The parser appears in the script panel of the IntelliScript editor.
2.
Right-click the parser name, and then click Set as Startup Component.
3.
Expand the IntelliScript tree, and then edit the example_source property of the parser. Change its value
from Text to LocalFile.
The auto-generated COBOL parser is configured in a way that does not require an example source
document. When you finish testing, you can remove the example source if you wish. Leaving the
example source has no effect on the transformation at runtime.
4.
Nested within the LocalFile component, assign the file_name property. Browse to the input file, which
contains the sample COBOL data.
5.
The document appears in the example pane of the IntelliScript editor. If the example pane is not visible,
click IntelliScript > Both. If the document is not displayed, right-click the parser name in the IntelliScript,
and click Open Example Source.
6.
Optionally, click IntelliScript > Mark Example to highlight the data that the parser finds in the document.
If the data correctly conforms to the data definition that you imported, the parser should find all the data
in the document. Therefore, the Mark Example command should highlight the entire document.
For more information about the color coding, see the Data Transformation Studio Editing Guide.
7.
8.
Examine the execution log in the Events view. Confirm that the parser ran without error.
For more information about interpreting the log, see the Data Transformation Studio User Guide.
9.
To view the parser output, double-click the Results\output.xml file in the Data Transformation
Explorer.
In the Data Transformation Explorer, double-click the TGP script file of the serializer.
The serializer appears in the script panel of the IntelliScript editor.
2.
Right-click the serializer name, and then click Set as Startup Component.
3.
Click Run > Run to activate the serializer. At the prompt, browse to the Results\output.xml file, which
you generated when you ran the parser.
4.
Examine the execution log in the Events view. Confirm that the serializer ran without error.
27
5.
To view the serializer output, double-click the Results\output.xml file in the Data Transformation
Explorer.
The display should be the same as the original input on which you ran the parser.
Use the auto-generated COBOL parser to convert the native COBOL data to the XML representation.
2.
Use a mapper to transform the XML representation to the corresponding database fields.
3.
Within the mapper, use an ODBCAction component to store the result in the database.
Use a second mapper, containing an ODBCAction component, to retrieve the data from the database.
2.
The mapper transforms the database fields to the XML representation of the COBOL data.
3.
Use the auto-generated COBOL serializer to convert the XML to the native COBOL representation.
Implementation
You can chain the auto-generated COBOL transformations together with other transformation components in
a single Data Transformation project. For example, you can use the RunParser, RunMapper, and
RunSerializer actions to activate the various components described in the above example. For more
information about these actions, see the Data Transformation Studio User Guide.
Alternatively, you can deploy the components in separate Data Transformation services, which you activate
in sequence. For more information about activating the services, see the Data Transformation Engine
Developer Guide.
28
CREST Library
The CREST library processes the messages produced by an electronic settlement system used for stock and
bond transactions in the United Kingdom. The CREST standard is maintained by Euroclear.
The NCPDP library transformations do not validate their input. If the input is invalid, the transformations might
fail or they might generate unexpected output.
For more information about the standard, see https://www.euroclear.com/site/public/EUI.
CREST Library
29
DTCC-NSCC Library
The Depository Trust and Clearing Corporation (DTCC) provides broker-to-broker securities transaction
processing services, such as clearance, settlement, and information services for stocks, bonds, and mutual
finds.
The National Securities Clearing Corporation (NSCC), a DTCC subsidiary, provides clearing, settlement, and
risk management services for US broker-to-broker trades involving equities, corporate and municipal debt,
and unit investment trusts. The NSCC messaging protocol is the proprietary system by which DTCC
customers exchange financial information and transactions. For more information about DTCC and NSCC,
see http://www.dtcc.com.
The Data Transformation DTCC-NSCC library provides components that transform NSCC messages.
You can use the Library Customization Tool to edit the DTCC_NSCC library components. For more
information, see Chapter 4, Library Customization Tool on page 75.
10230079204
0000000093DTCCCDT
30
2.
In Data Transformation Studio, create a project containing a library parser for the appropriate message
type.
For example, create a project containing a POV parser. Optionally, you can customize the parser.
3.
In the Data Transformation Explorer view, right-click the Scripts folder and click Add File. Add the
streamer TGP file for the message type.
4.
5.
6.
7.
On the XML Generation page, clear the option to Add XML Processing Instruction.
8.
Test the streamer in the Studio, and then deploy it as a Data Transformation service.
Validation
The DTCC-NSCC library parsers validate the input messages. The parsers implement DTCC validation types
such as mandatory field checks, field-level validation, order of messages, and rule validation. For more
information about these validation types, see the DTCC documentation.
By default, the library enables the validation. In cases where the input has already been validated, you can
disable the validation. To do this, pass a service parameter called strictMode to the parser, and set its value
to false. Alternatively, edit the IntelliScript and initialize the strictMode variable to a value of false.
To selectively disable certain types of validations, you can edit the message schemas in the Library
Customization Tool. For example, you can change a mandatory field to optional.
The library projects supply serializers in two versions. Serializers having names ending in
_Serializer_Restricted perform strict validation according to the DTCC standard. Serializers having
names ending in _Serializer perform a lesser degree of validation.
Except in extreme cases, validation errors do not cause the library parsers or serializers to fail.
Validation Reports
In addition to the standard parser or serializer output, the DTCC-NSCC transformations write validation
messages to the following output ports:
Output port name
Output
errors
errorsFound
A boolean value: true if the transformation found validation errors, or false if it found no
errors.
An application can pass the port locations to the transformation service. For more information about output
ports, see the Data Transformation Studio User Guide.
DTCC-NSCC Library
31
EDIFACT Library
The EDIFACT library implements the EDIFACT international electronic data interchange standard maintained
by the United Nations. For more information, see http://www.unece.org/trade/untdid/welcome.htm.
You can use the Library Customization Tool to edit the EDIFACT library components. For more information,
see Chapter 4, Library Customization Tool on page 75.
32
Input type
Enumerations
Mandatory fields
Description
12
Invalid value
13
Missing
18
Unspecified error
35
36
37
39
40
EDIFACT Acknowledgments
If you use the with_validations components of the EDIFACT library, you can configure a transformation to
generate EDIFACT CONTRL acknowledgments automatically. To do this, the transformation can run a
predefined parser on the XML output of the library component. The output of this parser is the
acknowledgment, including a summary of the validation errors.
To generate EDIFACT CONTRL acknowledgments:
1.
Run the transformation defined in the EDIFACT_XML_to_CONTRL project of the EDIFACT library.
2.
As the input of the transformation, pass the XML output of a library parser.
EDIFACT Library
33
3.
Required/
Optional
Instructions
inInterchangeReferenceNumber
Required
inExpectedInterchangeReferenceNumber
Required
inCorrectRecipientIdentification
Optional
inSupportedSyntaxIdentifiers
Optional
inSupportedSyntaxVersions
Optional
inTestIndicator
Optional
The transformation generates the CONTRL acknowledgment, including a summary of the validation
errors.
EDI-X12 Library
The Data Transformation EDI-X12 library contains projects that convert messages between X12 and XML,
and generate validation error reports and acknowledgements.
X12 is a standard for electronic data interchange (EDI) between trading partners over data networks. X12 is
developed and maintained by the Accredited Standards Committee (ASC). For more information, see
http://www.x12.org.
You can use the Library Customization Tool or Data Transformation Studio to edit the EDI-X12 library
projects.
34
Interchange. The outer layer that wraps the entire X12 message.
Functional group. The middle layer that wraps one or more transaction sets.
Each transaction set contains a header, one or more data segments, and a trailer. A transaction set might
contain data such as a purchase order, an invoice, or a statement of account.
The following example illustrates an X12 Vessel Content Details transaction set. In this example, segments
are delimited by tildes (~) and data elements are delimited by asterisks (*).
ST*109*0001~B4*15*0*2*19960515*1424*Statu*Equi*Equipment*D*XXXX*
Location Identifier*A*3~N9*01*Reference Identification*
Free-form Description*19960515*1424*01*01^Reference Identification^01^
Reference Identification^01^Reference Identification~Q2*X*XX*19960515*
19960515*19960515*10045*21392*A*Flight/Voy*01*Reference Identification*
B*Vessel Name*10691*B*E~V9*AAD*Event*19960515*1424*City Name*XX*XX*001*
XXXXXX*25946*Tr*Free-Form Message*01*25878*XXXXXX*717*437*272*2457*
12935~R4*1*A*Location Identifier*Port Name*XX*Terminal Name*Pier*XX~DTM*
001*19960515*1424*01*CC*Date Time Period~V9*AAD*Event*19960515*1424*
City Name*XX*XX*001*XXXXXX*4685*Tr*Free-Form Message*01*13647*XXXXXX*813*
605*52*20035*12104~N9*01*Reference Identification*Free-form Description*
19960515*1424*01*01^Reference Identification^01^Reference Identification^
01^Reference Identification~SG*2*001*XX*19960515*1424*01~SE*11*0001~
EDI-X12 Library
35
Each Data Transformation X12 service provides a parser and two serializers. The parser performs strict
validation. The following table describes the serializers by validation type:
Serializer Validation Type
Description
Loose validation
Service name ends with _Serializer. If the service finds validation errors, it
generates XML output and an error report.
Strict validation
Use the Library Customization Tool to disable or modify certain types of validations.
Both serializer types generate SE, GE, and IEA trailer segments. You can omit the SE, GE, and IEA data
from the XML input of the serializers.
Description
errors
An XML listing of validation errors that the service found, including the standard X12 error
codes.
errorsFound
Indicates whether the service found validation errors. The errorsFound output port can have
the following values:
- true. The X12 message contained validation errors.
- false. The X12 message did not contain validation errors.
TA1 acknowledgement. Confirms the receipt and validity of the interchange envelope, such as the sender
and receiver data.
36
X12_997
X12_999
Value
Input document
errors output
inOutgoingInterchangeControlNu
mber
2.
Required/
Optional
Description
Input document
Required
inOutgoingInterchangeControlNumber
Required
inInterchangeReceiverId
Optional
Receiver data.
inInterchangeReceiverIdQualifier
Optional
Receiver data.
inExpectedInterchangeControlNumbers
Optional
inSupportedFunctionalGroups
Optional
inSupportedFunctionalGroupVersions
Optional
inValidInterchangeVersionIDs
Optional
EDI-X12 Library
37
Parameter
Required/
Optional
Description
inSupportedInterchangeVersionIDs
Optional
inValidInterchangeControlStandardsIdentifiers
Optional
inSupportedInterchangeControlStandardsIdentifiers
Optional
inValidSegmentTerminators
Optional
inValidDataElementSeparators
Optional
inValidComponentElementSeparators
Optional
inValidResponsibleAgencyCodes
Optional
inValidAuthorizationInformationQualifiers
Optional
inAddNewlinesToOutgoingMessage
Optional
NONE
CR
LF
CRLF
38
EDI-UCS_and_WINS library
Library implementing the Uniform Communication Standard (UCS) and Warehouse Information Network
Standard (WINS), used in the grocery and warehouse industries, respectively. For more information
about the standards, see http://www.uc-council.org/ean_ucc_system/stnds_and_tech/ucs.html.
EDI-VICS library
Library implementing the Voluntary Interindustry Commerce Solutions standard, used in the retail
industry. For more information about the standard, see http://www.vics.org/home.
These library transformations do not validate their input. If the input is invalid, the transformations might fail
or they might generate unexpected output.
FIX Library
The FIX library supports the Financial Information eXchange protocol for real-time electronic exchange of
securities transactions. The protocol is maintained by FIX Protocol, Ltd.
The FIX library transformations do not validate their input. If the input is invalid, the transformations might fail
or they might generate unexpected output.
For more information about the protocol, see http://www.fixprotocol.org.
FIX Library
39
</Trailer>
</IndicationOfInterest>
FpML Library
The FpML library implements the industry-standard protocol for complex financial products.
The parsers have the following output ports:
ErrorsFound. If the transformation encounters a validation error, returns true. If the transformation does
not find an error, returns false.
HIPAA Library
HIPAA is an industry standard for administrative and financial health-care transactions. For more information,
see the following websites:
http://aspe.hhs.gov/admnsimp/faqtx.htm
http://www.hipaa.org
HIPAA is based on the X12 standard. For more information about X12, see EDI-X12 Library on page 34.
In addition to the regular parser, serializer, and schema components provided in every Data Transformation
library, there is a special HIPAA library component that validates input messages and generates
acknowledgments. The special component implements the complete set of validation and acknowledgment
types defined in the HIPAA standard up through version 5010A1.
40
ST*834*26608~
BGN*00*26608*20021226*1200*ES*2~
N1*P5*UES CORP*FI*111222333~
N1*IN*UNITED HEALTHCARE*FI*777888999~
INS*N*01*021*28*A***FT~
REF*0F*948047150~
REF*1L*199301~
HIPAA messages consist of multiple segments. Each segment begins with an identifying code. The following
table describes a few examples of the identifying codes contained in the sample message:
Identifying Code
Segment Description
ISA
GS
ST
Header segment
BGN
Beginning segment
N1
Name
SE
Trailer segment
The HIPAA version contains parsers and other components that do not validate incoming messages and
that do not generate acknowledgments.
The HIPAA Validation version contains the same parsers and other components as the HIPAA version. In
addition, the HIPAA Validation version contains a special project that implements the full set of validation
and acknowledgment types defined in the HIPAA standard.
HIPAA Library
41
Windows 7
AIX 6.1
HP-UX 11.31
42
Component
Download File
Contents
HIPAAXEvv.zip
HIPAAXESvv.zip
Component
Download File
Contents
HIPAAStandardsvv.zip
Windows.
XEngine_6.7.1_hotfix_16_<
os>_<platform>_yyyymmdd_s
imple.zip
Linux or UNIX.
XEngine_6.7.1_hotfix_16_<
os>_<platform>_yyyymmdd_s
imple.tar.gz
Expand the following ZIP files to the same, empty directory:
Note: If you plan to customize HIPAA SNIP 1-6 or to add SNIP 7 validation rules, obtain the HIPAA Validation
Specbuilder in addition to the other components.
Where to Install
You can install the HIPAA Validation add-on on the same computer as Data Transformation or on an
independent server. The latter option enables multiple computers that run Data Transformation to access a
single copy of the add-on.
From the directory where you expanded the download ZIP files, run the following file:
XEvv\Windows32\XE_Multisetup\XEngine_Multisetup.exe
where vv is the version number.
The installer prompts you to select options.
2.
HIPAA Library
43
The installer displays the modules that will be installed. Verify that the following modules are listed:
XEngine
XEngine Server
HIPAA
External Codelists
4.
At the subsequent prompts, accept the default options to complete the XEngine and XEngine Server
installations.
Ignore the prompt to restart the computer. Otherwise, you will have to run XEngine_vv_Multisetup.exe
again.
5.
Accept the default options to install the Standards Database and the External Codelists.
6.
7.
8.
Accept the default options to install the XEngine Server Web Service.
9.
Unzip the following file into the /XEngine subdirectory of the Products Root Folder:
XEngine_6.7.1_hotfix_16_<os>_<platform>_yyyymmdd_simple.zip
10.
Open a command window and verify that XEngine 6.7.1 is installed and operating correctly by entering
the following command:
%XERoot%\exec\run.bat -d %XERoot%\samples\X12\data\101.dat -s 6 -e X -sf 1
If the test is successful, the following output files appear in the current folder:
11.
_xeout_tid_1_0.xml
_xeout_tid_1_report_0.xml
Test the HIPAA Validation Engine by creating, deploying, and running a HIPAA validation project.
2.
Create an installation parent directory, set an environment variable called ECRootPath to the path, and
then change to the new directory.
For example, run the following commands:
mkdir <INSTALL_DIR>/hipaaValidation
setenv ECRootPath <INSTALL_DIR>/hipaaValidation
cd $ECRootPath
where <INSTALL_DIR> is the Data Transformation installation directory.
3.
In the directory where you expanded the download ZIP files, find the XEngine directory for the
appropriate operating system:
XEVV/<Operating_System>
For example, if you are installing on a Linux x64 system, the directory is XEVV/Linux_64.
44
4.
From the directory you located in step 3, extract the XEngine installer to the $ECRootPath directory:
tar xzvf XEngine_VV_linux_ix86_64_VV.tar.gz
5.
6.
7.
From the Standards-vv/HIPAA/HIPAA_Perl_Unix directory, where vv is the version number, extract the
HIPAA Standards Database to the $ECRootPath directory, and run the setup script:
source $ECRootPath/XEngine/exec/setEnv.csh
From the Standards-vv/ECL2/ECL2_Perl_Unix directory, where vv is the version number, extract the
External Codelists to the $XERoot directory, and run the setup script:
cd $XERoot
tar xzvf HIPAAEcl2Inst.tar.gz
perl HIPAAECL2Installer.pl
9.
From the XESvv/Unix directory, where vv is the version number, extract XEngine Server to the
$ECRootPath directory, run the setup script, and set an environment variable called XESRoot to the path:
cd $ECRootPath
tar xzvf XEServer_VV.tar.gz
perl XEServerInstaller.pl
setenv XESRoot ${ECRootPath}/XEServer
10.
In the XEngine Server Components subdirectory, extract the XEngine Server Web Service, run the setup
script, and set an environment variable called XESWSRoot to the path.
cd ${XESRoot}/Components
tar xzvf XEServerWS_VV.tar.gz
perl XEServerWSInstall.pl
setenv XESWSRoot ${XESRoot}/webservice
When the installer prompts you, accept the default setup options.
11.
xe_hotfix6.7.1.tar
XEngineHotfixInstaller.pl
12.
13.
Open a command window and verify that XEngine 6.7.1 is installed and operating correctly by entering
the following command:
Perl XEngineHotfixInstaller.pl
14.
_xeout_tid_1_0.xml
_xeout_tid_1_report_0.xml
Test the HIPAA Validation Engine by creating, deploying, and running a HIPAA validation project.
HIPAA Library
45
Note: If you use Data Transformation with Informatica PowerCenter, confirm that the PowerCenter
Administration Console does not use port 8077 or 8078.
2.
If you have installed on a 64-bit platform, edit the file $XESRoot/config/rtenvironment.xml. After the
<jvm_opt> element, insert the following code:
<item>-d64</item>
3.
Between: </TaskDescriptor>
And: <ProcessEnvironment>
Insert: <BDF-internal><FileSize>5M</FileSize><SwapLocation>${XESRoot}/bdftmp</
SwapLocation> </BDF-internal>
4.
5.
46
6.
7.
Edit the file $XESRoot/config/XEWEnvironment.properties. Set the ResultsDir property to the path of
a directory for temporary files.
The default path is $XESRoot/samples/outbound/_JMS_results/. You can enter another path, for
example:
/opt/Informatica/HIPAAValidation/_JMS_Results
Confirm that both the add-on and Data Transformation have read-write access to the directory.
8.
Identify a free TCP port that you can use for the HIPAA Validation add-on.
The add-on is preconfigured to use port 61616. Do not use this port because it conflicts with certain
other Informatica applications. For example, select port 61620 if it is free.
On UNIX-type platforms, you can use the netstat -a command to identify free ports.
9.
Open each of the following files in a text editor, search for localhost:61616, and replace it with
localhost:<port_number>.
$XESRoot/exec/ActiveMQ/ActiveMQEnv.properties
$XESRoot/jmsserver/conf/activemq.xml
$XESRoot/config/Connectors/JMS_WS_In_N2X.xml
$XESRoot/config/Connectors/JMS_WS_In_X2N.xml
$XESRoot/config/Connectors/JMS_WS_Out.xml
$XESRoot/config/XESWSBatchReader.xml
For example, change localhost:61616 to localhost:61620. There is a single instance of the string in
each file.
Note: If you enable any additional services in the HIPAA Validation configuration files, confirm that they
are configured with available port numbers, and edit the port numbers if required.
10.
Confirm that the JAVA_HOME environment variable is set to JRE version 1.5 or later.
HIPAA Library
47
11.
Use the following commands to start and stop XEngine Server and Tomcat:
Action
Command
$XESRoot\exec\_startXES.bat
$XESRoot\exec\RemoteConsole/
shutdownXEServer.bat
$XESRoot\webservice\exec\Tomcat.bat
$XESRoot\webservice\bin\apache-tomcat-5.5.20\bin
\shutdown.bat
$XESRoot/exec/_startXES.sh
$XESRoot/exec/RemoteConsole/
shutdownXEServer.sh
$XESRoot/webservice/exec/Tomcat.sh
$XESRoot/webservice/bin/apachetomcat-5.5.20/bin/shutdown.sh
2.
3.
4.
2.
In this file, find the interchange group that has an interchange control version of 00501.
The group is located near the top of the file, in the following location:
<Item name="HIPAA" id="...">
...
<Group name="Interchange" id="...">
<lookup linkName="InterchangeControlVersion">00501</lookup>
48
3.
4.
The all value means to report all errors in 999 acknowledgments. To report only X12 errors, enter
X12 instead of all. To report only SNIP 3-to-7 errors, enter nonX12.
The always value means always to generate 999 acknowledgments. Instead of always, you can enter
never, meaning not to generate 999 acknowledgments, or error, meaning to generate only in case of
an error.
In the same group, find the nested functional group that has a group version number of 005010X231:
<Group name="FunctionalGroup" id="b73de2d8-0fb0-4842-9e10-25c398bbe847">
<lookup linkName="GroupVersionNumber">005010X231</lookup>
5.
2.
HIPAA Library
49
Description
Url
The IP address or host name of the computer where the add-on is installed, and the HTTP
port number that you configured in $XESRoot/webservice/bin/apachetomcat-5.5.20/conf/server.xml.
RemotePath
LocalPath
The path that Data Transformation uses to access the temporary directory.
LocalPath and RemotePath point to the same directory. In the above example,
LocalPath is a path defined on a Windows server where Data Transformation is installed.
RemotePath is a path on a UNIX-type server where the HIPAA Validation add-on is
installed.
If you installed the add-on on the Data Transformation computer and you use the default temporary
directory, you do not have to configure LocalPath and RemotePath.
2.
3.
Command
Windows
startRemoteConsole.bat
Linux or UNIX
startRemoteConsole.sh
50
In Data Transformation Studio, create and deploy one or more regular library projects, using parsers
from the HIPAA Validation library.
2.
Create and deploy an additional library project, based on the special HIPAA_Validation project of the
HIPAA Validation library.
3.
When an application receives a HIPAA message, it can pass it to the HIPAA_Validation service for
validation and generation of acknowledgments.
4.
5.
If the input passed the validation tests, that application passes the input to the regular parser service,
generating an XML representation of the HIPAA message.
Optionally, you can configure transformations that combine these steps. For example, a single transformation
might pass the same input to HIPAA_Validation and to a regular parser, in sequence.
2.
Under the Data Transformation project type, select a Library project, and click Next.
3.
4.
5.
Click Finish.
2.
3.
The Studio displays a windows where you can select validation and acknowledgment options:
Option
Explanation
If selected, the service copies valid and invalid input messages to different
output ports. If not selected, the services copies all messages to the valid
port.
Validate transactions
Select this option to enable the validation. If not selected, the service can still
generate acknowledgments.
Generate acknowledgments
Select the acknowledgment types to generate. You can select types TA1,
997, 999, 824, and 277.
HIPAA Library
51
Option
Explanation
Validation types
You can use XML for data processing applications such as further processing
in Data Transformation. You can use HTML for display.
You can select HIPAA validation types 1 to 7.
Review the deployment options such as the service name and location, and click Finish.
Validation Output
The HIPAA_Validation service writes validation messages to the following output ports:
Output port name
Output
Acknowledgments
ErrorsCount
HtmlReport
XmlReport
InvalidTransactions
ValidTransactions
An application can pass the port locations to the HIPAA_Validation service. For more information about
output ports, see the Data Transformation Studio User Guide.
52
The mappers operate on an XML representation of the HIPAA messages. The representation is identical to
the one used in the Data Transformation HIPAA library. You can run the mappers on XML documents
generated by the library parsers. You can deploy and run the mappers as-is, or you can use them as a
reference for editing and customization.
The mappers are supplied as Microsoft Excel files that you can open in Data Transformation Accelerator. You
can use Accelerator to test, edit, and deploy the mappers.
The HIPAA Crosswalk package contains mappers in two formats:
Except for the file format, the Accelerator 9.0 and Accelerator 9.0.1.1 mappers are identical.
You can install the HIPAA Crosswalk package on a Microsoft Windows computer where the following
software is installed:
2.
If you use Accelerator 9.0, copy the XLS files from the Office2003 subdirectory.
If you use Accelerator 9.0.1.1, copy the XLSX files from the Office2007 subdirectory.
2.
Run and test the mapper. You can edit the mapper if required.
3.
Deploy the mapper as a Data Transformation service, or export it to Data Transformation Studio.
If a target field length is shorter than the source field length, the mappers truncate the data.
HIPAA Library
53
If a required target field does not have a corresponding source field, the mappers attempt to copy the data
from other relevant source fields, or they assign default values to the target fields.
If there is no target field that corresponds precisely to a source field, the mappers attempt to store the
source data in another appropriate target field. If there is no appropriate target field, the source data is
lost.
The mappers transform a HIPAA 4010 dependent that has a unique ID to a HIPAA 5010 subscriber. In
such cases, the HIPAA 4010 subscriber is not mapped, except for data that does not exist in the
dependent.
You can edit and customize the mappers in Data Transformation Accelerator. For example, you can add or
delete source and target fields, you can revise the mappings, and you can insert default target field values.
The Remarks column of the Accelerator display explains some of the crosswalk behaviors.
To validate the source and target messages, you can use the message validation feature of the HIPAA
library.
HIX Library
The HIX library uses the Health Insurance Exchange protocol to exchange information for governmentregulated and standardized health care plans in the United States.
For more information about the protocol, see HIX Library and ObamaCare.
The HIX protocol is based on the HIPAA standard. For more information about HIPAA, see HIPAA
Library on page 40.
Note: Do not use HIPAA or EDI-X12 validation components with the HIX Library. The HIX Library contains
separate validation components. These components process supported ASCX12 messages.
54
RMR*ZZ*2**450~
SE*27*0001~
GE*1*1~
IEA*1*000000905~
Message Acknowledgements
Message acknowledgements indicate whether a message has been received and whether it is valid.
The HIX library contains services that generate the following types of EDI acknowledgments:
997 acknowledgements
Confirms the receipt and validity of a transaction.
HIX Library
55
999 acknowledgements
Confirms the receipt and validity of a transaction.
TA1 acknowledgement
Confirms the receipt and validity of the interchange envelope, such as the sender and receiver data.
HIX_997_Generator
HIX_999_Generator
HL7 Library
The HL7 library implements the Health Level Seven version 2.x messaging standard used in the health
services industry. The HL7 standard is used world-wide in hospital and medical information systems. For
more information about the standard, see http://www.hl7.org.
The HL7 library transformations do not validate their input. If the input is invalid, the transformations might fail
or they might generate unexpected output.
You can use the Library Customization Tool to edit the HL7 library components. For more information, see
Chapter 4, Library Customization Tool on page 75.
56
57
<ASPRES>
<UNH>
<R01>1</R01>
<R02>
<R01>ASPRES</R01>
<R02>94</R02>
<R03>2</R03>
<R04>IA</R04>
</R02>
<R03>X000000</R03>
</UNH>
<LOOP_GR1>
<MAP>
<R01>
<R01>2</R01>
</R01>
<R02>DL</R02>
<R03>
<R02>ITAREQ</R02>
<R03>2</R03>
</R03>
</MAP>
</LOOP_GR1>
<UNT>
<R01>3</R01>
<R02>1</R02>
</UNT>
</ASPRES>
</Transactions>
</Interchange>
IDC-IDSI Library
Interactive Data Corporation (IDC) and its subsidiary, Interactive Data Services, Inc. (IDSI), provide reference
data for evaluating securities.
The Data Transformation IDC-IDSI library performs the following actions:
58
ErrorsFound. If the transformation encounters a validation error, returns true. If the transformation does
not find an error, returns false.
IDC-IDSI Library
59
<Identifyask_onlyquotes>0</Identifyask_onlyquotes>
<FactoradjustedCMOs>0</FactoradjustedCMOs>
<OptionsNBBOpricingcoverageflag>0</OptionsNBBOpricingcoverageflag>
<Priceevaluationcoverageflag>1</Priceevaluationcoverageflag>
</FileIndicators>
</Row>
</Body>
</IDC:DataFeed>
Description
MDMJavaBeans.xml
<project_name>_MDMJavaBeans.xsd
input.xml
<project_name>_<schema_root_name>.xsd
<project_name>_From_MDMJavaBeans_To_XML.tgp
<project_name>_To_MDMJavaBeans.tgp
<project_name>_From_MDMJavaBeans_UserDefined_Mapper.tgp
<project_name>_To_MDMJavaBeans_UserDefined_Mapper.tgp
After you transform the MDM JavaBeans file, you can use the MDM Mappers library to translate between the
output XML file and an ACORD LNA file. You use the Developer tool to import and run the MDM mappers.
For more information about the MDM Mappers library, see ACORD-MDM Mappers Library on page 13.
60
61
2.
3.
4.
5.
Browse and select an MDM JavaBeans file, and then click Next.
The MDM JavaBeans project appears in the Studio as a transformation that contains the source files, the
output files, and the mapper scripts.
6.
7.
8.
62
After you create the project, you can use the MDM Mappers library to translate the output XML file to and
from an ACORD LNA file. Use the Developer tool to import the MDM mappers.
NACHA Library
The NACHA library implements the Automated Clearing House standard maintained by the NACHA
Electronic Payments Association and used by financial institutions.
The NACHA library transformations do not validate their input. If the input is invalid, the transformations might
fail or they might generate unexpected output.
For more information, see http://www.nacha.org.
CHASE OF TX
NACHA Library
63
<CTXEntryDetailRecord>
<TransactionCode>2</TransactionCode>
<ReceivingDfiIdentification>15254585</ReceivingDfiIdentification>
<CheckDigit>1</CheckDigit>
<DfiAccountNumber>FDEHHFDHDFHDFHFHFH</DfiAccountNumber>
NCPDP Library
The National Council for Prescription Drug Programs (NCPDP) develops standards for information
processing in the pharmacy services sector of the health care industry. Insurance companies use NCPDP
transactions to process pharmacy drug claims.
The Data Transformation NCPDP library implements two NCPDP standards for communicating pharmacy
claims:
NCPDP Batch Transaction Standard versions 1.1 and 1.2. Used to transmit multiple claims in batch
format.
NCPDP Telecommunication Standard Format versions 5.1 and D.0. Used to transmit individual claims
from the point-of-sale to the insurance carrier.
The NCPDP library transformations do not validate their input. If the input is invalid, the transformations might
fail or they might generate unexpected output.
For more information about NCPDP, see http://www.ncpdp.org.
64
<ReceiverID>LOUIPROD</ReceiverID>
</BatchHeaderRecord>
<DetailDataRecord>
<TransactionReferenceNumber>0000000019</TransactionReferenceNumber>
<TransmissionTransactionType="Billing">
<Header>
<BinNumber>18003</BinNumber>
<VersionReleaesNumber>51</VersionReleaesNumber>
SEPA Library
The SEPA library implements version 5.0 of the Single Euro Payments Area messaging standard used in the
banking industry. The library validates messages according to the formatting rules of the European Payments
Council and the rules defined in ISO 20022. The library does not support SEPA usage rules.
The parsers have the following output ports:
ErrorsFound. If the transformation encounters a validation error, returns true. If the transformation does
not find an error, returns false.
The SEPA project names have the following suffixes, which correspond to the SEPA implementation guides:
IB. Interbank.
C2C. Customer-to-customer.
B2C. Bank-to-customer.
http://www.iso20022.org/
http://www.europeanpaymentscouncil.eu/
SWIFT Library
SWIFT is a messaging standard for the financial industry, maintained by the Society for Worldwide Interbank
Financial Telecommunication. The standard defines messages for purposes such as electronic funds transfer
and check processing. For more information, see http://www.swift.com/. There are two versions of the
standard:
SEPA Library
65
The Data Transformation SWIFT library implements transformations that process MT and MX messages. The
SWIFT organization has certified the Data Transformation implementation for use with SWIFT versions 2008
and 2009. SWIFT has authorized Data Transformation to display the SWIFT-Ready Application logo.
For MT messages, the library provides parsers, serializers, schemas, and validation components. For MX
messages, the library provides validation components. In addition, the library provides translators between
MT and MX.
You can use the Library Customization Tool to edit the SWIFT library components. For example, if you have
customized a SWIFT message by adding a field, you can edit the transformations to validate and transform
the field. For more information, see Chapter 4, Library Customization Tool on page 75.
66
</DestinationAddress>
<MessagePriority>U</MessagePriority>
<DeliveryMonitoring>1</DeliveryMonitoring>
<ObsolescencePeriod>020</ObsolescencePeriod>
</ApplicationHeaderInputBlock>
2.
3.
Select Data Transformation > Library Project, and then click Next.
4.
5.
6.
Click Finish.
The library project appears in the Studio.
7.
8.
Write down the service name and the location of the Data Transformation repository, displayed in the
Deploy Service window.
You will need this information in the subsequent steps.
9.
10.
Click Deploy.
Open a command prompt, and then enter the following command:
CM_console <service_name> -f<name of SWIFT directory file>
For example, enter:
CM_console Create_SWIFT_Lookups -fc:\CT_20090502.TXT
An XML file appears in the current directory, for example, CountryCode_Lookup.xml.
11.
Copy the XML file to the SWIFT_Lookups directory in the Data Transformation repository.
The file overwrites the sample file of the same name.
SWIFT Library
67
Message Splitter
A dedicated Script named MTnnn_splitter.tgp splits MT messages 940, 942 and 950 when they are longer
then the maximum message length. The Script is available in the restricted serializer project.
The SWIFT UHB defines the logic used to split the messages. By default, the message is not partitioned. To
split a message, set the value of the variable SplitLargeMessage to true in MTnnn_splitter.tgp.
The output is split into separate output files by the following method:
The output XML contain a list of message parts, the file names, and the file path. By default, the Results
folder contains the output files. To change the output path, set the value of vOutputPath.
The output file names are determined by the value of the message field 20, a sequence part number, and
a flag that identifies whether the file contains the last part. For example, a message that is split into five
parts will have the following file names:
12345TraRF123456_1_N.txt
12345TraRF123456_2_N.txt
12345TraRF123456_3_N.txt
12345TraRF123456_4_N.txt
12345TraRF123456_5_Y.txt
To change the logic for file names, change the rules in the splitter group section createUniqueFileName.
Field 103, the Message User Reference field, is updated by default. If the value is set, it is the same for all
message parts. To change the setting, enable the MUR logic in the group section
calculateMessageUserReference, or create proprietary logic.
Note: If the variable SplitLargeMessage is set to true but the message length is within the maximum length
limit, the message will still be split using the splitter logic defined in this section.
To transform the MT940, MT942, or MT950 messages with the Data Transformation Studio, ensure that the
SWIFT_Lookup folder is in the Studio workspace folder. If needed, copy the folder from the ServiceDB to the
Studio workspace folder. If you run the service from outside the ServiceDB, you need to ensure that the
SWIFT_Lookup folder is in the same path as the project.
If message MT940 generates validation error code C24 during the split procedure, then the splitter fails. The
C24 error code has the following description:
C24 error code
If field 86 is present in any occurrence of the repetitive sequence, it must be preceded by a field 61. In
addition, if field 86 is present, it must be present on the same page or message of the statement as the
related field 61
Parsers
The library parsers use a ValidateValue action to validate their input. The action is configured with one or
more Validation Rule Language (VRL) files.
For the following message types, SWIFT requires strict validation for Messaging Data Services (MDS)
certification. The VRL rules enforce the complete set of validations required by the SWIFT standard:
101, 102+, 102, 103+, 103, 104, 105, 110, 111, 112, 200, 201, 202, 202COV, 203, 204, 205, 205COV, 210,
300, 304, 305, 320, 321, 330, 340, 341, 350, 360, 362, 380, 381, 400, 410, 412, 420, 422, 430, 450, 456,
68
500,
527,
568,
710,
910,
n98,
501,
535,
569,
711,
920,
n99
502,
536,
575,
720,
935,
503,
537,
576,
730,
940,
504,
538,
578,
732,
941,
505,
540,
586,
734,
942,
506,
541,
600,
740,
950,
507,
542,
601,
742,
960,
508,
543,
604,
747,
961,
509,
544,
605,
750,
962,
510,
545,
606,
752,
963,
513,
546,
607,
754,
964,
514,
547,
608,
756,
970,
515,
548,
609,
760,
971,
516,
549,
620,
767,
972,
517,
558,
670,
768,
973,
518,
559,
671,
769,
n90,
519,
564,
700,
800,
n91,
524,
565,
701,
801,
n92,
526,
566,
705,
802,
n95,
530,
567,
707,
900,
n96,
For other message types, SWIFT does not require strict validation. The VRL rules are correspondingly less
strict.
If a parser finds a validation error, it does not fail. Instead, it writes data to the following output ports:
ErrorsFound. If the transformation encountered a validation error, returns true. If the transformation did
not find an error, returns false.
For more information about ValidateValue and VRL, see the Data Transformation Studio User Guide.
2.
Double-click one or more of the VRL files that define the rules.
This opens the VRL files in an editor.
3.
Clear the check box of the rules that you want to disable.
ErrorsFound. If the transformation encountered a validation error, returns true. If the transformation did
not find an error, returns false.
MX Validators
For each MX message type, the MX directory of the SWIFT library contains a validation project.
SWIFT Library
69
Each project contains a parser that processes an input MX XML message. The parser contains a
ValidateValue action and VRL rules that validate the input.
The parsers have the following output ports:
ErrorsFound. If the transformation encountered a validation error, returns true. If the transformation did
not find an error, returns false.
MT/MX Translators
For certain message types, the MT/MX Translation directory of the SWIFT library contains mappers that
translate between the Data Transformation XML representation of an MT message and the SWIFT MX
representation of the same message.
The mappers translate the following messages, in both directions:
MT Message
MX Message
MT101
MX pain.001.001
MT103
MX pacs.008.001
MT103STP
MX pacs.008.001
MT202
MX pacs.009.001
MT202COV
MX pacs.009.001
MX setr.004.002
MX setr.010.002
MT509 (INST)
MX setr.016.002
MX setr.006.002
MX setr.012.002
MT900
MX camt.054.001
MT910
MX camt.054.001
MT940
MX camt.053.001
MT941
MX camt.052.001
MT942
MX camt.052.001
MT950
MX camt.053.001
By chaining the mappers with MT parsers or serializers, you can translate MT text messages to MX, or the
reverse. For example, to translate MT to MX, you might chain the components in the following way:
1.
70
2.
3.
71
Comma
Pipe
Semicolon
Tab
Validation of Thomson Reuters Corporate Actions reports is based on the SWIFT 2011 specification.
The following table describes the files that the service produces:
File
Description
output.xml
errors.xml
errorsFound.txt
Boolean flag that indicates whether the service detected errors in the input.
72
<Start_Time>06.11.2011 05:00:57</Start_Time>
</Row>
<Row>
<End_Time>06.11.2011 05:00:58</End_Time>
</Row>
<Row>
<Row_Count>3</Row_Count>
</Row>
</Header>
<Body>
<Row>
<Asset_ID>0x000019000023013b</Asset_ID>
<Asset_Type>CORP</Asset_Type>
<Bridge_Symbol>US*;CPK-8.25-0314</Bridge_Symbol>
<Capitalization_Amount></Capitalization_Amount>
<Capitalization_Effective_Date></Capitalization_Effective_Date>
<Coupon_Rate>8.25</Coupon_Rate>
</Row>
<Row>
<Asset_ID>0x00038600d5d26def</Asset_ID>
<Asset_Type>OTHR</Asset_Type>
<Bridge_Symbol>GB*;WIGAN-3-PRP</Bridge_Symbol>
<Capitalization_Amount></Capitalization_Amount>
<Capitalization_Effective_Date></Capitalization_Effective_Date>
<Coupon_Rate>3</Coupon_Rate>
</Row>
<Row>
<Asset_ID>0x0003e70125312cdd</Asset_ID>
<Asset_Type>CORP</Asset_Type>
<Bridge_Symbol>CA*;ALMFC-4.35-0417</Bridge_Symbol>
<Capitalization_Amount></Capitalization_Amount>
<Capitalization_Effective_Date></Capitalization_Effective_Date>
<Coupon_Rate>4.35</Coupon_Rate>
</Row>
</Body>
<Trailer>
<Row>
<FixedValue>First trailer line</FixedValue>
</Row>
<Row>
<FixedValue>Second trailer line</FixedValue>
</Row>
</Trailer>
</TR:DataFeed>
2.
Select Data Transformation > Import Project, and then click Next.
The Import Data Transformation Project page appears.
3.
4.
5.
Click Browse, then browse to a report definition XML file, and then click Open.
The Import Files page appears.
73
6.
Click Finish.
The Reuters project appears in the Studio.
7.
8.
On the Output Properties page, select Disable automatic output, and then click OK.
9.
10.
74
CHAPTER 4
DTCC-NSCC
EDIFACT
EDI-X12
FpML
HIPAA
HL7
SEPA
SWIFT
Note: The user interface for the Library Customization Tool appears in English.
75
2.
Click Window > Open Perspective > Other > Library Customization.
2.
Under the Library Customization category, select the library name and click Next.
For example, to create a HIPAA customization project, select Library Customization > HIPAA Project.
Do not select a project under the Data Transformation category. This category creates Studio projects
that cannot be edited in the Library Customization editor.
3.
4.
At the prompts, select the library version and the message that you want to customize.
For example, select the HIPAA_4010 version and the HIPAA 834 message.
5.
Optionally, click the Browse button, and open an input file. The file is an example source file illustrating
the message structure that you want to define, such as a customized HIPAA 834 message.
Tip: For the best Library Customization Tool performance, select an input file that is no larger than 1 MB.
This advice applies only to library customization projects. In a deployed service, you can process very
large documents.
6.
Click Finish.
The project directory is created. The Library Customization perspective displays the project.
Importing a Project
To import an existing library customization project:
1.
2.
3.
76
If the example source displays as a single long line, click the Line Wrap button at the top of the
Example Source view.
2.
At the prompt, enter the segment separator character that is used in the message.
The Example Source view inserts a line break after each segment separator character. This splits the
display into multiple lines, enabling you to view the entire message.
2.
Library Customization editor. Displays and enables you to customize the message schema.
Project Explorer view. Displays all projects in the workspace. To open a message in the editor, doubleclick its *.libxml file in the Project Explorer.
Properties view. Enables you to view and edit the properties of elements that you select in editor.
Example Source view. If you select a sample input file, it is displayed in this view.
Output view. When you run the customization, this view displays the XML output that the customized
parser generates from the example source.
To open the schema in the editor, double-click its *.libxml file in the Project Explorer.
2.
In the editor, expand the tree and select the element to edit.
The Properties view, if it is open, displays the element properties. If the Properties view is closed, rightclick the element and click Show Properties.
3.
4.
b.
Select the option to Insert a sibling element above the existing element, to Append a sibling after
the element, or to add a Child element.
c.
In the Insert Element window, select an existing element or a new element, and click Next.
d.
On the Element Properties page, you can edit the property values.
e.
5.
6.
77
2.
3.
4.
After you correct the errors, click Run > Run Library Customization again.
Repeat until there are no validation errors.
Deploying as a Service
To deploy a library customization project as a service:
1.
2.
3.
4.
5.
78
Optionally, you can edit the project information, such as the service name or description.
2.
3.
4.
5.
Optionally, you can edit the project information, such as the description.
6.
b.
Under the Data Transformation category, select Existing Data Transformation Project into
Workspace.
c.
Follow the on-screen instructions to select the project and complete the import.
2.
3.
4.
Click Finish.
In the Project Explorer, right-click the name of the project, and then select Properties.
The Properties for <project_name> dialog box appears.
2.
3.
In the right panel, select Show alert when editing element that uses a global type, and then click OK.
79
Hierarchical Structure
The Library Customization editor displays the hierarchical structure of a message. It labels each structural
element with an icon. For example, the editor displays HIPAA messages using the following icons:
Icon
Structural Element
Description
Transaction set
A message.
Loop
Segment
A sequence of data.
Composite
Data element
The HIPAA standard defines the meaning of these elements and how they are nested. For example, a
segment can contain composites and data elements. A composite contains data elements.
The icons and structure elements differ for other industry standards. In some standards, the elements have
names such as message, record, and field instead of transaction, segment, and data element. Some
standards define additional logical constructs such as options or alternatives.
Global settings. Properties that apply to all instances of an element. If the same element is defined in
multiple contexts of the schema, editing the global settings affects all the instances.
Positional settings. Properties that apply to a single definition of the element in the schema. Editing the
positional settings affects only the schema context that you have selected in the Library Customization
editor.
In the HIPAA library, for example, a global setting might permit an element to contain any of ten enumerated
values. A positional setting might limit the same element to a subset of two values.
80
Property
Description
Global
Library type
DTCC-NSCC.
Global
ID
Message ID.
Global
Name
Message name.
Property
Description
Global
ID
Loop identifier.
Positional
Repetitions
Number of times that the loop is repeated, for example, 1 or 2. For an unlimited
number, enter >1.
Property
Description
Global
ID
Record identifier.
Global
Record type
Global
Sequence
81
Setting Type
Property
Description
Positional
Usage
Positional
Repetitions
Property
Description
Global
ID
Redefine identifier.
Global
Condition element
Property
Description
Global
ID
Option identifier.
Global
Condition
A string value. If the string has the same value as the condition element of the
redefine, the option is included in the redefined structure.
82
Setting Type
Property
Description
Global
ID
Field identifier.
Global
Number
Field number.
Global
Enumerations
Positional
Usage
Positional
Length
Positional
Type
Positional
Reject code
A field identifier for display in error messages if the field contains invalid data.
Positional
Description
Data description.
Property
Description
Global
Library type
EDIFACT.
Global
Library version
Global
ID
Message ID.
Global
Name
Message name.
Property
Description
Global
ID
Loop identifier.
Global
Name
Loop name.
Positional
Optional
True or false.
Positional
Repetitions
Number of times that the loop is repeated, for example, 1 or 2. For an unlimited
number, enter 9999.
Property
Description
Global
ID
Segment identifier.
Global
Name
Segment name.
83
Setting Type
Property
Description
Positional
Optional
True or false.
Positional
Repetitions
Property
Description
Global
ID
Composite identifier.
Global
Name
Composite name.
Positional
Optional
True or false.
Property
Description
Global
ID
Global
Name
Global
Type
Global
Min length
Global
Max length
Global
Enumerations
Positional
Optional
True or false.
84
Setting Type
Property
Description
Global
Library type
EDI-X12.
Global
Library version
Setting Type
Property
Description
Global
ID
Message identifier.
Global
Name
Message name.
Property
Description
Global
ID
Loop identifier.
Global
Name
Loop name.
Positional
Optional
True or false.
Positional
Repetitions
Number of times that the loop is repeated, for example, 1 or 2. For an unlimited
number, enter 9999.
Property
Description
Global
ID
Segment identifier.
Global
Name
Segment name.
Positional
Optional
True or false.
Positional
Repetitions
Property
Description
Global
ID
Composite identifier.
Global
Name
Composite name.
Positional
Optional
True or false.
Property
Description
Global
ID
Global
Name
Global
Type
85
Setting Type
Property
Description
Global
Min length
Global
Max length
Global
Enumerations
Positional
Optional
True or false.
Property
Description
Global
Library type
HIPAA.
Global
Library version
Global
ID
Message identifier.
Global
Standard name
Message name.
Global
Internal ID
Global
Functional group ID
Property
Description
Global
ID
Loop identifier.
Positional
Repetitions
Number of times that the loop is repeated, for example, 1 or 2. For an unlimited
number, enter >1.
86
Setting Type
Property
Description
Global
ID
Segment identifier.
Global
Standard name
Segment name.
Global
Child dependencies
Global
Segment qualifier
Positional
Usage
Setting Type
Property
Description
Positional
Repetitions
Positional
Industry name
Property
Description
Global
ID
Composite identifier.
Global
Standard name
Composite name.
Positional
Usage
Property
Description
Global
ID
Global
Standard name
Global
Type
Global
Min length
Global
Max length
Global
Enumerations
Positional
Usage
Positional
Enum subset
Subset of the enumerated values that is valid in this instance of the data
element.
Property
Description
Global
ID
Message identifier.
Global
Library type
HL7.
Global
Library version
87
Setting Type
Property
Description
Global
Description
Description.
Global
Category
Property
Description
Global
ID
Positional
Requirement
Mandatory or optional.
Positional
Repetitions
Number of times that the segment group is repeated, for example, 1 or 2. For an
unlimited number, enter >1.
Property
Description
Global
ID
Segment identifier.
Global
Description
Description.
Positional
Requirement
Mandatory or optional.
Positional
Repetitions
88
Setting Type
Property
Description
Global
ID
Global
Type
Field type.
Global
Description
Description.
Setting Type
Property
Description
Positional
Requirement
Mandatory or optional.
Positional
Repetitions
Property
Description
Global
ID
Global
Description
Description.
Property
Description
Global
ID
Global
Type
Component type.
Global
Description
Description.
Global
Requirement
Mandatory or optional.
89
Property
Description
Global
ID
Message identifier.
Global
Name
Message name.
Property
Description
Global
Name
Sequence name.
Positional
Repetitions
Positional
Requirement
Mandatory or optional.
90
Setting Type
Property
Description
Global
Code
Global
Name
Field name.
Global
Letter options
Letters such as A, B identifying the valid Option elements nested within the
field.
Setting Type
Property
Description
Positional
Repetitions
Positional
Requirement
Mandatory or optional.
Property
Description
Global
Letter option
Property
Description
Global
Repetitions
Property
Description
n/a
(None)
Property
Description
Global
Name
Component name.
Global
Prefix
Global
Length
Global
Fixed
If true, the component has a fixed length defined by the Length property.
If false, the Component has a variable length. The Length property defines the
maximum length.
91
Setting Type
Property
Description
Global
Char set
Global
Requirement
Mandatory or optional.
Property
Description
Global
Description
Global
Error code
Global
Qualifier
dependency
Property
Description
Global
Value
Global
Name
Global
Description
Global
Order
Integer identifier of a mutually exclusive set of values. Two codes having the
same Order cannot appear together.
Global
Requirement
Mandatory or optional.
Global
Repetitive
92
Setting Type
Property
Description
Global
Name
Global
Requirement
Mandatory or optional.
Global
Format
INDEX
AAR
library 38
acknowledgments
generating EDIFACT 33
generating X12 36
HIPAA 41
ACORD AL3
library 12
ACORD XML
certified industry standard 4
ACORD-MDM Mappers
ACORD LNA XML structure 14
creating project 14
definition 13
files 13
MDM JavaBeans XML structure 13
add-on
HIPAA Validation 42
AL3
ACORD library 12
ASC X12
library 34
ASN.1
AsnToXml processor 19
library 15
ASN.1Step
editor 19
AsnToXml
component 19
DTCC_NSCC
validation 31
DTCC-NSCC
customization properties 81
library 30
B
BAI
library 20
Bloomberg
creating service 25
Bloomberg field definition files
description 25
Bloomberg library
description 22
C
COBOL
deploying projects as services 28
importing data definition 26
integration applications 28
supported features 26
testing parser 27
testing serializer 27
CREST
library 29
E
EDI-AAR
library 38
EDI-UCS_and_WINS
library 38
EDI-VICS
library 38
EDI-X12
customization properties 84
library 34
validation 36
EDIFACT
customization properties 83
library 32
validation 32
F
field definition files, Bloomberg
description 25
FIX
library 39
FIXML
certified industry standard 4
FpML
certified industry standard 4
H
HIPAA
customization properties 86
installing validation add-on 42
library 40
using validation and acknowledgments 50
validation and acknowledgments 41
HIPAA library
HIPAA Validation version 41
non-validating HIPAA version 41
HIX
library 54
HL7
customization properties 87
library 56
93
HL7 Version 3
certified industry standard 4
I
IATA PADIS
library 57
IDC-IDSI
library 58
IFX
certified industry standard 4
installation
libraries 3
installing
library 3
ISO 20022
certified industry standard 4
NACHA
library 63
NCPDP
library 64
NSCC
library 30
O
OAGi
certified industry standard 4
optional field
in library components 9
L
libraries
adding components to projects 7
documentation 3
editing components 9
installation 3
obtaining 2
upgrading versions 7
version compatibility 3
library
installing 3
updating 3
Library Customization
perspective 76
Library Customization Tool
creating a project 76
deploying project 78
editing message schema 77
LIBXML files 77
message structures 79
overview 75
library project
creating 6
LIBXML files
LIbrary Customization Tool 77
M
MDM JavaBeans
creating project 62
definition 60
files 60
output XML structure 62
source XML structure 61
message specification
library PDF 79
MISMO
certified industry standard 4
MT messages
SWIFT 65
MX messages
SWIFT 65
94
Index
PADIS
IATA library 57
parser
for COBOL data 26
testing library 8
PDF specification
library message 79
perspective
Library Customization 76
projects
Library Customization Tool 76
protocol specifications
ASN.1 schema 15
R
Reuters, Thomson
creating service 73
Reuters, Thomson library
description 72
RosettaNet
certified industry standard 4
S
schema
ASN.1
protocol specification 15
ASN.1 protocol specification 15
editing message 77
for COBOL data 26
segments
in library components 10
SEPA
certified industry standard 4
library 40, 65
serializer
for COBOL data 26
testing library 8
service
library projects 10
standards
certified XML 4
startup component
in library projects 7
SWIFT
customization properties 90
library 65
validation 68
SWIFT MX
certified industry standard 4
T
Telekurs VDF
library 71
Thomson Reuters
creating service 73
Thomson Reuters library
description 72
TWIST
certified industry standard 4
V
validation
DTCC-NSCC 31
EDI-X12 36
EDIFACT 32
HIPAA 41
SWIFT 68
VDF
Telekurs library 71
VICS
library 38
W
WINS
library 38
UCS
library 38
UNIFI
certified industry standard 4
X12
EDI library 34
XML messaging standards
certified 4
Index
95