Vous êtes sur la page 1sur 18

GE Research &

Development Center

______________________________________________________________

Application of Six Sigma Quality Tools to


High Throughput and Combinatorial
Materials Development

J.N. Cawse and R. Wroczynski

2001CRD055, April 2001

Class 1

Technical Information Series

Copyright 2001 General Electric Company. All rights reserved.

Corporate Research and Development


Technical Report Abstract Page

Title

Application of Six Sigma Quality Tools to High Throughput and Combinatorial Materials
Development

Author(s)

J.N. Cawse
R. Wroczynski

Component

Characterization and Environmental Technology Laboratory

Report
Number

2001CRD055

Date

April 2001

Number
of Pages

12

Class

Key Words

Phone

(518)387-6095
8*833-6095

Six Sigma, combinatorial, high throughput, screening, Design for Six Sigma, quality,
experimentation, chemistry, materials development

As our ability to generate large numbers of experiments has accelerated, it has become more
important to ensure that the quality of each process step and individual sample is as high as possible.
There are too many samples for each data point to be checked individually, and too many process steps
for human oversight. To achieve a reasonable level of quality in the information from a high throughput
experimental system, the principles of modern Six Sigma manufacturing must be applied. The quality of
the end product is then ensured by the quality of each individual process step and the robustness of final
quality to variation in the process steps. Design for Six Sigma tools are particularly appropriate in this
effort.

Manuscript received March 13, 2001

Application of Six Sigma Quality Tools to High Throughput and


Combinatorial Materials Development
James N. Cawse and Ronald Wroczynski

Introduction
Background

In the last four years, this technology has


expanded to materials design problems outside
the drug field1. Major chemical companies have
entered this arena, either by themselves or in
concert with a company such as Symyx2, which
specializes in new technologies for combinatorial
materials discovery. Initial work has focussed on
development of robotic sample preparation,
reactors and sensors. Some of this equipment is
becoming available commercially. With the use of
this equipment, we have found that astonishing
increases in the throughput of experimentation
are possible (Figure 1).

Number of Reactions

Over the past ten years, the new research technology called Combinatorial Chemistry or High
Throughput Screening has seen exponential growth. This technology a set of techniques for creating a
multiplicity of compounds and then testing them for activity has been widely adopted in the
pharmaceutical industry over the past few years. Virtually every major drug manufacturer is now using
these techniques as the cornerstone of their research and development program. In the pharmaceutical
industry, libraries of 1000 to 1,000,000 distinct compounds are routinely created and tested for biological
activity. This is now practical because of the
convergence of low-cost computer systems,
25,000
reliable robotic systems, sophisticated molecular
modeling, statistical experimental strategies, and
database software tools.
20,000
Combinatorial Reactor
Conventional Reactors

15,000
10,000
5,000
0

7
8 8
8 9 9 9
7 8 8 8
t-9 ec-9 b-9 pr-9 n-9 g-9 ct-9 ec-9 b-9 pr-9 n-9
c
u
e
u
O D F A J A O D Fe A Ju

Figure 1. GE experience with high-throughput screening


of a catalyst system.

Need For Quality


As our ability to generate large numbers of experiments has accelerated, it has become more important to
ensure that the quality of each process step and individual sample is as high as possible. There are too
many samples for each data point to be checked individually, and too many process steps for human
oversight. To achieve a reasonable level of quality in the information from a high throughput experimental
system, the principles of modern Six Sigma manufacturing must be applied. The quality of the end product
is then ensured by the quality of each individual process step and the robustness of final quality to variation
in the process steps3. Key elements of Six Sigma are defined in Table 1.
A Six Sigma accounting of the potential for defects in a representative system with a 96-well array is
given in Figure 2. Although the potential for defects of any individual process step is relatively small (0.011.0%), the steps are applied so many times and to so many samples that the actual number of defects per
unit can be substantial. For example, in the second line of the figure, the technician is assumed to weigh 5

Table 1. A Six Sigma glossary.

Critical to Quality (CTQ). Any product or service characteristic that satisfies a key customer
requirement or process requirement.
Defect opportunities. Every operation and every subsystem component is considered to be
a potential source of one or more defects.
Defects per million opportunities (DPMO). As actual or statistically probable defects are
found, they are divided by the opportunities. The ratio is multiplied by 1,000,000 to
accentuate the importance of even small numbers of defects.
Unit. An appropriate subdivision of the total effort is defined as a unit. In the present case,
a single 96-well sample array was chosen.
Defects per Unit (DPU). To calculate the total number of potential defects in the unit, each
DPMO is multiplied by the number of times it can occur in a single unit. The results are
totaled to give the DPU.
Z. The standard metric for measuring process capability. ZLT is based on the overall
standard deviation and average output. ZST represents the process capability with special
cause variation removed.

chemicals in making up the stock solutions for a run. An error in any weighing will affect 96 samples,
however, so any weighing defect will result in 96 sample defects. Therefore this step has 96 x 5 = 480
opportunities for defects. We assume that any human operation has 0.6% (6000 dpmo) chances of being
defective4, so the total number of defects potentially arising from this process are 480 x 6000 x 10-6 = 2.88
defects/array. Other sources add to this as shown in Figure 2, which is a highly abbreviated total from a
more complex system. It is easy to see that the number of defects/array can quickly become quite large.

Process Step
Make Stocks
Make Samples
React Samples

Analyze Samples

Source
Chem. Supplier
Technician
Robot
Heatup
Temp. Control
Cooldown
Gas Chromatograph
GC
GC

Defect
offspec chemicals
weigh chemicals
poor mixing
Time
Temp.
Time
Peak Identification
Endpoint Detection
Peak Integration

dpmo # times
1000
480
6000
480
3000
96
1500
96
5000
96
1500
96
100
96
10000
96
500
96

DPU
0.48
2.88
0.29
0.15
0.48
0.15
0.01
0.96
0.05

Total Defects 5.45

Figure 2. Estimating the total number of defects in an experimental unit:


a 96-well sample array. dpmo = defects per million operations; #times =
the number of times that operation will be performed on that unit; DPU
= defects per unit = #times*dpmo/1000000.

Overall Six Sigma Approach


Since we are developing a new process in setting up a combinatorial system, we use the Design for Six
Sigma (DFSS) strategy. The DFSS project stages are summarized as IDOV: Identify, Design, Optimize,
and Validate (Figure 3). Each of the four stages has a few critical tools which are necessary for bringing a
project to successful completion. We will discuss this approach using examples from cases of
combinatorial/high throughput development projects at the GE Corporate Research and Development
Center. These projects include homogeneous catalysis, heterogeneous catalysis, phosphors, coatings, and
polymer development.

Identify customer CTQ's


Perform CTQ flowdown
Analyze measurement system capability
Generate/validate system/subsystem models
Build variance predictions (parts, process, performance)

Design

Roll-up variance for all subsystems


Capability flow-up
Identify the gaps
Low DPU of subsystem

Optimize

Use DoE on prototypes to find the critical few X's


Perform parameter and tolerance design
Generate purchase and manufacturing specs

Validate

Quality by Design

Identify

Confirm that pilot builds match predictions


Mistake proof the process
Refine models and process characterization database
Document the effort and results

Figure 3. Outline of the DFSS methodology used for high throughput


screening and combinatorial experimentation.

IDOV Step 1: Identify


The critical tasks in the Identify step are identifying the customers (all of them), determining their Critical
to Quality (CTQ) parameters, and generating a seamless connection from those parameters to
controllable elements of the actual experimental process.
CTQ Determination
Step 1 in CTQ determination is iteration of all the customers of the new process. This customer listing is
organized in a structure called the Application Channel (Figure 4), progressing from the most outside
the customer who actually pays money for a product to the most inside the people who are working
on the process. Typically there are three to five customer organizations. Within each of those
organizations, the people whose roles are critical to making a good judgment of the CTQs must be
determined. Once those people are found, the technique of customer needs mapping (CNM)5 can be used
to determine the CTQs.

End User
GE Business
Manufacturing
Operation
Final customer for
catalyst, not process

GE Business
Pilot Plant Team
Interface of
development with
end user

GE Corporate Research
and Development Catalyst
Development Team
Validation of combinatorial
development process and
laboratory scale test of
candidates.

GE Corporate Research
and Development
Combinatorial Team
Development of process for
catalyst development and
generation of catalyst leads
for further investigation.

Customers
Manufacturing
Manager
Manufacturing
Engineer

Customers
Process Engineer
Pilot Plant
Chemist

Customers
Catalyst Chemist
Development Engineer

Customers
Combinatorial Chemist
Microreactor Engineer
Data specialist

Figure 4. Application Channel worksheet for the catalyst development case.

Flowdown of CTQs by Quality Function Deployment


The CTQs determined by customer needs mapping are typically somewhat subjective and a bit fuzzy.
Developing these into actionable process items is called CTQ flowdown and is typically done using the
Quality Function Deployment technique.6 A typical interrelationship of the houses of the QFD for a
material or catalyst development project is shown in Figure 5.
The first house (Figure 6) captures the critical to quality needs of the external customer and how the
Business unit expects to fulfill those customer needs. The primary customer CTQs become the rows
(Whats) of the first house of the QFD; the measurements become the columns (Hows). Using
the typical QFD process 6,7, importance levels are assigned (1 to 5, 5 is most important) to the external
Customer CTQs based on the CNM. The relationship of the CTQ's to the Hows (Business
Product Deliverables) is then defined by assigning high, medium, low (9,3,1) values to the cells based
on the impact that the measurements have on the CTQs. The products of the cell values with the
importance values of the CTQ's is determined and totaled for each column. These totals thus gives
the overall impact that each product deliverable has on meeting the Customer CTQs.
The Whats from the first house then become the Hows of the second house and similarly
importance values are assigned to them. The second house (Figure 6) is then used to establish the
relationship for How the materials, catalyst, or process can be varied to deliver the product changes
required by the Business to effect the Customer CTQs.
The third house identifies the variables that can or must be probed in the combinatorial space to be
able to screen the materials, catalyst, or process for desired changes
Finally, the fourth house relates to the actual design of the high throughput system (reagent
preparation, reactor conditions, product isolation and work-up, chemical analysis, and data storage and
data analysis) needed in order to deliver the desired capability of the combinatorial chemistry system.
It is here that the CTQs of the Combinatorial sub-systems are mapped onto the hows or variables
in the HTS factory (Figure 7).
The fully developed QFD assigns importance and impact values to the relationships of items in the
Whats (rows) and Hows (columns) within each house. As a result it links (flowdown) the Customer
CTQs to items further down in the process. This ultimately determines the design of the high throughput

House
#2

Product
Deliverables

House
#3

Material
Process
Variables

HTS Variables
(HOW s)

CombiChem System
(WHAT s)

Customer
Needs

HTS System
Characteristics
(HOW s)

Materials, Process Variables


(WHAT s)

House
#1

Catalyst,
Materials, Process
(HOW s)

Product Deliverables
(WHAT s)

Customer Wants
(WHAT s)

Business Product
Deliverables
(HOW s)

House
#4
High
Throughput
Screening
System

Figure 5. The structure of the Quality Function Deployment Houses used for CTQ flowdown in high throughput
screening experimentation.

Variable Costs

Reaction Residuals

Resin Manufacturability

Total

Capping Level

Resin Melt Flow

Total

Resin Hydrostability

Lowest cost polymer

Resin Color Stability

Resin Color

Manufacturing Throughput

Environ. Health Safety

Polymer Reaction Measurables

Investment Costs

Total Cost

House 2

Side Product Content

Increase Capacity @ Lowest Cost

New polymer is the same as standard 5

External Customer
Expectation

Importance

Quality Product

Business Product Deliverables

Importance

House 1

Quality Product

285

Increase Capacity @ Lowest Cost

156

Total Cost

92

Environ. Health and Safety

21

Business Product
Deliverables

49 41 41 17

86 78 56 54 54 54 54 40 30 26 22

Total

Figure 6. First and Second House QFD relating customer CTQs with measurable product and process deliverables.

Database System

Chemical
Reaction

Preparation

Analytical
Preparation

Chemical
Analysis

Data
Analysis

Liquid Handler Delivery to DSC

Total

230

582

Design Space Coverage

m m

112

Reproducibility

m m m

380

Scaleability

m m

156

Total

N2 Flow Rate

Catalyst Concentration

Solvent Removal Procedure

Catalyst Delivery

Preparation Time

Reaction Time

Temp Profile

Order of Addition

Data Storage (Long-Term)

m m

Temp. Uniformity

m m

PS Standards for Relative Mw

Outlier ID

Solvent Choice

Monomer Reactivity

Catalyst Choice

Sample Tracking

Secondary Analysis Accuracy

Chromatography Peak Picking

Sampling Rates

Post-reaction Reaction

Sample Placement Accuracy

Diessolution Process

Slurry Delivery

Liquid Volume Delivery

Monomer Purity

Temp. Accuracy

Reagent Volatility

Accuracy

Overall Variability

Monomer Stability

Design Space Choice

CombiChem Factors

Monomer Ratio

Importance

Automation of Data Analysis & Storage

HTS Variables

126 72 70 70 58 58 54 54 54 54 54 54 54 54 54 42 42 42 40 38 34 30 28 28 28 28 24 24 24 24 24 20

Preparation

Reaction

Work Up

Chemical
Analysis

Data
Analysis

Figure 7. A Fourth House QFD connecting the materials, catalyst, or process for desired changes.

screening (HTS) system (the combinatorial factory). Conversely, once the HTS system is in place, its
statistical capability to identify new leads can be related back (flowup) to the ability to reliably deliver
the Customer CTQs.
The next key step in utilizing the QFD is to focus on the design of the HTS system. This is an iterative
process linking the Customer CTQs with the chemical analysis system and the reactor design
specifications. Simply stated, both the analysis and the reaction system have to be sufficiently capable to
allow identification of leads or hits above the composite noise of the systems. However, the ability to
perform certain high throughput analyses is a function of the type of reaction and reactor employed and,
similarly, the reaction system can be configured to more easily allow rapid in-line or off-line analysis. This
iterative process occurs early in the HTS design process and fully utilizes the interactive matrix of
variables and responses delineated by the QFD process.
To fully understand the interaction of the analysis and reaction systems, a complete process map is
constructed for the steps involved in running the HTS factory. Figure 8 depicts a generalized process map
including the design and operation stages of a HTS project. These generalized process steps can be subdivided into more detailed actions or steps. Figure 9 highlights an example of detailed steps that
correspond to Prepare Arrays in Figure 8. The detailed process steps that are unnecessary or unwieldy
can be changed or eliminated. The remaining steps in the process map can then be linked to the variables
in the HTS house of the QFD. Process steps related to those elements of the HTS house which have the
greatest impact on the Customer CTQ's are then the focus of significant quality evaluation and refinement
to reduce sources of variability.

IDOV Step 2: Design


The key Six Sigma activities in the Design stage include setting the specifications for the system and
flowing up the variance from the subsystems to the total system.

Factory
Design
Inputs
Chemical
System

Inputs
Ideas,
Chemicals

Process Design

Determine
Design Required

Construct
Prototype

Analytical System
Design

Define Key Analysis


and Analysis Precision
Needed

Decide on System
to Study

Plan and Prioritize


Experimental
Universe

Prework
Process

Purchase
Chemicals

Setup, Process,
and Evaluate

Prepare
Arrays

Data
Analysis

System
Database

Validate Safety
and Operation

Evaluate and
Validate
Method
Plan Combinatorial
Strategy for
Exploration

Synthesis

Safety,
Solubility, and
Compatibility Tests

Define Stock
Solutions and
Coat Mixtures

Factory
Production

Output
Candidates
for Scale-up

Sample
Analysis

Process

Data
Analysis

Inferential
Engine

Figure 8. A generalized process map for design and operation of a High Throughput Experimental System.

Weigh
reagents for
stocks

Dissolve
reagents for
stocks

Weigh
catalyst
for stock

Add
solvent
for stock

Charge
catalysts to
HTS array

Remove
solvent

Secure tips
to 96 well
replicator

Charge reagents
to array with
replicator

Transport
array
to oven

Remove
solvent

Secure
pipette
tips

Charge initiator
to array
with replicator

Transport
array to
vac oven

Remove
solvent

Preparation

Chemical
Reaction

Analytical
Preparation

Chemical
Analysis

Data
Analysis

Database System

Figure 9. Detailed process map for array preparation.

Specifications
Unlike a conventional factory, a high throughput experimental system has only information as its product,
so the specifications are on the quality of that information. Therefore, all specifications are expressed in
terms of variance (or, more conventionally, standard deviation). Specifications are not internally
determined; they are externally agreed upon by the customer and the process team, and should flow
logically from the measurements of the critical customer parameters (CTQs). In our catalyst
development case, the critical parameters were accurate and rapid identification of catalyst leads. In
addition, the desired catalyst activity was more than double the current level. From that, we agreed that a
25% increase in activity would represent a useful lead, and that we desired a 95% probability of correctly
detecting that lead.

CTQs
Accurate lead identification
Rapid lead identification
>100% increase in catalyst activity

Specification
95% probability of correctly detecting
a 25% increase in catalyst activity

This specification is on the whole system: the major program of the Design effort is to flow that
specification down to the process subsystems, then flow the variances of the subsystems back up to the
whole system. In short, the variance of the whole system is a function of the variances of the subsystems
(Figure 10).

Whole System

Preparation

Chemical
Reaction

Analytical
Preparation

Chemical
Analysis

Data
Analysis

Database System

Figure 10. The variance of the whole system is a function of the variances of the subsystems.

This flow up of variance can be done in two ways: by defects accounting or experimentally using nested
DOEs. The approaches are complementary and both were used in this case. The purpose of nested
experimentation is
to get a measurement of the total process variance, so it can be compared to the variance based
specification to see if the experimentation has a reasonable chance of success, and if so, to estimate
the required number of samples
to identify the most important sources of experimental variance so that they may be reduced with Six
Sigma MAIC (Measure Analyze Improve Control)7 projects.
Nested experiments should capture the variance resulting from normal operation of the process.
The purpose of defect accounting is to capture discrete sources of error not caught in the nested
experiment. This can include missed points, erroneously entered information, contaminated or mislabeled
chemicals, and so on. Defect accounting should capture the variance resulting from all abnormal events
in the process. These also become targets of MAIC projects.
Nested DOEs
As a set of samples moves through the high throughput factory, it undergoes a series of processes, each
of which contributes a certain amount of variance to the total. This leads to a situation where there are
multiple experimental units,8 and the variances from each unit must be calculated from a nested statistical
analysis.
Figure 11 shows a fairly typical process in a combinatorial factory. Four subprocesses are shown: making
stocks, making an array of samples from the stocks, reacting the array, and analyzing the samples. Each
has its own sources of variance. Careful planning and analysis of the data is necessary to ensure that each
subprocess is assigned its appropriate degrees of freedom and variance.
Determination of the effect of the variances of each step of the process requires a hierarchical design.
This initial step in variance determination should be a simple hierarchical design in which a single
representative system is replicated many times, with all of the subprocesses represented in the replication
(Figure 12). That will give an estimate of the overall variance of the system and the variance of the initial
subprocesses. The statistical analysis of a balanced system like the one shown is standard and is covered
in most texts 9 and statistical software packages.10 These texts and software often do not, however,
mention that the uncertainty in the estimates of variance at the upper levels of the hierarchy can be very
large!11

Process Step

Factors

Make Stocks

Stock solution factors:


Amount of chemical

Make Array

Sample factors:
Amounts added
Total volume

React Array

Analyze Array

Reactor factors:
Temperature
Reaction tim e
Pressure
Gas composition
Analysis factors:
Peak identification settings
Integration settings

Figure 11. A typical process in making a combinatorial or high throughput array.

Replicate Stocks
Replicate Reaction Runs
Replicate Samples

Replicate Analyses

Figure 12. A hierarchical experimental design for studying a nested system.

Defect Accounting
In defect accounting, the entire system and each subsystem is process mapped in detail. Each step is
examined by the Six Sigma Black Belt and the process expert for potential sources of defects. These can
be classified by type:
Parts. The quality level of components or raw materials (such as chemicals)
Process. The quality level of each unit operation (e.g. weighing, fluid transfer, or reaction).
Performance. The overall quality of subsystem performance vs specifications when all elements are
working correctly but normal variance is present.
Software. The quality of any software specifically written for the system.
The outcome of the process is illustrated in Figure 2, which shows a small part of a high throughput
screening system. Even with this small fraction of the overall opportunities for defects, we predicted 5.6
DPU.
Outcome of the Variance Flowup
The results of the nested experiments and defect accounting are combined to give an overall estimate of
the system quality. They are also used to pinpoint the largest sources of variance. Once this is done, we
can apply Six Sigma techniques to reducing that variance. Continued nested experiments and defect
accounting will allow us to track improvements in system quality until the system has reached a practical
quality level.

IDOV Step 3: Optimize


Optimization of the high variance processes identified in the previous steps is done by the standard Six
Sigma Measure-Analyze-Improve-Control (MAIC) process. The following cases illustrate the particular
problems of high throughput screening and the quality level we have been able to achieve through MAIC
methodology.
Case 1: Analytical System Gage Reliability and Reproducibility
The analytical system is the heart of any high throughput experimental operation, so it must operate at the
highest level of reliability for the operation to succeed. There are too many samples per day for the
operator or scientist to scan the raw data (spectra, chromatograms, etc) for accuracy and consistency. In
our catalyst system, the analysis was a rapid gas chromatogram. The MAIC process improved its
capability immensely:
Measure: one subsystem CTQ was peak
counts
identification. A gas chromatograph integration
system identifies components by selecting the
50000
Peak
largest peak inside a set retention time window.
identification
window
(Figure 13.) Therefore the consistency of peak
40000
retention times is critical to the overall system
Peak
capability.
30000
retention time
Analyze: the initial Six Sigma analysis (Figure
variation
14) of the chromatographic relative retention
20000
time capability showed that the capability of the
system was far too low. The small groups in the
10000
histogram were identified as single day
measurements. There was therefore severe day
1
1.5
2
2.5
to day drift (Zlt), but even the individual groups
indicated poor within-day capability (Zst).
Figure 13. The critical parameters in chromato Improve: Designed experiments on the number
graphic peak identification.
and type of internal standards optimized the
system (Figure 15). The Zst improved to 27 and the Zlt to 12 far more than six sigma! Missed peak
identification from this source can be estimated to be in the parts per trillion.
LSL USL
LSL

USL

Short Term
Long Term

ZST 1.8

ZST 27

ZLT 0.6

ZLT 12

.80

3.85

3.90

3.95

4.00

4.05

4.10

3.93

Relative Retention Time

Figure 14. GC retention time capability before MAIC


project. Note the groups of data points which were
single-day results. This indicated moderate short-term
(ZST) capability, but the day-to-day drift led to very poor
long-term (ZLT) capability LSL = lower specification
limit; USL = upper specification limit.

Short Term
Long Term

Relative Retention Time

3.98

Figure 15. GC retention time capability after MAIC


project. Note that the scale has changed, although
the LSL and USL remain the same as in Figure 14.

10

Control: Careful choice of the internal standards allows diagnosis of potential deterioration in
chromatographic behavior by tracking their relative areas using control charts.

Mix Ratios

Robot System Mixing Capability


Robotic pipettors are a mainstay of high throughput experimentation. They were primarily developed for
the medical and pharmaceutical industries so they are optimized for operation with water. In our system,
we planned to combine several different stock solutions made with a nonaqueous solvent and mix them by
repetitive aspirate and release operations. This was specified by the manufacturer as an effective mixing
technique. Using MAIC we found otherwise:
Measure: the subsystem CTQ was mixture uniformity. The initial specification was set at 20%
variation around the nominal, as measured by six consecutive chromatographic samples from a single
vial.
Analyze: the variation of the system was found to be more than 30% (Figure 16)
Improve: DOE on the pipette control parameters showed that the variation could not be reduced to
specifications at any available settings! A second experiment comparing alternative mixing methods
showed that a miniature magnetic stirring bar in each vial was the only way to achieve adequate
mixing (Figure 17).
Control: Mistake-proofing the procedure for putting the stirring bars in the vials was important. New
stirring bars for each experiment proved necessary to minimize cross contamination.

Mixture Sample

Figure 16. The variation in composition of six repetitive measurements taken on 58 samples after mixing by aspiration.

LSL

USL

ZST 7.7

Short Term
Long Term

ZLT 7.5

0.90

1.00
Mix Ratio

Figure 17. The variation in composition in the system after


substitution of mini stirbars for mixing by aspiration.

11

Temp C

Reactor Temperature Control


Percent
Variation
In any materials development program that involves
16%
the rate of a chemical reaction as a critical
parameter, temperature control and uniformity are
12%
often under appreciated. Figure 18 shows the effect
8%
on reaction rate variance caused by a small (2C)
temperature variability. With MAIC we:
4%
Measure: Since catalyst TON (Turn Over
20
0%
Number) is directly related to reaction rate,
10 Activation
120
5
180
temperature was identified as a CTQ. A 2 C
Energy (Kcal)
240
300
change in temperature was found to cause a
Reaction Temperature
20% change in reaction rate in our system, so 2
Figure 18. Rate variation caused by a 2 C
C variation was set as the specification. Some
temperature difference across a reactor array as a
immediate low-hanging fruit was discovered
function of reaction temperature and Arrhenius
when it was found that the reactor temperature
activation energy.
was controlled with a low quality, easily broken
thermocouple (standard wire error 1.1-2.9C).
That was replaced with a platinum resistance
TC Locations
102
thermometer (deviation 0.35C) 12 .
5
100
Analyze: Measurement of the actual sample
4
98
3
2
temperatures showed relatively small sample to
1
96
sample variation but large swings caused by the
94
TC 1
2 C Spec
TC 2
92
temperature control system (Figure 20) caused
TC 3
90
TC 4
by the crude control system and the thick walls
TC 5
88
Manual
of the high pressure reactor.
86
TC readings
Improve: Careful examination of the data
84
82
shown in Figure 19 showed that the within11:30 12:00 12:30 13:00 13:30 14:00 14:30 15:00
reactor temperature control was quite good (Zst
Actual Run Time
= 4.5, as shown by the closeness of the lines in
Figure 19. The actual variation in temperature
the Figure). The critical problem was the
within the high pressure autoclave as measured
temperature control system, which suffered
by precision thermocouples.
severe lags because of the reactor wall
thickness. A control system upgrade followed by
optimization of its parameters reduced the temperature swings to less than 1.5C. We also modified
the sample holder to improve gas circulation within the autoclave.
Control: Once the optimized control system was in place, the system was set on an automatic
procedure which minimized the variability. An ongoing program of SPC samples in the process has
shown acceptable performance.

Validate
The most important validation for a HTS project is confirmation that the screening results predict
outcomes on the laboratory and pilot scale. For the catalyst project, 94% of the leads identified in high
throughput screening were confirmed at the laboratory (150 ml reactor) scale.
A second catalyst project had two important CTQs: high catalyst activity and low side-product formation.
In traditional studies, these two responses went hand-in-hand. That is, highly active catalysts gave high
amounts of side-product. Over 300 catalyst compositions were evaluated by HTS with an aggressive 24%

12

being identified as leads for more detailed


traditional evaluation. Out of these leads, 90%
were confirmed as high activity catalysts with
lower side product formation in larger-scale
laboratory experiments. Figure 20 shows the
distribution of catalyst with various activity and
side-product formation. The current catalyst
performance is shown along with some of the
confirmed HTS leads highlighted by arrows.

50
45

Frequency

35
30
25
20
15

Current Catalyst

40

10
5

Conclusion

In two years, we have successfully used this high


uct
Activ
rod
ity
P
throughput screening technology to explore >2000
e
Sid
unique materials combinations in over 10,000
different ratios. Many hits were discovered and
Figure 20. Distribution of catalyst activity and side
several patents filed. The overall benefits to the
product production for an oligomerization catalyst.
catalyst development program were:
The small yellow arrows indicate leads which were
A broad, proprietary technology position has
confirmed upon scaleup.
been developed for this process
New systems with attractive economic and environmental properties have been identified
Wide exploration has been made less risky. We have ended the moth around the flame syndrome, in
which only modest variations were made on a known good system because wild exploration was
too expensive.
From this we have identified the key learnings from application of DFSS (Design for Six Sigma) to combinatorial and high throughput experimental systems:
Link and prioritize multiple complex steps with Quality Function Deployment
Pay special attention to the Gage Repeatability and Reproducibility of the analytical system
Focus on the variance of subsystems to reduce the overall system variance
Roll up subsystem variance to estimate system variance
Attack variance using advanced Design of Experiments tools.

References:
(1) Liu, D. R.; Schultz, P. G. Generating New Molecular Function: A Lesson From Nature. Angew. Chem. Int. Ed
1999, 38, 36-54.
(2) Symyx Technologies.http://www.symyx.com, (April 3)
(3) Harry, M. J. The Nature of Six Sigma Quality; Motorola University Press: Schaumburg, IL, 1988.
(4) The quality of routine manual operations (e.g . restaurant bills, prescription writing) tends to cluster at 4
Sigma or 6000 DPMO: Harry, M. J. The Vision of Six Sigma: A roadmap for Breakthrough; Sigma Publishing
Company: Phoenix, AZ, 1994, p19.26.
(5) Juran, J. M. In Jurans Quality Control Handbook; 4th ed.; Juran, J. M., Ed.; McGraw-Hill: New York, 1988; pp
6.4ff.
(6) Quality Function Deployment: 3-Day Workshop; American Supplier Institute: Dearborn, MI, 1997.
(7) Breyfogle, F. W. Implementing Six Sigma; John Wiley and Sons: New York, 1999.
(8) Milliken, G. A.; Johnson, D. E. Analysis of Messy Data; Van Nostrand Reinhold: New York, 1984.
(9) Montgomery, D. C. Design and Analysis of Experiments; John Wiley & Sons: New York, 1984.
(10) 12 ed.; Minitab, Inc, 1999.
(11) Hendrix, C.: S. Charleston, WV, 1997.
(12) The Temperature Handbook; Omega Engineering, Inc., 1995.

13

Distribution List
Corporate Research and Development
Research Circle
Niskayuna, NY 12309
M. Brennan
D. Buckley
J. Carnahan
J. Cawse
G. Chambers
B. Chisholm
K. Cueman
D. Dietrich
N. Doganaksoy
T. Early
V. Eddy-Helenek
W. Flanagan
J. Flock
J. Hallgren
C. Hansen
B. Johnson
T. Jordan
R. Kilmer
T. Leib
J. Lemmon
R. Mattheyses
R. May
C. Molaison

GE Plastics
5th and Avery Streets
Parkersburg, WV 26101
A. Berzinis

GE Capital
260 Long Ridge Road
Stamford, CT 06927

W. Morris
D. Olson
B. Pomeroy
R. Potyrailo
E. Pressman
T. Repoff
C. Sabourin
A. Schmidt
R. Shaffer
K. Shalyaev
O. Siclovan
J. Silva
G. Soloveichik
J. Spivack
C. Stanard
J. Teetsov
J. Webb
J. Wetzel
D. Whisenhunt
B. Williams
R. Wroczynski

GE Plastics
Highway 69 South
Mt. Vernon, IN 47620
S. Cooper
J. DeRudder
D. Whalen

GE Plastics
1 Plastics Avenue
Pittsfield, MA 01201

N. Izadi-Khalili
S. Clarke

2001CRD055

J.N. Cawse
R. Wroczynski

Application of Six Sigma Quality Tools to High Throughput and Combinatorial


Materials Development

2001CRD055
April 2001

Vous aimerez peut-être aussi