Vous êtes sur la page 1sur 4

1.

Acceptance Quality Control - AQL stands for Acceptance Quality Limit, and is defined as the
quality level that is the worst tolerable in ISO 2859-1. It decides on the maximum number of
defective units, beyond which a batch is rejected. Importers usually set different AQLs for critical,
major, and minor defects. Most Asian exporters are familiar with this type of setting.
2. Acceptance Sampling - Acceptance sampling uses statistical sampling to determine whether to
accept or reject a production lot of material. It has been a common quality control technique used
in industry. It is usually done as products leave the factory, or in some cases even within the
factory.
3. Assignable Variations - A distribution of non-random results caused by a single identifiable
factor with clearly defined characteristics.
4. Chance Variation - Chance variation or chance error or random error is the inherent error
in any predictive statistical model. It is defined as the difference between the predicted value of a
variable (by the statistical model in question) and the actual value of the variable. For a fairly
large sample size, these errors are seen to be uniformly distributed above and below the mean
and cancel each other out, resulting in an expected value of zero.
5. Continuous Improvement - A continual improvement process, also often called
a continuous improvement process (abbreviated as CIP or CI), is an ongoing effort to
improve products, services, or processes. These efforts can seek "incremental" improvement
over time or "breakthrough" improvement all at once.Delivery (customer valued) processes
are constantly evaluated and improved in the light of their efficiency, effectiveness and
flexibility.
6. Control Charts - The control chart is a graph used to study how a process changes over
time. Data are plotted in time order. A control chart always has a central line for the average,
an upper line for the upper control limit and a lower line for the lower control limit.
7. Control Limits - Control limits, also known as natural process limits, are horizontal lines
drawn on a statistical process control chart, usually at a distance of 3 standard deviations of
the plotted statistic from the statistic's mean.
8. Dispersion - In statistics, dispersion (also called variability, scatter, or spread) is the
extent to which a distribution is stretched or squeezed. Common examples of measures of
statistical dispersion are the variance, standard deviation, and interquartile range.
9. Distribution - In economics, distribution is the way total output, income, or wealth is
distributed among individuals or among the factors of production (such as labour, land,
and capital).
10. Frequency Distribution - In statistics, a frequency distribution is a table that displays the
frequency of various outcomes in a sample.[1] Each entry in the table contains
the frequency or count of the occurrences of values within a particular group or interval, and
in this way, the table summarizes the distribution of values in the sample.
11. ISO Standards - ISO International Standards ensure that products and services are safe,
reliable and of good quality. For business, they are strategic tools that reduce costs by
minimizing waste and errors, and increasing productivity. They help companies to access
new markets, level the playing field for developing countries and facilitate free and fair global
trade.
12. Lot Size - Lot size refers to the quantity of an item ordered for delivery on a specific date
or manufactured in a single production run. In other words, lot size basically refers to the
total quantity of a product ordered for manufacturing.
13. Lower Control Limit - Bottom limit in quality control for data points below the control
(average) line in a control chart.
14. Normal Distribution Curve - In probability theory, the normal (or Gaussian) distribution is
a very common continuous probability distribution. Normal distributions are important
in statistics and are often used in the natural and social sciences to represent real-
valued random variables whose distributions are not known. The normal distribution is
sometimes informally called the bell curve. However, many other distributions are bell-
shaped (such as the Cauchy, Student's t, and logistic distributions). The terms Gaussian
function and Gaussian bell curve are also ambiguous because they sometimes refer to
multiples of the normal distribution that cannot be directly interpreted in terms of
probabilities.
15. Population - In statistics, a population is a set of similar items or events which is of interest
for some question or experiment.
16. Probability - Probability is the measure of the likelihood that an event will occur.
[1]
Probability is quantified as a number between 0 and 1 (where 0 indicates impossibility [2] and
1 indicates certainty).[3][4] The higher the probability of an event, the more certain that the
event will occur.
17. Process Capability - The process capability is a measurable property of a process to the
specification, expressed as a process capability index (e.g., Cpk or Cpm) or as a process
performance index (e.g., Ppk or Ppm)
18. Quality - A quality is an attribute or a property characteristic of an object.
19. Quality Assurance - Quality assurance (QA) is a way of preventing mistakes or defects in
manufactured products and avoiding problems when delivering solutions or services to
customers; which ISO 9000 defines as "part of quality management focused on providing
confidence that quality requirements will be fulfilled".[1] This defect prevention in quality
assurance differs subtly from defect detection and rejection in quality control, and has been
referred to as a shift left as it focuses on quality earlier in the process.
20. Quality Circle - A quality circle or quality control circle is a group of workers who do the
same or similar work, who meet regularly to identify, analyze and solve work-related
problems.[1] Normally small in size, the group is usually led by a supervisor or manager and
presents its solutions to management; where possible, workers implement the solutions
themselves in order to improve the performance of the organization and motivate
employees.
21. Random sampling - In this technique, each member of the population has an equal chance
of being selected as subject. The entire process of sampling is done in a single step with
each subject selected independently of the other members of the population.
22. Range - In arithmetic, the range of a set of data is the difference between the largest and
smallest values.
23. Reliability - Reliability in statistics and psychometrics is the overall consistency of a
measure. A measure is said to have a high reliability if it produces similar results under
consistent conditions.
24. Sample Size - The sample size is an important feature of any empirical study in which the
goal is to make inferences about a population from a sample. In practice, the sample size
used in a study is determined based on the expense of data collection, and the need to have
sufficient statistical power.
25. Specification Limits - Specification limits are the targets set for the process/product by customer
or market performance or internal target. In short it is the intended result on the metric that is
measured.
26. Standard Deviation - In statistics, the standard deviation (SD, also represented by the
Greek letter sigma or the Latin letter s) is a measure that is used to quantify the amount of
variation or dispersion of a set of data values.[1] A low standard deviation indicates that the
data points tend to be close to the mean (also called the expected value) of the set, while a
high standard deviation indicates that the data points are spread out over a wider range of
values.
27. Statistical Process Control - Statistical process control (SPC) is a method of quality
control which uses statistical methods. SPC is applied in order to monitor and control a
process. Monitoring and controlling the process ensures that it operates at its full potential.
28. Statistical Quality Control - Statistical quality control (SQC) is the term used to describe the set
of statistical tools used by quality professionals.
29. Statistics - Statistics is the study of the collection, analysis, interpretation, presentation, and
organization of data
30. Total Quality Control - Application of quality management principles to all areas of business
from design to delivery instead of confining them only to production activities.
31. Total Quality Management - Total quality management (TQM) consists of organization-
wide efforts to install and make permanent a climate in which an organization continuously
improves its ability to deliver high-quality products and services to customers.
32. Upper Control Limit - A value that indicates the highest level of quality acceptable for a
product or service. The upper control limit is used in conjunction with the lower control limit to
create the range of variability for quality specifications, enabling those within the organization
to provide an optimal level of excellence by adhering to the established guidelines.
33. Variability - Range of possible outcomes of a given situation.
34.

Vous aimerez peut-être aussi