Vous êtes sur la page 1sur 3

Name : Muhammad Wildan M

NIM : A017006

Differences Calibration and Tera

Calibration and tera are similar activities in execution, but differ in purpose. Calibration aims to provide
assurance that calibrated tools have traceability traceable to national or international standards. Tera ensures fair
trade and ensures radiation safety.
Some calibration and tera differences are shown in the table below
Parameter Tera calibration
Rules UU No.2 1981 ISO 17025: 2005
Nature of the rules Required Volunteer
Personnel Sworn No rules yet
Aim Fair deal Search
Equipment type All measuring tools will be used Lab, production, services
Management agency Department of Trade. Calibration Lab
Working result Tera Signs, Srt. Ket. Label, Sertf. calibration
Interval Regulated UU No.2 1981 Appropriate nature of the tool.
Checking between Not known Among the hose cal
In addition to ISO 17025: 2005 as well as other standards such as ISO 9000 series, and standards involving the
control of measuring equipment calibrate as one of the requirements of competence.
3. Some definitions
Traceability: The nature of the measurement results or the standard values that may be linked to a particular
reference, usually of national or international standard, through an uninterruptible linking chain with references
having certain uncertainties.
Calibration: Determining the conventional truth of appointment of a tool by way of benchmarking by its
measuring standard traceable to national / international standards
Menera is marking with valid marks or valid undo marks, or giving written statements marked with valid or invalid
valid marks, performed by employees who are entitled to do so on the basis of tests conducted on measuring
instruments, measuring, weighing and unused equipment. (UUMl 1.q)
Verification: Confirmation through testing and presentation of evidence that the specified requirements have
been met
Maintenance: A series of activities to prove that a calibrator and its equipment are eligible for use in calibration
4. The standard hierarchy
The standard tools used in calibration and ta are required to have traceability as evidenced by the
presence of calibration certificates. This means that the result of the standard tool has been compared to the
standard hierarchy of the standard tool. The standard tool hierarchy can be described as follows:
International Standards
International standards defined by international treaties are therefore also called conventional standards. The
standard definition below is referenced from The 7th International System Unit (SI) mold 1998 (BIPM)
Standard dimensions
Standard meter was first agreed in 1889 in the form of Pt-Ir rod. The 1960s were changed based on 86 krypton
radiation waves. Meters were defined as 1,650,763.73 times the wavelength of krypton 86 radiation. In 1983 the
definition was changed to the distance traveled by light in vacuum during 1/299 792 458 seconds. The first
prototype meter was kept and maintained at the BIPM ( Bureau International des Poids et Mesures ) under
conditions agreed in 1889.
Standard mass
The kilogram standard was first defined as a mass of 1 dm 3 distilled water at its maximum density. In 1889 was
agreed as a mass of kilogram prototypes made of Pt-Ir with a diameter and height of 39 mm. This prototype is
still in use today and stored in BIPM.
Standard time
The standard seconds of 1968 are defined as 1/86400 average 1 day sun. However, because the time of war of
the earth was inconsistent, then in 1968 the definition was changed to 9192.631.770 times the time required for
the transition of cesium atoms 133 on field-free conditions maknit and at 0 ° K.
Strong current standard
The strong standard of current, amperes, 1946 is defined as a constant current maintained in two conductors, so
that between the two conductors there is a force of 2 x 10 -7 Newton. Both conductors are straight, parallel at 1 m,
infinite length, each diameter can be ignored, and located in a vacuum.
Standard temperature
The thermodynamic unit temperature, Kelvin, 1968 is defined as 1 / 273.16 times the thermodynamic
temperature of the triple water point ie the water condition which is in the three phases of liquid, solid, and gas at
atmospheric pressure 1. The triple point occurs at a temperature of 0.01 ° C. The relationship between degrees
Celsius and Kelvin is:
Standard quantity of materials
The standard quantity of the year, mol, 1969 material is defined as the amount of material equivalent to the
number of atoms of 0.012 kg carbon 12. The mol unit should be explained about the measured material such as
atoms, molecules, ions, electrons, or other particles, or the combined particles.
Strong standard of light
The light strong standard of 1980 is defined as the light power of a light source emitting monochromatic radiation
at a frequency of 540 x 10 12 hertz with the strength of 1/683 watts per steradian.
Primary standard
The primary standard is the first derivative of the international standard which is the highest standard in a country
( National Standard ). The primary standard prototypes for each quantity are as follows:
The primary prototypes of standards for mass and dimension are the same as their international standards.
The prototype for the primary time standard is an atomic clock based on the cesium atomic transition time.
The primary standard prototype for strong currents is the primary standard resistor and the primary voltage
standard.
The primary primer temperature prototype is a platinum resistance thermometer. 1927 IPTS ( International
Practical of Temperature Scale ) approved the use of a practical scale for temperature measurement.
The strong primary primer prototype of light is an optical radiation strength gauge by radiometric methods.
Secondary standards
Secondary standards are derived from primary standards that are stored or maintained in various industrial
measuring instruments or in calibration laboratories. Secondary standards can be produced and used for
standard tool calibration underneath. The secondary time standard in the form of a device called frequency
counter is sold freely.
Standard of work
Working standards are calibration standards used to calibrate measuring instruments or test kits. Working
standards are often referred to as calibrators.
5. Some calibration parameters
part of the science of metrology, therefore many calibration work involved with measuring activities measure. In
the widely used measures of terms that need to be well understood include the following:
Accuracy ( accuracy )
The closest price reads a measuring instrument at the actual price
Accuracy ( precision )
Measuring the ability of the measuring instrument to obtain similar repeated measurements
Resolution
The smallest change of the measurements that can be given in response to an instrument or measuring
instrument
Sensitivity
Comparison of measuring device response with input change of measured variable
6. Unit
The unit system used in calibration is called the SI system ( System Interantionale d'Unites ) system. SI unit
system has 7 basic unit that is meter (m), kilogram (kg), sekon (s), amper, Kelvin (K), mole ( mol ), and candela
(cd).
In addition to the above units there are two units of supplementary that is a unit of flat angles ( radians ) and
massive angle ( steradian ). From these units and supplementary units can be formed into various derived units
such as unit area, speed, pressure, torque and so on.
Writing a unit requires accuracy to avoid misinterpretation. Writing calibration results in the calibration certificate
must follow the rules of writing the unit according to SI units. Some of the unit prefixes used in metrology as
shown in the table below:
7. Hose calibration
The question that often arises in the calibration program is about the frequency of calibration. Tools that are often
used tend to be more often calibrated than a tool that is rarely used. But this does not apply to electronic-based
instruments, since the rarely used actually tends to be damaging, therefore the tools must be heated every day
for a certain time.
In general the calibration hose is determined by several factors as follows:
• Stabilization of measuring instruments / measuring materials
• Factory recommendations
• Tendency of previous calibration record data
• Record of care and repair data
• Scope and usage load
• tendency for wear and irregularities
• Cross-checking results with other measuring equipment
• Environmental conditions
• Measurement accuracy is desirable
• When the equipment is not working properly
Declare the calibration interval can be a calendar time eg once a year, in the form of time spent eg 1000 hours of
usage, the amount of usage for example 1000 times, and in the form of a combination of ways it depends on
which first reached.

Vous aimerez peut-être aussi