Vous êtes sur la page 1sur 28

c

 

Augmented Reality (AR) is a growing area in virtual reality research. The


world environment around us provides a wealth of information that is difficult to
duplicate in a computer. This is evidenced by the worlds used in virtual
environments. Either these worlds are very simplistic such as the environments
created for immersive entertainment and games, or the system that can create a
more realistic environment has a million dollar price tag such as flight simulators.
An augmented reality system generates a composite view for the usera   
 

    
       
   

  
     
  
 
 

a Augmented
reality presented to the user enhances that person's performance in and perception
of the world. The ultimate goal is to create a system such that the user cannot tell
the difference between the real world and the virtual augmentation of it. To the
user of this ultimate system it would appear that he is looking at a single real scene.

Dept of IT,SVEC. 1
c
c


   
  
 


Virtual reality is a technology that encompasses a broad spectrum of


ideas. The term is defined as "a computer generated, interactive, three-dimensional
environment in which a person is immersed. There are three key points in this
definition. First, this virtual environment is a computer generated three-
dimensional scene, which requires high performance computer graphics to provide
an adequate level of realism. The second point is that the virtual world is
interactive. A user requires real-time response from the system to be able to
interact with it in an effective manner. The last point is that the user is immersed in
this virtual environment. One of the identifying marks of a virtual reality system is
the head mounted display worn by users. These displays block out all the external
world and present to the wearer a view that is under the complete control of the
computer. The user is completely immersed in an artificial world and becomes
divorced from the real environment. For this immersion to appear realistic the
virtual reality system must accurately sense how the user is moving and determine
what effect that will have on the scene being rendered in the head mounted display.

The discussion above highlights the similarities and differences between


virtual reality and augmented reality systems. A very visible difference between
these two types of systems is the immersiveness of the system. Virtual reality
strives for a totally immersive environment. In contrast, an augmented reality
system is augmenting the real world scene necessitating that the user maintains a
sense of presence in that world. The virtual images are merged with the real view
to create the augmented display. There must be a mechanism to combine the real
and virtual that is not present in other virtual reality work. The computer generated

Dept of IT,SVEC. 2
c
c

virtual objects must be accurately registered with the real world in all dimensions.
Errors in this registration will prevent the user from seeing the real and virtual
images as fused. The correct registration must also be maintained while the user
moves about within the real environment. Discrepancies or changes in the apparent
registration will range from distracting which makes working with the augmented
view more difficult, to physically disturbing for the user making the system
completely unusable. An immersive virtual reality system must maintain
registration so that changes in the rendered scene match with the perceptions of the
user. Milgram defines the Reality-Virtuality continuum shown as Figure 1.

£  

The real world and a totally virtual environment are at the two ends of this
continuum with the middle region called Mixed Reality. Augmented reality lies
near the real world end of the line with the predominate perception being the real
world augmented by computer generated data. Augmented Virtuality is a term
created by Milgram to identify systems, which are mostly synthetic with some real
world imagery added such as texture mapping video onto virtual objects. This is a
distinction that will fade as the technology improves and the virtual elements in the
scene become less distinguishable from the real ones


Dept of IT,SVEC. 3
c
c

è    

The task in the augmented reality system is to register the virtual frame of
reference with what the user is seeing. Registration is more critical in an
augmented reality system because we are more sensitive to visual misalignments
than to the type of vision-kinesthetic errors that might result in a standard virtual
reality system. Figure shows the multiple reference frames that must be related in
an augmented reality system.

Fig 2

Dept of IT,SVEC. 4
c
c

The scene is viewed by an imaging device, which in this case is depicted as


a video camera. The camera performs a perspective projection of the 3D world
onto a 2D image plane. The generation of the virtual image is done with a standard
computer graphics system. The virtual objects are modeled in an object reference
frame. The graphics system requires information about the imaging of the real
scene so that it can correctly render these objects. This data will control the
synthetic camera that is used to generate the image of the virtual objects. This
image is then merged with the image of the real scene to form the augmented
reality image.

Dept of IT,SVEC. 5
c
c

    £
   
 

1. Head Mounted Display

2. Tracking System (GPS)

3. Mobile Computing Power

     

They enable us to view graphics and text created by the augmented reality
system. There are two basic types of head mounted displays being used.

     

The "see-through" designation comes from the need for the user to be able to
see the real worldview that is immediately in front of him even when wearing the
HMD. This system blocks the wearers surrounding environment using small
cameras attached to the outside of the goggle to capture images. On the inside of
the display, the video image is played in real-time and the graphics are
superimposed on the video. One problem with the use of video cameras is that
there is more lag, meaning that there is a delay in image-adjustment when the
viewer moves his or her head.

Dept of IT,SVEC. 6
c
c

     

The head position obtained through the video camera by the process as
explained above is the input to the graphics system. Graphics System produces the
virtual objects, which are aligned to that of real objects, virtual objects are then
merged with the real objects generated by the video camera and sent to the monitor
from where it is displayed to the user.

     

The optical see-through HMD eliminates the video channel that is looking at
the real scene. Instead merging of real world and virtual augmentation is done
optically in front of the user.

    

There are advantages and disadvantages to each of these types of displays.


With both of the displays that use a video camera to view the real world there is
a forced delay of up to one frame time to perform the video merging operation.
At standard frame rates that will be potentially a 33.33 millisecond delay in the

Dept of IT,SVEC. 7
c
c

view seen by the user. Since everything the user sees is under system control
compensation for this delay could be made by correctly timing the other paths
in the system. Or, alternatively, if other paths are slower then the video of the
real scene could be delayed. With an optical see-through display the view of the
real world is instantaneous so it is not possible to compensate for system delays
in other areas. On the other hand, with monitor based and video see-through
displays a video camera is viewing the real scene. An advantage of this is that
the image generated by the camera is available to the system to provide tracking
information. The optical see-through display does not have this additional
information. The only position information available with that display is what
position sensors on the head can provide mounted display itself. The major
advantage of optical see through display is that they could be made very small
however the biggest constraint in using this technology is the prohibitive cost.

The main components of our system are a backpack computer (with 3D


graphics acceleration), a differential GPS system, a head-worn display
interface (with orientation tracker), and a spread spectrum radio
communication link, all attached to the backpack.

Dept of IT,SVEC. 8
c
c

£  

The above shown figure is the block diagram of AR system. It consists of a backup
PC to which two inputs one from the GPS receiver and other from the head
mounted display comes. The signal from the GPS receiver tells the co-ordinates of
Dept of IT,SVEC. 9
c
c

the person and the orientation tracker gives the orientation of the head. These two
inputs will be transferred through the satellite to the database server. This server
based according to the information received sends the related database to the
satellite, which is then transmitted to the backpack PC. The graphics card will
generate the virtual objects, which are then merged with the real environment by
the head worn display interface and displayed to the user.

     

The tracking system is usually a GPS which we will be knowing about in further
study.

 è !     " 

This is the component of the AR system, which generates all the virtual
objects and merges the real based environment with the virtual objects. It is a
communicator between the AR system and the database server.

Dept of IT,SVEC. 10
c
c

 #
  

Where am I? The question seems simple; the answer, historically, has


proved not to be. For centuries, navigators and explorers have searched the
heavens for a system that would enable them to locate their position on the globe
with the accuracy necessary to avoid tragedy and to reach their intended
destinations. On June 26, 1993, however, the answer became as simple as the
question. On that date, the U.S. Air Force launched the 24th Navstar satellite into
orbit, completing a network of 24 satellites known as the Global Positioning
System, or GPS. With a GPS receiver that costs less than a few hundred dollars
you can instantly learn your location on the planet--your latitude, longitude, and
even altitude--to within a few hundred feet.
This incredible new technology was made possible by a combination of scientific
and engineering advances, particularly development of the world's most accurate
timepieces: atomic clocks that are precise to within a billionth of a second. The
clocks were created by physicists seeking answers to questions about the nature of
the universe, with no conception that their technology would some day lead to a
global system of navigation. Today, GPS is saving lives, helping society in
countless other ways, and generating 100,000 jobs in a multi-billion-dollar
industry. It provides a dramatic example of how science works and how basic

      

The GPS consists of three major segments: 


 $  and  .

 
  

The 
 segment consists of 24 operational satellites in six orbital
planes (four satellites in each plane). The satellites operate in circular 20,200 km

Dept of IT,SVEC. 11
c
c

orbits at an inclination angle of 55 degrees and with a 12-hour period. The position
is therefore the same at the same sidereal time each day, i.e. the satellites appear 4
minutes earlier each day.

   

The  segment consists of five     (Hawaii,


Kwajalein, Ascension Island, Diego Garcia, Colorado Springs), three 

, (Ascension Island, Diego Garcia, Kwajalein), and a  


  % & located at Schriever AFB in Colorado. The monitor stations
passively track all satellites in view, accumulating ranging data. This information
is processed at the MCS to determine satellite orbits and to update each satellite's
navigation message. Updated information is transmitted to each satellite via the
Ground Antennas.

è    

The   segment consists of antennas and receiver-processors that


provide positionin research leads to technologies that were virtually unimaginable
at the time the research was done.

  "  '

1.c The basis of GPS is "triangulation" from satellites.

Dept of IT,SVEC. 12
c
c

2.c To "triangulate," a GPS receiver measures distance using the travel time of
radio signals.
3.c To measure travel time, GPS needs very accurate timing, which it achieves
with some tricks.
4.c Along with distance, we need to know exactly where the satellites are in
space. High orbits and careful monitoring are the secret.
5.c Finally you must correct for any delays the signal experiences as it travels
through the atmosphere.

   (  

Suppose we measure our distance from a satellite and find it to be 11,000


miles. Knowing that we're 11,000 miles from a particular satellite narrows down
all the possible locations we could be in the whole universe to the surface of a
sphere that is centered on this satellite and has a radius of 11,000 miles. Next, say
we measure our distance to a second satellite and find out that it's 12,000 miles
away. That tells us that we're not only on the first sphere but we're also on a sphere
that's 12,000 miles from the second satellite. Or in other words, we're somewhere
on the circle where these two spheres intersect. If we then make a measurement
from a third satellite and find that we're 13,000 miles from that one, that narrows
our position down even further, to the two points where the 13,000 mile sphere
cuts through the circle that's the intersection of the first two spheres. So by ranging
from three satellites we can narrow our position to just two points in space.

Dept of IT,SVEC. 13
c
c

To decide which one is our true location we could make a fourth


measurement. But usually one of the two points is a ridiculous answer (either too
far from Earth or moving at an impossible velocity) and can be rejected without a
measurement.

Fig 6

    (  

The basic problem is in finding the distance of the user from the four
satellites. This can be done if we know the time taken by the signal from the
satellite to reach to the receiver. Then the total distance is given by


 %!"      & )  % !   
  &*   %(    "     (  &

The time taken by the signal to reach from satellite to the receiver can be
found by calculating the phase shift of the   

  

Dept of IT,SVEC. 14
c
c

The Pseudo Random Code is a fundamental part of GPS. Physically it's just a
very complicated digital code, or in other words, a complicated sequence of "on"
and "off" pulses. The signal is so complicated that it almost looks like random
electrical noise. Hence the name "Pseudo-Random." There are several good
reasons for that complexity: First, the complex pattern helps make sure that the
receiver doesn't accidentally sync up to some other signal. The patterns are so
complex that it's highly unlikely that a stray signal will have exactly the same
shape. Since each satellite has its own unique Pseudo-Random Code this
complexity also guarantees that the receiver won't accidentally pick up another
satellite's signal. So all the satellites can use the same frequency without jamming
each other. And it makes it more difficult for a hostile force to jam the system. We
assume that both the satellite and the receiver start generating their codes at exactly
the same time. Distance to a satellite is determined by measuring how long a radio
signal takes to reach us from that satellite.

To make the measurement we assume that both the satellite and our receiver
are generating the same pseudo-random codes at exactly the same time.

1.c By comparing how late the satellite's pseudo-random code appears


compared to our receiver's code, we determine how long it took to reach us.

2.c Multiply that travel time by the speed of light and you've got distance.

But  do we make sure everybody is perfectly synchronized?

Dept of IT,SVEC. 15
c
c

If measuring the travel time of a radio signal is the key to GPS, then our stop
watches had better be darn good, because if their timing is off by just a thousandth
of a second, at the speed of light, that translates into almost 200 miles of error!

On the satellite side, timing is almost perfect because they have incredibly
precise    on board.

Atomic clocks don't run on atomic energy. They get the name because they
use the oscillations of a particular atom as their "metronome." This form of timing
is the most stable and accurate reference man has ever developed.

Remember that both the satellite and the receiver need to be able to precisely
synchronize their pseudo-random codes to make the system work. Our receivers
needed atomic clocks (which costs a lot) nobody could afford it. The secret to
perfect timing is to make an  satellite measurement. If our receiver's clocks
were perfect, then all our satellite ranges would intersect at a single point (which is
our position). But with imperfect clocks, a fourth measurement, done as a
crosscheck, will NOT intersect with the first three. Since any offset from universal
time will affect all of our measurements, the receiver looks for a single correction
factor that it can subtract from all its timing measurements that would cause them
all to intersect at a single point. That correction brings the receiver's clock back
into sync with universal time . Once it has that correction it applies to all the rest of
its measurements and now we've got precise position of accuracy upto 3-6 mts.

      +  %&

×c Authorized users with cryptographic equipment and keys and specially


equipped receivers use the Precise Positioning System. U. S. and Allied

Dept of IT,SVEC. 16
c
c

military, certain U.S Government agencies, and selected civil users


specifically approved by the U.S. Government, can use the PPS.
×c PPS Predictable Accuracy
Yc 22 meter Horizontal accuracy
Yc 27.7 meter vertical accuracy

     +  %&

×c Civil users worldwide use the SPS without charge or restrictions. Most
receivers are capable of receiving and using the SPS signal. The SPS
accuracy is intentionally degraded by the DOD by the use of Selective
Availability.
×c SPS Predictable Accuracy
Yc 100 meter horizontal accuracy
Yc 156 meter vertical accuracy

    

×c The SVs transmit two microwave carrier signals. The L1 frequency (1575.42
MHz) carries the navigation message and the SPS code signals. The L2
frequency (1227.60 MHz) is used to measure the ionospheric delay by PPS
equipped receivers.
×c Three binary codes shift the L1 and/or L2 carrier phase.
Yc The C/A Code (Coarse Acquisition) modulates the L1 carrier phase.
The C/A code is a repeating 1 MHz Pseudo Random Noise (PRN)
Code. This noise-like code modulates the L1 carrier signal,
"spreading" the spectrum over a 1 MHz bandwidth. The C/A code
repeats every 1023 bits (one millisecond). There is a different C/A

Dept of IT,SVEC. 17
c
c

code PRN for each SV. GPS satellites are often identified by their
PRN number, the unique identifier for each pseudo-random-noise
code. The C/A code that modulates the L1 carrier is the basis for the
civil SPS.
Yc The P-Code (Precise) modulates both the L1 and L2 carrier phases.
The P-Code is a very long (seven days) 10 MHz PRN. The P-Code is
encrypted into the Y-Code. The encrypted Y-Code requires a
classified AS Module for each receiver channel and is for use only by
authorized users with cryptographic keys. The P (Y)-Code is the basis
for the PPS.
Yc The Navigation Message also modulates the L1-C/A code signal. The
Navigation Message is a 50 Hz signal consisting of data bits that
describe the GPS satellite orbits, clock corrections, and other system
parameters.

 

×c The GPS Navigation Message consists of time-tagged data bits marking the
time of transmission of each sub frame at the time they are transmitted by
the SV. A data bit frame consists of 1500 bits divided into five 300-bit sub
frames. A data frame is transmitted every thirty seconds. Three six-second
sub frames contain orbital and clock data. SV Clock corrections are sent in
sub frame one and precise SV orbital data sets (ephemeris data parameters)
for the transmitting SV are sent in sub frames two and three. Sub frames four
and five are used to transmit different pages of system data. An entire set of
twenty-five frames (125 sub frames) makes up the complete Navigation
Message that is sent over a 12.5 minute period.

Dept of IT,SVEC. 18
c
c

×c Data frames (1500 bits) are sent every thirty seconds. Each frame consists of
five sub frames.
×c Data bit sub frames (300 bits transmitted over six seconds) contain parity
bits that allow for data checking and limited error correction

£ ((     :

1.Ionosphere and Troposphere delays: The signal from the satellite slows down
while passing through the ionosphere and troposphere before reaching the receiver.
GPS system uses an inbuilt model that calculates the average amount of delay.

2. Signal and Multipath: While coming from the satellite signal may get reflected
of by high buildings or some other objects before it reaches the receiver. Thus
causing an extra timing error.

3. Orbital Errors: These errors are also known as ephemeris error. These are the
inaccuracy of satellites while orbiting the earth.

4. Receiver Clock errors: As we use an ordinary quartz clock in the GPS receiver it
introduces certain timing errors.

      (    % &

There are usually more satellites available than a receiver need to fix a
position so the receiver picks a few and ignores the rest. These picks should be as
far from each other in the space as possible as for maximum accuracy the spheres
should intersect at almost right angles.

Dept of IT,SVEC. 19
c
c

The accuracy achieved by this GPS system is 3-6 mts. but for Augmented
Reality accuracy in centimeters is required. For that purpose we use Differential
GPS Systems.

 ((    :

It is a way to correct various inaccuracies. It involves the co-operation of


two receivers out of which one is stationary and the other one is with the user. The
stationary receiver ties all the satellite measurements into a solid local reference.
The reference receiver is put onto a point that has been very accurately surveyed. It
receives the GPS signal and works in reverse order that is instead of using timing
signal to calculate the position, it uses the known position to calculate the time and
then compare it with the actual time that the signal must have taken from the
satellite to reach the receiver. This error information is then transferred to the
roving receiver. At a particular moment the stationary receiver doesn¶t know which
satellites are being used by the roving receiver. So it calculates the timing error
from all the 24 satellites and sends this information to the roving one. The roving
receiver then uses the required information to calculate the correction factor and
neglect the rest of information.

Dept of IT,SVEC. 20
c
c

[

 
 £
 

   

A simple form of augmented reality has been in use in the entertainment and
news business for quite some time. Whenever you are watching the evening
weather report the weather reporter is shown standing in front of changing weather
maps. In the studio the reporter is actually standing in front of a blue or green
screen. This real image is augmented with computer-generated maps using a
technique called chroma keying.

     

The military has been using displays in cockpits that present information to
the pilot on the windshield of the cockpit or the visor of their flight helmet. This is
a form of augmented reality display. SIMNET, a distributed war games simulation
system, is also embracing augmented reality technology. By equipping military
personnel with helmet mounted visor displays or a special purpose rangefinder the
activities of other units participating in the exercise can be imaged. While looking
at the horizon, for example, the display-equipped soldier could see a helicopter
rising above the tree line. This helicopter could be being flown in simulation by
another participant. In wartime, the display of the real battlefield scene could be
augmented with annotation information or highlighting to emphasize hidden
enemy units.

è     

Imagine that a group of designers are working on the model of a complex


device for their clients. The designers and clients want to do a joint design review

Dept of IT,SVEC. 21
c
c

even though they are physically separated. If each of them had a conference room
that was equipped with an augmented reality display this could be accomplished.
The physical prototype that the designers have mocked up is imaged and displayed
in the client's conference room in 3D. The clients can walk around the display
looking at different aspects of it. To hold discussions the client can point at the
prototype to highlight sections and this will be reflected on the real model in the
augmented display that the designers are using. Or perhaps in an earlier stage of
the design, before a prototype is built, the view in each conference room is
augmented with a computer-generated image of the current design built from the
CAD files describing it. This would allow real time interaction with elements of
the design so that either side can make adjustments and changes that are reflected
in the view seen by both groups.

 ( $     

When the maintenance technician approaches a new or unfamiliar piece of


equipment instead of opening several repair manuals they could put on an
augmented reality display. In this display the image of the equipment would be
augmented with annotations and information pertinent to the repair. For example,
the location of fasteners and attachment hardware that must be removed would be
highlighted. Then the inside view of the machine would highlight the boards that
need to be replaced worn by personnel that is attached to an optical see-through
display The wireless connection allows the soldier to access repair manuals and
images of the equipment. Future versions might register those images on the live
scene and provide animation to show the procedures that must be performed.

Dept of IT,SVEC. 22
c
c

    

Virtual reality systems are already used for consumer design. Using perhaps
more of a graphics system than virtual reality, when you go to the typical home
store wanting to add a new deck to your house, they will show you a graphical
picture of what the deck will look like. It is conceivable that a future system would
allow you to bring a video tape of your house shot from various viewpoints in your
backyard and in real time it would augment that view to show the new deck in its
finished form attached to our house. Or bring in a tape of your current kitchen and
the augmented reality processor would replace your current kitchen cabinetry with
virtual images of the new kitchen that you are designing.

Applications in the fashion and beauty industry that would benefit from an
augmented reality system can also be imagined. If the dress store does not have a
particular style dress in your size an appropriate sized dress could be used to
augment the image of you. As you looked in the three-sided mirror you would see
the image of the new dress on your body. Changes in hem length, shoulder styles
or other particulars of the design could be viewed on you before you place the
order. When you head into some high-tech beauty shops today you can see what a
new hairstyle would look like on a digitized image of yourself. But with an
advanced augmented reality system you would be able to see the view as you
moved. If the dynamics of hair were included in the description of the virtual
object you would also see the motion of your hair as your head moved

Dept of IT,SVEC. 23
c
c

[  (  

Tourists and students could use these systems to learn more about a certain
historical event. Imagine walking onto a Civil War battlefield and seeing a re-
creation of historical events on a head-mounted, augmented-reality display. It
would immerse you in the event, and the view would be panoramic.

,  

How cool would it be to take video games outside? The game could be
projected onto the real world around you, and you could, literally, be in it as one of
the characters. When one uses this system, the game surrounds him as he walks
across campus.

There are hundreds of potential applications for such a technology, gaming


and entertainment being the most obvious ones. Any system that gives people
instant information, requiring no research on their part, is bound to be a valuable to
anyone in pretty much any field. Augmented-reality systems will instantly
recognize what someone is looking at, and retrieve and display the data related to
that view.

Dept of IT,SVEC. 24
c
c

,  £
   £
   



Augmented reality systems are expected to run in real-time so that a user will
be able to move about freely within the scene and see a properly rendered
augmented image. This places two performance criteria on the system. They are:

×c Update rate for generating the augmenting image,


×c Accuracy of the registration of the real and virtual image.

Visually the real-time constraint is manifested in the user viewing an


augmented image in which the virtual parts are rendered without any visible jumps.
To appear without any jumps, a standard rule of thumb is that the graphics system
must be able to render the virtual scene at least 10 times per second. This is well
within the capabilities of current graphics systems for simple to moderate graphics
scenes. For the virtual objects to realistically appear part of the scene more
photorealistic graphics rendering is required. The current graphics technology does
not support fully lit, shaded and ray-traced images of complex scenes. Fortunately,
there are many applications for augmented reality in which the virtual part is either
not very complex or will not require a high level of photorealism.

Failures in the second performance criterion have two possible causes. One
is a misregistration of the real and virtual scene because of noise in the system. The
position and pose of the camera with respect to the real scene must be sensed. Any
noise in this measurement has the potential to be exhibited as errors in the
registration of the virtual image with the image of the real scene. Fluctuations of
values while the system is running will cause jittering in the viewed image. As
mentioned previously, our visual system is very sensitive to visual errors, which in
this case would be the perception that the virtual object is not stationary in the real
Dept of IT,SVEC. 25
c
c

scene or is incorrectly positioned. Misregistrations of even a pixel can be detected


under the right conditions. The second cause of misregistration is time delays in
the system. As mentioned in the previous paragraph, a minimum cycle time of 0.1
seconds is needed for acceptable real-time performance. If there are delays in
calculating the camera position or the correct alignment of the graphics camera
then the augmented objects will tend to lag behind motions in the real scene. The
system design should minimize the delays to keep overall system delay within the
requirements for real-time performance.

Dept of IT,SVEC. 26
c
c

å 

Though Augmented Reality is in a nascent stage and is still not being used in
mass. We feel that with the growing research in AR and shrinking size and
complexity of AR system it is not far away in time where everybody will own his
own AR system.

Dept of IT,SVEC. 27
c
c

 £   

1.cR. Azuma, A Survey of Augmented Reality .
2.c P. Milgram and A. F. Kishino, Taxonomy of Mixed Reality Visual
Displays IEICE Transactions on Information and Systems.
3.cMediated Reality with implementations for everyday life, 2002 August 6th,
Presence Connect, the on line companion to the MIT Press journal
4.cExperiences and Observations in Applying Augmented Reality to Live
Training
5.cRamesh Raskar, Greg Welch, Henry Fuchs Spatially Augmented Reality,
First International Workshop on Augmented Reality, Sept 1998.
6.cDavid Drascic of the University of Toronto is a developer of ARGOS: A
Display System for Augmenting Reality.

Dept of IT,SVEC. 28
c

Vous aimerez peut-être aussi