Académique Documents
Professionnel Documents
Culture Documents
Tensors are of great importance in physics. It has wide range utility in general relativity,
mechanics, elastic properties of matter and especially, electrodynamics. Tensors are
composed of two very common entities in physics, vectors and scalars. A scalar is basically a
tensor of rank 0 and a vector being a tensor of rank 1. A tensor of rank n in m-D space can be
considered as an entity with properties as given below:
1. The entity should possess components labelled by n indices, with each index having
values ranging from 1 to m.
2. As a whole, it should have a total of components.
3. Coordinate transformation: The components will transform in a specified manner.
4. The behaviour under coordinate transformation should confirm that physical
observables must not depend on the choice of coordinate frames.
Let us consider a vector ,
= 1 1 + 2 2 + 3 3
A rotational transformation of in the coordinate system into a coordinate system , with
the same vector
, can be represented as
= 1 1 + 2 2 + 3 3
The relation between the components is
= ( . )
Applying chain rule and taking into consideration, the linear relations between the coordinate
systems,
= (1)
Now, if gradient of a scalar in the coordinate system have the components
) = ( )
(
For the rotated system,
) =
( = (2)
The quantities transforming according to Equation (1) are called contravariant vectors, and
those transforming as per the Equation (2) are called covariant vectors.
We can summarise the notations as
( )
is Contravariant () =
is Covariant =
( )
Rank 2 Tensor:
Rank of a tensor depends upon the number of direction cosines or partial differentials in the
definition expression of the tensor. It can be interpreted as
The most convenient way to represent a second ranked tensor is representing the components
as a square matrix of order 3X3.
11 12 13
= [21 22 23 ]
31 32 33
To summarise, we define tensor as
1. Organisation of a system of components by one or more indices.
2. Transformation as per defined rules.
3. The rank is given by the number of indices.
Operations on Tensors:
Tensors add and subtract similar to vectors. Let,
+ =
, and are contravariant tensors of rank 2. The addition is characterised by
+ =
Similarly, for subtraction,
=
The expression is
=
The conditions for performing addition and subtraction operation are
1. The operands must be tensors of same rank.
2. They must be in same space.
Gausss Law . =
0
Gausss Law for Magnetism =
. 0
Faradays Equation =
Amperes Law (corrected by Maxwell) = 0 + 0 0
2. Lorentz Law:
= ( +
)
Now, from Lorentz law,
= +
Here, is the force per unit volume.
We can use Maxwells equations as written above to replace and .
1
= 0 (
. ) + ( ) 0
0
Now,
( ) = ( ) + ( )
We simplify the time derivative cross multiplication term.
Now, using Faradays equation from the table above,
= ,
And putting this in the time derivative equation,
( ) = ( ) + ( )
Therefore, replacing this in the core expression,
1
= 0 [(
. ) (
)] (
[
)] 0
0
We know,
1
( ) = ( 2 ) ( . )
2
1
(
) = ( 2 ) ( .
)
2
These results are arrived from product rule in Vector Analysis.
Now,
1
= 0 [(
. ) + ( .
) ] [( .
) + ( .
)
]
0
1 1
(0 2 + 2 ) 0 ( )
2 0
Now, we arrive at an expression for what may be called force density, however, this equation
by no means looks tidy, nor is it easy to use.
We can simplify this cumbersome equation using Maxwell Stress Tensor,
1 1 1
= 0 ( 2 ) + ( 2 )
2 0 2
Now, we interpret it for Cartesian System.
For Cartesian System:
1. The indices and are the coordinates of Cartesian System viz. x, y and z.
2. The Tensor has a total of nine components.
( )
3. The term , known as Kronecker delta function is such that
a. If the indices are same,
= = = 1
= 1, =
b. If the indices are different,
= 0,
Therefore,
1 1
= 0 (2 2 2 ) + ( 2 2 2 )
2 20
And,
1
= 0 ( ) + ( )
0
Representation of Tensor:
A tensor is represent by double arrow over the term, just like vector is represent by a single
right facing arrow. Tensor carries two indices unlike vector which carries one index only.
This can be summarised as:
Entity Number of Indices Notation
Scalar 0
Vector 1
Tensor 2