Vous êtes sur la page 1sur 5

Tensor:

Tensors are of great importance in physics. It has wide range utility in general relativity,
mechanics, elastic properties of matter and especially, electrodynamics. Tensors are
composed of two very common entities in physics, vectors and scalars. A scalar is basically a
tensor of rank 0 and a vector being a tensor of rank 1. A tensor of rank n in m-D space can be
considered as an entity with properties as given below:
1. The entity should possess components labelled by n indices, with each index having
values ranging from 1 to m.
2. As a whole, it should have a total of components.
3. Coordinate transformation: The components will transform in a specified manner.
4. The behaviour under coordinate transformation should confirm that physical
observables must not depend on the choice of coordinate frames.
Let us consider a vector ,
= 1 1 + 2 2 + 3 3
A rotational transformation of in the coordinate system into a coordinate system , with
the same vector
, can be represented as
= 1 1 + 2 2 + 3 3
The relation between the components is
= ( . )

Applying chain rule and taking into consideration, the linear relations between the coordinate
systems,

= (1)

Now, if gradient of a scalar in the coordinate system have the components

) = ( )
(

For the rotated system,

) =
( = (2)

The quantities transforming according to Equation (1) are called contravariant vectors, and
those transforming as per the Equation (2) are called covariant vectors.
We can summarise the notations as

( )

is Contravariant () =

is Covariant =
( )

Rank 2 Tensor:
Rank of a tensor depends upon the number of direction cosines or partial differentials in the
definition expression of the tensor. It can be interpreted as

Entity No. of Partial Derivatives

Scalar 0
Vector 1
Rank 2 Tensor 2

The most convenient way to represent a second ranked tensor is representing the components
as a square matrix of order 3X3.
11 12 13
= [21 22 23 ]
31 32 33
To summarise, we define tensor as
1. Organisation of a system of components by one or more indices.
2. Transformation as per defined rules.
3. The rank is given by the number of indices.

Operations on Tensors:
Tensors add and subtract similar to vectors. Let,
+ =
, and are contravariant tensors of rank 2. The addition is characterised by
+ =
Similarly, for subtraction,
=
The expression is
=
The conditions for performing addition and subtraction operation are
1. The operands must be tensors of same rank.
2. They must be in same space.

Symmetric and Asymmetric Tensors:

Just like in case of matrices, the tensor is independent of . However, in some cases
we get dependences. Two of such cases are symmetric and asymmetric tensors.
For symmetric,
=
For asymmetric,
=
From here we arrive at an identity,
1 1
= ( + ) + ( )
2 2
Here, we can clearly see that every rank 2 tensor can be resolved into two parts: One
symmetric and the other asymmetric.

Maxwell Stress Tensor:

Maxwell stress tensor gives a mathematical relationship between electromagnetic forces and
momentum. It is a rank 2 tensor. Its basic use comes when interactions of charges in magnetic
field become complicated and using Lorentz law becomes a cumbersome task. A tensor
operation in these cases turns to be simpler than solving equations covering few lines.
Before we start simplifying the cumbersome expressions, we would like to recall back all the
equations we will encounter in the derivation.
1. Maxwells Equations:
Here, we will mention the equations in differential forms, since; we are going to use them to
derive an expression for stress tensor.

Gausss Law . =
0
Gausss Law for Magnetism =
. 0

Amperes Law (corrected by Maxwell) = 0 + 0 0

2. Lorentz Law:
= ( +
)
Now, from Lorentz law,
= +

Here, is the force per unit volume.
We can use Maxwells equations as written above to replace and .
1
= 0 (
. ) + ( ) 0
0
Now,

( ) = ( ) + ( )

We simplify the time derivative cross multiplication term.
Now, using Faradays equation from the table above,

= ,

And putting this in the time derivative equation,

( ) = ( ) + ( )

Therefore, replacing this in the core expression,
1
= 0 [(
. ) (
)] (
[
)] 0

0
We know,
1
( ) = ( 2 ) ( . )
2
1
(
) = ( 2 ) ( .
)
2
These results are arrived from product rule in Vector Analysis.
Now,
1
= 0 [(
. ) + ( .
) ] [( .
) + ( .
)
]
0
1 1
(0 2 + 2 ) 0 ( )
2 0
Now, we arrive at an expression for what may be called force density, however, this equation
by no means looks tidy, nor is it easy to use.
We can simplify this cumbersome equation using Maxwell Stress Tensor,
1 1 1
= 0 ( 2 ) + ( 2 )
2 0 2
Now, we interpret it for Cartesian System.
For Cartesian System:
1. The indices and are the coordinates of Cartesian System viz. x, y and z.
2. The Tensor has a total of nine components.

( )

3. The term , known as Kronecker delta function is such that
a. If the indices are same,
= = = 1
= 1, =
b. If the indices are different,
= 0,
Therefore,
1 1
= 0 (2 2 2 ) + ( 2 2 2 )
2 20
And,
1
= 0 ( ) + ( )
0

Representation of Tensor:
A tensor is represent by double arrow over the term, just like vector is represent by a single
right facing arrow. Tensor carries two indices unlike vector which carries one index only.
This can be summarised as:
Entity Number of Indices Notation
Scalar 0
Vector 1
Tensor 2

Product of a Tensor with a Vector:

The product of a vector with a tensor or specifically the dot product of a tensor with a vector
involves two ways.
The first one is
) =
(.
=,,
Similarly, the second one is
. ) =
(
=,,
The resultant is a vector, since it has only one index.

Divergence of the Tensor:

Since, the product of a vector and a tensor gives a vector resultant; we can attempt to find the
divergence of the tensor. This will further simplify the expression of force density.