Vous êtes sur la page 1sur 37

Introduction to vectors and tensors

Vector space
A vector space (or a linear space) is a set (say, V ) equipped with
addition (on V V to V ) and scalar multiplication (on R V to V )
and satisy the following conditions:
1. Commutativity: u +v = v +u, u, v V .
2. Associativity: (u +v) +w = u + (v +w), u, v, w V .
3. Existence of zero element: (u +0) = u, where 0 V .
4. Existence of -ve elements: If u V then u V , and u+(u) = 0.
5. Distributivity w.r.t addition of vectors: (u +v) = u +v,
R and u, v V .
6. Distributivity w.r.t scalar addition: ( +)u = u +u,
, R and u V .
7. Associativity w.r.t scalar multiplication: (u) = ()u,
, R and u V .
8. Identity in scalar multiplication: 1u = u, u V .
Vectors in R
n
Vectors in n-dimansional vector space R
n
can be represented as n
ordered real numbers:
u := {(u
1
, u
2
, . . . , u
n
)| u
i
R}.
Addition:
u +v := (u
1
+v
1
, u
2
+v
2
, . . . , u
n
+v
n
).
Multiplication:
u := (u
1
, u
2
, . . . , u
n
), R.
A subset {u
1
, u
2
, . . . , u
m
} of a vector space V is lineraly dependent i
scalars
1
,
2
, . . . ,
m
(not all zero), such that

1
u
1
+
2
u
2
+. . . +
m
u
m
= 0.
A subset {u
1
, u
2
, . . . , u
m
} of a vector space V is lineraly independent
i

1
u
1
+
2
u
2
+. . . +
m
u
m
= 0,
implies
1
= 0,
2
= 0, . . . ,
m
= 0.
A subset {e
1
, e
2
, . . . , e
n
} of a vector space V forms a basis i
{e
1
, e
2
, . . . , e
n
} is linearly independent.
Then we can write
u = u
1
e
1
+u
2
e
2
+. . . +u
n
e
n
,
where the scalars u
1
, u
2
, . . . , u
n
are the components of u V .
In practice we always take the natural basis, i.e.,
e
1
= (1, 0, 0, . . . , 0),
e
2
= (0, 1, 0, . . . , 0),
.
.
.
e
n
= (0, 0, 0, . . . , 1).
Einsteins summation convention
Let u R
3
u = u
1
e
1
+u
2
e
2
+ +u
3
e
3
=
3

i=1
u
i
e
i
= u
i
e
i
,
where i = 1 . . . 3.
Vectors in R
3
Let V denotes the 3-D Euclidean space R
3
. Let {e
1
, e
2
, e
3
} be the
xed Cartesian basis.
We write
e
i
e
j
=
ij
,
where
ij
is called the Kronecker delta and dened by

ij
= 0 when i = j,
= 1 when i = j,
where i = 1 . . . 3 & j = 1 . . . 3.
The Kronecker delta is known as substitutional operator, since from
the denition we see
x
i
=
ij
x
j
, & A
ij
= A
ik

kj
.
Please note that
ij
=
ji
(i.e., symmetric), and

ii
=
11
+
22
+
33
= 3.
We know that u = u
i
e
i
= (u e
i
)e
i
, and v = v
j
e
j
. The inner product
(u, v) = u v = u
i
e
i
v
j
e
j
= u
i
v
j
e
i
e
j
= u
i
v
j

ij
(since e
i
e
j
=
ij
)
= u
i
v
i
(since
ij
is a substitutional operator)
= u
1
v
1
+u
2
v
2
+u
3
v
3
.
Note that indices must not be repeated more than twice.
Vector cross-product
e
j
e
k
:=
ijk
e
i
, where
ijk
is called the alternate tensor or
permutation tensor and it takes the following values

123
=
231
=
312
= 1

132
=
213
=
321
= 1

ijk
= 0 otherwise.
Note that
e
i
(e
j
e
k
) = e
i
(
mjk
e
m
) =
mjk
(e
i
e
m
) =
mjk

im
=
ijk
.
So
c = a b = a
j
e
j
b
k
e
k
= a
j
b
k
(e
j
e
k
) = a
j
b
k

ijk
e
i
.
c
i
= a
j
b
k

ijk
.
Scalar tripple product
Denition: [u, v, w] := u (v w) = u
i
e
i

mjk
v
j
w
k
e
m
=

mjk
u
i
v
j
w
k
(e
i
e
m
) =
mjk
u
i
v
j
w
k

im
=
ijk
u
i
v
j
w
k
.
[u, v, w] =
ijk
u
i
v
j
w
k
=

u
1
u
2
u
3
v
1
v
2
v
3
w
1
w
2
w
3


ijk
= [e
i
, e
j
, e
k
] =

e
i
e
1
e
i
e
2
e
i
e
3
e
j
e
1
e
j
e
2
e
j
e
3
e
k
e
1
e
k
e
2
e
k
e
3

i1

i2

i3

j1

j2

j3

k1

k2

k3

ijk

pqr
=

i1

i2

i3

j1

j2

j3

k1

k2

k3

p1

p2

p3

q1

q2

q3

r1

r2

r3

i1

i2

i3

j1

j2

j3

k1

k2

k3

p1

q1

r1

p2

q2

r2

p3

q3

r3

(since detT = detT


t
)
=

i1

i2

i3

j1

j2

j3

k1

k2

k3
_

_
_

p1

q1

r1

p2

q2

r2

p3

q3

r3
_

im

mp

im

mq

im

mr

jm

mp

jm

mq

jm

mr

km

mp

km

mq

km

mr

ip

iq

ir

jp

jq

jr

kp

kq

kr

We can show that

ijk

iqr
=
jq

kr

jr

kq

ijk

ijr
= 2
kr

ijk

ijk
= 6.
Second order tensor
A second order tensor is a linear transformation that transforms a
vector to an another vector, i.e.,
v = Tu,
where T Lin (the set of all second order tensors) is a second order
tensor, and u, v V.
Note that u and v have three and T has nine independent components.
Dierence between matrix and tensor: Existance of tensor is
independent of coordinate system.
Components of a tensor form a matrix (note that matrix will change
with coordinate systems).
Properties of a second order tensor
Linear transformation: T(u +v) = Tu +Tv, T Lin,
u, v V , and , R.
T0 = 0.
(T+R)u = Tu +Ru, T, R Lin, u V .
Equality: T, R Lin are equal if Tu = Ru u V .
Zero tensor: If Zu = 0 u V , then Z Lin and is called the zero
tensor and satises
(T+Z)u = Tu T Lin & u V.
Identity tensor (denoted by 1 or I): 1u := u u V.
Components of a second order tensor
We had v = Tu. Choosing u = e
1
, e
2
&e
3
we write (all are column
vectors)
Te
1
=
1
e
1
+
2
e
2
+
3
e
3
Te
2
=
4
e
1
+
5
e
2
+
6
e
3
Te
1
=
7
e
1
+
8
e
2
+
9
e
3
.
Now choosing
1
= T
11
,
2
= T
21
, . . . ,
9
= T
33
(e.g.,
Te
1
= T
11
e
1
+T
21
e
2
+T
31
e
3
) and writing them in index notation we
get
Te
j
= T
ij
e
i
.
Note that e
i
Te
j
= e
i
T
kj
e
k
= T
kj

ik
= T
ij
,
i.e.,
T
ij
= e
i
Te
j
.
1
ij
= e
i
1e
j
= e
i
e
j
=
ij
.
Transpose: dened as T
t
u v := u Tv u V .
(T
t
)
ij
= e
i
T
t
e
j
= T
ji
.

(T
t
)
t
= T
(T)
t
= T
t
(T+R)
t
= T
t
+R
t
If T is symmetric then T
t
= T i.e., T
ij
= T
ji
.
If T is skew-symmetric (or anti-symmetric) then
T
t
= T i.e., T
ij
= T
ji
.
Any second order tensor can be decomposed into a symmetric and a
skew-symmetric tensor i.e., T = T
s
+T
ss
, where
T
s
=
1
2
(T+T
t
), and T
ss
=
1
2
(TT
t
).
Tensor product: Let A = BC then
(A)
ij
= A
ij
= e
i
(BC)e
j
= e
i
B(Ce
j
)
= e
i
B(C
kj
e
k
)
= C
kj
e
i
Be
k
= C
kj
B
ik
= B
ik
C
kj
(similar to matrix multiplication).
The index k is repeated index or dummy index while i and j are free indices.
No index should be repeated more than twice.
Please check using indicial notation that (BC)
t
= C
t
B
t
.
Tensor product (or dyadic product)
Dyadic product of two vectors a and b is dened as
(a b)c := (b c)a c V,
or
(ab)c := (b c)a c V.
Please note that (a b) maps a vector to an another vector.
Properties of the dyadic product
Linearity of dyadic product:
(a b)(u +v) = [b (u +v)]a
= [(b u) +(b v)]a
= (b u)a +(b v)a
= [(a b)u] +[(a b)v].
Since (a b) is linear and transforms a vector to another vector, it
represents a second order tensor.
Now we prove that T = T
ij
(e
i
e
j
).
Tu = (Tu)
i
e
i
= [e
i
(Tu)]e
i
= {e
i
[T(u
j
e
j
)]}e
i
= u
j
{e
i
(Te
j
)}e
i
= (u e
j
) {e
i
(Te
j
)}e
i
= [e
i
(Te
j
)][(u e
j
) e
i
]
= T
ij
[(e
i
e
j
)]u
= (T
ij
e
i
e
j
)u
Since it is true for all u, T = T
ij
(e
i
e
j
).
Identity tensor: 1 =
ij
e
i
e
j
= e
i
e
i
.
So,
(a b)
ij
= e
i
(a b)e
j
= e
i
(b e
j
)a
= a
i
b
j
.
Inner product or scalar product between two tensors
(denoted by : or )

(T, R) = T : R = T
ij
R
ij
= T
11
R
11
+T
12
R
12
+T
13
R
13
+T
21
R
21
+T
22
R
22
+T
23
R
23
+T
31
R
31
+T
32
R
32
+T
33
R
33
.
It can be shown that
T : R = tr(T
t
R) = tr(TR
t
) = tr(RT
t
) = tr(R
t
T).
Eigenvalues and eigenvectors of tensors
A vector n is an eigenvector of a second order tensor T if there exists a
such that
Tn = n.
For non-trivial n
det(T1) = 0 (characteristic equation).
Expanding the above eq.

3
I
1

2
+I
2
I
3
= 0.
Cayley-Hamilton Theorem: T satises its characteristic eq.
T
3
I
1
T
2
+I
2
TI
3
1 = 0 T.
Principle invariants of T
First invariant:
I
1
= tr T = T
11
+T
22
+T
33
=
1
+
2
+
3
= e
1
Te
1
+e
2
Te
2
+e
3
Te
3
= T
ii
=
ij
T
ij
= 1 : T.
Some properties of I
1
: tr(T+R) = tr T+tr R , R.
trT
t
= trT.
tr(TR) = tr(RT).
tr(a b) = a b.
Cofactor of T (Cof T, or T

)
Denition: T

(u v) := Tu Tv u, v V.
We choose u = e
q
e
j
, and v = e
q
.
So u v = 2e
j
. Substituting this and simplifying we get
(T

)
ij
=
1
2

imn

jpq
T
mp
T
nq
.
In matrix form
[T

] =
_

_
T
22
T
33
T
23
T
32
T
23
T
31
T
21
T
33
T
21
T
32
T
22
T
31
T
32
T
13
T
33
T
12
T
33
T
11
T
31
T
13
T
31
T
12
T
32
T
11
T
12
T
23
T
13
T
22
T
13
T
21
T
11
T
23
T
11
T
22
T
12
T
21
_

_
.
Some properties:
(T

)
t
= (T
t
)

, (TR)

= T

.
Second invariant:
I
2
= tr(T

) =
1
2
[(trT)
2
trT
2
]
=
1

2
+
2

3
+
3

1
=

T
11
T
12
T
21
T
22

T
22
T
23
T
32
T
33

T
11
T
13
T
31
T
33

.
Third invariant:
I
3
= det(T) =
1
6

ijk

pqr
T
ip
T
jq
T
kr
=
1

3
.
Some properties:
det(T
t
) = det(T), det(TR) = det(T)det(R).
Inverse of T (denoted by T
1
)
If T is invertible (det T = 0), it satises
TT
1
= T
1
T = 1.
Inverse is given by (as we have seen in Matrix algebra)
T
1
=
(T

)
t
det T
.
Properties of inverse:
(T
t
)
1
= (T
1
)
t
, (TR)
1
= R
1
T
1
.
Orthogonal tensors (Q)
If Q is orthogonal if Q
1
= Q
t
or equivalently
Q
t
Q = QQ
t
= 1.
Q satises the following properties
Qu Qv = u v u, v V,
|Qu| = |u| u V,
|Qu Qv| = |u v| u, v V.

det(QQ
t
) = det(1) detQ = 1.
If detQ = +1, Q is called proper orthogonal tensors or rotation.
If detQ = 1, Q is called improper orthogonal tensors or reection.
Example of rotation (2D):
_
_
_
x

_
_
_
=
_
_
cos sin
sin cos
_
_
_
_
_
x
y
_
_
_
,

where [Q] =
_
_
cos sin
sin cos
_
_
.
Few examples
Problem 1: Show that
(a b)
t
= b a.
Solution::
[(a b)
t
]
ij
= (a b)
ji
(since (T
t
)
ij
= T
ji
)
= a
j
b
i
= b
i
a
j
= (b a)
ij
(a b)
t
= b a.
Problem 2: Show that
T(a b) = (Ta) b.
Solution:
[T(a b)]
ij
= T
ik
(a b)
kj
[since (AB)
ij
= A
ik
B
kj
]
= T
ik
a
k
b
j
= (T
ik
a
k
)b
j
= (Ta)
i
b
j
= (Ta b)
ij
T(a b) = (Ta) b.
Problem 3: Show that
(a b) : (u v) = (a u)(b v).
Solution:
(a b) : (u v) = (a b)
ij
(u v)
ij
[since A : B = A
ij
B
ij
]
= a
i
b
j
u
i
v
j
= (a
i
u
j
)(b
j
v
j
)
= (a u)(b v)
(a b) : (u v) = (a u)(b v).
Problem 4: If S is a symmetric tensor and W is a skew-symmetric
tensor then show that
S : W = 0.
Solution:
S : W = S
ij
W
ij
= S
ji
W
ji
[since, i and j are dummy indices]
= S
ij
(W
ij
) [since, S is symmetric, and W is skew]
= S
ij
W
ij
S : W = S
ij
W
ij
= 0.
Problem 6: Show that detRS = detRdetS.
Solution: [u, v, w] = volume of the parallelepiped. Choosing u = e
1
,
v = e
2
, and w = e
3
we get [u, v, w] = 1 (i.e., volume of a unit cube).
Similarly
[Te
1
, Te
2
, Te
3
] = detT.
det(TR) = [TRe
1
, TRe
2
, TRe
3
] = detT[Re
1
, Re
2
, Re
3
]
= detTdetR.
References
An Introduction to Continuum Mechanics, by M. E. Gurtin: Academic
Press.
Continuum Mechanics, by P. Chadwick: Dover Publications, Inc.
Introduction to the Mechanics of a Continuous medium, by L. E.
Malvern: Prentice-Hall Inc.

Vous aimerez peut-être aussi