Vous êtes sur la page 1sur 23

INTERNATIONAL JOURNAL FORNUMERICAL METHODS FLUIDS, IN VOL. 24.

137I_1389 (1997\

PARALLEL FINITE ELEMENT METHODSFOR LARGE-SCALE COMPUTATIONOF STORM SURGES AND TIDAL FLOWS
KAZUO KASHIYAMA, KATSUYA SAITOH,I MAREK BEHR2AND TAYFUN E. TEZDUYAR2
tDepartmcnt of Civil Engineering, Chuo IJniversity, t-13-27 Kasuga Bunkyo-ku, Tokyo t 12, Japan ^ 'AEM/AHPCRC, University of Minnesota, t IN Washington Avenue South, Minneapolis, MN 55415, U.S.A.

SUMMARY Massively parallel finite element methods for large-scale computation of storm surges and tidal flows are discussedhere. The finite element computations, carried out using unstructured grids, are based on a three-step explicit formulation and on an implicit space-time formulation. Parallel implementations of these unstructured grid-based formulations are carried out on the Fujitsu Highly Parallel Computer AP1000 and on the Thinking Machines CM-5. Simulations of the storm surge accompanying the Ise-Bay typhoon in 1959 and of the tidal flow in Tokyo Bay serve as numerical examples. The impact of parallelization on this type of simulation is also investigated.The presentmethods are shown to be useful and powerful tools for the analysis of storm surgesand tidal flows. O 1997 by John Wiley & Sons,Ltd. Int. J. Numer. Meth. Fluids.2A:1371-1389. 1997 No. of Fisures: 22. No. of Tables:0. No. of References: 26.

KEYwoRDS: parallel finite element methodt three-stepexplicit formulation; implicit space-timeformulation; storm surge; tidal flow

I. INTRODUCTION Storm surges a phenomenon which the sealevel in a near-shore is in risessignificantlybecause ofthe passage a typhoon or low atmospheric of pressure. This can causeenorrnous damagein major bays and harbours. Tidal flows in oceanbays are less violent, yet their understanding also important to is the design of shoreline and offshore structure. For the study of storm surge, computations were carried out in the past by some of the present authors and also other researchers.l'2 Tidal flow simulationswere previouslyreportedin References and 4. The finite elementmethodis a powerful 3 tool in such simulations,since it is applicableto complicatedwater and land configurationsand is able to represent such configurations accurately.In practical computations, especiallyin the caseof storm surgeanalysis,the computational domain is large and the computations needto be carried out over long time periods.Thereforethis type of problembecomesquite large-scale and it is essential to use methodswhich are as efficient and fast as the availablehardwareallows. In recent years,massivelyparallel finite elementcomputationshave been successfully applied to severallarge-scale flow problems.3's Thesecomputations demonstrated availability of a new level the of finite elementcapability to solve practical flow problems.With the need for a high-performance computingenvironmentto carry out simulationsfor practicalproblemsin storm surgeanalysis, this in
*Correspondenceto: K. Kashiyama, Department of Civil Engineering, Chuo University, l-13-27 Kasuga,Bunkyo-ku, Tokyo ll2, Japan

ccc 027 t-2091 I rz13'7 9 $I 7.50 t91 l-1 by Wiley& Sons, Ltd. A 1991 John

Received15 May 1996 Revised12 July 1996

1372

K. KASHIYAMA ET AL

paper we presentand employ a parallel explicit finite element method for computationsbasedon grids. The finite elementcomputations basedon a three-step are explicit formulation'lof unstructured In we the governingequations. thesecomputations use the selectivelumping techniquefor numerical formulationis carriedout on the stabilization. Parallelimplementation this unstructured-grid-based of Fujitsu Highly Parallel Computer APl000. As a test problem, we carry out simulation of the storm surge accompanyingthe Ise-Bay typhoon in 1959. The computed results are compared with the is results.The effect of parallelizationon the efficiency of the computations also examined. observed here with The computationof the secondclassof problems,involving tidal flows, is accomplished a stabilizedimplicit finite element method basedon the conservationvariables.This stabilization method is basedon the streamlineupwind/Petrov-Galerkin (SUPG) formulation for compressible flows and in Reference7 flows, which was originally introducedin Reference6 for incompressible with a for the Euler equationsof compressibleflows. This methodology was later supplemented discontinuity-capturing term in References8 and 9 and then extendedin Referencesl0 and 1I to the Navier-Stokes equations of compressible flows. The time-dependentgoverning equations are 12 discretized using a space-timeformulation developedfor fixed domainsin References and 13 and for deforming domains in Reference 14. The present data-parallel implementation makes no assumptionsabout the structure of the computational grid and is written for the Thinking Machines As CM-5 supercomputer. a test problem, simulation of the tidal flow in Tokyo Bay is carried out. 2. GOVERNING EQUATTONS which are obtained The storm surgephenomena can be modelledusing the shallow water equations, assuminga hydrostaticpressure from the conservation momentumand mass,vertically integrated, of distribution:

i , t , + u , u , r + g ( ( - ( o p,(, h + O - , l t ' l ' r , - v ( u i j t u i ) . 1 : 0 , ' ) +j1L p(h+o (+t(h+ouil,i:0,

(1)
(2)

where a, is the mean horizontal velocity, ( is the water elevation, /r is the water depth, g is the (o gravitational acceleration, is the increasein water elevation correspondingto the atmospheric pressuredrop, (r.), is the surface shear stress,(16), is the._bottom shear stressand v is the eddy (o can be given by Fujita's formula" as viscosity.The increase

, to:

7 L P (rlrnfl' wg-J| +

(3)

where Ap is the pressuredrop at the centre of the typhoon, p is the density of fluid, r is the distance from the centre of the typhoon and re is the radius of the typhoon. The surfaceshearstresscan be siven as (t.), : paywiJ(wkwk), (4)

where pa is the density of air, y is the drag coefficient and u/i is the wind velocity 10 m above the water surface.The wind velocity can be evaluatedusing the expressions

wt: -93t"t^0[x1 - (x1).]f cosg[x, (r2L]]* Crvre-tlun, C'V^ Q/R)", w2 : -:; t- cosg[x1- ("r).] * sin0[x, - (rz).]] I C2Vre
(1997) INT. J. NUMER.METH.FLUIDS.VOL 24: 137l-1389

(5) (6)

@ 1997 by John Wiley & Sons, Ltd

COMPUTATION OF STORM SURCES AND TIDAL FLOWS

1373

where Z* is the gradientwind velocity, V, and V2 denotethe velocity of the typhoon, (xr). and (x2). denotethe position of the typhoon, 0 is the gradientwind angle and R, C1 and C, are constants. The gradientwind velocity is define as

n,:I[_,. I t L \ I'.ml'
where/ is the Coriolis coefficient. The bottom shearstresscan be siven as

"'ll .(;)']

(7)

(15)r :
where n is the Manning coefficient.

n2g

;fir,'/(uouo),

(8)

3. VARIATIONAL FORMULATIONS We present two finite element formulations of the shallow water equations which have been implementedon parallel architectures. The first method is a three-stepexplicit method for fixed domains.The secondmethodis an implicit stabilizedspace-timeformulation.Although the examples presented this paper involve fixed domains only, the latter (space-time)formulation is seenas a in steptowardssolving an importantclassof problemswhich involve deformingdomains.With the two formulations included in this section addressingdifferent classesof problems, a cost/accuracy comparisonis not performed;however,it is expectedthat the explicit method for a given time step size will be more economicalthan the space-timeformulation if the domain is fixed. The implicit space-timeformulation,on the other hand,doesnot involve as much time stepsizerestriction(due to numerical stability) as the explicit method. 3.1. Three-Step Explicit Finite ElementMethod For the finite element spatial discretizationof the governing equationsthe standardGalerkin method is used. The weak form of the governing equationscan then be written as

g(( (r)., t Ir r,(0, u,u,,,+ +ffi+

#*)

*
(e)
(l 0 )

-t u1,i))on uft,dr:0, ln,!.ilu(ui.1 J.

: dQ +[(h +or.r,],,] o, J r.rr


wherer.r|and (* denotethe weightingfunctionsand /, represents boundary terms.

Using the three-nodelinear triangular elementsfor the spatial discretization, the following finite elementequationscan be obtained:

(oB) M,pitBi-r Kop,lusuyi + HaiGB- + ,,r(ffi)

,-ro,(ffi)

: o, ,*r",r,uu,

(l l)

M,p( p + B nB;,,u + (l,) + C opr;u pi(hy* (r) : 0. B;(hy


ac, 1997 by John Wiley & Sons, Ltd

(r2)

(1997) INT. J. NUMER.METH.FLUIDS.YOLA: 137l-1389

1314

K. KASHTyAMA AL. Er

The coefficient matrix can be expressedas M,f : f o"opdo. l^ J o f


JO f l

f Knflyj: I o"opol/dQ. J o f
JO

Hnpi:gI

OoOd.idQ,

Ton:I

OoOddO.

: SoipT u I O,,,O6,,dO u I o Oo.ror.rei,rdf), J o J*

: Bop,, [^ t"tr.,*r* .
Jo

C,n,: I o,oBo].ide. JC)

where O denotes the shape function. The bottom stress term is linearized and the water depth is interpolatedusing linear interpolation. For discretizationin time the three-step explicit time integrationschemeis employed using the Taylor seriesexpansion

F(r+ Ar): F(t)+ L/lo

at

a^P Prg .- T i * e

lrr drtrl

*, +o(Lf)'

- ' A'

(13)

where F is an arbitrary function and At is the time increment. Using the approximate equation up to third-order accuracy, the following three-stepscheme can be obtained:l6

a'toF(t\ = r(,*{\ 3 / = r \ t ) * Ta ' \ '\ r ( r- * * ) : ' F " t'+ N 'f t zl 2 aFQ ^tl3). + 0t (r4)

F(/ + Ar): F(t)* 6,TQ !-4'12.


Equation (14) is equivalentto equation(13) and the method is referredto as the three-step TaylorGalerkin method. The stability limit of the method is l'5 times larger than that of the conventional two-step scheme.4'17 Details of this method are given in Reference3. Applying this schemeto the finite elementequations, the following discretizedequationsin time can be obtained: Step I

- (BBt : ulBu^pi u!pu^Bi'/' + !lr,r,,rrui1 HaiG'p + ,"r(ffir)'u

-r.,(#*O)',*r.,r,r],
ulp(p*t,t : uip$ _L|[a,p,rrft,(hy(;) * Copriu,l;(hy + + (U,
INT. J. NUMER.METH.FLUIDS,VOL U: l37t 1389 (1997)

(r5)
(16)

(t) 1997 by John Wiley & Sons, Ltd

COMPUTATIONOF STORMSURGES AND TIDAL FLOWS

r375

Step 2

-Il**rorrt/3un-tt/3 - (ti"') * ,,u(ffi) : u!Br'p!'/' u!pu'p!'/' +Ha,((p*'t'

"' -,., (#*O)" *r,, u* u,,h;','f,


Step 3

u
(l 7)

_qolW,u*r;!ttt(h,(i*t/r) + C,Br,u,p!t/3(hy : u!p("p*t,, MiB(h*t,t + + (;+t/\1,

(18)

: u!p",p!1

- o,|r*r,r'ri'r'uif'/'H,p,((h*'/' + r,a(ffi)'u. - 0tt't) u!pu'p!'t' +


(t ( -'',\ffi)" ) ' 1nr1/2

t ttsr
(20)

+s"1p1uii'/2l'

M!p("p*t : Mtf("u*t'' - a,tlB"piru'plt/2(hy (i+rtz, + c,Br,u'p!tl'(hr + (i+t/\1, +

where superscipt n denotes the value computed at the nth time point and Ar is the time increment between the nth and the (r, + l)th step. The coefficient Mfu expresses lumped coefficient andMlu the is the selective lumping coefficient given by

aip:eMlp+(-e)Ma,
where e is the selective lumping parameter. 3.2. Space-Time Implicit Finite Element Method

(21)

In the implicit implementation a stabilized space-time finite element method is used. Using the conservative variables defined as

u: l'":): \r;',)
/u'\ ( H \
where Il:h+C, the variationalformulation of (1) and (2) is written as

. .KU)l do ao*ln.(H)(",#) d0+L.ru.)i - (u);r J.u. (u*^,H)

.'y^ .' * I-"(o-)'(H) -* ('.,#)] [# o,# .'y_,I*,(H) (H).n


R

See note on next page.

'Rdo u*H dP' * Ir^u. Lo

(22)

@ 1997 by John Wiley & Sons, Ltd.

INT. J. NUMER. METH. FLUDS, YOL A: l37l-1389 09n\

r376

K. KASHIYAMA

ETAL

Here U* denotesthe weighting function and the integration takes place over the space-time domain (or its subsetreferred to as slab) Q,, its lateral boundary Pn and its lower spatial boundary O,. The space-time terminology is explained in more detail in Reference 18. A, and K,, are the coefficient matrices of the advective-diffusive system, defined as

Al:

Krr :

*'':( ,';,i) -,1:,,* \i,:r,)1';)


(-"J{"'i) I
. :

(yr:,;;, ':,)::,,:,)

Az:( ,"*:r,

0 Uz/H 0 ",)" ) 2U2/H/

K2r :

Kzz:(-:;;':: ,::,)

R denotesthe right-hand-side vector

For the correct forms of these six matrices and Eqs. (22) and (24), see the following four pages extracted from T.E. Tezduyar, "Finite Element Methods for Flow Problems with Moving Boundaries and Interfaces", Archives of Computational Methods in Engineering, 8 (2001) 83-130.

-gHo(h(otloxt ku\r/ fr.t,Zn p+ * | -gHa(/, -r 6)l\xz - ft,")zlp k)zlp J + \ /

(24)

and H is the natural boundary condition term defined on the subsetof the lateral boundary Pr. The the notation(. . .)| and (. . .), indicatethe valuesof a discontinuous variableas the time r approaches temporal slab boundary /, from above and below respectively. The first two left-hand-sideterms and the entire right-hand side of equation (22) constitutethe Galerkin form of the shallow water equations (l) and (2). The third term enforces weakly the continuity of the solution across the time levels t,. The fourth and fifth terms are the SUPG stabilizationand discontinuity-capturing terrns respectively.For the derivation of the stabilization coefficientst and d for multidimensionaladvection-diffusivesystemssee e.g. Reference11. The stabilizationterms are integratedover the interior of the space-timeelementsff. The variablesand weighting functions are discretizedusing piecewiselinear (in both spaceand time) interpolationfunctionsspaces all fields.The resultingnon-linearequationsystemis solved for using the Newton-Raphson algorithm, where at each Newton-Raphson step a coupled linear equation systemis solved iteratively using the GMRES updatetechnique. 4. PARALLEL IMPLEMENTATION For the explicit algorithm a data-parallel implementationis performedon the Fujitsu APl000, which is a distributed memory, highly parallel computer that supportsthe communication mechanism. Figure I shows the configurationof the APl000 system.The AP1000 consistsof 1024 processing elementswhich are called cells, a Sun workstation which is called the host and three independent a networkswhich are called the T-net, B-net and S-net.Each cell possesses memory of 16 MB. Using 1024 cells, the peak computational speed reaches 8'53 Gflops. The cells perform parallel computation synchronizing all cells and transferring boundary node data to neighbouring cells. The host performs institution of cells' environment, creation of task, transfer of data and observation
INT. J. NUMER.METH.FLUIDS. YOL A: l37t-1389(1997) O 1997by JohnWiley & Sons,Ltd.

COMPUTATIONOF STORMSURGES AND TIDAL FLOWS

1377

T_Net Figure l. APl000 system

of cells' condition. All cells are connected by the T-net (torus network) for one-to-one communicationbetweencells. The host and cells are connected the B-net (broadcasting by network) for broadcastcommunication,distribution and collection of data and by the S-net (synchronization network) for barrier synchronization. The communicationand synchronization mentionedabovecan parallel library.re be realizedusing the vendor-supplied To minimize the amount of interprocessorcommunication, the automatic mesh decomposer presented Farhat20 employed.For eachsubdomainthe processor is by associated with that subdomain carriesout computations independently, only the subdomain boundarydatawith the other exchanging processors. The finite element equationcan be expressed as MX:F,

(2s)

where M is the lumped massmatrix, X is the unknown vector and F is the known vector. Figure 2 showsan examplemesh,with the broken line denotingthe boundaryof a subdomain. Elements(1)(4) belong to domain 1 (processor1) and elements(5) and (6) belong to subdomain2 (processor 2). The unknown values X are solved by X:F/M

(26)

No interprocessor communicationis neededto compute the unknown values of a node which is locatedin the subdomaininterior, such as node A. However, in the caseof node B, which is located on the boundary of subdomains,interprocessorcommunication is neededand the following procedure is applied.First the following valuesare computedin each processor: Mst - Mw! * M"rot, Msz : MB6 + MB6), l), Fer : Fe(r) * Fs(+) (processor Fsz : Fs(s)* Fs(o) (processor 2).

(27) (28)

Next these values are gatheredusing the communication library, then the unknown values of node B can be obtained by Xs: (Fil + FB)l(Mil * Ms).

(2e)

Data transfer is performed at every time step (see Figure 2). As the lumped mass matrix M remains constant throughout all time step, the data transfer of that matrix is required only once. The implicit algorithm is implementedon the ConnectionMachine CM-5. Similarly to the Fujitsu AP1000,the CM-5 is also a distributedmemory, parallelmachine,with a singlepartition size of up to
@ 1997 by John Wiley & Sons, Ltd. INT. J. NUMER. METH. FLUIDS. YOL A: l37l-1389 (1997)

t378

K. KASHIYAMAET AL.

,YK'
\N*W \ \q r,/
A. B. .B

Figure 2. Parallel implementation

512 processing elements(PEs) and a Sun multiprocessor host machine.The PEs are interconnected through fat-treedata,control and diagnosticnetworks.Each PE manages MB of memory and has 32 a peak processing speedof 128 Mflops, for a total peak of over 65 Gflops. As on the AP1000, highly optimized communication utilities are available, grouped in the Connection Machine Scientific Software Library (CMSSL). The implementationof the implicit algorithm describedin Section 3 follows closely the finite element implementationof the Navier-Stokesequationswhich have been describedin References and 22. 21

5. NUMERICAL EXAMPLES As an applicationof the three-step explicit algorithm,simulationof the storm surgein Ise-Bay,Japan accompanying the Ise-Bay typhoon in 1959 is carried out. This typhoon occurredon 22 September 1959 and was the greatestdisasterever to hit the Ise-Bay district. Over 5000 people were killed becauseof this storm surge.Figure 3 shows the configurationof the domain and the path of the typhoon. Figure 4 shows the finite element discretizationused.The total numbersof elementsand nodes are 206,917 and 106,5'71 respectively.This mesh is designedto keep the element Courant number constantin the entire domain.23'2o Figure 5 shows the water depth diagram.From Figures4

Figure 3. Computational domain and path of Ise-Bay typhoon (t997) INT. J. NUMER.METH. FLUIDS,YOL?4: 1371-1389

@ 1997 by John Wiley & Sons, Ltd

COMPUTATIONOF STORMSURGES AND TIDAL FLOWS

1379

Figure 4. Finite elementdiscretization

, , ."^-:,-\ \ , I \s, ,-\

r'lJ./^--,

)'rl-)
-

.2ffia
) t!t@, "t///,^ >._

\ 1 v x;l^ >:

i+s9
r);^(

, t

,/ l\

\.il\\r

\l'{}

'r1!;(4 (

@.
Figure 5. Water depth diagram (contours are evenly spaced at 50O m intervals)
C; 1997 by John Wiley & Sons, Ltd.

(1997) INT. J. NUMER. METH. FLUIDS.YOL 24: 1371-1389

1380

K. KASHIYAMAET AL.

Figure 6. Finite element discretization around Ise-Bay

and 5 it can be seen that an appropriate mesh in accordance with the variation in water depth is realized. Figures 6 and 7 show the finite element discretization and water depth diagram around IseBay respectively. fine meshwhich represents geometryaccuratelyis employed.Figure 8 shows A the the meshpartitioning for 512 processors. The typhoon data such as its position, speedand power are given at I h intervals.Using thesedata,the wind velocity can be computedat every time step.Linear interpolation is used for the data interpolation. For the boundary condition the no-slip boundary condition is applied to the coastline and the open-boundarycondition is applied to the open boundary. For the numerical condition the following data are used: n : 0.3, At : l0 m2 s-l , Cr : Cz: 0.6, R : 500 km, ro : 60 km. The selectivelumping parameterand the time incrementare assumedto be 0.9 and 6 s respectively.Figure 9 shows the path of the typhoons;the numeralsdenotethe time and position of the typhoon.Figure l0 showsthe computedwater elevation

AW#
Figure 7. Water depth diagram around Ise-Bay (contours are evenly spaced at l0 m intervals) (1997) INT. J. NUMER.METH.FLUIDS,YOL24: r37t 1389 O 1997by JohnWiley & Sons,Ltd.

COMPUTATIONOF STORMSURGES AND TIDAL FLOWS

1381

Figure 8. Mesh panitioning for 512 processors

at times 17:00 and 24:00. Figure l1 shows the computedwater elevationat t h intervals.It can be seenthat the water elevationvariesaccordingto the movementof the typhoon. Figure 12 showsthe computed current velocity at time 22:00 and the complicated flow pattern. Figure 13 shows the comparison of water elevation between the computed and observedresults2s Nagoya. It can be seen at that the computedresultsare in good agreement with the observedresults.

l7 t6

l5 t( r3 (hour)

Figure 9. Path of typhoon

(O 1997 by John Wiley & Sons, I.td.

INT. J. NUMER.METH.FLUIDS.VOt- 24: l37l-1389 (1997)

t382

K. KASHIYAMAET AL

Figure 10. Computed water elevation (contours are evenly spaced at 0 1 m intervals)

In order to check the performanceof the parallelization, three finite element meshesare employed: meshL with206,977 elementsand 106,577nodes,mesh M with 133,546elementsand 69,295nodes and mesh S with 76,497elementsand 4A,197 nodes.Figures 14 and 15 show the relationbetweenthe number of processorsand the speed-upratio and efficiency of parallelization respectively. In these figures the speed-upratio and efficiency can be defined as speed-upratio : comnutationaltime for one PE computational time for N PEs speed-upratio N

(30) ( 31 )

efficiency:

where N denotes the total number of processors. From these figures it can be seen that the performance is improved in accordancewith an increasein the degreesof freedom and the efficiency is decreased accordance in with an increase processors. the caseof the computationusing mesh in In
(1997) INT. J. NUMER.METH.FI-UIDS, VOL 24: l37l-.1389
Q 1997 by John Wiley & Sons, Ltd.

COMPIJTATION OF STORM SURGES AND TIDAL FLOWS

I 383

Figure I L Computed water elevation at I h intervals (contours are evenly spaced at 0.1 m intervals)

L and 512 processors,it can be seen that the speed-upratio and efficiency reach approximately 400 and 80Vorespectively. As an applicationof the stabilizedspace-timeformulation, the tidal flow in Tokyo Bay has been simulated. This problem was analysedearlier using the three-stepexplicit scheme describedin Section3.26Here we carry out the simulationusing the implicit formulation introducedin Section3. The mesh used in the computationconsistsof 56,893 elementsand 60,210 space-timenodes,as shown in Figure 16. The meshhas beendecomposed (which are assigned the into 256 subdomains to individual CM-5 vector units) using a recursivespectralbisectionalgorithm, as shown in Figure 17. The mesh refinementis related to the water depth, shown magnified 100-fold in Figure 18. In this simulationa time step size of 60 s is chosenand the total duration is 1600time steps,approximating
@ 1997by JohnWiley & Sons,Ltd. INT. J. NUMER.METH.FLUIDS.YOLA: t37t-r389 (1997)

I 384

K. KASHIYAMAET AL -> : 10 m/sec

Figure 12. Computed current velocity at time 22:00

E .9 o -g o
6

computedresult ---- Observed result

Figure 13. Comparison between computed and observed water elevation at Nagoya

[:]sool-i + o) o o

| | -*. MESHM I *_. MESHS oool_l I


3001200-

MESHL I

./,--'

,/,/..-'

foo
1

y'4:''
100 200 300 400 500

-l

Number of processors Figure 14. Comparison of speed-upratios

(1997) INT. J. NUMER.METH.FLUIDS,YOI- A: l37l--1389

@ 1997 by John Wiley & Sons, Ltd

COMPUTATIONOF STORMSURGES AND TIDAL FLOWS

1385

;e

:s
uJ

100 200 300 400 500 Number processors of

Figure| 5. Comparison efficiencies of

Fisure 16. Finite element discretization Tokvo Bav of

Figure 17. Mesh partitioningfot 256 processors INT. J. NUMER. METH. FLUIDS, YOL A: 137r-1389(1997)

@ 1997 by John Wiley & Sons, Ltd.

1386

K. KASHIYAMAET AL.

Figure 18. Water depth view of Tokyo Bay

Figure 19. Computed water elevationat l:

l5:00 h

Figure 20. Computed water elevationat r:18:00 h tNT. J. NUMER. METH. FLUIDS,YOL A: 137l-1389(1997) @ 1997by JohnWiley & Sons,Ltd.

COMPIJ-|ATION STORMSURGES OF AND TIDAL FLOWS

1387

Figure 2l . Computed water elevation at / : 2l :00 h

Figwe 22. Computed water elevation at t: 24:00 h

one 24 h period. At the ocean boundary a diurnal tidal wave is imposed with an amplitude of 0.5 m and a period of l2h. The following parameters used: n:0'03, Ar:5 m2 s-r, C1 : C, - Q. are The storm surge term (3) is ignored in this problem. The resulting elevation is shown in Figures 1922,magnified 50,000times with respectto the horizontaldimensions, times r: l5:00, l8:00, 2l:00 at and 24:00 h into the simulation respectively. The simulation was performed on a 64-node CM-5 with 256 vector units and took 8.5 h of computer time to complete.

6. CONCLUDING REMARKS A three-stepexplicit finite element solver and an implicit stabilized space-time formulation of the shallow water equations, applicable to unstructured mesh computations of storm surges and tidal flows, have been successfullyimplementedon the massivelyparallel supercomputers APl000 and CM-5 respectively. The explicit method has been applied to the analysis of the storm surge accompanying the Ise-Bay typhoon in 1959. The efficiency of the parallelization has been investigated and the computed results have been compared with the observed results. The performance and efficiency were observedto improve linearly in accordancewith an increasein the number of degreesof freedom. The implicit method has been used to compute the tidal flow in Tokyo
rQ 1997 by John Wiley & Sons, Ltd.

INT. J. NUMER.METH.FLUIDS.YOL A: r37t-t389 (1997)

I 388

K. KASHIYAMAET AL.

Bay. From the results obtained in this paper, it can be concluded that the presentedmethod can be successfully applied to large-scale computationsof storm surgesand tidal flows.

ACKNOWLEDGEMENTS We are grateful to the Parallel Computing ResearchCenter of Fujitsu Laboratory for providing us with accessto the APl000 resources.The last two authors were supportedby ARPA and by the Army High Performance Computing ResearchCenter under the auspicesof the Department of the Army, Army Research Laboratory co-operative agreement number DAAH04-95-2-0003/contract number DAAH04-95-C-0008.The contentdoes not necessarily reflect the position or the policy of the U.S. Government, and no official endorsementshould be inferred. The CRAY C90 time was provided by the Minnesota SupercomputerInstitute.

REFERENCES 1. J.J.Westerink,R.A.Luettich,A.M.Baptisa,N.W.SheffnerandP.Farrar,'Tideandstormsurgepredictionsusingfinite elementmodel', J. Hydraul. Eng.,118,1373-1390 (1992). 2. K. Kashiyama, K. Saito and S. Yoshikawa, 'Massively parallel finite element method for large-scalecomputation of storm s u r g e ' , P r o c .l l t h l n t . C o n f . o n C o m p u t a t i o r n l M e t h o d s i n W a t e r R e s o u r c e s , C a n c u m , 9 9 6 . 1 3. K. Kashiyama, H. Ito, M. Behr and T. Tezduyar, 'Three-step explicit finite element computation of shallow water flows on a massivelyparallel computer', Int. j. numer. methods fluids,2l,885-900 (1995). 4. M. Kawahara, H. Hirano, K. Tsubota and K. Inagaki, 'Selective lumping finite element method for shallow water flow',Inr. j. numer. methods fluids,2,89-112 (1982). 5. T. E. Tezduyar, M. Behr, S. Mittal and A. A. Johnson, 'Computation of unsteady incompressible flows with the finite element methods-space-time formulations, iterative strategiesand massively parallel implementations', in P. Smolinksi, W. K. Liu, G. Hulbert and K. Tamma (eds), New Methods in Transient Analysis, AMD Vol. 143, ASME, New York, 1992, pp.1-24. 6. T. J. R. Hughesand A. N. Brooks, 'A multi-dimensional upwind schemewith no cross-winddiffusion', in T. J. R. Hughes (ed.), Finite Element Methodsfor Convection Dominated F/ows, AMD Vol. 34, ASME, New York, 1979, pp. l9-35. 7. T. E. Tezduyar and T. J. R. Hughes, 'Finite element formulations for convection dominated flows with particular emphasis on the compressible Euler equations',AIM Paper 83-0125,1983. 8. G. J. Le Beau and T. E. Tezduyar, 'Finite element computation of compressible flows with the SUPG formulation', in M. N. Dhaubhadel,M. S. Engelman and J. N. Reddy (eds),Advances in Finite Element Analysis in Fluid Dynamics, FED Vol. 1 2 3 ,A S M E , N e w Y o r k , 1 9 9 1 , p p . 2 1 - 2 7 . 9. G. J. Le Beau, S. E. Ray, S. K. Aliabadi and T. E. Tezduyar, 'SUPG finite element computation of compressibleflows with the entropy and conservation variables formulations', Comput. Methods Appl. Mech. Eng., tM,397422 (1993). 10. S. Aliabadi, S. E. Ray and T. E. Tezduyar, 'SUPG finite element computation of compressible ffows with the entropy and conservation variables formulations', Comput. M ech., ll, 300-3 12 (1993). 11. S. Aliabadi and T. E. Tezduyar, 'Space-time finite element computation of compresssible flows involving moving boundaries and interfaces', Comput. Methods Appl. Mech. Eng., 107, 209-224 (1993). 12. C. Johnson,U. Navert and J. Pitkiiranta, 'Finite element methods for linear hyperbolic problems', Comput. Methods Appt. Mech. Ens., 45, 285-312 (1984). 13. T. J. R. Hughes and G. M. Hulbert, 'Space-time finite element methods for elastodynamics: formulations and error estimates',Comput.MethodsAppl. Mech. Eng., ffi,339-363 (1988). 14. T. E. Tezduyar, M. Behr and J. Liou, 'A new strategy for finite element computations involving moving boundaries and interfaces-the deforming-spatial-domain/space-time procedure: I. The concept and the preliminary tests', Comput. Methods Appl. Mech. Eng.,94,339-351 (1992). 15. M. Miyazaki, T. Ueno and S. Unoki, 'Theoretical investigations of typhoons surges along the Japanesecoast (II)', Oceanogr.Mag., 13, 103-117 (1962). 16. C. B. Jiang, M. Kawahara, K. Hatanaka and K. Kashiyama, 'A three-stepfinite element method for convection-dominated incompressible flows', Comput. Fluid Dyn. J.,l, 443462 (1993). 17. M. Kawahara and K. Kashiyama, 'Selective lumping finite element method for nearshorecurrent', Int. j. numer. methods fluids, 4,'7 1-9'7 (1984). 18. M. Behr and T. E. Tezduyar, 'Finite element solution strategiesfor large-scale flow simulations', Comput. Methods Appl. Mech. Eng., ll2,3-24 (1994). 19. APIUD Library Manual, Fujitsu Laboratories, Tokyo, 1993. 20. C. Farhat, 'A simple and efficient automatic FEM domain decomposer', Comput. Struct.,/8,57ffi02 (1988). INT. J. NUMER.METH.FLUIDS,vOL A: l37l-r389 (1997)

@ 1997 by John Wiley & Sons, Ltd.

AND TIDAL FLOWS COMPUTATIONOF STORMSURGES

l 389

'Computation 2l . M. Behr, A. Johnson,J. Kennedy, S. Mittal and T. E. Tezduyar, of incompressible flows with implicit finite elementimplementationon the ConnectionMachine', Comput.MethodsAppl. Mech. Eng., l0E,99-118 (1993). 'lmplementation 22. J. G. Kennedy, M. Behr, V. Kalro and T. E. Tezduyar, of implicit finite element methods for incompressible flows on the CM-5', Comput.MethodsAppl. Mech. Eng., ll9,95-1 ll (1994). 'Automatic 23. K. Kashiyama and T. Okada, mesh generation method for shallow water flow analysis', Int. j. numer. methods fluids, 15, 1037-1057 (1992). 24. K. Kashiyama and M. Sakuraba, 'Adaptive boundary-type finite element method for wave diffraction-refractron rn harbors', Comput.MethodsAppl. Mech. Eng., ll2,185-197 (1994). 'Reports 25. on Ise-Bay typhoon', Tech. Rep. 7, MeteorologicalAgency, 1961 (in Japanese). 26. K. Kashiyama, H. Ito, M. Behr and T. E. Tezduyar, 'Massively parallel finite element strategiesfor large-scalecomputation of shallow water flows and contaminant transport', ExtendedAbstr. SecondJapan-U.S. Symp. on Finite Element Methods in Large-Scale Computational Fluid Dynamics, Tokyo, 1994.

Q 1997 by John Wiley & Sons, Ltd

(1997) INT. J. NUMER.METH.FLUIDS.YOL2,4:137l-1389

Vous aimerez peut-être aussi