0 Votes +0 Votes -

56 vues15 pagesThe difficulties associated with using mathematical optimization on large-scale engineering problems have contributed to the development of alternative solutions. Linear programming and dynamic programming techniques, for example, often fail (or reach local optimum) in solving NP-hard problems with a large number of variables and non-linear objective functions. To overcome these problems, researchers have proposed evolutionary-based algorithms for searching near-optimum solutions to problems.
Evolutionary algorithms (EAs) are stochastic search methods that mimic the metaphor of natural biological evolution and/or the social behaviour of species. Examples include how ants find the shortest route to a source of food and how birds find their destination during migration. The behaviour of such species is guided by learning, adaptation, and evolution. To mimic the efficient behaviour of these species, various researchers have developed computational systems that seek fast and robust solutions to complex optimization problems. The first evolutionary-based technique introduced in the literature was the genetic algorithms (Gas). GAs were developed based on the Darwinian principle of the ‘survival of the fittest’ and the natural process of evolution through reproduction. Based on its demonstrated ability to reach near-optimum solutions to large problems, the GAs technique has been used in many applications in science and engineering. Despite their benefits, GAs may require long processing time for a near optimum solution to evolve. Also, not all problems lend themselves well to a solution with GAs.

Oct 09, 2017

© © All Rights Reserved

PDF, TXT ou lisez en ligne sur Scribd

The difficulties associated with using mathematical optimization on large-scale engineering problems have contributed to the development of alternative solutions. Linear programming and dynamic programming techniques, for example, often fail (or reach local optimum) in solving NP-hard problems with a large number of variables and non-linear objective functions. To overcome these problems, researchers have proposed evolutionary-based algorithms for searching near-optimum solutions to problems.
Evolutionary algorithms (EAs) are stochastic search methods that mimic the metaphor of natural biological evolution and/or the social behaviour of species. Examples include how ants find the shortest route to a source of food and how birds find their destination during migration. The behaviour of such species is guided by learning, adaptation, and evolution. To mimic the efficient behaviour of these species, various researchers have developed computational systems that seek fast and robust solutions to complex optimization problems. The first evolutionary-based technique introduced in the literature was the genetic algorithms (Gas). GAs were developed based on the Darwinian principle of the ‘survival of the fittest’ and the natural process of evolution through reproduction. Based on its demonstrated ability to reach near-optimum solutions to large problems, the GAs technique has been used in many applications in science and engineering. Despite their benefits, GAs may require long processing time for a near optimum solution to evolve. Also, not all problems lend themselves well to a solution with GAs.

© All Rights Reserved

56 vues

The difficulties associated with using mathematical optimization on large-scale engineering problems have contributed to the development of alternative solutions. Linear programming and dynamic programming techniques, for example, often fail (or reach local optimum) in solving NP-hard problems with a large number of variables and non-linear objective functions. To overcome these problems, researchers have proposed evolutionary-based algorithms for searching near-optimum solutions to problems.
Evolutionary algorithms (EAs) are stochastic search methods that mimic the metaphor of natural biological evolution and/or the social behaviour of species. Examples include how ants find the shortest route to a source of food and how birds find their destination during migration. The behaviour of such species is guided by learning, adaptation, and evolution. To mimic the efficient behaviour of these species, various researchers have developed computational systems that seek fast and robust solutions to complex optimization problems. The first evolutionary-based technique introduced in the literature was the genetic algorithms (Gas). GAs were developed based on the Darwinian principle of the ‘survival of the fittest’ and the natural process of evolution through reproduction. Based on its demonstrated ability to reach near-optimum solutions to large problems, the GAs technique has been used in many applications in science and engineering. Despite their benefits, GAs may require long processing time for a near optimum solution to evolve. Also, not all problems lend themselves well to a solution with GAs.

© All Rights Reserved

- 3. Genetic Algorithms and Their Applications-Asif
- Long-Term Open Pit Mine Production Planning - A Review of Models And
- A Prototype Tool for Multidisciplinary Design Optimization of Ships
- FLDU
- Center Point
- kbbdksabdkjsabjfdsabf,jbsalfbsakfaksv
- QT Combined 01
- 10.1.1.83
- Tutorial GA NN
- 1er articulo
- Contenido de Referencias Bibliograficas 1
- Managing the Exploitation Life of the Mining Machinery for an Unlimited Duration of Time
- Optimisation of the Beer Distribution Game.pdf
- A hybrid stock selection model using genetic algorithms and support vector%0Aregression.pdf
- A Tabu Search Approach for Optimizing Operating Conditions of Anaerobic Digester
- 1 - manual GenOpt.pdf
- Optimization algorithms in Chemical Engineering
- Load Balancing Approach in Heterogeneous Distributed Systems - A Review
- Lab Presentation M1S2 - 1
- Golzarpoor Behrooz

Vous êtes sur la page 1sur 15

SHUFFLED FROG

LEAPING ALGORITHM

1

OPTIMIZATION BY SFLA

1. INTRODUCTION

problems have contributed to the development of alternative solutions. Linear programming

and dynamic programming techniques, for example, often fail (or reach local optimum) in

solving NP-hard problems with large number of variables and non-linear objective functions.

To overcome these problems, researchers have proposed evolutionary-based algorithms for

searching near-optimum solutions to problems.

Evolutionary algorithms (EAs) are stochastic search methods that mimic the metaphor of

natural biological evolution and/or the social behaviour of species. Examples include how

ants find the shortest route to a source of food and how birds find their destination during

migration. The behaviour of such species is guided by learning, adaptation, and evolution. To

mimic the efficient behaviour of these species, various researchers have developed

computational systems that seek fast and robust solutions to complex optimization problems.

The first evolutionary-based technique introduced in the literature was the genetic algorithms

(Gas). GAs were developed based on the Darwinian principle of the survival of the fittest

and the natural process of evolution through reproduction. Based on its demonstrated ability

to reach near-optimum solutions to large problems, the GAs technique has been used in many

applicationsin science and engineering. Despite their benefits, GAs may require long

processing time for a near optimum solution to evolve. Also, not all problems lend

themselves well to a solution with GAs.

2

OPTIMIZATION BY SFLA

2. OPTIMIZATION

2.1 Definition

mathematical techniques.

To minimize the cost of production or to maximize the efficiency of production.

Its a technique to:

Find Best Solution

Minimal Cost (Design)

Minimal Error (Parameter Calibration)

Maximal Profit (Management)

Maximal Utility (Economics)

The optimization of systems and processes is veryimportant to the efficiency and economics

of many scienceand engineering domains. Optimization problems aresolved by using

rigorous or approximate mathematical search techniques. Rigorous approaches have

employed linear programming, integer programming, dynamic programming or branch-and-

bound techniques to arrive at the optimum solution for moderate-size problems. However,

optimizing real-life problems of the scale often encountered in engineering practice is much

more challenging because of the huge and complex solution space. Finding exact solutions to

these problems turn out to be NP-hard. This kind of complex problem requires an exponential

amount of computing power and time, as the number of decision variables increases

(Lovbjerg 2002). To overcome theseproblems, researchers have proposed

approximateevolutionary-based algorithms as a means to search for near-optimum solutions.

3

OPTIMIZATION BY SFLA

o Mathematical Algorithms

Simplex (LP), BFGS (NLP), B&B (DP)

o Drawbacks of Mathematical Algorithms

LP: Too Ideal (All Linear Functions)

NLP: Not for Discrete Var. or Complex Fn., Feasible Initial

Vector, Local Optima

DP: Exhaustive Enumeration, Wrong Direction

o Meta-Heuristic Algorithms

GA, SA, TS, ACO, PSO,

4

OPTIMIZATION BY SFLA

3. S HUFFLED FROG LEAPING ALGORITHM

3.1 Definition

The shuffled frog-leaping algorithm is a memetic metaheuristicthat is designed to seek a

global optimal solutionby performing a heuristic search.It is based on the evolutionof memes

carried by individuals and a global exchange ofinformation among the population.

The SFLA (Shuffled Frog Leaping Algorithm) is presented by Eusuff and Lansey

(2003) is a meta-heuristic iterative method inspired from the memetic evolution of a

group of frogs when seeking for food.

The SFLA is a method which is based on observing, imitating, and modelling the

behaviour of a group of frogs when searching for the location that has the maximum

amount of available food .

The SFL algorithm, in essence, combines the benefits of the genetic-based MAs and the

social behavior-based PSO algorithms. In the SFL, the population consists of a set of frogs

(solutions) that is partitioned into subsets referred to as memeplexes. The different

memeplexes are considered as different cultures of frogs, each performing a local search.

Within each memeplex, the individual frogs hold ideas, that can be influenced by the ideas of

other frogs, and evolve through a process of memetic evolution. After a defined number of

memetic evolution steps, ideas are passed among memeplexes in a shuffling process.The

local search and the shuffling processes continue until defined convergence criteria are

satisfied.

Begin;

Generate random population of P solutions (frogs);

For each individual i2P: calculate fitness (i);

Sort the population P in descending order of their

fitness;

For each memeplex;

Determine the best and worst frogs;

5

OPTIMIZATION BY SFLA

Improve the worst frog position;

Repeat for a specific number of iterations;

End;

Combine the evolved memeplexes;

Sort the population P in descending order of their

fitness;

Check if terminationZtrue;

End;;

The SFLA originally developed as a population-based metaheuristic to perform an informed

heuristic search using mathematical functions to find a solution of a combinatorial

optimization problem. Itcombines the benefits of both the genetic-based memetic algorithm

(MA) and the social behavior-basedparticle swarm optimization algorithm.In SFLA, there is

a population of possible solutions defined by a set of frogs that is divided intosubgroups

called memeplexes, each performing a local search. After a defined number of

memeticevolution steps, ideas are passed among memeplexes in a shuffling process. At first,

an initial population of P frogs is created randomly within the feasible space. For an S

variableproblem, ith frog is represented as

Xi = (xi1, xi2, , xiS).

Then, the frogs are sorted in a descending orderaccording to their fitness. Then, the whole of

population (P) is separated into m memeplexes, eachcontaining n frogs. In this procedure, the

first frog moves to the first memeplex, the second frog movesto the second memeplex, frog m

moves to the m-th memeplex, and frog m+1 goes back to the firstmemeplex, etc.

Within each memeplex, position of frogs with the best and worst fitness is determined as Xb

and Xw,respectively. Also position of frog with the global best fitness is determined as Xg.

Then, in eachmemeplex, a process is applied to improve only the frog with the worst fitness

(not all frogs) in eachcycle as follows:

XwNEW Xw= Di......(2)

6

OPTIMIZATION BY SFLA

where Rand() is a random number between 0 and 1. If this process generates a better solution,

the worstfrog will be replaced. Otherwise, the calculations in (1) and (2) are repeated with

replacement of Xbby Xg. If no improvement becomes possible in this case, then a new

solution is randomly generatedwithin the feasible space to replace the worst frog.

In this section, the presented classic SFLA is improved. As equation (1-2) represents, the

worst frogtries to find a better place by jumping toward the best frog. But as the random

operator is within 0 and1, only the space between the two frogs is searched. However, as the

space beyond the best frog (theother side of the best frog) might be a suitable place of

solution, there should be a possibility so that thefrog could search that area as well. So the

random operator is adjusted to rand (1, 1.75) in order toprovide a chance to search the other

side of the best frog. 1.75 is chosen according to the authorsexperiences in different

problems.

The other different of ISFLA from the conventional one is that in conventional SFLA, all the

elementsof Xb-Xw in (28) are multiplied in the same random value. In ISFLA, for each

element of Xb-Xw, arandom value is generated and multiplied in (2). It should be noticed that

ISFLA will come as SFLAin the results of this paper.

In the SFL algorithm, each memeplex is allowed to evolveindependently to locally search at

different regions of thesolution space. In addition, shuffling all the memeplexesand re-

dividing them again into a new set of memeplexesresults in a global search through changing

the informationbetween memeplexes. As such, the SFL algorithm attemptsto balance between

a wide search of the solution space anda deep search of promising locations that are close to

alocal optimum. As expressed by equation , each individual frog(solution) in a memeplex is

trying to change its positiontowards the best frog within the memeplex or the overallbest

frog. As shown in this equation, when the difference inposition between the worst frog Xw

(i.e. the frog underevolution) and the best frogs (Xb or Xg) becomes small, change in frog

Xws position will be very small, and thus itmight stagnate at a local optimum and lead to

prematureconvergence.

To overcome such an occurrence, this studyproposes that the right-hand side of equation

bemultiplied by a factor C called the search acceleration

7

OPTIMIZATION BY SFLA

factor, as follows:

Assigning a large value to the factor C at the beginning of the evolution process will

accelerate the global search byallowing for a bigger change in the frogs position

andaccordingly will widen the global search area. Then, as theevolution process continues

and a promising location isidentified, the search acceleration factor, C, will focus

theprocess on a deeper local search as it will allow the frogs tochange its positions.

8

OPTIMIZATION BY SFLA

4. PROCESS - SFLA

inspired by natural memetics. The algorithm contains elements of local search and global

information exchange. The SFLA consists of a set of interacting virtual population of frogs

partitioned into different memeplexes. The virtual frogs act as hosts or carriers of memes

where a meme is a unit of cultural evolution. The algorithm performs simultaneously an

independent local search in each memeplex. The local search is completed using a particle

swarm optimization-like method adapted for discrete problems but emphasizing a local

search. To ensure global exploration, the virtual frogs are periodically shuffled and

reorganized into new memplexes in a technique similar to that used in the shuffled complex

evolution algorithm. In addition, to provide the opportunity for random generation of

improved information, random virtual frogs are generated and substituted in the population.

1) Population consists of a set of frogs

2) Those frogs are partitioned into subsets referred to as memeplexes

3) Each memeplexes performing a local search for the food.

4) After a defined number of evolution steps, ideas are passed among memeplexes in a

shuffling process

5) The local search and the shuffling processes continue until defined convergence

criteria are satisfied.

9

OPTIMIZATION BY SFLA

10

OPTIMIZATION BY SFLA

variables), a frog i is represented as XiZ(xi1, xi2,., xiS). Afterwards, the frogs are sorted in a

descending order according to their fitness. Then, the entire population is divided into m

memeplexes, each containing n frogs (i.e. PZm!n). In this process, the first frog goes to the

first memeplex, the second frog goes to the second memeplex, frog m goes to the mth

memeplex, and frog mC1 goes back to the first memeplex, etc.

Within each memeplex, the frogs with the best and the worst fitnesses are identified as Xb

and Xw, respectively. Also, the frog with the global best fitness is identified as Xg. Then, a

process similar to PSO is applied to improve only the frog with the worst fitness (not all

frogs) in each cycle.

Accordingly, the position of the frog with the worst fitness is

adjusted as follows:

New position Xw

Current position= Xw+Di; Dmax Di -Dmax

where rand( ) is a random number between 0 and 1; andDmax is the maximum allowed

change in a frogs position. If this process produces a better solution, it replaces the worst

frog. Otherwise, the calculations in Eqs. (8) and (9) are repeated but with respect to the global

best frog (i.e.Xg replaces Xb). If no improvement becomes possible in this case, then a new

solution is randomly generated toreplace that frog. The calculations then continue for

aspecific number of iterations . Accordingly, the mainparameters of SFL are: number of frogs

P; number ofmemeplexes; number of generation for each memeplexbefore shuffling; number

of shuffling iterations; andmaximum step size.

11

OPTIMIZATION BY SFLA

5. .COMPARISION AMONG DIFFERENT EVOLUTIONARY

ALGORITHMS

SFLA has been used as appropriate tools to obtain the best solutions with the least total time

and cost by evaluating unlimited possible options. One of the problems of previous research

is that assumptions make them unrealistic in comparison with actual construction projects.

On the other hand, delay events during execution of activities have an important impact on

total time and cost of projects. Therefore, the authors attempt to make the model better

approximate real projects by considering splitting during execution of activities.

Compare non-dominated solutions ofSFLA by applying splitting to previous works in GA

and NSGA-II. Results in both TCO andTCRO models demonstrate improvement of solutions,

convergence ratio, and the processingtime to reach the optimum solution. It confirms that

SFLA improves results, by comparing results before applying. Since in this case, we do not

have any limit for resources, the impact of splitting on concepts of time-cost trade off and

resource allocation has been investigated. The values of improvement demonstrate that

splitting has significant impact on final results.

Benchmarkcomparisons among the algorithms are presented for both continuous and discrete

optimization problems, in terms of processing time,convergence speed, and quality of the

results. Based on this comparative analysis, the performance of EAs is discussed along with

someguidelines for determining the best operators for each algorithm. The study presents

12

OPTIMIZATION BY SFLA

sophisticated ideas in a simplified form that should bebeneficial to both practitioners and

researchers involved in solving optimization problems.

SFLA has been used as appropriate tools to obtain the best solutions with the least total time

and cost by evaluating unlimited possible options. Thevalues of improvement demonstrate

that shuffled frog leaping algorithm has significant impact on efficiency of various aspects.

.

13

OPTIMIZATION BY SFLA

6. CONLUSIONS

In this paper a optimization method known as shuffled frog leaping algorithm has been

described. It comes under the type evolutionary algorithms which means those algorithms

which are prepared by observing the typical natural biological evolutions and behaviour of

natural species such as ants, birds, bees, frogs etc.

A brief description of shuffled frog leaping algorithm method is presented along with a

pseudocode to facilitate its implementation. Visual Basic programs were written to

implement each algorithm. Also presented were the comparative results found when a

discrete optimization test problem was solved using all five algorithms. The PSO method was

generally found to perform better than other algorithms in terms of success rate and solution

quality, while being second best in terms of processing time.

Evolutionary algorithms (EAs) are stochastic search methods that mimic the natural

biological evolution and/or the social behavior ofspecies. Such algorithms have been

developed to arrive at near-optimum solutions to large-scale optimization problems, for

which traditionamathematical techniques may fail.

Implementation of evolutionary algorithms in various field because of their reliability

and simple implementation.

Introducing and improving a rather recent optimization algorithm known as shuffled

frog leaping algorithm.

Performing a comparison to show the override of SFLA in competition with 4

versions of evolutionary algorithm. Also, SFLAs high quality performance is

demonstrated.

14

OPTIMIZATION BY SFLA

7. REFERENCES

[1] Al-Tabtabai H, Alex PA. Using genetic algorithms to solve optimization problems in

construction. EngConstrArchit Manage 19996:12132.

[2] Duan, Q.Y., Gupta, V.K. and Sorooshian, S., Shuffled complex evolution approach

for effective and efficient global minimization. J. Optimization Theory Appns, 1993,

76, 502 521.

[3] Elbeltagi, E., Hegazy, T. and Grierson, D., Comparison among five evolutionary-

based optimization algorithms. J. Adv. Engng. Informatics, 2005, 19, 43 53.

[4] Eusuff, M.M. and Lansey, K.E., Optimization of water distribution network design

using the shuffled frog leaping algorithm. J. Water Resour. Planning Mgmt, 2003,

129, 210 225.

[5] Feng, C., Liu, L. and Burns, S., Using genetic algorithms to solve construction time

cost trade-off problems. J. Comput. Civil Engng, 1997, 11, 184 189.

[6] Hegazy T. Optimization of construction time-cost trade-off analysis using genetic

algorithms. Can J Civil Eng 1999;26:68597.

[7] Hegazy, T., Elbeltagi, E. and Elbehairy, H., Bridge deck management system with

integrated life cycle cost optimization. Transportation.

[8] Holland, J., Adaptation in Natural and Artificial Systems, 1975 (University of

Michigan Press: Ann Arbor, MI).

[9] Kennedy, J. and Eberhart, R., Particle swarm optimization, in Proceedings IEEE

International Conference on Neural Networks, IEEE Service Center, Piscataway, NJ,

pp. 1942 1948, 1995.

15

- 3. Genetic Algorithms and Their Applications-AsifTransféré pararshiya65
- Long-Term Open Pit Mine Production Planning - A Review of Models AndTransféré parGaluizu001
- A Prototype Tool for Multidisciplinary Design Optimization of ShipsTransféré parshimul2008
- FLDUTransféré parapi-26172869
- Center PointTransféré parPriyaprasad Panda
- kbbdksabdkjsabjfdsabf,jbsalfbsakfaksvTransféré parGaurav Lathwal
- QT Combined 01Transféré parDhiraj Ahuja
- 10.1.1.83Transféré parNguyen Thanh
- Tutorial GA NNTransféré parshardapatel
- 1er articuloTransféré parDavid Palomares
- Contenido de Referencias Bibliograficas 1Transféré parcamaguey100
- Managing the Exploitation Life of the Mining Machinery for an Unlimited Duration of TimeTransféré parGabriel Motta
- Optimisation of the Beer Distribution Game.pdfTransféré parDaniel S. Cruz
- A hybrid stock selection model using genetic algorithms and support vector%0Aregression.pdfTransféré parspsberry8
- A Tabu Search Approach for Optimizing Operating Conditions of Anaerobic DigesterTransféré parJonghyuck Park
- 1 - manual GenOpt.pdfTransféré parBazinga
- Optimization algorithms in Chemical EngineeringTransféré parAmir Mosavi
- Load Balancing Approach in Heterogeneous Distributed Systems - A ReviewTransféré parInnovative Research Publications
- Lab Presentation M1S2 - 1Transféré parTan Zhi Liang
- Golzarpoor BehroozTransféré parDarko Tešić
- A REVIEW OF RECENT ADVANCES IN ECONOMIC DISPATCH 1990.pdfTransféré parSheri Abhishek Reddy
- EE09 L02 Numerical Analysis and Optimisation Techniques APR 2014Transféré parSai Das
- Strategic Supply Chain Management in Process IndustriesTransféré parganeshdhage
- Lingo 14 Users ManualTransféré parCristian Castillo Yachapa
- Forming TaguchiTransféré parKitana Handa Suhatta
- Chapter-02-Linear-Programming-Models.pdfTransféré parRhea Joy C. Morales
- 1.Drinking Water Distribution SystemsTransféré parDiegoCelis
- Workforce Planning and Scheduling for the HP IT SeTransféré parRamdani Subekti Diningrat
- gu2010.pdfTransféré parkhairul
- Harris Hawks OptimizationTransféré parRahul Goswami

- Power Factor Improvement Using UpfcTransféré paruday wankar
- TEACHING AND LEARNING BASED OPTIMISATIONTransféré paruday wankar
- TEACHING AND LEARNING BASED OPTIMISATIONTransféré paruday wankar
- Optimizing Technique-grenade Explosion MethodTransféré paruday wankar
- Improved Reactive Power Capability With Grid Connected Doubly Fed Induction GeneratorTransféré paruday wankar
- CHANDRAPUR CTPS 15 DAYS TRAINING REPORTTransféré paruday wankar
- Optimizing Technique-grenade Explosion MethodTransféré paruday wankar
- Optimization Simulated AnnealingTransféré paruday wankar
- Symmetrical ComponentsTransféré paruday wankar
- Protective RelayTransféré paruday wankar
- Optimization Simulated AnnealingTransféré paruday wankar
- Optimization Technique-genetic AlgorithmTransféré paruday wankar
- Power Quality ImprovementTransféré paruday wankar
- Power Factor Improvement Using UpfcTransféré paruday wankar
- Optimization Technique-genetic AlgorithmTransféré paruday wankar
- Gas Turbine EngineTransféré paruday wankar
- Microwave Wireless Power Transmission SystemTransféré paruday wankar
- Microwave Wireless Power TransmissionTransféré paruday wankar
- Gas Turbine EngineTransféré paruday wankar
- OPTIMIZATION-Shuffled Frog Leaping AlgorithmTransféré paruday wankar
- Improved Reactive Power Capability With Grid Connected Doubly Fed Induction GeneratorTransféré paruday wankar
- Optimization Techniqueharmony SearchTransféré paruday wankar
- Optimization Technique Harmony SearchTransféré paruday wankar
- Optimization by Ant Colony MethodTransféré paruday wankar
- Optimization by Ant Colony MethodTransféré paruday wankar
- Laser Ignition for Internal Combustion EngineTransféré paruday wankar
- Rewinding a brushless motorTransféré paruday wankar
- Rewinding a bldc motor.pptxTransféré paruday wankar
- Hybrid Power Generation by Solar –WindTransféré paruday wankar

- QAPPT CH-1.pptTransféré parHibret
- M02_TAYL1971_11_ISM_C02.pdfTransféré parAdrian M. Zaki
- Application of Particle Swarm Optimization Algorithm for Solving Power Economic Dispatch with Prohibited operating zones and Ramp-rate limit ConstraintsTransféré parijeteeditor
- 36 Algorithm TypesTransféré parGunji Venkata Srinivasa Babu
- CaccettaTransféré parLucciano Salas
- Engg. Optimization S.S.raoTransféré parNITIN KUMAR GANGWAR
- To Study the Performance of SLP and Pattern Search using MATLABTransféré parGRD Journals
- Genetic Algorithms and Fuzzy Multiobjective OptimizationTransféré parYog Sothoth
- Network science Chapter 7Transféré parStfn1990
- Forrester FuturoTransféré parjair hernandez
- Pages From IP Formulation ExamplesTransféré parmahesh_manomanj
- ProblemTransféré parGlib Briia
- MetaheuristicsTransféré parzerbazerb
- Chain Matrix Multiplication and 0/1 KnapsackTransféré parMuhammad Umair
- linear programming explanationTransféré parNiranjan Gupta
- GPPusingCauchy inequalitymethodTransféré parMandar Chikate
- 13 2 11 III Dijkstra AlgorithmTransféré parmdhuq1
- chap4_2017Transféré parTej Eddine Labidi
- Nonlinear Optimisasi CACHUATTransféré parOkky Widya Arditya
- Abdelkader BENHARI Optimisation Notes.pdfTransféré parAshoka Vanjare
- 0898717027Nonlinear (1)Transféré parKamran Khan
- Heuristics for the Quadratic Assignment ProblemTransféré parafreyna
- 0387306625 BibliographyTransféré paramr1287
- OR_UNIT_I_umaTransféré parSanal Ek
- Application of Linear Programming for Optimal Use of Raw Materials in BakeryTransféré partheijes
- Chap 4Transféré pardelustrous
- Exercise 1Transféré parVandilberto Pinto
- PE_L6_PERTTransféré parAmrit Raz
- Transportation Problem-1.pptTransféré parKautzar Satrio
- n10-mincostflowTransféré parttungl

## Bien plus que des documents.

Découvrez tout ce que Scribd a à offrir, dont les livres et les livres audio des principaux éditeurs.

Annulez à tout moment.