Vous êtes sur la page 1sur 31

NAME ABD AZIZ ARRASHID B ABD RAJAK DEVENDRAN A/L INDIRAN FASIHAH BINTI MOHD JASLAN

UK NUMBER UK 22715 UK 21995 UK 22641

SHARIFAH MUNIRAH BINTI SYED UK 22523 HASHIM NURUL ASMAA BT ABD RAZAK UK 22524

Introduction to STOCHASTIC.

Definition of stochastic.
Definition of stochastic process. Stochastic and its importance.

Stochastic modeling is an interesting and challenging area of

probability and statistics. A branch of applied mathematics concerned with the collection and interpretation of quantitative data and the use of probability theory to estimate population parameters. The aims in this introductory sections of this topics are to explain on what is stochastic process is and what is meant by Markov Property and hence, give examples and discuss some of the objectives that might have study in stochastic process. Other expressions used as synonyms for stochastic process in the literature are chance process and random process. There are a number of aspects of a stochastic process that we can examine. Among them: 1. dependencies between variables in the sequence. 2. various kinds of long-term averages. 3. the frequency at which boundary events occur.

Stochastic just means random. Often, the random sequence of events.

A stochastic process is a mathematical model that

evolves over time over probabilistic manner. In a simple word it is a process involving the operation of chance. More generally, a stochastic process refers to a family of random variables indexed against some other variable or set of variable. It is one of the most general objects of study in probability and some basics type of stochastic processes include Markov processes or known as Markov Chain

1. A Brand Switching Model for Consumer Behaviors. Before introducing a new brand of coffee, a manufacturer wants to study consumer behavior relative to the brands already available in the market. There is also a strong possibility that when a superior brand is introduced, some of the old brands will be left with only a few customers. Sample surveys are used to gauge consumer behavior. As we are interested in the number of people who buy a certain brand of coffee, then that number could be represented as a stochastic process.

2.Queuing Problem. A bus taking students back and forth between the dormitory complex and the students union arrives at the students union several times during the day. Thus, The university administrators would like to know at each time period during the day how many students are left behind when the bus departs. Therefore, The number waiting at the bus stop is a stochastic process dependent on the arrival process.

3.Recovery, Relapse and the Death Due to a Disease.


The process of recovery, relapse and death in the case of some major diseases

such as cancer is governed by several random causes, and therefore stochastic process models have been found useful in the study of hospital data related to such cases. For instance, four different states of the patients can be identified: 1. The initial states of being under treatment. 2. The state of being dead immediately following treatment. 3. The state of recovery 4. The state of being lost after the recovery (not being able to trace the patient).

From a model like this, problems related to the effectiveness of treatment can

be studied.

4.Population Growth Problems.


It is more realistic to consider the growth of a population as

a stochastic rather then deterministic process. The external factors that may influence the growth of animal, such as weather conditions, disease, and availability of food, are too varied and uncertain for deterministic models. When these factors are identified and accounted for, the population size at any time can be considered as stochastic process. In problems of this nature we would not only interested in the behavior of this process but also in using such information in the control of the growth or decline of the population. (Lack of such action could result in the species becoming extinct)

INTRODUCTION TO MARKOV CHAIN. DEFINITION OF MARKOV CHAIN. PROPERTIES OF MARKOV CHAIN. THE INVENTOR OF MARKOV CHAIN. THE APPLICATION OF MARKOV

CHAIN.

A stochastic process, defined via a separate argument, may be

shown (mathematically) to have the Markov property and as a consequence to have the properties that can be deduced from this for all Markov processes. the Markov property holds for a certain random process in order to construct a stochastic model for that process. one of a limited number of simple ways of introducing statistical dependence into a model for a stochastic process in such a way that allows the strength of dependence at different lags to decline as the lag increases.

Of more practical importance is the use of the assumption that

In modeling terms, assuming that the Markov property holds is

Often, the term Markov chain is used to mean a Markov process

which has a discrete (finite or countable) state-space.

Markov Chain is actually a sequence of repeated trials of an

experiment in which the outcome of the experiment at any steps in the sequence depends at most on the outcome of the preceding step and not on any previous outcome.
which each experiment has m possible outcomes E, E, and so on and the probability that a particular outcomes occurs depend only the preceding experiment. state of the system depends only on the present state and not on the preceding state.

In general, a Markov Chain is a sequence of n experiment in

In other words or in simple ways to define, it is the next

A Markov is a stochastic process which has the property that the probability of

a transition from a given state to a future state is dependent only on the present state and not on the manner in which the current state was reached. This is also called the Markovian property if a Markov process meets the following conditions: 1.The system can be described by a set of finite states and that the system can be in one and only one state at a given time. 2.The transition probability , the probability of transition from state i to state j , is given from every possible combination of i and j (including i=j) and the transition probabilities are assumed to be stationary (unchanging) over the time period of interest and independent of how state it was reached. 3.The initial state of the system or the probability distribution of the initial state is known.

Andrei Andreevich Markov. Born on June 14, 1856 in Ryazan, Russia. Died on July 20, 1992 at the aged of 66 years old. Residence : Russia

Russian mathematician who best known for his work on develop the modern theory of stochastic processes that later known as Markov chains. of St Petersburg University.

Institutions: he began his studies at the physico-mathematical faculty

He also one of the member of Soviet Academy of Sciences.


He studying dependent trails since 1907 and in 1913 he applied his

findings and published his statistical work for the first time to the first 20,000 letters of Pushkins Eugene Onegin.

1. Board game played with dice.


A game of snakes and ladders or any other game whose

moves are determined entirely by dice is a Markov chain. This is in contrast to card games such as blackjack, where the cards represent a 'memory' of the past moves. To see the difference, consider the probability for a certain event in the game. In the above mentioned dice games, the only thing that matters are the current state of the board. The next state of the board depends on the current state, and the next roll of the dice. It doesn't depend on how things got to their current state. In a game such as blackjack, a player can gain an advantage by remembering which cards have already been shown (and hence which cards are no longer in the deck), so the next state (or hand) of the game is not independent of the past states.

2. Baseball Markov chain models have been used in advanced baseball analysis since 1960, although their use is still rare. Each half-inning of a baseball game fits the Markov chain state when the number of runners and outs are considered. During any at-bat, there are 24 possible combinations of number of outs and position of the runners.

3. Predicting the weather. Sunny = 1/3 Cloudy = 1/3 Weather Raining = 1/3
4. To predict individual emotional Sad = 1/4 Happy = 1/4 Angry = 1/4

Individual emotional
Comprehensive = 1/4

Queuing theory is the mathematical study of waiting lines,

or queues.
The theory enables mathematical analysis of several related

processes, including arriving at the queue, waiting in the queue, and being served at the front of the queue.
Queuing theory has application in diverse fields including

telecommunications, traffic engineering, computing, and the designs of factories, shops, offices, and hospitals.

Agner Krarup Erlang,

a Danish engineer who worked for the Copenhagen Telephone Exchange, published the first paper on queuing theory in 1909.

David G. Kendall introduced an

A/B/C queuing notation in 1953. Important work on queuing theory used in modern packet switching networks was performed in the early 1960s by Leonard Kleinrock.

The word queue comes, via French, from Latin cauda, meaning tail. . Queuing

theory is considered as a branch of operations research because the results are often used when making business decisions needed to provide services.

There are 4 main component in Queuing Theory


1.Input process
Describe the number of customer arriving during specific timed period

or period between two consecutive arrivals Represented by the distribution of random variables For some simple models, it will be a Poisson process

2.Services mechanism
Describes the number of serves, number of customer getting served at

any time, the duration of services and etc The duration of services represented by random variable ( often assumed to be exponentially distributed )

3.Queue discipline
Example of discipline is First Come First Served

disciplined , (FCFS) The method by which customers are selected from the queue for processing.

4.Number of queues
can be one or more The amount of time which is customer waits in the queue

called queuing time. The number of customers who arrive from the calling population and join the queue in a given period of time is modeled by a statistical distribution.

Applications are frequently encountered in customer situations as well

as transport and telecommunication.


Queuing theory is directly applicable to intelligent transportation systems, call

centres, PABXs, networks, telecommunications, server queuing, mainframe computer of telecommunications terminals, advanced telecommunications systems, and traffic flow.
Queues are employed extensively in many aspects of computer science. CPU

task scheduling, network load, and printer job scheduling have all been improved and effectively realized by use of queue theory. The main benefit of studying computer science through queue theory is that real world results and algorithms can be mathematically analyzed. With advances in computer hardware more efficient queuing algorithms can make an even large mark in computer science. By improving the speed and applicability of queue processes computer scientist can maximize the potential of these new systems.

1947 Simple Game is inverted by adjusting several knob 1980s rise of computer gaming consoles by Nintendo

1990s release of computer gaming such as the sims and the world of war craft

DEFINITION OF SIMULATION

Simulation is the imitation of some real thing, state of affairs, or process. The act of simulating something generally entails representing certain key characteristics or behaviours of a selected physical or abstract system.

The process of imitating a real phenomenon with a set of mathematical formulas.

The use of a mathematics model to recreate a situation, often repeated so that the likelihood of various outcomes can be more accurately estimated.

Flight stimulator-used to train pilots on the ground.

games-sega,nintendo.

Finance-stock market simulation help to low the risk of invest.Educationconcept where classroom simulation occur when instructor can explore
instructional strategies that can lead into increased learning

Vous aimerez peut-être aussi