In this post will give introduction to Markov models and Hidden Markov models as mathematical abstractions, with some examples.


In probability theory, a Markov model is a stochastic model that assumes the Markov property. A stochastic model models a process where the state depends on previous states in a non-deterministic way. A stochastic process has the Markov property if the conditional probability distribution of future states of the process.

 


System state is fully observable System state is partially observable

System is autonomous

Markov chain Hidden Markov model

System is controlled
Markov decision process Partially observable Markov decision process

 

Markov chain

A Markov chain named by Andrey Markov, is a mathematical system that representing transitions from one state to another on a state space. The state is directly visible to the observer. It is a random process usually characterized as memoryless: the next state depends only on the current state and not on the sequence of events that preceded it. This specific kind of "memorylessness" is called the Markov property.

 

Let's talk about the weather. we have three types of weather sunny, rainy and cloudy.

Let's assume for the moment that the weather lasts all day and it does not change from rainy to sunny in the middle of the day.

Weather prediction is try to guess what the weather will be like tomorrow based on a history of observations of weather

simplified model of weather prediction
Wewill collect statistics on what the weather was like today based on what the weather was like yesterday the day before and so forth. We want to collect the following probabilities.

 

image

Using above expression, we can give probabilities of types of weather for tomorrow and the next day using n days of history.

image

 

The larger n will be problem in here. The more statistics we must collect Suppose that n=5 then we must collect statistics for 35 = 243 past histories Therefore we will make a simplifying assumption called the "Markov Assumption".

image

This is called a "first-order Markov assumption" since we say that the probability of an observation at time n only depends on the observation at time n-1. A second-order Markov assumption would have the observation at time n depend on n-1 and n-2. We can the express the joint probability using the “Markov assumption”.

image

So this now has a profound affect on the number of histories that we have to find statistics for, we now only need 32 = 9 numbers to characterize the probabilities of all of the sequences. (This assumption may or may not be a valid assumption depending on the situation.)

Arbitrarily pick some numbers for  P (wtomorrow | wtoday).

 

image

Tabel2: Probabilities of Tomorrow's weather based on Today's Weather

 

“What is w0?” In general, one can think of w as the START word so P(w1w2) is the probability that w1 can start a sentence.

For first-order Markov models we can use these probabilities to draw a probabilistic finite state automaton.

image

eg:

1. Today is sunny what's the probability that tomorrow is sunny and the day after is rainy

First we translates into

  • P(w2= sunny,w3=rainy|w1=sunny)
P(w2= sunny,w3=rainy|w1=sunny) = P(w2=sunny|w1=sunny) *
   P(w3=rainy|w2=sunny,w3sunny)
  = P(w2=sunny|w1=sunny) * P(w3=rainy|w2=sunny)
  = 0.8 * 0.05
  =0.04

 

Hidden Markov model

A hidden Markov model (HMM) is a statistical Markov model in which the system being modeled is assumed to be a Markov process with unobserved (hidden) states. A HMM can be considered the simplest dynamic Bayesian network. Hidden Markov models are especially known for their application in temporal pattern recognition such as speech, handwriting, gesture recognition, part-of-speech tagging, musical score following, partial discharges and bioinformatics.

Example

Well suppose you were locked in a room for several days and you were asked about the weather outside The only piece of evidence you have is whether the person who comes into the room carrying your daily meal is carrying an umbrella or not.

image

Table 3: Probabilities of Seeing an Umbrella

The equation for the weather Markov process before you were locked in the room.
image

 

Now we have to factor in the fact that the actual weather is hidden from you We do that by using Bayes Rule.

image

Where ui is true if your caretaker brought an umbrella on day i and false if the caretaker did not. The probability P(w1,..,wn) is the same as the Markov model from the last section and the probability P(u1,..,un) is the prior probability of seeing a particular sequence of umbrella events.

The probability P(w1,..,wn|u1,..,un) can be estimated as,

image

Assume that for all i given wi, ui is independent of all uj and wj. I and J not equal

Next post will explain about “Markov decision process” and “Partially observable Markov decision process”.

0

Add a comment

We used have  Singleton Design Pattern in our applications whenever it is needed. As we know that in singleton design pattern we can create only one instance and can access in the whole application. But in some cases, it will break the singleton behavior.

There are mainly 3 concepts which can break singleton property of a singleton class in java. In this post, we will discuss how it can break and how to prevent those.

Here is sample Singleton class and SingletonTest class.
13

Microservices can have a positive impact on your enterprise. Therefore it is worth to know that, how to handle Microservice Architecture (MSA) and some Design Patterns for Microservices. General goals or principles for a microservice architecture. Here are the four goals to consider in Microservice Architecture approach [1].

Reduce Cost: MSA will reduce the overall cost of designing, implementing, and maintaining IT services.
12

Last few years has been a great year for API Gateways and API companies. APIs (Application Programming Interfaces) are allowing businesses to expand beyond their enterprise boundaries to drive revenue through new business models. Larger enterprises are adopting API paradigm — developing many internal and external services that developers connect to in order to create user-facing products.
1

kubectl (Kubernetes command-line tool) is to deploy and manage applications on Kubernetes. Using kubectl, you can inspect cluster resources; create, delete, and update components.

NOTE

You must use a kubectl version that is within one minor version difference of your cluster. If not you may see errors as below

1.
2

WSO2 Enterprise Integrator is shipped with a separate message broker profile (WSO2 MB). In this Post I will be using message broker profile in EI (6.3.0).

1) Setting up the message broker profile

1.1) Copy the following JAR files from the <EI_HOME>/wso2/broker/client-lib/ directory to the <EI_HOME>/lib/ directory.
1

When two or many applications want to exchange data, they do so by sending the data through a channel that connects the each others. The application sending the data may not know which application will receive the data, but by selecting a particular channel to send the data on, the sender knows that the receiver will be one that is looking for that sort of data by looking for it on that channel.
2

Microservices are going completely over the enterprise and changed the way people write software within an enterprise ecosystem.

Let build you microservices with msf4j for Auto Mobile.

The SMPP inbound endpoint allows you to consume messages from SMSC via WSO2 ESB OR EI.

1.  Start SMSC

2.  Create custom inbound end point with below parameter. (Make sure you pick correct system-id and password correct for your SMSC)

3. Create Sequence for Inbound EP.

4. Once ESB or EI start.
1

WSO2 APIM Components

WSO2 API Manager includes five main components as the Publisher, Store, Gateway, Traffic Manager and Key Manager.

API Gateway - responsible for securing, protecting, managing, and scaling API calls. it intercepts API requests and applies policies such as throttling and security checks. It is also instrumental in gathering API usage statistics.
I am
I am
Archives
Total Pageviews
Total Pageviews
2 0 5 8 0 4 9
Categories
Categories
Loading