In this post will give introduction to Markov models and Hidden Markov models as mathematical abstractions, with some examples.


In probability theory, a Markov model is a stochastic model that assumes the Markov property. A stochastic model models a process where the state depends on previous states in a non-deterministic way. A stochastic process has the Markov property if the conditional probability distribution of future states of the process.

 


System state is fully observable System state is partially observable

System is autonomous

Markov chain Hidden Markov model

System is controlled
Markov decision process Partially observable Markov decision process

 

Markov chain

A Markov chain named by Andrey Markov, is a mathematical system that representing transitions from one state to another on a state space. The state is directly visible to the observer. It is a random process usually characterized as memoryless: the next state depends only on the current state and not on the sequence of events that preceded it. This specific kind of "memorylessness" is called the Markov property.

 

Let's talk about the weather. we have three types of weather sunny, rainy and cloudy.

Let's assume for the moment that the weather lasts all day and it does not change from rainy to sunny in the middle of the day.

Weather prediction is try to guess what the weather will be like tomorrow based on a history of observations of weather

simplified model of weather prediction
Wewill collect statistics on what the weather was like today based on what the weather was like yesterday the day before and so forth. We want to collect the following probabilities.

 

image

Using above expression, we can give probabilities of types of weather for tomorrow and the next day using n days of history.

image

 

The larger n will be problem in here. The more statistics we must collect Suppose that n=5 then we must collect statistics for 35 = 243 past histories Therefore we will make a simplifying assumption called the "Markov Assumption".

image

This is called a "first-order Markov assumption" since we say that the probability of an observation at time n only depends on the observation at time n-1. A second-order Markov assumption would have the observation at time n depend on n-1 and n-2. We can the express the joint probability using the “Markov assumption”.

image

So this now has a profound affect on the number of histories that we have to find statistics for, we now only need 32 = 9 numbers to characterize the probabilities of all of the sequences. (This assumption may or may not be a valid assumption depending on the situation.)

Arbitrarily pick some numbers for  P (wtomorrow | wtoday).

 

image

Tabel2: Probabilities of Tomorrow's weather based on Today's Weather

 

“What is w0?” In general, one can think of w as the START word so P(w1w2) is the probability that w1 can start a sentence.

For first-order Markov models we can use these probabilities to draw a probabilistic finite state automaton.

image

eg:

1. Today is sunny what's the probability that tomorrow is sunny and the day after is rainy

First we translates into

  • P(w2= sunny,w3=rainy|w1=sunny)
P(w2= sunny,w3=rainy|w1=sunny) = P(w2=sunny|w1=sunny) *
   P(w3=rainy|w2=sunny,w3sunny)
  = P(w2=sunny|w1=sunny) * P(w3=rainy|w2=sunny)
  = 0.8 * 0.05
  =0.04

 

Hidden Markov model

A hidden Markov model (HMM) is a statistical Markov model in which the system being modeled is assumed to be a Markov process with unobserved (hidden) states. A HMM can be considered the simplest dynamic Bayesian network. Hidden Markov models are especially known for their application in temporal pattern recognition such as speech, handwriting, gesture recognition, part-of-speech tagging, musical score following, partial discharges and bioinformatics.

Example

Well suppose you were locked in a room for several days and you were asked about the weather outside The only piece of evidence you have is whether the person who comes into the room carrying your daily meal is carrying an umbrella or not.

image

Table 3: Probabilities of Seeing an Umbrella

The equation for the weather Markov process before you were locked in the room.
image

 

Now we have to factor in the fact that the actual weather is hidden from you We do that by using Bayes Rule.

image

Where ui is true if your caretaker brought an umbrella on day i and false if the caretaker did not. The probability P(w1,..,wn) is the same as the Markov model from the last section and the probability P(u1,..,un) is the prior probability of seeing a particular sequence of umbrella events.

The probability P(w1,..,wn|u1,..,un) can be estimated as,

image

Assume that for all i given wi, ui is independent of all uj and wj. I and J not equal

Next post will explain about “Markov decision process” and “Partially observable Markov decision process”.

0

Add a comment

Prevent the breaking of a Singleton Class Pattern
Prevent the breaking of a Singleton Class Pattern
13
Design Patterns for Microservices
Design Patterns for Microservices
12
API Monetization Models
API Monetization Models
1
Kubernetes command-line tool for Windows
2
WSO2 Enterprise Integrator with message broker profile
1
Messaging Patterns on Enterprise integration
Messaging Patterns on Enterprise integration
2
Writing Micro Services with msf4j
SMPP to wso2 ESB / EI
1
WSO2 APIM - Deployment Patterns and Profiles
SMS with WSO2 ESB
5
Reading Value from uri-template in WS02 ESB
1
Estimation for Software project development
Estimation for Software project development
1
JAVA8 Stream API and New Class Optional
Lifecycle of a Book in WSO2 Greg
1
Enterprise Data integration Directions
Enterprise Data integration Directions
Handling BigDecimal in Talend
Vehicles registration services - Part 01– PayloadFactory and Validate with JSON
1
Handling simple denormalized data from Talend
WSO2 ESB with JavaScript Object Notation
Cleaning OSSIM Alarms
Syscheck in OSSEC
Syscheck in OSSEC
Triggering action or email over the event occurrence in OSSIM
Adding More user data field for Event
Connecting to OSSEC rule from OSSIM
Creating New Rule set for OSSEC Server
1
OSSEC Rule Testing
Sending Brute force attack
DiskPart in window (Fdisk in windows 8)
Uncomplicated Firewall
Uncomplicated Firewall
Grep quotes in Linux
Grep quotes in Linux
OSSEC Decoder
How access log work with OSSIM
HIDS Agentless in AlienVault USM
Install OSSIM
OSSEC configure to new log file
OSSEC configure to new log file
Testing Log forwarding in OSSEC
Host based firewall in Linux
Adding OSSEC client to OSSEC Server
Creating Correlation Rules and Alarms in AlienVault
Advance Tutorial in OSSIM Directive
Simple OSSIM Directives
Making OSSIM Alarm from Event
Reading a custom log file from OSSIM
2
OSSIM components
module.exports VS exports
Adding agent for OSSIM from OSSEC
OSSEC service for Centos7
1
JavaScript references with setTimeout() and setInterval()
JavaScript references with setTimeout() and setInterval()
CROS in Node
CROS in Node
Node.JS with Express Session
Node.JS with Express Session
dhis2-android-dashboard Build from Source
Python make life easy
Python make life easy
Installing NodeJS in CentOS
Packaging and Distributing Python Projects
Zeppelin Data Validation Service
Zeppelin Data Validation Service
Introducing New Chart Library and Types for Apache Zeppelin
1
Tutorial with Map Visualization in Apache Zeppelin
Zeppelin Docs
Data validation
Generate a AngularJS application with grunt and bower
7
Git simple Feature Branch Workflow
Git simple Feature Branch Workflow
Workflows for Git
Workflows for Git
1
Chart Types and Data Models in Google Charts
Options for Google Charts
Google Chart with AngularJS
3
Grammar induction
Adding Configuration file for Python
NLTK tutorial–03 (n-gram)
NLTK tutorial–02 (Texts as Lists of Words / Frequency words)
Natural Language Toolkit (NLTK) sample and tutorial - 01
AffinityPropagation Clustering Algorithm
AffinityPropagation Clustering Algorithm
Building Zeppelin in windows 8
1
Zeppelin Note for load data and Analyzing
Zeppelin NoteBook
Data Binding in Angular
5
AngularJS and Angular Directives
AngularJS and Angular Directives
19
Density-based clustering algorithm (DBSAN) and Implementation
scikit-learn to generate isotropic Gaussian blobs
1
CouchDB 2.0 (Developer Preview) with HTTP APIs
CouchDB-fauxton introduction
2
Building Apache Zeppelin
CouchDB with Fauxton in windows 8
Installing Flask in Windows8
1
Basic Functionality of Series or DataFrame in Pandas
Pandas for Data Manipulation and Analysis
Pandas for Data Manipulation and Analysis
Install python 2.7.X on Ubuntu
Install python 2.7.X on Ubuntu
Python For Beginners
Python with CSV
Python Class
Python Class
Maven 3.3.x for Mint
Bower: Front-end Package Manager
Predictive modeling
Predictive modeling
Gaussian function
Installing external package for canopy / (python)
Regular Expressions with Python
2
Python Code for File reading
Estimation Theory
AngularJS and solr
3
Apache Solr and Search
I am
I am
Archives
Total Pageviews
Total Pageviews
2 0 5 8 0 4 9
Categories
Categories
Loading