Conditional independence of observations

The HMM assumes each observation is independent of all the others, given the state that emitted it.

This video just has a plain transcript, not time-aligned to the videoTHIS IS AN UNCORRECTED AUTOMATIC TRANSCRIPT. IT MAY BE CORRECTED LATER IF TIME PERMITS
So we've done this very simple sort of probability that if two things are statistically un correlated, that means that knowing the value of one doesn't tell us anything about the value of the other one.
We can compute the joint probability of these two things happening by just multiplying their individual probabilities.
We talked about my socks and the weather for quite a while.
We decided the weather is no informative about my socks because I get a bit dark.
I don't know whether it's just pulled around them.
On course.
The weather doesn't depend on what sucks weather.
Lots of other things.
That's a nice form off probability that two independent events we can compute the property of both of them happening by just multiplying their problems.
We've already applied that to hidden Markov model.
They hit a mark off model is going to make that assumption about consecutive observations.
Consecutive FCC vectors M.
A.
C C vector this time is independent of the one of the previous time.
We're going to develop that I do a little bit more, actually conditionally independent.
Given the model, it's generating it.
And so the probability off a whole sequence of observations coming out of a model is just the probability of each of the individual observations multiplied together.
That's beautifully simple math we can see that's gonna tend to really nice computational algorithms.
So it's going to make it easy to learn the model from data.
I give you some models and I tell you that parameters.
So the means and variances here just one dimensional galaxy ins and I give you an observation sequence.
Then we can work out the probability that this model generated this observation sequence that this model generous.
This observation sequence compared the two numbers on announced which model was more likely to have generated the observation sequence and what we actually computing there is the probability.
Let's give this some notation off the observation sequence.
Oh, given one of the models, we're going to look today.
How that notation works is going to use this bar on the model is going to model the words given the words given the word, we can compute the observation sequence.
Probability on that probability will depend on which of the models we choose to generate it.
So this observation secrets probability depends on the model that we use to generate it.
One model will generated high probability than the other.
Okay, so this bar, this notation just is the word given given that remember that this other notation, this common notation that just means okay, It was just notation turns into nice English words that mean things.

Log in if you want to mark this as completed
This video covers topics: , ,
Excellent 50
Very helpful 6
Quite helpful 7
Slightly helpful 3
Confusing 0
No rating 0
My brain hurts 0
Really quite difficult 0
Getting harder 3
Just right 54
Pretty simple 9
No rating 0