Conditional probability

The probability of observations from an HMM obviously depend on which model generated them: to express this, we need to know about conditional probability.

This video just has a plain transcript, not time-aligned to the videoTHIS IS AN UNCORRECTED AUTOMATIC TRANSCRIPT. IT MAY BE CORRECTED LATER IF TIME PERMITS
I'm going to look at the side of conditional probability, huh? And that is that knowing one thing now is informative about some other things.
There's some conditional dependency between correlated and specifically in this example.
Knowing which age men were using does change the probability of the observation sequence.
That's kind of the point.
Okay, well, then developed into a very simple probability real called basil, which is critical.
If we're going to use that to do all of the work of speech recognition, will name the items in that in that rule.
And then we're going to look at how we do recognition with a chairman's on which terms in this bays rule in this equation.
Hmm computes And then we realise, is another term that the ancients don't compute and we go to another model to do that, that's going to be called a language model.
So conditional.
Independence, independence.
So if things are independent, everything's great.
Just compute everything separately and just product.
Multiply them together.
That's all fine.
Okay, what if things are not independent? So let's think about two things, depending on each other, the height of the people in this room we could draw distribution of that.
Or we could say that Somebody waiting outside there and we're gonna guess their height before they come in and play a guessing game.
How accurately we gonna guess their height? Well, if we don't know anything about them, we're just going to guess the mean of all the people in this room, Here's our training data.
Maybe.
I mean, high 1.7 metres.
So whoever comes in the room guest 1.7 metres.
Not with the minimum era.
Gas on average, will be closest with guests.
And then now let's play another game.
I'm going to tell you something about the person is about to come in the room.
I'm going to say it's a man.
Maybe, I guess is now going to be.
We're gonna get there 1.8 metres.
It's a woman.
We're gonna get there maybe 1.6 metres.
So knowing something about them is informative about the other thing.
These things are not conditioned independent.
Knowing somebody's gender, no male female will help you guess their height on average for individual cases, not always.
We're going to find short men and tall women.
But on average, we're going to do better if we know one thing.
Gonna help us.
Guess the other thing.
So we got this notation for this.
We've just seen it on.
The same thing applies to Hidden Markov model.
It turns out then, that are hidden Markov model.
Compute the probability of an observation sequence, given the identity of the model.
Given which model is it? A model of one among to on the 31 of nine.
We're going to use a notation for that w big capital.
W means it's the random variable for word.
It could take one or two or three and knowing so we could compute things like probability that the observation sequence equals the particular one that we've been given that we try to recognise.
Given that the word equals two so we can compete things like that with hidden Markov models

Log in if you want to mark this as completed
Excellent 27
Very helpful 0
Quite helpful 3
Slightly helpful 2
Confusing 0
No rating 0
My brain hurts 0
Really quite difficult 0
Getting harder 3
Just right 26
Pretty simple 3
No rating 0