Video není dostupné.
Omlouváme se.
17. Markov Chains II
Vložit
- čas přidán 15. 08. 2024
- MIT 6.041 Probabilistic Systems Analysis and Applied Probability, Fall 2010
View the complete course: ocw.mit.edu/6-0...
Instructor: John Tsitsiklis
License: Creative Commons BY-NC-SA
More information at ocw.mit.edu/terms
More courses at ocw.mit.edu
"The memory of where you started washes out..". Restatement of mathematical equations like this makes these videos a gem. Thank you, MIT!
MIT is really the hope of human !
spread education all over the world !
Intuitive Ideas are actually helpful to understand the concept. How simply he has explained it.
Thank you MIT opencourseware, it helps a lot!
Thanks for you good work sir, this is the technique which every civilian should keep in their tool bag.
Outstanding lecture from Prof. Tsitsiklis
Thanks Sir ! I m so glad i ve had the chance to listen to your lecture its so clear and it made me advance in short time
One awesome thing we can do is to skip certain parts if you knew / don't care about them. I wish I could do this in real life with real professors (lol)
Great lecture. One of my favorite professors.
Excellent courses by professor Tsitsiklis .Ευχαριστώ πολύ κύριεΤσιτσικλή
46:47 "It's a very popular model for modeling people who are drunk" lol
Very helpful video thank you MIT!!!
Absolutely beautiful stuff ...
excellent teaching! it makes complex things easier, break-down the points, better than other online class!.
Brilliant lecturer! Thanks a mill
very clear !
The most intuitive lecture for Markov Chain that I've ever seen.
My exam is the day after tomorrow. Thanks MIT
I really like his lectures!
Intro => Most of the lecture, we're going to concentrate on their steady-state behavior.
meaning, we're going to look at what does a Markov chain do, if it has run for a long time.
What can we say about the probabilities of the different states?
So what I would like to repeat is a statement I made last time that Markov Chain is very very useful class of models.Pretty much anything in the real world can be approximately modeled by a Markov chain provided that you set your states in the proper way.
Respect to MIT.
48:50 1/(1+rho) ~ 1-rho
R K What?
Thanks for iploading
very helpful videos...
Thanks much, this was really helpful
wonderful
@24:38 When the professor mentions that the system is singular, why does he say it is because (0,0,..0) is a solution of the system. Doesn’t a non-trivial nullspace of matrix A make the equation Ax=0 singular?
Because of two assumptions above,
1.) we have a single or no recurrent class
2.) No periods in the recurrent class
This ensure (because n-> infinity)
case 1: if 0 recurrent classes than staying at the same state is not possible
case 2: if 1 recurrent class no periods so the event has to occur to pull out from recurrent chain
So, in both case pi0 is 0, (it does not stay at same state after infinite transitions)
by looking at the recurring equation pi0 to pin are all zeros
hence, solution is (0,0....0), cause of one row being 0 in the matrix.
Can someone what is the relevance of the last plot to Markov process, I see it as the number of person at the queue when time tends to infinity. But how can we relate that kind of definition to a markov process, where we just need the probability of any state ?
how did he calculate E[X] in Birth-Death Process? , E[X]=rho/(1-rho) ?
Prakash Besra by using the expectation of a geometric random variable minus 1
39:58 I don't follow this logic. Why do the number of upward transitions need to match the number of downward transitions? Won't it depend on the values of p_i and q_(i+1)?
this time-stamped section clarified my confusion: czcams.com/video/XKYpKYspe1w/video.html
NO account of twins in Birth-Death