14. Poisson Process I

Sdílet
Vložit
  • čas přidán 8. 11. 2012
  • MIT 6.041 Probabilistic Systems Analysis and Applied Probability, Fall 2010
    View the complete course: ocw.mit.edu/6-041F10
    Instructor: John Tsitsiklis
    License: Creative Commons BY-NC-SA
    More information at ocw.mit.edu/terms
    More courses at ocw.mit.edu

Komentáře • 51

  • @matsiv5707
    @matsiv5707 Před 4 měsíci +4

    If you're wondering, a simple way to solve the limit is to take the log of the expression, compute the limit of that logarithm and then take the exponential of the limit. You can use the fact that log(n!) goes to infinity as nlogn The rest of the computation is trivial

  • @MayankGoel447
    @MayankGoel447 Před 8 měsíci +3

    This is a fantastic lecture on Poisson Process and way they come from, probably the best

  • @dania_884
    @dania_884 Před 2 lety +10

    he teaches excellently! make complicated things easy to understand, that's the magic!

  • @nadekang8198
    @nadekang8198 Před 5 lety +26

    Even Professor Tsitsiklis might stuck in the middle of the proof of Poisson distribution PMF...but the proof is perfectly written in his textbook, which is the BEST by far I've ever read in intro level probability and statistics. It took me a long time to fully understand Poisson process...thanks to MIT OCW and Professor Tsitsiklis.

    • @elmrabti
      @elmrabti Před 3 lety

      Hi where I can find his textbook please?

    • @nadekang8198
      @nadekang8198 Před 3 lety +2

      @@elmrabti amazon

    • @elmrabti
      @elmrabti Před 3 lety

      @@nadekang8198 Do you have it?

  • @elenag.224
    @elenag.224 Před 3 lety +4

    Ένα μεγάλο ευχαριστώ, από Έλληνες φοιτητές!

  • @eneserdogan34
    @eneserdogan34 Před 2 lety +10

    To derive the formula at 26:07 (Poisson distribution) from 25:06 (Binomial distribution), you need to write combination as factorials, then use stirling approximation for factorials and the definition of e

    • @maxdickens9280
      @maxdickens9280 Před rokem +2

      Well said! But, actually, stirling is not a necessity. Since when n -> infinity, we have (n k) = (n * (n-1) * ... * (n - k + 1)) / k! -> n^k / k!

    • @oyaoya2468
      @oyaoya2468 Před 11 měsíci +1

      you dont need stirling appoximation for that

    • @Longpan898
      @Longpan898 Před 7 měsíci

      Thanks 🙏🏻

  • @Naz-pk4ll
    @Naz-pk4ll Před 5 lety +7

    You are a star, dear lecturer.

  • @xozhang
    @xozhang Před 7 lety +7

    Excellent explanation !

  • @achillesarmstrong9639
    @achillesarmstrong9639 Před 6 lety +2

    master piece of video

  • @subhashjain2294
    @subhashjain2294 Před 6 lety +4

    thank you so much sir,you made learning very easy....

  • @zihaocheng6438
    @zihaocheng6438 Před 8 lety +5

    wonderful lecture

  • @hrodrick2890
    @hrodrick2890 Před 10 lety +20

    sacredsoma The lector doesn't explain it, but in that part he is working with one of the several definitions of the Poisson process, in which the process is defined as a Lévy process. Deep knowledge of mathematical analysis, specifically measure theory, is required to understand these infitesimal properties of Lévy processes, that's why he doesn't say much about those second order terms.

    • @JaviOrman
      @JaviOrman Před 3 lety +1

      Thanks for that clarification. I wrote on my notes that this part needed some more explanation.

  • @grantguo9399
    @grantguo9399 Před 2 měsíci

    I enjoyed watching this maths lecture as watching a movie

  • @adityasahu96
    @adityasahu96 Před 3 lety +3

    too much information to grasp in one go.. woah nice lecture!

  • @KyleGoryl
    @KyleGoryl Před 6 lety +16

    Begin's poisson at 9:20

  • @rikenm
    @rikenm Před 7 lety +26

    my probability professor skipped poisson distribution saying it is not so important.

  • @anexocelisia9377
    @anexocelisia9377 Před 2 lety

    process has memorylessnessf property (provided the person entered the room doesn't see the future)
    continuous time variations
    definition of the poisson process and we use the concept of memorylessness in here
    time homogenity as time is continuous.
    Lambda*Tau will be the rate at very small interval
    diff time slots are independent of each other
    NUMBER of arrivals in the disjoint time interval will be independent of each other.
    delta is very small.
    pmf have the binomial pmf
    here we are taking a big interval and we are splitting that interval into many infinitesemely small interval.
    now prob of one of the intervals will be same as the bernoulli distriubution
    for poissson we take lambda tends to 0.
    merging of the poisson process

  • @richardellard
    @richardellard Před 5 lety +13

    Not even a chuckle at the "B movie" joke?

  • @luigixu3251
    @luigixu3251 Před 3 lety +1

    51:46 he probably meant "this plus that" instead of "this plus that".

  • @aniketsaha7455
    @aniketsaha7455 Před 6 lety +2

    Although it is true intuitively that the T2, T3,.... all will be independent exponential random variable but the pdf for K=2,3,4 is not exponential...How can you argue about that??? anyone?

  • @animagous1
    @animagous1 Před 7 lety +1

    Why probability of arrival for k>1 in time delta is zero.?

  • @WayWaterWay
    @WayWaterWay Před 4 lety

    This is amazing

  • @blogginbuggin
    @blogginbuggin Před 4 lety +2

    Prof John's lecture on Poissan PMF is available at: czcams.com/video/eTLa9GDID8A/video.html

  • @BryanSteeleSounds
    @BryanSteeleSounds Před 3 měsíci

    I'm wondering, at 14:15, when Prof. is discussing arrival independence (second bullet point), how can they be truly "independent," if this is being modeled by a PMF that must sum to 1. In other words, don't we create a type of dependency exactly when an event has occurred ?

  • @suyashmuley9096
    @suyashmuley9096 Před 6 lety

    what is the reason behind considering second order terms of delta? since it is just an interval.

  • @nguyentin6294
    @nguyentin6294 Před 3 lety

    I am having a hard time making sense of the notation at 11:22. I believe the notation should be the conditional probability P(k|t) rather than P(k,t). I interpreted the latter to be the joint probability and if it is the case, the summation over all k of P(k,t) given a fixed t could not be equal to 1. Anyone, please help knock some sense to my head!

    • @schobihh2703
      @schobihh2703 Před 2 lety +1

      t is a different nature and not a random variable.

    • @nguyentin6294
      @nguyentin6294 Před 2 lety

      @@schobihh2703 could you clarify it a bit more?

    • @schobihh2703
      @schobihh2703 Před 2 lety +1

      @@nguyentin6294 as mentioned in the video. k is a random variable, i.e. the summation over all k is 1. "t" just specifies omega, the set of all k's. it is only a parameter and not an event. Or otherwa around ask yourself if t would be an event what would P(t) be? it simply doesnt make sense to ask this, without specifing how much arrivals we are talking about.

    • @nguyentin6294
      @nguyentin6294 Před 2 lety

      @@schobihh2703 thank you for the insight :)

  • @sacredsoma
    @sacredsoma Před 11 lety

    I dont get the bit @ 15:50 about small interval probabilities being a limiting case. Why are there any second order terms?

    • @aryanparikh9085
      @aryanparikh9085 Před 3 lety +5

      I doubt that my comment will be useful to you at all since it's been 7 years, but I'll comment anyway for people who are going to view this lecture in the future.
      The second order terms exist because P(k,d) isn't 0 when k >=2. d here stands for delta.
      Regardless of how small an interval you pick, it is possible for more than one successes to occur in the interval. That probability, however, will converge to 0 much "faster" than d, so we can ignore those probabilities since they are trivial.
      This is an important assumption to use the binomial theorem, since the binomial theorem only works when there are two possible outcomes - success and failure. If we allow multiple successes, the theorem cannot be used.
      Of course, it follows that any distribution which doesn't satisfy the property that the probability of multiple successes in a given interval doesn't converge to 0 "faster" than the interval itself cannot be treated as a Poisson Distribution. In our course notes at University of Waterloo, this property is referred to as "individuality".

    • @priyamdey3298
      @priyamdey3298 Před 3 lety

      @@aryanparikh9085 If possible, can you share the notes of this particular characteristic for qualifying a distribution as Poisson?

  • @bosepukur
    @bosepukur Před 8 lety +1

    good explaination though a little more rigorous derivation wud have helped lot... intended for overview

    • @Isaac668
      @Isaac668 Před 7 lety +2

      the course is applied probability, the point is not rigorous derivation

    • @JaviOrman
      @JaviOrman Před 3 lety

      I would look into a textbook for complete rigorous derivations.

  • @huasongyin3711
    @huasongyin3711 Před 8 lety +3

    I really enjoyed this series of lectures just to refresh my memory. Thanks.But I am also surprised that A MIT professor of statistics can stump on such a simple task of deriving the Poisson PDF.

    • @niko97219
      @niko97219 Před 7 lety +26

      Anyone can stump on something once in a while.

    • @freeeagle6074
      @freeeagle6074 Před rokem +1

      He wrote the textbook which contains the detailed derivation procedures. He would probably be able to figure out how to derive the final Poisson formula if he had spent perhaps some more minutes. But the professor decided to leave the students to read the textbook themselves. By the way, some people do have the ability to memorize all details once they take a look. But that is a painful disease.

  • @mpakojohnk
    @mpakojohnk Před 3 lety

    Περίφανος που βλέπω κι άλλον Έλληνα να διδάσκει εδώ

    • @lekokotonteso
      @lekokotonteso Před 3 lety

      Πώς ξέρετε ότι είναι Έλληνας; Αγνοήστε τα λάθη, χρησιμοποιώ λογισμικό μετάφρασης.

    • @mpakojohnk
      @mpakojohnk Před 3 lety

      @@lekokotonteso Tsitsiklis is a greek surname. Furthermore focus on his accent

  • @user-ym8rz6mw5r
    @user-ym8rz6mw5r Před 8 lety +1

    hahaha, this guy does not get ready for lectures hahah. so funny..