What are Autoregressive (AR) Models

Sdílet
Vložit
  • čas přidán 10. 09. 2024
  • Time to start talking about some of the most popular models in time series - ARIMA models. First things first, let's look at the AR piece - autoregressive models!

Komentáře • 79

  • @enock_elk
    @enock_elk Před 4 lety +3

    Came here after being confused by my Lecturer,
    Thank you very much for simplifying this!

  • @pettirto
    @pettirto Před rokem +1

    Thanks Mr. LaBarr, I'm studying for my exam in time series and your videos are very helpful. Greetings from Italy!!!

    • @AricLaBarr
      @AricLaBarr  Před rokem

      Grazie! Glad to hear it was helpful!
      Ciao!

  • @hugoagudo4282
    @hugoagudo4282 Před 3 lety +5

    Great video. I’ve had a text book about time series that’s been gathering dust because I was afraid of all the symbols. This helps a lot

  • @oren2234
    @oren2234 Před 3 lety +1

    my statistics is very basic and i just needed a forecasting algorithm, this video explained it sooo well

  • @arnonym5995
    @arnonym5995 Před 6 měsíci

    I like the way you convey the intuition behind AR and MA models. One thing that might be confusing is however the terminology, in particular with regard to short and long memory, which is different in common literature. Therein, AR, MA and ARMA models are considered to be short-memory models, because their autocovariances are summable. Also AR models, whose autocovariance function (ACVF) decays quite quickly towards zero for increasing lags, even though the ACVF values in fact never fully reach zero, has summable autocovariances. In contrast long-memory behavior is indicated by a hyperbolically decaying ACVF, which results in an ACVF whose elements are not summable anymore. A popular example is the fractionally integrated ARMA model, often denoted by either FARIMA or ARFIMA, that can still have ACVF values of notable magnitude for large lags.

  • @clickbaitpolice9792
    @clickbaitpolice9792 Před 2 lety +1

    just become my lecturer lol. i love the enthusiasm you put in. makes learning more fun lol

  • @williamgomez6226
    @williamgomez6226 Před 2 lety

    Thank you, j had seen this equation when a was studying reinforcement learning, it's like the Value function weighted by a discount factor.... Great explanation!!!

  • @oq88
    @oq88 Před 3 lety +1

    One of the best teachers i’ve ever seen!
    Thank you

  • @ahsanshabbir16
    @ahsanshabbir16 Před 2 lety

    Hi Dr Aric LaBarr you work is Amazing please continue this again
    Under 5 minute concept is great

  • @economicsfriendly7425
    @economicsfriendly7425 Před 3 lety +1

    wow your teaching style is really amazing !! please make more videos on time series analysis. we really need your help!!

  • @felipedaraujo_
    @felipedaraujo_ Před 3 lety +1

    Excellent teaching! Thanks for your good work Aric!

  • @vadimkorontsevich1066
    @vadimkorontsevich1066 Před 2 lety

    God bless you for your efforts to explain!

  • @rossijuan9548
    @rossijuan9548 Před 3 lety

    Excellent contribution, thank you very much

  • @bend0596
    @bend0596 Před rokem

    super clearly explained, thanks!

  • @josealeman5008
    @josealeman5008 Před 2 lety

    simple and beautifully explained! thanks!

  • @elisesauvary8174
    @elisesauvary8174 Před 3 lety +1

    You are a god send!!

  • @Atrix256
    @Atrix256 Před 7 dny

    A lot of overlap here with an infinite impulse response filter from DSP. Im about to watch the moving average model video, but am wondering if that is the finite impulse response equivalent :)

  • @robin5453
    @robin5453 Před rokem

    Best ever, thank you!!

  • @valdompinga
    @valdompinga Před rokem

    man, you are incredible!
    Im learning ARIMA like im building legos!

  • @mirroring_2035
    @mirroring_2035 Před rokem

    Okay you're genius, thanks

  • @magtazeum4071
    @magtazeum4071 Před 8 měsíci

    at 3:31, 2nd term on the right hand side of the last equation, shouldn't the power of PI be (t-1) instead of t (and so on) ?

    • @AricLaBarr
      @AricLaBarr  Před 7 měsíci

      Completely correct! In all honesty, I should have had the left hand side be Y_(t+1) to make the math work better.

  • @kafuu1
    @kafuu1 Před 2 měsíci

    nice video!

  • @vaishnavikhiste7841
    @vaishnavikhiste7841 Před 10 měsíci

    WELL EXPLAINED

  • @roym1444
    @roym1444 Před 4 lety +1

    Is there any online resource you know of that would demonstrate how to code some of the concepts you've spoke about ?

  • @sidharthmohanty6434
    @sidharthmohanty6434 Před 2 lety

    Thanks

  • @dipenmodi1807
    @dipenmodi1807 Před 4 lety +1

    Can you explain the difference between Static, Dynamic and Autoregressive Probit models?

  • @Rundtj45
    @Rundtj45 Před 3 lety

    Excelente explanation, thanks

  • @user-tp3rq3wk3j
    @user-tp3rq3wk3j Před 2 lety

    I could not undrestand how do you calculate the φ because I 've seen a lot of correlation types and I do not know which one to use. Thank you for your time.

    • @AricLaBarr
      @AricLaBarr  Před 2 lety +1

      It actually isn't a correlation directly (unless it is an AR(1) model and then it is the Pearson correlation if the variables are standardized). The best way to think about it is that it is a weight in a regression model. The model chooses the weight that maximizes the likelihood (MLE) of the model and predictions. Hope this helps!

    • @user-tp3rq3wk3j
      @user-tp3rq3wk3j Před 2 lety

      @@AricLaBarr It helped a lot, thank you

  • @michalkiwanuka938
    @michalkiwanuka938 Před 3 měsíci

    the underlying assumption is that we know the data up to time t-1, and we use the observed data to estimate the parameters (ϕ1,ϕ2,…,ϕpϕ1​,ϕ2​,…,ϕp​ and e_t) , right?

  • @josephgan1262
    @josephgan1262 Před 2 lety

    If I am using a AR(1) model, and I have data of Yt-1, do I need to recursive back all the way to start point to predict Yt? or I can just use the formula shown at @1:17

    • @AricLaBarr
      @AricLaBarr  Před 2 lety

      You just use the formula! The recursive piece is to just show what is happening in concept if you keep plugging in what each lag truly represents. All you need for an AR(1) is just the lagged values (for each time point) to build the model!

  • @Rundtj45
    @Rundtj45 Před 3 lety

    How is different between long and short run,
    Do you have any class about that

  • @MrSk8L8
    @MrSk8L8 Před 4 lety +1

    Great explanation

  • @NishaSingh-qf2it
    @NishaSingh-qf2it Před 2 lety +1

    Hi Aric!
    This was such a splendidly explained video. I have a doubt though about NARX. Do they function the same way as this one (explained in the video) because NARX is also autoregressive model? If not, could you please explain about NARX as well?

  • @razzlfraz
    @razzlfraz Před 4 lety +1

    Does anyone know where the line is between autoregression and regression is, because, eg lowess and loess functions are called local regression, yet it looks like "local regression" is a form of autogression from a 10,000 ft view. My guess atm is that local regression does not add stochastic noise making it just barely miss the definition, but I am only guessing here. It could also be local regression is a form of autoregression but everyone is too lazy to write it all out. Whatever it is, I would like to know!

    • @PhilosophySoldier
      @PhilosophySoldier Před 3 lety

      Good question - I'm also wondering the answer. @Aric LaBarr can you help?

  • @Pewpewforyou0
    @Pewpewforyou0 Před 3 lety

    this was very helpful

  • @kumaratuliitd
    @kumaratuliitd Před 3 lety +1

    Hi Aric, thanks for the explanatory video. Can it be said that AR(1) is equivalent to Single Exponential Smoothing algorithm because it too depends on the Previous forecast and error.

    • @AricLaBarr
      @AricLaBarr  Před 3 lety +1

      Actually, a single exponential smoothing model is equivalent to a moving average of order 1 after taking a single time difference (more formally called an ARIMA (0,1,1) model or sometimes an IMA(1,1))! This is because of the structure of the single exponential smoothing model. It is a combination of past and prediction, but the prediction is more past, etc. Hope this helps!

  • @amirhoseinbodaghi9527
    @amirhoseinbodaghi9527 Před 3 lety

    Thank You Dear

  • @eengpriyasingh706
    @eengpriyasingh706 Před 2 lety

    For 3:51, what is the manipulation done should be explained a little. Since I am not from this background it will be difficult for me to go through what and how it is happening?

    • @ArunKumar-yb2jn
      @ArunKumar-yb2jn Před 2 lety

      May be you should make some effort by gathering a little background before asking that question?

    • @eengpriyasingh706
      @eengpriyasingh706 Před 2 lety

      @@ArunKumar-yb2jn u r so smart that's why I am asking...if he has told some references or a bit of manipulation done......if I have already some background then definitely I will not be here

    • @ArunKumar-yb2jn
      @ArunKumar-yb2jn Před 2 lety

      @@eengpriyasingh706 May be you should not act so entitled.

  • @ValentinLeLay
    @ValentinLeLay Před 8 měsíci

    Hi ! At 3:33 you wrote Yt = w/(1-ø) + ø^tY_1 + ... but shouldn't it be Yt = w/(1-ø) + ø^tY_0 + ... since it's basically ø^tY_t-t = ø^tY_0

    • @AricLaBarr
      @AricLaBarr  Před 7 měsíci

      You are correct! That should be Y_0 or phi^(t-1). I should have had the left hand side equal Y_t+1 and then my math would work better :-)

  • @mengsupeng6541
    @mengsupeng6541 Před 3 lety

    Thank you. Already subscribed.

  • @anupamagarwal3976
    @anupamagarwal3976 Před rokem

    perfect 5mins to understand any topic

  • @andresgonzalez-nl8or
    @andresgonzalez-nl8or Před 13 dny

    shouldn't it be, if Φ > 1 and not Φ < 1?

  • @pjy1006
    @pjy1006 Před 2 lety

    Love your videos! I am on a quest to find out why we need stationarity for ARIMA model (many explanations online but I cannot say I have a very clear understanding). Is stationarity necessary for Simple Exponential Smoothing?

    • @AricLaBarr
      @AricLaBarr  Před 2 lety

      We need stationarity because the structure of ARIMA models are that they revert to the average of the series if you predict out far enough. That wouldn't work very well at all if we have trending or seasonal data!
      Simple ESM's don't need stationarity, but do require no trend or seasonality to make them work best. Stationarity is more mathematically rigorous than just no trend or seasonality.
      Hope this helps!

  • @insideonionyt
    @insideonionyt Před 4 lety +1

    Its damn awesome!!!!!

  • @statisticianclub
    @statisticianclub Před 3 lety

    Really beneficial

  • @Tomahawk1999
    @Tomahawk1999 Před 4 lety

    Dear Aric, can a AR model have other predictors? and if yes what class of models is that?

    • @AricLaBarr
      @AricLaBarr  Před 4 lety +2

      Yes they can!
      AR models are long memory models, but there are also short memory models (think quick shocks that don't last long in time) called Moving Average (MA) models. That is the next video about to come out!
      If you are talking about normal predictors (think X's in linear regression) then this class of model is called an ARIMAX model. I'll have a video on these coming soon!

    • @Tomahawk1999
      @Tomahawk1999 Před 4 lety

      @@AricLaBarr Thanks for the quick reply!. I had to review a paper last week which used predictors (like X's) to examine stock prices in a time series model. I really had no clue and if and when u make a video, please do include how to run these models, and evaluate these models. Thanks a lot. stay safe.

  • @dineafkir5184
    @dineafkir5184 Před 4 lety

    Nice video. Will you be making something about the ARCH/GARCH model :-)

  • @waimyokhing
    @waimyokhing Před 4 lety

    what is exponential autoregressive model???

    • @razzlfraz
      @razzlfraz Před 4 lety +1

      Like this? en.wikipedia.org/wiki/Exponential_smoothing

  • @user-cl2jb7by8e
    @user-cl2jb7by8e Před 4 lety +1

    I hope there is a video about MA model!!!!!

  • @abderrahimba7390
    @abderrahimba7390 Před 2 lety

    Wooow

  • @zubairkhan-hz1vz
    @zubairkhan-hz1vz Před 4 lety +1

    Plz Arima model

  • @GameinTheSkin
    @GameinTheSkin Před 3 lety

    You are a more level headed StatQuest, won't mind singalongs tho

  • @makting009
    @makting009 Před 4 lety

    Sir one video about moving average

    • @AricLaBarr
      @AricLaBarr  Před 4 lety

      Definitely! Be on the look out this week!

  • @andreneves6064
    @andreneves6064 Před 4 lety

    Slides please

  • @HardKore5250
    @HardKore5250 Před 4 lety

    GPT-3

  • @batolhashimi6863
    @batolhashimi6863 Před 2 lety +1

    I wish you were my professor instead of him.