MLBBQ: “Are Transformers Effective for Time Series Forecasting?” by Joanne Wardell

Sdílet
Vložit
  • čas přidán 28. 08. 2024

Komentáře • 6

  • @mmattb
    @mmattb Před 4 měsíci

    Great discussion and a sharp group! Around 32:27 one of the commenters mentioned a paper analyzing noisy multivariate dynamical systems and that under the CLT they may appear linear. Are you able to link me to that result? I'd love to see it.

    • @SergeyPlis
      @SergeyPlis  Před 3 měsíci +1

      here's an empirical look at the brain arxiv.org/abs/2012.12351
      and theoretical papers, that I guess have inspired that investigation:
      Spatial averaging: ieeexplore.ieee.org/document/9993260
      Temporal averaging: ieeexplore.ieee.org/abstract/document/10155808
      HTH

    • @mmattb
      @mmattb Před 3 měsíci +1

      @@SergeyPlis those looks like good fun! Thanks a bunch Sergey.

  • @DunlopJ
    @DunlopJ Před 3 měsíci

    No mention of PatchTST Transformer from "A Time Series is Worth 64 Words: Long-term Forecasting with Transformers"?

  • @pranavishnoi20
    @pranavishnoi20 Před 5 měsíci

    So we can say this paper is Haox