165 - An introduction to RNN and LSTM

Sdílet
Vložit
  • čas přidán 4. 09. 2024

Komentáře • 85

  • @elisavasta2684
    @elisavasta2684 Před 2 lety +3

    One of the best explanation ever on LSTM! Greetings from Politecnico di Milano!

  • @sonhdang
    @sonhdang Před 3 lety +15

    I've watched dozen of videos on LSTM and this is the best one so far. Thank you so much sir. Greetings from UCLA!

  • @christophbrand9015
    @christophbrand9015 Před 3 lety +2

    The first youtube tutorial I saw which explains a LSTM in detail, e.g. why a Sigmoid or why a tanh is used within the cell. Great!

  • @pfever
    @pfever Před 2 lety +14

    Best LSTM explanation I have watched! All your videos are superb! I want to watch them all from beginning to end! Thank you for such detailed and intuitive explanations! :D

  • @Balakrish-cl9kq
    @Balakrish-cl9kq Před 2 lety

    I feel very gifted that I got the suggestion from CZcams, the right video....

  • @edwardbowora
    @edwardbowora Před rokem +4

    Best teacher ever.

  • @user-fd9tv9fq8z
    @user-fd9tv9fq8z Před 5 měsíci

    I get valuable Understanding. I realy appriciate the way of your explanation.

  • @AshishBamania95
    @AshishBamania95 Před 2 lety

    Can't believe that this is free. Thanks a lot. You are building a community of future researchers and innovators here!

  • @mehdisdikiene8752
    @mehdisdikiene8752 Před 3 lety

    I've watched many videos and read a lot about LSTM but this is the first time i really understand how LSTM works. Thumbs up thank you!

  • @learn2know79
    @learn2know79 Před 2 lety +1

    I was struggling to understand the basic concept of LSTM and watched dozen of videos and finally found the best one so far. Thank you so much for letting us understand. Greetings from GIST!

  • @dizhang947
    @dizhang947 Před rokem +1

    amazing work, thank you so much!

  • @saaddahmani1870
    @saaddahmani1870 Před 2 lety +1

    Good, thanks a lot.

  • @lh2738
    @lh2738 Před 2 lety +1

    Nice video, so well explained and not too long, along with a full tutorial. Probably one of the best ones about LSTM. Thanks and please keep up the good work! Greetings from France!

  • @AhmedFazals
    @AhmedFazals Před 4 měsíci

    Awesome! Thanks sir.

  • @aristideirakoze8098
    @aristideirakoze8098 Před 2 lety

    We are infinitely grateful

  • @jahanzaibasgher1275
    @jahanzaibasgher1275 Před 2 lety

    Thank you so much :)
    Subscribed after watching your first video.

  • @yangfarhana3660
    @yangfarhana3660 Před 3 lety

    I've viewed several vids on LSTM but this breakdown is the best!!

  • @davidomarparedesparedes8718
    @davidomarparedesparedes8718 Před 3 měsíci

    Great explanation! Thank you so much!! : )

  • @gadisaadamuofficial2946

    really, thank you for your more clarification!

  • @omniscienceisdead8837
    @omniscienceisdead8837 Před 2 lety

    Best explanation out there, i understood, what is happening both conceptually and mathematically

  • @claybowlproductions
    @claybowlproductions Před rokem

    Sir you are a gem!

  • @RAHUDAS
    @RAHUDAS Před 2 lety +1

    At 19:31, he mentioned how many units of LSTM , the units parameters is not for how many units of LSTM in any layer, it is for hidden state dimension.
    And for how many LSTM depends on input shape[0].

    • @Droiduxx
      @Droiduxx Před rokem

      So if I understand well, if we consider the input to be a sequence of x elements, each "LSTM" unit contains x states, and returns a list of x vectors passed to the LSTM units of the next hidden layer. Am I right ?

    • @RAHUDAS
      @RAHUDAS Před rokem

      @@Droiduxx yes, but consider return_sequence, and return_stae arguments also, their default values false , to see the full picture, kindly turn on return sequence.
      Example -
      x = tf.range(60)
      x = tf.reshape(x,(5,3,2))
      # shape - ( batch, time, num-features)
      lstm = tf.Keras.Layes.LSTM( 7, return_sequence= True)
      Output = lstm(x)
      Print(Output.shape)
      # answer (5,3,7)

  • @alteshaus3149
    @alteshaus3149 Před 3 lety +1

    Thank you very much for this video sir!

  • @Toss3geek
    @Toss3geek Před 9 měsíci

    谢谢老师

  • @zhiyili6707
    @zhiyili6707 Před 2 lety

    Thank you very much! It is well explained!

  • @indranisen5877
    @indranisen5877 Před rokem

    Thank you Sir, Nice explanations.

  • @kukuhiksanmusyahada7615

    Great presentation sir! thank you so much!

  • @rolandosantos7755
    @rolandosantos7755 Před 3 lety

    i love your video...i am just starting to learn machine learning and its very useful'

  • @-mle566
    @-mle566 Před 2 lety

    thank you, nice video for LSTM new learners :)

  • @aminasgharisooreh9243
    @aminasgharisooreh9243 Před 3 lety +1

    Thank you, it is really helpful

  • @alex-beamslightchanal8743

    Nice tutorial! Thank you!

  • @karamjeetsinghmakkar3323

    Dear Dr. S. Sreeni,
    Thanku for your informational videos regarding cnn.
    Kindly make LSTM for image classification tasks.
    Thanku.

    • @DigitalSreeni
      @DigitalSreeni  Před rokem

      LSTM is primarily used for processing sequential data. While it is possible to use LSTM for image classification tasks, it is generally not the best choice as it is designed to model sequential dependencies in data, whereas images are inherently spatial and do not have an obvious sequential structure. Images are typically processed using CNNs, which are specifically designed to handle spatial data and can effectively extract features from images using convolutions.

  • @jolittevillaruz5234
    @jolittevillaruz5234 Před 3 lety

    Very intuitive video!

  • @gakhappy
    @gakhappy Před 3 lety

    Great work sir. keep on doing great job

  • @rahuliron1635
    @rahuliron1635 Před 3 lety

    awesome explanation thank you very much

  • @nisa_ssa
    @nisa_ssa Před 3 lety

    Thank you so much for this video...

  • @s.e.7268
    @s.e.7268 Před 3 lety +1

    I am so happy to discover this channel! :)

  • @dantec.dagandanan3732
    @dantec.dagandanan3732 Před 2 lety

    Thanks!

    • @dantec.dagandanan3732
      @dantec.dagandanan3732 Před 2 lety

      I know this little amount of money is not enough to say thank you. Keep the good works ser, 🥰

    • @DigitalSreeni
      @DigitalSreeni  Před 2 lety +1

      Thank you very much. No amount of money is little. Every penny counts :)
      Bulk of the money goes to charities that help with cancer research and eye surgeries for poor people. So the society benefits from any amount that is contributed. Thanks again.

  • @VCodes
    @VCodes Před 3 lety +1

    great. thx a lot

  • @sadafmehdi2991
    @sadafmehdi2991 Před 3 lety

    Nice Explanation Sir!

  • @cryptodude5359
    @cryptodude5359 Před 2 lety +1

    Amazing tutorial! I got a question:
    At 14:59 you explain the forget gate.
    In the lower-left corner, the cell gets ht-1 (last timestep) as input. Is it possible to have a sequence of past days as input?
    For example ht-1 & ht-2 & ht-3 ... etc. to spot potential trends in the data. Maybe with multiple variables. Giving every single timestep an additional weight.

  • @manideepgupta2433
    @manideepgupta2433 Před 3 lety

    Amazing Sir.

  • @aomo5293
    @aomo5293 Před rokem

    Thank you, honestly it s very clear.
    Please I am looking for a tutorial on image classification but using local images dataset.
    Have y made a one before.
    Thank you again

  • @kanui3618
    @kanui3618 Před 3 lety

    nice explanation!

  • @awesome-ai1714
    @awesome-ai1714 Před rokem

    11:40 What is going on with the arrows? Signal from previous cell merges with current Xt, but there is no operator. Signal from left and signal from bottom Xt. And they both go to 3 gates?
    Edit: ok I see, its explained later

  • @nicolamenga8943
    @nicolamenga8943 Před rokem

    Thank you for the video.
    I have a question.
    The number of units (50) is the number of the so called "hidden units", also known as "hidden size"?

  • @tchintchie
    @tchintchie Před 3 lety +2

    I can´t help but find this channel incredibly undersubscribed!!!

    • @DigitalSreeni
      @DigitalSreeni  Před 3 lety

      I’m glad you like the content. I rely on you guys to spread the word :)

  • @ziyuelu1734
    @ziyuelu1734 Před 2 lety

    thanks!

  • @vzinko
    @vzinko Před rokem

    Why is there a dropout after the final LSTM layer?

  • @sherrlynrasdas8387
    @sherrlynrasdas8387 Před rokem

    Can you teach us how to use LSTM and ARIMA in ensemble learning in forecasting time series data?

  • @hudaankara5616
    @hudaankara5616 Před 3 lety +1

    Hi sir. thank you for much for all your videos. Could you provide us with tutorial to implement LSTM & RNN with Python Please?

  • @AveRegina_
    @AveRegina_ Před 2 lety

    I'm using RNN for my PG thesis work. I've a query. Do we have to run stationarity test for our time series data before feeding it in the neural network model... or this step is only required in traditional time series models like ARIMA?

    • @DigitalSreeni
      @DigitalSreeni  Před 2 lety

      RNNs are capable of learning nonlinearities (compared to ARIMA) and therefore should be able to learn from the input data without doing any stationarity pre-processing. This is especially true if you use LSTMs. Also, please note that you need lot more training data for RNNs compared to ARIMA. You may find this blog useful to understand the effectiveness of RNNs: karpathy.github.io/2015/05/21/rnn-effectiveness/

  • @bobaktadjalli6516
    @bobaktadjalli6516 Před rokem

    Hi, well explained! Could I have your slides?

  • @stevenzhou7358
    @stevenzhou7358 Před 3 lety

    Thanks for your videos! It's really helpful. I have a small question. Could you explain a little more about the meaning of units? Is it mean the number of hidden layers or the number of neurons in a layer?

    • @DigitalSreeni
      @DigitalSreeni  Před 3 lety

      May be this helps... stats.stackexchange.com/questions/241985/understanding-lstm-units-vs-cells

    • @stevenzhou7358
      @stevenzhou7358 Před 3 lety +1

      @@DigitalSreeni Thanks a lot! It's very helpful.

  • @JJGhostHunters
    @JJGhostHunters Před rokem

    Hi DigitalSreeni...I am a PhD candidate investigating applications of MLPs, CNNs and LSTMs. I see that you have amazing graphics for these model types in your videos.
    Would you be willing to share these graphics for the model architectures with me so that I may use them in my dissertation and defense presentation? I certainly would give you credit for them.
    Thank you for your time!

  • @ramchandracheke
    @ramchandracheke Před 3 lety

    First like a video then watch it !

    • @DigitalSreeni
      @DigitalSreeni  Před 3 lety

      Thanks for your blind confidence in the video, I hope your opinion doesn’t change after watching the video :)

  • @aminasgharisooreh9243
    @aminasgharisooreh9243 Před 3 lety

    please make a video about attention in images

  • @XX-vu5jo
    @XX-vu5jo Před 3 lety

    Lol ever heard of transformers???

    • @DigitalSreeni
      @DigitalSreeni  Před 3 lety

      Now sure what your meant by your comment, was that a question?

  • @pattiknuth4822
    @pattiknuth4822 Před 3 lety

    His continuing use of "ok?" "ok?" "ok?" "ok?" is incredibly annoying.

    • @adhoc3018
      @adhoc3018 Před 3 lety +2

      And you are not annoying at all.

    • @DigitalSreeni
      @DigitalSreeni  Před 3 lety +6

      Poor choice to comment on personal trait rather than content of the tutorial, ok?