Deep Dive: Apache Spark Memory Management

Sdílet
Vložit
  • čas přidán 26. 08. 2024

Komentáře • 13

  • @oldoctopus393
    @oldoctopus393 Před rokem

    the answers to almost all my questions on the topic are given in 26 min video. Awesome!

  • @sankarramanathan6051
    @sankarramanathan6051 Před 5 lety +4

    This was a really good presentation. Very unassuming, to the point with lucid slides.

  • @laeeqahmed1980
    @laeeqahmed1980 Před 7 lety +8

    It would be nice to have this kind of execution, storage visualization in Spark UI.

    • @harshitsaini15
      @harshitsaini15 Před 4 lety

      Its available at executors/storage tab, port 4040 by default.

  • @ankireddyambati7638
    @ankireddyambati7638 Před 4 lety

    Great Talk

  • @anfield6321
    @anfield6321 Před 3 lety

    kpii left TNC for this? NotLikeThis

  • @joo02
    @joo02 Před rokem

    my head = 1min.:D

  • @kk-si6fy
    @kk-si6fy Před 7 lety

    Since memory can always spill to disk when do we ever run out of Memory?

    • @verma.chitral
      @verma.chitral Před 7 lety +4

      kk8866 you run out of memory when you try to accumulate results that have a size greater than the memory you have on driver as a result of some action like take().
      it also happens when you are making large objects.

    • @vikashpareek8374
      @vikashpareek8374 Před 4 lety

      @@verma.chitral This is the case when the driver runs out of memory. But why executors run out of memory?

    • @vikashpareek8374
      @vikashpareek8374 Před 4 lety

      Did you get the answer to why executors run out of memory when memory can be spill to disk?

    • @verma.chitral
      @verma.chitral Před 4 lety +2

      Easy, it happens due to gc overheads and oom s. When you're creating objects at greater rate they can be gc'd.