10. Understanding Program Efficiency, Part 1

Sdílet
Vložit
  • čas přidán 18. 06. 2024
  • MIT 6.0001 Introduction to Computer Science and Programming in Python, Fall 2016
    View the complete course: ocw.mit.edu/6-0001F16
    Instructor: Prof. Eric Grimson
    In this lecture, Prof. Grimson introduces algorithmic complexity, a rough measure of the efficiency of a program. He then discusses Big "Oh" notation and different complexity classes.
    License: Creative Commons BY-NC-SA
    More information at ocw.mit.edu/terms
    More courses at ocw.mit.edu

Komentáře • 176

  • @Danny1905
    @Danny1905 Před 4 lety +96

    Wow I'm watching a class from the MIT in the comfort of my room... What a time to be alive :')

  • @spacewad8745
    @spacewad8745 Před 2 lety +51

    This is what I love about foreign universities. So much passion into teaching. Thanks OCW for allowing someone like me in a third world country to experience such excellent teaching.

    • @zolanhlangulela947
      @zolanhlangulela947 Před rokem

      Km my k mom I’m Kim mom kmmkmkkkk mom mkkmmm mom Mk ok mmmmkkkk ok Inkinga no kmmmk no kkkkkkkk 2:21 I i😮t

  • @leixun
    @leixun Před 4 lety +127

    *My takeaways:*
    1. Timing a program and its problems 10:30
    2. Counting operations and its problems 12:35
    3. The amount of time that the code takes depends on its input 20:50
    4. Big O notation 26:05
    5. Focus on the term that grows most rapidly 30:05
    6. Types of orders of growth 32:35
    7. Analyzing programs and their complexity 33:30
    8. Complexity classes 36:28
    9. Complexity growth 37:25

  • @MrSrijanb
    @MrSrijanb Před 6 lety +83

    OCW for me, is one of the best initiatives in the world. thanks MIT!

  • @ludwigwittgenstein1193
    @ludwigwittgenstein1193 Před 7 lety +262

    Your jokes are not bad, professor!

  • @user-fl7vs4ed6l
    @user-fl7vs4ed6l Před 3 lety +8

    ​​0:02:49 Want to understand efficiency of programs
    0:03:25​ Want to understand efficiency of programs
    0:03:57 Want to understand efficiency of programs 0:06:00 0:08:00
    0:10:07 How to evaluate efficiency of programs ​0:11:52
    0:14:00​ Counting operations 0:16:00 0:17:42
    0:21:16 Different inputs change how the program runs
    0:23:17 Best case Average case Worst case ​0:26:57 0:28:00​
    0:30:13 Simplification examples ​
    0:32:40 Type of orders of growth
    0:35:53 ​Analyzing program and their complexity
    0:36:54 Complexity Classes ​0:38:00​
    0:39:36 Linear search on unsorted list
    0:42:35 ​Constant time list access
    0:44:02 Linear search on sorted list
    0:45:30 ​Linear Complexity
    0:46:46 Quadratic complexity 0:50:00

  • @gfang3694
    @gfang3694 Před 7 lety +62

    Prof. Eric Grimson I miss you! You introduce me to the whole field of computer science!

    • @aayushgogia7210
      @aayushgogia7210 Před 7 lety +6

      Even i miss him. I took 6.00.1 on edX and it was so amazing.

    • @geekyprogrammer4831
      @geekyprogrammer4831 Před 5 lety

      @@aayushgogia7210 did he retire? :/

    • @TabrezKhan-vo8ko
      @TabrezKhan-vo8ko Před 4 lety +1

      @@geekyprogrammer4831 he dead

    • @readingRoom100
      @readingRoom100 Před 4 lety

      @hairbandfan1967 lol I just queried his name, and there's no mention of his supposed death

    • @Artaxerxes.
      @Artaxerxes. Před 4 lety

      @hairbandfan1967 he isnt. He's serving at Chancellor for Academic Advancement and directly reports to the MIT president

  • @moyakubu
    @moyakubu Před 6 lety +16

    I love this lecture, got some of the things I was confused about clarified. Thank you!

  • @massumpcao9964
    @massumpcao9964 Před 5 lety +5

    Thank you, professor, for these great lectures. I just had an job test and did great thanks to you

  • @SGspecial84
    @SGspecial84 Před 6 lety +5

    The man gives great effort. Admirable!

  • @muhammadmubashirullahdurra8394

    old is Gold.
    Im glad we have seniors!

  • @andrearivera7617
    @andrearivera7617 Před 6 lety +13

    I have been trying to understand how to identify the number of operations in an algorithm and the order growth for 4 weeks since my course started. Your explanation is so detailed that it is all clear. Thank you so much, Professor!
    I will keep following your videos.

    • @jellyjams7217
      @jellyjams7217 Před 5 lety +1

      Andrea Rivera im new to programming and I don’t understand how to count operations correctly yet. How do we differentiate which operations are just a number (constant) and which operations we multiply by n ?

  • @zlmsailor
    @zlmsailor Před 4 lety +6

    There must be years of accumulated experience and knowledge to make a lecture smooth and easy like this. Salute to Dr. Grimson.

  • @xer_t3661
    @xer_t3661 Před 2 lety +5

    when you are watching this on 2022 and the lecturer says "WE USE OMICRON! GOD KNOWS WHY! SOUNDS LIKE SOMETHING FROM FUTURAMA!"
    I really feel that😐

  • @dandk_
    @dandk_ Před 6 lety +7

    This cleared up almost all issues I was having, thank you very, very much!

  • @ketkiambekar3789
    @ketkiambekar3789 Před 3 lety +3

    The best resource out there for understanding Big O notation. Period.

  • @PankajKumar-ji1ig
    @PankajKumar-ji1ig Před 2 lety +1

    Thank You Professor, Hope you are well.

  • @brainstormingsharing1309
    @brainstormingsharing1309 Před 3 lety +1

    Absolutely well done and definitely keep it up!!! 👍👍👍👍👍

  • @luannsapucaia9878
    @luannsapucaia9878 Před 5 lety +2

    What a wonderful class. Great teacher with great jokes! LOVED!

  • @leonzheng574
    @leonzheng574 Před rokem

    top professors, top students, top university, thanks MIT OCW!

  • @jalalkiswani
    @jalalkiswani Před 5 lety

    Very nice lecture! extremely recommended.

  • @akbarrauf2741
    @akbarrauf2741 Před 7 lety +1

    thank you ,mit

  • @aymensekhri2133
    @aymensekhri2133 Před 4 lety

    Thanks a lot Prof. Grimson

  • @vietducnguyen4027
    @vietducnguyen4027 Před rokem

    Amazing Lesson, Thanks Prof

  • @OmigosTV
    @OmigosTV Před 3 lety

    thanks a lot.Very interesting and livefull lections for self education

  • @rajendere855
    @rajendere855 Před 6 lety

    Really Awesome lecture.

  • @sourPollo
    @sourPollo Před 3 lety +2

    Thank you MIT!!

  • @alan713812
    @alan713812 Před 2 lety +1

    this guy is so good!!

  • @GThomas748
    @GThomas748 Před 5 lety +2

    a not so simple topic explained in a simple way. that is the mark of a good teacher.

  • @jongpark70
    @jongpark70 Před 5 lety +4

    35:18 is hilarious!
    I wish I went to MIT just for this professor!!!

    • @wg3771
      @wg3771 Před 4 lety

      You are already just by watching this whole lectures!

  • @spartacusche
    @spartacusche Před 5 lety +6

    In the introduction , Professor said, that students saw greedy algorith, quick sort divide and conquer, but in what course?? may be he talk about CS 6.0002?

  • @sodapopinski9922
    @sodapopinski9922 Před 4 lety +2

    at 41:26 how is it 1+4n+1, should it not be 1+3n+1 or is the Len func in range loop adding another constant to make it 4n

  • @ysazak
    @ysazak Před 5 lety

    Great lecture

  • @deeprajgame
    @deeprajgame Před 6 lety +2

    Great lecture ever watch on performances. Thank you

  • @mandakasayam11
    @mandakasayam11 Před 6 lety

    Thank You Prof

  • @ankiewicz
    @ankiewicz Před 6 lety

    awesome... Self learning guided

  • @shikharupadhyay7435
    @shikharupadhyay7435 Před rokem +1

    Experienced teachers are at a different level!!!!

  • @clapathy
    @clapathy Před 7 lety +1

    Thank you

  • @gnatinator
    @gnatinator Před 5 lety

    Thank you MIT

  • @supermariosunshine64
    @supermariosunshine64 Před 7 lety +134

    Damn, I feel bad for Dr. Grimson. Are all MIT students automatons or what? He's pretty funny--really, no one even smirked?

    • @YannisSP
      @YannisSP Před 5 lety +17

      Noah Iboa I guess he is since he holds a Ph.D. (en.wikipedia.org/wiki/Eric_Grimson)

    • @jamesgoulding9941
      @jamesgoulding9941 Před 5 lety +4

      I do like his jokes but the one about the Fonz was pretty outdated.

    • @Grokfyr
      @Grokfyr Před 5 lety +1

      he made me smile alot actually

    • @robertsun5529
      @robertsun5529 Před 5 lety +13

      I think the audiences were just not picked by the recording device.

    • @hw4058
      @hw4058 Před 5 lety +3

      yes it's ture, if you watched 6.001, you'll know only when the students are interacting with the lecturer ,their voice will be recorded

  • @georgealex19
    @georgealex19 Před 6 lety

    Really nice lecture by Prof. Eric Grimson. Nice jokes here and there, exactly the right amount.

  • @crocopie
    @crocopie Před 6 lety

    35:39. The Fonz is that cool character in Happy Days.

  • @bullyellis1
    @bullyellis1 Před 2 lety

    Anyone know where to get the quizzes / tests / answers related to this lecture ?

  • @charlottewang1264
    @charlottewang1264 Před 4 lety +1

    for i in range(x+1): x from 0 to x , doesn't this have (x+1) times ops?

  • @diabelli4life
    @diabelli4life Před 5 lety +3

    He's by far the most amazing professor I've ever seen on the entire internet. This is how the educational level of a country raises as a rocket, by having this kind of people in their classrooms.

  • @WaleedBinKhalid
    @WaleedBinKhalid Před 5 lety

    Why the professor is using base 10 for O(log n)? We can get the exact number of iteration with the base 2. Am I missing or confusing something?

  • @nup_pun
    @nup_pun Před 7 lety +9

    On slide 7 "t" is not defined in the program.

  • @nizarch22
    @nizarch22 Před 4 lety +2

    I'm pretty sure the fact_iter(n) function (28:47) has a (n-1) while loop. Doesn't make sense he said it was an (n) while loop.

  • @ThuyNguyen-bu9ge
    @ThuyNguyen-bu9ge Před 5 lety +4

    You made me laugh, thanks for a great lecture!

  • @scorpio19771111
    @scorpio19771111 Před 2 lety

    Prof Grimson - a great pillar of edification at MIT. Homage!

  • @travishenagan7135
    @travishenagan7135 Před 4 lety

    This dude is very good

  • @UcupBet88
    @UcupBet88 Před 2 lety

    THIS PAGE IS OUR LECTURE TASK

  • @tombrady7390
    @tombrady7390 Před 4 lety

    MR grimson is dope

  • @darthglowball
    @darthglowball Před 6 lety +2

    I don't get one thing. At 50:12 it says: "I'm only looping over tmp, which is at most going to be len(L1) long", but that would be in the best case. The wost case is when the length of L1 and L2 are the same and all of the numbers inside both are the same (let's say all 1's), then tmp will be of length L1 ^ 2 because the if statement in the first nested loop will always be true. Only then I can see O(n^2) being applicable to the second loop (in the worst case the implicit "e in res" loop has a constant run time because res array doesn't get bigger than one item). If I'm wrong, I can not see how O(n^2) is applicable to the second loop.

    • @garrysohi4540
      @garrysohi4540 Před 5 lety

      There are two loops in second loop. First for each 'e' in tmp[] and then for that each 'e' it is checked if 'e' is already in res[]. So for two loops the second one is also quadratic.

    • @darthglowball
      @darthglowball Před 5 lety

      top coder, I know that there is two loops in the second half of the code, in which "e in res" is an implicit loop. But when the teacher says "I'm only looping over tmp, which is at most going to be len(L1) long", that is false. To prove my point, imagine L1 and L2 to be filled with all 1's. (L1 = [1,1,1,1,1], L2 = [1,1,1,1,1]). In that case, the line if e1 == e2 will always be true and execute with every iteration of its inner loop, thus making tmp the size of len(L1)*len(L2) (and if len(L1)==len(L2) then tmp will be len(L1)^2). I agree though that in such case O(n^2) is still applicable to the lower code half aswell as all of the code.

  • @davidjames1684
    @davidjames1684 Před 3 lety

    Since when is searching half of an unsorted list linearly average case? That depends highly on the data. For example, if you had 1000 random numbers from 1 to 100 in the list and you wanted to search for 75, you would likely find it WAY before searching half of the list.

  • @Tintak_hatpin
    @Tintak_hatpin Před 4 lety

    That unsorted list, I didn't understand how is the complexity (1+4n+1) ?

  • @monkarden2692
    @monkarden2692 Před 3 lety

    Wouldn't multiplying two very large numbers be more computationally expensive than multiplying one very large number with, let's say, 5? If yes, how come the factorial algorithm is considered to have the same level of complexity as the simple iteration with 5 steps inside?

  • @kaos092
    @kaos092 Před 3 lety +1

    Where does he get 1+6n+1 when nothing else changed in the program to make it 1+5n+1?

  • @eceakdeniz4049
    @eceakdeniz4049 Před 3 lety +5

    48:42 why would the worst-case scenario be when the lists are the same length and none of the elements in L1 are in L2? in that case, the program wouldn't go through all the elements in L1 since it breaks out the outer loop returning False when it goes through L2 once and for all and finds no matching elements in L2. Shouldn't the worst-case scenario be when the program goes through all the elements in both inner and outer loops? For instance, that case could be when L1 is [1, 2] and L2 is [1, 3, 4, ...]. Am I miscalculating something? I'd appreciate it if someone could help me.

    • @jackofnotrades15
      @jackofnotrades15 Před 3 lety +1

      No you are right, he said it wrong. This comments needs to be at top.

    • @mdrfckrr
      @mdrfckrr Před 3 lety

      Nope.
      It takes each element in L1 and checks for it in L2. If it doesn't find - it breaks out of the inner loop and takes another element of L1 to go through L2 again, and again, and again...until it reaches end of L1.
      So he is right saying that for len(L1) = len(L2) AND L1/L2 = 0 this is quadratic.
      Thats my reasoning, but maybe I miscalculated something ;-)

    • @hieuvuminh1653
      @hieuvuminh1653 Před 3 lety +2

      @Ece Akdeniz @mdrfckrr The worst case is probably both the lists are the same length and all of the elements in L1 and L2 are the same. Let's say the both of them have n elements. The first iteration you only need to compare one time, the second iteration you need to compare 2 times, the 3rd iteration you need to compare 3 times. The sum of the iterations is 1 + 2 + 3 + .... + n = n(n+1)/2 which is O(n^2)

    • @aidenigelson9826
      @aidenigelson9826 Před 2 lety

      @@hieuvuminh1653 shouldn't it be n² plus n? N² for the nested loop and n for the second loop, since it's at worst iterating over the length of the list, so n. And due to the addition principle it'd be O(n+n²) = O(n²)

    • @aidenigelson9826
      @aidenigelson9826 Před 2 lety

      Oh he's also counting the check again for the if condition? I don't get the second nested loop

  • @davinmercier2895
    @davinmercier2895 Před 2 lety +2

    "Omicron" sounds like something from Futurama lol 26:20

  • @Ben-bg2lp
    @Ben-bg2lp Před 2 lety

    The lady professor's lecture doesn't even compare to professor Grimson's. He is the absolute master at his craft. With his silky voice, his graceful gestures and constant tonal changes, I am watching a brilliant play while learning.

  • @yookoT
    @yookoT Před 2 lety +1

    Python 3.8 and higher version are not support time.clock() any more. Using time.perf_counter() to replace the time.clock().

  • @henrikmanukyan3152
    @henrikmanukyan3152 Před 7 měsíci

    25:16 about what we are trying to achieve in this lection

  • @nirvana7420
    @nirvana7420 Před 6 lety +1

    This is cool. But i am after how can i improve my complexity? I can determine the complexity but how can I overcome this?

  • @bananastuff2840
    @bananastuff2840 Před 2 lety

    in 46:51 what does `if not matched:` mean? if what is not equal to match?

    • @peasant7214
      @peasant7214 Před 2 lety

      matched is bolean variable
      if true always does the if statement
      if false never does it.
      if BOLEAN VARIABLE executes the condition when the BOLEAN VARIABLE is true, and not when it is false

  • @volleysmackz5960
    @volleysmackz5960 Před rokem

    Summary of 50mins:
    Always use worst case scenario to indicate the time taken by an algorithm. Big O notation is basically that. Remove additive, multiplicative constants in number of operations expression computed for the algorithm. Focus on dominant terms.
    Always try to make it faster that is: O(1) < O(logn) < O(n) < O(nlogn) < O(n2) < O(2n)
    Thanks!

  • @prateekmishra6245
    @prateekmishra6245 Před 4 lety +2

    BIG O at 26:14

  • @programmer1010
    @programmer1010 Před rokem

    So you say that count varies on different implementation because while has two basic operations assignment and testing but for only does assignment. So you say thatwhile and for actually take same time but when we want to count it differs.
    ANYWAY LETS JUST COUNT 2 OPERATIONS WHILE COUNTING OPERATIONS IN FOR IF FOR AND WHILE TAKE SAME TAME IN FACT
    16:20 doesn’t for loop test if i is less than x+1? You say it only assigns i values from range so it’s one operation.

  • @nataliarowczenio4190
    @nataliarowczenio4190 Před 4 lety +4

    My heart breaks at these souless students

    • @kunalrustagi7901
      @kunalrustagi7901 Před 4 lety +1

      Yes. It sure does. But you gotta admit the efforts these students have taken to get into MIT.

  • @feliperi4350
    @feliperi4350 Před 7 lety +8

    I miss when Eric Grimson gave candy to the students

  • @kyawn5115
    @kyawn5115 Před 4 lety

    41:26
    shouldn't it be 1+3n+1, instead of 1+4n+1? How come there is 4n?

    • @anonlegion9096
      @anonlegion9096 Před 3 lety

      perhaps because indexing a list is also O(1), so you have two assignments for " i " and "found", 1 comparison in the conditional and another 1 in the conditional for indexing i in L ( L[i] )

  • @programmer1010
    @programmer1010 Před rokem

    48:45 false
    Worst case is when lists contain same elements so matched is always true and loop goes till the end and returns true

  • @programmer1010
    @programmer1010 Před rokem +1

    14:38 You are going through that loop x+1 times

  • @stevmah
    @stevmah Před 5 lety +1

    @31:50 How come 3**n is the dominant term ? Excuse my ignorance, it would be helpful if someone can clarify.

    • @AlexTheGreatish
      @AlexTheGreatish Před 5 lety +1

      When you compare n^30 with something like 3^n, it is clear that for the value of n, 3^n will be much greater. You can test this yourself with a calculator. Type in something like n^5 (replace n with any digit) and then increment n and see how the number grows. Then type in 3^n (replace n with any digit) and then increment n and see how the value changes. After a certain amount of increments it will become clear than 3^n grows much faster than n^30 the more you increment it. You can also use the charts at 32:40 to compare the increase between the quadratic and exponential growths. With smaller numbers the difference might not seem like much, but you'll find that the more you increment n, the bigger the difference between quadratic and exponential, hence why 3^n is the dominant term. Hope this clears it up a bit for you.

  • @rolandheinze7182
    @rolandheinze7182 Před 5 lety

    I'm the Fonz... 'a','a','a',...! Pretty good lol

  • @jeeperscreeperson8480
    @jeeperscreeperson8480 Před 2 lety

    How is "total += 1" is 2 ops, but "return a * b/c + d" is just 3?

    • @Ari-jm6xx
      @Ari-jm6xx Před 2 lety

      Maybe because of the operations multiplications, division and addition? Although the 'return' makes me think four operations, but I don't know. I'm very new to this.

    • @jeeperscreeperson8480
      @jeeperscreeperson8480 Před 2 lety

      @@Ari-jm6xx Return is also an operation. It's like an assignment. So it should be 4 ops. But honestly, this sort of analysis doesn't even make sense in python since python is an interpreted language. It's impossible to know what's really happening when this line of code is executed.

  • @mikemihay
    @mikemihay Před 4 lety

    Big O notation : (actually minute 26 )
    czcams.com/video/o9nW0uBqvEo/video.html

  • @anonviewerciv
    @anonviewerciv Před 3 lety

    Efficiency tricks. (16:30)

  • @JacobKrajewski
    @JacobKrajewski Před 5 lety +1

    Who is editing these? SHOW THE SLIDES (please)

    • @mitocw
      @mitocw  Před 5 lety +2

      The slides (and code snippets, readings, assignments) are available on MIT OpenCourseWare at: ocw.mit.edu/6-0001F16. Best wishes on your studies!

  • @crocopie
    @crocopie Před 6 lety +1

    November 8?

  • @obli8984
    @obli8984 Před 2 lety

    I like when he say dumb...anyway superb lesson

  • @aaryan3461
    @aaryan3461 Před 5 lety +1

    @32.13 n^30 grows faster than 3^n. Then why is Prof taking it to be O(3^n)?

    • @mohammadsultan935
      @mohammadsultan935 Před 5 lety +1

      It doesn't. Try relatively large numbers on your calculator like n=150 and you'll see 3^n grows much faster.

    • @aaryan3461
      @aaryan3461 Před 5 lety

      @@mohammadsultan935 You're absolutely right! Thanks.

  • @Cartoon5784
    @Cartoon5784 Před 2 lety

    If you get anything out of this lecture, let it be this: order of growth

  • @pew5379
    @pew5379 Před 4 lety

    i almost did it

  • @marco.nascimento
    @marco.nascimento Před 5 lety

    Extraordinary lecture, love his joking hahah shame on the boring MIT students

  • @NoName-ef2gv
    @NoName-ef2gv Před 3 lety +3

    48:23
    The worst case is not right: "worst case when L1 and L2 same length, none of elements of L1 in L2"
    If none of the elements of L1 in L2, during the first run of the loop, the function will return false and exit the function.
    The worst case should be, in my opinion, every item in L1 is at the last possible position of L2.
    For example, first item of L1 is at last position of L2, second item of L1 is at the second to last position of L2, third item is at the third to last position of L2, etc.
    When the L1 and L2 have the same length, the order of growth would be len(L2)+(len(L2)-1)+(len(L2)-2)+(len(L2)-3)+...+1

  • @ngDetecter
    @ngDetecter Před 2 lety

    Dan Aykroyd

  • @SourabhBhat
    @SourabhBhat Před 7 lety +4

    The lecture is good but very bad video. The slides are not visible when the Professor is discussing about the content of slides. It is very difficult to follow along.

    • @mitocw
      @mitocw  Před 7 lety +12

      The lecture slides, code snippets, and more are available at MIT OpenCourseWare at: ocw.mit.edu/6-0001F16.

    • @SourabhBhat
      @SourabhBhat Před 7 lety

      Thank you ! That helped a lot.

    • @JoeWong81
      @JoeWong81 Před 7 lety +3

      I often think the same thing, but you can just hit spacebar to instantly pause all slides to study them.

  • @josephhydiga8331
    @josephhydiga8331 Před 4 lety +1

    The seventeen people who disliked this... Why?

  • @prajwalgowda2628
    @prajwalgowda2628 Před rokem

    Nov 8th

  • @mindmantra-digital
    @mindmantra-digital Před 2 lety +1

    Honestly, I hate that those boards are not being used. PPT is the worst medium to teach in a class room setting. It is even heart wrenching that Prof. Grimson can use boards pretty well (just look at the lecs from Fall-08).

  • @choonghuh
    @choonghuh Před 4 lety

    november 8th! mi birthday!

  • @tiankonghewo
    @tiankonghewo Před 7 lety +3

    还是谷歌的东西好

  • @viaprenestina3894
    @viaprenestina3894 Před 3 lety

    The man is continuously pointing to a board, I assume, that it is NOT visible in this video

  • @prajwalgowda2628
    @prajwalgowda2628 Před rokem +1

    Modi ji app yaha😮🙏

  • @michaeldunlavey6015
    @michaeldunlavey6015 Před rokem

    Very well done. But...
    Big-O, and little snippets of code are fine, but don't you guys ever work with *_real_* software? Million-liners? The kind that can get orders of magnitude speed improvement, not by messing with searching/sorting algorithms, but by finding out what it's doing that doesn't NEED to be done?
    Suppose you have some monster code that takes 100 seconds to run. Suppose it has routine A calling B in a jillion places, and in 60 of those seconds A is calling B for a lousy reason.** If you knew that you could fix it and get it down to 40 seconds, for a 2.5 times speedup. Wouldn't you want to fix it?
    Then suppose routine C calls D for a lousy reason for 30 of those seconds. After you fix the A-B problem, the C-D problem still takes 30 seconds, but now it's 30 seconds out of 40, not 100. So maybe before it wasn't so much worth fixing, but now it really is! If you fix it you go down to 10 seconds. That's a 10-times speedup.
    Then what about E-F, which was 8 seconds, and so on? That's how you get orders of magnitude constant factors.
    I bet you doubt that real software has such problems. That doubt means they are not looked for, and that's why they are not found.
    Do this. Run the program, in a debug build so it has symbols, under any debugger like GDB. Start it, and stop it at random by hitting Ctrl-C. Then do BT to see the stack trace. Do this several times. Chances are very good that you will see the A-B problem on multiple stack traces. Profiler programs will not see it for a whole variety of reasons I could go into. And notice - you found the problem without measuring. You don't know how big it is; all you know is that it's big. Its bigness is what made it visible! If you're really into "measure, measure", just take more samples. I never take more than 20, usually 10 or less. The bigger the problem is, the sooner you see it.
    Then maybe you guys will see beyond big-O.
    ** Typical speed bugs:
    - Calling *_new_* or *_delete_* when a prior object could just be re-used.
    - Calling *_pushback_* which deletes, reallocates, and copies an array just to make it 1 element bigger.
    - I/O time formatting or writing lines to a log file that nobody reads.
    - I/O time reading a DLL file to get a string translation resource, when the string doesn't need to be translated.
    - Calling an array-indexing function to index an array, to make sure the index is within range, when you know the index cannot be out of range.
    - Calling *_==_* between strings to check program state, when integers could be used.
    ... there is no limit to the ways time can be wasted.

  • @thangible
    @thangible Před 5 lety

    35:13 bleh bleh bleh

  • @senatorpoopypants7182
    @senatorpoopypants7182 Před 2 lety

    Unpopular opinion : I thought he was sufficiently funny.

  • @SaravanaKumar-ci1zk
    @SaravanaKumar-ci1zk Před 2 lety

    age is not equal to knowledge .......

  • @bengbeng2005
    @bengbeng2005 Před 5 lety +2

    oooh i missed u teacher i miss your jokes also

    • @rj-nj3uk
      @rj-nj3uk Před 5 lety +1

      He died or what? Why you people miss him?