Why Moore’s Law Matters

Sdílet
Vložit
  • čas přidán 25. 03. 2023
  • I finished this video before the passing of Gordon Moore. Moving it up now. RIP to a legend
    Links:
    - The Asianometry Newsletter: asianometry.com
    - Patreon: / asianometry
    - Twitter: / asianometry

Komentáře • 442

  • @Asianometry
    @Asianometry  Před rokem +1355

    RIP, Gordon Moore.

  • @AlanTheBeast100
    @AlanTheBeast100 Před rokem +646

    "...a lucky guess that got a lot more publicity than it deserved."
    - Gordon Moore.

    • @matsv201
      @matsv201 Před rokem +9

      Well. With a span between 1 year and 2 year it was a pretty wide guess. Also they did throttle there development during periods of the 80s and 90s. Basically cheating.

    • @AlanTheBeast100
      @AlanTheBeast100 Před rokem +24

      @@matsv201 It was a very good guess IMO.

    • @Connection-Lost
      @Connection-Lost Před rokem +2

      @@matsv201 Their*

    • @Connection-Lost
      @Connection-Lost Před rokem

      @@AlanTheBeast100 You not correcting him means you must be low IQ as well

    • @AlanTheBeast100
      @AlanTheBeast100 Před rokem +3

      @@Connection-Lost No it means that I know that predictions based on the very small data set that Moore had at the beginning has stood the test of time well. It's never going to be perfect. Indeed look at the graph in the video: ( @21:30 ) a pretty straight line on a log graph? Hmm? You do _know_ what that means, right? I mean you passed grade 9 math? Right?
      So, a little algebra
      1) 1971: about 2500 transistors
      2) 2020: about 35B transistors
      Based on that, if the number of transistors goes up 1.4x every years (on average), then you get close to the 2020 count.
      Of course you can adjust the limits as you will and get slightly different numbers. For example if I make the end number 50B, then the factor would be: 1.41 every year.
      So before accusing people of a low IQ, maybe you should do some basic high school math first and see if your own IQ is up to a level from which to throw cheap shots.

  • @tomhalla426
    @tomhalla426 Před rokem +130

    Another issue is that Moore’s Law only applies to microchips. Some politicians act as if similar advances apply to solar cells or batteries.

    • @PainterVierax
      @PainterVierax Před rokem +18

      yet, it barely applies to integrated circuit as most of the improvements during the last decades has been done through the design of the chips/systems and making smarter/fine tuned algorithms. And the raw computationnal power gain has mainly been used to ease software programming and portability. (eg. cache levels, multicore, SMT, interposers, ASICs, DSPs, FPGAs, GPGPU, higher level compiled prog languages, HAL, APIs, a metric ton of interpreted languages or cloud/server/Web-based apps)

    • @julioguardado
      @julioguardado Před rokem +3

      One of my pet peeves. There will be some spillover of 300mm wafers to solar cells which runs mostly on 200mm because the equipment is cheaper and often second hand. The switchover to 300mm claimed a cost advantage based on wafer size ratio. The same should apply to solar but I am not a solar manufacturing guy. Same could apply to LED's perhaps...

    • @tomhalla426
      @tomhalla426 Před rokem +5

      @@julioguardado My understanding is that solar cells are already at a high percentage of the theoretic performance (as are windmills), so only more efficient manufacture is possible.

    • @julioguardado
      @julioguardado Před rokem +1

      @@tomhalla426 Same here. Polysilicon is king and its efficiency hasn't changed from around 20% iirc. They're still looking for that high efficiency material that can be manufactured cheaply. Don't see any breakthroughs there.

  • @Palmit_
    @Palmit_ Před rokem +225

    a forecast of a decade, based on a few years data, just to fit sales targets.. is now, more than legendary. Mythological as it were. Amazing. Thanks Jon. Really interesting stuff.

  • @antman7673
    @antman7673 Před rokem +64

    Hearing the story of Moores Law, it is a self-fulfilling prophecy:
    We wouldn’t have current level of tech, without the ambitions of Moores Law.

  • @chengong388
    @chengong388 Před rokem +463

    Camera lenses got better because designers could simply let the software run random variations of the lens over and over again to find a better design, the more compute power the more variations you can try and the more likely it is the find a better design. Modern smartphone camera lenses are so ridiculously complex, each element basically has a radial wavy surface that make no sense but somehow focus the light just the right way at the end with minimal aberrations.

    • @WaterZer0
      @WaterZer0 Před rokem +103

      Ah the ole brute force approach.

    • @kylinblue
      @kylinblue Před rokem +13

      Could you show us an example?

    • @seventhtenth
      @seventhtenth Před rokem +79

      not entirely true, a lot is material science and a lot is micro fabrication costs

    • @Laundry_Hamper
      @Laundry_Hamper Před rokem +57

      Also important is no longer needing to produce a geometrically correct image at the focal plane. Digital image corrections allow you to trade distortion for sharpness

    • @musaran2
      @musaran2 Před rokem

      @@kylinblue Search "polynomial optics" and get ready for a headache.

  • @sambojinbojin-sam6550
    @sambojinbojin-sam6550 Před rokem +19

    Thanks for telling us Moore's Lore, not just the over-quoted "Law".

  • @jordanwalsh1691
    @jordanwalsh1691 Před rokem +137

    Really interesting video. One small quibble, 2:19 - 2:49 "That 'roughly' is doing some serious heavy lifting". Not really in my opinion. As you point out, 50*2^10 is only 51 200, but the power of the exponent is so large, that you only need to increase the base by 2.5% to 2.05 to make up the difference. 50*2.05^10 = 65 540
    Personally I think 2.05 falls comfortably within the neighbourhood of "roughly" 2

    • @woofcaptain8212
      @woofcaptain8212 Před rokem +2

      That was my thought

    • @spodule6000
      @spodule6000 Před rokem +1

      I came here to make that comment. Thanks for saving me the trouble!

  • @hushedupmakiki
    @hushedupmakiki Před rokem +63

    21:00 - when mythology is so engrained you create a global industry of adherents. George Moore's passing really struck something in us, even in very adjacent semiconductor research/industries.

  • @covert0overt_810
    @covert0overt_810 Před rokem +14

    We need Moore transistors….

  • @afterSt0rm
    @afterSt0rm Před rokem +85

    Well, I'll take the opportunity that creators generally see the initial comments to thank you a lot for the amazing content you've been putting out. I wish you only the best! Hugs from (just) another one of your Brazilian viewers ❤

    • @gordonfreeman9965
      @gordonfreeman9965 Před rokem +1

      Nice to see that I´m not the only Brazillian that knows this amazing channel kkkkkkkkkk

    • @OgbondSandvol
      @OgbondSandvol Před rokem +1

      @@gordonfreeman9965 Now there are three of us.

    • @DanielLavedoniodeLima_DLL
      @DanielLavedoniodeLima_DLL Před rokem +3

      It was nice to see as well that a Brazilian was involved in the last paper that he presented in the video

    • @zerotwo7319
      @zerotwo7319 Před rokem

      GG izi shrink brazil to a micro size so we can be more efficient also. Fit more brazils inside brazil

  • @mrrolandlawrence
    @mrrolandlawrence Před rokem +58

    rip GM. a real visionary & titan of semiconductors!

    • @chrisbova9686
      @chrisbova9686 Před rokem +1

      Humanity would be immeasurably better off without tech, or those who would enrich themselves from the death of humanity.

    • @Vysair
      @Vysair Před rokem

      @@chrisbova9686 you mean extinction? Without tech, you are back to middle age/caveman

    • @---------c5741
      @---------c5741 Před rokem +10

      ​@@chrisbova9686 ironic u need to spread this wisdom using technology 😅

    • @chrisbova9686
      @chrisbova9686 Před rokem

      @@---------c5741 indeed. Smoke signals aren't dependable, but won't ruin the entire life experience.

    • @wyw201
      @wyw201 Před rokem

      @@chrisbova9686 Wouldn't you say the transistor is one of mankind's greatest inventions?

  • @qweewqqweewq31313131
    @qweewqqweewq31313131 Před rokem +20

    Good info.
    However I just want to point out that "DRAM" should be pronounced as "DEE-RAM"
    Like SRAM is "S - RAM"
    They should be the same logic when you pronounce the word

    • @matthiaskamm
      @matthiaskamm Před rokem +7

      yes please, I cringe every time I hear "dram" instead of "Dee-ram" :-) Great video.

    • @shanent5793
      @shanent5793 Před rokem

      Whose prescription was that?

    • @seanwieland9763
      @seanwieland9763 Před rokem +1

      Same with people who say “oh-led” instead of “O-L-E-D”.

    • @juhotuho10
      @juhotuho10 Před rokem +2

      not the first time he does it, most likely he is intentionally saying it like this
      don't know why though

  • @MillionMileDrive
    @MillionMileDrive Před rokem +21

    Regardless if Moore's law holds true or not, it's always good to have a target. It sets a goal for everyone to innovate and reach that goal.

    • @chrisbova9686
      @chrisbova9686 Před rokem +1

      Not if it hurts humanity. Tech is garbage, life is good! Carbon is good!

    • @jb5631
      @jb5631 Před rokem +6

      ​@@chrisbova9686 LOL, go offline

  • @niosanfrancisco
    @niosanfrancisco Před rokem +28

    Excellent presentation. RIP Dr. Moore.

  • @johnhorner5711
    @johnhorner5711 Před rokem +25

    Thank you for yet another educational video. The timing of it's release is uncanny. RIP Gordon Moore indeed. A parallel technology trend which doesn't get as much attention is magnetic storage (disk drives). This has been on a similar trajectory as integrated circuits have been, and has been every bit as important to the development of technology. Data storage is now so plentiful and cheap that we don't even think about it. CZcams allows anyone to upload unlimited video content for the world to watch. That is thanks to the magnetic storage revolution. Maybe you could do a video on that topic at some point.

    • @InvictraX
      @InvictraX Před rokem

      I did notice a big gap in the development of data storage. The capacity has slow down dramatically.

  • @samueladams2340
    @samueladams2340 Před rokem +2

    Always look forward to your videos. Well researched and in depth.

  • @bgop346
    @bgop346 Před rokem +5

    good video on history and i think this is also one of my favourite quotes from "the intel trinity"
    "Gordon more than anyone else understood that it wasn’t really a law in
    the sense that its fulfillment over the years was inevitable, but rather that it was a
    unique cultural contract made between the semiconductor industry and the rest
    of the world to double chip performance every couple of years and thus usher in
    an era of continuous, rapid technological innovation and the life-changing
    products that innovation produced.
    That much was understood pretty quickly by everyone in the electronics
    industry, and it wasn’t long before most tech companies were designing their
    future products in anticipation of the future chip generations promised by
    Moore’s Law. But what Gordon Moore understood before and better than anyone
    was that his law was also an incredibly powerful business strategy. As long as
    Intel made the law the heart of its business model, as long as it made the
    predictions of the law its polestar, and as long as it never, ever let itself fall
    behind the pace of the law, the company would be unstoppable. As Gordon
    would have put it, Moore’s Law was like the speed of light. It was an upper
    boundary. If you tried to exceed its pace, as Gene Amdahl did at Trilogy, your
    wings would fall off. Conversely, if you fell off the law’s pace, you quickly drew
    a swarm of competitors. But if you could stay in the groove, as Intel did for forty
    years, you were uncatchable"

  • @bloqk16
    @bloqk16 Před rokem +1

    I recall back in the 1990s [US] when it came to PCs, the Moore's Law was getting into the lexicon of engineering professionals that used PCs in their work; as the rapid growing power of succeeding Pentium chips were rendering 18 month old PCs obsolete. It was an amazing era on the increasing processing power of the PCs back then on an annual basis.
    The engineering company I worked at, as a means to unload the _obsolete_ Pentium PCs they had [less than two-years old], were selling them to employees for around $100 [US]. Yet, those Pentium PCs were purchased at around $2.5K each, when new, two years prior.

  • @AlokRayadurga
    @AlokRayadurga Před rokem +4

    Over the past weekend, I've been thinking of Moore, the traitorous 8, the last AMD video that you had out recently, and the impact these men made on the industry. Thanks for this video (or tribute?)

  • @RoyAntaw
    @RoyAntaw Před rokem +15

    Gordon Moore a true pioneer RIP.

  • @stephanhart9941
    @stephanhart9941 Před rokem +5

    The Broken Silicon episode you were on was 🔥!!!

  • @ColeL88
    @ColeL88 Před rokem +4

    Was a nice surprise to see a picture of mine used as the thumbnail and another used part way through the video!
    Also thanks for listing the source :D

  • @Freak80MC
    @Freak80MC Před rokem +16

    Listening to this, almost makes me wonder if Moore's Law was more of a self-fulfilling prophecy. Something that motivated people to push harder for technological advancement, which ended up making it come true.

    • @m_sedziwoj
      @m_sedziwoj Před rokem

      Look at Ray Kurzweil predictions, they are more interesting than limited Moore's Law.

  • @StevieFQ
    @StevieFQ Před rokem +7

    I would never argue that we don't need improvements in compute performance but you can make the related statement that improvements in computing power (along with a seemingly never ending thirst for more SW developers) has lead to less efficient SW being developed to take adequate advantage of compute performance.

    • @PainterVierax
      @PainterVierax Před rokem +5

      True. But this lack of code efficiency also comes with many advantages like ease of writing, prototyping, debugging, reviewing, correcting, improving, porting or even installing programs. All of the heavy lift is made by a few software bricks now (compilers, interpreters, OS, HALs, APIs, game engines, Web browsers).
      Even in embedded, it becomes way more practical to restrict ASM or RTOS usage only when it's imperatively required.

    • @son_guhun
      @son_guhun Před 11 měsíci

      This claim is sort of absurd on its face. If it were more profitable to produce more efficient software, then that's would companies would make. However, the increasing complexity of business domains, infrastructure and the sheer amount of different platforms a piece of code must be compatible with makes it extremely inefficient (in terms of development costs) to attempt to squeeze every last bit of performance from a chip by writing code in low-level languages.
      Simply put, there's nothing stopping you from putting out highly optimized software today. But you would simply get out-competed unless you were working on a very specific domain or platform. So it's not that powerful hardware leads to less efficient software, but that less efficient software is usually more competitively produced and priced. Hardware performance simply dictates the minimum point at which software is simply not usable due to its inefficiency. For most applications, "adequate" advantage of compute performance IS the ability to produce less efficient software that is still usable, because this means you can produce MORE software or tackle problems that are more complex.
      If a browser already opens a webpage less than 2 seconds, nobody realistically needs it to be faster. It (the browser or the webpage) just needs better features, bugfixes or increased stability. Or maybe just less costly maintenance.

    • @PainterVierax
      @PainterVierax Před 11 měsíci +2

      @@son_guhun it still really depends on the application.
      Sure Linux got rid of its old ASM pieces to be plain C (and now Rust) but in the mean time Android mostly got rid of Java runtime to compile software during install like BSDs do, despite the vast improvement of ARM SoCs.
      Similarly, developing production software for microcontrolers is not done with extremely inefficient/portable code like MicroPython or Arduino. And sometimes ASM is still used for critical timing functions.
      Same thing with desktop applications: Developing small tools and games with high level languages and APIs can be done without taxing too much of the increasing computing resources (even on laptops and embedded) but it's never used for developing AAA games or production applications that want to use every resources available to speed up the execution.

  • @ciCCapROSTi
    @ciCCapROSTi Před rokem +5

    I fucking love how this guy does not separate jokes and information. You have to actually pay attention to distinguish.

  • @supremebeme
    @supremebeme Před rokem

    This is probably my top 10 channels. Keep it going!

  • @rollinwithunclepete824
    @rollinwithunclepete824 Před rokem +15

    Excellent video, Jon! Among other things it explains why I was never sure what Moore's Law predicted. Was it a doubling every year, every 2 years, every 18 months? Thing itself, Moore's Law, has been redefined to fit the data, the doubling over a span of time.

  • @stachowi
    @stachowi Před rokem +23

    How do you pump out so much great content

  • @lucidmoses
    @lucidmoses Před rokem +3

    And here I though this was going to be some nonsense about how accurate it's been and how it will never end. Nice that you actually looked up the info first. Nicely done.

  • @acolyte1951
    @acolyte1951 Před rokem +2

    I appreciate your take on what you said was technological nihilism, that improving the speed (among other things) of electronics is a good and necessary thing. Even if the average consumer doesn't see it, the track record of these developments have indeed transformed the lives of many humans. Seemingly, for the better.

  • @evinoshima9923
    @evinoshima9923 Před rokem +2

    I remember in the late 70s in Manila we were making a K&S 478 add on that turned that manual wire bonder into one controlled by a microprocessor. Zylog, AMD, Intel, were our customers... what an amazing time.... our computer had no monitor, used a trackpad with a billard ball in it, and had 12 kb of ram!

  • @tonyv8925
    @tonyv8925 Před rokem +9

    Wow, incredible history. I remember my first computer, a VIC 20, then upgrading to the C-64, then the 8086. The first computer I programmed was with Holirith cards on a Univac that used magnetic ring memory and large drum magnetic tape. It was a simple employee hours/wages program. So many things have changed since then. My cell phone has more computing power than the biggest computer that our local college had at that time. Amazing!

    • @milantrcka121
      @milantrcka121 Před rokem

      Which Univac -1108? Yes those were the days of dropped card stacks...

    • @Agent-ie3uv
      @Agent-ie3uv Před rokem +1

      Obviously a grandma but where on earth the notion that boomers can't use computers? 🤔🧐

  • @nickbensema3045
    @nickbensema3045 Před rokem +4

    A few years ago I saw a Philosophy Tube video, in which for the first time I heard Moore's Law referred to as a marketing term, as opposed to a venerable guideline for technological progress. This video illustrates that label wasn't just left-wing cynicism, but kind of accurate -- it demonstrably instilled a confidence in progress that drove sales.

  • @matthewvenn
    @matthewvenn Před rokem +3

    Another great video Jon! The last few minutes were especially interesting for me - I'll read that paper. I'd like to know what other industries rely on increasing compute density. I think that mobile phones are an example, the industry plans around people needing to buy new phones to keep up with the latest software. But if phones stop getting more powerful, then that industry needs to rethink its financial plans. I'm also guessing that AI (especially the training side) will want more and more compute going forwards.

    • @m_sedziwoj
      @m_sedziwoj Před rokem

      Many people are buying clothes each season, so I think phone industry will find the way ;)

  • @mrlucasftw42
    @mrlucasftw42 Před rokem +3

    Modern SSD have really revolutionized computing for the base user - boot in sub 2 minutes - after Windows updates can still often be sub 5 minutes.

    • @WaterZer0
      @WaterZer0 Před rokem +8

      TWO MINUTES?!
      JESUS CHRIST
      What are you doing?!
      There's no way it should take more than 30 seconds *maximum* to load the OS.

  • @RobBCactive
    @RobBCactive Před 5 měsíci

    There ought to have been discussion of Dennard scaling which was a key driver of rapid processor improvement, node shrinks gave not only smaller & cheaper but also faster transistors at a constant energy cost, meaning there weren't the heat & power wall issues which halted the frequency scaling.
    I was a bit disappointed that this channel failed to mention that as most people focus on number of transistors, when multi-core is a response to physical limitations on uni-processor performance.

  • @deem1819
    @deem1819 Před rokem +4

    Never thought I'd hear an overlap between all the dead space lore channels I follow and my semiconductor manufacturing interests

  • @douginorlando6260
    @douginorlando6260 Před rokem +2

    Shrinking transistor technology also led to lower power consumption solutions, particularly valuable for battery powered devices

    • @ttb1513
      @ttb1513 Před rokem +2

      Yes. The shrinking of a transistor’s area by 50%:
      1) Allows twice as many transistors on a chip with the same area.
      2) The same area implies basically the same cost, for twice as many transistors (far from costing twice as much).
      3) A transistor with 1/2 the area consumes 1/2 the power. With twice as many at 1/2 the power, the power consumption stays basically unchanged, for a chip with twice as many transistors.
      Neither the power nor the cost double when the number of transistors double in the same unit of area. This scaling phenomenon is the real important thing.
      If transistors stopped shrinking, compute that needs twice as many transistors will start costing twice as much and using twice the power. Do this for a few generations and costing 8x as much and consuming 8x as much power and 8x the area …. that will make you appreciate the transistor shrinkage advances we’ve had in the past.

  • @samueltan9279
    @samueltan9279 Před rokem +4

    Do basically Moores law was a self fulfilling prophecy. The industry followed it not because of some universal physical law but because everyone in the industry tried their best to follow it for various reasons.

  • @awesommee333
    @awesommee333 Před rokem +5

    While certainly in the weather example you have the additional compute power has helped, idk it seems that these massive investments in compute power only improve the prediction by so much, and, given the increasing cost of transistors, you almost have to wonder whether the next step would be worth it.

  • @danielsuguwa746
    @danielsuguwa746 Před rokem

    Interesting video, and thanks for the content! I just know today that Dr. Moore passed away on Friday... RIP legend...

  • @paulmuaddib451
    @paulmuaddib451 Před rokem +12

    I love the little bits of humor, wit and charm you add to each video, "...technically correct, the best kind of correct (from Futurama)", and that "Whomp Whomp".
    We see you, @Asianometry we see you. 😘

  • @1998awest
    @1998awest Před rokem +15

    Awesome video, Jon. I worked on Intel's 65nm node - you got most of the products of the time (Cedarmill, Yonah, Tukwilla). Great summary on the main drivers keeping Moore's Law alive (advancements in design and lithography).
    From 2000 - 2010, Intel's biggest worry was being categorized a monopoly and broken up. Since then, Intel lost its once-massive competitive advantage, and has been surpassed by TSMC and Samsung. The company was a juggernaut with Moore, but after his retirement, hubris crept into the company culture, and mismanagement became the norm, IMO.

    • @razorbackroar
      @razorbackroar Před rokem +4

      Sick dude that's awesome we on 4nm now

    • @m_sedziwoj
      @m_sedziwoj Před rokem

      @@razorbackroar anybody is naming they node 4nm? Intel 4, TSMC N4, maybe only Samsung, but they lying anyway so who cares.

    • @user-up7nb6id1f
      @user-up7nb6id1f Před rokem

      @@m_sedziwoj tsmc on 2nm soon bruh

    • @rkan2
      @rkan2 Před rokem

      ​@@user-up7nb6id1fEven TSMC doesn't call it with "nm"...

  • @Nor-tc8vz
    @Nor-tc8vz Před rokem +5

    Moore's law is dead, long live Gordon Moore.

  • @tengkualiff
    @tengkualiff Před rokem +24

    Moore's Law will never truly die. :(

    • @benc3825
      @benc3825 Před rokem +10

      Moore’s law, or Moore’s observation, I dead, simple as that. We don’t get to rename and define it so it still correcr

    • @RonnieMcNutt666
      @RonnieMcNutt666 Před rokem +1

      @@benc3825 AMD and intel future server cpus want to know your location, also gpu density

    • @benc3825
      @benc3825 Před rokem +1

      ​@@RonnieMcNutt666Which one specifically? The Turnin family, Venice, Emerald Rapids, Granite Rapids, Diamond Rapids, Sierra Forest and/or Clearwater Forest. :-P

    • @KekusMagnus
      @KekusMagnus Před rokem +1

      It's been dead for awhile now

    • @RonnieMcNutt666
      @RonnieMcNutt666 Před rokem +1

      @@benc3825 3090 to 4090 having nearly 3x the transistors in a smaller die

  • @jaymacpherson8167
    @jaymacpherson8167 Před rokem +8

    FYI…NOAA is typically called “no-ah.”
    Thank you for great documentation that Moore’s Law is passé.

    • @EndOfLineTech
      @EndOfLineTech Před rokem

      FOR YOUR INFORMATION you don’t need to be a dick

  • @glennmcgurrin8397
    @glennmcgurrin8397 Před rokem +2

    If you take a technology that's developing that rapidly and is that early in it's lifecycle and give me a ten year roadmap into the future and at year 10 you are actually at what you said would be year 9 that's amazingly accurate, it wasn't perfectly accurate but it's incredibly rare to see anything close to that as far as I see.

  • @gljames24
    @gljames24 Před rokem +1

    Moore's law, like every exponential in nature, is bounded by physical optimization limitations and more accurately follows a Sigmoidal curve as the technology hits an inflection point and only gives you diminishing returns and the graph goes logarithmic. The only way to advance is to switch to a new technology. Adding cores, gate geometry, 3D stacking has helped in many areas, but we'll likely have to switch from Silicon mosfets to Gallium Nitride or other chemistries to get any sort of frequency improvements at this point.

  • @atanumaulik7093
    @atanumaulik7093 Před rokem

    Brilliant, as always. The world needs more compute. Long live the Moore's law !

  • @truefan1367
    @truefan1367 Před rokem

    That truck backing up really sells this video.

  • @thepenguin11
    @thepenguin11 Před rokem +5

    I disagree that regular people do not really care about it anymore. Software expects for systems to advance, web alone, try using web with devices 10 years old, you will rage like crazy how slow everything is. The productivity goes up as well with more powerful units.

    • @WaterZer0
      @WaterZer0 Před rokem +1

      That's because programmers are less and less efficient.

    • @thepenguin11
      @thepenguin11 Před rokem +5

      @@WaterZer0 Clearly you don't know how programming works then. Same software these days is way more efficient than old software doing the same task on same machines. The software like cars advance and need more supporting functions to achieve more efficient and more advanced processes. The issue with badly optimized things these days are not fault of programmers in majority of cases, but fault of leadership, who push unrealistic time frames and cost restrains.

    • @WaterZer0
      @WaterZer0 Před rokem +1

      @@thepenguin11 capitalism bad? loud and clear

  • @steveinmidtown
    @steveinmidtown Před rokem

    really interesting...quick question, I'd like to watch the video about Wang Labs referred to at 6:05 but am not finding it?

  • @ruperterskin2117
    @ruperterskin2117 Před rokem

    Right on. Thanks for sharing.

  • @billfargo9616
    @billfargo9616 Před rokem +1

    Since "Moore's law is the observation that the number of transistors in an integrated circuit (IC) doubles about every two years," all that is required to keep it valid forever is to make bigger ICs.

  • @suryaps124
    @suryaps124 Před rokem

    I have been looking into Graphene FETs for computational use and other alternatives to the existing Si. One of the main challenges facing any Si alternative is the repeated "extending" of Moore's law. Would you cover post Silicon technologies in a video ? I find your videos quite fascinating and they help keep me informed while I look for roles in the semiconductor industry. Thank you.

    • @m_sedziwoj
      @m_sedziwoj Před rokem

      Industry can't move to larger wafer because cost, so moving to another technology... I would bet more on using carbon in different forms as addition to silicon base, not change of everything. At last I didn't hear anything which suggesting is a viable, I think it required a lot more research to make it to market. Researchers are overselling they inventions ;)

  • @colebevans8939
    @colebevans8939 Před rokem +2

    With chat gpt 4 this week and how much programmers are already saying it’s helping them code, I can’t help but feel we are just at the start of another massive Burt’s of exponential growth. Soon we will be able to code 10x more things, 10x faster for a fraction of the cost. That alone will be a massive boost to efficiency. As AI models grow it’s only going to get better. Add to that how quickly quantum computing is growing these last few years. Sure it’s an entirely different field but we have no idea what quantum computers could be capable of 20 years from now because we haven’t had the tools to start playing with them until yesterday.

  • @Venkat2811
    @Venkat2811 Před rokem +1

    As your narration and explanation is unbeatable, would be great if you could video on evolution of technology from 1400 AD (printing press) till date and how humans were worried about jobs being replaced. This would be very relevant to current GPT / AI revolution.

  • @Schroinx
    @Schroinx Před rokem

    Great video. If you are running out of topics, then the Wintel and x86 story could be one.

  • @JoshuaC923
    @JoshuaC923 Před rokem

    Did not expect to see dead space in a Asianometry video😂😂👍🏻👍🏻

  • @BobHank2
    @BobHank2 Před rokem +1

    Amazing analytical history of transistors and chip design.

  • @ericcarabetta1161
    @ericcarabetta1161 Před rokem

    Moore really got to see quite the transformation in computing power over his life.

  • @julioguardado
    @julioguardado Před rokem

    There's another wave coming in semiconductor manufacturing - the maturing of the industry. Optical scaling has a physical limit and wafer size is not going to go beyond 300mm. What we'll see is all chips becoming much cheaper, particularly complex ones as industry laggards catch up. I think the best is yet to come.

  • @d00dEEE
    @d00dEEE Před rokem +2

    Moore's Law should be rewritten as an Old English epic poem, say in the style of "Beowulf".

  • @pierQRzt180
    @pierQRzt180 Před rokem

    I am simple man, I see an interesting article cited, and I upvote.

  • @jrherita
    @jrherita Před rokem

    Appreciate the deep respect for Moore!
    Only comment is the node charge didn't really end in 2006 for Intel. They executed pretty well until about 2014-2015.

  • @Jalae
    @Jalae Před rokem +1

    1 trillion fold increase of compute for 2 degrees of accuracy seems like a gross misuse of energy.

  • @AdityaMehendale
    @AdityaMehendale Před rokem +1

    The actors at 11:05 haven't actually soldered a goddamn thing in their entire lives.

  • @matthagy86
    @matthagy86 Před rokem

    Thanks!

  • @traybern
    @traybern Před rokem +1

    It’s NOT a law. It’s an OBSERVATION.

  • @hsharma3933
    @hsharma3933 Před rokem

    I’m so glad you addressed the elephant in the room right at the beginning.

  • @m_sedziwoj
    @m_sedziwoj Před rokem +1

    Personally I would look at "Kurzwell's law" so cost of computing as calculation/s per $1000, because it extend to times with punching machines and ignore technical aspects, which is interesting as a story how it change with time.
    About end, I would give different arguments "you don't need more computing for today software, but only today" because to allow something new, sometimes is few orders of magnitud before it become available (real time RT in games, really smart AI etc). Where are limits to human perception as resolution, refresh or interaction, but quality and complexity have a long way.
    As autonomous cars, robots (humanoid or not but one which can deliver stuff under your door) and many many more. But AI revolution is only at beginning and Von Neumann architecture will not be best for it.

  • @ChanChan-pg4wu
    @ChanChan-pg4wu Před rokem

    insanely insightful! Thank you, Asiannometry.

  • @piethein4355
    @piethein4355 Před rokem +1

    I need more compute for my gaming rig still, My 380 can still just barely drive current gen VR titles, I will need sever orders of magnetude more compute before even being able to drive something currently high end like A XR3 at a native resulution. If i also wan't high refreshrates (atleast 144 hertz but preveribly in the 200s) without reprojection and raycasting for improved lighting then we are still many many orders of magnetude away from what is needed.

  • @jamesmorton7881
    @jamesmorton7881 Před rokem +1

    1978 the Motorola 6800. The applications uses exploded. The rocket was just launched.
    I loved all that was CMOS the RCA 4 bit; Now we are at around 2000MIPS. The IBM System 360 waas about 16.6MIPS in 1970.
    The self heat now increases with operating temperature due to higher leakage currents of smaller geometry, around 90nanoM

  • @Aarkwrite
    @Aarkwrite Před 7 měsíci

    I was not expecting a Dead Space reference bravo

  • @n45a_
    @n45a_ Před rokem +3

    i was going to sleep but i guess not

  • @HellishPestilence
    @HellishPestilence Před rokem

    There are of course business applications for more computing power. But for the first time in history, consumer products today are not limited by computing power but by things like network speed. The market for more advanced chips is a lot smaller if you're developing for a few HPC clusters at companies or universities rather than smartphones, which perform just fine with a 7nm chip for the vast majority of people

  • @alpaykasal2902
    @alpaykasal2902 Před rokem +3

    RIP Doctor Moore. Your fingerprints are all over everything, forever.

  • @sunroad7228
    @sunroad7228 Před rokem

    "In any system of energy, Control is what consumes energy the most.
    Time taken in stocking energy to build an energy system, adding to it the time taken in building the system will always be longer than the entire useful lifetime of the system.
    No energy store holds enough energy to extract an amount of energy equal to the total energy it stores.
    No system of energy can deliver sum useful energy in excess of the total energy put into constructing it.
    This universal truth applies to all systems.
    Energy, like time, flows from past to future".

  • @AgentSmith911
    @AgentSmith911 Před rokem +1

    Moore's law today is ~15% yield improvement per year

  • @user-jp1qt8ut3s
    @user-jp1qt8ut3s Před rokem +8

    I realize now how lucky I am to have met this guy, and many other great inventors and scientists. I think being scientist is one of the most beautiful lifes one could have. What's your job?

  • @marco21274
    @marco21274 Před rokem

    Could you make a video about Nanoimprint lithography?

  • @ps3301
    @ps3301 Před rokem

    We need 4k game running at 120hz refresh rate. That is another 6 years away

  • @ku0n349
    @ku0n349 Před rokem

    I absolutely adore your conclusion, I hate when people say things which ignore everything that technology could be, in favour of what we already have.
    Recently at a party I saw someone who is an engineer at Bosch saying that most of the things are already invented and that there is so little innovation that is yet to come. When I heard that from an actual engineer, I felt disgusted tbh

    • @ceeb830
      @ceeb830 Před rokem

      I don’t understand how someone could say that today

  • @michaelmoorrees3585
    @michaelmoorrees3585 Před rokem

    From a chip users standpoint, I remember the market wide semi level, the transition to new FET designs, in the 1st half of the 1980s. These improved MOSFETs made faster CMOS chips, so big chips, moved from NMOS to lower power CMOS. At the discrete level, the creation of power MOSFETs.
    I thought Intel's big win was IBM choosing its uP for its IBM PC, and the clones, having to stick to it, to stay compatible. Even moving to the 386, when IBM balked at using it, fearing stepping on its mainframe market.

  • @Edward135i
    @Edward135i Před rokem +1

    It's hard to think of a single person who effected more human lives than Gordon Moore, RIP Gordon thank you, I hope Pat doesn't run your company into the ground completely.

    • @shorerocks
      @shorerocks Před rokem

      Yeah. Now think about the inventor, or one of the godfathers, of AI: Geoffrey Hinton. There is a CBS morning interview that is super interesting. And... I am not able to predict how much live will change in the next 10, 20 years.

    • @Noqtis
      @Noqtis Před rokem

      When you fart, you change the life of more bacteria than there a humans on the planet. Never forget; your asshole is a planet.

  • @luke144
    @luke144 Před rokem +1

    We need to adapt and change the way we compute itself. We need to quit looking for unicorn farts with dark matter detectors and tackle problems like P Vs. NP. We need to make out computing me effective and efficient. Right now we run A LOT of flawed, blotted programs. I think things like gallium arsenic will surely help but we need to go back to the basics. We need to change the geometry and architecture of processors. The advent of the arm processor is a perfect example.

    • @buzzsaw838
      @buzzsaw838 Před rokem

      Go back to the basics? So basically we've reached the "local maximum" for the current technological paradigms in place for compute. Sounds like some fundamental revolution is needed at the underlying architectural level to continue anything like Moore's law level growth.

    • @luke144
      @luke144 Před rokem

      @@buzzsaw838 there are many ways the current model of computing can change.

  • @EricJorgensen
    @EricJorgensen Před rokem +1

    The less popular corollary to moore's law is the one about how moore's law will be re-defined in order to argue that it's not really meaningless every 3-4 years.

  • @accutronitisthe2nd95
    @accutronitisthe2nd95 Před rokem

    It can't go on forever...

  • @JimCareyMulligan
    @JimCareyMulligan Před rokem +1

    It’s just a trend line. You can make your own right now. Moore himself (or probably some analyst in company) revisited data. And if you famous enough you can call it Gelsinger-Moore’s law. Or RayJacket-Moore’s law. RIP

  • @VedranCro
    @VedranCro Před rokem

    Moore's Law reminds me of Hubble's Law, which describes the expansion of the Universe. Edwin Hubble used only a few data points and boldly drew a line connecting them to prove that galaxies farther from us are receding faster. The values on Hubble's chart were inaccurate, as were his predictions based on it (that the Universe was 2 billion years old). Nevertheless, Hubble's Law inspired others to replicate his work and refine their measurements, which ultimately led to plausible theories and predictions.

    • @VedranCro
      @VedranCro Před rokem

      And I love egg fried rice :)

  • @tonifakerman9639
    @tonifakerman9639 Před rokem

    Next time you come across the acronym NOAA you can pronounce it like the name Noah and everyone will still understand you, as always great video man

    • @ttb1513
      @ttb1513 Před rokem

      Oh "dram", you’re right. He pronounces DRAM as ‘draah-m’ instead of ‘dee-ram’. Little quirks in such great content.

  • @lJUSTwanaCOMMENT
    @lJUSTwanaCOMMENT Před 9 měsíci

    The reason you need ever more powerful hardware is software updates.

  • @timdere
    @timdere Před rokem +2

    "Moore's Law didn't get tossed into the trash like that two-day-old white rice in the fridge. Instead, the industry modified it to make delicious fried rice." Nice touch on the commentary! 😆

  • @TeodorLojewski
    @TeodorLojewski Před rokem +1

    RIP G. Moore

  • @matsv201
    @matsv201 Před rokem

    Lithography did break pretty badly in early 00s. But at the same time die sizes and power rise up to meet demand (that there was plenty of back then)
    Back in 99 cpu pretty much worked as TTL logic still. Sure, voltage was down, but but same basic method was still used.
    During early 00s this was change considerably to save power at the same time as power avaliable was increased.
    At the same time significantly increasing the die area.
    So while the 1965 idea was around component, not speed, by the time 90 came around, performance was the main metric.
    Mores law sort of was adopted to what the market needed.
    Of cause the lack of new lithography in 00s totaly broke the original idea of mores law, still this period the performance increased to the dubbeling every 18 month. This was really just calmed down in early 10s when power Constraints become more important. That is why we seen more moderate improvements in 10s, but on the other hand better products.

  • @BracaPhoto
    @BracaPhoto Před rokem

    Moore's Historical Trend is a more accurate description

  • @saricubra2867
    @saricubra2867 Před rokem +1

    He died with his law, after 2013, stagnation.

  • @Quickshot0
    @Quickshot0 Před rokem

    As an idea it has basically ended up redefining how our civilization has ended up looking I guess. It's hard to imagine we'd have put nearly so much effort in R&D in certain fields, especially electronics if they didn't start really trying to hit this target over and over again. And with how much electronics influences, including helps to develop, it's hard to overstate how much it has changed everything.
    And to think humanity getting fixated on it like they did was just a matter of chance.... one wonders what would have happened if they hadn't, how would things have developed then.