Nvidia Reveals Grace Hopper Generative AI Chip (Computex 2023)

Sdílet
Vložit
  • čas přidán 29. 05. 2023
  • At Computex 2023, Nvidia CEO Jensen Huang shows off its generative AI chip, Grace Hopper. See what the CEO has to say about the future of accelerated computing and AI.
    Never miss a deal again! See CNET’s browser extension 👉 bit.ly/39Ub3bv
    Subscribe to our channel: / @cnethighlights
  • Věda a technologie

Komentáře • 343

  • @klausyap
    @klausyap Před rokem +141

    The way he was holding it and not afraid to drop it, just as brave as Linus.

    • @demonz9065
      @demonz9065 Před rokem +1

      he started the video by saying it's in full production. if he breaks that one it's no big deal. why would he be afraid to drop it?

    • @oldpain7625
      @oldpain7625 Před rokem +4

      ​@@demonz9065Well he's giving a presentation on how innovative the technology is. Dropping it on the ground in front of all those people would not be optimal.

    • @demonz9065
      @demonz9065 Před rokem +1

      @@oldpain7625 it would'nt be optimal but when is dropping something delicate optimal? his embarrassment would be the biggest downside to him dropping something that's just gone into full scale production. there's no real reason for him to be afraid of damaging it

    • @samsonrobertbrewer8235
      @samsonrobertbrewer8235 Před rokem

      Since the February 02/2023/untill now I'm trying to rebuild websiaite lol the Microsoft 365 lunch onfenrary 3

    • @kneelesh48
      @kneelesh48 Před rokem +2

      He got the power of the leather jacket

  • @joelface
    @joelface Před rokem +136

    I do think Nvidia flies under the radar a little bit with the way it makes all of the advances of the other big computing companies possible. This seems like a seriously huge leap in computing power. It may prove fundamental to unlocking a real AGI.

    • @Jaker788
      @Jaker788 Před rokem +8

      There are other players doing pretty crazy stuff for less general purpose but extremely powerful AI training. Cerebras has their wafer scale AI processors which are very good for specific AI training models. Tesla with their Dojo computer is pretty crazy for training, it's memory bandwidth and hierarchy is suited for very high bandwidth and capacity fully addressable. It's far more powerful than their Nvidia GPU clusters. Nvidia is very general purpose, their GPUs have capabilities beyond just training instructions like Bloat 16 and low precision integer which limits the architecture, the difference between vector and matrix is decently significant. A pure matrix architecture with assistance from a simple integrated CPU in each cluster like dojo or Cerebras is quite powerful vs a GPU that must do vector and matrix.
      I would also say that AGI will not be done by a large language model, it doesn't have any true intelligence, but it's very good at making connections.

    • @OneDivineShot
      @OneDivineShot Před rokem +2

      @@Jaker788 but isn’t making connections the fundamental way our brains learn things as well? Neurons for connections in the brain.

    • @Jaker788
      @Jaker788 Před rokem +6

      @@OneDivineShot LLMs just aren't it, they're really good for certain things, but they have their limitations. Open AI has even said that GPT4 is about where the ceiling for LLMs is. We will need a different kind of model for AGI.

    • @preddyshite6342
      @preddyshite6342 Před rokem

      @@Jaker788 The crazy thing is not THAT it works, but how EASY it is to use. A pseudo-AGI architecture is more than sufficient to emulate the gamut of human proclivities. NVIDIA just handed it a body, so it can go hide anywhere.

    • @austinjoseph2881
      @austinjoseph2881 Před rokem +6

      Lol at flies under the radar. Have you seen the 300% increase in the stock recently

  • @Beingtanaka
    @Beingtanaka Před rokem +190

    The crowd was soooo dead, the man just announced the worlds fastest super computer

    • @thetshadow999animates9
      @thetshadow999animates9 Před rokem +13

      No, it’s not a supercomputer

    • @bengsynthmusic
      @bengsynthmusic Před rokem +3

      ​@@thetshadow999animates9
      👉🏽 11:37

    • @thetshadow999animates9
      @thetshadow999animates9 Před rokem +9

      @@bengsynthmusic it’d be a stretch to call it a supercomputer, more like a marketing thing. Also, this was unscripted so you can’t really go off of what Jensen says either way.

    • @bengsynthmusic
      @bengsynthmusic Před rokem +10

      @@thetshadow999animates9
      You can see on the left side that it says AI supercomputer. 11:37 So it's not some off-script slip-up. Plus it does 1 exaflops, which would put it among the top supercomputers. It is no doubt a supercomputer.

    • @Moltenlava
      @Moltenlava Před rokem +3

      @@thetshadow999animates9 This is the kind of chip you would find inside a supercomputer, this thing is designed for server computing racks.

  • @grcfalcon
    @grcfalcon Před rokem +34

    If that isn't the heart of Skynet, I don't know what it is.

  • @natej6671
    @natej6671 Před rokem +94

    I'm looking at that 40k pound beast of a GPU and I'm thinking .... It will be the size of a cellphone in 20 years.

    • @kneelesh48
      @kneelesh48 Před rokem +14

      Next you're gonna say, AWS will be the size of an airpod in 20 years. No, it won't.

    • @kneelesh48
      @kneelesh48 Před rokem

      @@user-hx5qv4kd6 doesn't mean we'll see the same change in the next 20 years. We can't shrink atoms.

    • @tsnstonepilot5375
      @tsnstonepilot5375 Před rokem +9

      @@user-hx5qv4kd6 yeah but it’s not the same. You can’t make electrons any smaller. We’re already nearing the limits when it comes to how small our transistors are. Further compaction will require some new breakthrough that we have no real concept of.

    • @ferdinand.keller
      @ferdinand.keller Před rokem +4

      People always say the same thing, that we are already at the limit. And then a new discovery is made, and we surpass ourselves. Thinking we won’t do better isn’t a prediction I would put money on.

    • @pictzone
      @pictzone Před rokem +2

      @@ferdinand.keller I mean I really can't see a way to make transistors smaller than a few atoms.. so he has a point. Maybe we'll achieve some great workarounds, like 3D chips, but that's it.

  • @joeyglasser2574
    @joeyglasser2574 Před rokem +118

    "I wonder if this can play Crysis"
    lmao Jensen is a based CEO

    • @lordrefrigeratorintercoole288
      @lordrefrigeratorintercoole288 Před rokem +18

      cringe

    • @freemanrader75
      @freemanrader75 Před rokem +5

      I don't know why all the gamers act so mad at Nvidia

    • @knightnxk2906
      @knightnxk2906 Před rokem +2

      @@freemanrader75 becasue they are not really here for the gamers?
      They are just not trying to lose market share and try to suck every dollar out of you, which apparently works very well.
      But the big plays are happening behind Enterprise doors, that's where the real tech is.

    • @freemanrader75
      @freemanrader75 Před rokem +1

      @@knightnxk2906 Nvidia is owned by gamers.

    • @edgeldine3499
      @edgeldine3499 Před rokem +1

      @@freemanrader75 because they want to charge 2-3 times as much today for 20-30.. maybe 40% more performance than they did a few years ago.. MSRP not Covid Inflated prices. If were talking the 4060 then you might even get less performance than the last generation..

  • @C01A60
    @C01A60 Před rokem +191

    This guy is really a one in a million passionate CEO!

    • @knightnxk2906
      @knightnxk2906 Před rokem +10

      cos hes getting paid well with bonuses.

    • @andy68686
      @andy68686 Před rokem +5

      @@knightnxk2906 no sir, the guy found the company in a Denny's

    • @knightnxk2906
      @knightnxk2906 Před rokem +3

      @@andy68686 Sure and I am Pope.

    • @Shiffo
      @Shiffo Před rokem +12

      @@knightnxk2906 Jensen is worth $33B, this guy can do everything he wants.
      Money is no limitation. Do you think he wakes up every day with the idea of earning more money today?

    • @adamrhea2339
      @adamrhea2339 Před rokem +12

      He's 10x better than Steve Jobs. Has actually written code and cares more than anyone else about the mission.

  • @TheViettan28
    @TheViettan28 Před rokem +74

    Finally Nvidia created a GPU for LLM. LLM is memory hungry.

    • @cidpraderas8950
      @cidpraderas8950 Před rokem +1

      I thought all it’s GPUs could handle LLMs, no? Is this one chip that is tuned specifically for LLMs?

    • @TheViettan28
      @TheViettan28 Před rokem +4

      ​@@cidpraderas8950 They did, but LLM needs a lot of memory. So The current LLM running on current GPU has to be sharded across multiple GPUs. If the GPU has more memory, the LLM can be stored on a single GPU and the training process may be much faster due to the reduced amount of inter-GPU communications.

    • @neonlost
      @neonlost Před rokem +1

      @@cidpraderas8950 depends on the size of the model and the optimization used. newer models usually use more though especially if they have a lot of parameters.

    • @joelface
      @joelface Před rokem

      How long until a phone runs on an equivalent chip? 5 years? 10 years?

    • @FriedChairs
      @FriedChairs Před rokem +1

      One thing is for sure is that he knows absolutely everything about the products NVIDIA is making. Can’t say that about all CEOs.

  • @jdkingsley6543
    @jdkingsley6543 Před rokem +46

    I like how we always go back to rooms full of computers. Despite the processing breakthroughs

  • @boltez6507
    @boltez6507 Před rokem +3

    10:46 At least he remembers his previous primary comsumers.

  • @jasonsadventure
    @jasonsadventure Před rokem +7

    *Jensen Huang @**12:00**:* *_"DGX2000, It is one giant GPU."_*
    *Voice of Morgan Freeman:* *_"Then... on Dec 12, 2023... Skynet was born"_*

  • @Zodtheimmortal
    @Zodtheimmortal Před rokem +55

    I'm excited and scared at the same time. What was this GPU named again, Skynet?

    • @Yankeyson1
      @Yankeyson1 Před rokem

      I was thinking the same thing.

    • @preddyshite6342
      @preddyshite6342 Před rokem

      Yes, now LLMs can be mobile and everyone is trying to make their own AGI.. lol INCLUDING MEEE!!!

    • @dwightk.schruteiii8454
      @dwightk.schruteiii8454 Před rokem

      @@preddyshite6342whats an LLM?

    • @preddyshite6342
      @preddyshite6342 Před rokem +1

      ​@@dwightk.schruteiii8454 Large Language Model. That's what chatGPT is. Because it is trained on a lot of words.

    • @420msclub
      @420msclub Před rokem

      Because Attention + Hype🎉

  • @noveenmirza4917
    @noveenmirza4917 Před rokem +5

    The energy of this guy is supreme!!

  • @maudentable
    @maudentable Před rokem +4

    Jensen is stacking up hardwares like vectors, matix and tensors.

  • @nijobot
    @nijobot Před rokem +9

    Grass Hopper

  • @JazevoAudiosurf
    @JazevoAudiosurf Před rokem +17

    I guess mega caps don't care about price, but the cost must be astronomical

    • @knightnxk2906
      @knightnxk2906 Před rokem +2

      Because it is. At least 100k
      This is an insane GPU, if i can even call it a GPU cos it feels like an understatement at this point.

    • @4482paper
      @4482paper Před rokem +4

      @@knightnxk2906 LOL - you just MASSIVELY underestimated the cost, a single H100 is $40,000+ ;-)

  • @joshuagreen5820
    @joshuagreen5820 Před rokem +8

    Wow the dedication all just so we can finally play Crises the way it should be! In 40 years we’ll have one in our phones lol

    • @Shiffo
      @Shiffo Před rokem +1

      In 40 years, you won't have a phone

  • @webpresent
    @webpresent Před rokem +5

    New Internet moment: the data center is the computer. 👍

  • @xlr555usa
    @xlr555usa Před rokem

    I want a big A100 cluster so I can play in my sandbox, I still don't understand the advantages of Hopper. Looks like you can link them with pods?

  • @chineduachimalo391
    @chineduachimalo391 Před rokem +17

    hmm but can it really play crysis?

  • @AngeloXification
    @AngeloXification Před rokem +5

    The future is going to be wilder than anyone can predict

  • @nexovec
    @nexovec Před rokem +6

    This is so insane. I love this.

  • @bruceli9094
    @bruceli9094 Před rokem +6

    Remember when people said we'd have to learn Chinese in the future?? Now you wont have to because of A.I the Black swan event.

  • @humorme5874
    @humorme5874 Před rokem +18

    "I wonder if this can play Crysis" hahaha a true classic

  • @HEBEcoin
    @HEBEcoin Před rokem +1

    Game changer, historical inflection point comes to mind 😅

  • @victorhernandez-mk7bk
    @victorhernandez-mk7bk Před rokem +8

    Amazing! with 1 GPU!!

  • @suekuan1540
    @suekuan1540 Před rokem +1

    What happened to the qbit quantum computers?

  • @peppy197
    @peppy197 Před rokem +1

    Will it fit in a case ...and run MSFS2000 ?

  • @aphaileeja
    @aphaileeja Před rokem +1

    Easy: build a flat Rumba the diameter of the average stride, then put the robot on top. I'm thinking a pole with arms and a face/360 camera🫡

  • @metamorfoza7656
    @metamorfoza7656 Před rokem +32

    When AI from this thing becomes aware and takes over the world, I hope that first thing that is going to do is reduce prices od gpus for 50% so it could replicate more efficiently... 750 bucks 4090 for all

    • @mlawal44
      @mlawal44 Před rokem +1

      😂

    • @Cyberdemon1542
      @Cyberdemon1542 Před rokem +1

      When that happens that will be the least of your problems...

    • @pictzone
      @pictzone Před rokem +1

      Even 750 bucks sounds insane for a GPU tbh... How times have changed

    • @surplusking2425
      @surplusking2425 Před rokem

      Unfortunatly modern machine learning AIs are actually just a glorified collage machine, so self-awareness is much like a medieval ornithopter became a modern stealth jet fighter

    • @DreamingConcepts
      @DreamingConcepts Před rokem

      if that happens it will give you gpus for free, also neuralinks. In fact it will force you to take it and put you inside a tube "for your own safety" to rot in the matrix forever never remembering what reality looks like.

  • @racerx6384
    @racerx6384 Před rokem

    Wow. The next shield TV is impressive.

  • @GlorifiedPig
    @GlorifiedPig Před 11 měsíci +1

    10:44 "I wonder if this can play crysis" lmao

  • @jdevoz
    @jdevoz Před 2 měsíci

    Whats the MTTF of that setup?

  • @ChishaSinyangwe
    @ChishaSinyangwe Před rokem +35

    Crazy scientist! You can't help but love this guy and his company!

    • @afkcnd2395
      @afkcnd2395 Před rokem

      Are you out of your mind ?
      This man pushes greedy practices in the whole GPU industry, last gen consumer GPU are literal scams.

    • @Vampirat3
      @Vampirat3 Před rokem

      arent you an easy sellout.

  • @urimtefiki226
    @urimtefiki226 Před 10 měsíci

    Waht is the algorithm of your chip?

  • @MsFearco
    @MsFearco Před rokem +1

    he sounds super excited.

  • @nabe3el454
    @nabe3el454 Před 11 měsíci +1

    Interesting. Dropping a comment here as an ' I TOLD YOU SO' when this piece of tech goes either south or north. Definitely a game changer. New breakthroughs coming in. A new world!?

  • @davidfaustino4476
    @davidfaustino4476 Před rokem +4

    Don't worry it will take at least 3x the VRAM for it to do anything useful.

  • @ash0787
    @ash0787 Před rokem +2

    This just makes the 3070's 8GB VRAM limitation more painful ...

  • @HK_Martin
    @HK_Martin Před rokem

    that jacket is just a tradition now

  • @BlueRice
    @BlueRice Před rokem

    AI seem to be the feature. at the same time i having a thought about skynet in terminator. computer power like this is mind blown. i still think phone computer power is powerful for its size. i still wait the days were they have contact lens small enough to have a power of a phone.

  • @darkashes9953
    @darkashes9953 Před rokem

    Still should have asked ibm for their optical circuits technology.

  • @campingismylife9394
    @campingismylife9394 Před rokem +4

    So the card is 21 times more powerful than the Geforce RTX 4090. Great.

  • @amortalbeing
    @amortalbeing Před 11 měsíci

    great stuff

  • @accumulator5734
    @accumulator5734 Před rokem +8

    Looks just like the terminator AI processor 😂.

  • @mariosebok
    @mariosebok Před rokem +6

    ENERGY SAVINGS? 2112 fans need less energy than millions of them

  • @SanctuaryLife
    @SanctuaryLife Před rokem +1

    That’s a 20 Ton GPU with 144TB ram if you didn’t catch the drift.

  • @JamesWitte
    @JamesWitte Před rokem +4

    A future version of this will be what we have to suicide attack to destroy it to stop the machines

  • @mattbegley1345
    @mattbegley1345 Před 2 měsíci

    Considering how much the products cost to make, how do you justify the MSRP?

  • @natsidruk86
    @natsidruk86 Před rokem

    200 billion transistors. Let that sink in for a moment...

  • @mossify2359
    @mossify2359 Před rokem +2

    This founder/CEO's company has market cap bigger than TSMC?

  • @aeromotive2
    @aeromotive2 Před rokem

    how much memory??

  • @selorius28
    @selorius28 Před rokem +1

    it's better to make super home computers based on this graphics with the ability to connect to the network via optical fiber something like bitcoin miners but working differently, buying all this costs why to buy when you can pay for use only and not for all equipment and so after a year there will be something else better

  • @FrankBarrett
    @FrankBarrett Před rokem

    Anybody get a count on how many times he says “Grace Hopper”?

  • @coordinateurtremplinsolida7341
    @coordinateurtremplinsolida7341 Před 9 měsíci +1

    Thanks to Nvidia and his CEO, Skynet was waiting for this technology to be born !!!

  • @H-GHN
    @H-GHN Před rokem +1

    a "goodbye, gamers" keynote

  • @chesstictacs3107
    @chesstictacs3107 Před rokem

    Such a likeable dude.

  • @ovoj
    @ovoj Před rokem

    We're about to start summoning the machine gods. It's gonna sting but oh well.. here we goooooo

  • @76ayoub76
    @76ayoub76 Před 11 měsíci

    Jensen, if I was in the audience I would jump on stage just to make sure you are a real person or rendered by Grace Hopper?!😁

  • @FeelX87
    @FeelX87 Před rokem +2

    Taiwanese energy mixed with American excitement is this kind of a ceo

  • @oddpranii
    @oddpranii Před rokem +4

    Me : looking at superchip on a $300 phone

    • @preddyshite6342
      @preddyshite6342 Před rokem

      Me: Reading this comment on $50 tracfone I cant unlock in my country

  • @zondaensensyvarealelumina9472

    10:44 😄😄

  • @VisualBeatLab
    @VisualBeatLab Před 11 měsíci

    Insane 👏

  • @davidtothemax1
    @davidtothemax1 Před 5 měsíci

    damn this leather jacket man is killing it

  • @SultanAhmed-xn1wb
    @SultanAhmed-xn1wb Před rokem

    Nvidia for life 🧬 for love ❣️ for future 🤗 keep going for the great work 🤠 be happy be safe

  • @aritradas5522
    @aritradas5522 Před 11 měsíci

    Why does GH200 sound like T800 getting hosted on CNET

  • @markisaac3550
    @markisaac3550 Před 5 měsíci

    It awesome

  • @jackieo7113
    @jackieo7113 Před rokem

    Can't even wrap my head around the ramifications of this!

  • @mta7444
    @mta7444 Před rokem +4

    Dude we have to leave now, I just 4:12

  • @generlate
    @generlate Před rokem

    astonishing.

  • @Cordis2Die
    @Cordis2Die Před rokem

    This is cool

  • @shirilomakumbela2642
    @shirilomakumbela2642 Před 10 měsíci

    This is cool and stuff im most interested in the stuff they don't tell us about...

  • @jimmy8mbb
    @jimmy8mbb Před 6 měsíci

    This guy is. Skynet?
    John, we found him at last

  • @alangonzales3130
    @alangonzales3130 Před rokem

    I am honestly curious if it can run crysis

  • @devqubs
    @devqubs Před rokem

    can we see gpu price drops ?

  • @dantkillmyvibe
    @dantkillmyvibe Před rokem

    When did Jackie Chan go in to tech ?

  • @selorius28
    @selorius28 Před rokem

    it's not enough for artificial intelligence, nvidia needs to work on apatite crystals

  • @I-Dophler
    @I-Dophler Před rokem +9

    During the Computex 2023 event, Nvidia unveiled its highly anticipated and groundbreaking Grace Hopper Generative AI Chip. This cutting-edge technology represents a major leap forward in the field of artificial intelligence, harnessing the power of advanced algorithms and machine learning to drive innovation and transform industries. With the Grace Hopper chip, Nvidia continues to push the boundaries of what is possible in AI computing, paving the way for exciting new applications and advancements in various sectors. This remarkable achievement showcases Nvidia's commitment to shaping the future of AI and solidifies their position as a leading player in the industry. The Grace Hopper Generative AI Chip is poised to revolutionize the way we approach complex tasks and unlock new possibilities in the world of AI-driven solutions.

  • @bodekbodek
    @bodekbodek Před 10 měsíci

    No crowd understood the crysis joke!!????

  • @kos8765
    @kos8765 Před 14 dny

    if u got a hag dentist or an imp or watever, on your back, this grace hopper chip will help you out

  • @Cooper3312000
    @Cooper3312000 Před rokem +1

    All we as consumers care about is what this means for GPU’s?

  • @TheShadiya
    @TheShadiya Před rokem

    But is it heavy?

  • @D15legend
    @D15legend Před 11 měsíci

    Can it run crisis though?

  • @user-uw8yo2so2i
    @user-uw8yo2so2i Před rokem +1

    For those of u regular people like me let me explain it for u.
    Computers go fast
    Hopper makes computers do 100x faster
    Get ready for the internet of things life’s about to get jetson like

  • @mikri194
    @mikri194 Před rokem +1

    Designed For 32K gaming Performance at 120 fps

  • @oddpranii
    @oddpranii Před rokem

    This ceo is cool

  • @Arunaasthra
    @Arunaasthra Před rokem

    Still don't know why they haven't developed a game engine

  • @yasunakaikumi
    @yasunakaikumi Před rokem +1

    So thats where all of the GPU vram went to, no wonder they have to cut all of the lower tier vram

  • @user-nq1vd5jy5w
    @user-nq1vd5jy5w Před rokem

    Gamers can say goodbye to normal gpu prices. Margins for these are stupidly high.

  • @bikram2955
    @bikram2955 Před 11 měsíci

    Should really talk about efficiency. This thing needs humongous power and carbon emission.

  • @Acxtcx
    @Acxtcx Před rokem +1

    10:43 HAHAHAHAHAHA

  • @MrShyghost
    @MrShyghost Před 11 měsíci

    And so it begins...

  • @mikmop
    @mikmop Před rokem +2

    Whatever happened to nobody will ever need more than 640 kilobytes of RAM

  • @curtishorn1267
    @curtishorn1267 Před rokem

    Needs fewer fans as they will be a maintenance headache.

  • @extrememike
    @extrememike Před rokem +1

    Fk***ng amazing!

  • @Ricolaaaaaaaaaaaaaaaaa
    @Ricolaaaaaaaaaaaaaaaaa Před rokem +19

    The people should have been cheering the whole time.....wtf. This is awesome news people!

    • @alsaderi
      @alsaderi Před rokem

      They r bunch of !deot, society are, they usually rush to describe scientific advancement (especially the biological & technological) with word like "Creepy "Scary "Unethical" etc.

    • @sebastianjost
      @sebastianjost Před rokem

      The tech is amazing, but the presentation really wasn't.

    • @Ricolaaaaaaaaaaaaaaaaa
      @Ricolaaaaaaaaaaaaaaaaa Před rokem +2

      @@sebastianjost It didn't seem very well put together but sometimes that's more organic and wonderful 😄

    • @curie1420
      @curie1420 Před rokem

      ​@@Ricolaaaaaaaaaaaaaaaaa because they know how this tech impacts the future.... its good tech dont get me wrong but anyone with a functioning brain knows we are not responsible for this to be in mass production yet

    • @schikey2076
      @schikey2076 Před rokem

      the presentation is E3's level of cringe im surprised Crowbcat havent woken up on his slumber for this lmao... they really need to get someone else to present this lol

  • @karlkelly9005
    @karlkelly9005 Před rokem +3

    If Nvida keeps on revolutionize their tech, they're going to be unstoppable$$$$$$$$$

  • @nagadineshdusanapudi3863
    @nagadineshdusanapudi3863 Před 11 měsíci

    Price ?

  • @knightnxk2906
    @knightnxk2906 Před rokem

    4 elephants 1 GPU 👁👄👁
    reminds of 2 girls 1 cup

  • @binnieb20
    @binnieb20 Před rokem +4

    Hopefully someone will reverse engineer it the same way Sega and Namco reverse engineered Military hardware so it's cheaper.

    • @thetshadow999animates9
      @thetshadow999animates9 Před rokem +3

      You can’t reverse engineer something this advanced, you’d need machines that cost over $100,000,000 a piece plus maintenance just to manufacture the silicon it runs off of.

    • @binnieb20
      @binnieb20 Před rokem

      @@thetshadow999animates9 Sega had lots of money back in the day

    • @thetshadow999animates9
      @thetshadow999animates9 Před rokem +1

      @@binnieb20 while Sega was indeed worth what is equivalent to today’s companies which are worth hundreds of billions of dollars, Sega was playing on easy mode with how much simpler technology was at that time. The only technologies you could steal would either be patented or be useless or even already done by Nvidia’s competitors. An example of this is AMD copying Nvidia’s DLSS 1, 2, and at the moment trying to copy DLSS 3.

    • @binnieb20
      @binnieb20 Před rokem

      @@thetshadow999animates9 we’ll see

    • @thetshadow999animates9
      @thetshadow999animates9 Před rokem

      @@binnieb20 I think we won’t see