NVidia are SCARED!

Sdílet
Vložit
  • čas přidán 26. 07. 2024
  • Urcdkeys super sale
    25% off code: C25
    Win10 pro key($15):biitt.ly/pP7RN
    Win10 home key($14):biitt.ly/nOmyP
    Win11 pro key($22):biitt.ly/f3ojw
    office2021 pro key($65):biitt.ly/DToFr
    SQL Server 2019 Standard 2 Core CD Key Global($91):biitt.ly/oUjiR
    Affiliate links (I get a commission):
    BUY an LG C1 OLED 48'': amzn.to/3DGI33I
    BUY an LG C1 OLED 55'': amzn.to/3TTpQp9
    Support me on Patreon: / coreteks
    Buy a mug: teespring.com/stores/coreteks
    My channel on Odysee: odysee.com/@coreteks
    I now stream at:​​
    / coreteks_youtube
    Follow me on Twitter: / coreteks
    And Instagram: / hellocoreteks
    Footage from various sources including official youtube channels from AMD, Intel, NVidia, Samsung, etc, as well as other creators are used for educational purposes, in a transformative manner. If you'd like to be credited please contact me
  • Věda a technologie

Komentáře • 266

  • @Coreteks
    @Coreteks  Před rokem +67

    Sorry I mispronounced tenstorrent, my bad

    • @pf100andahalf
      @pf100andahalf Před rokem +4

      Lol! I was wondering what was going on. It's okay man, I made a mistake once.

    • @jstro-hobbytech
      @jstro-hobbytech Před rokem

      I was just kidding with my comment. Great content. Some of your predictions for a few years ago are close to fruition I suspect.

    • @olternaut
      @olternaut Před rokem +1

      Is there going to be a part 2 video because you only touched on Tenstorrent. Will the consumer eventually benefit from RISC-V or what??

    • @dawidvanstraaten
      @dawidvanstraaten Před rokem

      At least we now know Tenstorrent processors are chips on Tren

    • @HikarusVibrator
      @HikarusVibrator Před rokem

      It’s not a mispronunciation, you said a completely different word. Which means you haven’t read the company name in articles/docs more than a handful of times. Which means you’re talking with a hell of a lot of self-assumed authority for someone who basically knows nothing

  • @esra_erimez
    @esra_erimez Před rokem +141

    Jim Keller is a living legend

    • @jstro-hobbytech
      @jstro-hobbytech Před rokem +4

      Who has had his company renamed hahaha I kid but I'm sure the second letter in kellers company isn't an 'R'.
      I used riscV microcontrollers alot and it'll be good to see better fp performance trickle down hopefully. Drawing simple graphics with dual core esp32 ic to a tiny screen is painful if you want to make menus. It's why I use teensy4.1 for my silly builds.
      I've been excited for tenstorrent for a while.

    • @iwishilivedinafreecountry5749
      @iwishilivedinafreecountry5749 Před rokem

      And Raja is legendarily fucking useless.

    • @MrYevelnad
      @MrYevelnad Před rokem +9

      Without keller AMD would likely go bankrupt now.

    • @ctsd623
      @ctsd623 Před rokem +5

      Makes no sense. Keller stopped working at amd years ago... he does short contract jobs. He comes in, designs/organizes a masterpiece, gets boreed and leaves onto the next company/project. That's what he does...

    • @jstro-hobbytech
      @jstro-hobbytech Před rokem

      @@ctsd623 he does it out of passion and works quick but results are not seen for atleast 3 to 4 years when it comes to silicon development. Plus he gets shares and I bet a small percentage at the other end like some actors do. At this point though I suspect he'd do it for free if he believes in it. In fact I bet at tt he hasn't made any roi yet. Yet haha He deserves it though. He looks at chips like I do at guitar but a better analog would be to use a player like Paul Masvidal or Roger Patterson haha

  • @genblob
    @genblob Před rokem +94

    I don't care about AI stuff but hearing about a company creating an open alternative using RISC-V to overthrow Nvidia's monopoly is great news. I hope the same happens in the DIY market because it has become incredibly stale and boring. Everyone using RISC-V CPU's sounds like a pipe dream but I think it will happen one day.

    • @Slavolko
      @Slavolko Před rokem +5

      It'll be a pipe dream so long as the OS and software don't support it. Apple has transitioned to custom ARM chips because they've put it in the work for supporting it in software, but we don't quite see that on the PC side.

    • @Einygmar
      @Einygmar Před rokem +4

      @@Slavolko As far as I understand, as long as all vendors implement the same standardized instruction set this should work fine. Like with graphics API's where each vendor implements DirectX or Vulkan instructions for its hardware so the software developers can use the same code for all the supported GPUs.

    • @Slavolko
      @Slavolko Před rokem +2

      @@Einygmar I know, but my point is that you'll need to convince OS and software developers to support your new hardware standard. That's all.

    • @AndersHass
      @AndersHass Před rokem +1

      @@Einygmarissue is most software isn’t written for RISC-V, so there needs a lot of work to make it work on RISC-V (like having a translation layer like Apple has done from x86 to ARM to ease the usage before being properly ported).

    • @Real_MisterSir
      @Real_MisterSir Před rokem +6

      The thing is, with their approach it also requires that the customers are capable of building their own custom models that are better than the box solution Nvidia ships - and this is the major caveat here. The highlighted companies like Google, Amazon, Tesla, etc - they are all big tech companies with massive resource pools and existing highly skilled software teams that can take on such jobs. But that isn't the case for the majority of the industry's needs as a whole. Most companies that actually benefit from these products, don't have fully stacked teams of google/microsoft level technicians and developers at their disposal, and they don't have the time nor expertise to scout the market for people to perform those tasks through 2rd party services.
      With Nvidia you get something that works out of the gate, has reliable support, and a major community of other players using the exact same systems - trouble shooting the exact same problems, and reaching solutions and knowledge gain that is very hard to replicate with bespoke in-house software.
      It's also one of the reasons Apple's OS and Windows are so popular -only here at a far lower economic scale for the end user, but the concept is the same. Nvidia isn't looking just at the big established corpos - they're looking at all of the up-and-coming enterprises that are looking to build their systems off of modern AI solutions and computing systems. These companies will gobble up Nvidia's product stack like hot bread.
      Said in another way, it's far easier to build a product stack that allows for full end user customization, than it is to create a full system lineup of hardware and software that works out of the box and has worldwide use and experience behind it. Nvidia can always make the switch when they need to - but RISCV based system manufacturers can't do the opposite. They're already locked - similar to how AMD will never reach the software suite integration and support that Nvidia has in enterprise. It would require decades of effort to just get on the same playing field, and even then you still have to convince customers to stop using what they've been using for years. I'd much rather be in Nvidia's shoes here, they have more options - which is a major benefit in a rapidly shifting industry.

  • @esra_erimez
    @esra_erimez Před rokem +28

    Ian Cutress (TechTechPotato) has some great interviews with Jim Keller regarding Tenstorrent

  • @Merrinen
    @Merrinen Před rokem +17

    The question Jensen would ask when talking about this topic with Keller is "but does it run Crysis?"

  • @pfcokelly
    @pfcokelly Před rokem +9

    I can hear EVGA yelling fuck yea from here.

  • @TechDunk
    @TechDunk Před rokem +21

    Can we get a yay for competition and open source

    • @offline__
      @offline__ Před rokem +2

      Sadly nvidia didn't make my gpus driver open source for some reason (GTX770) 😭

    • @jacksonsneed7689
      @jacksonsneed7689 Před rokem +2

      ​@@offline__The open source Nvidia Linux driver has come a long way in a short time. It's still rough, but it'll get there. I'm with ya though, I think it was a dick move to open source ONLY the newer architectures. The 700 series seems like the perfect target for open source optimizations, because regardless of how old they are, a lot of people still use 'em, along with the 900 & DEFINITELY the 1000 series.

    • @jacksonsneed7689
      @jacksonsneed7689 Před rokem +1

      Correct me if I'm wrong, but I'm pretty sure that Pascal is still the most popular Nvidia architecture at the moment, when it comes to the number of active users (1060 & 1070 are the most popular I think)

    • @jacksonsneed7689
      @jacksonsneed7689 Před rokem +1

      YAY!

    • @offline__
      @offline__ Před rokem +1

      @@jacksonsneed7689 couldn't have said it better

  • @lucasLSD
    @lucasLSD Před rokem +36

    This is so amazing, years ago open hardware sounded like a joke but now it's a threat
    .

    • @ireallyreallyreallylikethisimg
      @ireallyreallyreallylikethisimg Před rokem +2

      Good.

    • @estring123
      @estring123 Před rokem +5

      open source AI models are also a threat. a leaked google document says google and openai have no moat, and open source might crush them both.

  • @XMomaR
    @XMomaR Před rokem +40

    The more you buy, the more you save! Jensen's AI stranglehold may well fade, Raja-Keller Tenstorrent chips maybe gonna be killer sellers, so sayeth Coreteks: the futurist tech soothsayer!

  • @blackmennewstyle
    @blackmennewstyle Před rokem +15

    What a time to be alive, open source hardware might become a reality 🔥🚀
    I still remains skeptical though especially since Raja is involved but i truly want him to do well in his collaboration with Jim...

  • @dr.saltyballsack69
    @dr.saltyballsack69 Před rokem +10

    greed is so fun watching being destroyed

  • @fredsorre6605
    @fredsorre6605 Před rokem +28

    I really hope Jim Keller can put Nvidia in it's place and offer a competing product with the same performance but at a third of the price AI is still new enough that Nvidia's dominance in it can still be challenged.

    • @Moshe_Dayan44
      @Moshe_Dayan44 Před rokem +4

      I said this years ago, when everybody thought Intel had completely crushed AMD, and I first heard that Jim Keller had come back to AMD to help them design their Ryzen CPU. He's the Luke Skywalker of chip design. He can do it. Darth Huang and the nGreedia empire can and will be defeated by Luke Keller. :)

    • @tringuyen7519
      @tringuyen7519 Před rokem +3

      Nvidia’s CUDA is why it’s succeeding now & why it will lose in the future. CUDA is GPU specific. It doesn’t work with a hybrid APU architecture.

    • @arenzricodexd4409
      @arenzricodexd4409 Před rokem

      @@Moshe_Dayan44 and there probably tens if not hundreds of people like Jim Keller inside nvidia.

    • @socratic-programmer
      @socratic-programmer Před rokem

      @@tringuyen7519 the issue is that making a hybrid APU arch framework is really really hard. OpenCL was meant to be an example, SYCL a more recent thing (and that also has limitations). We need a bigger idea, I think.

    • @Moshe_Dayan44
      @Moshe_Dayan44 Před rokem +2

      @@arenzricodexd4409 This is exactly what people said about Intel in 2016: "There are literally hundreds of engineers like Jim Keller at Intel! How can one guy beat hundreds of people! It's impossible!" Yet critical aspects of the Ryzen core, as well as the chiplet concept and the interconnect design were all Keller's. One man DID undo Intel's iron, monopoly grip on x86 big iron CPUs. If you think Keller can't do it again, you're betting against a man who has never designed a dud chip. He also was the main architect of the DEC 21164 Alpha CPU, AMD's Athlon, the author of AMD's x86-64 64 bit instructions, as well as Apple's A4 and A5 chips. Do you really want to bet against him?

  • @EnochGitongaKimathi
    @EnochGitongaKimathi Před rokem +5

    It is very exciting to see the progress of RISC-V. People think the biggest advantage of RISC-V is that it is open source but I think being open source only facilitates the biggest advantage of RISC-V, it is modular. You can put in what you need and leave out what you don't in the design and have an extremely custom chip. A chip doesn't need to be good at everything (scalar, vector, matrix and spatial instructions) if there is no need.
    The biggest obstacle for RISC-V remains manufacturing. Access to the most advanced process nodes from TSMC, Intel and Samsung can only be afforded by a few. Advance process nodes offer the performance efficiency we need to make RISC-V something everyone wants. This is why the main target of high performance RISC-V is the server market. Perhaps a company like Samsung can come to the rescue of RISC-V and develop Android Smartphones using RISC-V. They can switch from using ARM for their Exynos Mobile Processors.

  • @prolamer7
    @prolamer7 Před rokem +6

    Well put facts, but you are forgetting big players learned from mistakes of 80-90s for example Facebook or Google should be superseeded already but instead they just buy their competitors. I think same will happen with new competitors, Nvidia or AMD will just buy them ....

  • @trapexit
    @trapexit Před rokem +6

    Until we get standard bodies and standards around the system as a whole rather than the ISA of the CPU RISCV will continue to be bespoke in use. Look at the issues Rene Rebe ran into with his early standard version of SIMD on the RISCV SBC he has.

  • @YourSkyliner
    @YourSkyliner Před rokem +10

    I will never again trust in a project that Raja Koduri is a part of

    • @Th3_Gael
      @Th3_Gael Před rokem

      Keller kinda offsets Koduri imo

    • @donnydarko7624
      @donnydarko7624 Před rokem

      ​@@Unicorn-CoinPat Gelsinger does. Look at the 180 with arc since Raja was removed from overseeing it.

  • @juniorjunior8494
    @juniorjunior8494 Před rokem +3

    One of your best videos, well potrayed. I am designing a RiscV accelerator, and i can first hand say the pace of software development and maturity is impressively fast. An area where RIsc V lagged was verification tooling, and this has been maturing well and is becoming more robust. Also, almost all the cool innovation in hardware is happening in RiscV. The biggest advantage is this velocity and freedom. We're already reaching the limits of silicon, which means that now specialization is the way to go to extract the next magnitudes of compute performance

  • @ageofdoge
    @ageofdoge Před rokem +13

    It's great to see RiscV becoming a major factor.
    What do you think of Dojo? Tesla claims they will use it to go from something like 5-6 exaflops to 100 in the next 14 months. Who knows when they might make any of that available to the outside world but I imagine they will at some point and it seems like they could cut into the market pretty good when they do.

    • @ireallyreallyreallylikethisimg
      @ireallyreallyreallylikethisimg Před rokem

      100 exaflops? In 14 months?? Ain't no way.

    • @YolandaPlayne
      @YolandaPlayne Před rokem +1

      @@ireallyreallyreallylikethisimg Agreed. I don't believe it. Just another marketing promise by Elon. "Self driving"? A trademark, not a feature.

    • @ageofdoge
      @ageofdoge Před rokem +3

      @@YolandaPlayne Self driving is a thing no one has accomplished before and is inherently unpredictable.
      If they are planning to build this in the next fourteen months, it means they've already lined up the production with TSMC. It's not the kind of thing you can place a last minute order for. Since there's already tesla designed silicon in the cars I think it's safe to assume they know how to process works.

    • @daniel_960_
      @daniel_960_ Před rokem

      @@YolandaPlayne seems believable. All they have to do is order fab capacity from
      TSMC. It's also 7nm which should have plenty affordable capacity.
      Their solution seems incredibly scalable too, with 25 chips in one tile.

    • @YolandaPlayne
      @YolandaPlayne Před rokem

      @@daniel_960_ capacity is tight, this is known.

  • @LaMouche99
    @LaMouche99 Před rokem +4

    "Tenstorrent" not "Trenstorrent" Celso

  • @adi6293
    @adi6293 Před rokem +8

    So when is nVidia going to buy this company?

  • @WildEngineering
    @WildEngineering Před rokem +3

    TENS TOR ENT - THERE IS NO R

  • @christiansrensen3810
    @christiansrensen3810 Před rokem +2

    Before Nvidia.....the king 3dfx ruled gaming.
    Before Apple... Nokia ruled the phone marked....
    Before Phillips... Was dominant in television..
    Before Kodak....ruled the photo industry.
    Before blockbuster was the marked former and King.
    All these companies (except Phillips). Is almost extinct..

  • @bobbavet
    @bobbavet Před rokem +1

    Thanks for another quality video Corteks. Any chance of investigating A.I hardware and the benchmarking of it? How can we measure performance of A.I.? Is anything readily available to measure the use of AI on GPU vs GPU, CPU vs CPU, GPU vs GPU?

  • @SARS1PP1US
    @SARS1PP1US Před rokem

    Where can I find information on the images at 8:10 of the video?

  • @alpha007org
    @alpha007org Před rokem

    Unrelated question: When you buy Office key, where the F do you get download link?

  • @Michael_Brock
    @Michael_Brock Před rokem +2

    Both WD and Seagate both have made the successful transition from hard disks to SSD, I am sure other companies have made the change.

  • @1Aquadon
    @1Aquadon Před rokem

    Thanks CT.. Always Stellar delivery!!

  • @Pushing_Pixels
    @Pushing_Pixels Před rokem +1

    It's interesting seeing how RISC-V is evolving. Something I think we will see become widespread in the near future is companies creating custom, hybrid solutions that combine RISC-V platforms as a base, but with highly customized, or even unique, proprietary IP blocks embedded alongside or within them. Every company will be able to have their own custom hardware designs, done either in-house or through specialist design companies, that they have IP rights over, working side by side with the open standard instructions, and potentially other IP licenced from third-parties. It will be the kind of heterogeneous computing AMD have been pushing, but far more customized and instead of customers being limited to AMD IP (or whoever) they will design their own, or take different ones from different designers and combine them with the open standard. Any company that can facilitate this process, from design co-ordination and IP integration, all the way to arranging fabrication and packaging, in a way that is more rapid and nimble than the big incumbents, will take off and become huge.

  • @skywalker1991
    @skywalker1991 Před rokem +7

    Jim Keller been with tenstorrent few years now , raja joining might ruin his project , rajas track record is not very good .

  • @AlexSeesing
    @AlexSeesing Před rokem

    Going smaller doesn't always means smaller nodes but just getting stuff becoming way more simple in order to scale in parallelization. How else would those compute units become chiplets?

  • @aaronmcquaid
    @aaronmcquaid Před rokem +2

    how do we invest in tenstorrent?

  • @chillidog8726
    @chillidog8726 Před rokem +1

    I'm not sure where you got the Tenstorrent LG information but after reading up on it for some time.
    Tenstorrent seems to only license thier RISC-V CPU cores not including any AI hardware.
    And after remembering Ian Cuttress interview with Jim Keller:
    Jim Keller said something like surprisingly people want to license just the CPU cores so we are probably gonna do that.
    So i think LG is just looking to move away from arm as thier Smart-TV embedded chip and use Tenstorrent CPU IP to produce it's own Smart TV chip. which would run a Linux based operating System. Likely nothing to do with AI that part or at least no hint of it having anything to do with it.

  • @Crihnoss
    @Crihnoss Před rokem +1

    I'm curious to how RISC-V performance stacks up to the other architectures? That will in no doubt will affect its adoption.

  • @bobmnz6914
    @bobmnz6914 Před rokem

    Let me know when it arrives. I'll be interested in how it handles graphics. But I am also interested in IF/How, UnReal 5 is going to change things. On the graphics front. Will we be able to run high Q graphics without a high Q card?

  • @ronnyspanneveld8110
    @ronnyspanneveld8110 Před rokem +2

    Not much to say about it, agree, ( EX Dec Alpha employee here :P )

  • @sersou
    @sersou Před rokem +5

    Tenstorrent, not T"R"enstorrent I believe.

    • @Coreteks
      @Coreteks  Před rokem +2

      @sersou my bad

    • @HikarusVibrator
      @HikarusVibrator Před rokem

      Unbelievable that we hear a voice and assume they’re an expert and meanwhile they don’t even know the company name. It’s so shameful. How could I ever think “oh hey that guy over there said so and so, I’ll take that opinion on board” after this. Hillarious shit man

    • @HikarusVibrator
      @HikarusVibrator Před rokem

      No it’s Transtorrent, the opposite of Cistorrent

  • @anthonyblacker8471
    @anthonyblacker8471 Před rokem

    I remember the Sun Microsystems computer in the engineering lab at NJIT in Newark, NJ had 1 GB of ram. That was 1996. It was unheard of. : )

  • @aalhard
    @aalhard Před rokem

    This is indeed a crazy time to watch the industry

  • @El.Duder-ino
    @El.Duder-ino Před 8 měsíci

    Which company Intel acquired and used their IP for the thread director?

  • @biotechisgodzilla
    @biotechisgodzilla Před rokem

    2:00 that picture reminds me of my old vodoo2 card 😍

  • @mikebruzzone9570
    @mikebruzzone9570 Před rokem

    Good report Celso I copied your report link into my SA distribution which you can find at my Seeking Alpha comment line and as always I have added my own observations. mb

  • @pf100andahalf
    @pf100andahalf Před rokem +3

    Until nvidia (and amd) decide that the crypto mining boom is over and stop fleecing their normal (average) gpu customers for everything they can squeeze out of them, I will continue to hope that something happens that gives them some humility. Will that ever happen? Probably not. But I can hope.

  • @Voidkitty_
    @Voidkitty_ Před rokem +1

    Yay another episode of hopium for the tech industry with the most soothing voice ever

  • @Dmwntkp99
    @Dmwntkp99 Před rokem +7

    Make gpu's upgradable like motherboards.

    • @gregandark8571
      @gregandark8571 Před rokem +2

      Make GPU chip's swappable like you do with your CPU when it's time to change it.

    • @Dmwntkp99
      @Dmwntkp99 Před rokem +3

      @@gregandark8571Would certainly improve longevity ownership of the card.

    • @n0madfernan257
      @n0madfernan257 Před rokem +1

      you could pitch-in the idea to investors, go with it

  • @pilsen8920
    @pilsen8920 Před rokem

    Jim's interview with tech tech potato was the 1st time i figured out what jim was building. I think he has a winning model.

  • @gecko2000405
    @gecko2000405 Před rokem +1

    Why Intel even allowed these guys to leave is beyond me. They're the greatest minds for GPU's and CPU's.

  • @Kaptime
    @Kaptime Před 11 měsíci

    Thinking more on this, you can see why Xilinx was such a key pickup for AMD. Being able to sell a chip with it's own FPGA you can customize for a specific AI workload is a great idea. Then selling a GPU or dedicated chip that can perform that task is a killer sell. If ROCm was as easy to use as CUDA they would be able to have the whole stack under control.

  • @DanT10
    @DanT10 Před rokem +1

    This video highlight the issues with capitalism in general. Infinite growth in a finite world means death.

  • @MrValgard
    @MrValgard Před rokem

    not all was sceptic :P i was tellin that ARM can grow on back of big tech and dethrone some on top

  • @orlof507
    @orlof507 Před rokem

    Great video, I was going through a huge pessimistic wave lately but this video gave me a bit of hope for the future.

    • @HikarusVibrator
      @HikarusVibrator Před rokem

      Guy who doesn’t know company’s real name impresses you eh. Nice.

    • @orlof507
      @orlof507 Před rokem

      @@HikarusVibrator yes

  • @ChittyBang66
    @ChittyBang66 Před rokem

    Hope a company uses this open hardware to make an affordable 4K 240Hz nano-LED gaming monitor.

  • @user-wm1xm5gm2k
    @user-wm1xm5gm2k Před rokem

    Brilliant analysis coreteks.

  • @kyounokuma
    @kyounokuma Před rokem

    Holy crap! Your commentary in this video is on another level. Good stuff.

  • @domm6812
    @domm6812 Před rokem +1

    Yep ...intel is a prime example of how you can suddenly fall down when you're at the top. Ignoring innovation, blatantly milking consumers and prioritising shareholder demands and margins above all else. Intel is lucky they may have caught the fall in time to climb back up ...we'll see.

  • @SweatLaserXP
    @SweatLaserXP Před rokem

    Discreet graphics ain't going anywhere for a while, and nVidia can transition to a market share approach where they'll make some long-term commitments at a discount with TSMC and/or Samsung and/or whichever chip fab. The thing is, with nVidia, they have top-notch driver maturity, so they will play well with games, editing, simulation, etc. In other words, they're reliable for hardware compatibility. So there is some incentive for them to shrink some of their last-gen chips down and go the budget route.

  • @platin2148
    @platin2148 Před rokem +4

    Pff most of the AI stuff isn’t really great for the end consumer. Most use cases can’t even generate money in any reasonable way.

    • @questmarq7901
      @questmarq7901 Před rokem +1

      Gene editing, Healthcare prediction, individualized drug creation (protein folding), robotics symbiosis, robotics in general.. there is no market that will not get disrupted by AI

  • @tech6294
    @tech6294 Před rokem

    Great video as always! ;)

  • @CupOfAwesome
    @CupOfAwesome Před rokem

    Solid and interesting analysis, as always.

  • @anndroid2161
    @anndroid2161 Před rokem +5

    LG aren't just licencing generic RISC V chip designs from Tenstorrent though, are they? I suspect they're licencing their AI add-on modules as well, otherwise they'd just use an open-source RISC-V core and save themselves the licence fee.
    If that's the case then the story is just "LG switches chip vendors", as they'll be locked in to Tenstorrent's RISC-V extensions just as if they'd gone with ARM or Nvidia.

  • @pierrebroccoli.9396
    @pierrebroccoli.9396 Před rokem

    A very astute video into the nature of Corporate Structures as much as it is technology.
    Irony as everything is being forced into the Corporate Structure of operation now days including Public Services and Government. Talking to a DB developer startup who's company was going Public Listing - I mentioned to him it would be a great time to get out and look at something new. As Companies - Structures become too large, they fall in on their own foot print.
    As one advocating for Open AI systems and keeping Software and Hardware Open and accessible - this is good news.

  • @GegoXaren
    @GegoXaren Před rokem

    Cheeses... What is with that echo, it's way worse than normal?
    Put a duvet behind, and a couple of pillows in front of you when recording, and get a carpet for your floor.

  • @WTFBI5
    @WTFBI5 Před rokem

    if i had drank a shot when ever u said "disrupt" i would be dead

  • @gamingtemplar9893
    @gamingtemplar9893 Před rokem +4

    Wonderful informative and entertaining work as always. I think governments still have too much influence, but things can change and this seems a very interesting opportunity. I prefer open solutions, in fact I would say "radically" open solutions, but we will see and I hope whatever is better for us instead of corporations.
    Oh, and don't mind those guys Celso, you were always right about AMD.

  • @jstro-hobbytech
    @jstro-hobbytech Před rokem

    Cool video!

  • @armandaneshjoo
    @armandaneshjoo Před rokem +1

    Exactly.

  • @maxwellsmart3156
    @maxwellsmart3156 Před rokem +2

    Why do you think AMD's goal is to disrupt Nvidia? Most likely it's what you wanted to happen and then you say RX 7900 is the 'worst' because it wasn't cheap enough. I guess you never factored in that prices would fall. The RX 7900XT is cheaper than a RTX 4070Ti, consumers will still buy the 4070Ti because RT and DLSS3. The market is a multi-variable equation. How cheap would the RX 7900 series need to be to make it the 'best'? AMD's goal is to make a profit with good margins because they need to justify their actions to the board and stakeholders.

  • @MozartificeR
    @MozartificeR Před rokem

    Thank you coreteks team...

  • @__--JY-Moe--__
    @__--JY-Moe--__ Před rokem

    this would be interesting 4 open source! good luck 2 all...some super talents coming together,
    2 achieve a historic purpose?😎👍 this will probably be a DPU!! or CPU+DPU+GPU!! wow! I get excited!!

  • @El.Duder-ino
    @El.Duder-ino Před rokem

    AMD and Intel should really watch this! Jim Keller as an industry veteran completely understands the "hole in the market" not just from the technological, but business model perspective as well! Tenstorrent concept and model is excellent and their approach is future proof. I just hope its not way too much advanced and forward looking as industry might not be ready for something like this and would rather let Nvidia milk it even more... Nevertheless something new and innovative including open source is much needed as Nvidia becomes something like a monopoly on the AI market, getting more powerful and more wealthier by every hour... just imagine what would happen if they would merge with Arm...

  • @saemranian
    @saemranian Před 10 měsíci

    Perfect,
    Thanks for sharing .

  • @slopedarmor
    @slopedarmor Před rokem

    Great video

  • @niamat5129
    @niamat5129 Před rokem

    Damm!!!! This dudes voice is mad deep.

  • @phoenixsub7072
    @phoenixsub7072 Před rokem

    x86 licence FEee ive never encountered it? ? ? ? ?

  • @nowonmetube
    @nowonmetube Před rokem

    Closed system where you can do nothing or open source where you have to do many things yourself? I think the thing that's the easiest to use will win. And I'm pretty sure that would be some user friendly but effective AI.

  • @HTV-2_Hypersonic_Glide_Vehicle

    NVidia is* not "are" NVidia is not multiple entities.

  • @rightwingsafetysquad9872

    Maybe in a few years Raja will leave so Tenstorrent can make some good products.

    • @donnydarko7624
      @donnydarko7624 Před rokem +1

      Yeah, I swear his track record seems to be constantly ignored.

  • @antemasq6520
    @antemasq6520 Před rokem

    Hey you should cover the feud between George Hotz and Jensen, he talked about It in a recen podcast with Lex Friedman

  • @daniel_960_
    @daniel_960_ Před rokem

    I think tesla setting an example with dojo will have a huge impact on the industry. Scaling to 100exaflop in 1.5 years.

  • @BurningDrake39
    @BurningDrake39 Před rokem

    Here! The more you but the more you save! 😂

  • @johnnyutah9939
    @johnnyutah9939 Před rokem

    Greed, for lack of a better word, is good. Greed is right. Greed works . Jensen Huang

  • @attackxxx
    @attackxxx Před rokem

    can we disrupt the flimsy pin design?

  • @sinephase
    @sinephase Před rokem +6

    Are there any examples of "open source" anything dethroning a major closed source company (other than Linux for servers)? Even Vulkan is hardly getting used anymore over DX12 for some reason.

    • @phoenixrising4995
      @phoenixrising4995 Před rokem +4

      Valve uses it pretty heavily for the Steamdeck.

    • @shinkiro69420
      @shinkiro69420 Před rokem

      Blender in 3D spaces

    • @scislife2398
      @scislife2398 Před rokem +1

      Star citizen is moving over to vulkan as well. When that gets going it will pull the rest of the games industry with it when they outsource their code to build huge interconnected worlds

  • @M3ganwillslay
    @M3ganwillslay Před rokem

    Mr.Raja is the Nvidia Killer ..King Shark

  • @e2rqey
    @e2rqey Před rokem +1

    It's TENStorrent not Trenstorrent

  • @gabfid3
    @gabfid3 Před rokem

    Nvidia has long sold gpus, now also cpus and more integrated systems with their stated goaø being to create accelerators of different kinds starting with gpus. The thing is that Nvidia has not created a chip purely for machine learning rather it has molded it's gpus to fit.

  • @jeffmofo5013
    @jeffmofo5013 Před rokem

    Wow, way too much tech jargon
    It would be great if the industry could create a modular compute architecture at the hardware level. FPGA is the expensive approach and ASICs are the cheap approach. CPU and GPUs are the everything is a nail so lets use a hammer approach.

  • @tomtomkowski7653
    @tomtomkowski7653 Před rokem +15

    AMD had the opportunity of the decade because Nvidia has so many Ampere stocks so they raised the prices like crazy and left the doors wide open.
    AMD went for chipsets to cut the costs and what have we got?
    7900xtx as fast as the 4080 for $1000 just because Nvidia raised the price of the 4080 to $1200.
    7900xt as fast as the 4070ti for $800 just because Nvidia raised the price of the 4070ti to $800.
    AMD had a chance and they blew it big time because RDNA 3 didn't deliver and we got nothing from Chiplets savings. I'm afraid that with the AI market, it will be a similar story.
    AMD was able to fight with Intel because this was a big, lazy, self-confident corporation used to the monopoly they had. Nvidia is a different kind of player just being ruled by greed.
    As for the riskV - sooner Nvidia (or Intel, or AMD) will buy it and this will be it.

    • @puriko89
      @puriko89 Před rokem +7

      It looks like AMD can't see past the next quarter. They want to imitate Nvidia in any way possible.

    • @kneppernicus
      @kneppernicus Před rokem +7

      RISC-V cannot be bought out by anyone, so that's not even an issue.

    • @skywalker1991
      @skywalker1991 Před rokem +3

      Chiplet saved amd money, but then GPUs are not selling and amd losing money after all , lol

    • @tdreamgmail
      @tdreamgmail Před rokem +1

      AMD eventually create superior hardware only to lose again with inferior software. No one is replacing CUDA at this stage.

    • @macronomicus
      @macronomicus Před rokem

      Both AMD & NVIDIA had record breaking unsold last gen backstock to sell thru, that & all the unfounded fears of recession is why prices were so high. It worked, both companies have vastly cleared the backlog, & both are bringing down prices. I suspect they wont bring prices too much lower though, because the silicon is more profitable as ai chips.

  • @dimagass7801
    @dimagass7801 Před rokem

    So TSMC calls or puts😭😭😭

  • @cerviche101
    @cerviche101 Před 10 měsíci

    Truths in this presentation, thank you for being open and honest with your findings, you are of a dying or breed.

  • @NeoShameMan
    @NeoShameMan Před rokem

    Midjourney is closed though

  • @phoenixrising4995
    @phoenixrising4995 Před rokem

    I don’t know if raja will dethrone Nvidia. Didn’t Intel give him the boot recently.

    • @nimrodery
      @nimrodery Před rokem

      Not according to Intel. I am unaware of any internal dialog at Intel indicating displeasure with actually getting products to market. He did leave.

  • @MozartificeR
    @MozartificeR Před rokem

    I like AMD going into chiplets. And I will not judge chiplets on video cards, until the technology is more mature.

  • @MikkelGrumBovin
    @MikkelGrumBovin Před rokem

    Keller is so goddamn cool 😉☝️

  • @Ojref1
    @Ojref1 Před rokem +7

    Raja is a doom upon all he touches. Putting him onboard your ship is like ordering the crew to punch a 10 foot hole in the side of your boat.

    • @pilsen8920
      @pilsen8920 Před rokem

      Thanks, I couldn't have said it better.

  • @dimagass7801
    @dimagass7801 Před rokem

    I gave upon investing and have AI do it for me, but it keeps buying more nvidia😭

  • @ajaypuri130
    @ajaypuri130 Před rokem

    I basically see your videos as I love the way you speak, and your knowledge is amazing. Nvidia reminds me of the INTEL of old under Andy Grove who wrote the headline making book ONLY THE PARANOID SURVIVE. When he was the head INTEL that company could not be overtaken by any other company because he was always scared, and he made his company work so hard to make sure no other company takes any lead. The current CEO of NVidia is the same: he is scared, always scared. He will never give an inch. He will push boundary after boundary to be the leader of the industry. I am now 61 years old and started using computers when were called word processors way back in 1980. I always keep the latest machine with the greatest hardware. My GPU has always been Nvidia. I tried ATI two times and both cards failed. No Nvidia card has ever failed for me. I am right on AMD 7950X. I wanted to upgrade to the 7950X3D but when I saw virtually no performance increase, I decided to wait. In the meantime, I bought a gaming laptop with the 13980HX, and I am amazed by the speed. For my desktop when I had no choice but to upgrade to a new DDR 5 motherboard, I had the choice of going for RAPTOR lake, but I said to myself: AMD is giving an upgrade route while INTEL is end of life for this socket and the very high-power draw scared me. So, I thought I was safe in AMD land. Surprise. Surprise. INTEL Is now releasing updated RAPTOR lake which should beat any AMD processor, even their 3D ones. So, I think even when INTEL is so down with its antiquated chip making facilities it is not letting AMD really take over in any way. Once it catches up it will again push AMD towards the grave. Nvidia is the same. I am an Indian and Raja Koduri seems like a nice guy, but he has constantly failed as does not have the craziness of the current NVidia CEO. The guy can do anything. Make any gamble but his gambles are very calculated and mostly pay off. I am now just a little scared of the new upcoming 5 series products as I think the GPU will be the complete computer where you install a CPU and SSD inside! I just hope I am not tempted to upgrade as already my current machine is using a 1200W power supply.

  • @whosonedphone
    @whosonedphone Před rokem

    Wow!

  • @denvera1g1
    @denvera1g1 Před rokem +2

    I cant wait for tensorent to take off
    Alternatively, i cant wait for AMD or someone else to make a machine learning ASIC that is an FPGA
    Imagine, killing off ML/AI usage of desktop GPUs like ASICS did for schitt-coin, but instead of only being viable for a few months, even days before the algorithm changes and a new ASIC needs to be designed and manufactured, if the ASIC could be re-designed on the fly for new algorithms, offering ASIC performance, with CPU levels of relevance/life span

    • @HikarusVibrator
      @HikarusVibrator Před rokem

      It’s TRANCETORRENT not tensOrent didn’t you listen?

    • @denvera1g1
      @denvera1g1 Před rokem

      @@HikarusVibrator Thats my problem, i didnt see it spelled

    • @HikarusVibrator
      @HikarusVibrator Před rokem +1

      @@denvera1g1 i was joking with you because the guy who made the video got it so horribly wrong

    • @denvera1g1
      @denvera1g1 Před rokem

      @@HikarusVibrator Hey man, i got it wrong too
      tenstorrent
      and i'm not going to remember that after hearing tensorrent so many times
      (Edit, come to think of it, i dont think i can recall hearing it ever pronounced correctly except for maybe a Dr Ian Cutress video)

    • @HikarusVibrator
      @HikarusVibrator Před rokem

      @@denvera1g1 this guy is talking about the intricacies of GPUs and giving very weighty opinions of companies/products in an industry full of very very smart people. He acts as an expert and convinces the fools who watch his channel that TRANStorrent are going to beat up Nvidia. Come oooon dude this is like the video game “journalist” that couldn’t get past the Cuphead demo. People who know tech never do this

  • @Humanaut.
    @Humanaut. Před rokem

    Amd isnt necessarily "following intel in cpu land" - from what I hear amd is actually ahead of intel cpus.

  • @pawegraczyk6050
    @pawegraczyk6050 Před rokem

    Nice