How NVIDIA missed the AMD KRAKEN

Sdílet
Vložit
  • čas přidán 9. 08. 2022
  • 20% coupon code: C30
    win11 professional 1pc : www.ucdkeys.com/product/windo...
    Office2021 Professional Plus CD Key:www.ucdkeys.com/product/offic...
    Windows11Pro+Office2021Pro Global Bundle: www.ucdkeys.com/product/windo...
    Windows 11 Home CD KEY GLOBAL :www.ucdkeys.com/product/windo...
    office 19 pp:www.ucdkeys.com/office-2019-P...
    win10 pro:www.ucdkeys.com/Windows-10-Pr...
    365 account: www.ucdkeys.com/Office-365-Pr...
    Support me on Patreon: / coreteks
    Buy a mug: teespring.com/stores/coreteks
    Bookmark: coreteks.tech
    My channel on Odysee: odysee.com/@coreteks
    I now stream at:​​
    / coreteks_youtube
    Follow me on Twitter: / coreteks
    And Instagram: / hellocoreteks
    Footage from various sources including official youtube channels from AMD, Intel, NVidia, Samsung, etc, as well as other creators are used for educational purposes, in a transformative manner. If you'd like to be credited please contact me
    #nvidia #rtx4090 #amd
  • Věda a technologie

Komentáře • 609

  • @olaole8315
    @olaole8315 Před rokem +31

    Finally!
    A youtuber not bought by GPU makers/vendors.
    Prices are crashing down.
    All we have to do is sit and wait.

    • @James2t3
      @James2t3 Před rokem

      🕐🕑🕒🕚🕓🕔🕕🕖🕗🕘🕙

  • @umeng2002
    @umeng2002 Před rokem +79

    nVidia 2021: "Gamers? What gamers?"
    nVidia 2022: "Gamers, please buy our GPUs."

    • @samgoff5289
      @samgoff5289 Před rokem +3

      Nvidia doesn't care who buys their gpus....

    • @trapstoner
      @trapstoner Před rokem +20

      @@samgoff5289 damn the joke flew right over your head

    • @Azurefanger
      @Azurefanger Před rokem

      @@samgoff5289 yeah dude they dont care thats the reason the prizes go down every week

    • @monsterboomer8051
      @monsterboomer8051 Před rokem

      in a nutshell

  • @OptLab
    @OptLab Před rokem +86

    To be honest...even if an 3090 drops -80% in price, I don't want my home to be heated when I use my computer AND a noisy use. FPS per watt matters as much as quietness.
    That is why new APU will be interesting to follow

    • @JustIn-sr1xe
      @JustIn-sr1xe Před rokem +9

      That's why I'm looking to AMD for my next GPU and CPU upgrades. Nvidia has power hogs for their 3000 series. AMD's stuff uses far less at default.

    • @OptLab
      @OptLab Před rokem +6

      @@Winnetou17 the RTX 3090 is one of the least efficient in fps per watt. If you plan Undervolting and underwatting by 30%, do it with the more efficient cards already, like a 3060 or 6600XT

    • @nastyyoda5673
      @nastyyoda5673 Před rokem +1

      Undervolting is key

    • @OptLab
      @OptLab Před rokem

      ​@@Winnetou17 Thanks for your comment, and I agree. Though, some cards are just the overclocked version of the others, in order to simply charge more. Do you actually have data or website about the best gpus to undervolt?

    • @YuhuanXie
      @YuhuanXie Před rokem +2

      Nvidia put the defaults on their 30 series cards too high, supposedly they were somewhat paranoid about RDNA2. It takes like 5 mins to watch a CZcams video about undervolting. My 3080 which runs around 320Watts at stock only loses about 10% performance running at 220Watts and at 140Watts (seems to crash while gaming if it goes any lower) it's still more than enough for esports and 140Watts was also about the minimum that my 1080Ti could run at without crashing.

  • @joeallan3706
    @joeallan3706 Před rokem +30

    4:00 "cough" jayztwocents "cough"

    • @somebonehead
      @somebonehead Před rokem +5

      Okay it wasn't just me who thought that.

    • @grtitann7425
      @grtitann7425 Před rokem +3

      Simple, he is deep into Nvidia pockets, like LTT and Tim from hardware unboxed.
      Can't trust those sellouts.

    • @50shadesofbeige88
      @50shadesofbeige88 Před rokem +16

      @@grtitann7425 lol Hardware unboxed? Really? 😄

    • @grtitann7425
      @grtitann7425 Před rokem +4

      @@50shadesofbeige88 yes, Tim, from HU. Not Steve.
      Tim will always twist his videos to be as pro.nvidia as he can without blowing his cover.

    • @gravesddd
      @gravesddd Před rokem

      There's no way Jay was that naive to believe prices were never going lower.

  • @DennisBLee
    @DennisBLee Před rokem +25

    Nvidia is paying for their greed now, it's karma for choosing to crank out high-end GPUs for miners in a time when the actual demand was in the low-midrange. Good luck selling anything when you'll be able to pick up a 3090 for $600 in december.

    • @samlebon9884
      @samlebon9884 Před rokem +1

      Yep, empires come with the seeds of their destruction.

    • @adnan4688
      @adnan4688 Před rokem +1

      @@samlebon9884 Honestly, right now,I don't see a difference between nVidia and AMD. They both went greedy.

    • @samlebon9884
      @samlebon9884 Před rokem +2

      @@adnan4688
      When it comes to greed, nobody beats Nvidia and Intel. AMD is somewhere in the middle.
      Remember, if there was no AMD, the market would a monopoly shared by green and blue.

    • @adnan4688
      @adnan4688 Před rokem +2

      @@samlebon9884 With the last gen,AMD proved they are the same. The moment they get a chance,they will set the same premium prices on their products. I have been an AMD supporter for 3 decades,but those are facts my friend.

  • @brucethen
    @brucethen Před rokem +8

    Last time AMD partnered with a game manufacturer and the game was delayed it was Half-Life 2, as I recall, this worked massively in AMDs favour, building up hype and selling more cards

  • @ThomasSerruques
    @ThomasSerruques Před rokem +69

    i think kraken is a hardware compression / decompression of assets on PS5 SoC. so there is already an AMD kraken ;-)

    • @samlebon9884
      @samlebon9884 Před rokem +12

      Yep, Nvidia is getting compressed without reversal.

    • @nextlifeonearth
      @nextlifeonearth Před rokem +1

      Kraken was already known in the industry before the PS5 came out. The PS5 is simply the first standard hardware that has an ASIC for that.
      Ideally Direct Storage will do something similar, though I don't really know what compression technique they chose. If it's similar to the XBox', then it's another technique more optimised for textures rather than general data, which makes sense, because the GPU's main workload during loading is compiling shaders (which faster storage won't help that much) and loading textures. Geometry is usually less data or with things like tessellation derived from yet another texture.

    • @Sam-cq9bj
      @Sam-cq9bj Před rokem

      Amd fanboi channel.

  • @Ruperth316
    @Ruperth316 Před rokem +21

    4:02 you mean Jayztwocents? lol

  • @mmadevgame
    @mmadevgame Před rokem +5

    I am so thankful to amd. If not amd we would still have 4 cores i9 cpus

  • @hemzaonline
    @hemzaonline Před rokem +12

    4:01 i believe it's JayzTwoCents

  • @theFutureSoundWaves
    @theFutureSoundWaves Před rokem +72

    Soon to upgrade my main machine, I'll go all-AMD this time with Zen 4 and RDNA 3.
    Go AMD! 👍👍👍

    • @pissmyasslynch5325
      @pissmyasslynch5325 Před rokem +5

      Me 2😗

    • @DamnDarryn
      @DamnDarryn Před rokem

      do you own a motherboard that supports AMD?

    • @gertjanvandermeij4265
      @gertjanvandermeij4265 Před rokem +5

      Yeah baby ! ME 2 ! Just F Nvidia & Intel ! AMD all they way !

    • @samlebon9884
      @samlebon9884 Před rokem +8

      @@DamnDarryn
      He said he's going all AMD, which means he's switching to a completely new platform.
      Many people are switching to AMD because the choice is obvious.

    • @thesethingsalsohappened6597
      @thesethingsalsohappened6597 Před rokem

      You all will regret your decision for sure. Just come back to this comment when that happened

  • @DiabloMonk
    @DiabloMonk Před rokem +4

    4:01 By “influencers”, do you mean Jayztwocents? I don’t think the use of “idiot” shortly afterwards was a coincidence.

  • @Uncle_yandere
    @Uncle_yandere Před rokem +9

    2020's 500$ rtx 2080 ti was a lesson, so i doubt anyone would want to risk for another possible chip shortage

    • @Slayer666th
      @Slayer666th Před rokem +3

      But most miners i know own 3-5 cards. With electricity bills going up and mining going down, i don‘t see them keeping their cards for much longer.
      There will probably be a flood of cards.

    • @tobiwonkanogy2975
      @tobiwonkanogy2975 Před rokem +1

      @Joe Campbell yes and no . the bigger cards are on separate larger dies and the smaller chips can fit more on the same diameter. the market stack just comes from overall silicon performance . the best chips are nearer to the middle of the wafer and less so toward the edges . the outside can have defects or underperform and they become the other chips. we basically get 2-4 types of die every generation and each of those wafers makes different gpus in the end.

  • @excitedbox5705
    @excitedbox5705 Před rokem +5

    Nvidia also cranked the power to the max to get gains that the architecture and samsungs node didn't provide. They can't offer the same performance increase this time because the overclock budget is already eaten up by the last gen. They need architecture improvements that are above average to reduce the extra power consumption that they used last gen.

  • @converse5874
    @converse5874 Před rokem +13

    I wonder if the advantages of these integrated systems (cpu + gpu + direct storage/smart access) also come into play with AMD's APUs

    • @samlebon9884
      @samlebon9884 Před rokem +4

      I think so. I think this slew of techs are coming from the consoles.

    • @TheDiner50
      @TheDiner50 Před rokem

      Honestly. If there is one thing I beg AMD to look into doing in the future is to create a console for the PC market. RIP pc building but I think honestly moving away from having motherboards, cpu's, ram and gpu's comming from different vendors would be a good thing. Or at least have the cpu gpu and motherboard be from the same vendor. So tired of motherboards having flaws and the only one that gives a **** are the cpu vendor. Since the motherboard vendors give 0 shits about stuff being broken unstable etc etc. I rather get the hardware as if it was a laptop IF there is a hope in hell that drivers BIOS etc etc are going to be as close as flawless as possible.
      Let PC building be about fans, pc cases and cooling! We pick the computer to put in there still but without this mess! Even if the CPU or APU can not be removed from the motherboard at the end of the day making the CPU vendor be responsible to design and deliver the crustal motherboard should make there be less of a need to switch some shit motherboard. There is not a single reliable vendor of motherboard out there. NONE. There are claims there are marketing but the weak link is always the motherboard whatever your budget is!
      Get a AND CPU? Well to f** bad your internet nic is from Intel and we are not giving a fuck that it is broken. WTF? O you got the first generation of M.2 drive motherboards? Oopps BIOS updates makes the port useless and you can kiss your ass that we are going to make sure to fix EVER AGEN. IT IS FIXED ALREADY!
      You got a Chinese produced CPU? Well the motherboard did not like that. You need it to come from this country of origin to be compatible it seems! Since somehow that is how bad there tolerances are on the motherboard level of just making shit barely work. Even now they are the ones making motherboard for the CPU vendors and there specs. AMD having to send a new CPU and proof to motherboard manufacture that the CPU they gave away for troubleshooting was genuine and still motherboard maker gives 0 fucks that there boards refuse to run products the are sold to work with. And where should go you? There is NO vendor making good motherboards. Lenovo maybe? And they basically make only laptops and laptop like desk machines.
      Make personal computers like consoles and make them reliable. Almost like Apple Mac's. But hopefully repairable and costing less while still having the good reliable stuff in there. Have 1 vendor to face to when shit dose not work. Not the blame game. CPU needs the motherboard to be well made for that CPU. There is no way around it. You can spend premium money on motherboards and still be burnt every SINGLE time. Only computers that have worked almost flawless for me over the years are laptops!

    • @miguelpereira9859
      @miguelpereira9859 Před rokem

      @@TheDiner50 I agree with you, AMD should just build a line of "Console-like" gaming PC's using APU's

  • @tobiwonkanogy2975
    @tobiwonkanogy2975 Před rokem +4

    this cycle was ridiculous anyway . there were no cards the entire cycle at price and now that we are on the cusp , no one wants to overspend . Getting a card at or near msrp is possible for a new unused card . there are still those waiting until the price drops down to msrp and then they can consider the upgrade they originally planned.
    The used market will be available even with the launch of the 40 series . it will be scalped as ticketmaster took control of their own scalping site. its a a lot harder to scalp concert/venue tickets now and gpu scalping is easy as can be.

  • @MostlyPennyCat
    @MostlyPennyCat Před rokem +52

    Breaks my heart to see Nvidia get their comeuppance.
    Oh wait, no it doesn't.
    Go AMD!
    They supported the open source world even though it's not profitable.

    • @zahell
      @zahell Před rokem +7

      This is the way

    • @beachslap7359
      @beachslap7359 Před rokem +9

      You mean they supported open source cuz they had no choice as nobody would bother with their products otherwise

    • @imadecoy.
      @imadecoy. Před rokem +10

      @@beachslap7359 The reasoning is irrelevant. Consumers need to reward that behavior.

    • @ytb3748
      @ytb3748 Před rokem +4

      @@imadecoy. I totally agree

    • @gertjanvandermeij4265
      @gertjanvandermeij4265 Před rokem +1

      Oh YES ! RDNA-3 is going to beat the RTX 4000 series ! And RDNA-4 will completly CRUSH Nvidia !

  • @marcux83
    @marcux83 Před rokem +5

    finally. payback time!

  • @Bassjunkie_1
    @Bassjunkie_1 Před rokem +2

    I can never unsee Nagreedia..

  • @vsuarezp
    @vsuarezp Před rokem +1

    Excelente video como siempre Celso. Saludos.

  • @IbanezGuitarz87
    @IbanezGuitarz87 Před rokem +17

    Time for Nvidia to be toned down imo... Go AMD!

    • @gertjanvandermeij4265
      @gertjanvandermeij4265 Před rokem +1

      Radeon DNA-4 (2024) will CRUSH Nvidia into the ground ! RDNA-4 will be a game changer !

  • @Speak_Out_and_Remove_All_Doubt

    The game needs to written to take advantage of direct storage though doesn't it? So sadly almost all benchmarks on release won't show off any of this 'synergy'.

    • @marsovac
      @marsovac Před rokem +5

      Forspoken it is not out yet if you are talking about it, and it was working on direct storage for a lot of time even when it was in alpha. AMD does not need all benchmarks to be like Forspoken, just one example of what is possible will be enough. If they can improve RT, integrate FSR3.0 + SAM + SAS and be on par with nVidia with DLSS3 it will be enough to claim a lot of the market.

    • @proxis9980
      @proxis9980 Před rokem +1

      get enough ram and direct storage is just dead.....ddr5 has already 4,8GBps for each pin throuput its doing up to 6,4GBps per pin.....even if you get a full bandwith limited ssd thats only getting you up to 14GBps....so you looking at up to tripple digitsGBps bandwith vrs 14GBps.....direct stograe is a complet waste of time if you arent restricted by volatile memmory aka you dotn have to share the ram between gpu and cpu and only 16GB of it...if amd doesnt pay them an arm and a leg to ONLY optimze for this feature set (aka consoles) that is just DOA for PC beside some eventual level loadtime reductions...

    • @Speak_Out_and_Remove_All_Doubt
      @Speak_Out_and_Remove_All_Doubt Před rokem

      @@proxis9980 You would think 64GB would be enough for most games and it's not that expensive these days.

    • @dpptd30
      @dpptd30 Před rokem

      Synergy can only really be achieved through harmonizing the software and the hardware, not at all CPU and GPU. Just look at Apple, although they’ve been designing their CPU since the A4 chip, they have still used PowerVR GPUs until recently, yet they still achieved great “synergy” with their proprietary Metal API. AMD’s issue now isn’t not having their CPU and GPU work together, it’s their mediocre software optimization, and their garbage ray tracing performance on RDNA 2 that can’t even beat Turing. If anyone said that ray tracing is a scam, they are tech illiterate idiots, simple, just look at how realistic Pixar animation or even movie CGI is, and apply it to video games in real time over 60 FPS. Ray tracing literally bring photo realistic and cinematic graphics to games in real time, they turn games into interactable movies, instead of just “a game”, and half assing it is a big mistake for AMD, as well as completely ignoring hardware accelerated AI, considering how much AI is revolutionized the industry right now.

    • @Speak_Out_and_Remove_All_Doubt
      @Speak_Out_and_Remove_All_Doubt Před rokem

      @@dpptd30 to be fair to AMD I think we need to judge them in 5 years or so to see where they are as they simply haven't had the money to do anything other than the basics for a long time. But equally, 'Apple like synergy' on a Windows PC would be almost impossible simply because there are too many different players.

  • @abowden556
    @abowden556 Před rokem +23

    I think its going to be really hard to overcome Nvidia mind share in the short term, even if they have a better product, it hasn't helped that much in the past.

    • @RonnieMcNutt666
      @RonnieMcNutt666 Před rokem

      i think that fuels amd, the roman comfort creates weak men etc, nvidia getting way too comfortable, the mind share thing is terrifying, its more like people hate amd rather than even like nvidia, negative influence stronger than positive

  • @mmmuck
    @mmmuck Před rokem +8

    I'm curious if AMD will catch up with Nvidia in real time rendering in DCC applications

    • @Megalomaniakaal
      @Megalomaniakaal Před rokem +3

      They are honest to god trying but that's a huge cliff to climb.

    • @dpptd30
      @dpptd30 Před rokem

      Not going to happen any time soon, at least for the next decade, RDNA 2 can’t even beat Turing in real time rendering, seriously an RTX 2060 is beating AMD’s 6900 XT on blender open data.

    • @carrickdubya4765
      @carrickdubya4765 Před rokem

      @@dpptd30 🥴

  • @mightylubu1826
    @mightylubu1826 Před rokem

    Such a great video as always!

  • @JustIn-sr1xe
    @JustIn-sr1xe Před rokem +23

    As an RTX 2060 owner, looking to get into higher tier resolution and refresh rates. I'll probably be skipping the 3000 series, simply because their price and power draw is far too high (also lack of VRAM). I may consider an RTX 4000 series. But, I'm more interested in AMD's offerings from both current gen and the upcoming RX 7000 series.

    • @DJ.1001
      @DJ.1001 Před rokem +13

      Price and power draw will be worse next gen.

    • @Farren246
      @Farren246 Před rokem +3

      As a Plex Server owner, I look forward to cheap NVENC gen 7 devices like your RTX 2060 flooding the market in the next 6 months.

    • @tringuyen7519
      @tringuyen7519 Před rokem +5

      Although the RX 7000 series will use more power than the RX 6000 series, it will nothing compared to the Nvidia RTX 4090TI’s 800W.

    • @machdaddy6451
      @machdaddy6451 Před rokem +2

      Try to hold out until Nvidia is begging you to take them!

    • @Hardcore_Remixer
      @Hardcore_Remixer Před rokem

      @@tringuyen7519 Newly, NVIDIA announced the actual power draw of 4080 to be like 320. 800W may be the maximum spike height. Though, AMD is still better at performance/price.

  • @honoredshadow1975
    @honoredshadow1975 Před rokem

    What was the Samsung OLED monitor he mentions half way through the video? That one is new to me. Would not mind getting that. I typed in G-Series and didn't find anything.

    • @Jmundala
      @Jmundala Před rokem +2

      He most likely mean Samsung odyssey g7/G8 - they are QLED not OLED. Doesn't seem correct what he says in that regards.

    • @honoredshadow1975
      @honoredshadow1975 Před rokem

      @@Jmundala Ya I thought as much. Damnit lol. Thanks for the reply.

  • @michahojwa8132
    @michahojwa8132 Před rokem

    there was AMD demo of physics effects working on 16 cores cpu - is it part of api stack used in forspoken?

    • @arenzricodexd4409
      @arenzricodexd4409 Před rokem +1

      Game developer in general did not want to push excessive resource on physics calculation. So they will make it to utilize one or two cpu core at best. And even that usually not pushing those cpu core real hard.

    • @michahojwa8132
      @michahojwa8132 Před rokem

      @@arenzricodexd4409 yes but still, there were reports of a tech demo presented like an year ago and no game used it since. Forspoken with many particle physics effects would be a good software to show that off and to get additional performance on consoles.

    • @arenzricodexd4409
      @arenzricodexd4409 Před rokem +2

      @@michahojwa8132 excessive physics processing has been push since mid 2000s. Game developer not interested with it. This kind of thing most often hardware maker ploy to sell stronger hardware. But game developer most often want their game to run even if you have potatoes so they can reach wider audience and broaden their sales target.

  • @Bare_Essence
    @Bare_Essence Před rokem +16

    Well ... the price cuts are unprecedented only because the price hikes were unprecedented. The markets allowed one and required the other. I do think it is interesting that Nvidia's attempt to flood the market with multiple types and quantity of cards at high or no MSRP has caused their announcement of the 4000 series to backfire and force a rescheduling. I also am curious why the 1000 series seemed to have unknown performance until the announcement by Nvidia with their "performance" charts that wowed everyone. Now we all know pretty much what to expect prior to any announcement of the 4000 series. It's a new world. The company reveals seem to always be "yeah, yeah, we know that ..."

    • @joesterling4299
      @joesterling4299 Před rokem +3

      "the price cuts are unprecedented only because the price hikes were unprecedented"
      Perfectly said. Prices became insane. As a result, I got completely out of the mindset of buying into a PC upgrade at all. My Skylake/Pascal rig is still doing fine at 1080p, and I'm just not going to bother with a new PC for a while, even though now they're getting more reasonable again. It's not even that I want to wait for the next gen of GPUs. I simply have no appetite for buying anything right now. I would have, when the 30 series launched, had it been reasonable and available enough.

    • @unreconstructed9172
      @unreconstructed9172 Před rokem +1

      They were not "unprecedented" they were inevitable! These "price cuts" are a joke anyways. Priced at original MSRP 3 years later is overpriced. I hope the market crushes them in the proverbial wallet.

    • @dycedargselderbrother5353
      @dycedargselderbrother5353 Před rokem

      I think managed "leaks" have become a double-edged sword. While they maintain hype over a longer period, when it comes time to purchase, people have had a longer time to consider the relative value, which cuts down on impulse buying. Evidence of this is in the video, when Coreteks says something to the effect of, "buyers have become more savvy", and this savviness is due partially to potential buyers absorbing and managing this extra information. I think people are also much more aware of product cycle behavior these days and how to identify where in the product cycle we are relative to the desire to buy.

  • @jonzie24
    @jonzie24 Před rokem +2

    Ah, neural graphics. That will be a holy grail, that will be a big buzzword used in next-gen consoles. Personally, I can't wait for it, as it will push the virtual world generation to whole new levels.

  • @WealthyHomeless
    @WealthyHomeless Před rokem +14

    4:00 I'm disappointed in jayztwocents

  • @DeadphishyEP3
    @DeadphishyEP3 Před rokem +59

    Nvidia made so much money during the boom they are not worried about this down turn. Only people freaking out are the investors that didnt understand this growth level and revenue was temporary. Nvidia would do it all again if they could, 3 birds in 1 year, and none in the second year is still better then 1 bird in each of the two years.

    • @ZadakLeader
      @ZadakLeader Před rokem +1

      You're a bird

    • @ItsJustVV
      @ItsJustVV Před rokem +7

      Exactly. Calling this "losses" by the entire tech press and YT is so pathetic, these are not losses, it's a return to normal market conditions of a 2 year old GPU generation that is about to be replaced in 2 months. This is normal, last 2 years were abnormal.
      It's the reason Jensen said they are raising wages at nvidia now, because they made a shit ton on $ until now.
      Tech press is so ignorant and can't or won't call it what it is, they just like click bait titles and statements.
      Coreteks is guilty of this too, he said losses instead of what it is. Even in calling out Jay for his BS "BUY NOW" shit he did 3 weeks ago, Coreteks was afraid to say his name. I don't understand this cowardice.

    • @jimiscott
      @jimiscott Před rokem +2

      @@ItsJustVV It is a loss if they need to writedown the assets sitting in the warehouse: CR Inventory WriteOff, CR Inventory

    • @ItsJustVV
      @ItsJustVV Před rokem +1

      @@jimiscott Technically yes, but I don't care about correct terminology when they made so much profits in the last 2 years that they can fail miserably in 2022 (which won't happen) and still be on the plus side, so no losses from my POV.

    • @nathangamble125
      @nathangamble125 Před rokem +1

      @@ItsJustVV "Calling this "losses" by the entire tech press and YT is so pathetic"
      It is literally a loss. They have spent money that they cannot make back. The reason for it doesn't change that.
      "I don't care about correct terminology"
      Then shut up. If you don't use words properly, you can't say anything useful or meaningful, so why bother saying anything at all?

  • @sermerlin1
    @sermerlin1 Před rokem +1

    Samsung G (if we are thinking about curved neo G7 and G8) series are LCD panels not OLED panels? This confuses me.

    • @ImSumGuy
      @ImSumGuy Před rokem

      I think he was mistaken, but there is a planned OLED monitor from Samsung in production, but not yet on the market

    • @sermerlin1
      @sermerlin1 Před rokem

      @@ImSumGuy Yes... That's right. There is an oled monitor(s?) coming from samsung sometime in the near future probably near identical to the dells AW QD OLED monitor but as of right now samsung doesn't have OLED monitors and no "G series OLED" monitors....
      I think he confused it with VA mini LED displays which are G series (G7 and G8).

  • @pyrophobia133
    @pyrophobia133 Před rokem +2

    but does this mean Jensen is going to cut back on leather jackets?

    • @diaman_d
      @diaman_d Před rokem

      he'll be wearing a black turtleneck sweater from now on.

  • @partnersincraft1889
    @partnersincraft1889 Před rokem +4

    Note: Fidelity FX is open source. Direct storage might become so if not already.

    • @oxfordsparky
      @oxfordsparky Před rokem

      its a microsoft API.

    • @partnersincraft1889
      @partnersincraft1889 Před rokem

      @@oxfordsparky so it can be replaced and/or is already used by Nvidia.
      Fidelity FX & DLSS is used by Nvidia.

    • @SerBallister
      @SerBallister Před rokem

      @@partnersincraft1889 AMD would make a driver that is compatible with the Direct storage API, I don't think they would replace it?

    • @partnersincraft1889
      @partnersincraft1889 Před rokem

      @@SerBallister I'd say it depends on if the Linux driver gets a better version of it. The Linux driver is very good.

  • @estebanguerrero682
    @estebanguerrero682 Před rokem

    Any thoughts on the ssd market and prices in the future? Next ram and connectivity standards such as computer eXpress Link CXL? And even wireless peripherals market?

  • @MaTtRoSiTy
    @MaTtRoSiTy Před rokem +9

    Honestly the AMD cards are looking like a relative bargain at present in my region. You can get a 6900XT for pretty much 1/3 what I paid for my 3080Ti (I did buy at the worst time). I would not even consider ray tracing if I were shopping at present as even with a 3080Ti I often just turn off RT most of the time as the more intensive implementations hit my frames so hard it just doesn't feel worth it. I wont be considering ray tracing as a factor next time I upgrade, so I may end up trying AMD for my next card as the value is much better

    • @gamtax
      @gamtax Před rokem

      At my place, Sapphire Nitro RX 6900 XT still cost $1500. What an expensive dust collector I'm seeing right now...

  • @elmariachi5133
    @elmariachi5133 Před rokem +42

    Prices still are 200% of what they should be.. still waiting! A high end card (like 3090ti) is 600$ max, and nothing beyond!

    • @JustAnotherAccount8
      @JustAnotherAccount8 Před rokem +6

      Exactly, PC gamers shouldn't pay more for a mid range PC than a current gen console. And high end should be within a few hundred of a current gen console.

    • @dazealex
      @dazealex Před rokem +1

      @@JustAnotherAccount8 Plus look at the insane amounts of wattage a GPU needs these days. I got a 2070 MaxQ in a shitty HP Omen laptop. Works fine on 1440p, keeping that until they release something that is truly 60fps with RT. Otherwise, screw it. Nothing much that I play supports RT anyway.

    • @JustAnotherAccount8
      @JustAnotherAccount8 Před rokem +4

      @@dazealex Yeah don't bother with RT, not worth paying a premium, and most games that support it look amazing without it, in some cases even better.

    • @paerbac
      @paerbac Před rokem

      Basic supply and demand, there's only so many silicon fabs out there and they can sell GPUs for way more to server and data businesses anyways so it makes perfect sense for them to raise the prices like they do, I don't like it, but capitalism does deem that things are priced the way that they should be

    • @TECHNDJ
      @TECHNDJ Před rokem

      3090ti is a prosumer card and its not for peasant gamers, so paying 1600 for it is worth it💥🤣

  • @xidex8
    @xidex8 Před rokem +1

    Kraken is your post-processed voice in every single video you upload nowadays.

    • @Coreteks
      @Coreteks  Před rokem

      @xidex why not turn the sound off and turn captions on if it bothers you so much?

    • @xidex8
      @xidex8 Před rokem

      @@Coreteks I usually watch your videos in a miniaturized window mainly listening what you are saying and doing other stuff alongside (e.g. coding), so what you suggest would not be possible. I just cannot comprehend what led you to heavily postprocess your voice when it was completely fine in the first videos on this channel.

    • @diaman_d
      @diaman_d Před rokem

      @@xidex8 if people could accept who they are, there would not be cosmetic surgery either.😉

    • @xidex8
      @xidex8 Před rokem

      @@diaman_d true, he probably thought his voice was not manly enough or something.

    • @lilblackduc7312
      @lilblackduc7312 Před rokem

      @@Coreteks "Never argue with stupid people. They drag you down to their level, then beat you with experience". -Mark Twain 🇺🇸 😎👍☕

  • @jamesmetz5147
    @jamesmetz5147 Před rokem

    Great as always.

  • @donaciuxx
    @donaciuxx Před rokem +1

    someone please make bass boosted version of this, i miss that bass!😅😂

  • @Bassjunkie_1
    @Bassjunkie_1 Před rokem +3

    Imo greatest reduction is a farce when they cost 2k. Just my opinion tho

  • @ChadeGB
    @ChadeGB Před rokem +16

    All the "Fidelity FX" features you mention are just DX12 Ultimate features, they're readily available and just as optimised for Nvidia hardware too. Square Enix are not going to alienate the company that has, by far, the largest market share when it comes PC graphics hardware. Intel and AMD are also not going to purposely hobble the performance and compatibility of their CPUs with Nvidia hardware, all those features mentioned with an "AMD synergistic" system will still be available with an AMD/Intel/Nvdia one.

    • @Azurefanger
      @Azurefanger Před rokem +2

      is a good option to have these improvements open for gpus of diferent brand but i think the new gen of amd will be a beast when u have a 7000 cpu and amd gpu together

    • @Coreteks
      @Coreteks  Před rokem +1

      @Chade Fallstar to some extent yes, but just like we've seen with ReBar the perf gains will be smaller outside of AMD's ecosystem. The feature might be available to NV users, but that doesn't mean it will be fully optimized

    • @ChadeGB
      @ChadeGB Před rokem +7

      @@Coreteks No, that's just not the case. Resizable Bar is a part of the PCIe spec and AMD just had a head start with it by utilising it (they called is SAM) first back in 2020, but they both gained performance in some games, whilst performance was actually reduced in others, but now they're pretty much neck and neck. This is a different case entirely, AMD marketing are, quite successfully by the sound of it, trying to make out like these features are in some way special to their ecosystem, they are not, they are standard DX12 Ultimate features. Features already being used in other games. AMD may initially have a performance advantage immediately after the games release, but that will be eaten up by the time Nvidia release their first Game Ready driver for it.
      These aren't special or optimised for AMD in any way. This has been going on since Ati V Nvidia days, it's nothing new to have games that initially optimised for one architecture over the other.

  • @IntenseGrid
    @IntenseGrid Před rokem

    Lol, you're the only one I know that says to clink on a link. Excellent video! I can't wait until the next AMD GPU's come out! About the 7000X series CPUs, did you see that AMD hurt their own performance by putting on such a thick heat spreader? If not, check out DerBauer's video on the subject! Maybe they should come out with a 7000Z series with a much thinner pure silver heat spreader!

  • @jangelelcangry
    @jangelelcangry Před rokem

    Kraken as in PS5's SSD tech. Get it?

  • @TheACLP
    @TheACLP Před rokem +17

    As graphic cards performance get better than games actually need and monitor refresh rates, there will be less and less reasons to upgrade. Unless there’s an actual good reason to transition to 8k.

    • @spicynoodle7419
      @spicynoodle7419 Před rokem

      Muh 4k Ultra @ 240Hz

    • @UserName-eb9oy
      @UserName-eb9oy Před rokem +4

      I think 8k on an ordinary size monitor may be higher resolution than the human eye can see

    • @spicynoodle7419
      @spicynoodle7419 Před rokem +4

      @@UserName-eb9oy 8k 100" monitor. Yeah, get ready for lots of neck pain.

    • @doctorfresh3856
      @doctorfresh3856 Před rokem +1

      @@spicynoodle7419 They'll Probaly Just Raise 8K to like 40 inches.

    • @Powerman293
      @Powerman293 Před rokem

      The smallest LG OLEDs used as a monitor still can fill up your whole field of view. I can't imagine anything sharper would help.

  • @laaaop
    @laaaop Před rokem

    what's that demo at 10:38?

    • @Coreteks
      @Coreteks  Před rokem

      @LeChuck czcams.com/video/_R-tN9l6onQ/video.html

  • @10001000101
    @10001000101 Před rokem +3

    A $250 7600 with 3070ti performance could devastate NVIDIA.

    • @DaybreakPT
      @DaybreakPT Před rokem

      No it wouldn't, but it would force their hand to launch a 4060 that they wanted to launch as a 4060ti for $300 instead of $500, so this is what competition is good for, better products at lower prices.

    • @DaybreakPT
      @DaybreakPT Před rokem

      If what you said was realistic NVIDIA'S reputation would already be devastated by selling xx60 class GPUs for $500
      I don't care how much you sell your flagship for, but making 1080p gaming unaffordable isn't good for anyone except their greed.

  • @Superb_Legend
    @Superb_Legend Před rokem +1

    I think its a bet on software that's gonna be extra for very few games like physx was

  • @eeka_droid
    @eeka_droid Před rokem

    loved the video thumbnail hahaha

  • @endersjehuty7721
    @endersjehuty7721 Před rokem +9

    3:57 Jayz two cents

    • @Rod_Knee
      @Rod_Knee Před rokem +1

      Yes. It seemed painfully obvious at the time.

  • @haze2427
    @haze2427 Před rokem +17

    Thank you for mentioning Jayztwocents, he really is a sell out, cant believe how shameless he is.

    • @backupplan6058
      @backupplan6058 Před rokem +2

      Never forget this is the guy who told everyone we should be thankful to them at the price increases of the 2000 series as the 2080 TI is the new Titan, then they release the Super and RTX Titan shortly before the 3000 series.

    • @gertjanvandermeij4265
      @gertjanvandermeij4265 Před rokem +1

      Jay is an EVGA shill !

    • @anonymoususer3561
      @anonymoususer3561 Před rokem +1

      What did he do?

    • @oxfordsparky
      @oxfordsparky Před rokem +2

      @@anonymoususer3561 upset the fanboys.

    • @bojinglebells
      @bojinglebells Před rokem +1

      @@anonymoususer3561 claimed prices wouldn't go any lower during a sale before prices went lower and have kept going lower, among other things

  • @Ivan-pr7ku
    @Ivan-pr7ku Před rokem +26

    The biggest "competitor" Nvidia faces now is the used market of their own products. Ray tracing turned out to be a slow fizzle and FSR 2.0 is breathing in the neck of DLSS. Machine Learning is the only remaining island of security for now.

    • @SnivTen
      @SnivTen Před rokem +2

      I literally bought a RTX card for the ray tracing and nothing else.

    • @PrefoX
      @PrefoX Před rokem +3

      FSR 2.0 is very bad in most games. Raytracing is starting slowly because the Console GPUs are not really able to render a lot of rays. thanks AMD for nothing

    • @Ivan-pr7ku
      @Ivan-pr7ku Před rokem +2

      @@SnivTen By the time we get fully ray-traced graphics engines (and games), the current crop of RTX hardware will be outmoded anyways. The only 100% ray-traced games that run at sufficient FPS are the old RTX flavored Quake 2 shooter and Minecraft. Not very compelling reason to dump a bag of money, for the RTX feature alone.

    • @gertjanvandermeij4265
      @gertjanvandermeij4265 Před rokem

      No worries .......... RDNA-3 is going to beat the RTX 4000 series ! And RDNA-4 will completly CRUSH Nvidia !
      Just F Nvidia !

    • @josh223
      @josh223 Před rokem

      @@gertjanvandermeij4265 why are you spamming

  • @TK-en2hq
    @TK-en2hq Před rokem +2

    maybe this will be the year I replace my 1070.

    • @carrickdubya4765
      @carrickdubya4765 Před rokem

      Your so lucky . . . I'm still on a 1030-2gb but hoping to get a deal on a 3080 for BF so I can finally play at max settings 🤪

    • @TK-en2hq
      @TK-en2hq Před rokem

      @@carrickdubya4765 NGL I still game on a 1080P monitor so.... graphics card is not on the top of my priority list, even in just my rig.

  • @635574
    @635574 Před rokem

    Its not about if they make MCD GPUs but when and how many.

  • @RepsUp100
    @RepsUp100 Před rokem +3

    "gaming revenue" = selling to miners revenue

  • @samcs06
    @samcs06 Před rokem +1

    Just bought a HP omen 25l during their back to school sell. It was only 20 dollars to upgrade to a 3060ti. Granted that isnt the fastest gpu but for 1100 usd I was about to get an i5 12400, 32gb of ram, 512 nvme, 2tb hdd and a 3060ti. I feel that likes a hell of a deal.

    • @ko7305
      @ko7305 Před rokem

      not really...i bought a hp as well with the 3060ti and amd 6800u (8c/16t), 16gb ddr5 gaming laptop from microcenter for 799+tax. you overpaid.

    • @samcs06
      @samcs06 Před rokem

      @@ko7305 well damn. I needed a desktop though not a laptop.

    • @ko7305
      @ko7305 Před rokem +1

      @@samcs06 especially for a desktop as those are cheaper. Desktops, best build yourself.

  • @daverhodus
    @daverhodus Před rokem +3

    If the 4090 really is 2X of a 3090, I might be tempted. 4K with ultra settings is tough.

    • @Azurefanger
      @Azurefanger Před rokem +1

      800 w for that gpu is madness the new cpu generation is almost 300w partically u will need a 2000 power suply sometimes the gpu spike above 1000w and the system can shut down like 3000 series release

    • @daverhodus
      @daverhodus Před rokem +1

      @@Azurefanger good point. I'm curious about that too.

    • @gertjanvandermeij4265
      @gertjanvandermeij4265 Před rokem

      No ...... NEVER 2x ! Maybe 60% better TOPS ! RDNA-3 is going to beat RTX 4000 series ! And RDNA-4 will completly CRUSH Nvidia !

    • @daverhodus
      @daverhodus Před rokem

      @@gertjanvandermeij4265 time will tell.

    • @estring123
      @estring123 Před rokem +1

      it's 1.8 of 3090, at 2x the power consumption
      so perf/watt has gone backwards
      LOL. only spoiled rich kids will buy 4000 series.

  • @SethOmegaful
    @SethOmegaful Před rokem

    Jensen jensen, my boi... where is my fucking new gen card?

  • @Herrikias
    @Herrikias Před rokem

    All the black and leather in the world can't hide Jensen's big old belly.

  • @v3xx3r
    @v3xx3r Před rokem +4

    7950xtx with 69 traversal coprocessors.

  • @Tampsey
    @Tampsey Před rokem

    11:00 looks sooo dope

  • @Eternalduoae
    @Eternalduoae Před rokem +6

    So the coprocessor is finally seeing the light of day?!
    Interesting!

  • @kaisersolo76
    @kaisersolo76 Před rokem +6

    Apparently 5800x3d works even better with all this

  • @HolyRamanRajya
    @HolyRamanRajya Před rokem +3

    If Nvidia indeed leased(not sold) gpus in bulk to miners and now took them back, they might as well can expunge paper trail and repackage used gpus as brand news.
    Memory wear from mining is worse than effects of processor wear from overclocking/undervolting. The latter will reduce performance a little but the former will just brick the gpu abruptly one day.

    • @HolyRamanRajya
      @HolyRamanRajya Před rokem +2

      @@Winnetou17 yes miners commonly undervolts processors and overclocks memory frequencies

    • @HolyRamanRajya
      @HolyRamanRajya Před rokem +2

      @@Winnetou17 several reports of 30x0 gpus with Hynix memory module in gpus failing has appeared since the crypto crash this year. The other makers have slightly more reliability but even they are expected to fail. Such failures were not reported before this year on the same series of gpus.

  • @himanshusingh5214
    @himanshusingh5214 Před rokem +1

    Lisa, TSMC and Jensen are from Taiwan.

  • @x1000plusx
    @x1000plusx Před rokem +51

    Karma to Nvidia by focusing on miners instead of gamers

    • @Pipit123ify
      @Pipit123ify Před rokem

      Nvidia tried to lock gpu's to mining right? It's just that for some crypto it is easy to bypass? But then again, maybe they did not really try that hard.

    • @iwritegoodessays9779
      @iwritegoodessays9779 Před rokem +2

      Nvidia took chances. Everyone will always be chasing for oppurtunities. Nvidia saw profits on miners and made a deal with them and they probably didn't expect that crypto will go crash so sooner than expected. Sad nvidia, im pretty sure their shareholders are having serious emotional damage and probably plans to take out their stocks from nvidia LMAO

    • @JAT985
      @JAT985 Před rokem +4

      @@Pipit123ify nvidia introfuced their LHR gaming gpus, but at the same time worked with their AIB’s to divert huge sections of their output directly to miners…
      It was a real kinda fuck you to gamers. Their LHR gpus were basically all for show

    • @HCG
      @HCG Před rokem +1

      They’re a business, their goal is to maximize profits, not be your friend. Grow up you naive toddler.

    • @zlac
      @zlac Před rokem

      They're crying right now, and wiping their tears with money all the way to the bank!😁

  • @Valk-Kilmer
    @Valk-Kilmer Před rokem

    This. Even at 4K, my RTX 3090 is having no issues playing games at ultra settings with DLSS and FSR 1/2.0 and anyone who says 'you can't hit 4k 120hz in every game' is wrong. I'm having zero issues.

  • @Aranimda
    @Aranimda Před rokem +2

    As long as FPS per watt efficiency does not improve I have little interest in the RTX 4000 series.
    I prefer cards with good performance and reasonable power consumption. Guideline: 1x 8pin PCI-e power connector needed.
    GTX 1080 > RTX 3070 > ????

  • @timstuart7112
    @timstuart7112 Před rokem +4

    Could it be a RTX accelerator 🤔

    • @spicynoodle7419
      @spicynoodle7419 Před rokem

      RX 7000 is going to have dedicated RT cores rather than just accelerators

  • @omnivos
    @omnivos Před rokem

    I'm still extremely happy with 6800xt 1440p performance. No need to upgrade when already at 80-250 fps ultra all.

  • @jerrywatson1958
    @jerrywatson1958 Před rokem +3

    Well, I just bought a Sapphire Toxic 6900XT LE w/360 rad for $999.99 + tax $60. As a hybrid cooled high end gpu it performs just like the 3090ti in my 1440p gaming rig. I have a 5600X since day one, I haven't seen the need to upgrade as I was always gpu bound with my 5700XT LC. I hope the new Fedelity FX features will still work with this gen gpus. I bought because after waiting 2 years, the price for this gpu dropped 50%+ It was $2049.99! 6 months ago. I hope DX Storage works with pcie gen 4 NVME drives and NOT require pcie5 OR windows 11! Thanks Cortex for all your hard work!

    • @donhearn2248
      @donhearn2248 Před rokem

      good find. When i checked in on the Toxic with rad it was still around 2k. I think it was at launch in May but might have been june.

    • @theplayerofus319
      @theplayerofus319 Před rokem +1

      i had a 5600x with my 3080 ti in 1440p. my new 5800x3d was a huge performance increase, give it a try

    • @ognintendo7061
      @ognintendo7061 Před rokem

      Nice, yea I got a 6900XT Toxic Air-cooled for 847$. It's a beast of a card. And it undervolts like a champ have it running better than stock performance at 200w or less depending on the game played. Fans never go above 30% speed lol can't even hear it

    • @ognintendo7061
      @ognintendo7061 Před rokem

      @@theplayerofus319 yeah I might upgrade/downgrade from my 5950x to the 3D once it gets a clearance discount in 2 months when zen 4 hits shelves. 250$ is a solid price for it I'd say. Hopefully stock lasts til then

  • @yancgc5098
    @yancgc5098 Před rokem

    4:03-4:18 the shots against JayzTwoCents here lol

  • @Frencho9
    @Frencho9 Před rokem +14

    Well deserved Karma Ngreedia!

  • @estring123
    @estring123 Před rokem +2

    nvidia is by far the biggest bubble in the history of the stock market

  • @russellruby1293
    @russellruby1293 Před rokem +1

    nvidia missed nothing. lisa su and jenseng wong are related by blood. they break holiday bread. a beautiful dance they do.

  • @hakanyucel1639
    @hakanyucel1639 Před rokem

    In Sweden the lowest price is on 3090 : $1578

    • @GTTwincam
      @GTTwincam Před rokem +1

      About 160-80 less here in Norway for the most affordable cards, the suprim x 3090 is just about 1500

  • @evilmarc
    @evilmarc Před rokem

    good for prices

  • @mecha-sheep7674
    @mecha-sheep7674 Před rokem +2

    I hope for AMD that this kraken is not of the Giuliani's kind...

    • @gertjanvandermeij4265
      @gertjanvandermeij4265 Před rokem

      No worries ...... RDNA-3 is going to beat the RTX 4000 series ! And RDNA-4 will completly CRUSH Nvidia !

  • @OroborOSX11
    @OroborOSX11 Před rokem +1

    Nvidia deserves to suffer for how shitty they’ve treated gamers with their pricing and lack of supply when everyone needed it.

    • @shellshock3243
      @shellshock3243 Před rokem

      And let me guess!! AMD never had a lack of supply during that entire period, the shelves where always full of RDNA2 cards priced at a reasonable MSRP and they and thier AIB partners didn't sell a single card to miners?...
      Get real, dude!!! Both where equally as bad as each other. Some people just like bashing down hard on Nvidia, while giving AMD a free pass, just because they're the underdog.

  • @mylittlepwny3447
    @mylittlepwny3447 Před rokem +2

    Always love seeing this channel putting out videos! Probably my favorite leaker. You aren't afraid to stick your neck out there even if you're wrong! I appreciate it

  • @iga4184
    @iga4184 Před rokem +1

    Zen 4 8-core CPU + RDNA3 GPU!

  • @MonkeyspankO
    @MonkeyspankO Před rokem +5

    Interesting. But I see no reason to upgrade for quite some time. Anticipating future tech/prices can leave you holding the bag, no matter what your entry point is. Zero money spent is the best discount. Plus, unless you are a professional/non-gamer user, there are very few games on the horizon that warrant upgrading. The latest Bethesda entry has been pushed into next year and most other games currently on the market don't push the boundaries in any meaningful way (especially on PC). Its not like the good ol' days of the 2000s, where each year pushed what your system could handle.

  • @astreakaito5625
    @astreakaito5625 Před rokem +1

    I hope Nvidia won't force even higher price on the 4080 because I want one tbh.
    I actually disagree with you saying a 3090 is that good at 4K if you plan to use raytracing too. even the 3090 need DLSS for 4K RT

    • @princekittipon6510
      @princekittipon6510 Před rokem

      4080ti usually come 6 months after 4080 ?

    • @Warsheep2k6
      @Warsheep2k6 Před rokem

      Always depends on the game even a 3070 or a rx6700xt can comfortably play most games at high in 4k60 no upscaling needed it only changes if you throw in ultra modes and Raytracing and honestly ultra vs high is mostly ranging "I barely spot a difference" to "I need a looking glass to find a difference" for a heavy impact on fps and frame times

    • @DaybreakPT
      @DaybreakPT Před rokem

      @@Warsheep2k6 I'm surprised you can even spot a difference in high vs ultra in any game released after 2015. Let alone when you're gaming and not comparing screenshots.

  • @wwfww
    @wwfww Před rokem

    Nvidia is actually preparing the karen architecture, be prepared

  • @DJ.1001
    @DJ.1001 Před rokem +2

    Next gen will be full of gimmicks from both teams in order to justify higher MSRPS than ever before.

    • @gertjanvandermeij4265
      @gertjanvandermeij4265 Před rokem +1

      Yes indeed, just like GAY TRACING ! Not interesting for gamers, only for movie studio's !

  • @marcin_karwinski
    @marcin_karwinski Před rokem +4

    Well, nVidia can also just try to push for ARM desktops with their DC-originating architectures, to disturb the Windows gamers market, or they could push for extra features for integration with their cloud gaming solutions... after all, nVidia still holds the crown with their workstation/compute GPGPUs' solutions. I mean nVidia could very well just offer a mobile/laptop GraceHopper-like platform whilst offering cloud-based option to run software that's not natively supported on ARM, somewhat bound offering, nearly like what Microsoft did with XboxS and XboxUltmate... Sell hardware at a loss even, but recoup costs on monthly co-sold streaming service etc.

  • @jjdizz1l
    @jjdizz1l Před rokem +1

    Sounds like my stock is about to go up!

    • @phillgizmo8934
      @phillgizmo8934 Před rokem +1

      And your capita consists of what: cows, goats or sheep? ))

    • @jjdizz1l
      @jjdizz1l Před rokem

      @@phillgizmo8934 😂 silly man, you literally have no idea 🤣

  • @miyagiryota9238
    @miyagiryota9238 Před rokem +2

    Good let the greed catchup to Nvidia

  • @JTMC93
    @JTMC93 Před rokem +2

    I could see the RTX 4060 or RTX 4050 or the AMD equivalent possibly catching on akin to how the GTX 1060 did. If only because of the 5 to 7 year cycle of PC upgrades.
    Edit: I can also see it meshing with possibly a switch towards pure AMD and maybe even pure Intel systems. Assuming Intel can ever pull their heads from their arses and clean the shite from their ears to actually get some proper communication and work done.

  • @annamaria-pv1fw
    @annamaria-pv1fw Před rokem

    great content

  • @ArafinWasHere
    @ArafinWasHere Před rokem

    1000 series , buy .
    2000 series , skip .
    3000 series , buy .
    4000 series , you know what to do .

  • @unreconstructed9172
    @unreconstructed9172 Před rokem +1

    Considering miners won't be buying a bunch of cards and both AMD and Nvidia can't cater to them, I wouldn't buy anything until prices for top end is less than $800 and price for lowest end $200 or less. Prices need to normalize. Miners buying at any cost is not a variable now. It's a buyer's market. Proving that the "chip shortage" was not the reason for costs but Nvidia and AMD preventing actual production to keep prices up should provide enough reason to hold out and hit THEM in the pocket for a change.

  • @mesaber86
    @mesaber86 Před rokem +2

    If ppl buy used GPUs of 3000 series they must be insanely dumb. These cards have been running 24/7 by miners since they bought em.

    • @oxfordsparky
      @oxfordsparky Před rokem

      you must be insanely dumb to think that all rtx30 gpu's are owned by miners.

  • @kinnay999
    @kinnay999 Před rokem

    gives me a mantle api vib

  • @PKG12341
    @PKG12341 Před rokem +6

    Always appreciate the deep dives, Thanks Coreteks !

  • @TopAnime-kn1xr
    @TopAnime-kn1xr Před rokem +1

    Poor NGreedia.. I feel sooo bad for them.
    Lets all buy their latest card, the GTX1630(even slower than GTX1050Ti) to support NVidia.

  • @apieceofcrap8794
    @apieceofcrap8794 Před rokem +1

    4:00 calling out jayztwocents i see

  • @anon_forever
    @anon_forever Před rokem

    I wonder if Apple has some synergy oriented plans in the future, given that they also cover all components of the trifecta that you mention.

  • @moldoveanu8
    @moldoveanu8 Před rokem

    Nvidia, thank you for the high prices and fire risks. That's all I needed to make my next GPU decision much easier 😂