AMD’s Actually RELEASING THESE CPUs!

Sdílet
Vložit
  • čas přidán 15. 05. 2024
  • AMD is actually releasing these, RX 8900 XTX could have been amazing, Intel responds to wild CPU issues and bad news for Nvidia GPUs!
    ***Items featured in this video available here***
    ►Newegg (Affiliate): geni.us/newegg1
    ►Amazon US (Affiliate): amzn.to/3b9UjKB
    Join The Discord: / discord
    Twitter: / gamermeld
    Facebook: Facebook: / gamermeld
    Gamer Meld Merch: teespring.com/stores/gamermeld
    Gamer Meld Sponsors: www.gamermeld.com/sponsors
    Support Me On Patreon: / gamermeld
    SOURCES:
    wccftech.com/nvidia-geforce-r...
    videocardz.com/newz/amd-cance...
    forums.anandtech.com/threads/...
    www.chiphell.com/thread-26014...
    www.tomshardware.com/pc-compo...
    www.igorslab.de/en/intel-rele...
    www.digitaltrends.com/computi...
    videocardz.com/newz/amd-repor...
    www.extremetech.com/gaming/am...
    TIMESTAMPS
    0:00 Bad News For Nvidia GPUs
    1:08 Intel Discusses CPU Issues
    2:57 AMD's Actually Releasing These
    4:17 RX 8900 XTX
  • Věda a technologie

Komentáře • 279

  • @Meoknet
    @Meoknet Před 16 dny +50

    Hardware Unboxed has pulled back up an interview where Dr. Ian Cutress asked Intel about the loose power limits for board makers, and Intel said it's all in spec. Now they're blaming the board makers for being out of spec we have to interview to prove they're lying. They've been more than happy to push power limits up so they can compete better in benchmarks but the minute it causes issues, they backstep responsibility.

    • @treywesley958
      @treywesley958 Před 16 dny +9

      1000% smh. Intel knew exactly what is was doing and was completely fine with it until it blew up in their face.

    • @Lue1337
      @Lue1337 Před 16 dny +7

      "as long as the CPU is not dead(yet) It is in Spec"

    • @jimmyjango5213
      @jimmyjango5213 Před 15 dny +3

      classic intel

    • @GamerMeld
      @GamerMeld  Před 15 dny +11

      It’s why I haven’t really been going after board makers. I definitely more blame this on Intel.

    • @georgejones5019
      @georgejones5019 Před 15 dny +2

      This. Intel hasn't been forward with boardmakers on the PL1 and PL2 limit specs.

  • @Bakachu7
    @Bakachu7 Před 16 dny +24

    As much as I would love to see AMD release a beast gpu, I'm much more interested in $200-300 cards that provide generational uplift.

    • @codeymorganti
      @codeymorganti Před 16 dny +5

      I get the feeling that's never gonna happen again, greed ruins all.

    • @warrenpuckett4203
      @warrenpuckett4203 Před 15 dny +1

      I retired and had to get off the gaming roller coaster. Not for lakawanna. Can't work overtime anymore. So now I just go for cool and quiet.
      Put the R-9 390 on the shelf in reserve. Swapped in and just used the old HD 7790. Pretty much a 3800x APU on a Asrock Steel Legend and 32GB of 3600.
      Slowed it all down. CPU Clock and memory clock. Pretty much just what a Ryzen 5 7600 will do now, for 1/3 the price. A WiFI board still is bit of $ but not as much as 2015.
      The hard part is finding a AMD Mobo with a video port.
      I also had to buy a new car in 2016. Because my retirement ride (2005 7 passenger T-Blazer) got stolen. Probably ended up in UAE or Pakistan. May want front and rear A/C there.
      That did not help either. Also my IRA took a BIG hit in 2021 to 2023. It did not recover.
      But I can play my old CDs and Bluetooth them to my speakers in the house. Oh and the 32GB of RAM is a bunch cheaper.
      So if you just need a desktop? You can buy a monitor, moboard, RAM and APU for less than I paid for the 27inch I bought in 2012 for $700.
      Because more the 75Htz refresh is not needed.
      I wish I had bought AMD stock in 2012 instead.
      Kicked M$ to the curb and just use Linux. Because I can't afford the software and hardware upgrade cycle any more.
      I also don't need to change out the old AMD 3800X. Still has more than enough umph to use it as a Linux desktop.
      Hope for the best and plan for the worst. This current cycle is not over. Lived through 3 of them.

    • @patrikmiskovic791
      @patrikmiskovic791 Před 13 dny

      Same goes for 90% people

  • @sureberferber9101
    @sureberferber9101 Před 16 dny +128

    Gamer Meld: AMD’s Actually RELEASING THESE CPUs!
    Gamer Meld Tomorrow: INTEL'S Actually RELEASING THESE CPUs!
    Gamer Meld Wednesday: AMD’s Actually RE-RELEASING THESE CPUs!
    Gamer Meld Thursday: INTEL'S Actually RE-RELEASING THESE CPUs!
    Me: 🤨😐😑

    • @tazverr
      @tazverr Před 16 dny +37

      I can't stand this guy, all I feel is confused I am just about to block his channel
      every. single. day. is clickbait

    • @silfrido1768
      @silfrido1768 Před 16 dny +9

      I usually just go to the part that I’m interested in the most.
      I’m only here for the amd CPU’s :)

    • @NickyNiclas
      @NickyNiclas Před 16 dny +6

      This channels main purpose must be killing braincells. I also can't tell if the narrator is AI or not.

    • @nathanbutcher7720
      @nathanbutcher7720 Před 16 dny +3

      IT'S OVER!

    • @escape808
      @escape808 Před 16 dny +3

      @@NickyNiclas Sounds like a trained voice, so technically yes it's AI.

  • @chriswright8074
    @chriswright8074 Před 16 dny +42

    X3d wasn't for just gaming eypc already had x3d

    • @trixter192
      @trixter192 Před 16 dny +11

      Exactly this. X3d came from epic.

    • @gamamew
      @gamamew Před 16 dny +6

      Just remember the consumer 5900X3D, 5950X3D, 7900X3D and 7950X3D, those are not very good at gaming for the price, but can benefit on certain production workloads.

    • @dragonproductions236
      @dragonproductions236 Před 16 dny +1

      @@gamamew The 7900X3D series performs just like the 7800x3d if you set it up correctly, because it's just a 7800x3d with extra cores grafted onto it. The 900/950 are bad price per performance not bad performance.

    • @SpoonHurler
      @SpoonHurler Před 16 dny +2

      A number of scientific programs/ computaters love the extra cache. I think many of us forget that computers are used for stuff other than games and content creation.

    • @SidneyCritic
      @SidneyCritic Před 16 dny +1

      @@gamamew I think it's all their dual chiplet designs that are not good for games, ie, single chips are the fast ones. It might be a scheduling/communicating problem between chiplets, because games don't use many cores.

  • @Mark_Williams.
    @Mark_Williams. Před 16 dny +19

    4:00 - Incorrect, 3D V-Cache doesn't span chiplets. 5800X3D and 7800X3D are single CCD processors and the V-Cache helps them well enough. It helps games most often because the runtime code in them is usually small enough to fit in an L4 cache size, thus negating the need to take a latency hit when going out to RAM.

    • @GamerMeld
      @GamerMeld  Před 15 dny

      How am I incorrect? I never said it’s on both chips. I literally reviewed their newest X3D chips lol.

    • @Mark_Williams.
      @Mark_Williams. Před 15 dny +3

      @@GamerMeld In that case the wording was poorly chosen. "Cross chiplet" or inter-chiplet means between two or more chiplets. That's what those prefixes mean, so you implied between chiplets. What you should've said instead is "intra-chiplet", meaning within a single chiplet or stuck to CCD terminology to prevent confusion.
      Also it's not strictly there to boost CPU-to-CPU communication per se as you also implied, in fact it does nothing for peak bandwidth between them. All it does is provide a bigger bucket of cache allowing all CPU's on that CCD to benefit from not having to take the latency and bandwidth hit of reaching out to DRAM. Yes different CPU's can store and retrieve data via the L3, skipping DRAM but that's a side effect of it's purpose, keeping highly requested stuff close to the cores. Think of it as a memory interface saving feature. Memory bandwidth is often the bottleneck these days, hence why keeping more data on die is so beneficial in applications that can fit most/all frequently used code in cache (like games).
      I just realised I misspoke about L4, it's seen as L3.

    • @CVE42287
      @CVE42287 Před 15 dny +2

      ​@@GamerMeldDude just take the feedback, stop acting like you're immune to mistakes, because one look at a couple other videos of yours disproves that fallacy relatively quickly.

    • @RobBCactive
      @RobBCactive Před 5 dny

      ​@@GamerMeldWendell Level 1 talks about numerical & engineering design calculations that are sped up massively on MilanX, a package run is set to run use 1 CCD and can scale really well because of the cache.
      Big data sets like more cache as both bandwidth and latency is much faster than DRAM, a programmer doesn't "use" it, it's simply as processing blocks of data fitting in L1 cache, then L2 & L3 is much faster.
      Games get low IPC often because of unpredictable memory access leading to memory stalls. Predicted memory access has pre-fetch of the cache line.

  • @nadtz
    @nadtz Před 16 dny +8

    There are already X3d Epyc chips, Milan-X and Genoa-X. Gaming isn't the only workload that benefits from large L3 cache, phoronix has done some pretty good deep dives on the server workloads that benefit from it.

  • @averagenoob2.0
    @averagenoob2.0 Před 16 dny +76

    No one is buying a 4060 ti

    • @dungeondeezdragons4242
      @dungeondeezdragons4242 Před 16 dny +19

      Very unfortunately you are wrong. There are a lot of people who are uninformed or are a fanboy

    • @einstien2409
      @einstien2409 Před 16 dny +5

      it is literally one of the most popular GPUs in gaming systems based on steam hardware server.

    • @einstien2409
      @einstien2409 Před 16 dny +5

      ​@@dungeondeezdragons4242Or they are professionals who uses CUDA which AMDs dog shit GPUs don't support or even have a proper alternative to. (Nvidia got it running much longer before AMD and will take AMD a lot of time before they catch up so I hope they catches up cuz it's tiring to see Nvidia dominate this side of the market).

    • @einstien2409
      @einstien2409 Před 16 dny

      @@dungeondeezdragons4242 P.S. it's not a hardware issue, it's a software issue.

    • @manitoba-op4jx
      @manitoba-op4jx Před 16 dny +7

      @@einstien2409my 1060 is the last nvidia card i will ever own, i'm switching to intel

  • @GustavoNoronha
    @GustavoNoronha Před 16 dny +8

    4:02 no! It is useful in gaming because it effectively increases the memory bandwidth as perceived by the game, as the cache can hold more data, so there are fewer cycles wasted waiting for RAM loads and stores. And there are several professional and scientific applications which also benefit from this increased effective bandwidth, that's why you also see Milan-X, Genoa-X be a big deal on HPC workloads, if you are doing any sort of physics simulation, for instance, the additional cache is a huge boost.

    • @BestFriendOfDog
      @BestFriendOfDog Před 16 dny

      This, I only wish I could afford a server grade CPU for my workloads. I am one of these users who follows these tech news because I can only afford a 7950X, 5950X CPus.

    • @GamerMeld
      @GamerMeld  Před 15 dny

      That’s exactly what I was saying lol. You realize I reviewed these chips, right? I discussed this exact thing in that video. I also discussed the release of all of those Epyc CPUs. I’m more referring to it with it being on a consumer system. Why do I have to go through every detail of everything every time just so I don’t have 100 comments trying to “prove me wrong.”

    • @GustavoNoronha
      @GustavoNoronha Před 15 dny +2

      @@GamerMeld you literally said "3D vcache is only really used in gaming because it makes cross chiplet communication much faster", which is not the case and not what I said above.
      I'm not doing the "prove me wrong" dance, I'm just trying to contribute with a better understanding of the feature and why it'll be useful on Epyc AM5 CPUs: applications that require more memory bandwidth, like simulation workloads, for instance.

    • @BestFriendOfDog
      @BestFriendOfDog Před 15 dny +1

      @@GustavoNoronha yea gamermeld, own your mistakes.

    • @itsTyrion
      @itsTyrion Před 13 dny

      @@GamerMeld it might be what you MEANT but well..

  • @newyorktechworld6492
    @newyorktechworld6492 Před 16 dny +18

    RX 8900 CE Clickbait Edition.

  • @bamboostick5
    @bamboostick5 Před 16 dny +7

    This guy is clickbait

  • @jamegumb7298
    @jamegumb7298 Před 16 dny +2

    EPYC 4004 is only interesting if they have a way for people to use a load more lanes, that is what is really lacking, and the one step up you buy Threadripper with huge tdp, massive pricy motherboard, and more cores than I want at lower frequency than I want.
    It would mean a new chipset and/or board though for that sweet 4/5× 4× NVMe drives, finally, maybe dual E-key on top.
    I would not mind a *short term* 200w+ boost for a 4004, 8 12 and 16 core version, iGPU of course.

  • @nsday1
    @nsday1 Před 16 dny +5

    Intel didn't say anything before, because it wasn't a problem before. But they have been having more and more difficulties reaching maximum performance in trying to keep up with AMD, and has lost the crown in that department. In the past, it wasn't an issue because their CPUs could more or less handle it. But the latest generation, they are already pushed to the max. So maintaining that kind of pressure takes it's toll on the CPUs. Also, keep in mind, this isn't happening to brand new CPUs, but CPUs that are a few months old at least, and more than likely, being K series chips, having the customers push them further. You can buy a Porche and it can go fast, but it's not going to last if you have it redlined all the time.
    A much better solution, while it takes a bit of effort, is to find the undervolt that allows you to maintain Intel's clock speeds without thermal throttling. I have a 14900K set to 1.35V, LLC 5 (MSI), and the P cores untouched. I can run Cinebench23 without it being thermal throttled, and get higher scores than all those that are "overclocking" their CPUs.

  • @miguelcollado5438
    @miguelcollado5438 Před 16 dny +2

    3DVCache is excellent in servers and Data Transfer, btw drive BUS and other items, like HBA's etc... the greater the cache capacity and PCIE Lanes, the more efficient the CPU

  • @joncroft2857
    @joncroft2857 Před 16 dny +11

    AMD has not came out and openly said that they are not making high-end GPU's

    • @betag24cn
      @betag24cn Před 16 dny +2

      they sort of did and didnt like last year, lets hope it is wrong, but i only expect 600, 700 and 800 class this time, no 900 class

  • @pweddy1
    @pweddy1 Před 16 dny +2

    There are server chips with 3d vcache. It’s not a surprise that they would release 16 core chips with 3d vcache for SERVER/WORKSTATION chips.

  • @auritro3903
    @auritro3903 Před 16 dny +2

    This is literally your second video on the exact same topic.

  • @dawienel1142
    @dawienel1142 Před 14 dny +1

    Correction, 3D cache being good for gaming is a happy coincidence.
    3D vcache was built to accelerate spesific enterprise workloads in Epyc CPUs.

  • @royboysoyboy
    @royboysoyboy Před 16 dny +1

    3:59 i think the "cross chiplet communication" part might be wrong because then why does the 7950X with 2 CCD chiplets perform worse, sometimes way worse on 0.1% lows compared to 7800X with 1 CCD chiplet

    • @GamerMeld
      @GamerMeld  Před 15 dny +1

      It’s more about not having to cross the chiplets. I somewhat said that wrong. It helps with the issue of moving from one chiplet to the other.

    • @royboysoyboy
      @royboysoyboy Před 15 dny +1

      @GamerMeld thanks for the response. Just wanted to highlight that 3d v cache is very useful even when you don't have multiple chiplets doing operations. Thanks for trying to clarify!

  • @johnscaramis2515
    @johnscaramis2515 Před 15 dny

    4:10 3D-Vache does not work on Zen4c-chiplets, as part of the higher density was removing the contacts for the 3D-VCache.
    But they need to have Cache, otherwise the performance of those 32 cores would be crippled by lack of RAM bandwidth, considering that AM5 has only two memory channels. And looking at the pinout of AM5, there's no room for additional one or two RAM channels.
    So those chips will need cache to perform good, I assume that AMD (assuming those chips actually will be released) will introduce L4 cache, either as an additional chiplet on the carrier or maybe as a chiplet on top of the IO-Die

  • @WilsoYUN0DI3
    @WilsoYUN0DI3 Před 16 dny +3

    Eh I built a pc a few years ago not gunna upgrade until Zen 5 CPUs and 50 series have been around for about a year, don't see the point in upgrading for a 50% increase in performance when I can wait a year and get better lol

    • @UltimateGattai
      @UltimateGattai Před 16 dny +2

      I'm still using a 1st gen Ryzen mobo, I'm waiting for AM6 boards before I upgrade, the performance increase is going to be wild.

    • @WilsoYUN0DI3
      @WilsoYUN0DI3 Před 14 dny

      @@UltimateGattai is there an estimated date for this?

  • @johnscaramis2515
    @johnscaramis2515 Před 15 dny

    4:02 nope, 3D-VCache does not make cross-chiplet communication faster. Additional cache only has one purpose: reduce the amount of communication with slow(er) storage "outside". The more local RAM/Cache you have, the less to have to communicate with higher cache levels or in worst case RAM. Additionally you can load bigger chunks from memory, avoiding latency penalties by avoiding invidividual memory access and using burst transfers from RAM (not sure it the term still exists with modern DDR memory).

  • @gamingevolved760
    @gamingevolved760 Před 16 dny +2

    I have 13900k and what i duet is lockt cpu to 5.5 ghz all core and set 253 watt on bios and no more crasch👍😀

  • @Anonymous-sb9rr
    @Anonymous-sb9rr Před 15 dny +1

    As far as I know, there's no Zen 4c with 3d V-cache. So it's either 16 cores with 3d V-cache, or 32 cores without.

  • @dragonl4d216
    @dragonl4d216 Před 16 dny +16

    Definitely able to wait things out. My RX6800 with 16GB of VRAM will last me a long time. Hardware are no longer the same as the start of the millennial where they get obsolete every 2 years. Heck even the subpar specs PS4 lasted a good 8 years. Calling it that my GPU will be fine at 1440p until at least 2028.

    • @GForceIntel
      @GForceIntel Před 16 dny +2

      Keep dreaming. There no way that will last that long. Sony will definitely announce their new console in 2027 by then your gpu will be extremely obsolete as games will start adopting unreal engine 5 and heavily by the end of this gen and right now your gpu cpu cant even handle most of those games with the little complexity they have been able to apply in the games using that engine.

    • @Gen0cidePTB
      @Gen0cidePTB Před 16 dny +2

      ​@@GForceIntel This is correct.
      Unreal 5 is going to force the market to push the performance envelope.

    • @fartcruncher98
      @fartcruncher98 Před 16 dny +3

      You'll definitely be fine for a while but not at 1440p. Might have to drop down to 1080p in a few years

    • @UltimateGattai
      @UltimateGattai Před 16 dny +3

      I'm using a 7 year old GPU that is on par with a baseline PS5. ​I'm pretty sure he can get more than 2 years out of that GPU, a PS6 probably won't beat his card anyway. @@GForceIntel

    • @devjitdutta5670
      @devjitdutta5670 Před 16 dny +1

      @@GForceIntelmaybe for GPU’s with less than 16GB VRAM. It all depends on the games you play, the resolution, the settings and the frames you want. For example I play games at 1440p with High settings and my 7900 XT will last me for the next 5 years if I want to play at 1440p High at 60fps.(for most of the games)

  • @systemBuilder
    @systemBuilder Před 16 dny

    Actually, 3D vcache was developed for EPYC and then ported over to Ryzen. With so many EPYC cores, the cache in the interconnect fabric is very important to reduce main memory bottlenecks - 8-way interleaved memory just isn't enough . .

  • @rogerthomas7040
    @rogerthomas7040 Před 15 dny

    X3d cache does not improve inter-chiplet communications. The aim of such cache is to reduce the need for communication across chiplets and to/from memory. This is the reason for it being found on top end EPIC processors and the same reason why it would make sense for it to be found on the processors you are talking about as that many cores on a AM5 platform would need as much help as possible to limit the amount of communications via the limited size memory bus. One key thing is that Zen 4c chiplets do not support X3d cache, so only chips using Zen 4 (full fat) can have the extra cache added.

  • @camiam119
    @camiam119 Před 16 dny +3

    Just got my 4080 Super FE. I haven't put it in yet, but I'm happy I got it when I did

    • @alejandroz1606
      @alejandroz1606 Před 16 dny

      I've never been able to get a FE XD

    • @camiam119
      @camiam119 Před 16 dny

      @@alejandroz1606 I ordered it right from Nvidea's website. Is it normally out of stock?

    • @b3as_t
      @b3as_t Před 16 dny +2

      How much did u pay for it lmao, you can get an xtx for 750 USD with neweggs deal.

    • @camiam119
      @camiam119 Před 10 dny

      @b3as_t purposely got the 4080 over anything else. I only spend $1000, but I really wanted an Intel and Nvidea pc. Personal preference

  • @geofrancis2001
    @geofrancis2001 Před 16 dny +1

    this all goes back to the MCE enhancements that came out with the 8700k.

    • @Gen0cidePTB
      @Gen0cidePTB Před 16 dny

      MCE as in Machine Check Error? Got a link to share? I RMA'd a 3930K over that error and Intel actually called me up and asked me what I was doing with the chip because they were amazed I broke it that way, but that would have been before your mentioned enhancements.

    • @geofrancis2001
      @geofrancis2001 Před 15 dny +1

      @@Gen0cidePTB no multi core enhancement, its where motherboards would lock the cpu to its max boost speed on all cores

  • @BestFriendOfDog
    @BestFriendOfDog Před 16 dny +1

    X3D chips are not just useful for gaming. There are workloads, such as FEM, where lots of cache can be really useful, specially for small models or problems that can be solved in parallel. It increases the assembly of matrices in mathematical problems. AMD did have some slides on their previous servers with 3D cache where this was used in simulations.

    • @GamerMeld
      @GamerMeld  Před 15 dny

      I understand that. I’m more referring to on a consumer board, but I get these are made more for that.

    • @BestFriendOfDog
      @BestFriendOfDog Před 15 dny

      @@GamerMeld getting this on a consumer board would be terrific. A lot young researchers who can't afford a threadripper workstation would love this. A FEM software is about 25K-50K USD, 5K-10K USD for an academic version, add this a 10-50K workstation.

  • @BIGBASSSAMA_4
    @BIGBASSSAMA_4 Před 16 dny +1

    Percentage-wise how much will ray tracing performance get a boost with the 8900 XTX

    • @meneldil7604
      @meneldil7604 Před 16 dny

      it ray tracing who cares i rather have high fps

  • @25myma
    @25myma Před 16 dny +1

    Epyc on AM5 is a smart move; not everyone uses their PC for gaming but not everyone has 5K+ to spend on a workstation system...ddr5 also allows for 256gb quite easily, so yeah, why not.

  • @edwxx20001
    @edwxx20001 Před 16 dny +1

    doubling core count in gpu's do not double performance. there is a lot we don't know about the performance characteristics of mcm multi-gpu cores, but we can expect them to be less performance efficient than the current monolithic designs. for all we know they worked on a model that doubled the cores, increased the silicon massively, and only preformed 20% to 30% better. in that case its not ready and you scrap the design.

  • @pvalpha
    @pvalpha Před 16 dny

    Yeah, increasing the CU count like that would *not* have been a linear increase in performance. And we know they had chiplet multidie GPU issues and this would have required multi-gpgpu dies to work together. I'm figuring that their drivers just couldn't hack the chiplets together for gaming they way they can for CDNA stuff so they're spending another cycle on that.

  • @joseavalos9988
    @joseavalos9988 Před 16 dny +1

    50% more CUs is no the same as 50% more performance!

  • @smokey1234
    @smokey1234 Před 16 dny

    you should definately tone down your volume level, each time i open a video of yours its always twice the volume of everything else...

  • @gavingi5875
    @gavingi5875 Před 15 dny

    i'm terminally ill ... I would have loved my last GPU ... currently putting together a frivolous last PC ... I want more than my 3090 ... oh well, I can only buy what my budget and timing will let me buy .. I cannot wait till next year etc.

  • @tendosingh5682
    @tendosingh5682 Před 14 dny

    I miss the days when tech news reporting was about getting straight to the point and provide quality accurate information. Now its just about hype. The sad part is that yt even rewards negative feedback so its better to incite "conflict" with the clickbait.

  • @ram64man
    @ram64man Před 16 dny

    Why the supply issues ?, people are forgetting tcmc was forced to shut down for a week after the major earthquake for safety checks, further more shipping from the area was Also affected, mainly due to national emergency, loading and unloading was for emergency use only regular shipping had to go through other ports , but logistics as well as ships suitable were last in queue or waiting for relevant ships to move from the other side of the island, further more it took days to clear the mountain passes and restore railway connections, give it 6 weeks and everything will be back to normal on chip supplies, unless china invades then we’re all screwed

  • @fxandbabygirllvvs
    @fxandbabygirllvvs Před 15 dny

    3d v cace works on ai also

  • @ManuFortis
    @ManuFortis Před 15 dny

    As far as the possible GPU releases for AMD go, I'm fine with them sticking to midrange for now. As much as I like having high end gear, last time I went into that, I ended up having to use it for 10 years nearly before it was justifiable to buy something new. Some would say sooner perhaps, but I like to get every penny out of my GPU's if I can. Since then, I've gravitated towards a more fps/power/$ balanced approach. And, this is basically where midrange lays as well, if you aim for the higher end of the midrange.
    What I suspect will happen, is after they release these iterations, they will eventually do a refresh (like is usual) but those refreshes will have the 180cu's instead of 144. This will probably be done at a point when their current 'midrange' is falling behind, and they need another boost in performance. And well, probably once again later on with the further 200+cu's. How long apart those releases occurs probably depends on how long they figure they can slow down for, before they have to pick up the pace again and start targeting the higher range again.
    And thing is, if that is the case, i'm all for it. I personally believe we have been pushing a little too far and fast in regards to GPU and CPU performance lately, not focusing on the security issues with CPU's for instance, and apparently with Intel, other issues. They need time to let some of these products cook a bit longer before they release them. Work out the bugs they find. Etc. But people push harder and harder for faster and better, and for what? More power usage? Unstable rigs? A certain point, there needs to be a slow down so that things can 'catch up' so to speak as well on the software side, and the user side as well. For instance, the constant increase of memory bloat due to software developers having no reason to actually optimize for lesser systems. Why would they? Something better is always right around the corner. That's not necessarily a bad thing, constantly having improved hardware down the road. But perhaps maybe we should stop going 100mph over the speed limit, ya know what I mean?

  • @BReal-10EC
    @BReal-10EC Před 16 dny

    I think the 3d V cash is actually good for AI also.....

  • @eskimolost2012
    @eskimolost2012 Před 15 dny

    8900 xtx/9900xtx would of been a real 5090 killer price to performance

  • @vineetkumarbharti2633
    @vineetkumarbharti2633 Před 16 dny

    Next video - @AllTheWatts leak - Navi 50 144WGP 288CU 512 bit GDDR7.

  • @toastsniffer
    @toastsniffer Před 16 dny

    Don't overclock the higher priced overclockable CPUs?

  • @ToriksLV
    @ToriksLV Před 16 dny

    Still not using dark mode i see.

  • @l3v1ckUK
    @l3v1ckUK Před 16 dny

    I remember back in the Athlon 64 days. AMD released one Opteron server CPU on the consumer socket 939.
    It was an overclocking beast.

    • @GForceIntel
      @GForceIntel Před 16 dny

      I've seen multiple videos that were overclocking just improves heat production in a cpu and little to no performance.

    • @Gen0cidePTB
      @Gen0cidePTB Před 16 dny

      ​@@GForceIntelI read that after a point, power leakage leads to dropped clocks (where the CPU skips a clock because power hasn't completely drained from the last clock), but there's still a performance benefit it's just not as much as it should be.

    • @PineyJustice
      @PineyJustice Před 16 dny +1

      @@GForceIntel Lol, that's absolutely incorrect. Overclocking back in the day brought huge gains, overclocking hypertransport and ram could give you 20% or more performance then you could OC the cpu from 2ghz to say 2.6ghz for another 15-20% performance on top of that. Nowdays cpus come out of the box fully overclocked.

    • @GForceIntel
      @GForceIntel Před 16 dny

      @PineyJustice I was talking about today, and they don't come overclocked out of the box. If they did, people wouldn't be so eager to overclocking an overclocked cpu.

    • @Gen0cidePTB
      @Gen0cidePTB Před 16 dny +1

      @@GForceIntel Having power limits removed would be a form of OC. Similar to PLL overclocks or increasing power limits on a GPU.

  • @royboysoyboy
    @royboysoyboy Před 16 dny +1

    Time to buy an intel arc card

  • @herrdoctorsloth9801
    @herrdoctorsloth9801 Před 16 dny

    Big sad about RX 8900, have been using AMD since my 486DX66 days

  • @sertocd
    @sertocd Před 16 dny +1

    intel POR is good enough you don't need intel baseline. (gigabyte mobo)

  • @ThisOLmaan
    @ThisOLmaan Před 16 dny

    YEAH MAN could of of should of... but AMD just wants to stay ahead on Ai but what do i know

  • @s7r49
    @s7r49 Před 16 dny

    Rtx are not selling well at any of our stores . I doubt if increasing prices is their answer. They probably aren't ordering as many since everyone who wants one already has one

  • @sleepingvalley8340
    @sleepingvalley8340 Před 16 dny

    Honestly I am waiting for the Strix Point APUs to come out its obvious AMD is refusing to compete at the top end at this point since they do not have the budget to do that with Nvidia so they are moving away from high-end gaming and into SFF, Consoles and Mobile Devices which with ARM starting to become a thing doesn't surprise me.

  • @jamescorey3516
    @jamescorey3516 Před 15 dny +1

    What a fancy way of Intel to say: "We overclocked to CPUs to makes our performance look better."

  • @ToallpointsWest
    @ToallpointsWest Před 16 dny

    Amd's GPU division, snatching defeat from the jaws of victory

  • @LastRightsTV
    @LastRightsTV Před 15 dny +1

    AMD is the king of what may have been. We have no idea why it was canceled, probably stability or latency. So it would have been a POS no one wanted, and those who bought it would have regretted it

  • @blackmage4100
    @blackmage4100 Před 15 dny

    There was news a while back that Nvidia is reducing the supply of RTX 4000 chips in anticipation of RTX 5000 release. Presumably to prevent a repeat of the RTX 4000 launch when there were still lots of RTX 3000 GPUs clogging up inventory. So no surprise that the supply of RTX 4000 GPUs will reduce. No big loss IMHO, these cards are pretty meh in terms of value :)

  • @stonedoubt
    @stonedoubt Před 16 dny

    I have a beast Core i9 13900k machine. The issue with these is not the chips. It’s the motherboard manufacturers.

  • @kkendall99
    @kkendall99 Před 15 dny

    I'm not bummed about my video games not being more amazing. Quite the opposite, I'm very excited to see AMD show Nvidia some competition in the AI space.

  • @eggsdooley
    @eggsdooley Před 16 dny

    I’m making plans to build computers out of used parts for disadvantaged kids. I’m only an amateur builder. What would be the best GPU / CPU combo to keep the build under $1000?

    • @Martin-wj6um
      @Martin-wj6um Před 16 dny

      Maybe something based on 5000 series amd cpu and Radeon 6700xt or higher.

    • @deusexmachina9743
      @deusexmachina9743 Před 16 dny

      Under 1000? That's a huge budget for disadvantaged kids

    • @eggsdooley
      @eggsdooley Před 14 dny

      @@deusexmachina9743 But….like, do they deserve a worse computer because they’re disadvantaged? The idea is to make them less disadvantaged, not to disadvantage them more. I want them to be able to play games at a level that is like other kids.

    • @eggsdooley
      @eggsdooley Před 14 dny

      @@Martin-wj6um Oh interesting, maybe the 5700x? I was thinking NVIDIA but I will get better value for money on used Radeon cards right? Maybe there is less chance they would have been used for bitcoin mining…..

    • @Martin-wj6um
      @Martin-wj6um Před 14 dny

      @@eggsdooley Yes.
      Mining does not really stress the GPU all that much. It stresses memory the most. DDR6X found on NVIDIA cards tends to be more sensitive,HBM cards even more.
      6800 and 6800xt now seem as a very good deal. And yes they have been used alot for mining in past because they have 16 GB of memory. Miner cards have much less heat cycles, but look for fans...sometimes they run them on max all the time...I would not be running from mining cards tbh :)

  • @amante762
    @amante762 Před 16 dny

    I'm not bummed about the 8900XTX, I would have had to hear all of you complain about how expensive it is for a year.

  • @jimdob6528
    @jimdob6528 Před 16 dny +1

    I wouldn’t be surprised if amd decides to release that massive gpu later once they see what nvidia can do to “beat them”… would be a classic fumble type crap they would do honestly. All 3 gpu brands are clowns at this point.

  • @zertex2830
    @zertex2830 Před 15 dny

    Just to point out the obvious about the supply shortage. It's more than likely that they want to clean leftover GPUs in preparation for the new 5000 series GPUs. At this point, everyone should know their disgusting practices.

  • @BigAndTattooed
    @BigAndTattooed Před 16 dny

    The AMD officially announced that they canceled the high-end gpu..... or that's just a rumor?

    • @Ancient1341
      @Ancient1341 Před 16 dny

      Rumor

    • @riven4121
      @riven4121 Před 16 dny

      Rumour from a reputable source and multiple sources that has a high chance of being true because of what happened with the 7900XTX

  • @Gielderst
    @Gielderst Před 16 dny

    It's a shame that AMD gave up on a successor to the 7900 XTX. But i'm glad to know that my 7900 XTX will still remain the fastest from AMD, even when the next gen GPUs come out. So that i can focus on saving up for Ryzen 9000 :D

  • @dj4aces
    @dj4aces Před 16 dny

    Years ago, Intel did say that what board partners are doing was within spec. It's funny to me that the same settings that were in spec yesterday are no longer in spec today. I hope board partners hold Intel's feet to the fire on this. While they're not at all blameless, they were doing what Intel did say it was acceptable to do.

  • @AxiomofDiscord
    @AxiomofDiscord Před 16 dny

    A 32c little core processor be very nice for my home fileserver. Hell a quarter of that would be a huge boost.

  • @HedgehogY2K
    @HedgehogY2K Před 10 dny +1

    AMD rules in terms of efficiency. Theor RX 8000 series would likely hit near their current top end with a better architecture, lower power consumption, reliable production and hopefully quality control, and best of all-staying within 8 pin capacities on our mid ranged PSUs. People have to buy stupid high power PSUs just to power their 12 volt 56 amp 4090s if 12v-2x6 is anything to go by. We should stay clear of Nvidia's power hungry idea of every single gamer hogging 675 watts simultaneously, imagine the black outs. It was already happening in Europe due to crypto just using 3080s. Stability is key for a successful market, no one can complain about issues that don't exist and the quality control would be easier on AMD as well.

  • @paulturner5769
    @paulturner5769 Před 16 dny

    So the "Graphics Chip" makers happily screwed the gamers who built their companies to make bigger profits from cyber currency.
    Then when that ran out of steam they were shocked that gamers weren't taking reheated crap and they had to actually make something decent.
    Clearly now that the 'New Shiny' is A.I. they are demonstrating that they didn't learn a thing.

  • @Guixu_cosmos
    @Guixu_cosmos Před 16 dny

    CPU GPU Power consumption?
    Hope not "A BIG SURPRISE".

  • @stoissdk
    @stoissdk Před 15 dny

    I just want AMD to focus on getting those mid to lower high tier GPUs out at a competitive price.
    Not everyone needs or wants, to pay for Nvidia's high end CPUs.
    AMD is in a good position to compete with anything mid tier Nvidia puts out.

  • @razeefbakar
    @razeefbakar Před 15 dny

    low supply, increased prices = don't buy. if u have a lot of money = buy what u want. doesn't really have money = buy what u need. if u are scraping the bottom of the barrel = up to you

  • @richdelmazzio9890
    @richdelmazzio9890 Před 16 dny

    Thank you for sparing us from click bait.

  • @gstormcz
    @gstormcz Před 14 dny

    Gpu gaming performance doesn't scale with raw performance linearly, so 2x more raw compute power is not much crazy. But if AMD release multichip gpu, it could be interesting.
    AMD Epic (I wanted to say Xeon, xd) won't be gaming chip, multicore chips clock usually lower and utilize 32 cores for game?
    Also the price will make it just as option.
    Intel old chips technology finally shows its age, lol.
    It looks like grandpa making its sprint run and getting heart attack. 😅
    I don't worry to tell it running my PC with Intel i3-10100f. 🤷🏼‍♀️

  • @AdamsWorlds
    @AdamsWorlds Před 16 dny

    AMD have never really cared for the top end of stuff. Its always been happy taking the middle ground. I think its probably worried about the custom chips and ARM also (i would be). If Sony, Microsoft and Nintendo all went Arm or some custom none AMD chip that would be a massive dent. Intel are probably equally worried. The server market in a decade could be very different if people go the route of custom chips. Best to scale things back for AMD/Intel and give us a drip feed of performance for as long as possible rather than open the taps. It's the same with the GPU market.

  • @moochiekk
    @moochiekk Před 16 dny

    I believe the reason for AMD not releasing 8900 XTX is due to using the manufacturing for their AI cards that will make way more $$$ and as we know, there is limited amount of space/time at TSMC to make any chips

  • @sniper1251
    @sniper1251 Před 15 dny

    Intel did just promote buy Amd to not have any issues lol

  • @itsTyrion
    @itsTyrion Před 13 dny

    Intel when auto-overclocking the shit out of CPUs and pumping 300 watts into a desktop CPU causes issues: Surprised pikachu face

  • @Trampus10-4
    @Trampus10-4 Před 16 dny

    Memory Express is full of 40 series Nvidia cards. No body can afford them. Especially seeing how AMD performs as good for less!

  • @gamerforever4837
    @gamerforever4837 Před 16 dny

    Yes, it's known nvidia does this to make more money, they have done this, since the 10 series. Now you act like it's new and first time ever, they have done it for years, and I bet it will happen with the 50 series as well.

  • @magottyk
    @magottyk Před 15 dny

    Stop calling Zen C cores "little" cores as they have the full functionality of normal Zen cores.
    The main difference between Zen and Zen C is frequency and L3 cache configuration.
    Zen4 CCD is 8 cores with 32MB L3, while Zen 4C CCD is 16 cores with 32MB L3 cache in 2 8 core CCXs.
    Arguably it's the two CCXs in Zen4C that can have a slight impact in data set heavy workloads due to the 16MB per CCX L3 vs 32 in a clock for clock comparison of CPUs, but not when comparing against Zen 4 "Big" cores in an APU.
    Arm little and Intel little (e) cores are less capable cores not just in frequency but functionality.
    The best description of ZenC is compact not little.

  • @DuXQaK
    @DuXQaK Před 15 dny

    Clickbait accusations thread disappears right after GamerMeld's denial comment in it, where he is questioning whether the accusers have actually watched the video.

  • @forog1
    @forog1 Před 16 dny

    In order for AMD to release a GPU like that, they would need their power efficiencies under control first. Their RX 7000 as is, are quite power hungry and AMD probably knows that as they probably made an 8900xtx in their labs. If they released something like that... No lol. AMD is doing right by focusing on their GPUs hardware architecture improvements rather than just another Rasterize GPU that we now have boat loads of.

  • @BigDrewski1000
    @BigDrewski1000 Před 16 dny +1

    Meh. I got a 2070 super and play 1080p. Im good. Lol

  • @jimbodee4043
    @jimbodee4043 Před 16 dny

    I suspect AMD''s lisa sue gets called by her second cousin, the geezer who cooks nvidida gpus in his kitchen oven, to not compete at the top end.

  • @grunt7684
    @grunt7684 Před 16 dny

    Yep, I would have been all over that 8900XTX

  • @mohdhazwan7706
    @mohdhazwan7706 Před 15 dny

    isnt people buys intel cpus for their specific feature? which is a centralized home heater 🤣

  • @toucouillefangirl7446
    @toucouillefangirl7446 Před 16 dny

    NVIDIA is one of the few companies making money off ai. 50 percent from a sample size already said they see NO RETURNS in the next coming years. Ai is progressing slow to give us things like cgi and self checkouts but not as much as the hype. Watching the ai bubble pop will be as funny as the crypto and NFTs

  • @joshhoskins6400
    @joshhoskins6400 Před 15 dny

    There's a reason Intel hasn't came out with an extreme platform in years 😂

  • @dantebzs
    @dantebzs Před 15 dny

    Intel did it on purpose to boost benchmarks on outlets and make the CPUs less garbage. Intel is to blame.

  • @dreamshooter90
    @dreamshooter90 Před 16 dny

    Am I bummed about the GPU AMD could have released? Yes. Yes I am.

  • @KimPossibleShockwave
    @KimPossibleShockwave Před 14 dny

    AMD canning their next generation, high-end graphics cards is going to bite them in the rear.
    AI is great, but it's a bubble.

  • @Dragon_Slayer_Ornstein
    @Dragon_Slayer_Ornstein Před 16 dny +2

    The X3D CPU's originally was not aimed at gaming, it was aimed at productivity. You get huge performance boost compiling code with X3D chips, gaming was an afterthought.

  • @fuzzylumpkins6034
    @fuzzylumpkins6034 Před 16 dny +1

    I officially dropped off all my tech subs today with exception of Gamer meld. I no longer have any tech needs as my employer provides me a 32k fund yearly for my set ups but GM actually gives news that matters. Cheers.
    As much as I hate/loathe and am disgusted by Nvidia when it comes to average daily gamers, the RTX A800 is an amazing GPU. Only being a 40GB it is close to truly amazing. Nvidia is capable of easily producing 40 and 24GB GPUs under £2000 and £800, they simply have no interest in YOU, the average consumer.

  • @WXSTANG
    @WXSTANG Před 16 dny

    With crypto going up... people are likely to start buying up GPUs again.

  • @TheHope12322
    @TheHope12322 Před 16 dny

    RDNA4 was getting too hot.

  • @jamesmcd71
    @jamesmcd71 Před 16 dny +1

    All you review channels really need to stop with the rumors and " leaks." It makes all of you look like a fool or propagandist.
    One day AMDs 8000 gpus are going to be better than Nvidia for half price. Then AMDs 8000 gpus are going to be crap. Then AMD is going to own the midrange market with the 8000 gpus.
    It's all just crap. I'm sick of it.

  • @MokenaRay
    @MokenaRay Před 12 dny

    What a shame. RX 8900 XTX could have actually outperformed the RTX 4090

  • @harlemshakeqcbrousseau6863

    I need a rx 8900xtx

  • @walkabout16
    @walkabout16 Před 15 dny

    AMD, oh AMD, what have you done?
    With CPUs released, but victory not won.
    The RX 8900 XTX, a promise so grand,
    Yet plagued by issues, not what was planned.
    Intel responds, to CPUs gone wild,
    But AMD's troubles, leave users beguiled.
    With promises broken, and dreams unmet,
    The gaming world sighs, with deep regret.
    And for Nvidia, bad news rings true,
    As their GPUs falter, in the review.
    AMD, oh AMD, what's the deal?
    With releases like these, you've lost your appeal.
    Critics lament, with voices raised high,
    As AMD's missteps, catch their eye.
    For in the race of CPUs, the stakes are high,
    But AMD's choices, leave us asking, why?

  • @saultube44
    @saultube44 Před 6 dny

    AMD not releasing a GPU that could have outperformed the 5090 by 50% is sad indeed; but it's not the 1st time AMD does this, in some cases they could have 64 CUs instead of 60 and then match and even surpass NVdia highest xx90 GPU with 1-10% but they didn't, they ha played underdog, to probably have some profit advantage, it's all BS, but I don't know for sure why they do it; I think is some arrangement among Lisa Su and Jensen Huang, I find no other possibility.
    @GamerMeld You should go less for the dramatic and more with technical facts; your explanations are getting more and more confusing