Just How GREat is the Radeon 7900 GRE ... Golden Rabbit Edition?
Vložit
- čas přidán 25. 02. 2024
- Wendell puts not one, not two, but three of the new Radeon 7900 GRE cards to the test!
**********************************
Check us out online at the following places!
linktr.ee/level1techs
IMPORTANT Any email lacking “level1techs.com” should be ignored and immediately reported to Queries@level1techs.com.
-------------------------------------------------------------------------------------------------------------
Intro and Outro Music By: Kevin MacLeod (incompetech.com)
Licensed under Creative Commons: By Attribution 3.0 License
creativecommons.org/licenses/b... - Věda a technologie
This really showcases WHY nvidia limits the ram on their lower tier cards. One of the many reasons I no longer give them my $$. Instead of giving you enough ram to get the most out of your card they purposely limit it to hinder you on higher resolutions.
They tell you that it's not a 4K card and to get a 4080, instead.
One of the biggest laughs I've had this gen was that the 4060ti was originally marketed as a 1080p card. Sure thing, with only 8GB of VRAM. But why on earth should a $400 card in 2023 be meant for 1080p? That's the same story they told me about the 1060 years ago.
Be that as it may, AMD aren't saints, either. They've got their flawed products (or should I say flawed prices), as well. The 7700XT is useless at $449, the 7800XT should never have been called XT, the 7600 should've launched as the 7600XT at $299 and the 7900XT may be really good now at ~ $750, but it was a complete joke at $900.
Personally, I don't care that much (7900XTX), but they could've done better, much better!
They force you to spend more money on a higher tier card, since clearly you are rich enough to have a 4K monitor.
So you can go up the product stack
@@ElJewPacabrah
It's the Apple product spend more to get more.
The ultrawide testing would be greatly appreciated. Love your content regardless, always hard to not catch your enthusiasm.
Needed
I love the long-running Danny Devito as an AI benchmark bit.
I've been on team green all my life but for the first time I'm contemplating my next upgrade to be a team switch...
Switch to intel...
IGPU 😂
Don't think too hard
Do it..... You won't regret it!!
To what, save a trivial amount of money for poor RT performance and a gimped feature set?
@@churblefurblesLess money, more VRAM, usually more raw performance. Not everything is about RT and DLSS... Both sides have their advantages.
I already have this kart for nearly month ... Performance is great and no stupid 12vhpwr connector ;7
Awesome review, thanks for all your hard work!! Really love your channel 💝
Wendel! You had the perfect opportunity to throw in a sesame street Count "ah ah aauuuh!" after the 3 in the intro.
I'm not brand loyal, I am just ANTI NVIDIA! 🤣
Thanks for the review
those sapphire cards are monsters, i own a nitro xtx and pure white 7800 xt
thats just a 6800XT on steroids how is that a monster ? 1080p? definitely not over 1440p
@NeverSettleForMediocrity just my experience. I have the 7900 xtx nitro plus like I said and also the pure white 7800 xt. Both can do high fps at 1440p and both can do 4k with many games.
@NeverSettleForMediocrity ultrawide 1440p is no problem for the 7800 xt
@NeverSettleForMediocrity benchmark numbers mean nothing if you don't play those games bro :) charts and graphs cannot show you the experience
you sounded jealous in your post @@NeverSettleForMediocrity
Good review thank you
Love what you said about "I don't think at this point in time in 2024 anyone should be brand loyal"
Great review, Wendell. It'd be interesting to see different resolutions and how performance varies from the widely accepted standard that all other reviewers test at.
Having to log into their software was what kept me from not buying another nvidea card lol
newer versions don't need log in
@@marcogenovesi8570 Newer versions of NVIDIA/Geforce Experience or whatever it is called? I know that you could download the actual drivers themselves manually without logging in, but since when did the shareholders allow this change of plan? I don't really have a Windows + Nvidia combo anymore so this is the first time I'm hearing that you no longer need to log in.
would be cool to see an ultrawide testing video at different field of view settings to see the stress impact of increasing view
I have a Powercolor Red Devil 7900XTX and the cooler is impressive, to say the least. Right now with the three 140mm fans in the bottom of my Fractal Torrent blowing cool air on it, the fans on my GPU are at 0 rpm.
PowerColor has been doing fantastic job these last few gens.
Every card with limited bus width, is worth looking at 3200x1800 performance. (when you use driver level upscaling to 4K it looks like 4K for the most part. ive never been able to pixel-peep a difference on a 65" TV)
Reason being there's often a massive performance delta between 1800p and 2160p just on account of framebuffer bottlenecking. Often to the point 1800p numbers are closer to 1440p fps than 4K fps.
Best review I have seen yet.
It would be nice to see more content on ROCm and ZLUDA. I will be looking into it whether my 6800 is of any use at all on Linux or do I need an upgrade (to GRE maybe?). The amd docs looked somewhat pesimistic last time
Surprisingly, Sapphire used to have top quality cards... but Steve from HUB also showed high temp on his Nitro+ 7900 GRE as well so... I don't think it's just a one odd example here.
Any chance of running proton benchmarks? I know all dozen of us gaming on linux would appreciate it.
Probably over on the Linux channel, like he said.
Power Color Hellhound GRE is very good. I hope the price won't be horrible.
It is, almost 800 pounds.
@@limpa756 :(
@@limpa756Where?
It's great card, it beats even the Sapphire Nitro+!
@@lukasschmidt9555where did you find this info at?
Great design on the Steel Legend, got to love a all Steel Legend PC Gaming build.
Would like to hear the more about the physical cards themselves and why you would pick one brand over the other.
Would love to see an in-depth dive into the Adrenaline Overclocking settings on these cards. Was thinking about getting a 6800XT and selling my reference 6800, but if I do that, I would definitely consider the GRE instead. Just would love to see how close one can get it to the 7900XT
I 2nd this! Very much interested in overclocking results with this card...
I was hoping the 4070 Super was going to have 16GB but NVIDIA greedy. I always wanted to get a Red Devil and the 7900 GRE is looking good.
This feels like a 3090 Ti in regards to timing.
12:28 "settle in" as the fan's logo settles in.. nice. Also lmao nice m.2 stick.
AMD does also have frame generation that needs to b build into the engine (part of FSR 3) where Fluid Motion frame generation is driver based thereby can work on various games, though not as good as the build in game frame generation.
About to buy my first graphics card in years. Good timing.
I have the 7800xt nitro plus and I can confirm my hotspots on certain titles do go above 25c delta, while the card was still in my pc case I very carefully tightened the screws on the retention plate and there was a very very small amount of movement in them, I would think that the thermal pads need to be heated up a few times to bed in a little better? after doing this my hotspot delta never goes over 25c, I also have experienced this on my cpu after repasting it, a very small tweak on the screws did give a little movement, also the nitro plus card fan curve even on performance bios could be a little better, there is plenty of power in those fans to keep the temps at below 60c with hotspot at no higher than 80c on the most demanding games, the 7900 gre just needs a few tweaks to get a lot more from it, great cards from amd very powerful.
I’ve been waiting about a year now for a good value proposition in this price range. 7800xt was looking like the one, but perhaps it’s the 7900 GRE. This is the most compelling so far
I’m in the same boat as you. I think I’d spend a little bit more money at this point. I’m curious on the different models and how they compare now.
I wonder if now with the benefit of AI , we could have a control panel similar to "Geforce Experience" on both sides red and green where you just input the desired fps and it will automatically adjust your graphic settings in order to obtain that desired fps outcome. Not sure it will happen though ...
best tech channel and its not even close
Competition is wonderful.
Would love to see more ML with this card
Now I need Level1Techs to make a Frosted Flakes knockoff cereal.
The ultrawide benchmark would be awesome! I have one and it's really hard to find anything about it
Timestamps are a thing in the 2020s I hear.
timestamps cost money
Pretty impressive result PowerColor got there with the Hellhound, beating Nitro by a wide margin.. great cooler.
Thanks Wendell for this review. Steve at HUB used the 4070 Super with his own comparison and his results levelled the GRE with the Super or very damn close. I will now sit through Steve at Gamers Nexus to seek out his opinion.
HU used a reference China only model and that was really sloppy. No surprise, given their history of shady testing but not excusable, especially as their obvious bias has now been proven to be so laughably inaccurate and just plain wrong.
@@ObakuZenCenter I have observed watching HUB that Tim is totally biased towards Nvidia while Steve will vote either way. It fascinates me no end watching Tim compare FSR against DLSS comparing one still frame zoomed ten fold in order to support his opinion that Nvidia has better tech. I find this just plain ridiculous as games are not played this way way let alone that any differences are unlikely to be noticed.
Powercolor coolers are incredibly impressive for how plain they look. I like the simplistic aesthetic though.
would love to see 3440x1440 testing to see if i should get a 7900gre or 7900xt.
It's basically a RDNA3 version of a 6900XT. Same 5120 cores and even 256bit bus.
Any comment about the developer who released Open Source ROCm 5 ZLUDA? Mind if you can make a Windows tutorial for it? Thanks.
Don’t forget about 3840x1600! ;)
After watching this video, I kinda want a RX 7900XTX now [weird]
On the ROCM and Linux side of things, I would be interested in videos on what you can do with them. Stable diffuion tells me nothing. What about a source image and then showing a stable-diffused-image? What kind of fun things can be done with ROCM and a 6000/7000-series AMD card? I want to play with it but I don't know what is out there. What is possible and what would interest me. I am not a photo editor or graphics designer, what could a dummy like me do? =)
Did I tell anyone that my 3080 12gb strix does 390w out of the box? Glad it’s winter is all I’m saying. 😂 prob gonna re-up in the 6000 Gen.
Just got a Nitro+, these card are beasts
"Ahh it seems a lil sus" lol
The 7900 GRE overclocked VERY well.
You can catch up to that stock 4080 in many titles.
Yas gimme ultrawide benchmark
It's 750 euros here in Europe, so 200 more than 4070. Sux.
Here in Poland 2500pln so about 620euro - not good , not bad
@@Pieteros21 Nice! I'm in Greece.
An XFX model was available here in Lithuania for 570€ a month ago.
420k subs!
I have the power colour RX6600 Fighter and yes a lower end card sort of only uses 100w but when gaming boy is it incredibly energy efficient. As great as the RX580 card was back in it's time if you have that card like I did the RX6600 is it's replacement and it's even lower PWR usage
Helpful video but please add timestamps next time
Nitro+ tend to be more power unlocked and faster than the rest hence why the temp is higher
Better cooling allows for higher manual oc though. Also board power limits play a big role as well. Dont know the board power limits of either card. Thats another big factor.
Still waiting for your 4070 super review
According to TechPowerUp's GPU Database, the RX 7900 GRE is actually 2% faster, on average, than the RTX 4070 Super. I don't consider that significant but it shows that price-wise, the RX 7900 GRE competes with the RTX 4070 but performance-wise, it competes with the RTX 4070 Super.
This is one of the reasons that I won't use an AIO. An X3D CPU doesn't warrant one and it just gets in the way.
If AMD would put out a card with local AI/ML workloads in mind that was roughly a 7600 XT with 32GB VRAM for like $600, they wouldn't be able to manufacture them fast enough no matter how many they made, and it would spur a bunch of devs to focus on AMD/ROCM support for LLM adjacent workloads.
Wow, that 4070 12GB performs worse than a 3080 10GB. I'm thinking there's more than "not enough VRAM" happening here.
AMD software is miles ahead of what NVIDIA gives.
So now almost all of AMD’s Rdna 3 cards are a flavor of Radeon 7900.
16:9 is wide screen, wendel i expect better from you son
This is a marginally decent launch (better than the 7600 and 7600 XT launches), but it doesn't seem like AMD really wants to sell a lot of these; or a lot of 7800 XT at this point. In order for an AMD launch to even be interesting, they have to be at least 20% overall better than the competition at a minimum. They're launching this now because they are worried about increasing sales of 4070 & 4070 Supers; shooting something in between is their best bet, but If you compare this to the 4070 Super, it's about the same Raster perf, but only at a 7% discount. If you compare it to the 4070, it's roughly 12-15% better, but it costs 6% more than the cheapest 4070 at $520. They need to squeeze out more performance or cut the price a little more... The real saving grace of this GPU is if the overclocking uplift is really as good as the rumors suggest!
3440x1440 user here
Of the 37 reviews out this morning, this is the only one im watching. Embargos are so lame.
Gforce experience was garbage requiring logins, Facebook if I remember
This will probably be my next GPU, unless I spot like a $400 7800 XT in the near future. Or I just wait for the 8000 series because the ray tracing will probably be a lot better on top of everything else...But then that'll probably be like $600-800...There's just no winning is there?
Is DLSS TECHNOLOGY proprietary to Nvidia?
If not, why doesn't AMD use it instead of FSR?
When is AMD going to realize that users want a COMBINATION of action and graphics performance?
Add timestamps
Wendell!
When we gonna see some Linux benchmarks?!
Throw the memory way up and I bet it's not half bad just posting before watching the video 😎
p.s he didnt touch the clocks but for the price its still not bad, just to bad its sold out everywhere in the part of Europe I'm in .... p.s the memory speed it apparently what you can adjust to get a lot more performance out of the card, undervolting the core and upping the memory speed. Hope he gets a chance to show us what kind of gains you can get.
Its GRE-ate 🤣
Cool video and all but maybe people should know that AMD might abruptly completely ignore massive issues with some *cough WoW cough* games and shove them under the rug.
with locked down overclocking it's basically a non starter. just a slightly faster 7800xt with better rt performance, that's it. and if you really want that rt perf, why not just spend 50 bucks more for 4070 super?
What "locked down overclocking" are you talking about? The 7900GRE has the best OC potential of all cards this gen.
@@danieloberhofer9035 They have no clue
@@danieloberhofer9035have you ever tried to overclock amd 7000 card lol
He thinks we know people that can afford houses and GPUs so we can go try it.
Are people still buying GPU's?
I thought that GPU was a PS5 for a second
AMD cares about gaming LOL when was the last time AMD was first to market with VRR, RT, upscaling, and reflex
Probably hard to find a 14900k with a memory controller that can't do at least 7200. Cmon now, you of all should be able to do it.
RAM speed barely, if at all, affects the performance, so it's not really a big deal.
@@hydroponicgard haha, you only think that because you watch those xmp tech tubes mate^^
The way CZcamsrs be hawking this shit amd must have sold fuck all in China with o how hard they are pushing them on the rest of the workd
Sadly, it's Dragon year now🤣
No money in china to be able to sell the things =)
$550 is still too MUCH for GRE !
Make it $500 (and the 7800XT $450ish) then the cards will fly away from the shelves... 👍
I agree. If people arent buying gre they will.
This is what the 7800xt should have been instead of barely matching the 6800xt. It's unfortunate we don't see decent cards until the last quarters of a generation before they're all replaced. More unfortunate that it's a _worse_ value than the existing 7800xt according to others' reviews. Somehow Wendell must have a golden sample because he's seeing far better numbers than other reviewers for the Gimped Rabbit Edition.
Milking as much out of the consumers as possible that have FOMO mindset.
Same with the CPU side, why didn't AMD have the X3D cache on all of their 5000 range of CPU's?
Literally released them at the tail end as less and less mobo manufacturers are producing AM4 boards.
@@doctorspook4414 Yeah, going over other reviews today Wendell's numbers look questionable as everyone else is seeing FAR smaller gains on the GRE over the XT. He either has a golden sample GRE or the world's worst Nvidia cards. Everyone else shows it doesn't even cover the price increase over the XT, making the GRE a _worse_ value.
Between this, the 5700 non-x, and the 5700X3D AMD is literally selling everything they can dig out of the garbage cans.
@@doctorspook4414Let's stay reasonable. The GRE is debatable and pricing needs to adjust after launch, I certainly give you that.
But your comments on Vermeer-X are just not accurate. When Vermeer (aka Zen3 or Ryzen 5000) launched, they didn't have 3D-vCache ready, they had just started bring-up of the first samples themselves. Also remember, that was even before Alder Lake was out. Intel's competition at the time was 10th gen and that abomination 11th gen.
Moreover, 3D-vCache was originally intended as a datacenter product for MilanX, and the later success as a gaming focused part started out as a skunkworks project, because they had some leftover chiplets around and someone thought "Why not on desktop? Let's give it a try and see what happens." Originally, they didn't even intend to launch it to consumers, but when they realized they had a golden opportunity at hand, of course they did.
And just as today (and certainly with Zen5, as well), there's no reason to equip the whole lineup with 3D-vCache, because not every customer values maximum gaming performance over everything else, and the stacked cache comes with drawbacks other than costing more to make. So there's a very good reason to differentiate and have CPUs without it.
@@zodwraith5745 Or a little of both. Maybe its not golden sample but a good sample vs a bad sample. +3% here -2% there vs their avg siblings instead of a golden sample.
@@foxs49er Still doesn't explain why his numbers are SO much higher than the 4070 compared to everyone else. He must have the world's worst 4070.
I am not going to buy something called golden rabbit edition, ever. But still watched the whole video and might even consider researching AMD to replace the 3070. Been on nvidia only forever. 2 trillion dollar company with windows98 looking control panel is a source of much chuckling in my group.
4070 super all day no question
Terrible mindset. You should probably ask more questions, and not just about GPUs.
@@highpraise-highcriticbig time
Still sour about the 8700G launch spec switcheroo shenanigans. AMD has to do better, not introduce useless market segmentation.
The only launch shenanigans that I can find for the 8700G are about a random spec sheet reference to ECC support that got removed with the consensus seeming to be that it was an accidental listing rather than deliberate spec downgrade, agree that they need to do better but seems pretty mild all things considered (AMD in general have been pretty good about consumer ECC support, with their desktop APUs seeming to be a consistent exception rather than only just being excluded this generation).
Why did AMD not put out a card that can compete with the 4090???
Why should they? Almost nobody buys the 4090.
12:05 LOL
AMD cards cost more than their Nvidia counterparts in my country of India. Nobody buys AMD here, practically a non option.
Golden rat edition
Perhaps
Golden BNUUY you can't just MISCLASSIFY THEM IT IS *VERY OFFENSIVE!!*
"The most interesting thing about AMD's new GPU is NVIDIA" 😂😂💀💀💀💀
A question: why are most Tech CZcamsrs unable to understand that GPUs aren't only a gamers toy but mostly a tool for professionals ? 🤔🤔🤔
Ah yes, the issue is totally vram being "too small" and not the fact modern games are a bloated mess.
Look how long it took you to distinguish between two resolutions just because you're using the absurd term "1440p". Just state the width and height, so it's no longer mysterious what you're talking about.
Resolutions are not some number followed by a "p". I don't know why that caught on, but it's extremely annoying. It's not even correct for 1080p, which is an HDTV signalling standard, not a resolution.
Resolutions for TVs have been a number followed by a "p" or "i" for as long as there has been any form of HD screens.
480 was standard TV resolution, then 720i, 720p showed up, followed by 1080i, then 1080p.
"i" is for interlace, and "p" is for progressive, regarding how the TV updated it's pixels.
You should learn the history of why technology has the labels it does before running your mouth like an idiot, because 1440p is, and has been, a standard resolution for over a decade.
@@Skelethin The HDTV standards includes two signal specifications that end in "p". 1080p and 720p. The signal specifications include the resolution and refresh mode. They are not themselves resolutions.
I know full well the history of these things, as I was there and paying attention when they were created. You're talking nonsense.
The habit of throwing "p" at the end of a line count to replace resolutions has never been correct, and serves only to sow confusions, which is evident whenever someone tries to refer to historically prevalent resolutions with that absurd naming scheme.
The fact that so many have taken up an incorrect terminology doesn't make it any better.
@@TrueThanny except that they are still *standardized resolutions* that are*commonly and universally accepted*.
"Throwing p at the end" is to specify that the resolution is using progressive scan for its refresh, and it has whatever the number it is as the number of vertical pixels. And with the standard being 16:9 for screen radio, knowing one of the dimensions means you *also know* the other.
Your bitching about people *properly using p to represent a screen resolution* is pedantic, stupid gatekeeping nonsense to whine about not everyone agreeing with your personal, specific preference to have *the entire specifics* of a screen size be used all the time.
If you think anyone would commonly use 1920x1080 or 2560x1440 *every single time* they referenced 1080p or 1440p - especially for something where it is so repetitively used as a review of a GPU - you are an idiot. It's needlessly repetitive, very annoying, a waste of time, and listeners would get annoyed and sick of it really quickly.
Also if you have this much hate for the user of 720p, 1080p, 1440p, you must really hate how effectively useless descriptors like '4K' or '8K' are, as they don't give any actually real numbers like 2160p(3840x2160) would instead of '4K'. Except 4K is a marketing gimmick term that is too embedded to just drop for 2160p.