Almost 400W under Full Load - The 14900KS is one of the Most Extreme CPUs ever Made
Vložit
- čas přidán 7. 06. 2024
- Get Conductonaut Extreme and other TIM here:
www.thermal-grizzly.com/en/co...
---------------------------------------------------------
Support me on Patreon:
/ der8auer
---------------------------------------------------------
Music / Credits:
Outro:
Dylan Sitts feat. HDBeenDope - For The Record (Dylan Sitts Remix)
---------------------------------------------------------
Paid content in this video:
- Grizzly Spot
Samples used in this video:
- i9 14900KS
Timestamps:
0:00 Intro
0:45 Thermal Grizzly (Advertising)
1:07 Intel Core 14900KS review guide
2:16 Cinebench R23 & power consumption
4:04 Delidding & System Integrators
5:13 Clock rates in Cinebench
6:56 Gaming benchmarks
7:40 Disclaimer: RAM configurations
8:19 Gaming benchmarks & X3D issues
10:13 Test methodology changed & gaming benchmarks
11:43 Summary/Conclusion
14:35 Outro - Věda a technologie
Since there are a lot of comments about the X3D Core affinity:
I know that I can manually fix it with tools like process lasso but that's no adequate solution. Only if we point out issues like that, they will be fixed eventually. Publishers, AMD and Microsoft have to update their Games/Windows/BIOS/Chipset so it works out of the box. You have to keep in mind that the average gamer probably has no idea about things like that and that he might not get the full performance.
I understand but it's kind of a lazy comparison then. The KS products are a chore to review already, so I get why you wouldn't want to take the time ( which you stated multiple times in the video that you had no time for this review ) to get the 7950X3D to work as intended. At that point you may as well not include it in the graphs. Just say that you couldn't get it working correctly at the beginning.
@@RAZGR1Z So you don't understand then. Read again what der8auer wrote.
hopefully AMD fixes that because it's insane that it's been out for more than a year now and it's still behaving badly
400w to thermal throttle the think. Is useless
@@RAZGR1Z I agree with Der8auer here, if I buy a 7950x3d so that I can game mostly and render occasionally (my real-world scenario), I don't want to have to use process lasso to control what cores the games are running on. I would if its a game that I'm playing heavily and seriously care about, but what I'd like is for the system to do this automatically. Pointing out these failures and making AMD look bad (even if they aren't technically worse) inspires more development time towards fixing these issues, and everyone benefits as a result.
You can also squeeze out more performance by minimizing background processes running and chasing lower total system latency by doing a bunch of permanent process scheduling to move core system processes and interrupts to the non-highest boosting cpu cores, but it really does approach being way too much work to be reasonably acceptable. If we pressure Microsoft to make Windows behave like this by default, it saves individual work for consumers and brings benefit to everyone at the same time. What we really need is more exposure to these kinds of issues from major content producers, like Der8auer is doing here, with community support showing that these are real issues that people care about.
For the 7950X3D you need to make sure your Xbox app and Microsoft store apps are all updated. Once you do that you can hit the shortcut Windows Key + G and it comes up with the game bar. In the settings wheel in game bar you can select this is a game and the game will now set to be ran only on the Vcache cores. I have had to do that with only new games that I guess haven't been added to the list in the driver yet, but this is the way to fix it.
Very interesting, I didn't know that :) Thank you for that comment!
The question is how many average Gamers would know about that. In the end always better if the CPU or driver does it by itself
@@der8auer-en Its drove me crazy for a while until I found a reddit post talking about making sure the game bar was updated and that you can set specific applications as "remember this is a game" in it. It is not the most elegant solution but does the job.
Kind of ridiculous in this day and age that consumers have to figure out something that should be so simple... Or at least ahould be instructed to the end user.
@@Poo_Garden I'm pretty sure you are instructed, you just didn;t read it
So Microsoft being Microsoft
400Watts CPU + 450Watts GPU = 1HP AC just to counter the heat. :D
+ 200w from RGB (not even an exaggeration at this point)
Just put the whole case outside the window. Big brain move
@@username8644I hate RGB.
Luckily in my country one needs heating for 98% of time every year. When one doesn't need heating.. one doesn't want to stay at the computer anyways
@@marcogenovesi8570 Yeah, screw the planet! I like it.
Cats everywhere will love sitting on top of pcs running this. The ultimate cat warmer.
400w is kind of a fur burner though if they fall asleep on that! With a GPU that's in the territory of many space heaters.
It's impressive how much less the 7950x3D is consuming while almost at the same performance
Whats even more impressive is the 7800X3D, yet its not in these charts for some odd reason. It does even better in games.
It's very finicky with scheduling though just like mentioned in the video. For primarily gaming, the 7800x3d is a much better choice.
@@HenrySomeone You could use something like Process Lasso to manually control what cores are being used, but this might require experimenting with what works best for each game and many people just want it to work right straight out the box. Still, if you know what you're doing and have the patience you could make the most out of the 7950X3D. There are some games that will use more than 8 cores when things get more demanding, which could make the 7950X3D better than the 7800X3D, especially on the percentile lows which is where you will more likely be CPU limited.
The 7950x3d is a garbage CPU that cuts its cores in half.
Why? It is not really surprising that a CPU that has a abysmal but extreme auto-overclock applied will have bad efficiency.
This feels like 14nm+++++++++++ all over again. We're on the 4th edition of this CPU.
Intel 7 node is being wrung for everything it has here. Arrow Lake dropping all the way to 20A (double node shrink) is going to save so much compared to this. Rumors are that PL2 might be reduced from 253W to around 177W for the Ultra 9 290K (15900K).
@@DigitalJedi Although it remains to be seen what kind of clocks they manage. They had a huge problem when going from 14nm to 10nm where 10nm just didn't clock high enough to beat 14nm early on
@@DigitalJediNo. PL2 is rumored to drop to 177W. 230W isn't even mentioned anywhere. You just made it up.
@karl5010 oh shit it's down to 177? Dang. Gotta keep up to date on my leaks I guess.
7950X3D: "look what they need to mimic a fraction of my power!"
Overclocked and fine tuned Intel still blows the AMD ones into oblivion. My 13700K is running so smoothly and high all core overclocked, it beats AMD any day.
Last time I was that early, CPUs didn't double as furnaces.
@@arudanel5542 by "this winter" u mean in like 9months where better cpus has come out? lmao
Still not as bad as 20 years ago with the P4 Extreme Edition that wouldn't downclock and always run hot.
@@puciohenzap891 CPUs were a lot lower-power back then, the coolers just sucked compared to what we have now. Put a P4 under a modern tower and it'll barely get hot at all.
1.5 years later Gamers nexus and this dude are still fighting the 7950x3d to make it run. We Intel guys slammed our processor off the table and caught it on the motherboard locked it in and haven't looked back!
@@kramnull8962 Except you have to put it in a contact frame so it actually makes proper contact with the cooler, but keep on dreaming 🙂
Gotta also select "remember this is a game" on the x3d in gamebar settings
Is there a "this is a PC" setting in bios?
KS = building your home near a river and diverting waterflow to your loop. Just make sure to install those fish barriers.
That sounds like a future LTT episode.
the 7800X3D still reigns in gaming ;-)
Ys and here I am with my 7800x3d pulling 85 watts.
I'm still using my 5800X3D 😊
Fanboyism is dangerous. Both companies have different use cases to offer, and "choosing a side" just means that you will blindly defend your "side" even when they do bad things.
Don't choose sides, it harms your objectiveness.
Lying to oneself is a result of fanboyism, and you just don't help anyone like that.
@@Boogie_the_cat AMD OR DEATH!!!
@extracurricularkitty3605 it s just stating a fact nothing more or less it s dangerous to read something and apply your own wishes to it! I really hope you have a learning moment reading this comment and have a great day!
Finally, an informative and smart review. Thanks.
This cpu is a joke imho. 400W to barely beat AMD. wow. 😮
This is Intel's Bulldozer
A CPU that shouldn't exist. Period. Thanks for testing!
Why are you saying this. The 14900KS is the working golden samples of the Intel series binned as 8 P cores and 16 E cores CPUs. It is not a new CPU but the result of binning. I can accept that this CPU is kind of pointless. But at the same time Intel know two things. One it is a market for this CPU. And people are willing to pay premium for this CPU. So why not cash in on that fact
The cpu is perfectly fine. This is one of dumbest comments I've seen in a while.
@@afre3398 So true. It's for hardcore overclockers.
@@blakkwaltz🤣🤣🤣
What are overclockers going to play with then? This chip is going to set new records.
Good God, Roman, your "script warmer" is gorgeous!
This power thing looks exactly like the problem last time with mainboard defaults and autos being wrong and applying too high voltage.
Did you check the actual voltages vs requirements/spec?
I'm reminded of the Preshott socket melters, Netburst again
No Mr Bond, I don't expect you make a video without your script!
Definitely a Bond villain cat. Good one!
Enabling delidded processors feels like a must, to keep temperatures under control.
Hey Roman, regarding the 7950x3d vs 7800x3d discussion, couldn't you have simulated them by deactivating the non X3D cores or just by using Process Lasso to force the program's threads to utilize the X3dD cores? I'm sure there's a valid reason, but I'd be curious to better understand!
if you look to the left of gaming benchmark chart, you can see ryzen power consumption, easy to miss among intel bars racing with fps bars
Came watching for the CPU, kept watching for the cats
..I should watch this channel more.. maybe it has dedicated cat videos.
Louis Rossman?
it would be nice to see what this KS would look like with e-cores turned off and or maybe HT off as well, power would certainly be more acceptable but would performance stay the same?
I've actually run into some cases where windows scheduler gives up trying to schedule stuff properly on the hybrid-core intel chips and then just starts shuffling threads around wildly at times, causing bad stuttering. This usually happens with a pairing of game & cpu when the game simply has more heavy threads than physical p-cores, at which point I start seeing stuttering, etc. This doesnt really impact the higher corecount chips because most games only have around 8 heavy threads currently (some do exceed that, but they usually aren't fps-critical games - its stuff more like AI-heavy turn based strategy games during end-turns), but it is very noticeable on the lower end mobile chips with hybrid core architectures which might have only 4 p-cores (or even less). Even games heavily optimized for old hardware that can run just fine on pure e-core cpus (ex, the N100) like warframe can suddenly experience intermittent really bad performance stutters if you put them on a device that has a cpu that is better on-paper like a 2p+4e core chip, purely because windows scheduler starts panicking and throwing threads everywhere since there are not enough p-cores for the game when if it kept the threads in place on whatever cores they were already running on it would be just fine.
This CPU will make Gamersnexus's Steve create a whole new methodology about CPU coolers, and 100+ new charts to fully test this CPU including: "Can this heat my room in Winter?" "Can this heat a room in north pole?" "Will it cause a blackout in Germany?". Stay Tuned for GamresNexus Review!
"How long will this last in a power outage", hook it up to a $100 UPS Battery backup while running full load
Hahaha right on
No steve is preparing for another long drawn out battle with the next gen of Ryzen. He had this same issue when he tested this 7950x3d chip 1.5 years ago.
Amd learned his lessons from bulldozer, Intel forgot the ones from pentium 4!
buldozer was not close to 200 watts, this thing is a crime
@@betag24cnIt was for the FX 9590. CPU was insane, especially with the original price. Was the first 5GHz CPU however (turbo permitting) so far from unexpected.
@@dabombinablemi6188i forgot that CPU existed. I remember it dropped in price like 70% in a couple of months(numbers out of my ass, i dont remember exactly). At least if it had good performance
I had a 6300. I OC'd it to 5ghz without changing voltage. Good times.
I'm guessing the video is recorded in 30fps? The game footage at the end and the to camera footage is incredibly chunky is all and the game footage has been scaled back to the recording frame rate?
I remember seeing something somewhere from someone about limiting the clock speed of the non X3D cores so that the "fastest" cores are always the ones with the 3D V-Cache
I have the 14900k, I love it even though its hot. But this is why competition is great, and why I'm thankful AMD is making good cpus, competition only helps us get better products.... Although in this situation i wouldnt call the ks better...
The price to performance consideration has led me to choose AMD CPUs for every system I have built, dating back to the Phenom II X4 965 Black Edition. I have gone back and forth between Nvidia and AMD graphics over the years, but have always run AMD processors.
12th gen intel was a beast, though, for the time. Not that big a leap from 12th gen to 14th (13+) gen, though.
Curious how well binned these are and how well they respond to voltage tuning. Just from an academic standpoint. I don’t need this many cores anyways (not on my gaming machine, do dev work on my Mac) but just would love to know.
These are the creme of the crop. The 13900 is the lowest with all the cores, followed by the 13900K/KF. The 13900KS and 14900K/F are the step above that. The 14900KS is even higher binned. These are functionally golden samples enabled by the Intel 7 node being quite mature.
I am curious as to what was used for cooling... Was it done with an AIO, and if so which one? Not many reviews showing over 300w cooling capabilities for an AIO...for 400w it almost feels like some custom cooling is going to be required.
That AMD power saving is Epic levels of awesomeness!
*Epyc
Yeah Intel isn't for us Europeans any longer. But maybe it is a conspiracy, us blows up the pipeline and intel makes high power cpus. They wants us poor.
Your cats will love the heat from this CPU
Still idles around 4 W doing surfing and simple tasks.
@@kramnull8962cats when he starts up a game: 🎆🎆🎆
@@kramnull8962 Bro Heat is horrible thing than a 30w idle power .. this thing is literally going 100c for just gaming
What motherboard are you on, and cooling?
What cooling solution was used in this test? 360mm aio?
Der, I received some Kryo from Amzn and it was either a 1:1 fake or just got too dry from sitting. I'm trying to paste a 1065G7 on a Laptop, is Hydro comparable to Kryo thermally? Probably gonna try something ekse, but Kryo is the only paste I've used so far for significant temp gains. Anyone, suggestions? Laptops don't respond anything like Desktop w/ IHS and a lot of thermal mass, stock paste and MX-2 are pegged against 100c throttle under significant load, Kryo is like 15c lower under load and somehow prevents PL2 boost from immediately exceeding TJ.
With 5090 soon to come and new Intel CPU's I surely need a contract with the city general power grid or i'll rent an warehouse near to an industrial power plant
we will need to extrapolate meteor lake performance, scale up to desktop level to see what it performs
Sick
Maybe you already tried this but for the 7950X3D sometime you have to update the Xbox app through the Microsoft store, to get newer games to play on the X3D cores, and you can always force the game to be on the X3D cores by checking the box under the xbox game bar app general setting for "remember this as game" I have to do this for most emulators . 🙂
Back in the day if you recall the extreme editions used to give 5 year vs. 3 year warranties. I think it's nice that for the extra money you get better warranty. That alone is worth a lot for the high end cpu stuff as the risk of them cooking themselves is higher. If it wasn't for that warranty extension I would have had a big problem when my 990X died, but with its warranty Intel just sent me a new one. Can't complain about that.
42,650pts in r23 on my 6.2ghz overclocked 14900k @ 360w. 💪
Nice.
My 13900kf get over 42k @5.7p 45e I'm convinced these new 14 gen cpu's are showing 6+ ghz but are equivalent to around 5.7/5.8 on the 13 gen i9
Intel are campaigning for 420mm Aio coolers. 15th gen are going to cause power outages.
maybe I have to start selling liquid nitrogen coolers again xD
@der8auer-en I miss doing ln2. My passion was gpu tho and evga leaving left a huge hole lol.
We need more cases to support 420mm
Apparently the thumbnail shot turns the video sponsor into 14900KS requirement.
Roman, would that deal be for extreme laptops, where they use desktop cpu's with liquid metal ?
LM gets a bit more headroom but the main limit in laptops is cooling: radiator volume and weight.
I have a 13900HX (same silicon as this KS lol) in a mid size 16" that's limited to 100W (CPU only, no GPU) without needing jet fan mode.
My old Clevo DTR brick that was twice the size (2inch thick, >6kg) was built for overclocking, it had a massive 1.2kg vapour chamber heatsink to cool 2x190W GTX1080s. It also has a VRM capable of running a 9900K at 170W but I had to add watercooling to let it happen. With some vbios mods I've seen over 600W from the wall when benchmarking
I also proved with my other completely internal watercooling mod of a P370EM, that the external radiator cooling really is needed. I was able to add 2 extra laptop sized heatsinks to a 17" chassis to get great OC (4.7ghz 3940XM) but it just couldn't cool well enough in the long term to keep loop temps stable, and prevent GPU VRM overheat.
The high end these days sucks by comparison, a Titan which brags about 270W combined CPU+GPU cooling. It can push a 14900HX to 238W short + 176W long power. Does that make the $5K price tag worth it?......
That cat is absolutely adorable ☺️
Yeah, testing CS2 in 4k with 4090 at 99% load is a VERY nice test for the CPU. Keep up the good work.
So does this mean. That this CPU in a test bench setting . Was on the edge on maxing out at a 360 AIO??
I know it's weird, but I'd be curious how well this undervolted as much as how it overclocks. Can you get it to use the same power as a 14900K stock with no performance loss?
This is basically Intel doing a cover of the FX-9590.
how much dod that cpu used? like 200 watts? 250 overckocked?
Time is a flat circle, or something.
@@betag24cnstock tdp was 220w.
@@dex6316 half of this, ok, i already forgot
😂😂
Mostly for the spectacle's sake, I'd like to see how this new KS would perform delidded + direct die contact frame + heavy duty water cooling loop. Seems like the delta between a 14900k vs amped up 14900KS would be interesting.
For shits and giggles, I'd also like to see a 7800X3D + 7800X3D delidded + direct die contact frame + water cooling loop, plus some very basic modifications in BIOS, mostly as it relates to curve optimizer built in _(ASrock has a good one)._
I'm interested in the spectacle of what _'could'_ be a typical daily but at the extreme end. Not important, but I'd find it interesting.
That's perhaps all AsRock has going for it.
Considering how good the IHS on the Intel's must be to actually dissipate all that energy as well as it does that would be a comparison I'd expect to trash Intel even more - assuming your AMD motherboard is capable of supplying more than 100 odd Watts so you can get more performance rather than just cooler running (not that cooler is a bad thing).
@@foldionepapyrus3441I own a 7800X3D and my prev CPU was a 5900x. I'm just curious to see what this would look like.
The 7800X3D wouldn't do well with more power since it seems that it's a bit more sensitive to heat, probably due to the cache. I know it's voltage sensitive.
@@davidbetancourt4028Get rid of AMD's rather low efficiency IHS and I'd expect the 7800X3D to be able to crank it up a great deal more with no heating problem.
AMD haven't even tried to make the IHS nearly as good as shedding heat as Intel - but then they don't need to, their compute per watt is good and the CPU is perfectly happy running at the higher temperatures this creates... I'd love to see direct die on AMD more than intel for this reason - probably much more gains to be had.
@@foldionepapyrus3441I've delidded a few processors and used the systems for a good bit. One was my HTPC. I wanted it quieter since it was in our family room. Lowered the temps by a good 20 degrees C which was nuts.
I thought about doing it to my 7800X3D but decided against it because I didn't think it'd matter that much. If I had to do it over again, I would have bought the delid tool and tried it out. I got too much going on to try that now tho, but it would be neat.
I'm not sure how much headroom it'd have. I haven't seen anyone do this though. I'd really like to see that along with the Intel for the funsies.
Wich is the AiO used in this test? And fans? Speed of pump and fans?
Great to watch this with you!!!
In Brazil this CPU is a complete nightmare unless you are rich and/or have solar panels and a big AC.
Very happy with my 7800X3D with only 50-60w gaming
Can't wait for Alienware 14900KS delided, with intel stock cooler on top 😉
I'm just waiting for it to actually catch fire.
Haha, delidded with "liquid cooling" but it's only a little 120mm aio, or whatever size those little alienware ones are...
5:06 how do we know it is delidded and not just lidless from the manufacturing?
Enjoy your videos and look forward to the next one on the KS. I'm really curious how the KS performs power consumption wise when power limited because sometimes the unlocked power causes the voltage to spike. When you do the next video hopefully you check the stock 253W power profile.
I wonder what the future of PSUs is going to be....with more powerfull draws
They can't really just keep increasing the wattage, due to limits in the actual plugs themselves in your walls. For the US, around 1500 - 1600 watts is about as much as you're going to be able to pull from a standard 120V outlet. We already have power supplies up to that.
Next gen intel will scale down is cpu so less power draw
@@DaveGamesVTfor high end stuff they can, just get two sockets on different circuits or a 240v circuit like for ovens and other high power equipment and EVs. I mean this is a 700$ CPU, if you can afford that and the god tier system around it you can afford a little of electrical remodeling
@@DaveGamesVTAbsolutely correct. But they probably could make a higher wattage unit that runs on two separate 120v outlets. That seems absolutely ridiculous though.
@@username8644 But you'd need it to pull from 2 different circuits, you can't just plug it in top and bottom.
Be aware that Asus has special settings that ignore Intel limitations, this can explain the high power draw. Jay did a video about this.
And Mr Bauer definitely set his own stuff in bios based on the results
@@Houstonruss You mean to impress the AMD fans. All you need to do is make AMD look good to be a content creator.
der you are by far the best tech youtuber in the oc part but can i ask you a little favor??can you make a video this again but with the cores disabled?they say if you disable the cores you get more fps i dont know if they say it for ecores or pcores
I'd be interested to see how much you can undervolt the 14900ks and achieve stock clocks. If you can get a -0.1 thru -0.2v offset and still achieve stock clocks, that will bring down power consumption and heat a lot.
I know with my 13600k I can achieve a -0.225v offset and still run stable at stock boost clock speeds. This isn't how I have it setup though, I have it setup achieving the maximum overclock on stock voltage of 1.375v. Which I am able to run 5.5ghz on all p-cores, 4.3ghz on all e-cores and 4.9ghz ring cache.
EU tends to ban everything non-eco nowadays... this cpu would be prime example to do that in the electronics branch :D
I've seen Intel 13900K/14900K users on Frame Chasers and Reddit say they're having silicon degradation on their chips. They said that they can't hold overclocks anymore and in some cases have complete CPU failure after ~6 months. Are any of you experiencing this?
Not sure about 13th & 14th but I've been running a 12900KS all-core 5.3 GHz @ 1.4-ish V for over 1.5 years, still as stable as ever. Having said that, during normal use, I run my PC on Windows power saver plan with it's max CPU frequency set for 3.3 GHz - I think it's min: 5%, max: 50% if I remember correctly. If for some rare reason power saver is too slow / unresponsive, I switch to Balanced but with min: 5%, max: 99% (or was it 95%) instead of 5%, 100%. This means a max of 4.x GHz (can't remember). This means my CPU is always running at very low voltage, heat, power, and frequency. Only when I do things that benefit from max power/speed do I set the power profile to one with a max of 100%. For example, for gaming & benchmarking, I use "Ultimate" plan w/ min: 100%, max: 100% and for things like editing & rendering videos, I use Balanced w/ min: 5%, max: 100%.
I don't know why it's so uncommon for people to use & edit different power plan profiles. Why would I want my CPU blasting to high voltage, frequency, power, heat when I'm just doing easy stuff like surfing websites, watching videos, etc.?
Can it be due to these cpus constantly running at very high temps?
@@ashenwattegedera Can be architectural decay from just how much electricity is being pushed though, not necessary only heat.
The degradence is probably from the voltage the cpu's running at. At least in this video some of the core vids were hitting 1.48v, which is very high, my i7 14700k hits at around 1.35v for the max core vid out of the box. For the i9s you'll want to undervolt them, to try to lower the amount of voltage being pushed through them.
It happened on my ryzen 1700, after 3 years it needed at little more voltage for same overclock.
Could we see a test of the 80w mode against k or kf? Would better interesting to see the silcon quality difference.
I’d really like to see some other real world testing. How about DaVinci Resolve and SolidWorks?
14900KS is a 13900KS TI
More like Ti Super.
@@Purjo92 more like 14900 super - same perfomace more power draw more heat bigger price :D (rtx4080s) same :D
Yup just like a 2080 to a 2080 Super. 3% faster costing more money.
13900k ti super xtx
u just need to change cpu affinity on task manager or process lasso
to fix ryzen 9 x3d*
I know that but the average gamer probably doesn't :)
@@der8auer-en exactly, thats why is unrealistic use process lasso. AMD plz fix
@@der8auer-en fair enough, but u should at least show the results after the fix or saying that there's a fix for that (prob the easiest fps boost to do)
It would be amazing if we came full cycle and went back to direct die contact like it was on the Athlon XP CPUs.
I come for awesome tech stuff but stayed for the cat
Intel back to their 2-3% ipc gains… over clock them to keep up, power and heat be damned! So 400w stock, I wonder what OC will bring that up to, 600w?
This is zero IPC. It's the same architecture as 13th gen which itself is mostly just a cache uptick from 12th. KS 'gain' is all clockspeed due to binning.
What greebj said. IPC is independant of clock speed
I’m happy with my 7950x3d. Between that and my 4090 my psu only goes up to 300 watts under load.
7950x3d is still horrible with it cutting its cores in half. The lasso method is to time consuming as well.
Did you heavily limit your 4090? Regular 4090s consume a lot more than that. Otherwise you're being CPU limited in the games you play which is bringing down the GPU wattage
@@kendokaaa Hi there! Never thought about that. So I did some research and I was wrong with the psu. I was playing Elden Ring tonight at 4K with raytracing and all settings set to max or high settings. PSU reading was at 414 watts (my bad) and GPU usage was at 85 percent. According to the task manager cpu usage was at 15 percent although I don't know if I'm reading it correct. Is this good? Not really well versed in the overclocking/tweaking thing yet. I just like to do gaming.
@@ers-tj4toLow CPU usage is normal as games don't use many cores.
@@ers-tj4to Unless your CPU is underpowered, it's never going to be used to 100% when playing games so that's fine. Ideally you'd want your GPU at or close to 100% unless you have a frame rate cap or are using vsync. Some games just cannot use all of your GPU because they're limited by the performance of individual CPU cores, ram or even CPU cache so this also results in GPU usage being lower than 100%. A lower GPU usage leads to lower power consumption, at least for the GPU. If your frame rates are to your liking, there's nothing wrong with your numbers.
I run my 7950X3D w/process lasso. It’s fully configured for everything I play, but that seems to be the only way around it trying to play some games on the wrong CCD.
since when was 3% considered an upgrade. is the memory controller any different?
New BULLDOZER 🤣🤣
At least it can compete with AMD in performance, unlike Bull Loser did vs. Intel back in the day. But still, I find these i9 products to have only extremely niche use cases that still make sense vs. Zen 4, even if your electricity is dirt cheap compared to European prices and you live in a cold climate lol.
The 14900ks is the fastest all-around cpu. Bulldozer could never make such a claim. Yes, Intel uses power. However, this is a desktop cpu. Power only draws when the desktop is on. You are not going to notice it on your bill. For perspective, your home level 2 ev charger pulls 20kw all night long. Keeping the chip cool is a systems engineering problem that can be overcome. It is not as significant a problem as looping a 4090 into any high-level system case. With delidding, the cpu should overclock another 200 to 300mhz.
Cpu alone using more than my 5800x and 6600xt combined.
Not when both are oc'd ofcourse but damn
The power consumption is due to the 1970s cpu and Ram architecture. I can almost guarantee you the RAM controllers and address bus are eating the bulk of the power.
Your 5800x does 18K worth of useless work in R23. vs a 31K 13700K performance.
@@kramnull8962 According to your math the 13700k is 50% better than the 5800x. With twice as many cores and thus 50% more threads. So thread per thread they are equal and you belittle this guy through his cpu? Wow
@@wastanley734 Not 50% but I love your spirit and honesty. You'll learn something in 50 years.
@@kramnull8962Nice argument that barely holds up. Lol.
I'm curious to see how efficient this chip is because of the bin, when power limited and undervolted.
Finally, a spiritual sucessor to the FX9590 😅
0:38 don't you dare to cut her out like that again!! ❤
If they are going to cover delidding under warranty, they are asking for trouble. It is a delicate operation and it doesn't always go well.
They know that lol. It’s why they are ONLY offering warranty through certain prebuilt companies.
They’ll sell them bulk pre-delidded dies (what’s the point of adding a IHS just to remove it?)… for a higher price ofc. They can charge more because of the exclusivity. The “warranty” you’re getting as a consumer is going to be at the cost of paying that exclusivity tax to the prebuilt company. Essentially it’s insurance ur paying for. Except the insurance $ gets spread between Intel and the prebuilt company. So basically it’s a money grab / publicity stunt and nothing more. Intel isn’t being freaking benevolent lmao - just trying to charge people the max amount possible and hide it in a “prebuilt price”
Delidding will not be warrantied to me and you, only to certain system builders (which are probably veeeeery expensive boutiques).
The numbers will be very low anyways. Intel gets the marketing out of it, may lose a handful of chips, not a big deal.
If the binning is better it should also undervolt better right?
perhaps bryan from tech yes city can answer that if he gets a review sample, not much people do undervolting videos
I will stick with my i7-14700KF for now. Thanks for the review
Did you watch that recent video by Dave's Garage? showed his Threadripper hitting 1000W :)
So happy to run a 7800X3D for 5 months now on my gaming setup.
From power to POWER to beat power.
I've been tuned out for 6+ months. When did the Dr. Evil thing start?
14900ks buyers are enthusiast. No one buying a 14900ks is buying from an SI so that delid warranty is pointless. The warranty should have been for the consumers not for businesses.
It’s probably because they’re gonna have a special deal where they sell delidded dies to them directly - for a higher cost, which will get passed onto the consumer so essentially that “warranty” is something you’re paying for. And the SI will profit from by having an exclusive deal with Intel. Basically everybody wins except for the consumer. Calling it enthusiasm for OC’ers is a joke when it’s all just a publicity / money stunt
This just cements the fact that AMD is the leader in efficiency.
400 W just to be a few percent faster than the 7950X3D is crazy!
AMD really worked their magic with the Ryzen chip(s), from the dreadful FX chips, to this, is very fascinating. The roles have been reversed!
Comparing auto-overclock to in-spec does not tell you anything about efficiency.
@@mikeycrackson
Agreed.
Dr. Su has been an absolute force at AMD, in turning that ship around.
@@ABaumstumpf
Stop waiting for the Asian kid to do your homework for you.
Calculate the frames per Watt.
Do the math.
(Even if you scaled the result to normalise to stock clock speeds, AMD would STILL come out ahead in terms of efficiency. Clock speed drops, power consumption drops, so you'd still be more efficient.)
@@ewenchan1239 "Stop waiting for the Asian kid to do your homework for you."
You should not think that everybody is as incapable as you are.
"Even if you scaled the result to normalise to stock clock speeds"
That is not how that works kiddo. That is not even close to how that works. But go ahead, keep on posting and showing how little you actually know.
Can't wait for the 15900KS :)
from what i heard, next gen intel is built on tsmc 3nm, so hopefully they will be far cooler and far faster.
@@edwxx20001 3nm appears to be for mobile CPUs or the GPU tile. 20A looks like the desktop node of choice, which is set to have similar performance to N3 or N2 depending on the libraries chosen for a given section of the chip. It's also debuting backside power delivery, my PhD research topic, for x86 CPUs.
@@DigitalJedi well i know names are meaningless, but 2 nanometer should match up to 20 angstrom.
@@DigitalJediIntel should have a decent advantage Vs zen 5. We'll see if architecture is good.
In Christopher Walken's voice: MORE! MORE POWAH!
Dude, just call it 13900KISS. Sheesh!.
If the 7900x3d had that many cores it would smoke and still be under 400w I bet 😮
Why do the odd step with delidding when they can simply not add the heat spreader?
Makes no sense to add extra risk and work.
Why don't you use process lasso?
I'm starting to think Intel has performed necromancy on the NetBurst architecture.
Might honestly be one of the worst CPUs I've ever seen.
It is a clear sign that Intel is desperate and their only solution is to increase power consumption to try to match the Ryzen 7800X3D in gaming which uses a quarter of the power
Btw intel 13900k has better 1 percent low compared to amd 7800x3d
@@alfredgamerzzgamers3970 at what cost? It's not really an achievement to have slightly higher 1% lows while using twice the power and is impossible to keep cool without a 360mm AIO. The 7800X3D beats the 13900K in more games with lower power, is easier to cool, and doesn't come on a dead platform.
@@alfredgamerzzgamers3970 And draws twice as much power when turboing.
@alfredgamerzzgamers3970 *according to intels first party benchmarks
Something I never see is the performabce when tuned for power efficiency, better silicon theoretically can be undervolted/underclocked etc to get the best perf per watt.
Look at what laptops do with the same silicon. All the way down to 13900HX in smaller 15 inch models.
Unfortunately undervolt lock means most models can't be tuned for max efficiency. I have a lower end Helios 16 Neo that I can keep mostly to 100W at about 4.2ghz P + 3.0ghz E cores to limit fan noise. Multis and PL1 PL2 are all that's tunable in this, barf, I definitely got what I paid for (under $1k)
I hate performance laptops these days, you have to spend 5 grand to get something from MSI or Eluktronics you can actually tweak, and I'm not that made of money I'm willing to cost myself a $4K all soldered non-warranty replacement mainboard. In the olden days a mainboard was 'only' about 700 bucks, assuming you didn't kill the socketed CPU or MXM GPU
Intel totally lost it 😂😂😂😂
Fluffy cat!!! And that cpu make a good butt warmer for the cat :)
Can we make a 14900KS vs 14900K/KF OC to same frequency comparison?
So this is why Jay got that quad 560 water box from Singularity.
Thats really poor performance. No wander so much people buy AMD these days.