Damn, Intel! - 12th-Gen Core (Alder Lake)
Vložit
- čas přidán 28. 05. 2024
- Get 10% off all Jackery products with code LinusTechTips at www.jackery.com?aff=27
Learn more about CORSAIR's NEW Xeneon Gaming Monitor: geni.us/nueip4
Intel’s long-awaited Alder Lake is finally here - But can 12th-gen Core really slow AMD’s momentum, or is it just another power-hungry processor in a long line of desperation CPUs?
Buy Intel i9-12900K
Amazon: geni.us/RBZxE6
Best Buy: geni.us/8Upc
Newegg: geni.us/fjtN
Buy i7-12700K: geni.us/oTUR
Buy i5-12600K: geni.us/z2Xmy
Purchases made through some store links may provide some compensation to Linus Media Group.
Discuss on the forum: linustechtips.com/topic/13860...
► GET MERCH: lttstore.com
► AFFILIATES, SPONSORS & REFERRALS: lmg.gg/sponsors
► PODCAST GEAR: lmg.gg/podcastgear
► SUPPORT US ON FLOATPLANE: www.floatplane.com/
FOLLOW US ELSEWHERE
---------------------------------------------------
Twitter: / linustech
Facebook: / linustech
Instagram: / linustech
TikTok: / linustech
Twitch: / linustech
MUSIC CREDIT
---------------------------------------------------
Intro: Laszlo - Supernova
Video Link: • [Electro] - Laszlo - S...
iTunes Download Link: itunes.apple.com/us/album/sup...
Artist Link: / laszlomusic
Outro: Approaching Nirvana - Sugar High
Video Link: • Sugar High - Approachi...
Listen on Spotify: spoti.fi/UxWkUw
Artist Link: / approachingnirvana
Intro animation by MBarek Abdelwassaa / mbarek_abdel
Monitor And Keyboard by vadimmihalkevich / CC BY 4.0 geni.us/PgGWp
Mechanical RGB Keyboard by BigBrotherECE / CC BY 4.0 geni.us/mj6pHk4
Mouse Gamer free Model By Oscar Creativo / CC BY 4.0 geni.us/Ps3XfE
CHAPTERS
---------------------------------------------------
0:00 Intro
1:06 Intel's special sauce
1:50 A note about Windows 11 and test setup
2:43 Gaming results
4:56 Productivity results
6:59 Did Intel cheat?
7:46 Power consumption
8:55 Thermal performance - Intel's still selling space heaters
10:27 Value and conclusion - 12600K > 12900K - Věda a technologie
A note on Ryzen on Windows 11: We did not experience the CPU-swap performance degradation bug in our testing. Our Ryzen bench was running a fresh install with the patch and chipset driver from the get-go; It was actually geared for Windows 10 if we couldn't get the patch + driver in time for testing, and wiped for the fresh install once they became available. -AY
Why wasnt the Ryzen overclocked to the same wattage for the test?
@@penetrarthur sus
@@penetrarthur very sus indeed
@@penetrarthur it's about comparing the performance the typical person will get out of that chip.
Overclocking, power matching, tweaking, liquid nitrogen cooling, those will make them compare the limits of the silicon for sure. But it doesn't help you decide what CPU actually fits your needs and budget.
@@penetrarthur super sus
The year is 2084. I lie on my deathbed awaiting the news I’ve been looking for my entire life. The alert comes through. “Intel has just announced it’s 69th gen processor.” Finally, after all these years, I can die in peace. “Nice.” I mutter to myself. Before drifting off.
Nice
Good to see you here congrats on 200k
Damn
Very nice.
I hope I live to be 100 yo then.
Seeing the 5950x/5900x with more "power unlocked" to match in intel power draw would be interesting to see.
Robbaz in the wild!
Where are the videos Robert!
Facts bro
But I heard Ryzens don't overclock that well.
@@OKNOWIMMAD12345678 CPU super saiyan showdown
Before Ryzen, Intel's financial incentive has been to improve products as little as possible to maximize profits. Now their financial incentive is to make as good products as possible to surpass AMD's performance and to stop bleeding market share to them. So now we have two companies with the financial incentive for better products, which is awesome for us consumers!
But we just have two companies that do everything in x86. It's not good in long term.
@@karanjoshi2662 There are also only two companies left who can manufacter cutting edge chips. And one of them only can in theory, and in reality appears to be behind.
@@karanjoshi2662 yeah, x86 is getting old, ARM is the future as proved by apple
A few thousand intel lover bois have been irreparably damaged inside by your comment.
@@karanjoshi2662 M1 from apple is arm based, and it's insanely powerful.
Love that you guys started to introduce 3D rendering and Blender and stuff. Huge thanks for this!
132
More blender and production work loads is awesome.
Would really love it if you started including an "Unreal Engine Developer" benchmark. Just loading the free "city park environment" can take forever due to all the shaders being built. Generally more cores means faster loads. Would love to see a benchmark on the "cold" load time of a map like this, as well as one showing the full rebuild of the map!
Really wished they also did Unreal 4/5, Unity, Substance painter etc benchmarks.
second
This
Damn, as a game artist, I 100% agree.
you expect them to do productivity benchmarks that people actually use rather than a bunch of stuff 1 person hiding away in a basement somewhere runs just to say "nice".
FINALLY SOME COMPETITION AGAIN! Lets go bois! Both Intel and AMD fans are gonna win from this
Well our only enemies are scalpers and miners
I'm SO stoked
we aint winning when prices keep going up
The virgin fanboys vs the chad good product enjoyers
AMD will wash em' out again with Ryzen 6000
As an AMD Fan, I'm very excited Intel is making a push back! We're finally seeing some innovation on their end!
You can see innovation from the 8th gen intel cuz they beating very bad your amd shit
@@tiborsejk1921 lmao please... AMD started to get dominant after 3rd gen and literally buried Intel alive in 5th gen. I'm glad Intel finally shaked themselves and being true competitors, this will be a benefit to us. But sometimes we need to accept things.
@@AlliaNCeCZcams Which dimension we talking about? An amd fanboy trying to "accept things" :-D Omg made my day. Amd was never any competition for gaming to Intel.
@@juveboy01033 I would prefer 1 or 2 less FPS to not having an heater inside my computer case. Just to suck out a little bit FPS they were making their CPUs like they got thwir IHS from magma. 12th gen is also hot but in lighter workloads like gaming they perform so well. If i was an AMD fanboy would i say i'm happy seeing Intel got into leadership again? Shut up please..
AND fanboys are always millennials. Somehow you forgot the 60 years Intel owned the market.
An Intel i5 being faster than a Ryzen 9 is something I never thought I’d see.
While drawing 200w and spitting fire
@@bigtitmaster Nigga what? Its thermals are at 60 Celsius or under and it draws 125W.
I believe the Ryzen 9 is less than 10% better than the Intel i3 which is such a small difference
@@supersuede91 WOAH WOAH WOAH WOAH
@@bigtitmaster what the fuck kind of cooler do you have? a copper block?
I am a Unix-like developer and I compiled different programs every day. Thank you for adding the Firefox Compiling Test
Which OS are you working on?
"Unix-like" makes it sound like you're embarressed of your target platform. Mac perhaps?
Then again, it could also mean "I develop for everything but windows", and if so I salute you.
"Unix-like"
You can say MacOS, we won't judge you.
@@nobodyofconsequence6522 It they were using a Mac, then they wouldn't be interested in an Intel or AMD processor, would they? So no point in writing the initial comment and almost for sure they meant that they write code for everuthing that is Unix-like.
@@FL4SHK Linux is “Unix-like,” not Unix. It’s based on GNU (GNU’s not Unix). It functions like Unix, but technically isn’t Unix.
Just in time, Intel! My house heater broke but now I can explain to my gf that we'll have warmth and a beast gaming PC. :)
What I would like to see is a breakdown of actual task scheduling and how the efficiency cores compare to the performance cores on a granular level. Something that really showcases the difference between heterogeneous cores vs the traditional homogeneous cores.
"Competition drives innovation" - someone, sometime
Milton Friedman, probably.
Sounds like something Marx wouldn’t say
That's why technology sucks under socialism. This isn't critique, it's a fact.
Sounds like Tsun Zu
God I love capitalism
A 'power user' gets a new meaning here.
One of the best comment I read
AMD's power efficiency.....when thinking about not too long ago (Bulldozer)....dang, they STILL impress me.
You are talking about maybe a few dollars a month difference in electricity costs. Just keep moving that goalpost...
@@Tiberious_Of_Elona well the cool thing about efficiency isn’t just about reducing cost it’s also the power draw in total. The less power we consume from the grid the better.
Let alone heat lol
@@Tiberious_Of_Elona funny enough, we all had to hear people whine about amd being less refined and eating up more power, but like always when intel does it, it's ok, negligible, just a few bucks more, when amd does it, well, crap amd, unable to deliver, intel the best! Get your crap sorted out dude.
@@lupuradu My fx-8350 feels kinda cool compared to the new CPUs. Only drawing about 180w when severely overclocked and keeping it under 70 degrees on a Noctua NH-D14
So im guessing watercooling would be the best way to go?
Gamers Nexus did a testing with Artic Liquid Freezer II 360 and the CPU was still at 70c, sounds like your gonna need to Liquid Nitrogen it.
If ur doing that kind of workload regularly, AMD is the better chip itseems
your gonna need like an actively cooled watercooler. meaning something is cooling the water while also pumping it around the system lol
No, if you are just gaming with maybe editing videos here and there and photoshop, get an LCD AIO and ur set
Noctua: hello
Damn, those thermals are rough. I'm impressed at the performance numbers, but I feel like the 90C+ temps and 230W power draw are going to make me go to a 5950X anyways.
The power draw and high temps are a no for me as well. Sure it's faster, but that doesn't make it "better" by any means. Hopefully it's a push in the right direction though
Yep, imagine this with a 3090 and you got yourself a house warmer for the winter.
i doubt 99% of PC users will even use the CPU in a way where those thermals will matter tho, in gaming (and probably streaming too since it's not too taxing either) you'll be at industry standard temps. I really would only worry if you work in CAD etc etc, but those people most likely weren't looking for a upgrade anyway
At least they can't claim that you can't OC these chips - you just won't because of temperatures and power draw. Not viably, anyway
exactly my conclusion. Not even considering the insane cost for the platform that you need to put that CPU into.
I know you wanted a best case but I’m more interested in seeing how well the CPU performs without better RAM
This.
AND using windows 10. I mean, hell, who even is using 11 right now?
@@ImAzraa Me
@@ImAzraa I'm using W11 on my laptop. Its 'fine' I guess. It did however bluescreen my desktop (i9-7940x/Prime x299 MB) so I won't be moving to W11 for my main rig for a while.
@@ImAzraa The figure I've heard cited is that only 2% of Windows users are on 11.
@@praetorxyn I've heard 5.
it's really refreshing to see AMD, Apple, ARM and Nvidia finally wake the sleeping giant known as intel.
pulled the trigger and went full ddr5 and 12600k, from an i5-7600. What a difference! When the new gen GPU's, Direct Storage and resizable bar comes around, its going to be nuts!
Excited to see what Ryzen's response is!
It will take some time
@@SamuelLobo01230 perhaps amd will just go like snap
nothing
@@Ben_mgsp probably
Being affordable lmao
Will be interesting to see the DDR4 benchmarks, also now that Intel has enabled all core boost, would make sense that the AMDs would have that enabled also in the testing.
watch hardware unboxed video, they compare the DDR4 and DDR5
AMD all core boost is on by default. Same goes to Intel. However, LTT previously disabled all core boost only for their Intel benchmarks as it was considered "out of specs"
@@pixels_per_inch I didn't see mention that it was now enabled as it was considered overclocking
AMD already does this out of the box and have been since ryzen launched as far as I'm aware
Hardware unbox 😁
“Our sponsor doesn’t help” - 10 most risky quotes in history
It's "doesn't know" instead of doesn't help
Listen carefully
Our sponsor doesn’t know
Would like to see some comparison in the audio recording field as I would love to work with +- 500 - 700 tracks with alot of different effects and vst Instruments
This was long overdue. Finally improvements from Intel. Thank you AMD for bringing competition to the market. I can't wait to see how both will improve their cpus in the next 2 years.
What a joke! Requires new motherboards, new Ram, new cooler, etc. Meanwhile my B450 should support AMD's 3D cache refresh. Poor Intel.
@@noleftturnunstoned you know right that AM5 is coming very soon.
(Atleast you can use your old cooler)
@@noleftturnunstoned So we should handicap performance on new products because you're too broke to afford new parts? You're the joke here for thinking that lmao.
@@Freestyle80 my pc has an intel cpu
@@Freestyle80 You're seeming like an Intel fanboy
I'm still impressed by AMD. Older architecture, DDR4 only, almost half as much power used and yet they still are *almost* as powerful as the Intel ones. It's really impressive and I just can't imagine what the next generation of AMD will bring.
However, I am also really impressed by the i5, it's amazing.
Older? Maybe so by date of release. But the latest zen are build on 7nm compared to these 10nm. Also the ddr4 argument? The difference is negligible between 4&5. Intel was way behind. And stil is architectural speaking.
@@niels1123 yeah amd fanboys still gotta say the same thing
@@niels1123 1) Comparing nanometers between two different manufacturers means nothing.
2) Transistor size doesn't mean everything. The architecture has maybe an even greater role. A 7nm pentium would still be pretty slow (but runs cooler). Alder lake is newer by date, this means AMD was already working on their next chips when Intel was working on that one.
3) Wait for the next LTT video comparing DDR4 and DDR5 before having such conclusions. There is a difference, we'll later see how much it is.
4) You seem like a frustrated AMD buyer. Why can't you just admit both Intel and AMD are good and competition is even better?
@@MK73DS 1 in this case wrong as we all know Intel has struggled with their 10 nm for years. 2 is same as 1 indeed their newer architecture makes this chip almost as fast as a 5950x with lesser performance cores. 3 has been tested all ready just not ltt in this video. 4 you invalidated your entire post with blind statements. At the moment I only use an Xbox.
@@niels1123 Please... tell me you're trolling. There should be fixed amount of stupidity allowed in one comment.
Still don't trust Intel with their sockets changing more often than I change my underwear.
Wait... What
@@iZack187 He said he changes his underwear twice a day.
You must be new to PC gaming. Last time I trusted AMD, they gave us Bulldozer. We ask them to try again and they fucked off to make consoles.
@@5thwiseman How long ago was that? You realize that pretty much everything about the company has changed since then, right?
@@5thwiseman Dude. Bulldozer was released 10 YEARS AGO. So after 10 years you still won't trust a company that pretty much proven themselves with 5 generations of Ryzen that has been shredding Intel since 2017? Okay suite yourself buddy.
And just to be clear. AMD to date has never made a console. They've provided CPU's and GPU's to console manufacturers.
Great news for competition, more so if it starts to drive prices down instead of ever increasing price inflation like AMD/Nvidia. I would've liked to have seen you use DDR4 on some of the benchmarks for an apples to apples comparison. Regardless, when natty gas goes through the roof this winter I can at least heat my home with an intel chip.
Would love to see how the I5 performs with a DDR4 system!
Hardware unboxed did that already, check it out
@@foxcell21 save us some time and tell us what he said lol
it's even better than i5+DDR5, lol
yea 12th gen with ddr4 performs even better than ddr5 in some games
czcams.com/video/yZReEP8ixyk/video.html
Hot damn, that wattage and thermals. Still though, as someone who is currently running a 5800X, I’m super happy to see Intel finally making real improvements this generation. I’ve never been actually seen real competition in the tech space like this.
Yeah it will definitely give Ryzen some incentive to do better, so whenever there's actual competition everyone benefits.
"that wattage and thermals"
That is the 12900K overclocked or infinite turbo. In real world, the use is way lower due to Big-Little design, you overclock a 5950X (infinite turbo) and the power draw and heat is the same (around 220 or 240 watts) but the 5950X has 20% lower IPC and lower clockspeed.
@@saricubra2867 In the real world, the 12900K is 3% faster in gaming, and with all benches combined it just tops the 5900X, but is still 9% behind the 5950X.
And that is while it is using DDR5, that accounts for a 7% increase in games and overall.
So I'm excited that Intel made this jump, like really excited! But I'm not impressed, where the 12900K shines in benches, it's also using way more power, almost 100% more power in fully MT loads and 35% in mixed loads. Even 8% in gaming, the only good this is that Intel at idle consumes 7watt less.
I might want this chip for the single-tread performance, that looks amazing, but other than that, it's just bruteforced performance with the DDR5 uplift and a major increase in power consumption over the competition.
1000% more competition mean we win more
@@saricubra2867 LMAO, the 12900k is nowhere near 20% higher IPC. Did you even watch the video? Also, if the 5950x has the same power draw after "infinite turbo" with 8 extra full size cores, that definitely not a good thing for the 12900k. Especially since the 5950x beat the 12900k in multiple benchmarks *without* "infinite turbo."
It's funny to see AMD with all 16 "performance" cores score half the power usage than Intel's 8+8 P&E cores. So much for "efficiency" cores.
8 efficiency cores only consume like 40-50 watts at max load. It's the big cores that are super power hungry
@@TabalugaDragon Please tell me in short, which consumes more watts, AMD or INTEL?
@@sirs4878 with equivalent number of cores at max load - Intel.
Idle power consumption is 6 times lower on Intel, and average power consumption during gaming too.
@@TabalugaDragon How many VMs Ryzen 5950X can run?
@@sirs4878 that mostly depends on how you configure the Virtual Machine....
I would say that VM's are more RAM dependant than CPU
I watch these videos for the creative add placement. Great timing! Great info!
They always nail-it with the i5 for some reason, good showing with a overall comeback
Not even close, they were paid to compare ddr4 to ddr5.... LMFAO! and you ate it up like factual evidence.... Go put an iontel rig on ddr3, and compare to ddr4 with AMD. Youd see similar results lmao.
@@crisnmaryfam7344 Nobody cares
@@crisnmaryfam7344 Relax buddy, you can see DDR4 and DDR5 test on the internet, this is a good product and both cores are impressive, AMD still more efficient and competitive in performance but that doesnt make this Chip-Lineup bad. I'm fairly impress
8600k and 9600k were lacking even on their respective launch days. Still, they're the exception to the rule and "an i5 is good enough for gaming" has been true more often than not.
@@crisnmaryfam7344 so you are saying Intell should have to have their performance hindered because their competition doesn't have equal feature sets? That's absolutely idiotic
I'm honestly quite curious how the Linux kernel will handle this. It already has great scheduler support for ARM big.Little. I'm not sure if the code architecture-specific and needs to be rewrite for 12th-gen Intel though. We'll just have to see!
it's hugely specific. like comparing latin to chinese
Probably very well, considering Intel is the largest code contributor to entire Linux kernel...
The kernel did not get patches by Intel yet to address this architecture and optimizing its scheduling. So it will probably take some time until we see Intel getting ahead on Linux because currently AMD is still your best option with Ryzen in most tasks and the patch/merge window for version 5.16 of the kernel seems closed already (5.16 might get ready in December).
Gaming benchs are wacky. It's hit or miss depending on what cores kernel decides on. Wendell did a video on it.
You are right, i saw benchmarks with Fedora, Archlinux, Ubuntu, ClearLinux and Windows 11 and all have better perfomance than Windows 11
Whatever absolute maniac lined up the i9 box accents with the larger Intel box... I love and appreciate you
Thank you for including MSFS 2020!
Intel sure are using a lot of power and heat to be competitive.
You mean "to win."
And 20% higher IPC with higher clocks, and DDR5, and PCIe Gen 5.0, revolutionary architecture for x86, etc.
Power consumption is a non issue.
@@saricubra2867 If you that were the case, then nobody, even Intel, would be clamoring for Apple's M1 efficiency. These CPUs are still a burning furnace but it is a welcome change for Intel.
That alone almost makes me want to wait to see AMD's response to this...
Same thing AMD did before Ryzen.
You gotta do what you gotta do.
I’m honestly impressed that the 5000 series is still on the heels of the 12th gen CPUs, very exciting times in the CPU Market.
What a time to be a computer enthusiast, AMD, Intel and Nvidia are bringing so much competition and it’s amazing.
Have to agree. I use my 5950x for rendering, and the figures from Intel here made me consider a change. Until I saw that the Intel was running at 90C stock!
I overclock my amd chip, so I get better results than the i9 while still running cooler and for slightly less power draw. So I'll stay where I am for now.
But alderlake version 2 next year might be a game changer. But then amd will have their next chips out then too.
Exciting times to be a nerd!
@@StudioNirin idk man i have 5950x + 420aio from arctic cooling and with pbo i get 77C under aida64 load test.
I'm just gonna wait and see what AMD's next generation of CPUs will be.
Took Intel long enough to be able to compete again after getting kicked in the dick by Ryzen. Although I do wonder how much of the performance is from the DDR5 and Windows 11 sucking off Intel's new core architecture.
I'm still gonna stick with my Ryzen 7 3800x until it breaks though, just because I need a new GPU before a new CPU.
What a time to be a scalper
Apple is the new kid in the block.
Impressive chips. I wonder how task manager shows the cores/threads with a mixture of performance and efficiency chips?
that will be on Microsoft part
I'll give credit to Intel for being innovative again, like the good old days
intel: "watts go brrr"
m1: "u got a power bank?"
The M1 Max runs at 130W
@@AndrewGOP That's a combined TDP of the GPU and CPU. The CPU runs at 30-35W at full load and absolutely crushed every chip in existence in terms of performance per watt.
@@avinashanish3350 12th gen intel cpus will be 45w, we'll see about that. Also the M1 Max is 4 grand
@@AndrewGOP 12th gen mobile CPUs are not going to be 45W. 11th gen mobile CPUs are already hitting TDPs upwards of 75W under full load. Another thing to consider is that the M1 Max comes close in performance to the 12900K that runs at 250W on full load. That performance per watt ratio is insane.
@@avinashanish3350 Dude, are you seriously comparing Arm chip to X86 chip? Your silly little M1 can't even encode AVC. Sure, there is power, but the M1 is missing TONS AND TONS of instructions and features, making it a mobile chip. This is like comparing a stripped sports car with no features or comfort to a tesla. You have no idea what you're talking about. Think about why respectable companies, cloud-services, etc. are avoiding stuff like Apple M1 chips even thought some youtuber said it's good at playing youtube videos.
EDIT: I should elaborate. By no means M1 is bad, but 12th gen intel and M1 are two completely different worlds and are used for different things. Good luck hosting and playing minecraft on your M1, it can barely reach 100fps while intel goes into 1000+.
LOL. "Our sposor Corsair doesn't know" That's the best segway ever.
Segue*
Sponsor*
Xeneon!?
lol*
know.*
I have a Intel Core i5 12600 K(F). It needed a proper cooler fron Noctua, the NH-U12A Chromax. Runs great even under heavy load. No problem with heat.
Glad I waited to build my new PC. And nice to see the competition happening.
Can't wait to see what happens when AMD picks up this design style in addition to their existing Ryzen chiplet system.
Competition gon' be goooooood
I have a feeling AMD is going the same route, and Lisa has it all ready to reveal any time now. Imagine 16 performance cores and 8 little ones 6950xt.
pickes up this design style - u mean stealing and calling it something else, thats AMDs way - always been
@@cnhtol1586 Like Intel didn't steal this power core design from ARM? X86 in the long run will be a dead architecture anyway, the future is RISC, even for desktop.
Watch the new Moors law is dead video were he talks how AMD wants to realize big little. That will beat Intels designs easily
@@REPOMAN24722 but we’re long away from the future on the idea of performance cores didn’t come from arm first. RISC-V is long away before they are gonna be used as desktop chips!
I'd love to know how much of this is just because of the ddr5 though. Is it really Intel doing well or is this is having an upgrade in the ram department? This will be very important to see how easily AMD will be able to come back with their next-generation. Also seriously those temps though
Whatch Hardware Unboxed review. It is much more thorough.
I have a 5950x and 3800C14 ram, I’m going to run a few tests to compare.
I'd have to guess DDR5's not doing a lot but I'm sure it does cost more.
There are DDR4 benchmarks available on plenty of sites. In the ones I've had time to check so far, DDR5 wins out in games and productivity, but not by massive amounts. (DDR5-6000 CL40 vs DDR4-3600 CL16 btw.)
i'm also intrested for the comparison to windows 10 ryzen
I'd LOVE a revisit to this, now that we know 12th gen has issues with bending and contact patch on the cooler. I'm going to get the thermal ready frame over the TG frame to flatten out my cooler, then I'm going to liquid metal it. Small form factor case heat management things..
Out of all the CZcamsrs I watch you are the only one who does sponsors right, you give them maybe 30 seconds instead of dedicating a quarter of your video and I appreciate it so much
Game of Thrones :
Everyone : "The winter is coming."
Me : "Rest assured. I preordered Alder Lake. Now how hot do you want your winter?"
I got the i9 10980xe and my room stays at a perfect 90 degrees while it’s 50 F degrees outside
ADL actually uses less power and runs cooler than Zen3 during gaming.
@@Itaketoomanypics Toasty. We can combine and create our own fusion reactor.😄
Global Warming?
I'll do it myself.
You know nothing Jon Show
No way!! Does this mean I can finally afford 4th gen Intel now?
Nope
Not really
I think they're pretty expensive unless you're buying used
Go AMD and save some cash on the parts and your energy bill.
Buy i5 2nd 3rd gen lga 1155 and gtx 1050 ti 1060 or i7 2600 , Maybe i7 2600 is better than 4gen intel i5 .
Buy from me hehhe soon I will sell new pcs with ryzen 5 cpus
Lmao , i have 4th gen
Hey! Great Video!!! I have a quick question sir. How long did Intel say they would stay on this platform? And as AMD Said how long they would be on the next platform? I have a 5900x now. But really want something I could put a new processor in a few years from now. Thank for your help
0:52 nice, a picture to compare framerates, professional indeed.
I upgraded to a ryzen 5 recently, and I don't plan to change again for years to come, but im still glad we finally have competition in this space.
lol competition
Same
Finally? Intel has never had competition until now, good that AMD is catching up to give us a fight the recent years
@@waifu_robin fanboi detected
@@Quantumfluxfield another fanboi detected
Let's see when AMD's new ones come out later this year! Great time to be alive for us all no matter if you're team red or blue!
Go to the AMD fanboy forum then hahaha
@@tomd4748 salty intel fanboy?. He's not even shitting one intel and you're saying he's a fanboy? get you're shits together.
I'm team price/performance
@@vondamn9943 this guy gets it
@@tomd4748 they were paid to compare ddr4 to ddr5.... LMFAO! and you ate it up like factual evidence.... Go put an iontel rig on ddr3, and compare to ddr4 with AMD. Youd see similar results lmao.
I have the i5-12600k and I have to say I am extremely impressed. I've always been an intel fan, but thought about switching to AMD for several years. Finally, I can worry no more.
Same! And I can't wait for their next cpu's in the coming 3-4 years!
My previous laptop had i5 8265u + ddr3 2800 (acer swift 3) and now I have i5 1240p + ddr5 4800 (msi summit e14 evo). And it's just banger, 4x more performance in my working program. I was expecting 2x max. I'm just amazed what intel did with 12th gen
@@timerson yeah imagine 13th - 15th generation man. Can't wait for it
Im very excited for this cpu as I currently have , wait for a it, an I5-2400 LMFAO . Its going to be a monumental difference
@@lifeofa200sx7 so glad I went with the 7600X! even better than what intel has to offer for gaming, Damn AMD is a beast!
7:53 I have an i7 12700k with a mobo that doesn't allow for overclock and when I stress test with Prime 95 I do get 239w ~ 241w TDP with liquid metal, way above what the image shows
Intel: we are back baby!
Ryzen: now hold on a minute
Lisa got something just for this occasion I can see it haha, AMD is gonna be like surprise mutha fukka!!!
@@REPOMAN24722 I don't think AMD is going to do a completely new architecture this time. I think it'll just be a refresh of 5000.
@@hwstar9416 Intel didn't do a new architecture, they copied ARM from 11 years ago, well "borrowed the idea", without a full schematic of the chip we won't know how close the new chips are to ARM's big.LITTLE architecture
5nm, here it comes.
@@hwstar9416 I dont know, new socket, has to be more than a little. PGA to LGA.
The temps and power draw though. So it's like a much faster Bulldozer :P
That said, it'd be interesting to see how much improvement 3D V-cache makes to AMD and what their next gens are like
kinda the point by having high efficiency cores, you'd never hammer all the cores at once in any real application
Like the old pentium4s that used to melt sockets.
(edit - seriously the people saying power doesnt matter - your individual power user overclocked CPU doesnt mean a thing, millions of prebuilt systems coming off the shelf using that kinda power does. & The idea of this arch in data centers is nuts.)
I'm super pumped for the Zen3D chips as well.
I'm running a 3700x at the moment, so I honestly have no reason to upgrade at this point, but I'm glad I at least have the path if\when I have the money to do so.
I'm just glad there is some proper competition between these two again, up until Ryzen came out, shit was a snooze fest, and prices were NUTTY asf.
Doesn't California ban several game systems now due to their power draw and them blaming the computers of global warming..
Imagine what will happen regarding these chips and the systems they come in..
@@zakofrx California where everything good and fun is illegal, Except weed.
Now we just hope that the new E-cores do not end up having meltdown-class design mishaps.
Hi all you great folks, i was wondering where i can get my hands on one of those cpu cards with the architecture on it for Alder lake, many thanks for all the great content i try never miss one, thanks Charlie from the UK
You should have included DDR4 benchmarks too, as most people aren't too keen on spending $200 on only 16GB of RAM. And I don't think the 12600k is better than the 5600x (from a value perspective) for the reason that boards are much more expensive, you need a beefier cooler and PSU, and I don't imagine that it'd be all that great with DDR4.
That same scenario happened between DDR3 and DDR4, users hung unto 3 even after all makers switched to 4 lowering its cost gradually due to competition while demand on 3 raised its cost and we ended up in a weird situation for a bit where 3 costs more than 4! 😅
well that depends, if there is performance gains then yes people would be willing to pay more for ram
it happens all the time
it just came out and you are making generalizations on price? I have seen ~$180 alder mobos already, so value boards should be coming
beefier cooler, sure but is negated by the price difference between the chips
psu? nope, wattage is determined by the gpu you want to run
True.
DDR5 played an important part In helping intel beat old gen DDR4 Ryzen cpu.
Also since it's not just a simple cpu swap , the cost of DDR5 ram and new mobo will make it a lot expensive than rayzen.
But linus here didn't focus on these points for reasons unknown. Maybe he believes that he will help competition by being a little biased .
Have to agree. To even the comparison as much as possible, platforms should have both been DDR4. This review left me wondering how much the performance with Intel was due to DDR5,
And then we have another question. Can you even get everything at the market? I bet some areas are cleaned off pretty fast and 90% of the people who want to upgrade can't because of no available parts ( mobo + ram + cpu + new cooler because of new layout for the screws). You cant upgrade missing even 1 part.
I love it, when the Civ6 benchmark is in the review. All that counts for me. :-D
Timestamp?
@@HH-ni5hm 4:19
Linus do an overclock video for the i9 12900k. Happy new year!!
Can't find that platform overview video about chipsets/motherboards. Anyone got link?
A competitive gaming CPU for just 300$ - sweet 😍 Now I just need to break out another 1800 bucks for a GPU and I'm golden 😌
If you're paying that much for a GPU you're not golden, you're rusty 30-year-old sheet of iron
@@tsakeboya Well I mean have you seen the 3080 prices recently?
Dont forget to add 60% extra for ddr5. The only reason why 11600k is faster than 5600x :D.
@@tsakeboya the gpu market is broken rn prices are ridiculously inflated
You might not need a top tier gpu if you're just playing competitive games they're fairly easy to run and intel's integrated is decent now
The i5 seems to be the best deal so far
Yes, and I'm planning to buy it!
always is
When yoru doing the numbers remember to factor everything into it. Motherboard Ram etc etc.
@@dralord1307 i9 is 300$ more for not much extra performance, you can significantly upgrade your gpu for that
@@naveenbattula for gaming that is. In tasks that benefit from more cores the i9 might be better.
I really need to know, did Linus use the noctua lga 1700 bracket during testing? I heard temps can be worse on lga 1700 if using an asus board with lga 1200 compatibility, due to poor contact with the cpu chip. I'd like to see a review with the actual noctua lga 1700 bracket before I believe 90C is what the d15 reaches
I know its not exactly apples for apples, but would be cool to see some M1 Max Benchmark comparisons generally in some of these too! Esp concidering intel chips are slowly starting to head in a simiar direction.
I try to avoid having computers that feel like blast furnaces these days, so if I want multicore madness I think I'm sticking with Ryzen 9 over Core i9. But man, that new i5 is looking pretty sweet. If I was going to build a new game machine, I'd be giving that one a HARD look.
My thoughts exactly. Wife has a need to render/edit, so she'll be AMD still, while I might hop to Intel in the next update. Weird times. :D
@@tuurekeranen1771 tbf this are a generation apart. Wanna see when amd will drop their new cpus
Indeed! It *reeally* makes me want to see what the 12400 numbers will look like. That one could be the new value darling.
Yes though you still need to look at the entire platform cost, including RAM and MB prices. Let's wait for the next follow-up video.
@@jellorelic or wait for 3DZen in Q1. Which should net a 15%+ increase in gaming performance putting Intel's Alder lake pointless.
I would like to see a best case scenario versus using Intel/Windows 11 and AMD/Windows 10, how much would gaming benchmarks differ in that scenario
Intels 12th beats itself around 8 percent on windows 11 in cinebench r20 multicore. amd takes a 10ish percent hit on windows 11 over all right now.
Hardware Unboxed was doing some good win 10 vs win 11 AMD, Intel work
Well, that's not fair
Hardware Unboxed already did the test - not much, around 5 percent off at worst
Agree, even if MS claims they "fixed" Ryzen issues it's still a brand new OS that was made for Alder Lake. It's bound to be plagued with bugs using other architectures.
8:39 that hair swipe, I miss old ncix spikey hair Linus
I still don't understand why "I need" efficiency cores on desktop...
To appease the thermal envelope gods. Seriously though the core counts keep rising so having all cores as "high" power comes at a price to thermals and power usage. Focusing on half(hopefully 3/4 in the future) and making them the best you can with the extra thermal headroom can actually increase performance as I would say most common apps don't use more than 16 threads. Just my opinion tho
Go to 7:00, it prevents the throttling of the cores that are high power dealing certain workflows to the low power cores allowing it to be more efficient as a whole. At least how I understand it……..
Yeah, I'm aware of the concep, which is perfect for laptops, but on desktop... I don't get it. The underclocking of power cores (when thay are not needed at high clocks) is good enough by my opinion.
There is a push towards increasing power efficiency in desktops with california recently enforcing restrictions on desktop computers with high energy draws.
Given this is the "best case scenario", with Win11 and DDR5, I can't see it being better value than AMD's offerings, at least not while DDR5 is still priced as high as it is. Still, it's great to see competition again!
True. Though to be fair, this competition over the last years already gave us extremely good value on CPUs. It's very tough for Intel to keep up with AMD now, so I'm glad they're at least in the ballpark still. Without this release, AMD might have kept higher prices on their 50 series CPUs.
Turns out Intel did there tests on the version of Windows 11 that had issues with Ryzen CPU’s.
Especially since the current gem ryzen processors are probably going to have their prices cut.
@@SnifferSock I wouldn't count on price cuts. AMD still has supply issues.
@@mitchjames9350 Yeah but the new intel chips still beat out the ryzen chips in just about everything on windows 10 too. And what we always see with brand new architecture is over the period of a year the performance gets better with OS and driver updates.
That i5 is so tempting. We’ll see if they actually sell for MSRP though.
With the current market, it’s honestly a very good question. Scalpers could have a reawakening for these cpus considering how powerful they are.
@@VanquishR But intel is having a okay supply of silcon right? So why would they sell it for so High
@@b-beluga4510 they’ll probably price it in the middle of the 5600X and the 5800X, or they could just go ham and price it below the 5600X
@@hextobyte they have msrp lower than 5600x at 289 for 12600k and 269 for KF version
Don't forget you might also need a new mobo, new memory and new stuff. So even if it's MSRP the price overall will still be way higher.
Good for Intel for doing well. In some countries outside the US, Intel has the better value per performance or very slight difference for the cheaper price. AMD is still saving Zen 4 for Ryzen in the coming months.
sounds like all they did was pump more power into some slightly more efficient cores then added some lower yield ones which in turn made the total die size bigger? wouldnt be worried if i was amd, looking at their new epyc technology, that taking advantage of double throughput from ddr5. exiting times indeed im sure intels glory will be short lived
I find it a bit worrying for the "Intel advantage"(despite disclaimers at start) when using windows 11 AND DDR5, as the vast majority hasn't made the leap for either yet... And will probably not be representative for most users experience....
Perhaps another video with windows 10 comparison?
I second this Idea. Thats what I was thinking throughout this whole video.
I guess that's in the follow up video they mentioned. They must have been very pressed for time in this video.
After years of misleading the market with graphs and charts... You can clearly see that nothing has changed.
Windows 11 will be the standard in a few years (not because I think it's good, but that's how Windows is) so if they wanted this video to age well it wld need 11. And yeah benchmarks take a long time, they can't test on both with so little time before the embargo lifts.
Hardware unboxed has done a bit of benchmarking with win10, results are it's not that off, compared to Win11, just a bit varied from task to task. DDR5 is a genuine advantage for them, but Win11 is not exactly boosting perf in general, it just enables BIG.little to work properly. And since they are advancing their chips themselves on that front, I consider it still a win on merit.
3:29 sheeeeesh
The thumbnail for this video is like an old-school ad, its great
Seems like the power draw being significantly higher means this technology not really viable for servers and cloud computing applications as is much less efficient per watt... I'm waiting for this new foundry to be built, i reckon next two gens of intel are gonna be hot!
the huge power draw comes from the golden cove cores as they scale horrible above 4,5ghz in terms of performance per watt
the gracemont cores below 4 ghz dont require mutch
The other thing I've noticed after checking prices today that it's ungodly expensive to get ddr5 memory and to get a compatible motherboard so the platform cost as of right now is going to be more than double than the cost of getting a current generation AMD or Intel setup. I like to see how things pan out over the next month to see how the real cost of motherboard and memory add to the platform cost of alder lake. I mean hell, just check new way right now and it would cost me $770 to pick up 32 gigs of ddr5 memory. That is absolutely insane when the memory cost almost double the CPU price. The motherboard cost from what I can see so far is around 450 to $500 at this moment in time. Which means you're going to pay a hefty penny for that kind of performance and as far as I'm concerned unless I see something more reasonable there's absolutely no reason to spend so much extra money for such a minimal performance difference when you look at it overall. Definitely the new adopter tax in full swing. Be patient guys and let prices drop a little bit at least.
I imagine it will take longer than usual for prices to drop due to the chip shortage
32GB of 5200MHz RAM is $280. Still really expensive, but not as expensive as you’re saying. Sure there’s nothing really in stock, but I wouldn’t be surprised if it gets resolved somewhat soon. NAND hasn’t been hit as hard by COVID as other industries.
Not to mention the performance increases. A year down the line, I wouldn't be surprised to see ram hitting 7000 mhz, for half the price it is now. Not to mention, Ryzen will likely soon have their 6000 chips out. 5nm on 5ghz will be a compelling argument to stay on team red.
I’m not worried. Since I don’t have a fixed income, if the prices never drop, that’s just another reason for me to quit gaming altogether. I’m already spending enough time in front of my screen as it is. Maybe it’s the perfect time to quit.
Exactly, I game at 4k 120hz and this is why I bought an 11th gen just 2 weeks ago.
Ram, CPU ( I7 ) and mobo cost me $1400aud. The same amount of ram alone is $750aud
The most ive seen the CPU used is 20%
This just makes me more excited to see AMD's next chips.
Amd its exciting, Intel its Hotter
AMD's wattage and TDP is a huge advantage for them, this is gonna get really spicy when they release their new gen DDR5 cpu's
if history is any guide AMD will fall by the wayside like every other time intel came back and left them in the dust
@@Baka_Oppai don't forget about that part that involved Intel playing dirty games in order to bring down the AMD. You want them to fight over your money, but legally, not by paying OEMs to handle miserable jobs.
@@Baka_Oppai History isnt reliable when you are talking about entirely different people running things.
Heat is not caused by a large amount of power in a small area. Heat is caused by a large amount of power in a small mass or a small weight. Density is important.
I think that having CTR configured it's key for Ryzen. And it's something Intel doesn't bring to its CPUs, only to the K versions and you have to do it manually. 1usmus could do it for AMD because the CPUs were capable of doing it in first place. That's key value.
I never thought that i5 will carry the whole 12th Gen line up.
The i5 10400f was carrying 10th gen tbh.
Never count an underdog out
Tbh the i5s carried the last 2 generations. 10400f and the 11400f
Which currently makes the LGA 1700 platform already terrible for upgrading down the road if you buy the i5 now, because 12900k doesn't look very compelling.
here when its still unlisted...
Hm
Hi hi
bro
How?
nice
my question is how does virtualization work with these cores. will it take random cores. half and half or only one type when i select the core ammts
i still wonder what brand those blue and yellow 'controllers' are that they use for scrolling trough the script
would be awesome to see some thermal results in some hot areas like, 30-35C room temp to see how much difference it make an air and an watercooler
cheers from Brazil
Buy an AC
At that heat in Brazil I'd be worried about humidity damage. Water cooling would work better in that space, however.
Move to south Brazil
@@andreas5914 ah yes, just buy stuff, it's so easy to do down here, an AC definitely doesn't cost 3x the minimum wage, nope.
Compra um extintor de incêndio.
Way to go intel!
Although I would be interested to see benchmarks on windows 10 with ddr4...
Agreed. Not everyone wants to jump into windows 11 this early
Does it support DDR4? I don't think any of the motherboard manufacturers have released them yet at least.
@@stephen9894 DDR4 has been around for years. 5 is new.
This is only an early video, their doing a comprehensive review later.
GamersNexus did their testing win Win10 and ddr5
window 10 doesn't support these chips and the next and chips will support ddr5 so is better to use the latest tech to compare.
10:20 have you considered undervolting and underclocking to see the best power saving while having ok performance
8:40, boy were you wrong, Linus!
Exciting! Will say I wish you tested with DDR4. The price puts it on a totally weird, whacko comparison no matter the performance, and God knows there's some serious benefits to it but yea. Totally screws with the price/perf.
DDR4 performance is the same as DDR5 at much lower cost.
It just isn't worth investing into DDR5 right now because when you next upgrade, current DDR5 kits will be real bad, same as what happened with initial DDR4 years ago.
@@Steamrick It isnt. Go check LTTs, or anyone else's DDR4 vs DDR5 video. Tho i mean... I guess yeah the DDR5 kits in the future will get faster
@@Steamrick yep no gains in gaming from all benches ive seen so far
seacrch 12600k review from optimum pc or something like this he tested i5 with ddr5 and ddr4. DDR4 was same as DDR5
Check HWUB's review of the 12900K. DDR5 helps in benchmarks and in select games
I can't wait to have this in 5 years when I move to windows 11.
Always wait a generation for improvements and big fixes
And 13 is a bad number
So, i5-14xxx for me!
Yup
Exactly my friend.
@WILDENZ I have to use my machine for work so I'm just spooked that updating will break everything. If I had a dedicated gaming machine I'd already have updated to fuck around with it.
Exactly what I was thinking!
@4:00 wheirdness of tests, wheirdness of theeze all wheird cores all together... Linus, You are a wheird man... continue please.
Which camera do you use?
I would love to see the same chip be tested with DDR4 and 5 to see how big of a difference it makes.
Stolen comment
This is why I love being poor. Because I'll always be saving up to buy a pc, and as the years go by, better hardware becomes available
Just wait till next year when Zen 4 and RDNA3 are out or 18 months till zen 5 is out haha
Yeap not like they really care, but rich people literally waste their money upgrading their PC every year lol.
LOL SAME
@@Dave-dh7rt I'm a patient man
That monitor alone be costing more than my entire gaming rig
Considering the 12th hen CPUs have more cores, and more cache total…. It’s kinda expected the thing takes more power when all of those P and E cores are running at max.
One thing that is good about these CPUs, the E cores doing all of the background tasks, like running something like steam, discord, and other Windows default and other apps, leaves open room for those P cores to be unrestricted by background tasks.