CPUs in 2024: I Babble About Them for 15 Minutes
Vložit
- čas přidán 16. 06. 2024
- CPUs are my obsession. My love. My precious. Let me talk about lots of things about them that I normally only share with Fluffykins or my pillow.
This video was made on a N100 mini PC as a personal challenge
0:00 - Boring VS Interesting
1:01 - $100 PC
2:50 - The CPU lineup
3:27 - $100 CPU
3:50 - $135 CPU - All you need
4:55 - Productivity
6:00 - A Beautiful, Crowded Budget Market
7:17 - AMD Since Ryzen
9:58 - Intel Since Ryzen
11:17 - The Future - Hry
This man could make a video about fertilizer price to performance in 2024 and I'd watch for 30 minutes.
Edit: I’m glad we’re all in agreement
i would love some youtube longs
i'd definitely watch that
that might be a tad bit too exciting, how about yesterday's weather?
So a literal shitpost
@@skalturaIt was very grey and compellingly boring actually
I imagine that the mini pc has some sort of magical curse that makes it slightly unoptimal for everything from 8-bit snake to fully raytraced 8k gaming
It's probably windows I have a laptop with an atom CPU and it's horrible to use with win 10. I swapped it to linux and have a much better experience, I'm not running a desktop just a window manager and it's a much better experience.
The Curse of Slight Inconvenience
I'm running a telescope rig with it. it's 12v, super low power, and I can use RDP from the comfort of my warm desk.
sadly I did need to upgrade to an nvme drive in order to grab raw video from my planetary cameras.
They are an amazing alternative to raspberry pi for DIY stuff
Install Tiny 11 on it. A cut down Win 11 with no bs running in the background
That 12100f one cent off deal 💰💰💶💶
Oh well now i gotta buy it.
thats like, a whole cent. do you know just how much you can do with that? yeah me neither :(
bro on the screwegg grindset 💪💪
Back in my day you could go to the movies, get a steak dinner, get your car washed, and still have change left over.
i got it
Just wanted to say that I have become increasingly appreciative of your journalistic selection and demonstration of the "Budget" range of technology as I have gotten older. Especially now that I have to pay for things like rent. While so much of tech journalism seems to be focused on the next, latest and greatest its important to highlight the mid range of what most people will ever need or use. Your demonstrations like the video editing on the $100 PC I think are great at proving why these lower end systems are important.
As a quick example, I accidently broke my ~$1000 phone and picked up the Samsung A03 from BestBuy for $60 and I swear to god I am never buying a more expensive phone again. Because while the camera may be shit, there isn't any app it cant run smoothly besides for maybe super graphics heavy games which I do not see as a necessary for my phone to play anyway.
@@1637tyyeah the samsung a series phones are bloody great.
Whenever i suggest for someone to get one of them instead of a high end model phone they laugh at me.
Got my a50 on ebay for $100 aud and got an a42 on facebook marketplace for $150 aud after i broke the screen on my a50
@@1637ty if it wasnt for the ease of use with IOS, i would also go for a relativly cheap android. because you're right, aside from the poor cameras that are generally typical with cheaper androids. the phones themselves are usually fine. and relatively fast. or at least as fast as i'd need it to be. but yeah, most people who arent stuck to their phones all day, do not need a 1000 phone. at any point.
This PC is two times cheaper than 8 gigs of ram in a Macbook. Pretty impressive tbh.
Oh wow, didn't think of it in that way.
What you were experiencing on the N100 was the fact that CPU utilization can go above 100%, tools like hwinfo show it correctly, while taskmanager does not.
i had a CPU on a relatively new laptop it was around 300~ dollars, its boost clock was 2ghz and it had an hdd and i used to play CS:GO on it it would get like 25 fps on dust 2 so i would play the training map and try to beat the high score as i got 40 to 56 fps in the training ground @@2kliksphilip
What's the point of using percentages above 100%? I guess it better illustrates the "boosting" behavior, that can only be sustained for a limited time?
i had a 300 Doller laptop with a intel Celeron 2core processer and its boost clock was 2ghz i got around 25 to 15 fps in CS:GO 640x480 resolution @@2kliksphilip
@@tomallo99 It can happen for a few different reasons, one is that its possible that monitoring software is reporting over 100% usage because multiple cores are being used (whether it be physical cores or threads), another one is that i can be that a single threaded application is fighting for CPU time, another one is that a single threaded application can be utilizing multiple threads, increasing the throughput.
Short answer:
The CPU Utilization is incapable of going over 100%, OSystems and HWinfo are only measuring time of the cpu threads dedicated to tasks(and to drivera+operating system itself).
The additional performance counters just let you see how well your program utilizes cache and how often the cpu is able to to guess the result of branching execution paths.
Longer answer:
Yes a cpu has more than 1 ALU unit. Cpus execute multiple instructions in one clock cycles.
And the code you run might not be up to a task.
So yes it is possible for certain tasks to make the cpus crunch harder.
Your code can also make the cpu hit branch misses less often(certain types of branching having lesser penalties, let's you hint to the compiler what code path are much more likely to be taken).
Even Longer answer:
The N100 can only use a single-channel DDR4-3200 or DDR5-4800 MT/s memory kits.
Not only does N100 have a video decoding block.
Demanding tasks such as video editing let's you divide the data that is being crunched into smaller blocks of data.
This means that data can fit into L1/L2/L3 of the cpu (like color correction can be done say in blocks of 64x64 (which fits into a 16KB L1 cpu cache block)).
This just means mutliple effects can be applied on a 64x64px scale reducing the number of transfers from memory to cache when processing data(yes even the N100 is going to be botlenecked by slow memory).
11:14 "And right now is a h...very exciting time for me"
Could've sworn there was a hellaæøœ coming
That's because it's AI generated 🤫
Entire video is a Userbenchmark burn and I love it
not really it also takes shots at the people that have been sucked up to the whole AMD fanboying but they changed their argument as their cpus got faster.
all fanboys are clowns.
@@chasethefeel true that.
@@chasethefeelimo, nobody should be a fanboy, they should be a realistic fan. you might like amd or intel and there is nothing wrong with it. just don't bash the other brand if they are doing well.
Whats wrong with Userbenchmark?
@@robiulhaque70 agree everyone can have favorites i personally dont care but one bit i see way too much is amd fanboys saying inel fanboys buy because of the brand!
but if you were after cheap mid gear cpus intel has that market now not amd
that was the whole reason ryzen was valued in the first place.
tldr buy what u want no one should care just dont be hypocritical.
you will babble, i will listen
"Babababaabab ybiffrwf dadadadaddaaa"
Great words from 2Kliksphilip, that we'll listen.
; *
CPUs have improved so much in such a small amount of relative time. From R7 2700X, to R9 5950x, to i7 12700K, to R7 7800x3D, there was always a noticeable improvement in game perf. And from 2700X to 5950x was night and day in multithreading heavy tasks.
In all that time though, I only ever upgraded the gfx card once. Went from GTX 1080 Ti 11gb to an RTX 4080s 16gb just last month.
Remember that era of i5-2000 to i5-7000 series when the processors didn't really change at all? AMD threw a hell of a wrench into the system, from affordable 8 cores with the Ryzen 1700 to consumer-grade 16 cores with the Ryzen 3950X. I can't wait for the 8000 or 9000 series of CPUs, I'm hoping for a performance uplift good enough to justify moving to DDR5.
@@2kliksphilip I agree, there should always be some sort of fight between tech companies to make better and better chips and technological advancements, because if there isn't one, then the rate of advance is going to slow down
Yeah like compare 2011-2018 cpus it's like the same performance. Now 2018-2024 the HEDT cpu from 2018 is slower than budget 2024 cpu
@@2kliksphilipThere was some innovation, just basically none of it went to the mainstream/performance desktop market, it was all about power efficiency for mobile, and core count for server. For example, xeons went from 8 cores per socket to 22 cores in 4 years (2012-2016), at basically the same tdp rating. Intel just decided to do nothing in the mainstream market because amd was licking windows.
@@lookitsrain9552 And because of that mid-late 2010s HEDT Intel stuff still is totally serviceable, just at a higher TDP than modern stuff and less features. With the way things are going, it may not last long like that, but still cool. Because that's where all their innovation went until Ryzen came around, especially Zen 2.
I'm really a fan of these CPU and GPU videos Philip makes, because he does them from a normal consumer's perspective, instead of just wanking on the latest and greatest, most expensive things like many dedicated tech channels. Actually, one of these videos was the reason I switched to AMD in 2018 and I'm really happy with that choice still.
Your passion and enthusiasm for CPUs in the current market is so endearing.
Something you didn't mention that is becoming increasingly important in MT workloads is figuring out where the application stops really scaling with more cores and which cores you should use at that core count. The main thing I do on my personal machines is software development and with the languages I use now I only really ever use 14c/28t at once.
At that thread count my two options are really the 14700k or 7950x, and while the AMD might have lower performance when all cores are saturated I'm not going to be saturating them. In a 14c MT test those full Zen 4 cores are going to outpace those E cores a lot, making AMD a better option.
For a lot of professionals I think the advice of "get whatever has the better MT score" is quickly becoming harder to agree with software usually doesn't scale linearly and I really wish tech media would mention it more.
straight to the point, valuable perspective. always love when you upload
just as i was about to watch this video i remembered your klik empire video and i just wanted to say that i really enjoy your CPU videos and have for as long as ive been watching them (including bygone era advice cos they are fun) i hope that you enjoyed making this video because i would love for you to continue making more like them, you are one of the big creators i watch where i am actually axcited to klik and see when you have a new video. have a good day chap. -your fan
Thought the quality of this video is a little bit off (and no 4K) until you said it was made on the mini PC and then it all makes sense
I still fell like 4k (along with 5g) is one of those technologies simply made to waste everyone's money.
When I start hearing Digadig... I already know that this is going to be one of the greatest 15 minutes of babbling.
I have already watched this video three times over... Your computer part ramblings are so edutaining!
I love that you have this fascination for analyzing stuff nobody else wants to and letting us see your outcome
I absolutely love these in detail hardware overviews. Very enjoyable and informative.
I love these sorts of videos from you!
And I totally agree, it's crazy how awesome the CPU market is right now for the consumer :D
very interesting indeed, helped me reconsider a cpu upgrade atm. keep yapping 2kliks!
love that you share your hype for the modern battle of the CPU market. I am equally thrilled about the future of CPUs
I feel like I can almost predict the vibe of the next segment in the video, positive, negative, intriguing or thrilling, simply based on the queued track that's been playing alongside the videos over the years (including recent tracks). Amazing how a familar soundtrack can do that.
I LOVE YOUR VIDEOS! Script at 03:00 is HILARIOUS :D
Another Banger Video. Thank you!
It's always a Helluvalot of a time. Thanks for the breakdown Phillip
GPUs are like houses now.
can always listen to you talk about tech
Thunderbolt 5.0 support in cpus is coming out too which makes external GPUs very viable now for many tasks that are optimized decently. Most tasks avoid gpu to cpu back and forth communication anyways because of the latency.
Most tasks are a throughput problem.
And thunderbolt 5.0 has reached very viable throughput imo.
It very much shrinks the gap between laptops and desktops even more.
yeah shrink the gap as long as you carry arround a massive eGPU enclosure with you that takes up more space than the laptop itself
3:40 was actually hilarious. glad I wasn't looking away for that
When fx cpus succ 120w it was like the worst thing that ever happened in cpu history and now intels 400w consumption is a selling point
This really is like intel's bulldozer but if bulldozer was a actually competitive, what you got from bulldozer back then was:
- technically more cores (treating threads as cores iirc)
- higher clockspeed
- higher power draw and more heat
Now with intel you get:
- technically more cores (through low power efficiency cores)
- higher clockspeed
- higher power draw and more heat
Like seriously the parallels are uncanny when you think about it.
the highlighting at 3:45 made me laugh out loud.
I used to be a NVDIA and Intel "fanboy" for many years. Last year though i got myself a new gaming pc with the Ryzen 7 5800x3D and the 7900XT and i absolutely love it.
i tend to make it a point to watch all new vids uploaded across your channels. i am halfway through this trying so hard to gaslight myself into thinking i care enough to finish it. i can do this
My PFsense router is hosted on an SFF PC I bought from a local government for $70. Still doing it's thing five years later. its amazing the value you can get on the used market.
I have to say Phillip no matter what the video topic is across your multiple channels I always find your videos very calming and they often help me sleep. I thank you
Ever since I build my new PC I stopped looking at PC hardwares for a while, but every once in a while I love to watch your videos talking about these stuff
Don't know why, but I always enjoy hearing you just chatter about hardware. Even if it isn't the most information rich, sometimes I do learn something new and see a different perspective
I love Philip's look at the tech industry and his insights.
0:46 Anyone else love that hellleeelll of a lot Phillip does?
As someone who's not that tech-savvy, and is thinking of building a new PC, this video has helped me catch up on a lot of things, especially for CPUs. This video was a godsent for me. Thanks!
fantastic babble. dont get much of this kinda content from other pc hardware youtubers so its great to see
This video is so fire thanks Phil
12:34 This is one of Intel's greatest shortcomings in their marketing.
E-Cores aren't Power Efficient. They are space-efficient. In a sense that Intel can fit 4 E-cores in the space of 1 P-core on the CPU "die".
I haven't found any benchmarks that substantiate "Power Efficiency" claims. On the contrary, the P-cores deliver more performance per watt compared to the E-cores.
Thank you! It's so nice to see people that get it, so to speak. The only power efficient part of the E-core is that their lowest usage is lower than a P-core. So if you have like a really light task you can theoretically set your E-cores at 400 MHz and they will sip power. Especially nice if you can tune the voltage for them too. But it's a rather niche case.
Depends on the task, for singlethreaded tasks, of course the P-cores are power efficient, but for multicore they are terrible.
Without the E-cores, i would be stuck with the i9-10900K's 100 watt iddle power draw instead of 5 watts on my 12700K, making the 10900K useless for a Digital Audio Workstation.
@saricubra2867 theres no way 10900k has 100w idle power draw
Like the final phrase... like adding a 20degree angle upwards... :)
I love it when someone just shares their passion for computer tech. I started my PC with a 6700K from Intel which was great at the time, then I held onto it for probably four years before upgrading to the 5900X from AMD; MASSIVE leap in performance for rendering and relativelyy fair price. Now the market is so much more interesting for finding hardware that suits your needs, ranging from brute-force power to efficiency and punch
thank you Philip
That Pause at 3:05 made me think my internet. was playing up. 🤣🤣
im always more excited for your videos than hardware developments tbh
Thank you for mentioning the 3100! I got the 3100 for 80$ back in mid-2021 paired with a 1650 and let me tell you that cheap CPU was more capable than I gave it credit for. I upgraded to a 5600 I got for 150$ and the difference was noticeable immediately yet I was still deep down thinking how incredible it was to get such a competent CPU for less than 100$ back in the market then and it ran so cool with the stock Wraith cooler never failing to reach max boost clocks. I sold that PC with the 3100+1650 to a friend with a steep discount and she works and games with it, it meets all her needs and that PC will fulfill its duties for years to come without a hitch.
also a big plus point for the 5xxx series is that you can use some older mainboards, such as the msi b350 that i used for my 2600, by just updating the bios.
i hope they get even more competitive about it so we as consumer can have more of the benefits
Pleasantly surprised to see you don't have any affiliate links in your description at all, not that it matters but most people making these sorts of videos just make them as an excuse to pad their descriptions with a bunch of links
My man just called the ryzen 7 7800x3d "great midrange" 💀 its literally the best processor for gaming rn
Gaming is a mid-range CPU task, the high end involves rendering 3d scenes and videos and doing AI shit.
All of those are GPU tasks XD@@Jetsetlemming
@@JetsetlemmingNah, i almost literally burned my i7-12700K with a Digital Audio Workstation.
this is the video ive been waiting for
I've never seen someone more excited for e-cores!
I brought a 5800x3d a few months ago, despite the 7000 series / 13 and 14th gen being out i still went zen 3 since i still had my old b550 motherboard and its been great! I don't feel like im missing out on much with the faster and newer gens, but after building my friend a 13900k system i can tell just how beastly the new intel CPUs are! (my last intel product being the i7 7700k)
This market is perfect timing as well since im looking to upgrade my old gaming server and the Ryzen APUs and even the i3's of this gen are pretty capable in terms of gaming performance... Now if only NVIDIA and AMD would come down from their high horse of pricing in the GPU market... Great video as always Philip!
I upgraded from a Haswell i7 to the Alder Lake 12700K. Extremely overkill CPU, totally worth it, fast shader compilation and THE Intel CPU for streaming.
Honestly this video really makes me hope that the next generation of Intel Arc ends up being their Ryzen moment, because the current GPU market absolutely sucks (as you mentioned), and with the recent AI boom, Intel has a lot of motivation to get their GPU division up to snuff.
Build a dirt-cheap Ryzen 5700G-based Linux rig last year, mostly from used components, and couldn't be happier with it as my main system.
It effortlessly lets me play Alien Isolation with just the integrated graphics and the bog-standard amdgpu driver that comes with openSUSE Leap 15.5.
It compiles my custom kernels in far below 3 minutes and gets barely over 60°C while doing it (using nothing but the standard cooler that came eith the 5700G).
5:56 "Do Kliksphilips Dream of Processors?"
Love the title haha
love the video, I pretty much enjoyed my meal
Looking at those gpu charts i remember playing at 800x600 😊
That ending came out of nowhere 😂
I'm a simple man, Philip uploads, I like before I've even started the video.
i've been using the 12100f for about a year now in conjunction with a 6700xt and can wholeheartedly recommend both, i havent had any trouble at all running games, unless its late game hoi 4 at max speed but still perfectly playable
I still love my 5800X3D. That extra cache really makes sense when you try it in games. It has a huge effect on 1% lows, which has eliminated microstutters entirely.
When Intel starts making affordable high-performance chips with a similar amount of cache, I will consider them an option.
I have a 12700K, i bought it for the IPC, not the cache, i paired it with DDR5 as well, it demolishes your 5800X3D on PS3 emulation.
I'd usually go for the cheapest 6 core but since there's now a non insubstantial amount of games that love insane cpus, I spend almost as much on them as gpu and don't really have much of a choice about it
Good video. Just wanted to make small correction - Intel's SOC tile does not have any GPU cores on it.
2:04 that's because your expectation of "easier tasks" are actually harder on the cpu than you thought, and since cpu doesn't waste time ever not being utilised at 100% in high demanding tasks, it's above your expectations. Easy.
I was hoping you would also touch a little bit on the impacts of the apple M chip series.
Seems like those are what people were hyped about a few years ago since they were super power efficient and powerful.
i love "philip babbles about pc hardware" i keep watching the old AMD RX500/400 series videos
the one thing that has gone crazy recently is the prices of motherboards and their features being limited to super expensive boards you used to get onboard power/reset and 9 panel readouts now you need to spend a fortune to get basic troubleshooting equipment
12:28 The E cores are the latest iteration of the intel atom line, so the N100 is just an atom chip with newer branding
Hello Phillip, are you interested to revisit VR gaming as of right now? Would be really interesting to hear your take on it. Quest 3 came out recently, it may be the wireless vr headset we were waiting for, but is it worth it? Anyway, thanks for the great work, love watching your content.
And this is why competition is good
With hyperthreading going away as well its gonna be interesting
The N100 is a really cool CPU, it's cheap and extremely power efficient, but the big thing for me is the fact that it has hardware video acceleration (QuickSync). This might be what is helping power through modern games easier than old ones, though I'm not sure. What I can say is that it absolutely does help immensely for video encoding/decoding, and explains why you found email browsing slower than video editing. I set up one of these mini PC's with an N100 as a media server, and to my complete surprise it can do 6+ 4K transcodes at once, I didn't even bother testing any more. Using software decoding it would be struggling with just 1. And plus, this was with extremely modern formats such as 10-bit HEVC, VP9, and AV1. Having this on a little efficient mini PC and an AMD 3D cache CPU on my gaming PC is a really awesome setup, and I didn't even have to spend that much. We've come a long way haha
those n100 mini pcs run linux really well, its night and day compared to windows 11 in my testing
Windows 11 is just unoptimised, or rather, the UI is mostly web app trash
Running the ryzen 5600 with a 2070 super. The cpu is a beast. I upgraded it from a i7 4770k which was also a beast
I am not a semiconductor engineer, but this is my understanding: Chips made of a bunch of chiplets connected together are necessary because the new process nodes using EUV lithography have a very limited area of wafer in focus at one time, only centimeters wide, and it took some heroics to get it that large. Everybody will eventually be using them for performance desktop chips, whether they connect them by multiple exposures using larger interconnects, or mechanically.
You can get better yields per wafer on chiplets compared to a monolithic design so it's cheaper to produce more working chips but that cost gets offset by the interconnect. It also helps scale a product like AMD are doing with epyc and threadripper and planning to do with GPUs.
Everyone will definitely switch to a chiplet design especially as we get closer to 1nm. As nodes get smaller it takes a lot more time to increase yield rate so for example you have a monolithic chip that has a defective Io that chip is basically scrap now with a chiplet process you can just make it work as it's an independent part, same thing with cores.
I think yield was the reasoning ten or twenty years ago, but now it's just not feasible to do a single photomask for a performance-CPU-sized chip.@@xfy123
The ocular limit will be reduced from like 850mm^2 to 430mm^2 (only for Intel, the other chipmakers don't plan to buy those new tools yet). Currently on the consumer GPU front even something as powerfull as the RTX4080 is only around 380mm^2 which means chiplets won't be necessary in that segment. On the server/data center market Intel has recently released their newest Xeons with up to 64 cores on 2x750mm^2 chiplets which is already impossible to make on a single die and it's forced to compete with AMD's up to 96 Zen4, 96 Zen4 x3D and 128 Zen4c core Epyc products.
The way it was explained to me, yields used to be a justification, but today the EUV process is just fundamentally constrained. To do a big chip you need to precisely optically align wide interconnects for chiplets on a bunch of photomasks, or mechanically connect them.@@xfy123
No matter how powerful CPUs get my software bosses will keep saying "don't spend any time on performance, hardware is just grtting better anyway" and software will keep getting slower and worse
I will be excited about GPUs again when we can get 4090 performance for ~800$. Besides that, I think APUs are pretty exciting right now. I'm looking forward to the next generation of handhelds.
The problem people have doesn't seem to be with the e cores, it's with the lack of option for avoiding them on the high end or replacing them with 2 more P cores lol. I bought a last-gen i5 for my high-perf openwrt build to avoid them, since it's a pain when dealing with low-latency applications and virtualization.
Epic ending 😁
I ended up using a R5 4600G for a buddy of mine. He's doing mostly office stuff, and the APU is overkill on both the IGPU side and on the 6 cores side. Funny enough, I would have went for the R3 4300G, but the price difference to the 4600G was just 14 USD. Either the 4600G was heavily discounted, or the 4300G was badly priced.
Regardless, I felt no remorse spending the extra to get the 4600G, even though the 4 cores part would have been absolutely fine.
my autism liked this video. can you make a video with graphs and stats about,
cpu and gpu price to performance improvement over years or relases. I wonder if which one of them are developed more and got more efficient.
The next time I build a gaming PC, which I hope will not be for a long time, I will pick whichever CPU has the most cache (for gaming performance)
I bought a ryzen 5500 for around 100 euros, no complaints.
Does everything i ask it to do
I bought in to Sapphire Rappids last year expecting the new cpu's coming out this year would be based on 13th gen core architecture. But no, it's just a refresh with a slight core count bump and some extra cash on some of the cpu's.
No extra single core speed, but I'm probably gonna upgrade to a higher core count model anyways. Gonna be cheaper than switching platform to threadripper 7000 anyways and supposedly Emerald Rappids will bring support for intel optane pmem. Maybe I can have some fun with that.
8:57 true for me. i am using intel's i5-11600kf and am now waiting for the 7800x3d to arrive since it's on AM5 and i know that I've got at least the 8000 series to upgrade to in the future if i wanted to. if i stuck with intel and went 14th gen i'd have no upgrade path. newegg 7800x3d combo (with motherboard and 32 GB 6400 cl 32 ddr5 ram) ran me $566 after tax.
I guess big cases is spending to "potential buying" performance regarding powersupply and pc case extra too and is really thermal resistant while as in small pcs its not. One mayor thing that isn't mentioned, that with HBM VRAM and dedicated gpus like in notebooks in small PCs would be possible, but I really do wish standarized small form factors like we have in big size desktops.
I dunno man, the release of OG Ryzen was the true golden age of processors. It forced Intel to not be so dogshit as they are now. Maybe even the release of Dual Core processors, true dual core. That also was an incredible time to buy!
Surprised you didn't highlight the 7800x3d.
Being the absolutely fastest processor for gaming while pulling under 100 watts is unbelievable.
Hey @2kliksphilip what are your thoughts on the new Snapdragon X Elite ARM APU? Where it might sit in the market, and what the future of gaming in ARM might hold. Would love to hear your thoughts. Maybe even a video review when it comes out.
I'm holding off my thoughts until benchmarks
Cool video! It's really nice watching content that's positive, rather than the constant stream of negativity that goes around. Not to say if you were negative that'd be horrified and unsub forever feeling betrayed, just me mood right now.
I just bought a 3070. But now i need a new cpu to help it out. This video came out just in time for me
I've been researching for a possible PC build for around 500 USD and honestly I was kinda shocked at how cheap (relatively speaking) a 5600 is, considering it comes with a good-enough cooler and AM4 motherboards are aplenty. Though well-priced used GPUs have been a bit of a tough find in my country, I'm pretty excited nonetheless.
Nice video.
Ryzen 5 5600 here.
I paired it with a MSI B450M PRO-VDH MAX, some random, used 16 GB 3200 RAM and a used GTX 1660 Super.
The CPU really is a blast, so glad I went with AMD here.
Runs CS2 with medium settings and 1080p at about 180-240 FPS.
For less than 500€ you can get a still very competitive gaming PC with this amazing CPU platform.