Intel's Pushing It Even Further
Vložit
- čas přidán 15. 05. 2024
- ► Check out today's hottest tech deals here: www.ufd.deals/
howl.me/cl4NZpfqrah
howl.me/cl4NXRTgrDP
geni.us/RjzeGZ3
0:00 - Intro
00:17 - Linux Driver Dude At Nvidia: tinyurl.com/29loe6n7
tinyurl.com/244k5zto
01:18 - Elgato Neo: tinyurl.com/2bknj8wn
02:34 - UFD Deals: www.ufd.deals/
howl.me/cl4NZpfqrah
howl.me/cl4NXRTgrDP
geni.us/RjzeGZ3
03:35 - Lenovo With Snapdragon X Elite: tinyurl.com/2bjpd4o8
04:04 - Intel’s 500W CPU: tinyurl.com/2xkdteyr
tinyurl.com/25bvat3f
tinyurl.com/2abyvkgf
07:10 - Comment Response
► Follow me on Twitch - / ufdisciple
► Join Our Discord: / discord
► Support Us on Floatplane: www.floatplane.com/channel/uf...
► Support Us on Patreon: / ufdtech
► Twitter - / ufdtech
► Facebook - / ufdtech
► Instagram - / ufd_tech
► Reddit - / ufdtech
Presenter: Brett Sticklemonster
Videographer: Brett Sticklemonster
Editor: Rikus Strauss
Thumbnail Designer: Reece Hill - Věda a technologie
► Check out today's hottest tech deals here: www.ufd.deals/
howl.me/cl4NZpfqrah
howl.me/cl4NXRTgrDP
geni.us/RjzeGZ3
Is your Twitch/streamelements down, both did just stop working. i get Sorry. Unless you've got a time machine, that content is unavailable.
Well, the good news is the computer alone will keep you warm without a space heater. So you are technically saving money because the space heater and computer don't have to both be running during winter!!!!
finaly cpu powered kitchen stoves incoming
can mine and cook ur food
@@Jo21 @kokodin5895 Right? They can pipe heat whole cities, other regions and melt the ice caps thirty times faster, suck dry all the output of a nearby nuclear fusion reactor, AND COOK YOUR FOOD! Pfft, who needs the sun? Intel solved! Sun, you're cancelled!! 😂
One day.. the legendary oven gpu comes out, thats sure. Gaming till you forgot your food in it.
@@aykyi2668 better, the oven was powered by cpu's to bake that one gpu
You know what else consumes 500 watts? The freaking space heater in my garage 😤
now you can set an AI rig for the same power that actually does something besides heating the air🤑
The 14900KS already pulls 500 watts in blender and that has a 150 W TDP 🤣🤣
Data centers using Intel chips should open a thermal power plant next door to reuse the heat generated because it creates heat like a coal-burning furnace😂😂😂
Skynet is gonna love that cpu
500W on a CPU is crazy even for a Xeon...
It's a 1500mm^2 multi-chiplet CPU so it totally makes sense for it's size, in comparison the 4090 only uses 610mm^2 of silicon.
Not really
Why not ?
It may has up to 288 cores@@kashyapchonekar5437
Thats what I needed, a CPU capable of using 99% of my electric capacity. Who needs household equipment?
The cut at 6:29 was made a lot cleaner with the same arm movement 😂
14900ks can be 400W already LOL
Intel pushing for more watts on their Xeon CPU's might make it less appealing compared to AMD's Threadripper CPU's, because these CPU's are meant to stay on for crazy amount of hours, this affecting the power bill for companies. Lol
500W is not that much if we are talking about so many cores...
All we need is micro fusion reactors at home, heh. I have a small fan heater 900W use only when it gets extremely cold in the winter. But I think my 7800x3d and 7800xt at max take 300-400 W.
2watts per core
Actually sounds amazing, really.
@myne00 you are simply misscalculate somethig. It has 128 cores, and 128x2W is only 256W it's not 500W..
I dont think the other ones will use the 500w thats available. Its 288 cores on the one that will be 500w@TamasKiss-yk4st
at that point, surely the tjmax for those chips has to be like 120 and they're designed to run comfortably at 110-115
I'd honestly like them to stop the power creep for a bit an try to optimize to get the same calculation power for less actual power ..
lol I missed Kyler's Reece transitions lol.
How long until I gotta run a 220v to my computer?
I'm gonna have to make a choice between plugging in my computer or one of my oven or dryer.
Really like your shirt missing ironing. I like this trend because I follow it.
Cant wait to call an electrician to install my new CPU
I mean we don't necessarily need budget GPUs for the foreseeable future if they make the APUs better value. We need mire than 12 CUs max. There's a big gap between that and a 7600 XT
The Nvidia stuff comes after I just sold my modded Switch Lite. I had Linux running but there is a nasty bug in the Nvidia code where the frame pacing is terrible and has tearing when the screen is rotated. The Switch screen is natively a portrait screen, so always rotated. 😢. Maybe it gets fixed?
Look at it this way, i could replace the heaters in my house with that. My laptop rtx 3060 is like 80w max and does a good job heating my room lol
the thing is that amd zen4c cores are just smaller zen 4 cores
they have the same features and everything they are just smaller
but the new intel cpus are only efficiency cores
they are a different architecture then the performance cores as i know
Excellent set of news!
I'm not even mad. 288 cores at 500w is only 1.5-1.7 watts /per core/ (if you take the memory controllers and PCIe controllers out, maybe even less). If they have them doing useful work that's amazing.
CZcams BAN these bots
They WON'T! They'd rather shadow ban comments, and hid legit comments than protect us from these bots and imposters.
@@OctagonalSquare lol CZcams hid my comment what a surprise....I state what's wrong with the platform and they hid my comment.
@@RAM_845 they do this to a lot of people and are seemingly oblivious. the people who run this platform are beyond incompetent.
@@MarkMichon7 I think they're politically biased I can see my comment that I posted on here in my comment history via my channel history.
lol
"suckling a lot of juice"
I spit my coffee out. Came for the info laughed away by the commentary.
Pretty soon we will need our own power plant to power our pcs
I have Xeon w7 as daily driver. After light OC it's 620 W CPU... It's crazy, but they are so much easier to cool than desktop ones, since the silicon is size of a tic tac box, instead of your pinky's fingernail.
bro I swear in the future even a 1000W power supply isn’t gonna be enough 😂 ima need a whole ass power plant 😭
I wannahear userbenchmark with that one 😂
I don’t want bigger cpu, I want multi cpu for consumers, let me put in 2 £100 cpu and 2 £200 gpu in one motherboard, heck, let me connect 2 pc together and double the processing speed
SLI is dead
They won't bring it back
Funny back slap: Epic makes iOS game engines $1 per install unless it is downloaded in the Epic store ...
14900KS already pulls 500+ this thing is going to be more like 800+
Yeah, Intel, why ARE you doin that?!
The 14900KS already pulls 500w's this is going to be more like 800w
Yeah I don’t think nvidia is too happy that George Hotz found an exploit in the linux driver to enable P2P on Geforce gpu’s a feature only available on the more expensive workstation/AI gpu’s.
seperate breakers for gpu AND cpu
nice
imma ask my power company to just gimme that industrial 20kv, pipe it straight into my pc dood
Intel bring the power! Boom boom fire power 💥
I am scared to think what a 2000 W CPU would do, especially given Intel history of lack of cooling, I think it will probably turn into an atomic bomb.
Dude!
We gotta help pump these view numbers up!
500w.... my 14900K couldn't be powered up by a 450w PSU stock! 🤣🤣
DUDE!
Brett, higher end PCs aren't a necessity. If someone wants a budget 100 dollar chip, why can't that person just save up a little longer to build a higher performing pc that would last them longer.
Then they'll release the Core Ultra 9 290KS, 355W max turbo power, lol.
Even Further BEYOOONDDD
Lets Do This. You Got Bud In Bottles?
Oh hi Marc
loving the linux content!
Hell yeah dude
Rare brew tea when?
CPU the size of a motherboard inbound.
garlic bread
With spaghetti 🗣️💯
Bread makes you fat.
Insane consume as much as my 4090 only ppl using those CPUs will be like corporations that do 3d modeling and insane stuff no regular person would ever need or even want those cpus unless there Linus and want it just to make a video on it
Energy is cheap when you make your own. Server companys can figure out their own thing. You just get solar or build near a power plant and get a good deal. Electricity will always be cheap even if its not to you. You can get disconnected from being able to get your own power and scammed but oil prices has what to do with an oil well? It takes the same ammount of effort to pull it up. Solar pannels sit in a field. You only need enoough pannels. A power plants costs come from government.
It doesnt really matter how much energy a part pulls unless you live somewhere where you are not free to use energy except at high cost.
You may think that that is everywhere but that just means you have not been free so you should have real things to think about instead of what program your going to play with.
It's funny. All the analytics show Nvidia dominating the GPU field, but I always see people in your comments purchasing or considering Radeon.
intel got them beechnuts
your running those cpu out of spec its the motherboard makers , maybe diy isn't for over clockers anymore. i run stock and score better than most overclockers, lol at ignorant ,what was just isn't anymore.
Nvidia is rushing to Linux now because all of the major servers run some form of linux. So they need it now for their AI cards. Gamers are just catching the backsplash.
Duuuude!
what!!!!!! 500w i have intel core i9 10980xe and i do regret buying it because of how power hungry when it's oc. I think this time i'm gonna go with AMD cpus
More and More!!! More watts = more thermal throttling :) for first 10seconds will be good score but after multitype uses score will be lower and lower like intel 13/14gen :)
Hardware naming has become SOOO STUPID! Intel (with that whole 100/200 series ONTOP of the K/F/S/T/H/U or anything else!) AMD with their need to XTXTXTX3D everything and now theyre just upping the CPU number name based on the year instead of architecture (zen1 being 1000 series, 2000 series zen1+ etc) Is there a difference between the 7040s and 8040s? Nvidia having multiple products with the same name but hardwarewise are very different. Even Apple is in on this with the difference between their base CPU's and adding MAX and Ultra and PRO...which one is better and why not something that can be counted up?
I don't know how but this needs to stop! It's already a labyrinth to decrypt these and as time has gone it has only gotten so much worse! I feel like a crystal ball would make more sense now!
Dude
I suspect with DLSS and low latency tech, the proprietary Nvidia driver on linux will be the preferred driver for gamers for a long time, but the kinds of desktop issues the open source driver can fix will desktop users and more casual gamers so incredibly happy. So glad Skeggs is continuing work on it from Nvidia. It makes it seem like Nvidia is backing the efforts to have a competent open source driver for Linux.
They didn't care for so long its a day late and a dollar short, I wouldn't hold my breath for a competent solid open source driver for awhile if at all.
Day 1 of asking for Kyle to be a permanent co host
I hate Nvidia for changing the pronunciation of T.i. to tie. It bothers me and i know thats why you say it that way
❤ dis show.
why would anyone want to buy a card BELOW a 4060!? that's the absolute basic B. model. I really hope they just move to a good better best model and stop all this dillydallying around with low end stuff... I'm hoping for 5070, 5080, and 5090. if you want a less expensive card, they still have the previous generation, and you can get previous generation cards used for a steal, particularly right around the time they release new cards.
I love tatas, duuude. 🤪
Encoding is horrible on your last batch videos. It's either camera settings have changed or whoever is exporting the video has lowered the bitrate
Have you checked your resolution settings in CZcams? They like to change people to lower resolutions in a hope they wont notice
@@GonnerMeLeggies I have, I always manually check my setting to make sure they're in place. In 4K the bitrate is very low and you can clearly see compression artifacts.
Intel makes CPU TDP 500 watt is becuase AMD makes the their CPU the same TDP too, Intel want to compete on this point too.
You know this is a joke right?
Rabies in the sapphire...♥
3rd :3
Nvidia never does anything good without ripping people off, I smell a rat.
pluhhhhhh
85th comment!!11!
less gooooo
Intel bulldozer?
CPU for americans 😂
Or stop buying Apple....
If you don't want a high-end gpu, by used. It's as simple as that. I see no reason why any GPU manufacturer needs to produce any sort of low end GPU in a current product stack.
I understand, that binning means they eventually have a glut of chips that can't be used for the high end, and will eventually make lower end chips out of them. But honestly, in my opinion, if you need a low-end GPU then buy a used GPU.
Intel really wants users not to use their CPU
Please for the love of God learn how to pronounce "breakfast."