Intel is Gunning for NVIDIA
Vložit
- čas přidán 8. 05. 2024
- Sponsor: Lian Li O11D Evo RGB on Amazon geni.us/B3OD
Intel announced its plans to compete with NVIDIA - namely, the H100 - in the burgeoning AI space. Its plans include a new Gaudi 3 AI accelerator card, a push for more open standards than NVLink (like ethernet), and Xeon 6 CPUs. We cover Intel's news, what we think its strategy is, and give opinions from the consumer viewpoint.
Watch our similar coverage of NVIDIA's recent AI keynote: • NVIDIA Is On a Differe...
The best way to support our work is through our store: store.gamersnexus.net/
Like our content? Please consider becoming our Patron to support us: / gamersnexus
TIMESTAMPS
00:00 - Intel Says Intel is Boring
01:41 - AI, AI, AI, AI, AI
03:52 - Alternatives to NVIDIA for AI
06:50 - Everybody is Gunning for NVIDIA
10:57 - Fight for Second
13:47 - Intel Gaudi 3 Accelerator
15:45 - Intel is Excited
17:33 - Xeon 6, Lunar Lake, Panther Lake
20:03 - Intel AI Software Stack
21:13 - What We Think
** Please like, comment, and subscribe for more! **
Links to Amazon and Newegg are typically monetized on our channel (affiliate links) and may return a commission of sales to us from the retailer. This is unrelated to the product manufacturer. Any advertisements or sponsorships are disclosed within the video ("this video is brought to you by") and above the fold in the description. We do not ever produce paid content or "sponsored content" (meaning that the content is our idea and is not funded externally aside from whatever ad placement is in the beginning) and we do not ever charge manufacturers for coverage.
Follow us in these locations for more gaming and hardware updates:
t: / gamersnexus
f: / gamersnexus
w: www.gamersnexus.net/
Steve Burke: Host, Writing
Jeremy Clayton: Writing
Video: Vitalii Makhnovets - Hry
AI-AI-oh! Thread for people who don't get the comment about 9 not being a prime number is below! If we rhyme with AI like Intel, can we eliminate the prime with Dell?
Grab one of our GN15 Metal Emblem pint glasses to support our work! store.gamersnexus.net/products/gn-3d-emblem-glasses
Watch our coverage of the NVIDIA keynote that preceded this: czcams.com/video/0_zScV_cVug/video.html
@@lunarvvolf9606 why do you say that?
Thanks Steve.
Xcellent dancing four this man number one king four computer. Intel gr8 and wonderful four India become number one nationality. Intel mr Pet sir more xcellent and more smartly and power then AMD Lisa su, number one disgrace four all humanitarians but she try her best her branes and four this I am so proud.
Annnals of History
@@RanjakarPatel बहुत सुन्दर कहा भाई
good news steve, auto captions was able to successfully transcribe "annals"
Thanks to Nvidia H100, funny
Really?! It must have changed at some point. I checked originally and it said "ANIMALS."
Instant pause and rewind with cc on for me as well lol
on the other hand it messed up "Google autot transcribes" only a few words later
@@GamersNexus HEHEH....SURE....
I’m sorry Steve, I have not done enough acid to be prepared for this video, I’ll be right back.
Hahahaha
...you've had two hours now...
How's the acid
Last time I took acid, I took 10 tabs and experienced the ineffable disection of reality and my soul and how it relates with said reality. I have not done enough acid to be prepared for this video, I'll be right back.
Last time I took psychedelics the code in which the universe is written was laid bare to me and I was annhilated for realizing the true nature of the simulation.
I have not done enough acid to be prepared for this video, I'll be right back.
Those 24 200Gbit Ethernet nodes would make a nice Quake server for a LAN party
Fuck yeah
Wanna slide in a couple cod2 rounds, too? Pls! I won’t play shotguns, i promise!!
0:47 is an Intel MMX reference from a TV ad back in the 90's... but you already knew that didn't you.
Regardless I'm glad he included the clip because it's funny seeing this out of touch CEO goofily dancing while holding his product.
At this point, we’re going to need a compilation of all the intros where Steve gets cut off before he can swear due to the sheer bewilderment to what was presented before him
the reality becomes more and more unhinged. how do we even percieve this change anymore? other than simply staring flabbergasted and waiting for appropriate time to say "what the fuck is going on" anymore? its all so hillarious
Nah this is just ceo in their natural habitat. The ai just let them loose
@@luandoduy416 CEOs were probably always a little unhinged but mostly contained by PR/HR but these days being memeable is an asset. While somebody like Steve Ballmer was a liability back in the days.
Hear hear! Please do that at the end of year recap.
"Prior decade was sort of boring"
Yeah and who's fault was that?
Lugma corporation
Exactly, Intel spent it sandbagging and flogging quad cores, until AMD spanked it's ass with Ryzen, and Intel panicked and had nothing in return coz it spent over a decade rinsing consumers with mediocre upgrades while they twiddled their thumbs thinking yeah we can milk this gig for another decade, instead of researching and developing anything ground breaking.
AMD, for not being competitive and creating a situation where everyone else was allowed to coast.
"4 cores 4 ever" was the defining thing for their decade.
Clearly it was all those pesky engineers who love doing “boring” and “meaningful” work. The C-suite is here to fix that!
(Manufacturing AI and Robotics practitioner here) Machine learning in factories is good for getting actionable insight from the enormous amount of data in a smart factory... Applications like anomaly detection (is the machine ok?), explaining phenomena (what's causing bad quality?), optimisation (energy, quality, material consumption), corrections (can I add X to save the batch?).
Factories often had the issue of drowning in data but not deriving enough benefits from it.. cool, we have 100gb of historical data on that electric motor but what's going on now? What's going to happen? What should happen?
Thanks for the interesting insight!
It's quite fascinating to hear constantly about miraculously advanced AI stuff in an age when mainstream IDEs - used by the tech crowd - still often fail to correctly execute a simple refactor/rename operation. Or, to hear about upcoming advanced AI in Windows while they still couldn't even develop a properly working basic search box that could find a partial word match among the few dozen installed applications.
It's not for you, it's for execs who have wet dreams about replacing their workforce with technology that doesn't exist and will not exist any time soon
jetbrains moment
The US tech industry has always made money on hype. This is the latest one.
There is a good amount of 'there' there, but it's a tiny fraction of the hype. Like, yes, Gen AI CAN do a lot of things... but it shouldn't. And after the novelty, no one really wants it to.
0:37 I thought the music was edited, but it was actually part of the reveal lmao
I thought that as well at first. I thought "this can't actually be part of the presentation" but I underestimated them...
It seems the companies truly try to out cringe each other with this stuff.
@@B_Machine well... nerds be always be awkward... it's a trope for nothing.
@@B_Machinebrother have you seen the windows 95 reveal
@@steviesavagery no, I'll check it out!
Edit: omg, that was something else!💀
They at least had genuine energy to it. It had me thinking, "hell yeah," instead of, "oh no...🤦🏻"
"Every company will be an AI company"
Every time I see stuff like this I feel like John Cusack's character in 1408, living in a Kafkaesque reality.
I fully agree with you, cultured man. It lowkey chills my spine.
Companies will say whatever they need to in order to get Wall Street to give them money.
or that it is becoming a bubble, until it pop.
And when everyone's AI... no one will be
Every restaurant is Taco Bell
12 seconds, 12 fucking seconds and I am already cringing to the point my dentist can smell the payday, what the actual fuck
Good, I don't like nvidias stranglehold on multiple sectors of the industry.
All we need now is a competitor to CUDA.
what about ROCM????
@@theaveragecactusyoure comedic
Supposedly Microsoft is working on a consolidation layer (like they did for DLSS, FSR, etc) to help bridge the gap between CUDA, RocM and Intel's oneAPI. Take this rumor with a grain of salt though.
@@uzikun People said this with Zen, but here we are. Everything can improve with time, we'll just have to wait and see.
Yes, it's called SYCL
The “I love NVIDIA” AI song will go down in history as the tipping point in losing the race to our robot overlords
15:46
Even Big Brother is losing his job to AI...
Skynets plan to wipe out humanity is to make everyone unemployed.
The robots are coming
Normally I hate Pat Gelsinger, but that chip reveal was so cringy that it swung back around to me loving it. I mean I still hate him but I liked the bit.
Probably the most under-appreciated aspect of this channel is Steve’s sense of humor 🤣🤣
Also brilliantly covered the entire topic.
- How do we stitch it together?
WIth glue, intel. Glue.
and tape
Because they use EMIB for die to die interconnects, it’s more like a sticky pad😆
dont forget the Snake Oil!
@@benc3825nope...solder metal pad vs amd short term glue that their chips gonna fall off
and also duct tape
Google: correctly transcribes "annal"
Also Google: incorrectly transcribes "auto"
trillion dollar tech company, give em some slack
2:53 I too had to turn on my Closed Captioning and check, after he made that little comment 😉
Guess they should have used AI.
Oh wait
That Intel presentation reminds me of the old 80's "Mind of Minolta" TV Ads. They were amazing back then.
Seems like Intel might have a new vibe coming to it.
I'm thinking it might be due to real competition from elsewhere, and some folks leading the way.
I'm only 1 minute in, what the hell is going on with Intel's presentation 😂
lol
Intel has finally completely lost it LMAO
😂😂😂
Yeah I'd say so. WTF were those clips lol. And at 0:37 at first I though GN edited in the music. But I underestimated how cringe Intel would get... And I genuinely don't know WTF is going on at 1:16. Other than it kind of has that watercolor look that you can sometimes get when running images or videos through AI upscale programs (like topaz video AI). The clips were bringing back memories of that amazingly cringe Qualcomm conference with the whole "Born Mobile" thing (Qualcomm at CES 2013). Or Konami E3 2010.
All the companies at least indirectly do maybe, but Intel is first and directly supports occupation and terrorists.
They are obviously most insane and should be boycotted
This is still my favorite out of context quote from good ol’ Patty G
I might even be able to beat my children with that.
@@benc3825I feel like it’s such a rich person thing if you beat your kid with a GPU
The robots will look back one day in embarrassment at how goofy they were conceived
Yes, but so did you
In the words of Ford Prefect; "Family is always embarrassing isn't it."
goofy aah
hey, all kids got to deal with emberassing Dad jokes. Good to see our robot overlords will have just as many cringey stories "Not now, Dad! I'm building a humanbeingatarium for all these meat people!"
*goofily
Sorry 😂
3:02 google actually was spot on accurate with the AI closed captions
Your editing team is hilarious. 😂
I remember those days when the Blue and the Green used to team up together and simply ignore the Red team. Also the days when Blue used to insult Red openly in their presentation calling them "Imitator". The table didn't only turn, it got upside down!
Computer hardware marketing has officially jumped the shark. I know we used to complain that these announcements/presentations were boring but this is not what we wanted them to do instead.
I also felt like I needed to wash and yell translator. Half what the presenter said was fucking acronym giberish.
IDK I’m kind of enjoying laughing at them
Totally agree. I like my tech announcements prefaced by developers developers developers
Bruv, we going back to the blu man group Pentium days.
On one hand, I understand that they're trying to market to people with little to none of the requisite knowledge to understand what their products do or how they work: Tech investors and corporate executives.
On the other hand, why are major investment and purchasing decisions being made by people who need bright flashing lights and a live DJ to keep them entertained in presentations about cutting edge computer hardware? Why do we live in a world where this isn't a dry PowerPoint presentation going over performance data and their methodology for collecting said data?
Had no problems with "annals", but for some reason your clearly spoken "auto transcribe" turned into "autot transcribe".
Do they use AI for it now? I don't remember there being so many spelling mistake before, but I've been seeing weird errors for a while now.
@@AvendesoraThat part has always been AI what are you taking about
@@letcreate123 Sorry. A modern LLM. better?
Hi Steve! Just a quick one... I got the evo RGB and the 140mm fans are bigger than the bracket, the 2 screws on the bottom and 2 on the top of the 3 fans are getting screwed in the air :D I love how it is presented in the demo they fit on the support :))
The Intros to GN's vids are just getting better and better 🤣🤣🤣
Papas lil baby.
PAPA'S HERE *boss music*
Mr. Gelsinger I feel very uncomfortable right now.
Some sugar baby just got a new ringtone 😂
He ate Papa John's pizza.
@ts757arse I dislike AMD ☺️
Lisa Su looking through binoculars at Intel and nVidia fighting: "Let them fight..."
"let them take over the trillion dollar industry, those fools."
Yeah, Intel isn’t doing that any time soon. Losing tons of money tends to lead to having problems taking over the industry
@@benc3825 Isn't it remarkable that we live in a world where Intel is behind AMD and Nvidia in Market Cap? I never thought I'd see it.
@@smuggy8576”Intel has depleted its supply of prime numbers.” LMAO.😂
When Lisa Su became CEO, AMD was basically an unprofitable penny stock company with inferior products while Intel was a borderline monopoly. Since then, AMD share price has gone up over 50x and has overtaken Intel in market cap. Their CPUs now dominate gaming, and they're continually taking market share away from Intel when it comes to data centers as well. She knows what she's doing.
Steve, I have both your drink coaster sets, was wondering if you guys are working on another set? I'll definitely buy them!
Maybe neon blacklight sensitive?
The slide on Intel Gaudi 3 talks about inference and running, so it is not about training AI. The inference phase is what Chat GPT does to answer when you write some text (so, after training).
AFAIK, both NVidia and Intel are behind the Groq LPU both in term of latency (time to replay in chat) and energy efficiency (and Groq is still using 14nm)... There is a huge architecture difference, the Groq has a deterministic behavior (no varying instruction latency due to cache, unordered execution...a dream for developers), embed a network switch, which as I understand provide deterministic behavior at distributed scale.
Pat Gelsinger is slowly transmogrifying into John McAfee, and I'm not sure how I feel about that...
Bah! Is he faking heart attacks to escape the feds? Is he trolling people by tweeting about his whale banging? Is he absconding to some banana republic?
The music and dancing reminded me of Steve Ballmer’s “DEVELOPERS DEVELOPERS DEVELOPERS”
As long as he doesn't buy a house in south America we're fine
i doubt pat will sell drugs, kill his neighbours and abandon country any time soon
he will just keep destroying the company the same way he has been doing untill it becomes a fab only
I welcome this insanity
Old McDonald, ai, ai, ohhh
Old MacDonald's render farm,
ai, AIO
On this farm he had Jensen
ai, AIO
I love these coverages I encourage you guys to do more of these!
While the notes for Xeon 6 2.4x and 2.7x notes said that it was vs prior generation platforms, on Intel website they were comparing those Xeon 6 to Xeon 2nd Gen which was using Skylake architecture, so not exactly prior generation unless they were thinking that because Sierra Forest is based on E-cores which is based on Skylake thus the comparison is somewhat valid?
Also they had a slide comparing Xeon 6 Granite Rapids (P-cores only) with.... Xeon 4th gen, which is using the same architecture as Alder Lake, so not exactly the previous gen (which should be 5th gen). But I guess their wording should be enough to avoid problem with false marketing since they didn't say previous generation but again, prior generation platforms. To be precise, their wording is "Based on architectural projections as of Feb. 14,2023 vs. prior generation platforms. Your result may vary." They do wrote about the comparison was with those CPUs, but it is in their written article and not on the slide itself. So basically a bit of snake oil just enough to make everything more shiny than it should.
The sheer volume of cringe these companies are putting out is absolutely insane.
A reflection of many of their customers, they are trying to cash in on the whole fanboy/celebrity thing.
That intel thing is highly likely inspired by that car idiot who put up a dancing human trying to pass it off as an innovation in robotics.
@@Skobeloff... Cash in, as in people who like cringe?
The cringe doesn't matter. They're boomers. Only the raw untapped compute chips matter.
Companies thinking how humans act:
If anyone can target nvidia with ai it’s Intel. Intel arc has really interesting RT and productivity on their GPUs that I didn’t expect
Absolutely right on the media side as well.
Intel has the best of the best video codec hardware so far. Yet all rounder nvidia still hold the market. However nvuda must decrease the price and doing what amd do. Make some of the software open source. So the nvida will wipe the floor with intel. Other wise within 10 more years intel.will catching up with nvidia. Or even sooner if their gpu can consume far less power
I did a lot of testing on arc in its early days. It’s easy to dunk on them for drivers, but it’s continued to impress me on the media/productivity side .
I just hope for more completion in the market
RT doesn't really mean much for AI performance. However, Arc is pretty much designed for AI. The XMX that runs XeSS is an advanced matrix engine, which is what you really want for tensor operations. In a generation or two, I wouldn't be surprised to see XMX be very competitive with Nvidia's tensor cores. I don't know if AMD has anything comparable in CDNA, but they don't for RDNA and that will hold them back.
NVIDIA, AMD, Intel. The competition is set up now. Let's see how the race will be going. Gaudi2 had a good price tag.
Why at 18:51 the part where is wirtten "codename" between parenthesis shifts 1 pixel to the left?
19:27 wait what. 🤣 That generated image.. good to see that they have humor.
Old mcdonald had a server farm 1:46
Come on dude I thought of that first (in my mind)
The captions show annals correctly, but Gelsinger as gelnar lol
Sounds like a D&D character name!
"I am become Gelnar, destroyer of Ansys!"
07:32 I remember ad campaign, "Switch", talking about openness of formats, compatible interfaces, easy to replace parts. The same "*totaly family unfriendly word*" company which is now one of the toughest and harshily defended monopolies of consumer devices and services.
If you're try to gain positions, you talk about openness. If you're leading, you secure with copyright and propietary solutions.
Decades changes, a-bole business practices don't.
3:37 I believe you mean I'm running two power supplies. With the way graphics cards are going this maybe the future.
NVidia seriously needs competition
If Baiden gives 20 billions to nVidia, and just see what happen. tsmc got only 6 billions and relocate all the best engineer from Taiwan to help up. tsmc can withstand 7.4 earthquakes. Take that, Intel.
Making graphic cards is really hard it's why it's only nvidia and amd,intel
Yes, this is why AMD is making the MI-Cards and already baking in Tops in every CPU. And they already beat Intel by lightyears. Before aiming at NVIDIA, they should try to beat AMD first, what they are not able to do right now. This is just a brainwashing show to make them look competitive in any way, but they are not.
technically seen both amd and intel already more than compete with nvidia, the problem is people just won't see that and will by defauly keep only looking at nvidia.
and that is because nvidia is a marketing company, meaning they spend most of their budged on how to look good and make people think of them as great.
as a result both amd and intel and broadcom and others need to provide way better value in order to even get some okay marketshare,
and as a result they generally have lower marketshare making those insanely better value harder to reach.
the main problem is essentially people being mentally weak, and so weak towards nvidias marketing.
if people would just decide to go with nvidias competition, not only will they suddenly get a lot better due to finaly having okay marketshare and so software also being optimized for them(as in right now only nvidia receives much optimization from third parties, if intel and amd also got optimization from people(like game engines also by default optimizing for them instead of only for nvidia then amd and intel cards and such will perform much better than they do now.
since hardware is one thing.
drivers are one thing.
first party is one thing.
but third party implementation and optimization gives insane effects and is reached through marketshare.
for example if a nvidia gpu is just designed poorly or has terrible drivers, all games and gameengines and most softwares will in general optimize and fix those problems caused by bad nvidia design or poor drivers.
if amd or intel has a driver issue, well only rare cases do third parties optimize to get around that, as a result people think intel and amd to be somewhat poor.
while in reality as a example gaudi3 currently beats the h100 with around 1.5 times while also being much cheaper and more efficient.
however when things are optimized for gaudi3 as well it's potential is many times higher since it just is many times faster in actual AI performance, but most softwares are made for nvidia.
and nvidia has the nasty habbit of not accepting or using new technology and instead relying on old legacy technology so their stuf looks good early on yet gets outdated rapidly.
as a result big AI datacenter workloads are still designed for those legacy methods, for which nvidias hardware is also optimized. meaning that gaudi since it is not optimized for those legacy(old) things and instead for the future will right now only be around 1.5 times faster, while if AI gets more futureproofed gaudi3 will be many times better than that 1.5 times better.
originally I thought they reffered to the entire board with 8 of those modules when they named the AI performance since in that case even 1 module would still be much faster than the H100 in technical performance when optimized for, but turns out it reffered to 1 such module which can help you imagine how much more powerfull than H100 it is.
but it is a vicious cycle of nvidia having most marketshare, so all is optimized for nvidia, nvidia however uses very outdated methods meaning all is optimized for those outdated methods and hardware types, as a result all runs way worse than it could actually run, but if it where actually optimized properly to be more efficient and fast then nvidia gpus can no longer really run it because they where based on legacy stuf and so do not support the new stuf or barely and so since nvidia has the biggest marketshare softwares keep using those outdated methods despite it greatly increasing powerusage and reducing performance.
amd and intel since they can't compete in the legacy field have to innavate and improve but they need to get adoption for such newer methods to be used and to show the world how great things are, amd tends still follow nvidia a lot in hardware, intel however seems to really push for those newer much faster and more effient methods, but again rely on adoption to get it working even close to it's potential since if all is written for legacy shit then it won't run as well on modern hardware.
if many people would adopt intel and amd however, and right now especially intel then the gpu market for normal users will skyrocket since not only will the new better methods finally be widely supported, but also nvidia will be forced to actually improve their performance and value yet most people are to stupid to see even that single point
@@enbe3188AMD will never be as good as Nvidia. That's why Nvidia are slacking.
With the number of super cuts of people repeatable saying AI, I'm surprised that the editor didn't make keynote speakers sing Old Macdonald
OMG Steve! Thanks for the laughs! I needed that desperately!
Hey Steve, you should start a tech based standup comedy channel 😂
This stuff is so good 😅
Can they at least use AI to make those presentations less terrible?
That's the thing!
AI makes it WORSE!
I bet the AI is referencing some crappy presentation, and the generated presentation got referenced again by future AI, and so on
Imagine a tech industry without AI
Engineers would get SO SO SOOO MUCH more work done
@JeskidoYT youre saying that as if AI isnt a recent invention, people will always find ways not to get stuff done and the best inventions in tech history have mostly been on accident (if everything had gone to plan for IBM, there would be no PC compatibles like we have them today, only propriatary IBM PCs)
Nah. Good and honest presentations come from AMD.
NVIDIA's music was really brainwashing because I couldn't get it out of my head for weeks.
That's how Skynet will get us. Drive us mad with earworms.
And that was even before Udio arrived on the scene
0:55 **Woosh** I am the sound effect
GREAT video guys! Really like your work.
Because of that opening, I had to check that the video was not uploaded on April 1st. 😂
That “papa’s little baby” 😂😂😂😂😂 and previously “thank you papa”. So much cringe 😂😂😂
Or 1998
Intel: (every time they say) “A.l.”
Ali G: “Ayye!”
A'aight (The 't' is silent)
I took a drink every time Intel said "AI." Umm.. anyone got a second kidney they're not using? Maybe a liver lobe?
This things are important as some of what was announced will ultimately trickle down to the gaming hardware.
That intro was actually trippy 😅
Thanks Steve!
Back to you Steve!
What a classic 😂😂😂😂
This lowkey has the same energy as the Gavin Belson Box3
hahahaha
Signature edition B=ox=3
I have only watched SV a year ago and it's a future documentary like Idiocracy. Judge is a genius.
lmfao :D :D
Gavin Belson: "I want the signature to be BIGGER"
That lian li Case looks insane. Actually got decent space for cables.
great review .. keep doing these ! ps. I want to know how much does a Gaudi board cost, and how programmable is it ? We really need open standards - super fast interlink / ethernet, and an open API for ML and vector/matrix compute .. so we can write 3D games, engineering applications, and new inventions - in my case turning 2D photos into 3D models. Its great that intel is engaging in this ferocious competition.
Intel is on the cutting edge of AI generated Cringe.
Yeah the video was pretty cringe. But I’m happy they are taking the competition seriously. The more competition the better
10:57 Using Pytorch as an example is not a good one. Pytorch is a deep learning library. Its counterpart is Tensorflow and many others. It does not directly interface with GPU, instead it goes through cuda or ROCm (amd equivalent). Intel is trying to push for their open source "cuda", OneAPI, which I just checked still does not have official released PyTorch using it. Nvidia has more than 10 years of development and experience in cuda and GPU accelerated computing (not limited to deep learning). Both Intel and AMD will have a very rough uphill battle to fight.
Nvidia's dominance in AI is not even so much CUDA, but more of the overall relentless in support and polish for everything, writing tons of custom optimized kernels for libraries etc. The whole CUDA moat in this case is more of an umbrella term for their entire ecosystem, because many of these advantages aren't technically CUDA related. Nvidia actually spent more than a decade of effort while these other companies did not.
IMHO, the ML frameworks is where Nvidia has a huge leg up, because they were (and have been there) for a long long time helping build them but I'll be totally honest, if Intel makes realistically good TF/PyTorch support a reality, they could cause a huge upset by simply saying "look, we have this nice card with the A750 and 64GB and it doesn't cost an arm and a leg because VRAM is cheap and Nvidia are bloodsuckers". Make them ~$600 and I'll be running out the door to buy a couple (couples) :P
Much like the prisoners dilemma, the one that defects first on VRAM size/cost will totally wipe the rest for a modest time, because performance isn't really a concern if you can't event fit the model in memory to start with.
मग कसं भाऊ, जमेल का Intel ला सर्व काही.. मला माझ्या घरी बसवायचं आहे AI. चल AI, हात पाय दाबुन दे. चल AI, चल AI, अद्रक टाकून चहा बनव.. अशा order ठोकणार आहे मी.
when the music started playing, i didnt know if it was edited or not lmao
Does that GPU orientation in the sponsors case not effect the heat pipes? Seems like gravity might not help the thermals
Pat forgot the leather jacket lol
a small mercy
He hasn't unlocked that trade with the villagers yet, Pat needs to farm more cows first.
He forgor 💀
He cannot afford it😂
@@TheDumbTake-xb6rr living down to your name
When the "AI" segment started I thought you were playing a clip from Ozzy's Crazy Train ("Eye Eye Eye")
TBH my biggest personal hesitance towards using any AI at all is that we constructively can't run and use it locally to ensure the safe-keeping of confidential information. In law, that sort of means sending absolutely privileged information right to some company, usually a big and powerful one. Just because I don't know how these companies will use the information and how they might adverse my specific client doesn't mean I can put them at unknown risk for the sake of possibly more efficient use of time.
But being able to run open-source code on local hardware could change that. If Intel can come out with AI Add-In-Cards that don't cost the kidney required to buy a 4090 (if one somehow evades China's inhalation of them), then maybe I'll consider it. They could save cost making discrete cards apart from GPUs by not having to fuss with all the other silicon devoted to things unhelpful to AI purposes. It might be nice to return to the days of having more than one Add-In-Card
You can, stable diffusion and LLM can run locally on your PC. It's still require mid-high end hardware, but you already can do it now, for free.
@@sadkurtable 10GB VRAM really isn't enough, and getting a new GPU means getting a new waterblock too
One of the most concrete evidence of a higher power, was the fact that everyone who was listening to that Nvidia AI song in person wasnt convulsing like a epileptic salmon at the first note.
Like these videos outside of games focus; thanks for everything you do guys.
Steve turning from Tech Jesus to Tech Santa.
This tech is already dead, it just doesn't know it
🎅🎅🐱
I feel like this conference from Intel was a true Steve Ballmer spec show, and I am excited to see more
It did properly pull up "Annals" in the auto captions.
There hasn't been a new Crowbcat video in awhile but this scratches some of that itch, thanks guys
I'm convinced ai is safety word for cocaine
CocAIne
Anyone know if Gamers Nexus ever did testing on any SSD heatsinks?
Thanks for the video Steve
After watching Fallout I've learned what we really need are Super Managers so that these ai projects can continue for 200 years. 200 years of song and dance.
You gotta embrace the bore. Lean into it.
Thanks Steve!
One Love!
Always forward, never ever backward!!
☀️☀️☀️
💚💛❤️
🙏🏿🙏🙏🏼
Checking the Gaudi 3 AI white paper, the 1.5x speed up *on average* is just due to the larger memory on each card compared to the H100 (128GB vs 80GB)... There are some improvements on interconnection speed (900GB/s vs 1200 GB/s on intel) and memory bandwidth (3.35TB/s vs 12.8 TB/s) but Gauid 3 is essentially is a slower card. 1835 vs 1979 TFLOPS on BF16 AND 1835 vs 3958 TFLOPS on FP8
A.I....bro just tranquilize me at this point🙄
Sucks to see no mention of Arrow Lake anywhere in this. :(
Hey, just letting you know, I actually really appreciate the AI hardware/software coverage. I'm a gamer and an AI software engineer working on developing Retrieval Augmented Generation chatbot at my company. It's nice to hear Gamer Nexus cover both sides of the things I love to do.
lol unfortunately, these chips are so damn expensive, even my company can't afford to buy them for me :')
Not that they're being cheap. They'd be 100% willing to buy them for me if they could get the ROI from it to justify the purchase for our mid size company. But the hardware is too expensive to get proper ROI. Hoping for some stuff to "trickle down" to my company, and then eventually trickle down to prosumer.
One would’ve hoped that AI would have replaced mundane tasks to allow people to focus more time on their hobbies, expressing themselves with art and music.
Unfortunately, the AI-obsessed C-suites of modern tech companies would love to see it be quite the opposite; they want to replace the artists, the people with passion and vision, with AI models that hallucinate amalgamated images using training data that they didn’t earn.
As someone who has been interested in both tech and art for a long time, the shift is both disturbing and disheartening.
Money and greed corrupts everything.
I'm not sure what gave you the impression that they would be so altruistic. It's always just been about selling the product. The reason for art and music being on the forefront is simply because it was the easiest. For better or worse it's coming to replace everything, it's just a matter of how hard is it to implement in each use case.
Let me tell you how the economy works.
And how your complaints about ai literally do not matter.
The cycle goes like this: new groundbreaking tech is created, that technology displaces a portion of the workforce while increasing productivity for far cheaper, new jobs are created that usually pay more than the previous jobs, the displaced workers find a new job.
I want to ask you, where do you think we would be if we rejected advances in technology when it comes to the industrial or agricultural sector, because what about muh job as a farmhand? Ill tell you what would have happened, we would still be in a pre industrial society with less food, less products, less money, dying of polio or some other bs at 30.
Implementing ai allows businesses to create things for cheaper, it also allows artists to be more productive, it also allows people to create more niche products at a higher standard of quality that would otherwise not be economical.
In other words, before you start talking about anything, you should probably at least understand the bare minimum.
Money and greed, you guys seriously think you understand anything. Let's be real here, the thing you're complaining about is "capitalism" Which just shows how narrow minded and frankly it shows how much of a dimwit you are.
Let me tell you how the economy works.
And how your complaints about ai literally do not matter.
The cycle goes like this: new groundbreaking tech is created, that technology displaces a portion of the workforce while increasing productivity for far cheaper, new jobs are created that usually pay more than the previous jobs, the displaced workers find a new job.
I want to ask you, where do you think we would be if we rejected advances in technology when it comes to the industrial or agricultural sector, because what about muh job as a farmhand? Ill tell you what would have happened, we would still be in a pre industrial society with less food, less products, less money, dying of polio or some other bs at 30.
Implementing ai allows businesses to create things for cheaper, it also allows artists to be more productive, it also allows people to create more niche products at a higher standard of quality that would otherwise not be economical.
In other words, before you start talking about anything, you should probably at least understand the bare minimum.
Money and greed, you guys seriously think you understand anything. Let's be real here, the thing you're complaining about is "capitalism" Which just shows how narrow minded and frankly it shows that you're not exactly the brightest bulb.
Also, ai is overvalued like hell right now, sure it's useful, but this is getting to .com levels of bubble, maybe even worse. The internet was the future, that doesn't mean that every .com was useful.
@@alexis1156 Brevity dude.
And redirecting our complaints to Capitalism sounds like a defunct all-encompassing umbrella. There's morals in the business, when the internet was discovered it created more jobs than it destroyed for all skill-levels. When AI was developed, it sapped skill from lower workers for a small fraction of people. "That's where Capitalism comes in" keep pedalling that irrelevant retort, because AI has helped significantly less than what the media has led you to believe.
Go buy shares, it's the new trend right now.
Thanks Steve, back to you Steve
Pat Gelsinger has the same energy ad Dr. Lisa Su and Jensen letherjacket Huang, nice stuff.
What's with the kerning adjustment between the 'k' and 'e' in Lunar Lake at 18:50
Love when you cover this stuff tbh I just find it all very interesting
Thanks Steve
Wondered if there's a video talking about all this AI stuff - maybe with Wendell? - but segueing into what all this means for GPUs and gamers too.
(Do the designs of this hardware effect what home products we'll see? Are we in a silicon capacity fight with all the AI demand now? etc.)
12:00 where there’s a will, there’s a way Steve
but Gaudi is made by tsmc. tsmc's capacity is booked by nvidia and apple. when is intel's fabs going to be able to compete, even for intel's own business?
we NEED intel for high end gpus but not sure if thats happening anytime soon
Their battlemage is coming in the fall and that’s supposed to go band for band with nvidia
450w sku
@@TEENYcharma Not at the high end. Battlemage is topping out at around 4070 Ti level.
Something to compete at 70ti level should do the job maybe 80series but thats too optimistic, most people buy gpu under that mark for personal use
we need more 'ai' supercuts from these conferences :D
1:24 if that’s all rendered on gaudy😮
Hopefully any/all of these AI focused co.'s have the best security measures imagined.
Oh yeah, no glow boy backdoors to be found.
im sorry, but i totally lose it and start laughing uncontrollably when steve goes "WTF" on 1:37 mark XD
Back to you Steve!
Nvidia is seeing what Cisco experienced in the early 2000s where a bunch of manufacturers jumped onboard the networking equipment bandwagon. I prefer setting up Cisco at work but at home I prefer Netgear. There's a brand for everyone else in between.
Thoughts on Ubiquiti?