The amount of power these use is less than what was used just a couple of years ago, and the next generation will use less power too, so you'll eventually have the computer power required for chatgpt in your desktop.
@@pcmason you already can have it with GPT4ALL. It can run on your cpu or use vulkan, opencl, cuda or the M series of the macbooks and it is pretty fast if you have a gpu with 6 or more gb of ram
@@pcmason While you are definetly not wrong we still have the problem that the usecases for these machines are getting better and better with better performance, leading to a massive growth in usage and thus increase in energy consumption in total. The efficiency increases we have these days are way smaller than the increases in usage overall. Data center energy consumption in projected to double by 2030. The current trend to "daily" AI usage and more compute intense applications being simpler to run is heavily counteracting any improvements we made towards efficiency. Using ChatGPT instead of Google for example uses (up to) 25 times the energy, no matter how efficient the servers are. So tldr, you are both not wrong, but also not right.
1:45 that Nvidia employee forgot she was working with linus. You got to stay on your toes.... And just a few seconds later you realize it's Yvonne, the GOAT of guiding Linus. Here's to Yvonne, saving the company one small catch at a time.
@@gonthyalavishalchandan1193 "is *that* John?" "yes, *it* is" That's a common type of exchange, most people don't wanna be referred to with *it* or *that,* but it's common to use those pronouns in this specific case, I'm sure no disrespect was meant
i wonder how powerful it'd be if nvidia could actually build a single gpu which is built out of like 1000 blackwell superchips that would fill up multiple rooms. And i wonder what we'd be able to do with such thing...
I mean, almost? These things are meant for datacenters, so you'd have dozens or hundreds of racks like these, all connected with extremely fast interconnects. That's almost like having a GPU the size of a building.
@@DajovaI mean I would want a case with actual cooking so a radiator case that's just a GPU that I could hook my water loop into the case so maybe we could have a chance to watercolor the RTX 50 cards
@@bluflame5381 This I dream about powerful PC yet I probably would just still stream CZcams half the time and play Elden Ring/Battlefield and Cyberpunk the other half... Which runs fine on my 4060.
@@51Archives And that is what we want, our downfall, after all we all have to die eventually, if we die as the last humans in hisotry then we are special.
We got DX12 and Vulcan...... That either no one bothers trying to use, or no dev bothers doing the "less work than back then" to enable..... Gimme back SLI / Crossfire damit!
@ZaHandle Just use those industrial red power plugs. Youll need at least one of those for the rig if not more. That puppy can supply over 300,000 watts. Psu? Who needs that crap just plug it straight into a nuclear reactor. The rack likes raw unfiltered energy kappa.
I mean,... you would have to probably cover the whole world with 'super' computers of old (Zuse anyone?) and wouldn't have a faction of calculation power of one of these bad boys
@@Safetytrousers Sure, and if all you do is play games with the exact same framerate and graphics settings as before then you'll use less power. But people won't do that.
because it didn't. you're either viewing this on a Nokia phone from 2004, or half blind. his hand would even dissapear every time he waved it in front of it
*Segway* Two-wheeled, self-balancing personal vehicle. A Segway is a two-wheeled, self-balancing personal transporter device invented by Dean Kamen. It is a registered trademark of Segway Inc. It was brought to market in 2001 as the Segway HT, and then subsequently as the Segway PT. HT is an initialism for "human transporter" and PT for "personal transporter." A lesser known fact about Segway is that the company's previous owner died when he fell off a cliff while riding a Segway, kind of ironic. Those things aren't safe, let me tell you. Also, the word you're looking for is 'segue'.
I actually work for a hyperscale ODM in QC and testing and we're renovating one of our rooms at the moment for building scale liquid cooling. Am excite.
The issue with AI is not that the AI is going to take anyone's job, it is how greedy corporations and their investors will inevitably demand it is used to eliminate as many jobs as possible. AI can be used for good as a companion tool to help people do their job easier and faster but corporations are not content with just the added productivity. They want to eliminate as much payroll as possible while quite hypocritically, boosting top level executives compensation to orders of magnitude more than what they pay their actually productive employees. All I am saying is that it is not AI that is the problem its a corporate regulation issue. The only thing that curbs the infinite corporate greed is laws mandating the continued use of human employees wherever AI is used without cutting their wages or benefits. Corporations can afford it despite their false claims otherwise we would not have so many billionaires.
AI stocks will dominate 2024. Why I prefer NVIDIA is that they are better placed to maintain long term growth potential, and provide a platform for other AI companies. I know someone who has made more than 200% from NVIDIA. I'll also take any other recommendations you make.
I believe the next major breakthrough will be in A.I. For sustained growth similar to META, it's crucial to avoid making impulsive decisions based on short-term market fluctuations. Instead, prioritize patience and maintain a long-term perspective. Consider seeking financial advisory services to make informed buying and selling decisions.
In a similar situation, I sought advice from a financial advisor. By restructuring my portfolio and diversifying with good ETFs, the S&P 500, and growth stocks, I was able to grow my portfolio from $200,000 to over $800,000 in just a few years.
This is definitely considerable! think you could suggest any professional/advisors i can get on the phone with? i'm in dire need of proper portfolio allocation
Rebecca Nassar Dunne is the licensed fiduciary I use. Just research the name. You’d find necessary details to work with a correspondence to set up an appointment..
@@frankhuurman3955 or better yet, pick nothingness. Maybe now u're not forced to use AI, but sooner or later u would be. At this point someone should amass a "no tech" campaign, since pretty tech nowadays are the same -- making you rely on them.
To Nvidia: This admittedly funny marketing stunt won't make me forget that your other GPU's are still overpriced. To Linus: BUY IT AND RUN THINGS ON IT.
He doesn't give a shit about you or any home user. He's got almost a laser focus on the datacenter. Sales of gaming GPUs are just the cherry on top of the cake for him.
NVIDIA networking makes me remember when NVIDIA firewall was installed by default with one of my GPUs back in the day that would block absolutely all traffic by default 😂
Simple, MS Winblows used to run like ass on ARM and Nvidia is too dumb for everything else (and burned all goodwill so no one else will do it for 'em).
I mean they have the Shield as a standalone product, along with their Jetson series to some extent. As for selling bare CPUs, there is no standard platform yet on the market to my knowledge. So you end up with proprietary junk on Apple side for consumer Arm, and more proprietary junk on the datacenter space from Nvidia. AMD and Intel are in no hurry to move away from x86 yet. Nvidia and Apple both love money so I don't see how Arm PCs are gonna happen anytime soon. Kind of a shame.
They are going to, in a partnership with MediaTek, it's supposed to be a snapdragon X elite like competitor. Is expected around end of year or 2025 (don't quote me on these numbers, I don't remember it too well)
8:24 Being a soon to be pharmacist, and looking at this is absolutely crazy to me. Drug development for specific proteins took years to DECADES of tests. And you are now telling me that these can be done in a matter of hours? If they manage to simulate in vivo clinical trials I'll be out of a job before I even begin my career lol.
@@Viewer13128 Neuro will make jokes about whats happening in game, she insults peoples gaming set ups within moments. shes a personal buddy to the creator, and hes just one guy. before chat gbt was even a thing
@@novadestry i was mainly wondering if the ai in this presentation are using more advanced stuff or not. for example the clip where the ai can react to moving game footage. i'm not sure what else the ai does so i was wondering if neuro is indeed doing everything it can do and more. i was just wondering how an entire company with lots of budget still can't make something equal/better than 1 person. that would be quite shocking and they really need to go back to school.
Everyone and their mom knows the 50-series is launching this year. If you spent hundreds of dollars on an upgrade without researching that, it’s on you.
@@trapical Comprehension isn’t your strong suit. You literally took my words and made them mean what you wanted them to so you could argue. Try reading again… slower… and see if you come out with the same result. L
Cost is about $70,000 for that CPU + 2 GPU board that plugs into the blade and you need 2 per blade. You are looking at about $3 million per server rack. Do you see why NVIDIA's stock is so high?
Can wait for the future generations to see this video and be like "Oh yeah, 8 TB/s. That's roughly the transfer speed of those cheaper USB flash drives... What a slow memory those guys head."
As someone that worked in healthcare AI image recognition for live cardiology radiography I can tell you that this gpu is way overkill if you have good engineering. You just need consumer grade gpus.
Consumers: "Man, I hope the 50 series is good" Jensen: "Let's see how many times I can say AI in my rambling two hour speech no one wants to hear" Next, they'll combine Crypto and AI. They'll call it CrAI, because that's what most consumers will be doing as AI continues to be shoved down our throats.
Installed Debian with GNOME 40 on a random as fuck IBM blade, chucked in a 3060 to trial AI workloads for my company and realized I was in a Wayland session by default - xorg was nowhere to be found. I think we're pretty much on a good path.
Nvidia GPUs work just fine now on Linux and have for a while. You may have to give up some ancillary features, but everything most people would care about is there, stable and ready to go.
1:45 almost the most expensive mistake linus would ever make
They don't call him Linus Drop Tips for nothing
Since they are bare dies, I'm pretty sure, there is nothing there to break.
the way that Yvonne rushed in lol
In hospital, probably
Lol wife to the rescue 😂
Oh, man! Do I have to buy a new case again?
Maybe a new house my guy.
no you don't
@@dzello damn, it wont fit
No, a new house
Nah bro for new gpus you need to buy a new country + planet to barely fit this thing 💀💀💀
So it can heat up the planet while simulating itself heating the planet?
My mind is blown.
The amount of power these use is less than what was used just a couple of years ago, and the next generation will use less power too, so you'll eventually have the computer power required for chatgpt in your desktop.
@@pcmason great explaination ngl
Genius
@@pcmason you already can have it with GPT4ALL. It can run on your cpu or use vulkan, opencl, cuda or the M series of the macbooks and it is pretty fast if you have a gpu with 6 or more gb of ram
@@pcmason While you are definetly not wrong we still have the problem that the usecases for these machines are getting better and better with better performance, leading to a massive growth in usage and thus increase in energy consumption in total. The efficiency increases we have these days are way smaller than the increases in usage overall. Data center energy consumption in projected to double by 2030. The current trend to "daily" AI usage and more compute intense applications being simpler to run is heavily counteracting any improvements we made towards efficiency. Using ChatGPT instead of Google for example uses (up to) 25 times the energy, no matter how efficient the servers are.
So tldr, you are both not wrong, but also not right.
1:45
It's hilarious how the moment he says "oh no" everyone rushes their shit to support him
IT WOULD HAVE BEEN A MASTERCLASS if the cameraman had rushed too dropping his camera.
‘Sorry Linus, you pay me to film you and not to catch stuff … got a solid clip tho bro’
1:45 that Nvidia employee forgot she was working with linus. You got to stay on your toes.... And just a few seconds later you realize it's Yvonne, the GOAT of guiding Linus. Here's to Yvonne, saving the company one small catch at a time.
Isnt that his wife
@@mpresto15 It is.
@@ACHonezGaming She* is
@@gonthyalavishalchandan1193 "is *that* John?"
"yes, *it* is"
That's a common type of exchange, most people don't wanna be referred to with *it* or *that,* but it's common to use those pronouns in this specific case, I'm sure no disrespect was meant
Love how she rushed in like she was thinking "WE CANT AFFORD THIS DROP LINUS!!" 😂
Little did we know the GPU is normal size, but Linus being so short makes it look huge
Lmaoo
Imagine the insurance company seeing Linus carrying your equipment.
It feels like there's linus clause for tech companies.
@@grze149 so,Top Gear/Grand Tour Hammond similarity? :)
I bet he has to take out a policy every time he does these kinda projects.
in 5 years we gonna have entire buildings for just our GPU, just like the 60s!
i wonder how powerful it'd be if nvidia could actually build a single gpu which is built out of like 1000 blackwell superchips that would fill up multiple rooms. And i wonder what we'd be able to do with such thing...
I mean, almost? These things are meant for datacenters, so you'd have dozens or hundreds of racks like these, all connected with extremely fast interconnects. That's almost like having a GPU the size of a building.
YES! The "Nvidia Univac"
@@SimplCup You also had to build a nuclear power plant next door, just to power that thing!
IKR. I was noticing that the boards were the size of VLSI PDP-11 or VAXen boards...
All hail the return of the vector-based supercomputer.
1:43 concerned wife runs to save her house LOL
Nvidia should embrace the memes further and just announce a case designed around a GPU.
Isn't that technically what their SFF certification is?
@@DajovaI mean I would want a case with actual cooking so a radiator case that's just a GPU that I could hook my water loop into the case so maybe we could have a chance to watercolor the RTX 50 cards
yes. a lot of stupid people would buy it
Sure it's fast but can it run elden ring?
@@bluflame5381 This I dream about powerful PC yet I probably would just still stream CZcams half the time and play Elden Ring/Battlefield and Cyberpunk the other half... Which runs fine on my 4060.
Watching Jake & Yvonne on crowd control at the start made me spit out my lunch haha
They did great! - LS
Now what is this going to mean for the 50 series..boooom
at first i though what jake doin standing over there now i know, haha
so thats what she was doing there. I kind of wondered why she was just standing there.
Great content well made
2:00 Somehow this is my first time hearing Yvonne speaking Chinese 🤣
ikr? isn't she korean?
@@skylinrg Singaporean iirc but hearing that I doubt she had a lot of practice.
1:46
Where would Linus be without Yvonne to support him?
Linus Tech Tips brought to you by NVIDIA.
probably in debt
Linus drop tips
I mean, he's said many times that LMG wouldn't exist without her. That was just the latest example of why lol.
She looked extremely ready to react and I'm guessing that wasn't staged :)
That's no Graphics Processing Unit, that's a Giant Processing Unit!
so it processes giants... love it.
Nah; that's an AI produced monster that will surely lead to our downfall within a few integrations in the near future..
@@51Archives And that is what we want, our downfall, after all we all have to die eventually, if we die as the last humans in hisotry then we are special.
The G’s for “Grill”, right?
No all you have to do to survive is have Linus drop it on purpose or take off the cooling and turn it on
"That's not a graphics card, it's a space station!" Obi-Wan Kenobi
72 Petaflops? hah, losers, i'm already at 2 Flipflops
underrated comment !
Next is Exaflop followed by Zettaflop FYI, if you care to know. The Russian military is already a Z-Flop.
its kinda funny cuz a flip-flop is also a computer thing. but 2 flip flops is like 2 bits of storage so not very impressive
How many petaflops in a flip flop?
XD
1:20 - All I heard is Nvidia will never give us SLI back.
Key words US
We got DX12 and Vulcan......
That either no one bothers trying to use, or no dev bothers doing the "less work than back then" to enable.....
Gimme back SLI / Crossfire damit!
Well hopefully amd is cooking up something
It sucked and nobody used it anyway...
@@littlefrank90 TF is "nobody?"
Speak for yourself 🤣
1:45 LTT entire wealth just flashed before Yvonne eyes
Time to sketchily wire together 4 2000w power supplies
duct tape them together
you mean 50?
I believe in Alex
@ZaHandle Just use those industrial red power plugs. Youll need at least one of those for the rig if not more. That puppy can supply over 300,000 watts. Psu? Who needs that crap just plug it straight into a nuclear reactor. The rack likes raw unfiltered energy kappa.
@ZaHandle Almost the entire world uses 240v power.
1:45 Linus, please, not again.
They were ready to hold it
@@xugro I think you mean catch it, which they did. Because again, Linus is Linus.
Catcher
wait, what happened the first time?! Lmao
@@Aonoexorcist100 he dropped a 3090 during it's review... C'mon it's not been that long.
4:26 “Reticulating Splines” that takes me back a bit 🤣
1:45 multiple NVIDIA employees' hearts skipped a beat just there.
computers in the past: its the size of a whole room computers today: also the size of a room lol
And that will not change anytime soon I think.
I mean,... you would have to probably cover the whole world with 'super' computers of old (Zuse anyone?) and wouldn't have a faction of calculation power of one of these bad boys
Computers gets compact releasing the space that could be filled with even more computers
@@GroovingDrums You mean fraction, don't you? 0o
That one rack could probably simulate every mainframe that ever existed on a molecular level in real time.
Finally, something that can run Microsoft Flight Simulator in VR
Can't wait to buy a GPU in 2050 that's equivalent to that whole rack, and only use it to play Minecraft.
And will only hit 20fps in Minecraft....
That'll likely already be around 2035
i bet we could still crash it using tnt or redstone.
🤣🤣🤣🤣
Lol. 2050? Don't think the world is gonna be in a good enough shape for that my friend...
will it fit in my mini-ITX case?
yes it sure will
If not just return it, someone will appreciate an open box deal.
LMAOOO 😂😂😂
If you use the chip as a side panel maybe
yes you just need apple's hydraulic press
4:17 Jensen specifically called it the spine in the keynote. Did the press kit call it a spline?
1:45 The Ghost of Michael Jackson passed through Linus there
The GPU needed for GTA 6:
Minimum requirement for GTA6 720p 30 fps
@@hannes0000 seriously, that's what i am wishing to play gta 6 at gtx 1080
With how optimization has been going the past couple of years. I think you're going to be lucky to get a solid 60fps Using a 5090 at 4k.
Just put a ps5 or Xbox series s/x console cause it won’t be out for pc day one but for console it will
@@hokahn6313you def will buy it
With Linus' almost completely absent upper body strength, the fact that they still let him pick stuff up at all is amazing to me.
I didn't know what Jensen was cooking, but it might be all of us with all the power consumption and the carbon footprint
The cost of power per compute has been greatly improved.
I really hope it's renewable
@@Safetytrousers Sure, and if all you do is play games with the exact same framerate and graphics settings as before then you'll use less power. But people won't do that.
@@Safetytrousers that's gonna create more demand and gap in the AI training market so there's that too
Our AI overlords will apparently win through creation of ecological collapse.
11:30
that segue was so fast i didnt even notice that it was a sponsor
great video as always!
1:45 Heart Rate Go BRRR! The most impressive/unbelievable part of this video is that NVIDIA allowed him to pickup/carry this tech
Me seeing Linus and GPU in the preview: So it's a normal sized GPU?
I was searching for this comment
Nvidia enters the SFF market
Finally a GPU that could run Cities Skylines 2
But can it run Crysis?
@@markflakezCGbut can it run Minecraft alpha 1.5.2?
5:11 Techquickie Linus showed up for a second.
Okay, Linus. Whole house GPU when. The only logical step after whole house watercooling
1:46 she was fast with that because she knew no one in that room could’ve afford that😂
but can it run Crysis?
Why did that heatsink in the begining look so real, i didnt even notice it until it was gone
because it didn't. you're either viewing this on a Nokia phone from 2004, or half blind. his hand would even dissapear every time he waved it in front of it
11:34 is what I believe is the smoothest segue transition so far, did not expect to get baited that hard.
*Segway*
Two-wheeled, self-balancing personal vehicle.
A Segway is a two-wheeled, self-balancing personal transporter device invented by Dean Kamen. It is a registered trademark of Segway Inc. It was brought to market in 2001 as the Segway HT, and then subsequently as the Segway PT. HT is an initialism for "human transporter" and PT for "personal transporter."
A lesser known fact about Segway is that the company's previous owner died when he fell off a cliff while riding a Segway, kind of ironic. Those things aren't safe, let me tell you. Also, the word you're looking for is 'segue'.
Moran
@@TriggerHippie🫡
It's almost sounds like an audio glitch, it was so well timed. I kept scrolling comments till i found someone who agreed, lol
I actually work for a hyperscale ODM in QC and testing and we're renovating one of our rooms at the moment for building scale liquid cooling. Am excite.
@6:25 I can't wait until those heatsinks start popping up on ebay in several years.
But effectively useless unless you have data centre level airflow generators
Can't wait for LTT to benchmark it in 10years time.
The issue with AI is not that the AI is going to take anyone's job, it is how greedy corporations and their investors will inevitably demand it is used to eliminate as many jobs as possible. AI can be used for good as a companion tool to help people do their job easier and faster but corporations are not content with just the added productivity. They want to eliminate as much payroll as possible while quite hypocritically, boosting top level executives compensation to orders of magnitude more than what they pay their actually productive employees.
All I am saying is that it is not AI that is the problem its a corporate regulation issue. The only thing that curbs the infinite corporate greed is laws mandating the continued use of human employees wherever AI is used without cutting their wages or benefits. Corporations can afford it despite their false claims otherwise we would not have so many billionaires.
1:45 I think that was Linus most expensive "OH NO"😂
Yvonne literally teleports at 1:45 🤣
love how fast the nvidia rep reacted, almost like she was expecting it. 1:39
That's Linus wife, but yes she definetly expected that and reacted FAST.
GPU = Generative Processing Unit?
Gougable pricing utility.
This makes much more sense than it should.
Rip GPGPU.
*3 million dollar machine exists*
Linus: *Goes near machine*
Me: 😬😬😬😬😬
Jensen: 😅
AI stocks will dominate 2024. Why I prefer NVIDIA is that they are better placed to maintain long term growth potential, and provide a platform for other AI companies. I know someone who has made more than 200% from NVIDIA. I'll also take any other recommendations you make.
I believe the next major breakthrough will be in A.I. For sustained growth similar to META, it's crucial to avoid making impulsive decisions based on short-term market fluctuations. Instead, prioritize patience and maintain a long-term perspective. Consider seeking financial advisory services to make informed buying and selling decisions.
In a similar situation, I sought advice from a financial advisor. By restructuring my portfolio and diversifying with good ETFs, the S&P 500, and growth stocks, I was able to grow my portfolio from $200,000 to over $800,000 in just a few years.
This is definitely considerable! think you could suggest any professional/advisors i can get on the phone with? i'm in dire need of proper portfolio allocation
Rebecca Nassar Dunne is the licensed fiduciary I use. Just research the name. You’d find necessary details to work with a correspondence to set up an appointment..
What if I don't want my pc to send a screen shot to Microsoft or nvidia every 10 seconds
He said with the last demo that it supposedly runs on device (no data is sent anywhere)
then you pick Linux and a non-AI bs GPU
Have you tried turning off your internet
@@frankhuurman3955 or better yet, pick nothingness. Maybe now u're not forced to use AI, but sooner or later u would be. At this point someone should amass a "no tech" campaign, since pretty tech nowadays are the same -- making you rely on them.
4:13 felt like he was gona segue to another sponsor, “im gona have to tell you… about our sponsor”
Nono he would say "I'm gonna have to tell you.. about this segue to our sponsor!"
5:26 was the men behind Linus Supermicro CEO?
Linus: Look at all this amazing stuff.
Comments: Yea, yea, you guys did you see that he almost dropped it!
Linus listing the specs around the 5:15 mark reminds me of the turbo encabulator meme
0:27 that hose clamp moved when linus hit it
To Nvidia: This admittedly funny marketing stunt won't make me forget that your other GPU's are still overpriced. To Linus: BUY IT AND RUN THINGS ON IT.
He doesn't give a shit about you or any home user. He's got almost a laser focus on the datacenter. Sales of gaming GPUs are just the cherry on top of the cake for him.
@@arnox4554 I mean ngl, Datacenters has more purpose than games.
You can’t be serious with a comment like this.
NVIDIA networking makes me remember when NVIDIA firewall was installed by default with one of my GPUs back in the day that would block absolutely all traffic by default 😂
That's what a firewall is supposed to do. Block everything and from there you allow only what you actually need.
@@DJDocsVideosmost people have no idea what they need to allow and would click yes on any water falling green text app that requested it
4:50 realized Linus was in some type of luxury hotel or some fancy conference building
Can’t wait until 3035 so we can have this in a regular GPU sized 6:09
Makes you wonder why Nvidia has not already released a consumer ARM CPU if they are so good.
Simple, MS Winblows used to run like ass on ARM and Nvidia is too dumb for everything else (and burned all goodwill so no one else will do it for 'em).
they make ARM chips because they don't have a x86 licence so they don't have much choice
I mean they have the Shield as a standalone product, along with their Jetson series to some extent. As for selling bare CPUs, there is no standard platform yet on the market to my knowledge. So you end up with proprietary junk on Apple side for consumer Arm, and more proprietary junk on the datacenter space from Nvidia. AMD and Intel are in no hurry to move away from x86 yet. Nvidia and Apple both love money so I don't see how Arm PCs are gonna happen anytime soon. Kind of a shame.
They are going to, in a partnership with MediaTek, it's supposed to be a snapdragon X elite like competitor. Is expected around end of year or 2025 (don't quote me on these numbers, I don't remember it too well)
They have had ARM SoC's for well over a decade. For example the Nintendo Switch has a NVIDIA Tegra X1 from 2015
8:24 Being a soon to be pharmacist, and looking at this is absolutely crazy to me. Drug development for specific proteins took years to DECADES of tests. And you are now telling me that these can be done in a matter of hours? If they manage to simulate in vivo clinical trials I'll be out of a job before I even begin my career lol.
AI can only simulate so much. Actual bodies aren't... as straightforward and will still need clinical trials
That last presentation is literally what one guy made called Neuro Sama. And that ai has more personality than a Bethesda character
are u certain they have the same functions? i'm just checking to make sure.
@@Viewer13128 Neuro will make jokes about whats happening in game, she insults peoples gaming set ups within moments. shes a personal buddy to the creator, and hes just one guy. before chat gbt was even a thing
@@novadestry i was mainly wondering if the ai in this presentation are using more advanced stuff or not. for example the clip where the ai can react to moving game footage. i'm not sure what else the ai does so i was wondering if neuro is indeed doing everything it can do and more.
i was just wondering how an entire company with lots of budget still can't make something equal/better than 1 person. that would be quite shocking and they really need to go back to school.
11:25 - that is NOT a Triceratops...too many spikey bits on the crest.
Linus ~ "This cost how much? Oh I need to pick this up!"
I feel like I just bought my 3090 last week… and it’s already two cycles old
Everyone and their mom knows the 50-series is launching this year.
If you spent hundreds of dollars on an upgrade without researching that, it’s on you.
@@trapical My mom doesnt know that
@@trapical I didn't know you are able to read people mind trough your monitor.
@@trapical Comprehension isn’t your strong suit.
You literally took my words and made them mean what you wanted them to so you could argue.
Try reading again… slower… and see if you come out with the same result.
L
lmao i bought my 4090 last week as well. its so sad...
1:02 premere pro moment XD (if someone sees this a re-export might fix it because this happened to dankpods before)
I'm glad humanity have been given Linus in this era and not 300years ago
4:20 is that spline reticulated?
0:02 Yvonne rockin' backpack prototype
2:07 : lololol the AI killed the Vibe
Yvonne with the save. Stop letting this maniac handle expensive shit!
🤣🤣🤣
6:20 doesn't GPU stand for General Processing Unit?
They didn't show the responsiveness from the first test... I wonder how long it takes to reply... 🤔
About 3-6 seconds. Just didn't want to break the flow that early in the intro - LS
@@LinusTechTips all of this hardware and generative AI still sucks lmao
@@LinusTechTips Well. I guess we can wait...3-6 seconds is still fast to me. I guess there would be a time when things are going **too** fast.
How many kidneys do i need to sell?
you only need to sell one of yours and 300 others from your friends
You need to sell more than half your organs kidney isnt enough for a single tube of that thing
Cost is about $70,000 for that CPU + 2 GPU board that plugs into the blade and you need 2 per blade. You are looking at about $3 million per server rack.
Do you see why NVIDIA's stock is so high?
1:45 Linus Drop Tips. The Final Drop.
TLDR: Your AI waifu is about to look at you in a much higher resolution than ever and be even more realistic.
9:02: The "-ish" does some really heavy lifting here.
1:45 It could have been a historic moment
This entire video, including Linus, was AI generated on Nvidia GB200's! 😂
In fact the GB200 is just an AI hallucination and does not exist!
9:55 did she just look at you without you even using a camera? Is this the real new AI?
0:35 linus holding it like that made me so nervous given his history with dropping things
Finally, a system that can run Cities Skylines 2.
Can wait for the future generations to see this video and be like "Oh yeah, 8 TB/s. That's roughly the transfer speed of those cheaper USB flash drives... What a slow memory those guys head."
where is the real delay you said you'd show?
You know it looks like a great deal, but the shipping is just too much for me. Guess ill buy one next generation
Almost as big as drakes “gpu”
too bad he prefers mini itx cases
Oh my
@@ech 😂😂😂😂😂😂😂 this got me dying
@@ech nearly spit my drink out
Lol@@ech
0:19 nice to see Linus going on a trip with Yvonne.
As someone that worked in healthcare AI image recognition for live cardiology radiography I can tell you that this gpu is way overkill if you have good engineering. You just need consumer grade gpus.
Need it for Game dev's perhaps, or they might need to grab a couple 5090's Gamers won't need more than the 80-class.
This GPU is for training on very large datasets. You don't *need* it, but it will help to work on larger models. - LS
So that's where all the memory Nvidia didn't put on their consumer GPUs went
11:35 Most jawdropping segue I've ever witnessed
That reaction time is uncanny xD
That’s what happens when you’re married to him.
Consumers: "Man, I hope the 50 series is good"
Jensen: "Let's see how many times I can say AI in my rambling two hour speech no one wants to hear"
Next, they'll combine Crypto and AI. They'll call it CrAI, because that's what most consumers will be doing as AI continues to be shoved down our throats.
1:41 I begin to wonder if Linus does this on purpose now every now and then
And it still can’t run Ark
Frrrr
Who tf plays Ark lol
I'm playing the newer ark @@tonimodahl7262
Nice, but still no good drivers for linux right?
New feature: you try to use Linux, it just shorts the gpu.
maybe we should let an Nvidia AI develop a properly working and up to date Linux driver.
Installed Debian with GNOME 40 on a random as fuck IBM blade, chucked in a 3060 to trial AI workloads for my company and realized I was in a Wayland session by default - xorg was nowhere to be found.
I think we're pretty much on a good path.
explicit sync is a thing now, apparently its not bad
Nvidia GPUs work just fine now on Linux and have for a while. You may have to give up some ancillary features, but everything most people would care about is there, stable and ready to go.
8:48 oh good, glad we're on the same page here.
So weird to see Jake and Yvonne as NPC's
I cannot imagine how you do it 😂 just laying around, goofing around and filming the shots for your video unphased by all the other people haha