AMD & Nvidia Gotta Watch Out
Vložit
- čas přidán 15. 05. 2024
- ► Check out today's hottest tech deals here: www.ufd.deals/
geni.us/iBwb
howl.me/cl0ZhVcNOOn
howl.me/cl0ZjgXGcHP
0:00 - Intro
00:17 - Google Vids: tinyurl.com/25a48ghc
• Become a better storyt...
tinyurl.com/294jqsc6
01:32 - Google’s ARM CPU: tinyurl.com/27yncdxg
02:21 - Google Chatbots: tinyurl.com/2cxrtt34
04:33 - UFD Deals: www.ufd.deals/
geni.us/iBwb
howl.me/cl0ZhVcNOOn
howl.me/cl0ZjgXGcHP
05:14 - Tesla Settles Autopilot Lawsuit: tinyurl.com/26ymdxyt
tinyurl.com/286bs3ks
06:33 - Intel Battlemage Launch: tinyurl.com/28lzknsz
tinyurl.com/25y39bay
08:20 - Comment Response
► Follow me on Twitch - / ufdisciple
► Join Our Discord: / discord
► Support Us on Floatplane: www.floatplane.com/channel/uf...
► Support Us on Patreon: / ufdtech
► Twitter - / ufdtech
► Facebook - / ufdtech
► Instagram - / ufd_tech
► Reddit - / ufdtech
Presenter: Brett Sticklemonster
Videographer: Brett Sticklemonster
Editor: Rikus Strauss
Thumbnail Designer: Reece Hill - Věda a technologie
► Check out today's hottest tech deals here: www.ufd.deals/
geni.us/iBwb
howl.me/cl0ZhVcNOOn
howl.me/cl0ZjgXGcHP
As someone who has to deal with chat bots from time to time, it's incredibly frustrating, most of the time my problem is not in their responses and they just waste my time and in the end an employe sents me an E-mail with the answer/solution to my problem, sometimes that answer is wrong/does not help. Really frustrating
I never want to beta test AI's "Killer App"
So you are saying you don't want to be the unwitting hero in a tech action movie?
nice one :)
@@thatrealbaI'd happily say that. Just imagine the amount of data they'd collect
Brett: I don't peel off plastic or stickers off.
I dare you to not peel off the sticker off a CPU cooler. 😅
Reese's Mark Zuckerberg impression is really good. He came off a little human but a little practice it'll be perfect.
As an Intel Arc A770 owner I'm excited for Battlemage
that doesn't work with ton of AI APPs.
@@user-yi2mo9km2s thankfully almost no one uses AI apps at all aside from maybe chatGPT.
I want to thank you for your input on the Intel vs AMD & Nvidia vs AMD arguments. As someone who frequents the PC Building & PC Enthusiast Facebook groups, there’s nothing but rampant arguing on which company is better. And don’t dare ask a question because 80% of the comments will be people demonizing you for your choice in hardware.
It IS good when companies compete with their products. It ISN’T good when people act like the use case for said products, are always the same.
And also people losing their minds because someone stress tested intel cpus and it went to 400W made them think intel is all the time chugging Power which is silly, because when you used e cores strictly, it's at best 100W usage🤣
If they say after thanksgiving, before Black Friday November, it'll be between Oct 14th and November 29. Canadian Thanksgiving is Oct 14 lol
You know. I think I'm going to start celebrating Thanksgiving in October, so I can have two thanksgivings as well.
The 7950x3d is pretty much plug 'n play now on the latest chipset drivers/bios. Mine flawlessly executes top gaming performance and also runs a massive vm server setup for work. An absolute beast for both scenarios. The original janky issues with core parking seem largely a non-issue now.
HI human Reece, I are from human race too. *Edit to say I liked the endrant!
If you own an ASUS Z790 motherboard. You need to disable ASUS optimized CPU and set it to DISABLED use intel settings and your problems go away. Motherboard MFG truly over volt Intel and AMD cpus. Remember when AMD cpus blew up, yah the motherboards did it by over voting.
I own 13900k with the setting above and have no issues at all. Jay's 2 cents also pointed this out weeks ago and discovered this problem with optimized bios settings from motherboard MFG"s causing issues and over volting the cpu when it's not needed.
Can't for my future AI assistant to book me a $20,000+ vacation to China when in reality I just asked for takeout from Panda Express
i hope snapdragon enters pc gpu and cpu market soon to provide tough competition to macbooks and to nvidia because they are raising gpu prices unreasonably these days
Wow Brett, was great to see you go wild at the end of the video man... That was Epic !!!
Honestly, one of the main nono's for me about Intel is it's power consumption. Even NVvidia's GPU can more powerful and less power hungry. AMD is just bestie in power consumption slash quality of image
Did you use arc gpu?
As I understand, Alchemist had an architecture problem that pushed up the power-draw. And as far I understand, this should be fixed in battlemage.
👏
Having a healthy competition is something weird lately, but I like the option we have to choose according to what we need, internet is a place to find so many people trying to convince you about what you need to choose according to nothing related to you...
I have a 7950x3D, what do I need to do with game bar? did I miss something?
wait will there be an FE 7900 GRE. i saw some poster saying it comes out july 28th
What ever happened with that Alchemist+? Looks like they skipped the refresh.
Absolutely agreed with your rant at the end. People should be allowed to buy what they want. Fanboying over metal and plastic has always been lame. Crazy how folks both console and now PC does this 🤦♂🤦♂
As a Brit that date format is more triggering than i expected.
I want a video with you pealing the protective film off every piece of equipment you have!
btw why do the lights say HN
just a thought on the 13900k im runing my at 5.8 and have had no problems with at all with any games and im on a rog strix z690
i think you should peel those in your car, uv and heat might damage it and then is harder to peel it.
I keep noticing over the few episodes now that the bitrate of the video in 4K is very low, it's either the export of the video has low bitrate or videos are shot in low bitrate. There are compression artifacts everywhere.
If I'm not mistaken I think it's the AVX in bios because alot of games don't even like to run if that stays on and a bios update could help
Reese?! *falls to knees* Nooooo?!
I thought Reece lived in South Africa? When did he move to the Uncanny Valley?
😂 uncanny joke!
AI acceleration on a GPU could become useful in gaming if developers can leverage it to have in game npc's leverage it to effect their actions in some way making play thru's unique each time.
That sounds like a nightmare to implement, but it would absolutely be cool
Brett if you don’t peel things off then what are your thoughts on bandaids? Do you still have a flintstones bandaid your mom put over that little boo-boo you had when you were 6?
PEEEEEEL OFF THE FILM FRON YOUR CAR INTERIOR
I know its irrational but OML does this bug me now that its been pointed out.
Beta tester for killer (AI app) where do i sign up🥳
13:00 I think my next build would be a thread ripper if I can find 24 cores for cheap on the used market.
That last bit about enjoying life was more inspiring than it should be. Thanks!
Running Ryzen 9 apu. Runs amazing, no noticeable difference for like 3 years, in a laptop that sees some good use.
I definitely trust that alien Reece voice to recommend a microphone.
My mothetboard gives to cpu 150w if remove c stats on a 35w cpu
I always wondered what does “UFD” means?
Just curious of your opinion. Do you think we have hit a point in gaming GPUs that we wont be getting performance from new cards without higher wattage demands. Since AMD seems to be staying out of high end and Nvidia continues to demand more power i feel like quantum tunneling is starting to be a problem and they cant just shrink the processors anymore.
That card holder joke was top tier
With the Intel issue, I can about guarantee it's motherboard power related and can be fixed via the BIOS. When I first got my 12900k + MSI z690 Edge Wifi DDR4, it would run hot all the time to where I had to underclock it to keep it under near throttling levels on a 360 AIO. With the BIOS updates that I've kept up on over the past year+, I no longer break 80c; and even have a small overclock on it.
can i use your last video as meme ? that was GREAT words 😁
ocular diagnostics for systems used for auto driving cars as well as connect it to my ai developer.
Prezi was awesome. I was the first to use it in many of my classes. Did it die off? That's a bummer, there was a lot of potential there.
OMG you are a monster with protective film.
😂 I agree!
3:40 Why isn't ? well a judge already said it was so...
either they check their chat bot or they don't use it cause judge are gonna hold them liable
>Google gemini chatbots
What a glorious dumpster fire those will be.
I really want to use battlemage, but I’m put off by the very real possibility of trying to do something and finding out Intel hasn’t optimized the drivers for that task yet. For now, I’m sticking with AMD for my next build.
13900k/7900xtx on strix z690f d5 no problems here, but i do have a 253w power limit and a 0.075mv all core undervolt
I'm surprised but i'm actually looking forward for BattleMage
Now that we have a video of ReeceGPT, we need one of KylerAI and Bretmini.
As a creator how's using an A750 and loving it I'm all for that Battlemage hype train.
Peeling the film off is the last step in the build. After it is plugged in and posts...peel!! Only a monster doesn't do this.
I I hope that battle mage comes out banging with great drivers and improvements to shake up the gpu market so they can bring prices downs in a perfect launch
Im thinking of going amd cpu / intel gpu, battlemage should be prime and ready for my 5600x. Cant wait
Wait.. Before Black Friday but after Thanksgiving. So, after Nov 28th but before Nov29th? Spoken like a politician and I'm going to go divide by zero.
That AI Reese was banging out the deals with the air of authority the human Reese could only hope to.😅
Reece, I'm gonna need you to go ahead and complete this captcha for me. 😬
I've been waiting for the 2nd or 3rd generation Intel GPU's to jump onboard. I'm excited and crossing my fingers that battlemage will be more competitive against red and green guys!
Most people I know now use canva instead of google slides, in both solo and collabs 🤣
So, the Battlemage engineers have moved on to Celestrial according to Intel's GPU dude because Battlemage is completed, and the drivers are A series compatible... that tells me that the holdup all the way until November/December is purely based on decision making - OR hardware manufacturers not putting in the orders Intel hoped for... We were all told to wait for Q2, but Intel is dead silent on Battlemage...
Other benefits of going Intel over AMD for your CPU:
The iGPU has more function than just being a display out. Granted, its not great for gaming but Quicksync is amazing. My 13700K has more options for life after being in my gaming rig. And during its tenure, it can encode a stream if I decide to get back into that and my GPU's encoder kicks the bucket for whatever reason. Apparently, its really good for video editing and color grading, too. If you have a dual purpose machine, Intel seems to be the way to go.
More cores for my dollar. This means I have more resources to experiment around with virtualization.
Overclocking. I get it, there's not much room for it with these newer chips, but ive had both Intel and AMD and Intel takes the cake when it comes to tinkering with this for me.
W AI Robot Voice Reece!
Maybe VideoCardz meant Canadian Thanksgiving October 14th?
**Linus Sebastian walks in to peel all the protective stickers**
Socks and sandals boy not allowed.
😂
Way to figure out if its a llm chatbot: see if the chatbot is able to focus on the task by throwing in distraction small-talk and mildly derailing.
Second note, multicore/multithreaded processes dont favor the 13900k more than the direct comparison 7950x. In single core tasks the i9 13900k has a single digit percent advantage over the 7950, but in fully multicore workloads, since both have 32 threads, they trade blows really with a single digit percent advantage to the 7950x. 13900k has 8 performance cores and 16 efficiency cores (only p cores have SMT), so despite having 24 physical cores, it runs about the same as a 16 pcore-only processor.
What it seems to boil down to is that you're familiar with intel's quirks and dont want to deal with accounting for a different set of quirks that amd has, which is fair. Just, there isnt a clear 'better' cpu, only what you prefer more.
To be fair I'm happy just rolling with my 13500 and 6700xt. I get the performance I need to get the job done and have no issues with gaming. I'll just waiting until the prices drop to upgrade.
Since the AMD/ASUS BIOS was brought up, hasn’t Jay made a video talking about how motherboard companies are pushing Intel limits out of the box?
I've personally had bad luck with the few ATI/AMD Radeon cards I've owned. I've also seen all the problems that friends have had with the ones they owned. I am hoping Intel really does well with Battlemage and maybe ill try one. For me I dont care at all about cost/performance, I like ray tracing and Nvidia's software features. Cant really explain why I dont buy AMD CPU's other than I've been using Intel for so long its just habit now. Last AMD CPU I bought was Athlon XP 3200+.
Yeah we saw how well Google did with their Tensor chip. It’ll be fine.
We were all wondering when you were gonna replace Reece with AI. 🤔Maybe Kyler is AI too?
Canva is the new prezy now
Whoa... existential Bret! Is that a new character download for Fortnite?
Chatbots started with Eliza. We have come a long way but still have the same problems due to limited training/vocabulary. Chatbots often don’t admit that they don’t know and just keep using a “tell me more” type of response that Eliza would use.
Remember Eviebot? It was all the rage like twelve years ago and people were in mild awe of it, but it was filled with short, pre-scripted responses that still took a while to generate, and it couldn't remember anything past the last message you sent. And the "human" face that made Evie so much more personal than Cleverbot to begin with was low-res, it animated in low framerates, while also being unresponsive and incredibly janky. What a long way we've come since then.
Intel chatbot says: "This coming Black Friday but before Thanksgiving 2025 definitely around then."
i9 14900kf probably just needs little bit more power, easy bios fix
Intel needs to send you a Battlemage GPU to review!
Remember AMD makes CPU's with more cores and higher clocks than the X3D line.
I have an Intel a770 card, bought it at the start of this year actually. I live and breath computers in both my personal life and work life, so i'm not too bothered by the nuances Intel cards require computer troubleshooting skills for. While I am a little bummed out about the fact I could have waited to get the Battlemage lineup coming out soon, I'm honestly just happy that my dollar goes to supporting possibly a better competition in the future. Hey maybe I will still grab one and effectively get 2 Intel GPUs for the price of one Nvidia one :)
Alien Reese gives me the creeps ;)
I have wanted an Intel Arc gpu for the past year, but I never buy the first generation of anything. Let the problems get fleshed out. When Battlemage drops I am going to get one.
sorry for th4 delay had to get back to intel nvidia was developing for AMD if you cant tell, a lot of those chips need a very rigorous burn in and ai training cycle in order to pack so much into one spot, sometimes running games in 4 separate windows is good enough for the gpu and cpu to allocation more hardware on the same thing at once allowing for more performance.
That's why I got a 7950X, not the 7950X3D, when I built a new system. Better price/performance than Intel, no explody or dealing with core scheduling like with the 3D. Just 32 threads to do all the work things and still play games.
Can't wait for battlemage
I use a a770 in my guest/ game server pc lol.
Another reason to have an AMD CPU, specially at the high end, is to not have an industrial heather as a cpu and needing a power plant to power it. With the respective chiller to cool it down. AMD is better at power efficiency and temperature as their intel counterparts. If that's a priority to you then AMD is usually a better choice in this regard.
Im Rooting so hard for BattleMage to be at least competitive so that Nvidia and Radeon aren't so stingy with the graphical power and greedy with the high prices next time.
That was a good explanation of perspectives for the taters 🤭, not sarcasm haha
What. The. Frick!?
I am also wearing the 'Get in nerds' shirt today *and* I leave the protective film on things.
It absolutely annoys people at my office and in my friend\family circle. There's no real motivating factor other than I just don't feel as though I need to remove the film.
Their reaction is funny though 😂
If I had tech friends, I'd likely adopt this stance, just because. But alas... no techy people in my day to day.
I think Bret believes companies value their shareholders positions more than they actually do lol
With Google new AI, wonder how this may affect Nvidia’s stocks 😮
Just build a 7950x system Brett. Get a nice ASUS board and 64GB Corsair Vengeance DDR5 to go with it. Sell or give away the i9.
As someone who has a Pixel 8, I doubt their custom server chips are gonna be that power efficient. Granted, they aren't using Samsung's designs, so that'll probably be better
You can't say "breakfast."
14:55 bro took Philosophy 214 in college.
Maybe we need a UFD Philosophy channel, or Sociology-Tech News. Teach them simple gamers about life!
Apple was sued once when one of their “authorized repair centers” employees stole a customer’s intimate photos. This was not an Apple Store employee but an independent contractor and people said Apple was responsible. The OFFICIAL company website cannot fall back on “why did you believe a chatbot”, because it speaks for the company by design.
My issue with how usually choose their parts is on the tribal/hooligan mentally of "my team is better" where they purchase one brand because they believe it to be the "better" one instead of looking for what they actually need. If you want a PC focused rig go for AMD CPU and either AMD or Nvidia G.Card, depending on your budget. You also need the same righ for video editing? Maybe go for an intel with an i-gpu, intel quicksinc is very good for those kinds of work. You also need to render or model 3D stuff, the cuda cores of an Nvidia are, sadly, the only choice right now for most until ROCm becomes more widespread. Unfortunately most people just go: "Go intel for cpu, go for nvidia for gpu" when they could have gotten a better deal if they actually did their homework.
Chat bots make me angry.... They attack my job When your a Customer support agent/Over phone tech support/ISP Support
Chat bots are scarry and they attack my job, and they SUCK!
intel: I WILL ACCELERATE YOU
Like i said before. If you have seen the anime, Ex driver. Self driving is a no-go.