What The Hell Is Up With AI?
Vložit
- čas přidán 19. 01. 2024
- Paul from @paulshardware tells Adam all about his thoughts on AI and what kind of AI "demos" he saw at CES 2024
*This video is sponsored by SilverStone. The innovative IceMyst all-in-one cooler allows you to stack RGB fans on the water block, which easily allows direct airflow to any hot spots on your motherboard, VRM or DRAM. Pick up the IceMyst AIO today and equip your PC with this unique high-performance cooler: www.silverstonetek.com/en/pro...
Buy PCWorld merch: crowdmade.com/collections/pcw...
Follow PCWorld for all things PC!
--------------------------------
SUBSCRIBE: czcams.com/users/pcworld?sub_c...
TWITTER: / pcworld
WEBSITE: www.pcworld.com
#ai #ces2024 #interview - Věda a technologie
Adam is getting so good at presenting videos. Props! :)
4 butts? Thats even more intimidating!
I can't remember which channel showed it, but there was a CES segment that showed a AI board that slots into a M.2 slot. If that costs between $100-200 AND I have a use for it I will most likely buy one.
Edit: Just checked. The video I mentioned is on this channel, PC World, and was posted a few days before this video.
Intel sold a Movidius VPU for m.2 three years ago. Google a TPU m.2 board via coral. Even Qualcomm has m.2. Modules of their inference cards.
And there is probably more.
Unless you are developing computer vision models, CNNs or some some LSTMs like it's 2013 - what is the use case for you?
@@Veptis You last question is the most important one. "What is your use case?" I don't have one yet and I'm sure most people don't. That is likely why I didn't know about the other ones you listed. I've never had reason to look into it before and still don't currently. But that might change in 2-3 years.
@@The_Opinion_of_Matt my use case for example is running language models of size 34B or more at fp16. And there is no m.2 or PCIe card that can do it. The only real product I could buy is two RTX 6000 Ada.
The best chance I have with NeuChips or Qualcomm - should their inference cards be eventually sold for workstation.
With those old generation accelerators you can't do much. And your CPU, GPU or even NPU is probably a way better fit for the workloads.
@@Veptis Yeah, you are doing way more with AI than I ever anticipate using it for. At best I could maybe see using it to optimize home energy usage, but that likely won't be for at least another 5 years after home automation is more up and running. I currently work with Building Management Systems, specifically the Siemens branded system. So home automation wouldn't be significant leap for me. I've contemplated the idea of building a system from scratch using an Arduino or Raspberry Pi before, but I haven't dug into it yet.
We've already seen progress in audio editing due to "AI".
The magic wand - and background removal in photo editing is due to "AI".
In a way it's not so new. Procedurally generated maps is "AI".
Heck, (pretty dumb) AI has been in videogames since the 90s.
Stick Ai on it and make it lame.
It reminds me of "The Cloud" back in 2007. It was being pushed by every company but most people didn't understand what it was. Of course it was basically just online data storage at the time. I was a computer tech back then and had just completed a build for a customer and he asked if it was powerful enough to handle the cloud. I suppressed a laugh and told him as long as he's connected to the internet. That's what AI reminds me of today.
I can see progress in video editing, photo editing, post-production efffects, not just 3d, but also 2d animation due to AI-processing, let's call it the NPU.
Look, if a process is faster in the NPU - then it's faster than on the CPU or GPU.
So we might even see faster spreadsheeting due to the AI-components.
Or file compressing, encoding. Wherever it may be faster, it will help.
"Ah, maybe it's like a tiny problem if you like use copyrighted material and stuff, but I don't think anybody has the answer to that." (Every AI tech guy at the moment)
No? Really? So as long as it's not the copyright of the Adobe and Disney behemoths that's getting infringed we suddenly forget how it works? After years of being told, in cinemas and everywhere else, we were naughty pirates and as individuals should and will go to jail because we made big ouchie to billion dollar corporation?
And nobody has answers if it happens to individual artists, not even a clue, at all? I find that quite hard to believe, honestly.
It's weird how in your world, AI guys are the bad guys here while in my world its Disney / Adobe.
The answer is clearly abolish copyright.
The argument for generating images being problematic (in terms of creating the models) falls flat if the models weren't trained on copyrighted data. It is to be expected that in the very near future you can imitate everyones style without it being part of the training data.
It is already the case with voice cloning, which can copy just about everyone well enough (and RVC-2 used the fully open Hifi-TTS dataset for its base model) and image and audio generation tasks are the same things, since audio generation works by generating 2D spectrogram images.
My two takeaways from this discussion:
- People gonna people
- AI: it's full of buttholes
😁
Kidding aside, that was an excellent, high-level discussion. Agreed on various points made, including running on local hardware vs cloud.
AI is definitely in a hype and investment bubble right now, but there are so many real use cases (both good and bad) that it's going to continue growing after the bubble bursts a bit.
Excellent!
Paul's good people. 🤓
Since "AI" models do contain the training material in an encoded way, they should be trained on material with explicit permission or otherwise it seems like distributing the model is breaching copyright. OpenAI stated that large language models "wouldn't be possible" without copyroghted material so at least it's now clear a court can decide if it's ok. NVIDIA has been succesful in training their DLSS models seemingly without breaching any copyrights so at least they've shown it should be possible. Models should in my opinion be very specific anyway for best results and least problems with source material.
dlss is not a language model.
The training data for dlss is copyrighted material. It's 16K renders from various games. And even just rendered 3D scenes.
Copyright law doesn’t cover “encoding” copyrighted material - nevermind the complete inability to demonstrate that that’s actually true.
@@TonkarzOfSolSystemso it's just a coincidence that when asked the models produce full NYT articles word for word and Marios when asked for a italian plumber, as there is not a copy of either encoded in there? And of encoding is not covered, I surely can just re-encode a movie to a different format and suddenly copyright stops being an issue for distributing it to my friends?
@@Veptiseverything is copyrigjted material of it's not in public domain. The question is if one has permission to use it. Nobody has challenged NVIDIA's even though they have been doing this for years by now. It's possible of course that they are in similar breach as anyone else and nobody thought about it.
@@kvahlman A human artist or writer can reproduce articles and art the same way and we've yet to outlaw brains.
How do you go from AI to dragon butts?
Only at CES 😂
-Adam
W Paul!
I miss the days of the 8087 ..
I remember them, but can't say I miss them. With the power I now harness I would be insane to go back to that.
Two super knowledgeable AI experts talking about AI 😂
"experts"
@@akiraicI stand corrected 😉
AI experts are boring, I'd rather hear about dragons and buttholes 😛
@@zivzulanderthat certainly took an unexpected turn 🤪
"Can I please have two kilos of AI?"
Manipulate/manipulative always carries a negative connotation, unless you're talking about controlling an inanimate object.
Influence is the similar term without a negative or positive meaning.
Looking forward to a new Internet without search engine. Use the local information
I wonder how many more years before androids will be walking around CES.
That's the wrong person to ask what is AI. If you ask me he just failed the exam question of the meaning of AI.
People out there to make it unethical - Yup you hit that on the head, so forcing it into the OS of a computer is the worst idea possible in all of mankind history - Go, Go, Go MS - Better telemetry to help collect every possible bit of personal for its own nefarious reasons, - Better Adverts on the desktop, better ways at stop you disabling, enable parts of the OS that MS doesn't want you to do have unless you pay up ! !
I liken AI to the Gutenberg printing invention, which ignited the Renaissance and the Industrial Revolution. We are currently in the early stages of the AI Revolution, akin to the embryonic phase of a monumental transformation.
AI currently going through youtube to demonetize video it don't like careful not to tow the party line .
Artificial Intelligence.
when cars were new, many people died from it and still do but it is widely used.
AI will do the same and stay for the same excuse.
7:36 almost as if client-side anti-cheat never made any real sense to begin with since it never, ever worked. ****Server-side anti-cheat**** or bust, period.
I won't be using AI, I will do my own thinking and protect my privacy at the same time. Thanks but no.
Ok smarty pants.
@@Black_Cat_997 go ahead and give your data, give your iris, your DNA, your face, you will have nothing and won't be happy.
You are on a platform that is using various of models to serve you more relevant videos and ads. you are giving your behavior pattern to Google for years.
@@Veptis I use linux 24/7, I don't have any social media accounts and I try to leave the least possible of my behavior patterns. I don't give likes and from time to time, I remove my comments in the videos.. if that is enough I don't know but I try
The best thing about AI is that it contributed to killing the PC gaming market, which saved me time and money. Other than that, I don't care very much.
Generative A.I. is immoral given that it's trained on stolen artwork. Generative A.I. shouldn't be used currently and people should feel bad about using it. They won't because no one cares about actual artists. They just want the artwork for nothing. All the while they marvel at artists' skill and work. It's an absolute JOKE that generative A.I. is even being considered. A.I. will ultimately steal jobs and increase prices. I do think that there are valid form of A.I. uses though. I am already tired AF about the overuse of "A.I." has a term, we don't need a Samsung fridge with A.I.
You're not "skilled" by using prompts to create artwork. I am really saddened to hear Paul is using generative A.I.. Massive blow to my respect for him, or anyone that uses it. Unreal.
Paul is a good guy. No sense in that talk
I didn't say he wasn't a good guy. I can strongly disagree with what someone is doing and still maintain a level of respect for them. @@robertlawrence9000
AI is stupid. its all stupid. if you need a computer to do something, program it to do that. no need to make everything "AI".