7 Reasons NVIDIA 40-Series GPUs improve the workflow - What Gamers didn't mention!
Vložit
- čas přidán 8. 06. 2024
- 7 Reasons Why NVIDIA 40 Series are awesome what gamers didn't tell you.
👉 Check out Zotac 4080 AMP Extreme AIRO: bit.ly/3KjGZpP
GPUs Used in this video:
Zotac RTX 4070 trinity geni.us/ZTC4070trinity
Zotac RTX 4070ti Extreme AMP AIRO geni.us/zlwer or geni.us/ZTC4070ti
Zotac RTX 4080 Extreme AMP AIRO geni.us/jstvFn or geni.us/aiaV
Zotac RTX 4090 Trinity OC geni.us/Ztc4090AIRO
Zotac RTX 3090ti AMP Holoblack geni.us/dDdAE
👇❗️PC Builds for Creators [Latest]❗️👇
➡️ $750 BB4B (Best-Bang-For-Buck) PC geni.us/750-1300BB4B
➡️ $1500 BB4B Creator PC geni.us/1500-2300BB4B
➡️ $2500 BB4B Creator PC geni.us/2500-3500BB4B
➡️ $4000 BB4B Creator PC geni.us/4000-5kBB4B
Want me to check out some tech or interested in collaborating?
📩 email me: thetechnotice@gmail.com
FB: / thetechnotice
IG: @thetechnotice
#RTXon #nvidiastudio #Zotac #NVIDIA
Video produced by Lauri Pesur
Edited by Sam Ruddick
-------------------------------------------------------------------------------------------------------------
GEAR USED IN THIS VIDEO:
[All 'geni.us' links below redirect to Amazon (or some other Online stores) and may earn commissions]
AUDIO:
-The Recorder + Mic I Use: geni.us/tscmdr-10l
-Shotgun Mic I Use: geni.us/T35T
-My Main Headphones [CHEAP]: geni.us/dzv7Q
-My Main Headphones 2.0 [NOT CHEAP]: geni.us/as7q9v
VIDEO:
-A-roll Camera: geni.us/NvnoA
-Lens for A cam: geni.us/J009I
-B-Roll Camera: geni.us/wVlF8J
-Top Down Cam: geni.us/56576C
-Main Bokeh Lens 4 Sony: geni.us/k09oA
-The Wide Angle beast w OS: geni.us/9KfCwy
LIGHTS:
-Main Fill Light: geni.us/qTbE
-Hair & RGB Lights I use: geni.us/TdY1
-RGB Tube Wall Light Behind me: geni.us/96gWdz8
-The Fire effect light: geni.us/GX5KA
OTHER:
- Favourite camera Monitor w LUT: geni.us/tecUD
- Music Licences:
▶Get 2 Months for FREE when joining Artlist/Artgrid below◀
Music & SFX Licensing Site: geni.us/TNartlist
Stock Footage Site: geni.us/TNartgrid
DISCLAIMER: I have to state that some of the links on this channel are affiliate links which means by clicking on them and purchasing the items I get a small commission. I'm part of an Amazon Affiliate and earn revenue from qualifying purchases. Using the provided links does NOT change the price for you.
⏲️
0:00 What Gamers DIDN'T tell you!
0:20 Sponsorship Disclosure...
0:59 4080 Killing 3090ti
1:15 ✅ No 1
1:25 What is AV1?
2:30 40-series NVDEC & NVENC explained
4:00 AV1 Streaming
5:00 AV1 vs H.264 video quality Difference
5:45 ✅ NO 2
6:40 VRAM Increase
7:38 ✅ No 3
8:08 2x+ the Performance in Blender
8:59 OctaneBench Performance Increase
9:30 ✅ No 4
9:52 ✅ No 5
11:00 ✅ No 6
11:40 Premiere Pro Timeline Playback
12:00 Tips for Photographers
12:45 ✅ No 7
13:25 Peak power consumption
13:38 Average vs PEAK Power consumption
14:08 30 vs 40-series Power consumption
15:14 Performance PER WATT- WOW!
16:30 Conclusion - It's worth it!
17:26 Best PC for Creators!
The pricing is killing it for Nvidia, then the lower vram / memory (and lower bus bandwidth) are seeing many refusing to upgrade or purchase the RTX40-series for gaming and/or professional workflows unless you have deep pockets and can grab the RTX4090 which is really the only GPU worth considering.
i believe that some features make up for the low bus but thats that we still have high prices and low vram
For architecture & 3D this gen has been a game changer, we’re close to being able to render in real-time in Blender. One note when comparing GPUs between gens, for a lot of us VRAM becomes the major constrains once you’ve setup your workflow so comparing a 4080 to a 3090ti doesn’t really makes sense in terms of swapping one card for another as you’d probably need the 24Gb of VRAM for your work as well leaving only the 4090 as an option.
Exactly!
So i have a question. My cousin is also in need of a rig for architecture & 3D modeling for his studies. And i am trying to find out if the 4070 or 4070 ti with 12gb vram are enough for this task but i cant seem to find people covering this topic. What would you say?
probably fine I'd say. It's when scenes get more complicated and you run out of 12GB of VRAM, but I'm guessing here, not sure what scenes he's working with/studying :)
@@theTechNotice He is studying landscape architecture. He told me he'd have to render stuff in the next semester. Its going to be a BD present for him so i didnt get too far into what software he uses. The cpu is gonna be a 13600k and gpu the best for my budget would be a 4070 ti or something but idk if the vram gonna hold him back
What are people's experiences with the 40 series cards and Unreal Engine 5 for creating architectural renders? I'm mainly an architectural photographer (who does a bit of video with Resolve), so I was thinking about either the 4070 or 4070 Ti, but I also want to dabble a bit in Arch Viz with UE5 and wondering if those cards would be up to it, or if i'd need to go with the 4080 or 4090? They seem really highly priced for something that isn't my main day to day bread and butter.
First thing I did when I put the 4080 in my PC was import a stupid high poly count model into Blender (13 million I believe it was) and started warping it in the sculpt tab, and the 4080 pretty much handled it like a champ. Still stuttered here and there, but compared to my old 6600xt which lagged hard on a model with less than a quarter the amount of polygons I was pretty damn impressed. My wallet on the other hand is still not impressed 😅
3090 vs 4090 looks impressive since I have the former. Seeing it purely on the creator workload alone, it's appealing. But man, why can't they increase the VRAM a little to make it more negotiable for 3090 owners to upgrade. Also, the rest of the line up has too little VRAM, why in the world does a $600-800 card only have 12GB and a $1200 card only have 16GB where the prices for 3090 right now is about the same. Everything other than 4090 is not a good look
The 3060 12gb is a welcome oddity at least. Especially for users of Resolve and Stable Diffusion. Here's hoping that there will be a 4060 12gb or even 16gb!
Yea if nvidia 4000 serie would be only about 4090 than it would be good, controversial but good decision, but draining more money for less, from people with more limited budget, just becouse they still trust to nvidia, thats awful especialy in financial crysis its not acceptable. I am expecting from them only one step to expose themselves.... thats canceling of 90 serie in the future, becouse its way more cheaper gpu, than nvidia likes...product is also well moneymaker so I am expecting this outcome.
@@nenoman3855 8GB on the 4060 R.I.P
Yeah, as much as I like the 'small increace' of VRAM compared to 30 series, I do agree with you. I think 4090 could've got 36GB and 24GB on 4080, then no problem.... makes me wonder if they're gonna do a 'super' and slap some more VRAM on it? 🤔
Exactly because they know that most creators cannot get around with 12Gb these days. So it’s effective feature to ladder their prices
Oh yeah, blame the gamers. Wait until your project runs out of VRAM 😢
Wym
@@vanrensburgarchitecturebim1641 I didn't say AMD was any good. I'm just saying 12 gb of VRAM isn't enough for anyone at a high resolution, or even 1440p. It's been 7 years since the 1080ti
@@JBrinx18 Also, can you tell this to Apple too? Their base model macbook air is still launching with 8gb ram and looks same. It’s not about staying at same vram but it’s about game optimization. I would recommend you to compare TLOU part 1/Hogwarts Legacy graphics to Red dead redemption 2 to see for yourself that rdr2 has much better graphics and uses less vram. Game developers have become lousy and don’t optimize properly before launching. Just don’t buy games and torrent. I personally never buy games and always have torrented though I have spend thousands of dollars on gpu because I feel hardware is worth it but games are not.
@@pinakmiku4999 Yeah, do you know how long RDR2 took to make? 8 years! And everyone said it was still unoptimized and too hard to run when it first came out. They also have a massive team with a ton of funding, they can hold out for that long. Optimization takes a lot of time, and developers for years have had to spend several months to get you games that run under 8gb of ram. We've been living with 8gb GPUs as the midrange since 2015. It's time to move on, but Nvidia doesn't want to give you more VRAM. We have to demand more
@@pinakmiku4999 also, your lack of respect for developers time and effort by piracy is a blight to the industry as a whole. If everyone was like you, we would just have crappy free to play games left
The new NVENC is about 1.25x faster than the one found in the previous RTX 3000-series.
By itself, that is not a huge difference.
But since the new RTX 4000-series have two NVENC's in them,
it ends up being about 2.5x faster than previous generation.
Now that is a huge difference for video-editing!
how about the lower end model like 4060?? is it same nvenc or downgraded?
@@storyanaksekolah2 as far as I know only the 4080 and 4090 have dual encoders, the rest just one
LMAO, where's the price comparison. You know, performance per $
If I did it in 3D animation it would look good for the 40 series, in gaming probably now.... but hey... I'm not a gaming channel ;)
Would love to see a comparsion between AV1 and H.265 instead of H.264 which is old now on thoise GPU's.
I think at least covering the price issue would have helped this video. There are ways to do sponsored videos and still be impartial. You have to point out the negatives of which the 40 series has many. 1) Price 2) Size 3) Cutting corners (memory bus, Vram) to name just a few. This series might be good but its not at all consumer friendly and if this trend keeps up, they will kill the PC gaming market. If that happens, artists and creators will be paying double for the 5080.
Exactly
Size depends on what model of the 40 series you're talking about. The 70/70ti series are smaller than their prior series significantly. Price is almost the same series on series if you account for inflation and valuation of the $.
The 4070 is ~ to the 3080 and costs $100 less. Heck the purchasing power of $699 in 2020 is roughly $815.20 in todays dollars.
I know the costs are too high. But people have forgotten that $399 in 2010 isn't $399 in todays dollars.
It just seems like most of those talking points are just standard AMD fanboy complaints. In terms of #3, that's a different conversation.
All I know is that I have to use AMD as a daily driver for my job, and I'm not impressed. But that's just my experience. Your mileage may vary.
Most likely they're going to run into issues making a revolutionary architecture and they'll just release a 41xx or 45xx series instead.
I would like to say to Davinci editors looking for a GPU… If you’re getting the 4080, just spend couple more hundreds and get the 4090. The price differisce is smaller than the difference in performance and stability. Oh, and turn on ECC.
Especially since they are staying in stock now and some are even slightly under MSRP in the US at least.
The ONLY thing I don't like about the RTX4070 is the 12GB VRAM buffer. I can't justify buying an RTX4080 due to the PRICE. I've considered AMD but for me the software just isn't there (I use CUDA but there's other reasons). So where does that leave me? Enjoying my GTX1080. Still doing great. Honestly, this entire GPU years-long issue has made me re-evaluate my "need" to upgrade my PC. MOST of my games that I currently play would benefit in NO WAY from a better computer. Currently playing an old WOLVERINE game. And I play STRATEGY games like CIV 5. I have God of War but FSR2 gets me through nicely at 60FPS/1440p High(ish) settings... even Spider-Man Remastered runs great. Ray-Tracing? No. But I can't justify $600USD just for a slightly prettier game... so I'll hold off for a few YEARS then just build a completely new gaming rig.
The RT/DLSS upgrades are significant enough to make it worthwhile. The lower power draw of the 4070 compared to the 30 series is pretty incredible. I took a jump to the card everyone hates, and it is a monster. Especially compared to a 1080. It's not just the raw power, it's all the AI/software upgrades. Like the video upscaling.
@@SlavicCelery DLSS isnt an upgrade
@@ScoopDogg DLSS 3.0 is an upgrade. If you play single player games, it absolutely is an upgrade.
all youtubers talk about graphic cards for gaming nor for editing, your video is very helpful and thanks ❤
Quite the incendiary title, one that between it and the ngreedia sponsorship, you're guaranteed to make people pause the video at that point, leave an angry comment (like this one), then watch something else.
That being said, I agree with the hidden message in the title - the rtx series are NOT for gamers; ngreedia made a gpu for other markets then simply advertised it to gamers. Fake frames and fake resolution is setting ngreedia up to make the 'performance' of gpus based on the fake frame/resolution software. What this means is your new $3000 USD gpu won't perform much better than your old $1000 USD gpu in game that has fake frame/resolution version 4 instead of version 6.
Rtx is literally unusable without the aforementioned fake frames and fake resolution for all but $1000 USD variants and even with fake performance, you're probably better off keeping the fake stuff and disabling the ray tracing. Until the cheapest gpu or consoles can do ray tracing like the rtx 3090 at a minimum, ray tracing is NOT in the near 'future'. Which means you paid a premium for tech that is largely irrelevant.
So, ngreedia should have called this 'professional' cards, charged $5000 for them, then release proper gpu for gamers. Sot his is not on gamers - ngreedia hid the fact that these cards are NOT for us, and tried to get us to overpay.
Sir when adobe will provide dual nvenc support for 4080 and 4090 including av1
Thank you just subscribed , i'm looking for a GPU for 8K H.265 , having some problems with blue screen crashes with code: VIDEO_MEMORY_MANAGEMENT_INTERNAL can't edit videos right now. . . Do you think the ASUS ROG STRIX GeForce RTX 4090 OC Edition will fix this problem , i tried everything to fix my rtx 2080 ti oc , clean driver install . . Tomorrow i try to reset bios and clean windows install . . If not i buy the card ,
Try installing more system memory (RAM), at least 64 GB for 8K output . I am still on 1080Ti no problem for me .
Does the new 4000 series improve the optical flow speed warp in Davinci Resolve?
I also wondered about same thing
Nice to have such details for creators when it 's talking about GPU ! thx man !
If I heard right from somewhere else:
The duel encoder is only available on the 4070 and up for desktop models, and only available on the 4080 and up on the laptop/mobile models.
Hi! So..
I have question. You one time talked and said that it doesn’t depend on what company is the graphic card produced on. Just find more cheaper proposal.
But I’m interested in the KFA2 company that supports making nvidia graphics cards. I’m interested in their review from your side. They are going much cheaper then other’s and makes me have some concerns. What do you think?
Thank you for your reply.
Good video. Two questions. Can the i9 13900k do AV1 encoding? If I'm using say a RTX 3090 will the system be smart enough to use the CPU instead of the GPU? I'm thinking in terms of Resolve. Thanks.
Not hardware encoding, no.
Yes, i9 13900k *CAN* do AV1 encoding. Pretty much any modern CPU released in the last several years can. The software you'll be using won't use your 3090 even if you wanted to since RTX 30-series can't encode AV1.
@@theTechNotice While waiting for your reply, I found this video on Tom's Hardware channel on TY. At 38:31, he presents a table which shows the performance of the i9 13900k encoding AV1 compared to GPUs. The performance doesn't look great but at least its doable. So yes it does seem possible. Does this video seem to be saying what its saying or am I interpreting it wrong?
czcams.com/video/elZH8iXGTPk/video.html
@@bgtubber So to ask the obvious, Resolve will automatically switch to the CPU? The option to encode in AV1 won't be greyed out if I don't have a 40 series Nvidia GPU? I currently just bought a 3090 at a very good price and intend to upgrade to the 13900k.
yes, but that would be software encoding, which basically means it uses the CPU cores for doing it not a dedicated engine which would be much more power efficient and faster :)
Until recently, I had a Zotac mini 1080ti. Very happy with it. Worked amazingly.
I'd like to see a comparison in Lightroom Denoise - it makes a huge difference between older and newer GPUs and no one has tested it yet.
A lot of photographers are tackling graphics cards now, for me, this would be a big reason to upgrade.
You re savior of us who re just all about Design and mention about gaming just for relaxing, you deserve more
I have the ZOTAC GeForce RTX 4090 AMP Extreme AIRO , and love it for my channel , plus I did have 4070Ti and 4080 versions as well , but 24GB is on a other level !
I bought zotac 4090 2 months ago, is a really amazing card, worth every penny ❤️
from purely photographic workflow like photoshop and lightrroom only, is intel Arc A770 enough. i know u said Adobe has not fully supported intel Arc yet. i do 300+ layers in photoshop and lots of panorama and photo stitching. files ranging 4GB and upwards. thanks
Image editing tasks are heavily CPU bound, you can get away with igpu as far as graphics goes. I suggest focusing your budget on CPU and system ram for the best results. A good NVMe drive goes without saying. I5-13500 is really the sweet spot. And according to Pudget, it's merely 10% slower than 3x more expensive 13900k.
@@zarkozi oh wow. thanks for the above advise. well noted. cheers.
Waiting for workstation RTX series reviews from you man.
Such a nice video! greetings from mexico !! I always follow your videos!
Anyone know how to fix low gpu usage on zotac amp 4080 with i7 13700k?
TN can you tell us which card has dual encoders/decoders and which does not??
Right now RTX 4070 ti AND above (4080, 4090) do.
but 4070 doesnt....
no matter what this is the actual 4060
they named it 4070 so they can charge as much as possible..
Yep.
Hell nah! 4000 series is a significant performance uplift but takes the disappointment award in 10 years of Nvidia.
This one wins a pin..
wasn't 2000 series worse?
U r gonna run out of vram by 2024. Only 4090 will survive
The Gamers are not saying the cards are terrible, they are saying the PRICE via comparison is terrible. The Number of People that buy GeForce Cards at Home for your 7 reasons, versus those that buy them for Gaming is not even close. At work, I do not care how much the GPU is, the Boss will pay for it. At home... It's my money my pocket. For Now... the Gamers are right.
Good point on focussing in on the electricity.
Energy bills in an energy crisis are not pretty... I want everything in my pc to run as on a low a wattage as possible.
It is a 5nm process node, not 4nm, I know it is confusing because of the TSMC 4N name, but that stands for custom Nvidia.
Good video, i'm a photographer and I'd like to know what do you think about the new AI features of Adobe LR and ACR denoise which seem to make better use of the computational power of the tensor cores. Is there a noticeable speed jump for these specific functions or is the advantage of the 40xx series over the 30XX series negligible? Thank you.
as far as I can tell not a huge difference and that's what I mentioned in the video. I think still better than nothing, but I'd probably AIM as low of 40 series as possible if thinking of 40 series as a Photographer :)
@@theTechNotice tank you
4060 Ti will fit well on your workflow. Looking forward to this card also for myself.
@@MuhammadIhsan390 the 4060 could be a good choice as long as you have 12GB of VRAM.
@@domenicomancini6299 I don't need that big vRAM so whatever...
your every videos are gem bro keep it up
3D enthusiast here, currently own a 3070ti - 4090 looks pretty sweet when I think about 50% boost in rendering performance!
It's not 50%, more like 300+%:
4090 in Blender Monster 6528 samples
3070ti - 1799 samples
😇
@@theTechNotice That's just awesome, like well done Nvidia 😎 they sure outperformed themselves! And, thanks for catering to the pro/work audience with your content, much appreciated👍🏻
@@braddishv3146 Hey you missed my original comment, I was the one who mentioned the 3070ti, I had assumed the 4090 would be 50% faster than it - but it turns out its way faster than that - which is what he was responding to, besides he shared the benchmarks so he wasn't trying to mislead anyone - I've been subscribed this channel for quite a while - he's not the type that does that, you should check out his other videos too, they are very well made and quite informative.
Not gonna be able to afford one until about November I think but defo will be getting one to gain access to the AV1 alone as is =]
I want pair 4090 with 13400 for blender. Is that okay? I want to cut some costs here and there (cpu and motherboard mostly)
I don't see why not :)
13700 at least
Here come s3 marketing sales😂
What is the issue in that?
when there is literally no other product to market, then what u gonna do lol. May as well cash in until AMD and Intel fix their shit and become competitive in this field
I know I'm on a small island here, and I only video edit on weekends and after work, but my Intel card does well for me. Keep in mind I do make extra money and find it harder to keep up with new business.
Wait a minute. You're supposed to only compare the current series of GPU against the most immediate generation, above the current one.
I got no idea why this is the standard. But apparently, it's the only approach that Tech reviewers utilize.
Can you take a Temp gun to these Reviews are record the temps too ? I think thats very important too.
9:19 confusing that you flipped the red/green colors. My first reaction is that the 4080 was _slower_ .
Sorry, MY bad for sure! :(
Wonder how 4000 series numbers will compare to 3000 series when used WITH Intel iGPU.
Upgraded from i5-13500k/3060Ti to i9-13900k/4070Ti, and while CPU performs 20-30% better in video export times, GPU side is literarly the same. No difference.
I don't think these gpus are good value especially for gamers, but still I think it may be worth it for creatives. If it'll improve your workflow for your job, that cost isn't as bad to swallow, especially if you are a freelancer that can put a high end pc down as a business expense.
How about doing a Adobe Premiere Pro video like this @TechNotice!
Is 4060ti 16gb a good choice?
Is it any better than 3080ti?
Their price is about the same. (Around 450$ and 500$ for used respectively)
Sir i got "2927" standard overall scores and
"2658" extended overall scores in pugetbench for daVinchi resolve 18.5 i am shocked to see this level of scores as i have i5 12600k with 32gb 3200mhz ddr4 ram and Gigabyte rtx 4080 eagle.
i also tried all 8k red 30fps, 96fps etc and canon r5 4k 120fps raw. All running as it should. very smooth and not a single frame drop.
Interesting, something must be off with my rig. Scored 1997 on the same benchmark with 11700k, 4080, 64gb ddr4 3200mhz, all m.2
@@OnceinaSixSide are yo using studio version? I have studio.
Tell me your setting under decode option so i can help you.
@@HDRPC That's generous of you! :)
Use GPU for blackmagic RAW decode - ON
Decode H.264/H.265 using hardware acceleration - ON
Nvidia - ON
Intel Quicksync - ON
Use easyDCP decoder - OFF
Automatically Refresh growing files in the media pool - OFF
Use GPU for R3D - Decompression and debayer
Also yes I am using paid Studio version
I just put my hands on my new 4090 this week... oh boy... that thing is a monster. I use it in blender, it is just mind blowing. And, in this business, time is literally money, so this is incredibly lucrative. 40 series is a no-brainer for business.
Do you get better results with professional RTX A2000 - A6000 over the cheaper GeForce range?
not really is the short answer, longer answer abit more difficult :)
It's a shame the 4080 price is absurd here in Europe. 1200+ EUR is just not worth it unless you actually need the video encoding for professional purposes. But nice try Nvidia and Tech Notice to put lipstick on these pigs. Still no way I'm going to buy a GPU worth the price of an entire rig.
Sponsored by NVidia & Zotac lol ^^
That's how he feeds him self , with his little youtube channel, he's paid by NVIDIA to say nothing but good things about Nvidia, TECH NOTICE THE HYPOCRITE 👎👎
Perfect, sponsoring stuff that I've been saying for 6 months, works for me ;) you a gamer or a creator or hybrid? 😇
Would love to see an updated in depth look at AMD gpus v Nvidia gpus.
What in particular? 😊
He did one not too long ago for the release of AMD’s flagship XTX (red devil)
@@theTechNoticeWell I'm mainly a photo editor but I also like casual gaming. Which would be the better option? As well as I'm thinking about getting into 3d rendering but I really don't have much to spend $1500 Cad Max. Appreciate your videos very much!
A 4070 vs 6800 would be a popular choice. Since they would be the ones a significant class of people could afford.
Thanks for this content ! really useful
My please, thanks for the comment!
Awesome channel, thank you. I wonder if you've ever done or thought of benchmarking Unreal for creators ? Unreal is getting more and more popular for creators, whether this is environment building or in my case it's a character presentation pipeline. I watched a bunch of your videos and my conclusion is that if the money isn't an obstacle, I wouldn't even look AMD direction for content creation because Nvidia and or software is optimised for one another that much better. Vram though is a big issue for us creators and Nvidia is playing some funny game there. I wouldn't even care that much buy like 4070 and be content but the prices are astronomical for what you get in terms of Vram but Iam preaching to choir here hehe. Thanks again.
Are AMD GPUs good enough for video/photo editing?
Architeture is where it’s at, tho Gpu enhanced spellchecker maybe a close second. ;)
When you say 40 series. I feel like you keep comparing the 4080.
I'm planning to get the 4070 non ti...I can't pay for an over priced + +...
Only willing to to get 4070 and it will be the last nvidia card I'll get. In the future I might switch to Intel arc. Will 4070 just be as good?
Guess what? I'm not buying, drop a couple hundred dollars on every tier and I might be interested in that 4070 ti, the other 3 cards no ty, specially not interested in the 2 cards above a 1000$.
Bruh are you the kid from the movie Sandlot? ..you know the catcher that says "You play ball like a girl!"
4070 is too expensive my friend. They must drop the price to 500-550
It would be great if you could do these same benchmarks with the 4060 ti card since everyone is giving it such a negative review. Also, another approach that people have been over looking is stable diffusion iterations/sec for 512x512 generations comparisons. Tom's Hardware a while back a benchmarking of some of the earlier 40 series but an update to that with the new cards, including the 4060 ti, would give a nice complete picture.
Sponsored by nvidia and their partner so it’s meaningless
Still true compared to what AMD sponsorhip youtuber try to tell you
@@zet7914 any sponsored video is worthless
@@ichisenzy I agree with you but still what he's saying in this video is 100% true
@@zet7914 ain’t gonna pay an extra 200$ for facts
@@ichisenzy extra 200$ ?
0:09 7900 XTX is better than a 4090 in DaVinci resolve according to PugetBench tho
I love how i am able to animate everything very smoothly even after huge path traced scenes. Dlss 3 is 3d apps is really game changer.
Rendering speeds is just life saving, not time saving, those speeds are life saving.
My 4080 always runs completely silent and super cool. And also very very efficient in all productive tasks.
We get these level of performance and efficiency in just one generation which is a insane thing. I have many things to talk to these are great.
Av1 encoding is really making great great change in video quality.
16:02 stop hitting the kids please:)
Lol
Doesn't ark support av1?
Yes.
Five reasons not to buy and keep the rtx 3000, very expensive , Vram limited both in size and bandwidth, raw power similar to the old generation and with dlss3 which is nothing more than false frames that will increase the response time .... Did I say that it is very expensive? Because is. If you want to see a leap from the old generation, give NVGreedia $2000 for a 4090.
Love the title of the video "What gamers didn't tell you!" like gamers hide some info about NGreedy and its marketing strategy to milk us like cow, try to sell us Ray Tracing with 8GB of VRAM during the last 3 years and now tell us to buy new GPU because planned obsolescence. At least AMD graphics cards have admittedly limited Ray Tracing gimmick capabilities but still work in RE4 Remake, Hogwarts Legacy and other new games in 1440p.
@@braddishv3146 Clown is a little tough. Clumsy for sure but he proves to us that Ngreedy and AIB are desperate to sell their 4000 series by all any means even if it's mean sacrifice their RTX Quadro range and spit on gamer's face. It's like trying to selling me gaming stuff for something else because no gamers are buying and tell people that "gamers didn't mention that!". Little bit pathetic if you ask me.
Rtx 3000 series hasn't gone down in price says a lot about rtx 4000.
16:00 …… is that your kid crying? 🤔🫣
1st the gpu selling price blows my mind nuff said!
AV1 and 4gb more vram is available for AMD 16GB 7800xt. For mid and low range AMD rocks. 4070 should have minimum 16gb vram, 12gb is not worth it.
+1 for the 4090. It's a fantastic value compared to the $6800 RTX 6000 Ada
I've always and always will recommened your channel to my friends and my viewers (when I start uploading) who are in the content creator side. As for myself, I'm a gamer and a content creator. I would recommend that you drop the "gamers hid this from you" Us gamers never hid anything from content creators. We gamers know people who purchase a GPU for content creation or work, is a good purchase, regardless the class of the GPU. Its a return on investment. All us gamers are refusing to purchase these products because of the price compared to gaming on a console and previous gen. There are more gamers than content creators. It's not the gamers problem that these GPUs are over priced for gaming, they are not selling well and they are not as many content creators as gamers to purchase a GPU.With this video being sponsored by Nvidia with a title saying "Gamers hid this from you" once again looks like another dig from Nvidia to gamers, yet its harming your channel by looking at your like to dislike ratio. This company is quickly losing their main customers for the GeForce brand everyday due to the disrespect and poor customer service towards gamers (yes we are customers like evry other human being) yet, it seems as like they don't care. Is this the type of customer service you support?
What about now? 😇
Less that two minutes into the video and he is already comparing the price of the 4080 to the “more expensive” 3090. Like anyone would actually purchase a 3090 at this point at full retail over the 40 series.
Please tell me how much does the Founder Edition of 4090 costs
Am I the only one hearing a young girl screaming in the background at 16:02? I'm guessing it's his daughter 😅
aaaand i'll be stuck with a 3070 until i can afford the 4070 or better for AV1 encoding to upload videos faster
thanks Nvidia for your absurd pricing for something that's only better than the 30 series
What are the disadvantages of AMD compared to Nvidia
driver stability issue, frequently bsod, software and game crashing, stutters, heat up, no display, less software compatibility, less stable oc potential, much worse productivity performance etc.
@@prabhakardhar1379 according to what I read / hear CZcamsr say a lot of the issues you listed aren’t as bad anymore for gaming
@@alanchen7757 they are belive or not
@@alanchen7757 they still are, so that fine if you are doing gaming only and get them for a good price but for content creation it's no
@@MAKIUSO well I only do video editing not anything else so if it can do good in da Vinci I'm fine with it plus it has more VRAM and the most important thing is that the cheapest 4080 is 340 dollars more expensive than MSI 7900xtx in my region
Note: The Rtx 4070 has only ONE Nvenc encoder! You will NOT get 2.5X the performance.
correct!
So 4070ti is the lowest one to have two encoder?
I never use DLSS, never seem to have the same results as reviewers. Ray tracing is alright but also never seem to use it. The whole VRAM being out of date already is another reason I’ll probably end up going with AMD.
I do like like Nvidia's features for video creation. That's a deal-maker for me.
I needed this video thank you i want build 2 pcs one for 4k vid editing And second for making games And nobody just you talked about this side GPU users you saved me a lot of money and time! Really thank you!
Nice review. Thanks! Lower power usage equates to a lower PSU with fewer connections needed. Saving on electric bill too. 🌟🌟🌟🌟🌟
Wrong colors on the numbers from OctaneBench
Your ADs never worked on me.
Great review. These 4000 series cards are amazing for creators. AMD has nothing to counter.
I don't know if you actually watched the video but there's no AMD cards in it. It's 40 series vs 30 series.
@@Jwalker76 I did watch it maybe you need to learn how to understand a comment. Like I said AMD has nothing to counter Nvidia as AMD cards are total trash for creators.
@@Jwalker76I don't know if you actually read what JayzBeers had to say
I just find it odd that you would watch a video that uses tests to get results so the CZcamsr has evidence for his claims. Then call it a great review when you clearly don't care or understand what the difference is between having evidence for your claims is or not. Guess fanboys have never needed evidence, just blind loyalty.
Gamers already knows, now it is time to sell 12GB 4070 to somebody who don't 🙂Just without jokes, 24GB 4090 is excellent, super expensive, but if it can earn money for itself, then why not. 4080 is like a Porsche Boxster, it is not a 911, but still Porsche. 😀 But please don't fool yourself by DLSS3 frame generation, it double frames, but not actual performance, it is working if it can predict movement of scene, if not then....
This video makes a lot of good points for content creators. Thank you for your effort and thank you NVIDIA for reminding me how broke I am.
Loving the unofficial dislike ratio :)
I do understand the sponsored part, but these gpus are crazy expensive, outside the states its a bloodbath.
Pls tell nvidia to get their shit together and price the cards reasonably and put MORE vram. Else what you preach would be e-waste in a couple of years
4070ti was mostly slower than everything than 3080 because of vram, it should've 16 GB vram. Of course 4080 is gonna be better than 4090 but 4070ti is such a bad future proof card and there's no wonder you didn't even showed 4070 😂
*power consumption..*
I remember plenty of people saying AMD would kick NVidia's ass because AMD was going to use CHIPLETS. I was very skeptical because that's just now how it works. When you add CHIPLETS you then actually need MORE power because now you've got I/O to handle between the chiplets. For desktop CPU's you actually have more CACHE than on laptop because the monolithic design is more efficient... the ONLY way that chiplets are BETTER in terms of POWER consumption is if you design them for a lower FREQUENCY. Obviously, you can do that for MONOLITHIC as well, but if the costs of MANUFACTURING end up cheaper for chiplet then when you crunch the numbers adding MORE chiplets at LOWER frequency can make a more power efficient product. But these are early days for chiplet gaming GPU's so I figured that those types of optimizations weren't yet on the table. So the power efficiency is slightly worse than NVidia from what I can tell (i.e. RTX4080 is about same power draw as RX-7900xt but the RTX4080 averages about 10% faster. Although, the RX-7900xt has 4GB more VRAM which needs power so in fairness the GPU might be closer to 5%. Only going by TECHPOWERUP data and I'm sure there's some variance in testing methodology, driver improvements etc)
I wonder how much NVIDIA paid him to say this
Did you see the sponsorship disclosure? 😇
ARCHITETURE? Spell check please.
😮😱🤦
I kind of want to start boycotting 3840x2160 video as an output format as a protest statement against the inexorable march of technology and the harm it has done earth and mankind.
While most content on 40 series is extremely critical due to price, THANK YOU for delving into the real benefits of the Ada Lovelace architectural changes that make the 40 series a huge benefit for many workloads that benefit from Accelerated computing!
Excellent video