The Best GPU for rendering with Blender. Using single and dual GPUs Part 1. RTX3070, 3080 and 3090.
Vložit
- čas přidán 11. 09. 2024
- #Nvidia #RTX #Blender
What is the best GPU for rendering with Blender. In this video I test single and dual RTX3070, 3080 and 3090 GPU for rendering scenes in Blender.
This is part 1 of a series of videos where I test gpu render in 3D software like Blender, Maya and 3D max with Redshift, Vray and Cycles.
Blender demo files: www.blender.or...
Blender with CyclesX: builder.blende...
SOBA Noodle Scene: www.behance.ne...)
Discord: / discord
Facebook: / mediamanstudioreview
Best channel on benchmarks I've ever seen!
thanks for the kind words. Please share on your FB page.
Sweet mother of mercy, DUAL 3090's.
This machine is practically Thanos's gauntlet.
check out this video with 4 RTX3090 GPUs
czcams.com/video/H0lRWAzdGPQ/video.html
Thanks for watching
Honestly, it's so hard to find this kind of content - thank you very much!
thanks for watching Bulwak
Thanks Mike for doing this. Looking to get 2 x RTX 3090 soon so this is very useful. Looking forward to part 2! 😊
Glad it was helpful! Please share on your FB page.
It would be interesting to see this revisited. Especially since the 40xx cards are out now, and as well Blender is now on 3.5. I feel in some cases pricing makes buying a seccond 3080 look far more apealing. Even more so if you have a 12GB version. While only the 3090 has NVLink I have found if you have enough system ram and really work on optimizations you can get buy pretty well with 12GB. As not only are the Blender benchmarks not representative of a real workload, but also are optimized to show the lowest possible time. Where the wild you're concerned about far more than just one single highly optimized frame.
Awesome video! Very helpful! Looking forward to the rest of the videos in the series. Can't wait to try out the same benchmarks on my own systems.
Glad it was helpful! and thanks for watching the Prashant. Please share on your FB
This is really good focused benchmarking - I’ve been looking for a solid media production focused hardware channel. Thank you for producing these.
Thanks for watching William
This is gold for us blender users :)
Thanks for watching. Please share on your FB page.
Hello bhai... Apko mai insta par follow krta hu 🤣🤣🙈🙈🙈 @TechMango.in
@@rk-pl5mk Hi Bhai, oh yes 😁
At last, someone is doing the exact reviews I have wanted for years!
Excellent video thx for posting this..you got a new sub
thanks Chris.
Please keep watching the channel and share any video topic ideas you may have.
When I discovered this channel my first thought was that Mr. Mediaman could be the brother that Roger Moore never had.. lol.. Jokes aside, this was really an extremely helpful video since I was asking myself if two RTX 3070's with a roughly price tag of $998 could beat one RTX3090 or not and voila, you just gave me the answer in this video.. Many many thanks! And yeah I am aware of it but I don't need the vram and my scenes are never that much complex.. and even if, I wouldn't mind.. But this way I could save 500 precious dollars.. Awesome! Even two RTX 3080's can still save you some 100 to 200 dollars which seems to be the safer option to avoid a failing CUDA test. But if you stick to OptiX then no issues with RTX 3070. Now I wonder.. what other double card options could be lurking out there that could deliver similar results with even a cheaper price tag? Because that's the whole point: Getting similar results with lower prices..
Glade I could help Ata Turkoglu. Thanks for watching
Currently I am using get 1650ti mobile GPU, I have Asus tuf a15 ryzen 7 4800h with 16gb ram.....I started using blender around 3 months ago, & its been a great experience but now I feel I need a true workstation, I am doing some freelance work to get me in a rtx pc.
Hope I can achieve it quickly, i wanna go into virtual production, so i Start learning unreal engine too ✌️
thanks for sharing your experiences with the channel Ravi Ranjan. Good to hear artist progressing in this industry. keep up the hard work and success.
Your videos are exactly what I've been looking for - thank you!
thanks Dan
The camera angle switch is very annoying
Perfect meme material :))))
That's called "face the consenquences angle". You can know it as "angry daddy angle" because your stupid comment.
@@Hz4 His comment aint stupid, what? It is indeed annoying
It's not the angle switch, it's the frequency of the switch.
9:22 Nice test.
I know this is funny and tragic at the same time like having Ryzen 5 5600x, 16gigs of ram, B450m M4 motherboard, 550Watt power supply, 500gb SSD, 1TB HDD...........I use a GT 710 for rendering my works 😶😔 and yes I am trying to buy a new one my eyes are on to 3060 ti 12gb, its should be good for gaming, rendering, editing, streaming at 1080p.........hopefully 🤞so ya am saving up by doing certain fiver works, and making 3d models for Roblox games and so on so I can buy the GPU one day.
Thank you for making me understand more about the GPU working principle 👍
now upgraded the setup with 3060 ti
Just found this channel whilst looking for dual gpu support in Blender. Really great content and delivery - well researched and just the right balance of tech detail providing all the key info in a well produced format. Keep it up. Subscribed.
Thanks Aidan. Please share and help grow the channel
Thanks for the most recent benchmark in blender. Keep up the good work.
Thanks, will do!
Very useful content. Im using dual 5600XT at the moment. Now i was seeking new gpus and found your video about dual 3070.
Useful video! if possible pls compare rtx a 4000,a5000 with 3080ti,3070ti and 3090.
I will have e A6000 next week to compare that GPU with the RTX3090
@@MediamanStudioServices thx
Thank you for this man! I'm currently using a gtx 1080 and I'm now upgrading to a 3090. Had to buy a prebuilt to get one and was still skeptical about it as I wanted to eventually add another 3090 for the vram. There wasn't much information available about shared vram anywhere else. Thank you once again!
Glad I could help! Please share on your FB page.
These are the best comparison videos, well done and keep it up!
thanks raymond.
Glad I stumbled on this. The downside is GPUs are so hard/expensive to get right now. Have to wait a while to try this out.
Glad I could help. Thanks for watching
Nice video. Thanks for your time.
So happy I found your channel ! Totally crazy that you don't have 1M+ followers and 100k+ views !! Please do a massive CEO or put Facebook ads ! You deserve them !!! Great test and spirit.
thanks for the super kind words. I am working on getting more views. Please share on your FB page.
@@MediamanStudioServices My pleasure ! Just shared about your great channel on my very first Blender video : czcams.com/video/dN4O3AI4VB4/video.html
Awesome video, had exactly the information i was looking for and was informative but also concise. Thanks!
thanks for the kind comments. please keep watching Blake Parbery
such an underrated channel
thanks you
These benchmarks are awesome. Makes me think that getting a 3070 is more worth it if I'm only doing character renders.
thanks for watching
is it worth taking a 4k monitor to 3080? for work, rendering
These videos should have more views. I recently bought an entry-level rig for Blender and CAD: a Dell Precision 3450 with an i7 CPU and a Quadro P620. Just enough to get started smoothly.
hi John, thanks for watching the channel.
12900k + 3090FE + 128GB RAM for blender, AR, PR & PS . Your videos steered me into getting the 3090
Glade I could help Dewayne Dailey. Thanks for watching
This is one of the best videos , really appreciate it
Subscribing was the least I could do, Top work sir!
I only get to watch my dream GPU and my dream software together.
dream big sir. thanks for the sub
This was very professional, really good job 👏 👍
thanks for watching. Please share on FB to help grow the channel
Very interesting benchmark. Thanks for testing it using Blender demo files, it is something that Blenderians can relate to. 😄
One question though, did the Cycles Optix denoising, Fast GI approximation turned on when you did this test? If not, it could render much2 faster.
hi Double U Studios, No i did not turn on denoise as I wanted to tests to be raw benchmarks.
Thanks for watching and please subscribe and share on your FB page to help grow the channel
you might like to try out the Turbo Tools addon for Blender. It can render the classroom scene in 12 seconds on a gtx1070, and will even clean the individual passes so the classroom scene's complex compositor will work (the only denoiser to be able to do that).
free addons? Where can we purchase this?
is it worth taking a 4k monitor to 3080? for work, rendering
@@dimavirus9979 I have a 4k and a 1080p monitor hooked up to my gtx 1070 for rendering.
This is really helpful for a beginner like me, Probably the best video in terms of information, much thanks 🙏🏻
hi bhargav chavda, Thanks for watching, Check out the other videos on my site.
Hi Mike, Thanks for all the testing you do. Could you do some comparisons between Nvidia and AMD graphics cards specifically for Blender only renderings. Cheers
Thanks for the idea! I would love to but I do not access to AMD GPU at this time. As soon as I ca get my hands on some i will make a video.
I am currently using AMD 6950XT in my rig which I use for Blender, DaVinci Resolve, and Adobe After Effects.
is it worth taking a 4k monitor to 3080? for work, rendering
Fully love your channel!
thank you. please share on your FB page to help grow the channel.
Once again the tile size seems very low on these test, 16 x 16 is ok for CPU, but GPU should be on 256 x 256. It should be a lot faster.
Love these videos, real useful information. I love the A series man, I really don't want to be memory limited, I'd give up the clocks for that security
glad I could help
As requested: currently using a single 3060ti.
Also - very nice vid, Mike. Thanks so much.
i am blender user, thanks for this video. 2x1660 super, i am planning to buy new ones. 3090 is the best with 24 gigs, but it impossible to buy it in my country, because miners. I afraid to buy 3080 with 10 gigs with no NV Link, but, i guess, it's the only one solution.
Hi NV works. the rtx3080s would be the same workflow as you have not but with more power, you do not need NV unless you need more than 10GB. Also you may want to wait for the Super versions of the RTX3xxx GPUs as these may have more vram.
thanks for watching
@@MediamanStudioServices have any suggestions in what dates of year these video cards may appear?
@@ruslan_naguchev it is rumored to be next month
got a desktop with an i9 12900k and a rtx 3090, upgrading from an rtx 2060 mobile this will be a colossal upgread, my workflow would bo so much smoother now
Thank you! This is a great content you're making you have a new subscriber!
thanks ClassicyUP
This is what I was after, thanks man
Thanks for that video, it was really nice! But I got another question: I'm using Blender for quite a while now for small projects. Now I really want to do some bigger ones. For that, I'm planning to buy a new computer. But I really can't decide which GPU I should pick. I'm pending between the RTX 3080 10 GB, the RTX 3080 12 GB and the RTX 3080 Ti. Sure, I could just buy a 3080 Ti, but how much of a difference would it make compared to a 3080 10 GB or 12 GB? And would it be worth the extra money? Hope you can give me a little advice. ;)
hi Felix, I would get the 3080 12GB and save a few dollars over the ti version
Thank you for you videos they are very much appreciated. Great GPU content
thanks for watching Sraztec01
Thank you! This explains so much that other videos do not. Everyone benches 1 gpu… and fatally shares the exact test scene with visuals. really let me know that I do need higher vram cards. The photorealistic cad models I have to render with 4mil poly in redshift just bog down on 3080tis. Can feel it based on the ratio of smaller scenes to larger scenes. The non linear speed decrease is very noticeable… time for threadripper pro and some free flowing pcie 16x 4.0 lanes.
Really appreciated you video, I am using 2×3080 with threadripper 1920x, I am planning to get 3090 because of VRAM.
good choice, I really like my 3090. it is a beast
If you use 3080 and 3090, you can only use 3080's amount of vram when render.
@@rsunghun hi Rai, I would only render on the RTX3090 and sell the 3080 and get yourself some extra cash
I wish to have a good animation production system
One day God will give me a good system
😍
Until prices come down to where they used to be, many of us are stuck with current GPUs. I built a new PC from the ground up with an AMD 5900X in August. It would have cost twice as much for just a 3070 than it did for my motherboard (Which has 2 pcie v4 16 channel slots), the CPU, 32 gigs of ram, new NVme drive, new case and fan, and new 2nd monitor. So I'm still running my 1660 super OC I was using in the previous computer. Here's hoping that when the Intel GPUs come out, it will drive prices down.
Commenting for the algorithm.
The GPU Prices are going down, and i want to upgrade my PC from a GTX 1080 (non TI).
In my Country the RTX 3060 is now at around 430€, basically its less than half the price of an RTX 3080 and even comes with more RAM.
So, what about Render times with 2x 3060 VS single 3080 ?
(I have a Threadripper 3960X & 64GB RAM)
I also want to know this
great video! subscribed! Im running a RTX 3060 and in the classroom scene I got about 30 seconds.
Hey bro can you suggest me which one should i buy 3060, 3060ti or 3070 for 3d applications?
@@Ramattra-ow well I'd say don't go for the 3070 cause its got only 8gb of VRAM which is important for creating 3D scenes i'd say get the 3060 cause its got 12gb VRAM
@@FFContent okay so high cuda cores doesn't matter in this? 🤔 And if i choose 6700xt or 6800xt it will be good for 3d rendering or should i only go for nvidia?
@@Ramattra-ow they do help with render times but if your planning on creating large scenes then more VRAM is needed but if your not then just get the better card and I’d say Nivida is a better choice since most 3D programs are better optimised for nivida
Love the content, great insight and presentation. You deserve more audience for sure :)
I run already a vintage Intel i7 950 on Asus P6T MB which luckily is a PCI 16 and I wish to replace my 1050 Ti with some RTX model or even think about getting a laptop with RTX but those are pricey and noisy.
Does it makes any sense to get a rtx 3060 ti on my oldie or should I go to banksters for a loan for full desktop upgrade?
well you could start with the GPU upgrade. you can always transfer it to a new system at a later date. The RTX3060 to will still run great in the older systems even at gen 3 PCIe
@@MediamanStudioServices I believe his setup would be PCI-E is Gen 2. My Sandy Bridge (P67 chipset) came later and is also had Gen 2 as well from 2011.
Your channel is godsent, thanks a lot for your hard work, It helped me so much :)
Cheers!
hi უხამსი ხამსი, thanks.
Thank you from the depth of my hear for this video. Firstly I wanted to buy 1 rtx 3080, after seeing this I decided to buy 1 rtx 3070 (because im low on budged for now) and I will add another 3070 after some months. Seems the best option when I see the markets (where they can deliver on my country) and where 2 rtx 3070 are just 1.2 times more expensive than 1 rtx 3080. Thank you for saving peoples money and life! 🥰🥰😍😍🤩🤩🤗🤗
Glad I could help
3070 has less VRAM, so that could hurt you on complicated renders. 3080 has either 10gb or 12gb. That can make a real difference.
Good video test, Thanks for the post.
Just started to use blender, GPU 3070FE
@@ImportRace thanks for watching, Please subscribe and share on your FB to help grow the channel
@@MediamanStudioServices You're welcome and will do
Dude you are a saint. Thanks for the content
I appreciate that!
Please keep watching and share on your FB page to help grow the channel.
@@MediamanStudioServices will do
Glad I found this channel
Thanks Clownass, Please help grow the channel by sharing on your FB page.
I also have a new video with the RTX A6000 vs. the RTX3090. Please watch that video
can you stay on one camera for long enough for my eyes to adjust at least?
Great video! I have a 2080 TI and am thinking of getting a RTX 3060 as a render GPU - would a combination of between the 2000 & 3000 series be advisable? Focusing on 3D and rendering in Maya, Blender, Arnold and Octane.
Thank you! It's really helpful.
thanks for watching. Please share on your FB page.
Nice work!
I'm actually using the GTX 770 from MSI :)
very well done review and a solid presentation. As a 3D freelancer I have been keeping an eye out for the RTX- series since launch. The funny thing is that due to unavailability I switched back to a... 1080ti and it still works fantastic ^^. The only thing I am really missing out on is RTX but due to Unreal 5's real time lighting solution "Lumen" the desire for RTX cards for production has been lowered.
ps. it might be an idea to research Edit Poly modeling performance in Blender and if/how the GPU impacts the performance. In Blender modeling performance is mainly CPU based so it would be interesting to see some combination with CPU/GPU.
Hi Remon, you are still missing out on raw processing power of the new gen GPUs as well. Yes its crazy getting a GPU in today's market, but the production times savings could be worth it
check out my video here.
czcams.com/video/ilcHvNOrPWg/video.html
The difference in improvement in things like Substance Painter is shockingly different - I went from 1070 to 30 series, and it was worth it. I would recommend saving up and watching out for a good opportunity.
@@aintnomeaning thank you for the comment. It is definitely a purchase on my mind for a while now. The 1080 TI just performs so well the requirement is not there just yet. This in combination with the double market prices makes me want to hold off a bit but if I can snatch a 3080 for a good price I might just bite :P
@@remon563 For sure!
aight... u got my respect, subscribed!!
thanks for watching
Great video!
Could you post all the info/spec for the parts you used in the benchmarking computer?
I am in the process of building a new desktop and in particular I am wondering about what motherboard you used (when you mentioned workstation vs regular desktop parts).
Thanks!
Very well done video!
thanks Kerry. Glad you liked it. Check out the new video on A6000 vs. the 3090
That was very helpful, thank you.
thanks Cloeren Jackson
I'm really enjoying your content Mediaman.. I was wondering about performance losses between older chipsets? Comparisons between Gen 2,3, And 4? Thanks for great informative video!
Again I found the exact video and information I wanted
thanks for watching ItzRen
I'm getting good 4k renders of a walk through animation of the Soba benchmark scene. Tile size had to increase to 4096. Actual render time 16 seconds per frame in Cycles on blender 3.0 with an overlook rtx3090.
that's some nice performance. I have not dived into Blender 3.0 too much yet. Thanks for sharing your experiences.
I love your ideas
Thank you for this!
Hi Luis, thanks for watching
Great video. Do you see any benefit of choosing a Quadro card over those cards?
Love this, but man, the frequency of camera cuts in the first minute gave me headaches.
Splendid video thank you sir!
thanks for watching. Please share on your FB
I'm using a few 3090s. I needed a PCIE extension cable to squeeze the 4th one into my Threadripper system because there wasn't enough space to fit 4 cards in there due to the slot spacing.
What kind of PSU are you using to power all the GPUs?
@@mikebrown9826 I would almost guarantee that barring this person having a really jank setup, they are using a dual PSU case with 3 of the GPUs on an independent supply or something similar.
Great content Thank you! subbed
thanks orkun sanal
very helpful info. thank you
Love your channel! I’m grabbing a new pc this month and am debating between a 5950x and a 3990x both setups with 2x rtx 3090s. I work primarily in C4D and unreal. I would love to see some render tests with these setups in octane! Keep up the great videos!
hi TJ, good luck getting the new system, sounds like s anice set up.
A 3990x is best for general 3D work and game development since it can compile shaders much faster due to core count. In addition to having more PCIe lanes. However if you work on animations primarily then 5950x is your best choice due to its much faster single thread performance which will allow for faster viewport playback so you can quickly re-iterate on your animations. While still having a good core count. Keep in mind that you only get 24 lanes with it so your GPU's will be running at 8x each which will degrade GPU performance by about 1% but that is still within margin of error.
Best of luck with what you get.
Also. Do not get the FTW3 3090's as they have faulty power. I had one die on me while being pushed to 100% for over 15 minutes. Best choice is the Asus ROG strix or the TUF. Then the MSI gaming x trio after. These are what I have experience with. Generally EVGA cards will break if I push them too hard. Same happened with an EVGA 2080ti a year ago.
@@nawafalrawachy394 I would have to agree, the 5950x is a faster CPU. As for GPUs in a desktop system, check out my latest video in this topic. czcams.com/video/57gJvskWvPA/video.html
@@nawafalrawachy394 Thanks Nawaf! I’m torn after hearing news about Alder Lake now. I wonder if it’s worth waiting for that.
Thank you for all this videos 👏👍🙏
thanks for watching and don't forget to share on your FB page.
There is 1 more important aspect to compare that is VRAM. with 10 - 12GB you will be likely getting out of memory very quickly these days. I've been experiencing it over last 2 months with my 3 3080 on various types of scenes: a Cruise going on an infinite ocean with some simulation, a car chase scene in a big city or just a simple morphing particles effects. Gotta switch to 3090 lately and helped alot. Eventhough the speed is quite slower than 3 RTX3080, of course.
is it worth taking a 4k monitor to 3080? for work, rendering
Yes I'm getting over 7GB VRAM just modeling and texturing a scene and I'm no where near done.
Can't you use nvlink?i think it merge the vrams and it is compatible with 3d applications
@@arashrahimnejad3955 only 3090 supports NVLink in 3000 series.
Why not use bigger tile size for rendering? Such small tile sizes are better suited for CPU rendering (like 32 x 32 or 64 x 64). Render speed for GPU is much faster with size of 256 x 256 and with recent versions I even get best times even with 2k x 2k.
Great video!, I'm looking to build a new pc by the end of the year, and I'm wondering if it's worth going through the hassle of putting my 980 in (alongside a new gpu) with the system to have faster animation renders, pretty sure I won't be able to afford 2 new gpus even if the gpu prices go down.
liked button has been hit :P
Thanks for providing AAA quality content to viewers.
Your welcome
@@mikebrown9826 hello sir how are you?
@@abhirajawat4531 doing fine. thanks for asking
If you look closer at those charts you will see that Optics uses Vram and the cuda core version of cyclesX has not been flushed out yet. Also the system Ram may be rather slow so you need to speed it up and raid it. Lastly you can get away with Vram demands over the 3080 limits so must be using system memory. These charts are telling when things are not to expected scaling ...
Hi Claus, I am not sure what you mean by RAD the system RAM. but thanks for watching
@@MediamanStudioServices Sorry, you are right I was thinking using raid cards for M.2 storage not RAM. Was wondering if that speeds up the startup of the rendering on some of the demo files.
@@clausbohm9807 faster drives will increase performance to GPU rendering, but just the load times to get the data into RAM. From there it is transferred to the GPU. So a few seconds will be shaved off the load times.
@@MediamanStudioServices Great that was what I suspected, thanks for confirming that. Its annoying to wait for the file to start rendering ...
I have a scene I made that you can use for benchmarking. It's an interior scene with volumetrics. Only uses almost 1.5 GB of vram and it takes way longer to render than any of the blender benchmark files. On my system with a RTX 3060 & RTX 3070 working together, it takes 7 min 7 sec for a really clean render using Blender 3.0(cycles x). I will provide a link so you can download it if you want.
My system for these scenes
Ryzen 9 3900x
RTX 3060
RTX 3070
64GB 3200 Mhz ram
NVME gen 4
Blender 3.0 Cycles X
BMW cuda 13.7 sec - Optix 7.6 sec
Classroom cuda 24.5 sec - Optix 15 sec
Blender 2.93
BMW cuda 19.5 sec - Optix 10.7 sec
Classroom cuda 53.7 sec - Optix 34.6 sec
Hi Belnder Rookie. I would love to check out your render test scene. You can PM me on the mediaman Facebook page.
thanks for watching
@@MediamanStudioServices I don't have a FB account. I would just add the link here but YT would put me in the spam box. So, the link is in my "About" tab on my channel in the description.
Hey bro, could I have a system with a rtx 3070 (which I have now) and I can add a 3090 later, and both of these GPU would still work together? I am new to using 2 GPU so I have no idea. Do they work as simply as putting 2 graphic card on my motherboard and they will work together at full potential? Firstly I wanted to have 2x 3070 but after reading your comment I am thinking of waiting some months and instead of buying a second 3070 now, I can buy a 3090 later. I hope you see this msg! 😁😁😁
@@versatale In programs like Blender, you can mix and match GPUs and they work together just fine because programs like Blender do not need any sort of nvlink. However, when it comes to gaming, whichever GPU is running the displays is the only GPU being used, unless you are using nvlink or something similar. But nvlink only works with two of the same cards. When it comes to the 3000 series cards, only the 3090s are nvlink compatible.
But to answer your question more directly, if you have a 3070 and you add a 3090, they will work together just fine in programs like Blender.
@@BlenderRookie Thank you very much for taking your time explaining this. I dont game, maybe once a month and I am full time 3D artist so thats super great news. Im probably gonna wait out some time and maybe buy a rtx 3090 since I can do much of my stuff with my 3070 for now. Much love and blessings, this was super helpful 🥰🥰🥰🥰
Really helpful content, thanks. Something I’ve been struggling to get a definitive answer to in online searches, hoping you might know the answer - would a dual GPU setup increase viewport render performance in Blender, or is the extra GPU only useful for final render?
Hi By The numbers. Viewport only uses one GPU.
Thanks for watching
Great video! Thank you so much.
Looks like RTX 3080 is enough for me, and adding a RTX 4000 could be a huge performance improvement in the future.
any future upgrade should be focused on VRAM. you want to have as much as possible alongside that extra performance.
is it worth taking a 4k monitor to 3080? for work, rendering
@@dimavirus9979 yes, just don't buy a small one. 32 inch or higher, otherwise look for QHD screens. also try to find something that has 120Hz or better, 300-400 nits minimum (preferably 500 or higher) and 100% sRGB coverage (preferably 96-100% DCI-P3).
@@mariuspuiu9555 what are the recommendations for qhd? how many inch? to make your eyes comfortable
I miss this channel 🙁
Sorry Ratzel. I have relocated to Vancouver Canada and currently trying to get equipment to do some new videos. Once I find a source for GPUs I will be doing some more content. Thanks for watching
@@mikebrown9826 thanks for the answer, i'll keep waiting then hope to see you soon, great content sir. 🤘😻
what 3090s are those blue ones and where can i can blower style 3090s besides ebay
great video. tested out the 4080 and 3090 but when with more vram with the 3090.
just smashed like button to complete 600
thanks Kamel
Thanks for your input on these issues. They have really helped me a lot. I have a question for you, and this is how it behaves in Unreal Engine, for 3D production of Architecture...
Unreal does not use more than One GPU at a time. sad but true......
I would like to ask If the RTX 3070 + RTX3060 can also be used with dual GPU?
yep, subscribed!
hi Erbay,
Thanks for watching, Please share on your FB page to help grow the channel.