The Best GPU for rendering with Blender. Using single and dual GPUs Part 1. RTX3070, 3080 and 3090.

Sdílet
Vložit
  • čas přidán 11. 09. 2024
  • #Nvidia #RTX #Blender
    What is the best GPU for rendering with Blender. In this video I test single and dual RTX3070, 3080 and 3090 GPU for rendering scenes in Blender.
    This is part 1 of a series of videos where I test gpu render in 3D software like Blender, Maya and 3D max with Redshift, Vray and Cycles.
    Blender demo files: www.blender.or...
    Blender with CyclesX: builder.blende...
    SOBA Noodle Scene: www.behance.ne...)
    Discord: / discord
    Facebook: / mediamanstudioreview

Komentáře • 619

  • @henrique-3d
    @henrique-3d Před 3 lety +20

    Best channel on benchmarks I've ever seen!

  • @maybefuture
    @maybefuture Před 3 lety +8

    Sweet mother of mercy, DUAL 3090's.
    This machine is practically Thanos's gauntlet.

    • @MediamanStudioServices
      @MediamanStudioServices  Před 3 lety +1

      check out this video with 4 RTX3090 GPUs
      czcams.com/video/H0lRWAzdGPQ/video.html
      Thanks for watching

  • @bulwarkgaming4693
    @bulwarkgaming4693 Před 2 lety +3

    Honestly, it's so hard to find this kind of content - thank you very much!

  • @fadegboye
    @fadegboye Před 3 lety +4

    Thanks Mike for doing this. Looking to get 2 x RTX 3090 soon so this is very useful. Looking forward to part 2! 😊

  • @hardwire666too
    @hardwire666too Před rokem +6

    It would be interesting to see this revisited. Especially since the 40xx cards are out now, and as well Blender is now on 3.5. I feel in some cases pricing makes buying a seccond 3080 look far more apealing. Even more so if you have a 12GB version. While only the 3090 has NVLink I have found if you have enough system ram and really work on optimizations you can get buy pretty well with 12GB. As not only are the Blender benchmarks not representative of a real workload, but also are optimized to show the lowest possible time. Where the wild you're concerned about far more than just one single highly optimized frame.

  • @PrashantBuyyala
    @PrashantBuyyala Před 3 lety +4

    Awesome video! Very helpful! Looking forward to the rest of the videos in the series. Can't wait to try out the same benchmarks on my own systems.

    • @MediamanStudioServices
      @MediamanStudioServices  Před 3 lety

      Glad it was helpful! and thanks for watching the Prashant. Please share on your FB

  • @isaac-alves
    @isaac-alves Před 2 lety +2

    This is really good focused benchmarking - I’ve been looking for a solid media production focused hardware channel. Thank you for producing these.

  • @xDaShaanx
    @xDaShaanx Před 3 lety +4

    This is gold for us blender users :)

    • @MediamanStudioServices
      @MediamanStudioServices  Před 3 lety +1

      Thanks for watching. Please share on your FB page.

    • @rk-pl5mk
      @rk-pl5mk Před 3 lety +1

      Hello bhai... Apko mai insta par follow krta hu 🤣🤣🙈🙈🙈 @TechMango.in

    • @xDaShaanx
      @xDaShaanx Před 3 lety +1

      @@rk-pl5mk Hi Bhai, oh yes 😁

  • @MitchTube
    @MitchTube Před 2 lety +1

    At last, someone is doing the exact reviews I have wanted for years!

  • @ChrisDallasDualped
    @ChrisDallasDualped Před 3 lety +2

    Excellent video thx for posting this..you got a new sub

    • @MediamanStudioServices
      @MediamanStudioServices  Před 3 lety +1

      thanks Chris.
      Please keep watching the channel and share any video topic ideas you may have.

  • @Turkish_Vanilla
    @Turkish_Vanilla Před 2 lety +2

    When I discovered this channel my first thought was that Mr. Mediaman could be the brother that Roger Moore never had.. lol.. Jokes aside, this was really an extremely helpful video since I was asking myself if two RTX 3070's with a roughly price tag of $998 could beat one RTX3090 or not and voila, you just gave me the answer in this video.. Many many thanks! And yeah I am aware of it but I don't need the vram and my scenes are never that much complex.. and even if, I wouldn't mind.. But this way I could save 500 precious dollars.. Awesome! Even two RTX 3080's can still save you some 100 to 200 dollars which seems to be the safer option to avoid a failing CUDA test. But if you stick to OptiX then no issues with RTX 3070. Now I wonder.. what other double card options could be lurking out there that could deliver similar results with even a cheaper price tag? Because that's the whole point: Getting similar results with lower prices..

  • @raviranjan554
    @raviranjan554 Před 2 lety +3

    Currently I am using get 1650ti mobile GPU, I have Asus tuf a15 ryzen 7 4800h with 16gb ram.....I started using blender around 3 months ago, & its been a great experience but now I feel I need a true workstation, I am doing some freelance work to get me in a rtx pc.
    Hope I can achieve it quickly, i wanna go into virtual production, so i Start learning unreal engine too ✌️

    • @MediamanStudioServices
      @MediamanStudioServices  Před 2 lety +1

      thanks for sharing your experiences with the channel Ravi Ranjan. Good to hear artist progressing in this industry. keep up the hard work and success.

  • @JustLacksZazz350
    @JustLacksZazz350 Před 2 lety +1

    Your videos are exactly what I've been looking for - thank you!

  • @ferdabros4700
    @ferdabros4700 Před 2 lety +51

    The camera angle switch is very annoying

    • @cddvdblurayful
      @cddvdblurayful Před rokem +1

      Perfect meme material :))))

    • @Hz4
      @Hz4 Před rokem

      That's called "face the consenquences angle". You can know it as "angry daddy angle" because your stupid comment.

    • @earlycents
      @earlycents Před rokem +2

      @@Hz4 His comment aint stupid, what? It is indeed annoying

    • @91BH8
      @91BH8 Před rokem +6

      It's not the angle switch, it's the frequency of the switch.

  • @Turgineer
    @Turgineer Před 11 hodinami

    9:22 Nice test.

  • @tttechtalk3056
    @tttechtalk3056 Před rokem +4

    I know this is funny and tragic at the same time like having Ryzen 5 5600x, 16gigs of ram, B450m M4 motherboard, 550Watt power supply, 500gb SSD, 1TB HDD...........I use a GT 710 for rendering my works 😶😔 and yes I am trying to buy a new one my eyes are on to 3060 ti 12gb, its should be good for gaming, rendering, editing, streaming at 1080p.........hopefully 🤞so ya am saving up by doing certain fiver works, and making 3d models for Roblox games and so on so I can buy the GPU one day.
    Thank you for making me understand more about the GPU working principle 👍

  • @aidanbowen9052
    @aidanbowen9052 Před 2 lety

    Just found this channel whilst looking for dual gpu support in Blender. Really great content and delivery - well researched and just the right balance of tech detail providing all the key info in a well produced format. Keep it up. Subscribed.

  • @turquoisekamlyn9618
    @turquoisekamlyn9618 Před 3 lety

    Thanks for the most recent benchmark in blender. Keep up the good work.

  • @Hz4
    @Hz4 Před rokem

    Very useful content. Im using dual 5600XT at the moment. Now i was seeking new gpus and found your video about dual 3070.

  • @HARIsubha1
    @HARIsubha1 Před 3 lety +2

    Useful video! if possible pls compare rtx a 4000,a5000 with 3080ti,3070ti and 3090.

  • @slipstream363
    @slipstream363 Před 3 lety

    Thank you for this man! I'm currently using a gtx 1080 and I'm now upgrading to a 3090. Had to buy a prebuilt to get one and was still skeptical about it as I wanted to eventually add another 3090 for the vram. There wasn't much information available about shared vram anywhere else. Thank you once again!

  • @salibaray
    @salibaray Před 2 lety

    These are the best comparison videos, well done and keep it up!

  • @eyeprops5422
    @eyeprops5422 Před 2 lety

    Glad I stumbled on this. The downside is GPUs are so hard/expensive to get right now. Have to wait a while to try this out.

  • @Badutspringer
    @Badutspringer Před rokem +2

    Nice video. Thanks for your time.

  • @rusticagenerica
    @rusticagenerica Před 3 lety

    So happy I found your channel ! Totally crazy that you don't have 1M+ followers and 100k+ views !! Please do a massive CEO or put Facebook ads ! You deserve them !!! Great test and spirit.

    • @MediamanStudioServices
      @MediamanStudioServices  Před 3 lety +1

      thanks for the super kind words. I am working on getting more views. Please share on your FB page.

    • @rusticagenerica
      @rusticagenerica Před 3 lety

      @@MediamanStudioServices My pleasure ! Just shared about your great channel on my very first Blender video : czcams.com/video/dN4O3AI4VB4/video.html

  • @b_m_p_1_9
    @b_m_p_1_9 Před 2 lety

    Awesome video, had exactly the information i was looking for and was informative but also concise. Thanks!

  • @rootofminusone
    @rootofminusone Před 2 lety +1

    such an underrated channel

  • @Alexander_S_K
    @Alexander_S_K Před 2 lety +1

    These benchmarks are awesome. Makes me think that getting a 3070 is more worth it if I'm only doing character renders.

  • @johnrobinson4445
    @johnrobinson4445 Před 2 lety

    These videos should have more views. I recently bought an entry-level rig for Blender and CAD: a Dell Precision 3450 with an i7 CPU and a Quadro P620. Just enough to get started smoothly.

  • @DigitalDesigns1
    @DigitalDesigns1 Před 2 lety

    12900k + 3090FE + 128GB RAM for blender, AR, PR & PS . Your videos steered me into getting the 3090

  • @enzoni34
    @enzoni34 Před 2 lety

    This is one of the best videos , really appreciate it

  • @taham7195
    @taham7195 Před 2 lety

    Subscribing was the least I could do, Top work sir!
    I only get to watch my dream GPU and my dream software together.

  • @saeeddali3145
    @saeeddali3145 Před 3 lety

    This was very professional, really good job 👏 👍

  • @doubleustudios2020
    @doubleustudios2020 Před 3 lety +1

    Very interesting benchmark. Thanks for testing it using Blender demo files, it is something that Blenderians can relate to. 😄
    One question though, did the Cycles Optix denoising, Fast GI approximation turned on when you did this test? If not, it could render much2 faster.

    • @MediamanStudioServices
      @MediamanStudioServices  Před 3 lety +1

      hi Double U Studios, No i did not turn on denoise as I wanted to tests to be raw benchmarks.
      Thanks for watching and please subscribe and share on your FB page to help grow the channel

  • @3d-illusions
    @3d-illusions Před 2 lety +1

    you might like to try out the Turbo Tools addon for Blender. It can render the classroom scene in 12 seconds on a gtx1070, and will even clean the individual passes so the classroom scene's complex compositor will work (the only denoiser to be able to do that).

    • @wilsione
      @wilsione Před 2 lety

      free addons? Where can we purchase this?

    • @dimavirus9979
      @dimavirus9979 Před 2 lety

      is it worth taking a 4k monitor to 3080? for work, rendering

    • @3d-illusions
      @3d-illusions Před 2 lety

      @@dimavirus9979 I have a 4k and a 1080p monitor hooked up to my gtx 1070 for rendering.

  • @bhargavchavda684
    @bhargavchavda684 Před 2 lety

    This is really helpful for a beginner like me, Probably the best video in terms of information, much thanks 🙏🏻

  • @Jershaun
    @Jershaun Před 3 lety +1

    Hi Mike, Thanks for all the testing you do. Could you do some comparisons between Nvidia and AMD graphics cards specifically for Blender only renderings. Cheers

    • @MediamanStudioServices
      @MediamanStudioServices  Před 3 lety +2

      Thanks for the idea! I would love to but I do not access to AMD GPU at this time. As soon as I ca get my hands on some i will make a video.

  • @kszanika7782
    @kszanika7782 Před 2 lety +3

    I am currently using AMD 6950XT in my rig which I use for Blender, DaVinci Resolve, and Adobe After Effects.

    • @dimavirus9979
      @dimavirus9979 Před 2 lety

      is it worth taking a 4k monitor to 3080? for work, rendering

  • @DavidLeeSirHope
    @DavidLeeSirHope Před 3 lety +1

    Fully love your channel!

  • @chrisliddiard725
    @chrisliddiard725 Před 2 lety +5

    Once again the tile size seems very low on these test, 16 x 16 is ok for CPU, but GPU should be on 256 x 256. It should be a lot faster.

  • @ItsAkile
    @ItsAkile Před 2 lety

    Love these videos, real useful information. I love the A series man, I really don't want to be memory limited, I'd give up the clocks for that security

  • @thorn-
    @thorn- Před rokem

    As requested: currently using a single 3060ti.
    Also - very nice vid, Mike. Thanks so much.

  • @ruslan_naguchev
    @ruslan_naguchev Před 3 lety +1

    i am blender user, thanks for this video. 2x1660 super, i am planning to buy new ones. 3090 is the best with 24 gigs, but it impossible to buy it in my country, because miners. I afraid to buy 3080 with 10 gigs with no NV Link, but, i guess, it's the only one solution.

    • @MediamanStudioServices
      @MediamanStudioServices  Před 3 lety +2

      Hi NV works. the rtx3080s would be the same workflow as you have not but with more power, you do not need NV unless you need more than 10GB. Also you may want to wait for the Super versions of the RTX3xxx GPUs as these may have more vram.
      thanks for watching

    • @ruslan_naguchev
      @ruslan_naguchev Před 3 lety

      @@MediamanStudioServices have any suggestions in what dates of year these video cards may appear?

    • @MediamanStudioServices
      @MediamanStudioServices  Před 3 lety +1

      @@ruslan_naguchev it is rumored to be next month

  • @alexandroromio5976
    @alexandroromio5976 Před 2 měsíci

    got a desktop with an i9 12900k and a rtx 3090, upgrading from an rtx 2060 mobile this will be a colossal upgread, my workflow would bo so much smoother now

  • @classicyup3306
    @classicyup3306 Před 2 lety

    Thank you! This is a great content you're making you have a new subscriber!

  • @emilythemartian
    @emilythemartian Před 2 lety

    This is what I was after, thanks man

  • @FelixW2902
    @FelixW2902 Před 2 lety +3

    Thanks for that video, it was really nice! But I got another question: I'm using Blender for quite a while now for small projects. Now I really want to do some bigger ones. For that, I'm planning to buy a new computer. But I really can't decide which GPU I should pick. I'm pending between the RTX 3080 10 GB, the RTX 3080 12 GB and the RTX 3080 Ti. Sure, I could just buy a 3080 Ti, but how much of a difference would it make compared to a 3080 10 GB or 12 GB? And would it be worth the extra money? Hope you can give me a little advice. ;)

  • @sraztec01
    @sraztec01 Před 2 lety

    Thank you for you videos they are very much appreciated. Great GPU content

  • @ryanpwm
    @ryanpwm Před 2 lety

    Thank you! This explains so much that other videos do not. Everyone benches 1 gpu… and fatally shares the exact test scene with visuals. really let me know that I do need higher vram cards. The photorealistic cad models I have to render with 4mil poly in redshift just bog down on 3080tis. Can feel it based on the ratio of smaller scenes to larger scenes. The non linear speed decrease is very noticeable… time for threadripper pro and some free flowing pcie 16x 4.0 lanes.

  • @roneelbali7138
    @roneelbali7138 Před 2 lety +1

    Really appreciated you video, I am using 2×3080 with threadripper 1920x, I am planning to get 3090 because of VRAM.

    • @MediamanStudioServices
      @MediamanStudioServices  Před 2 lety +2

      good choice, I really like my 3090. it is a beast

    • @rsunghun
      @rsunghun Před 2 lety +1

      If you use 3080 and 3090, you can only use 3080's amount of vram when render.

    • @MediamanStudioServices
      @MediamanStudioServices  Před 2 lety +1

      @@rsunghun hi Rai, I would only render on the RTX3090 and sell the 3080 and get yourself some extra cash

  • @extragame4574
    @extragame4574 Před rokem +1

    I wish to have a good animation production system
    One day God will give me a good system
    😍

  • @swdw973
    @swdw973 Před 2 lety

    Until prices come down to where they used to be, many of us are stuck with current GPUs. I built a new PC from the ground up with an AMD 5900X in August. It would have cost twice as much for just a 3070 than it did for my motherboard (Which has 2 pcie v4 16 channel slots), the CPU, 32 gigs of ram, new NVme drive, new case and fan, and new 2nd monitor. So I'm still running my 1660 super OC I was using in the previous computer. Here's hoping that when the Intel GPUs come out, it will drive prices down.

  • @Maxwell7724
    @Maxwell7724 Před rokem +2

    Commenting for the algorithm.

  • @BoomixDe
    @BoomixDe Před 2 lety +4

    The GPU Prices are going down, and i want to upgrade my PC from a GTX 1080 (non TI).
    In my Country the RTX 3060 is now at around 430€, basically its less than half the price of an RTX 3080 and even comes with more RAM.
    So, what about Render times with 2x 3060 VS single 3080 ?
    (I have a Threadripper 3960X & 64GB RAM)

  • @FFContent
    @FFContent Před 2 lety +1

    great video! subscribed! Im running a RTX 3060 and in the classroom scene I got about 30 seconds.

    • @Ramattra-ow
      @Ramattra-ow Před 2 lety

      Hey bro can you suggest me which one should i buy 3060, 3060ti or 3070 for 3d applications?

    • @FFContent
      @FFContent Před 2 lety +1

      @@Ramattra-ow well I'd say don't go for the 3070 cause its got only 8gb of VRAM which is important for creating 3D scenes i'd say get the 3060 cause its got 12gb VRAM

    • @Ramattra-ow
      @Ramattra-ow Před 2 lety

      @@FFContent okay so high cuda cores doesn't matter in this? 🤔 And if i choose 6700xt or 6800xt it will be good for 3d rendering or should i only go for nvidia?

    • @FFContent
      @FFContent Před 2 lety +1

      @@Ramattra-ow they do help with render times but if your planning on creating large scenes then more VRAM is needed but if your not then just get the better card and I’d say Nivida is a better choice since most 3D programs are better optimised for nivida

  • @marsmarv
    @marsmarv Před 2 lety +1

    Love the content, great insight and presentation. You deserve more audience for sure :)
    I run already a vintage Intel i7 950 on Asus P6T MB which luckily is a PCI 16 and I wish to replace my 1050 Ti with some RTX model or even think about getting a laptop with RTX but those are pricey and noisy.
    Does it makes any sense to get a rtx 3060 ti on my oldie or should I go to banksters for a loan for full desktop upgrade?

    • @MediamanStudioServices
      @MediamanStudioServices  Před 2 lety +2

      well you could start with the GPU upgrade. you can always transfer it to a new system at a later date. The RTX3060 to will still run great in the older systems even at gen 3 PCIe

    • @Obie327
      @Obie327 Před 2 lety

      @@MediamanStudioServices I believe his setup would be PCI-E is Gen 2. My Sandy Bridge (P67 chipset) came later and is also had Gen 2 as well from 2011.

  • @Farlough1337
    @Farlough1337 Před 2 lety

    Your channel is godsent, thanks a lot for your hard work, It helped me so much :)
    Cheers!

  • @versatale
    @versatale Před 2 lety

    Thank you from the depth of my hear for this video. Firstly I wanted to buy 1 rtx 3080, after seeing this I decided to buy 1 rtx 3070 (because im low on budged for now) and I will add another 3070 after some months. Seems the best option when I see the markets (where they can deliver on my country) and where 2 rtx 3070 are just 1.2 times more expensive than 1 rtx 3080. Thank you for saving peoples money and life! 🥰🥰😍😍🤩🤩🤗🤗

    • @MediamanStudioServices
      @MediamanStudioServices  Před 2 lety +1

      Glad I could help

    • @DerekDavis213
      @DerekDavis213 Před 2 lety

      3070 has less VRAM, so that could hurt you on complicated renders. 3080 has either 10gb or 12gb. That can make a real difference.

  • @ImportRace
    @ImportRace Před 3 lety

    Good video test, Thanks for the post.

    • @ImportRace
      @ImportRace Před 3 lety

      Just started to use blender, GPU 3070FE

    • @MediamanStudioServices
      @MediamanStudioServices  Před 3 lety +1

      @@ImportRace thanks for watching, Please subscribe and share on your FB to help grow the channel

    • @ImportRace
      @ImportRace Před 3 lety

      @@MediamanStudioServices You're welcome and will do

  • @miguelcrespo5545
    @miguelcrespo5545 Před 3 lety

    Dude you are a saint. Thanks for the content

  • @Clownass24
    @Clownass24 Před 3 lety

    Glad I found this channel

    • @MediamanStudioServices
      @MediamanStudioServices  Před 3 lety

      Thanks Clownass, Please help grow the channel by sharing on your FB page.
      I also have a new video with the RTX A6000 vs. the RTX3090. Please watch that video

  • @ReenoMoon
    @ReenoMoon Před rokem +8

    can you stay on one camera for long enough for my eyes to adjust at least?

  • @Xplozhun85
    @Xplozhun85 Před rokem +1

    Great video! I have a 2080 TI and am thinking of getting a RTX 3060 as a render GPU - would a combination of between the 2000 & 3000 series be advisable? Focusing on 3D and rendering in Maya, Blender, Arnold and Octane.

  • @howard7446
    @howard7446 Před 3 lety

    Thank you! It's really helpful.

  • @danlinte1187
    @danlinte1187 Před rokem

    Nice work!

  • @houssemskills9282
    @houssemskills9282 Před 2 lety +2

    I'm actually using the GTX 770 from MSI :)

  • @remon563
    @remon563 Před 2 lety

    very well done review and a solid presentation. As a 3D freelancer I have been keeping an eye out for the RTX- series since launch. The funny thing is that due to unavailability I switched back to a... 1080ti and it still works fantastic ^^. The only thing I am really missing out on is RTX but due to Unreal 5's real time lighting solution "Lumen" the desire for RTX cards for production has been lowered.
    ps. it might be an idea to research Edit Poly modeling performance in Blender and if/how the GPU impacts the performance. In Blender modeling performance is mainly CPU based so it would be interesting to see some combination with CPU/GPU.

    • @MediamanStudioServices
      @MediamanStudioServices  Před 2 lety +1

      Hi Remon, you are still missing out on raw processing power of the new gen GPUs as well. Yes its crazy getting a GPU in today's market, but the production times savings could be worth it
      check out my video here.
      czcams.com/video/ilcHvNOrPWg/video.html

    • @aintnomeaning
      @aintnomeaning Před 2 lety +1

      The difference in improvement in things like Substance Painter is shockingly different - I went from 1070 to 30 series, and it was worth it. I would recommend saving up and watching out for a good opportunity.

    • @remon563
      @remon563 Před 2 lety +1

      @@aintnomeaning thank you for the comment. It is definitely a purchase on my mind for a while now. The 1080 TI just performs so well the requirement is not there just yet. This in combination with the double market prices makes me want to hold off a bit but if I can snatch a 3080 for a good price I might just bite :P

    • @aintnomeaning
      @aintnomeaning Před 2 lety

      @@remon563 For sure!

  • @itsmyname_real
    @itsmyname_real Před 2 lety

    aight... u got my respect, subscribed!!

  • @williamiiifarquhar4345

    Great video!
    Could you post all the info/spec for the parts you used in the benchmarking computer?
    I am in the process of building a new desktop and in particular I am wondering about what motherboard you used (when you mentioned workstation vs regular desktop parts).
    Thanks!

  • @kerrybaldino8826
    @kerrybaldino8826 Před 3 lety

    Very well done video!

  • @cloerenjackson3699
    @cloerenjackson3699 Před 2 lety

    That was very helpful, thank you.

  • @Obie327
    @Obie327 Před 2 lety

    I'm really enjoying your content Mediaman.. I was wondering about performance losses between older chipsets? Comparisons between Gen 2,3, And 4? Thanks for great informative video!

  • @2B_9E
    @2B_9E Před 2 lety

    Again I found the exact video and information I wanted

  • @basspig
    @basspig Před 2 lety +1

    I'm getting good 4k renders of a walk through animation of the Soba benchmark scene. Tile size had to increase to 4096. Actual render time 16 seconds per frame in Cycles on blender 3.0 with an overlook rtx3090.

    • @MediamanStudioServices
      @MediamanStudioServices  Před 2 lety

      that's some nice performance. I have not dived into Blender 3.0 too much yet. Thanks for sharing your experiences.

  • @Midnightwalker01
    @Midnightwalker01 Před 2 lety

    I love your ideas

  • @salasart
    @salasart Před 2 lety

    Thank you for this!

  • @LeandroJack
    @LeandroJack Před rokem

    Great video. Do you see any benefit of choosing a Quadro card over those cards?

  • @almostwhite27
    @almostwhite27 Před 2 lety

    Love this, but man, the frequency of camera cuts in the first minute gave me headaches.

  • @KhataarFelMataar
    @KhataarFelMataar Před 3 lety

    Splendid video thank you sir!

  • @Mildly_Amused
    @Mildly_Amused Před 2 lety +2

    I'm using a few 3090s. I needed a PCIE extension cable to squeeze the 4th one into my Threadripper system because there wasn't enough space to fit 4 cards in there due to the slot spacing.

    • @mikebrown9826
      @mikebrown9826 Před 2 lety +1

      What kind of PSU are you using to power all the GPUs?

    • @paulianniello1740
      @paulianniello1740 Před 2 lety

      @@mikebrown9826 I would almost guarantee that barring this person having a really jank setup, they are using a dual PSU case with 3 of the GPUs on an independent supply or something similar.

  • @orkunsanal
    @orkunsanal Před 2 lety

    Great content Thank you! subbed

  • @amirnaseri8793
    @amirnaseri8793 Před 8 měsíci

    very helpful info. thank you

  • @TJMillard
    @TJMillard Před 3 lety

    Love your channel! I’m grabbing a new pc this month and am debating between a 5950x and a 3990x both setups with 2x rtx 3090s. I work primarily in C4D and unreal. I would love to see some render tests with these setups in octane! Keep up the great videos!

    • @MediamanStudioServices
      @MediamanStudioServices  Před 3 lety +1

      hi TJ, good luck getting the new system, sounds like s anice set up.

    • @nawafalrawachy394
      @nawafalrawachy394 Před 3 lety +1

      A 3990x is best for general 3D work and game development since it can compile shaders much faster due to core count. In addition to having more PCIe lanes. However if you work on animations primarily then 5950x is your best choice due to its much faster single thread performance which will allow for faster viewport playback so you can quickly re-iterate on your animations. While still having a good core count. Keep in mind that you only get 24 lanes with it so your GPU's will be running at 8x each which will degrade GPU performance by about 1% but that is still within margin of error.
      Best of luck with what you get.

    • @nawafalrawachy394
      @nawafalrawachy394 Před 3 lety +1

      Also. Do not get the FTW3 3090's as they have faulty power. I had one die on me while being pushed to 100% for over 15 minutes. Best choice is the Asus ROG strix or the TUF. Then the MSI gaming x trio after. These are what I have experience with. Generally EVGA cards will break if I push them too hard. Same happened with an EVGA 2080ti a year ago.

    • @MediamanStudioServices
      @MediamanStudioServices  Před 3 lety +1

      @@nawafalrawachy394 I would have to agree, the 5950x is a faster CPU. As for GPUs in a desktop system, check out my latest video in this topic. czcams.com/video/57gJvskWvPA/video.html

    • @TJMillard
      @TJMillard Před 3 lety

      @@nawafalrawachy394 Thanks Nawaf! I’m torn after hearing news about Alder Lake now. I wonder if it’s worth waiting for that.

  • @tuan310580
    @tuan310580 Před 3 lety

    Thank you for all this videos 👏👍🙏

  • @sleepingonthecouch95
    @sleepingonthecouch95 Před 2 lety

    There is 1 more important aspect to compare that is VRAM. with 10 - 12GB you will be likely getting out of memory very quickly these days. I've been experiencing it over last 2 months with my 3 3080 on various types of scenes: a Cruise going on an infinite ocean with some simulation, a car chase scene in a big city or just a simple morphing particles effects. Gotta switch to 3090 lately and helped alot. Eventhough the speed is quite slower than 3 RTX3080, of course.

    • @dimavirus9979
      @dimavirus9979 Před 2 lety

      is it worth taking a 4k monitor to 3080? for work, rendering

    • @humansrants1694
      @humansrants1694 Před rokem

      Yes I'm getting over 7GB VRAM just modeling and texturing a scene and I'm no where near done.

    • @arashrahimnejad3955
      @arashrahimnejad3955 Před rokem

      Can't you use nvlink?i think it merge the vrams and it is compatible with 3d applications

    • @sleepingonthecouch95
      @sleepingonthecouch95 Před rokem

      @@arashrahimnejad3955 only 3090 supports NVLink in 3000 series.

  • @TheWayneArts
    @TheWayneArts Před rokem +1

    Why not use bigger tile size for rendering? Such small tile sizes are better suited for CPU rendering (like 32 x 32 or 64 x 64). Render speed for GPU is much faster with size of 256 x 256 and with recent versions I even get best times even with 2k x 2k.

  • @DVFHAFYT
    @DVFHAFYT Před 2 lety

    Great video!, I'm looking to build a new pc by the end of the year, and I'm wondering if it's worth going through the hassle of putting my 980 in (alongside a new gpu) with the system to have faster animation renders, pretty sure I won't be able to afford 2 new gpus even if the gpu prices go down.

  • @amrasbine9252
    @amrasbine9252 Před rokem +1

    liked button has been hit :P

  • @atulbisht2989
    @atulbisht2989 Před rokem

    Thanks for providing AAA quality content to viewers.

  • @clausbohm9807
    @clausbohm9807 Před 2 lety +1

    If you look closer at those charts you will see that Optics uses Vram and the cuda core version of cyclesX has not been flushed out yet. Also the system Ram may be rather slow so you need to speed it up and raid it. Lastly you can get away with Vram demands over the 3080 limits so must be using system memory. These charts are telling when things are not to expected scaling ...

    • @MediamanStudioServices
      @MediamanStudioServices  Před 2 lety

      Hi Claus, I am not sure what you mean by RAD the system RAM. but thanks for watching

    • @clausbohm9807
      @clausbohm9807 Před 2 lety

      @@MediamanStudioServices Sorry, you are right I was thinking using raid cards for M.2 storage not RAM. Was wondering if that speeds up the startup of the rendering on some of the demo files.

    • @MediamanStudioServices
      @MediamanStudioServices  Před 2 lety

      @@clausbohm9807 faster drives will increase performance to GPU rendering, but just the load times to get the data into RAM. From there it is transferred to the GPU. So a few seconds will be shaved off the load times.

    • @clausbohm9807
      @clausbohm9807 Před 2 lety

      @@MediamanStudioServices Great that was what I suspected, thanks for confirming that. Its annoying to wait for the file to start rendering ...

  • @BlenderRookie
    @BlenderRookie Před 2 lety +3

    I have a scene I made that you can use for benchmarking. It's an interior scene with volumetrics. Only uses almost 1.5 GB of vram and it takes way longer to render than any of the blender benchmark files. On my system with a RTX 3060 & RTX 3070 working together, it takes 7 min 7 sec for a really clean render using Blender 3.0(cycles x). I will provide a link so you can download it if you want.
    My system for these scenes
    Ryzen 9 3900x
    RTX 3060
    RTX 3070
    64GB 3200 Mhz ram
    NVME gen 4
    Blender 3.0 Cycles X
    BMW cuda 13.7 sec - Optix 7.6 sec
    Classroom cuda 24.5 sec - Optix 15 sec
    Blender 2.93
    BMW cuda 19.5 sec - Optix 10.7 sec
    Classroom cuda 53.7 sec - Optix 34.6 sec

    • @MediamanStudioServices
      @MediamanStudioServices  Před 2 lety

      Hi Belnder Rookie. I would love to check out your render test scene. You can PM me on the mediaman Facebook page.
      thanks for watching

    • @BlenderRookie
      @BlenderRookie Před 2 lety

      @@MediamanStudioServices I don't have a FB account. I would just add the link here but YT would put me in the spam box. So, the link is in my "About" tab on my channel in the description.

    • @versatale
      @versatale Před 2 lety

      Hey bro, could I have a system with a rtx 3070 (which I have now) and I can add a 3090 later, and both of these GPU would still work together? I am new to using 2 GPU so I have no idea. Do they work as simply as putting 2 graphic card on my motherboard and they will work together at full potential? Firstly I wanted to have 2x 3070 but after reading your comment I am thinking of waiting some months and instead of buying a second 3070 now, I can buy a 3090 later. I hope you see this msg! 😁😁😁

    • @BlenderRookie
      @BlenderRookie Před 2 lety +1

      @@versatale In programs like Blender, you can mix and match GPUs and they work together just fine because programs like Blender do not need any sort of nvlink. However, when it comes to gaming, whichever GPU is running the displays is the only GPU being used, unless you are using nvlink or something similar. But nvlink only works with two of the same cards. When it comes to the 3000 series cards, only the 3090s are nvlink compatible.
      But to answer your question more directly, if you have a 3070 and you add a 3090, they will work together just fine in programs like Blender.

    • @versatale
      @versatale Před 2 lety

      @@BlenderRookie Thank you very much for taking your time explaining this. I dont game, maybe once a month and I am full time 3D artist so thats super great news. Im probably gonna wait out some time and maybe buy a rtx 3090 since I can do much of my stuff with my 3070 for now. Much love and blessings, this was super helpful 🥰🥰🥰🥰

  • @btn237
    @btn237 Před 3 lety

    Really helpful content, thanks. Something I’ve been struggling to get a definitive answer to in online searches, hoping you might know the answer - would a dual GPU setup increase viewport render performance in Blender, or is the extra GPU only useful for final render?

  • @rainfd2037
    @rainfd2037 Před 2 lety

    Great video! Thank you so much.
    Looks like RTX 3080 is enough for me, and adding a RTX 4000 could be a huge performance improvement in the future.

    • @mariuspuiu9555
      @mariuspuiu9555 Před 2 lety

      any future upgrade should be focused on VRAM. you want to have as much as possible alongside that extra performance.

    • @dimavirus9979
      @dimavirus9979 Před 2 lety

      is it worth taking a 4k monitor to 3080? for work, rendering

    • @mariuspuiu9555
      @mariuspuiu9555 Před 2 lety +1

      @@dimavirus9979 yes, just don't buy a small one. 32 inch or higher, otherwise look for QHD screens. also try to find something that has 120Hz or better, 300-400 nits minimum (preferably 500 or higher) and 100% sRGB coverage (preferably 96-100% DCI-P3).

    • @dimavirus9979
      @dimavirus9979 Před 2 lety

      @@mariuspuiu9555 what are the recommendations for qhd? how many inch? to make your eyes comfortable

  • @blackstar_1069
    @blackstar_1069 Před rokem +2

    I miss this channel 🙁

    • @mikebrown9826
      @mikebrown9826 Před rokem +6

      Sorry Ratzel. I have relocated to Vancouver Canada and currently trying to get equipment to do some new videos. Once I find a source for GPUs I will be doing some more content. Thanks for watching

    • @blackstar_1069
      @blackstar_1069 Před rokem

      @@mikebrown9826 thanks for the answer, i'll keep waiting then hope to see you soon, great content sir. 🤘😻

  • @MographChamp
    @MographChamp Před rokem +2

    what 3090s are those blue ones and where can i can blower style 3090s besides ebay

  • @sondamvula
    @sondamvula Před 2 měsíci

    great video. tested out the 4080 and 3090 but when with more vram with the 3090.

  • @Altohamy
    @Altohamy Před 2 lety

    just smashed like button to complete 600

  • @stratoarquitectos8889
    @stratoarquitectos8889 Před 2 lety

    Thanks for your input on these issues. They have really helped me a lot. I have a question for you, and this is how it behaves in Unreal Engine, for 3D production of Architecture...

  • @6gs900
    @6gs900 Před rokem +1

    I would like to ask If the RTX 3070 + RTX3060 can also be used with dual GPU?

  • @mixtapechi
    @mixtapechi Před 3 lety

    yep, subscribed!

    • @MediamanStudioServices
      @MediamanStudioServices  Před 3 lety +1

      hi Erbay,
      Thanks for watching, Please share on your FB page to help grow the channel.