Two GPUs in a Desktop computer, x8 PCIE Performance.

Sdílet
Vložit
  • čas přidán 14. 09. 2021
  • #Nvidia #RTX #Blender
    Comparing 8x vs, 16x PCIE performance . Desktop vs. workstations GPU set up.
    Redshift: www.redshift3d.com/
    Attic Scene: drive.google.com/file/d/1zqnU...
    Discord: / discord
    Facebook: / mediamanstudioreview
  • Krátké a kreslené filmy

Komentáře • 316

  • @vaibhavtailang
    @vaibhavtailang Před 2 lety +16

    Just the video I wanted. Thank you very much sir! You've helped a lot in clearing my doubts. I went with 5900x setup instead of Threadripper Workstation purely bcoz of price factor. Also because as an Independent artist I could afford the performance loss as compared to spending a thousand bucks extra for threadripper workstation which I couldn't afford at this point anyway. But I surely get the point that at some point later I will need to shift to a workstation. I also get that apart from performance, I also don't have much hardware flexibility and upgrade path with a desktop but for now 5900x setup is just the optimum sweetspot for me.

    • @MediamanStudioServices
      @MediamanStudioServices  Před 2 lety +2

      Hi vaibhav tailang, I understand you situation as I myself have a 5900x for the same reasons. The system is great for most work. I edit all my videos on the 5900x and I really like the performance i get for the money.
      Good luck
      Thanks for watching

  • @tojiyev_cg
    @tojiyev_cg Před rokem +6

    Great video. It is the ONLY video about this topic on CZcams. Thank you guys.

    • @phasepanther4423
      @phasepanther4423 Před rokem

      But why though?
      Just because sli nv link and crossfire aren't supported anymore there's no reason to ignore this at all.

    • @tojiyev_cg
      @tojiyev_cg Před rokem

      @@phasepanther4423 I ment that it is the only video about using dual gpus on consumer motherboards, which usually have 16x pcie lanes, in this case you can use 2gpus with x8 lanes for each of them. There is no such video about this topic. I didn't say anything about not supporting sli of crossfire

  • @DerelictNacho
    @DerelictNacho Před rokem +1

    I greatly appreciate the way you produced this video. To the point. No click bait title. Straight forward. Thank you!

  • @laszloperesztegi
    @laszloperesztegi Před 3 měsíci +1

    Thank you for the video. I'm here for comparing the two methods for Video Transcoding performance.

  • @rifatkabirsamrat5237
    @rifatkabirsamrat5237 Před 10 měsíci +4

    You guys are lifesavers. Literally the only video on this topic on CZcams. Thank you so much!

  • @ChrisDallasDualped
    @ChrisDallasDualped Před 2 lety +2

    Amazing info you've given us and have subbed to your channel today, you have the best rendering info on the net that I've seen, thank you for this.

  • @clausbohm9807
    @clausbohm9807 Před 2 lety +3

    These are the question few review, thanks for the effort.

  • @itsaman96
    @itsaman96 Před rokem

    Thank you for making this video its kinda really detailed Theorie that always comes in my mind

  • @Anthrocarbon
    @Anthrocarbon Před 17 dny

    Thanks for investigating, but boy was the Laughing Man name drop the GITS nostalgia hit I needed today!

  • @rayornelas8459
    @rayornelas8459 Před 9 měsíci

    Really informative. It would be great to note your full specs in your videos.

  • @jinomian01
    @jinomian01 Před rokem

    Thank you for doing this. This helped me understand what the impact is, if I go down the desktop dual gpu scenario. I don't need 2 at the moment(And have never done that on any of my builds) but I would like the option in my next build.

    • @MediamanStudioServices
      @MediamanStudioServices  Před rokem +1

      Two GPU does work in a desktop if you can fit the cards into the system and MOBO. and for 3D rendering, the 8x PCIe speeds does not affect the performance that much. It only about 2-3% in final time to render due to the transfer of the scene to the GPU. But desktop is a viable solution for dual GPU set ups. thanks for watching

    • @jinomian01
      @jinomian01 Před rokem

      @@MediamanStudioServices Thank you Mike, that is great to know.

  • @andreybagrichuk5365
    @andreybagrichuk5365 Před 2 lety +1

    Amazing content, thank U, Mike !!!

  • @clausbohm9807
    @clausbohm9807 Před 2 lety +1

    Great compares! Great explanation. Great video! Again, please do a 3060 12GB test since it has 4GB more then 3070's

  • @rockyvicky3848
    @rockyvicky3848 Před 2 lety +2

    Sir you are really making youtube usefull

  • @Gettutorials123
    @Gettutorials123 Před 2 lety +1

    Really informative!

  • @Frederic_walls
    @Frederic_walls Před 2 lety

    Excellent! Thank you very much for the information, it is very helpful

  • @vinee.mp4
    @vinee.mp4 Před rokem

    Huge thanks on this informative video!

  • @Afflictionability
    @Afflictionability Před 11 měsíci +2

    thank you I needed this

  • @kylecook4806
    @kylecook4806 Před rokem +2

    I've 5950x and was thinking of switching to a workstation with next gen stuff, this vid though was exactly what I was looking for, getting a second GPU and I'll be staying on my current system for a long while.

    • @xMaFiaKinGz
      @xMaFiaKinGz Před rokem +1

      You can actually go as low as 4x with pcie 4.0 without losing performance. I have rtx 3070 on pcie 3.0 8x and it works flawlessly with no performance loss.

  • @guillermocosio7362
    @guillermocosio7362 Před 2 lety +3

    Hey Mike! Using PCIe Gen4 x8 is the same bandwidth as PCIe Gen3 x16. A 3090 at fullspeed will be 1% (aprox) bottlenecked by PCIe 3 x16, so a 3070 shouldnt be bottlenecked at all with PCIe 4 x8, However using a older system with only PCIe Gen3 (Intel 10th Gen or Ryzen 2000) and dual graphics card may result in more of a performance difference.
    Also I believe that during Rendering if the scene fits within the GPU VRAM the PCIe interface is only used to feed the GPU during loading, so PCIe speed could really hurt is when the scenes don't fit in the VRAM and the GPU need the PCIe interface needs to access system RAM as a cache. So maybe in huge scenes the performance difference may actually be huge aswell.

  • @OfficialHankIII
    @OfficialHankIII Před rokem +1

    THANK YOU FOR YOU DIRECT TO THE POINT VIDEOS!!!!!

  • @fenghanlin4284
    @fenghanlin4284 Před 2 lety +1

    Thanks for the sharing. I had the same experience. I used Supermicro 2024US-TRT with two A100-40GB cards for GPU-accelerated computing (simulation). I made a mistake by putting one of the card into a X8 slot. I can see from nvidia-smi that both cards can run perfectly to 99%, until oneday I noticed that one GPU in X16 and the other in X8. I had the question whether if affects anything. So I load task individually to the card in X16, and to the one in X8, separately. I didn't see any difference at all! Then I was wondering if it changes with different load, by which I meant the different usage of memories. To confim that, I test different load from occupying 10GB memories until 40 GB in full. I didn't see any difference. Later I did the same tests on two 3090 cards (turbo fan) on a 10980XE workstation (of Dell 5820X). Sorry I can't test A100 in workstation because of heat issue. I put one of the RTX3090 card only in a X16 slot and later a X8 slot. This time I could see about 20% of difference in terms of simulation speed. Note that on 5820X, only PCIE 3.0 is provided. The I was wondering if the difference is caused by the difference between Geforece and Tesla. But I can't put them into the AS-2024-US chasis because the RTX 3090 cards are too long (msi-aero). So I roughly had a conclusion that, PCIE3.0X16 or PCIE4.0X8 may be good enough for my applications, but it may vary, subject different user cases. Thanks alot for your sharing.

  • @3dpixel583
    @3dpixel583 Před 2 lety

    Hi! Thanks for your comparison. This is exactly what I am looking for !
    I have a question though. Do you think the CPU difference actually contributes to the time difference?

  • @M.D.design
    @M.D.design Před 2 lety +1

    Thanks a lot for the review
    I was planning to build a system with a ryzen 9 5950x and two 3090 or 3080/80ti, mainly depending on what I can find on the market because the gpus cost a lot where I live (2500€ on average for a 3090 custom).
    But obviously I would be more inclined to 3090 also to try the nvlink.
    I also considered a workstation platform like the asus wrx80 with a 3955wx (because I do 80/85% of the work with the gpu and don't need a lot of core) but it's more expensive

    • @MediamanStudioServices
      @MediamanStudioServices  Před 2 lety

      Hi Man, you could try a Lenovo P620 with the 12 or 16 core, although the space inside the case is limited, but I have installed two RTX3080 in the case and it worked. If you can get your hands on two RTX3080ti founder cards then would fit no problem in the P620 and Lenovo is offering the 3080ti if you can get a local SI to do a special order for Lenovo. Don't buy form the web site as the prices are better from a reseller.
      But a Ryzen 3950 is a great system as well, just pick a good MOBO, the X570 Taichi is a great board with good VRMs
      www.asrock.com/mb/amd/x570%20taichi/
      you will need a 4 spacer NvLink bridge with this board. but you will want the extra space with the big RTX3090s.
      Or better yet a custom loop water cooled system is your best choice. check out my video on bending PETG tubing.
      czcams.com/video/dwfn5IQXnIE/video.html
      thanks for watching

  • @Atsolok
    @Atsolok Před 2 lety

    I was looking for this answer and you're the only one that could tell me.... Subscribing!

  • @SaladFX
    @SaladFX Před rokem +3

    Really wanted to know how two different GPUs would work for production workloads. especially rendering in xpu.

  • @lejeffe1663
    @lejeffe1663 Před 2 lety

    awsome been eyeballin a threadripper pro/r9 dual system build :)

    • @MediamanStudioServices
      @MediamanStudioServices  Před 2 lety

      Hi Jeffe, I wish you luck finding a system in this messed up market.
      All the best.
      Thanks for watching

  • @TGameplay4k
    @TGameplay4k Před 17 dny

    great video thanks!

  • @lucapupulin6990
    @lucapupulin6990 Před 2 lety +1

    Very useful information!
    Thanks for the awesome benchmarks you do!
    It would be interesting to see how much simulation (Houdini Pyro or Vellum) speed is impacted using x16 vs x8 PCIE

  • @TheArtofSaul
    @TheArtofSaul Před 2 lety

    Something to keep in mind especially in RS case is how out of core works with slower PCIE bandwidth. PCIE speed isn't too much of an issue when everything is loaded at once in a single go and fits in VRAM. But as Out of core is pretty robust in RS you can work on much much larger scenes but the PCIE speed can make a large difference in the total bandwidth of data moving back and forth from System RAM / Scratch Disk (worst but not bad on M.2) vs a single load to the VRAM.

    • @MediamanStudioServices
      @MediamanStudioServices  Před 2 lety

      Hi Saul, I am not an expert on Out of Bang Vram usage, but if this video is any indication of how the GPU performance when they run out of memory, this I say more Vean is always better. At the end of this video I render a very large scene with RS and the RTX A6000 crushed the RTX3090 due to the limited Vram.
      czcams.com/video/M2I69uK9WVU/video.html
      Thanks for watching

  • @comodore3684
    @comodore3684 Před rokem +1

    great channel. for the video, I was expecting a benchmark comparison between one desktop gpu vs 2 gpu. is it worth to add a second gpu for rendering or is it worst or is it the same ? maybe you have another video that deal with this question, otherwise would be nice to do one. keep up the good work

    • @MediamanStudioServices
      @MediamanStudioServices  Před rokem

      Hi Comodore, I have a few dual GPU video's on my channel. Check them out.. Thanks for watching

  • @06RHDSUBIEWAGON
    @06RHDSUBIEWAGON Před 5 dny

    id love to see a video on how to setup the 2 gpus and how to run them together

  • @DigitalDesigns1
    @DigitalDesigns1 Před 2 lety

    Love your videos. new subscriber..

    • @MediamanStudioServices
      @MediamanStudioServices  Před 2 lety

      hi Dewayne Dailey, for video editing, Unless you use Resolve Studio, the paid version. most video editing software does not use the GPU for much acceleration. And rendering in the viewport is a single GPU task. Just final 3D rendering can use the multi GPU set ups

  • @peterxyz3541
    @peterxyz3541 Před rokem +2

    THANKS! Im less confusing. Question does the MB auto balance work load? I’m planning to use 2 different Nvidia for AI gen text & images

  • @eric6606
    @eric6606 Před 2 lety +1

    Would love to see a review of the new Mac Pro/Max vs RTX 3090/A6000. Great content btw.

    • @MediamanStudioServices
      @MediamanStudioServices  Před 2 lety +1

      Hi Eric, not that I am a Mac hater, but there is not much support for GPU rendering on a Apple system, so I do not have any desire creating a Mac video on 3D rendering performance.
      Thanks for watching

  • @junaidcg
    @junaidcg Před 2 lety

    Hi Mike, another great video from you. Do you think on the instances when you rendering heavy rendering, is it likely that there will be significant difference between desktop processor vs workstation processor, just like the test you did for RTX A6000 vs RTX 3090, where the difference was negatable until you tested with memory intense rendering?

    • @MediamanStudioServices
      @MediamanStudioServices  Před 2 lety

      I dont think so as the CPU is close is specs. So it should not be much difference in the test results.

  • @garyc5245
    @garyc5245 Před 4 měsíci +1

    Well even though this is 2 years old, still very helpful. My situation is using a NVlink bridge on two 3090TI's for a client that wants the VRAM of a A6000 without the price. Was able to find a nvlink bridge, but it difficult finding a motherboard with the 81mm spacing I need between the two 5.0 x 16 slots

    • @hareram4233
      @hareram4233 Před 4 měsíci

      Did u find one? I also need suggestions for this motherboard

    • @garyc5245
      @garyc5245 Před 4 měsíci

      @@hareram4233I ended up getting the TRX 50 Aero-D by Gigabyte, has a 3 slot spacing and found a 3 slot NvLink online. Still in the works, but looks like Linux can support NvLink even if windows does not....at least I hope so

  • @santiag0pictures
    @santiag0pictures Před 11 měsíci +1

    Thank you so much! It's hard for many of us investing our savings in hardware trying to get the best performance at a lowest cost. Have you ever considered power consumption? because having to many machines working 24/7 affects the cost of the electricity bill. So, is it cheaper having many GPUs in one PC vs Many PCs with single GPUs?.

  • @fadegboye
    @fadegboye Před 2 lety +2

    Hi Mike! Thanks for the video, always has lots of useful information. Just one thing, at 4:03 you mention VRay benchmark scene but that is a Redshift benchmark scene. Is that correct? Cheers.

    • @MediamanStudioServices
      @MediamanStudioServices  Před 2 lety +1

      yups, my bad. but can not change it now. CZcams will not allow any edits after you submit the video. I would have to delete the video and repost it.
      Thanks for pointing that out

    • @fadegboye
      @fadegboye Před 2 lety

      @@MediamanStudioServices No problem. Good content.

  • @KiraSlith
    @KiraSlith Před 10 měsíci

    It sounds like a lot of that performance reduction comes from switching time loss. I was going to build a gaming system and just add a second GPU later, but I ended up cowering out and just throwing a pair of 2080tis into a T7820. $1k and I got a rig that happily does both Blender with RT and AI production work.

  • @aceaquascapes
    @aceaquascapes Před 2 lety

    @3:26 dear the screenshot score is differnt from what ur audio said.. thanks for the great work.
    I finally got a workstation

    • @MediamanStudioServices
      @MediamanStudioServices  Před 2 lety +1

      ya, i rushed this video and there is a few mistakes. But thanks for watching
      what workstation did you get?

  • @Slacker28
    @Slacker28 Před rokem

    I don't know if you can answer this but I'm just rendering with a high-end gaming PC using stable diffusion and I just want to know if x 8 pcie Gen 4 would reduce the quality because I want to put more nvme drives into the machine and I don't want to do it if it's going to compromise my Ai rendering.

  • @nawafalrawachy394
    @nawafalrawachy394 Před 2 lety

    Hi Mike,
    Thank you for taking the time to provide this valuable information.
    If I may ask. What is the keyboard you use in your videos. Thank you.

  • @georgigazeto4650
    @georgigazeto4650 Před rokem +2

    are you going to make a new dual gpu video with 4080 or 4070 ti?

  • @DigitalDesigns1
    @DigitalDesigns1 Před 2 lety

    Looking to build a machine to help my realtime cycle rendering in the viewport. I've seen the 3090 nearly rendering certain scenes in realtime I want this level of performance but getting a 3090 now a days is hard. would 2 3080s out perform 1 3090? in rendering, video editing etc?

  • @yasherok
    @yasherok Před 2 lety +1

    Hi Mike,
    Could you please say what motherboard do you use for the Desktop machine in this video, and what cooling system? As I see you use blower versions of GPUs in this rig, right?
    Also, what do you think about GIGABYTE X570S AORUS MASTER motherboard for using with two fan style GPU and Ryzen 5950x? Looks like it has enough room for airflow between two GPU and it has decent VRM for this processor? Tring to figure out best config for my situation.
    Thank you for your channel, there is a lot of useful and unique information here!

    • @MediamanStudioServices
      @MediamanStudioServices  Před 2 lety +1

      Hi Vlr. I am using the Gigabyte X570 pro wifi.
      www.gigabyte.com/Motherboard/X570-AORUS-PRO-WIFI-rev-10#kf
      for the cooling system. I did my own custom loop. check out the video i did here.
      czcams.com/video/r_Rl30LhE2M/video.html
      czcams.com/video/dwfn5IQXnIE/video.html
      The GIGABYTE X570S AORUS MASTER is a good choice.
      Thanks for watching

  • @drengot1066
    @drengot1066 Před 2 lety

    really informative

  • @rudypieplenbosch6752
    @rudypieplenbosch6752 Před 2 lety

    Interesting information, so the bandwidth of the pci4.0 is at this moment hardly used if a 16* slot and a 8* slot can stay withing 10% of eachother. And now we already have PCIe5.0, while PCIe4.0 is hardly fully used, maybe storage attached to the PCIe bridge, makes better use of the available bandwidth.

  • @sanaksanandan
    @sanaksanandan Před 2 lety +2

    Very useful test. I guess with older gen GPUs like RTX 2070 or 1660, the performance drop will be negligible.

    • @MediamanStudioServices
      @MediamanStudioServices  Před 2 lety

      the performance difference should be the same across the generation of GPUs.
      thanks for watching

  • @jinron24
    @jinron24 Před 2 lety

    Here's two questions, one if two x8 is only 5-9 percent slower why and second is there a way to fit complex scenes in limited memory cards?

  • @dan_hale
    @dan_hale Před 2 lety

    Thank you very much for this video! I would really like to see this testing done again with RTX 3090s. Also, I'm curious on the responsiveness when updating a scene in the viewport. I would think the increase in badwith with 16x would significantly reduce latency making the viewport much more snappy.
    I currently am running my duel 3090s with 5800X, however, I'm planning on upgrading to TR Pro after watching this video. Thank you!

    • @MediamanStudioServices
      @MediamanStudioServices  Před 2 lety

      Hi Daniel, So only one GPU is used when interacting with the viewport in 3D programs, so if you want to know the performance, Juts take out one of your GPUs and this will be full 16x on PCIe gen 4 in your system. You can then compare the performance between the two GPUs running at 8X.
      Thanks for watching

    • @dan_hale
      @dan_hale Před 2 lety

      @@MediamanStudioServices Thanks! I also wonder if the performance difference has more to do with the CPU/platform instead of the PCIe lanes. Otherwise, the custom build of blender that I use utilizes all GPUs even in the viewport (its quite spectacular). So I didn't even think to try this.

    • @MediamanStudioServices
      @MediamanStudioServices  Před 2 lety

      @@dan_hale The Ryzen 5xxx CPU as very fast and I di not think this is a limiting factor in my tests, in fact the 5900x is clocked faster that the 16core TR-Pro cpu i used. So i still believe the 8X PCIe is the limiting factor is this set up.
      I would also be very interested in your custom set up as no system can use two GPU in the viewport for interacting with the software as this? Can you supply more information on this?

    • @dan_hale
      @dan_hale Před 2 lety

      @@MediamanStudioServices Download the latest Blender Experimental Cycles X build. There are also various other builds of blender that support multi-gpu in viewport various different ways such as K-Cycles and E-Cycles for blender. I've switched entirely to rendering in Blender due it's unbelievable leverage of GPU power. It's seriously, seriously fast, and the new versions and custom builds are just even faster.

  • @andreasrichman5628
    @andreasrichman5628 Před rokem +1

    Does it affect southbridge temperature? significantly I mean

  • @deadmikehun
    @deadmikehun Před 2 lety

    Awesome video, however i have a question. Do Sata drives use any pcie lanes, or are they using the dedicated 4x that is needed for the chipset? I have an asus b450-f gaming motherboard that i currently use with a gtx 1070. I have ordered an rtx 3080Ti and i want to use them simultaneously in blender, but i'm not sure whether i should upgrade my motherboard. The motherboard manual says that in a dual gpu configuration GPU1 would get 8x, GPU2 would get 4x, the NVME SSD would get 4x. I'm also using 4 Sata HDD-s for storage, but if they use any additional PCIE lanes other than the 4 required for the chipset i will loose a lot of performance on the GPUs. Does it even worth using a secondary 1070 with a 3080Ti? It's still cuda cores though... Thank you!

    • @mikebrown9826
      @mikebrown9826 Před 2 lety

      Sata uses your chipset and not pcie lanes. As for using the 1070. There is some advantages but with only 4 lanes. It may be better to just use the 3080ti at full speed and the 16 lanea

  • @ColinCharles3D
    @ColinCharles3D Před 2 lety

    Hey Mike, thank you for your videos, they help me a lot concerning my future build.
    Indeed, as a 3D Artist, I am looking to upgrade my build that has lasted me for my studies since 2016.
    As I'll work with a workstation at work, I am looking to build a personal desktop PC for 3D rending (gpu based Octane Render/C4D & Blender/Cycles), video editing (adode suite) and gaming (most likely 1440p, 34' ultrawild).
    My current build would be :
    - 12th gen i7 (liquid cooled AIO)
    - 64gb RAM DDR4 (DDR5 looks a little too expensive)
    - 2TB m.2 nvme + 500gb ssd sata
    - RTX 3080 10gb (+ GTX 1060 6gb from current build)
    - 1200W PSU
    Which motherboard would you recommand that can handel 2 GPUs with at least 2x 16 PCIE 4.0 or 5.0 ?
    Doesn't look like a lot of choise yet for z690 chipset that doesn't break the bank... maybe I should wait a bit ?
    And do you think 1200w psu is enough for 2 gpus and the overall build ?
    Thanks a lot Mike, just subscribed, your content is priceless !

    • @MediamanStudioServices
      @MediamanStudioServices  Před 2 lety +3

      hi Colin, no desktop MOBO will support 2 GPUs in 16x. You would need to move up to a HEDT or Threadripper MOBO. Now you can still get good performance from a X570 running the GPUs at 8X. And I would recommend the Asrock X570 Taichi for running two GPU. This board have great VRMs to manage the power and the spacing to house the two GPU.
      Good luck with the build

  • @JohntheNX
    @JohntheNX Před rokem +2

    can u make a new test but with pci 5.0? Im planning to build a new rig with 7950x and x670 mainboard. I read the manual that if i use 2 pcie 5.0 slot, It will share each x8, but im not sure if there s any reduction considering it s faster

    • @jinomian01
      @jinomian01 Před rokem

      I would love to see this as well. Similar build I am considering.

  • @pascaljean2333
    @pascaljean2333 Před rokem

    So do the cards share memory? It almost sounds like dual 3060 12gb would be a sweetspot

  • @dulichauch
    @dulichauch Před 2 lety +1

    That's the info! Thank you so much! I want to build this for rendering and modelling in Blender, PSU and case with upgradeability in mind. 5950x is out of budget so I hope the 5900x will be able to handle mobo and future 2 GPUs. My list Ryzen 5900x, X570 Taichi, 32Gb Vengeance LPX, RTX 3060, plus a 2nd GPU in the Future, Case Thermaltake Level 20 HT, Deepcool 360EX, 2NVMe WD 750SN, Seasonic Prime PX1300. Thank you for any input.

    • @MediamanStudioServices
      @MediamanStudioServices  Před 2 lety +2

      Hi dulichauch, if you reduce the PSU to 1000watts you could get some extra ram. I would suggest 64GB. Even with two RTX3080 at 350 watts that still leaves room for the 105watt CPU.
      All other parts are a good choice. The Taichi is a good choice for two GPUs.

    • @dulichauch
      @dulichauch Před 2 lety

      @@MediamanStudioServices Oh, thats surprising! At beQuiet's PSU Calculator, adding the watercooler, 2 GPUS, 3 additional fans and 2xM.2 and one SSD I received 1000W usage, so I got the big PSU, still think of keeping it. For RAM: would it be totally absurd to order 2 separate LPX Kits of 16x2GB for 170€ or is it strongly recommmended to go with one kit only 2x32gb for 280€?

    • @MediamanStudioServices
      @MediamanStudioServices  Před 2 lety

      @@dulichauch it was just a suggestion to save money and get the much needed RAM required for creative workloads. More PSU is always better if you can get it.
      and I would get the 2x32gb and leave room for upgrade later.

  • @tay.0
    @tay.0 Před 2 lety +1

    I think most of the differences you are running into is caused by CPU differences between the two builds, since Light Cache and some Global Illumination are still using CPU even during GPU renderings.

  • @hardwire666too
    @hardwire666too Před 2 lety +1

    GPU prices are way down now, and theused market is starting to flood. I really want a 3090 mainly fo the Vram, but the prices on used GPUs is hard to ignore. I already have a 2070 and used they are starting to pop up for $250-ish bucks. It may not increase my Vram and let me worry less about scene size, but if it can shave some significant time off my renders in Blender 3.xx I'd call it a win. Would be curious to see what you think.

    • @MediamanStudioServices
      @MediamanStudioServices  Před 2 lety

      Well I think the 3090 would shave off quite a bit of time vs. the 2070. if you can get the 3080 or 3090 at a good price. then you should get one.
      thanks for watching

  • @GuidoGautsch
    @GuidoGautsch Před 2 lety +1

    This was just the video I'd hoped you'd make! Super interesting, thank you Mike. It seems that an 570X board would do the trick with relatively minimal sacrifice. There's a pretty significant price difference in Australia between the 3060 Ti (~ $1200) and the 3070 (~ $1700), so for a dual GPU rig for Blender, two 3060 Tis would be cheaper and faster than a single 3080 ($2300), 3080 Ti ($2500) and even a 3090 ($3300). Yes, GPUs are expensive down under. I was thinking of pairing them with an MSI Tomahawk x570, a Ryzen 5950x and potentially an 850w PSU?

    • @MediamanStudioServices
      @MediamanStudioServices  Před 2 lety +1

      hi Guido, I was looking at the asrock taichi x570 when I built my system, but it was not available when I purchased my equipment. The Tomahawk is also a nice board.
      Good luck on the GPU purchase.
      Thanks for watching.

    • @nawafalrawachy394
      @nawafalrawachy394 Před 2 lety

      Hi Guido,
      Just a word of advice. If you have triple slot cards (air cooled) then it is recommended to have at least one empty slot between them. This way, the top card won't be choked of air. I personally recommend the MSI x570 Godlike. Another thing is for dual 3060ti a 1000 Watt gold or higher PSU is recommended from a reliable psu maker like Seasonic or EVGA (EVGA gpu's are bad but their PSU's are excellent). Generally, using dual GPU's in rasterization based engines like EEVEE or Unreal engine (by using multiple instances to speed up rendering / multi instance rendering) will use more power than a pathtraced engine like cycles or octane would use. Thus tripping the OCP on your PSU. Speaking from experience. Hope this helps.

    • @GuidoGautsch
      @GuidoGautsch Před 2 lety

      @@nawafalrawachy394 thanks, that's really good to know! That motherboard costs $1200 here 😵 more expensive than TRX40 boards, could even get an WRX80 for that. So maybe I'll stick to a single GPU after all.

    • @GuidoGautsch
      @GuidoGautsch Před 2 lety +1

      @@MediamanStudioServices thank you. The Taichi looks solid

    • @nawafalrawachy394
      @nawafalrawachy394 Před 2 lety

      @@GuidoGautsch It could still work with a regular layout board if the cards are between 2.7-2.8 slots wide. Though the top card will struggle a bit to get air which will increase its temps. To combat this, I reduce the power limit on the top card to 95% and give it a custom fan curve to keep it from going above 80c.

  • @Andee...
    @Andee... Před 2 měsíci +2

    I wish you included card temps

  • @pranavsuthar5864
    @pranavsuthar5864 Před rokem +1

    Hey mate! I am bit confused whether to buy gaming PC or workstation PC, for CAD works. So I searched for custom build pcs website and found Z690 chipset supports both workstation(Nvidia rtx a2000) & gaming(Nvidia rtx 3080ti) graphics. But my question is can I put both these graphics on chipset at same time & while working on CAD software it uses workstation graphics (for compatibility as per hardware certification) & while gaming it uses gaming graphics.

    • @mikebrown9826
      @mikebrown9826 Před rokem +2

      Why use two GPUs. Just get the 3090 or 4090 as they also work well with CAD application.

  • @Verlaine_FGC
    @Verlaine_FGC Před 11 měsíci

    Does the gen of the PCIE slot factor in this as well?
    Will two gpus that are meant to work on PCIE 4.0X16 slots still get a performance hit on each of them even if they're on PCIE 5.0x8 lanes?
    So no matter what, the overall performance of the lane is the highest MUTUAL rating between the slot and the gpu that is on it?
    So a gpu that is meant to run on a 4.0x16 slot will be reduced to a 4.0x8 performance when mounted to a 5.0x8 slot?

    • @omidnt2992
      @omidnt2992 Před 7 měsíci

      if your graphic card is PCIE gen 4 , then putting it in a Gen 5 PCIE slot will not increase your performance cuz your card only know how to talk in "GEN 4" language, now what if you put two PCIE gen 4 graphic cards into 2 PCIE gen 5 slots with only 20 or 24 PCIE lanes, you will have two PCIE gen 4 x8 .
      But for rendering or AI that speed reduction wont impact your performance that much.
      And for gaming, you will never want to use 2 graphics cards cuz almost no game has coded to work with multi GPUs. for gaming maybe you can assign Physics to process on one card and everything else on your main graphic card , but as i said , don't do it for gaming too much heat and electricity for a little or no improvement (sometime the heat will even reduce your gaming FPS)

  • @crobar1
    @crobar1 Před 2 lety +1

    Interesting video

  • @theWanderer521
    @theWanderer521 Před 2 lety

    Great content as always! I don't know if this is worth making a video - but I would love to see a video about NVlinked GPUs - performances,installation,configuring the settings
    I was creating a scene in Blender the other day, I ran out of Vram on my 3090 lol...I might have used the wrong settings, didn't optimized the scenes better or used high res textures that I shouldn't be. My future plan is to build a workstation with multiple gpus (currently I have my eyes on the H3 platform, but I might need more vram + gpu power to render) I'm using 3970x with a 3090
    Thanks!

    • @MediamanStudioServices
      @MediamanStudioServices  Před 2 lety

      Hi theWanderer521, I would live to do a Video in two RTX3090, trouble is that I do not have two 3090 anymore. But if I can get my hands on two gpus I will do a video on the set up of the NVlink.
      Thanks for watching

    • @rakshithr4230
      @rakshithr4230 Před 2 lety

      @@MediamanStudioServices Waiting to see this kind of video. Please make one.
      I have planned to build pc with i9 12900k with 2 x RTX 3090. l want to know what would be the performance with NVlink.

    • @rakshithr4230
      @rakshithr4230 Před 2 lety

      @@MediamanStudioServices On your experience, tell me is that a good idea or should I go with workstation.

  • @DasunGunathilaka
    @DasunGunathilaka Před 2 lety

    Hey.. thanx for the video.
    I have a question .
    Is there a difference in video quality when you export with hardware acceleration enable (nvidia, amd, Qsync) compared to just using cpu for vedio decoding and encoding..?

    • @MediamanStudioServices
      @MediamanStudioServices  Před 2 lety +1

      Hi Dasun Gunathilaka, there is always a difference when using a different encoder, be it hardware or software, but the difference should be very small and the quality should be about the same.
      thanks for watching

  • @anyonecansayeverything
    @anyonecansayeverything Před 8 měsíci

    good point

  • @lightning4201
    @lightning4201 Před rokem

    Would the results be comparable if it was 3090s instead of the 3070s?

  • @arasdarmawan
    @arasdarmawan Před 2 lety

    Hey Mike. Is there a way to stack 6 GPU RTX 3080/3090 blower on a single workstation motherboard ? I've been looking as well towards mining motherboard (that can fit 6-12 GPU) - but they reduce the pcie lanes speed x1 - Your video here showed a very little difference between x16 and x8 - So i'm curious if x1 is pushing it way too far for a render gpu farm? need your advice.

    • @MediamanStudioServices
      @MediamanStudioServices  Před 2 lety +2

      My question to you is why would you want to do this. Just buy a GPU server for this task. Gigabyte makes a great server for just this style of workflow. Anything over 3-4 GPU in one workstation requires a server chassis.

  • @elaymondes7205
    @elaymondes7205 Před 6 měsíci

    Hi,
    First of all, thank you very much for the great video, I'm currently planning to put together a workstation for Unreal Engine with two RTX4090 cards with a motherboard for the I9 14900K chip.
    Do you have a good tip which motherboard is suitable for working on both lanes at full speed without splitting them?
    I've been searching for days but unfortunately without success because I'm not that deep into the subject.
    Thank you
    LG

    • @oscarh.405
      @oscarh.405 Před 16 dny

      Look for msi's top of the line mobos. The Godlike and some less expensive ones can do that, but still they're around +700 usd.

  • @TheBlueVaron
    @TheBlueVaron Před rokem +1

    Would Directx12 use both GPUs as one for gaming?

  • @superstite21
    @superstite21 Před 2 lety

    Great video! How do rendering softwares handle two GPUs in a non SLI/Crossfire/NVlink configuration vs a SLI/Crossfire/NVlink??
    Thank You

    • @MediamanStudioServices
      @MediamanStudioServices  Před 2 lety +4

      when I get some more GPU I will test this out and do a video
      thanks for watching

  • @ahmedsouody1765
    @ahmedsouody1765 Před rokem

    Thanks for the great content .. what is the power supply capacity for the desktop?

  • @user-rw8kl7xy5n
    @user-rw8kl7xy5n Před 4 měsíci +2

    Thank you for this video. It helped explain alot. I have a dual gpu setup with two 4070 supers. My current board has 1 pcie x 16 5.0 and 1 pcie x 16 4.0. I am in the process of getting a new motherboard with 2 pcie x16 5.0 slots. Would this improve performance?

    • @wtfskilz
      @wtfskilz Před 4 měsíci

      It would increase, however it's still going to run at 8x

    • @majus1334
      @majus1334 Před měsícem

      No. 4070 SUPER runs @PCIe 4.0, your motherboard will downgrade all lanes to accommodate your GPU, so PCIe 5.0 would be disabled/ downgraded to PCIe 4.0.

  • @ChrisDallasDualped
    @ChrisDallasDualped Před 2 lety +1

    Hey Mike, I'm looking to build a PC for workflow mainly in Premiere Pro and Filmora, is it more wise to go for an RTX 3090 or 2 RTX 3070's which would benefit me the most? I don't care for gaming this is strictly for my workflow. Also, what boards and which ram would you recommend? Obviously faster ram makes a difference? What about hard drive? I would prefer ones with at least 3K write speeds. And which water cooled system should I use? I've also narrowed it down to either the Ryzen 5900X or 5950X which I have access to. I will await your reply and thx a million for your hard work.

    • @GuidoGautsch
      @GuidoGautsch Před 2 lety +1

      Premiere Pro isn't able to take advantage of GPUs efficiently and is more CPU-limited us my understanding, so even a single 3070 would be fine. Invest in the 5950x out of those two options. If you were to drop PrePro for Resolve, however, two 3070s would beat out the single 3090 by ~30%

    • @ChrisDallasDualped
      @ChrisDallasDualped Před 2 lety +1

      @@GuidoGautsch thx for the input, I hate resolve btw Adobe is so much easier to work with and so is Filmora btw. Yeah 5950X is what I am aiming for.

    • @GuidoGautsch
      @GuidoGautsch Před 2 lety +1

      @@ChrisDallasDualped yeah, I hear you. I prefer PP over Resolve as well for editing - just wish it'd be able to take advantage of GPU grunt as effectively as Resolve does.

    • @MediamanStudioServices
      @MediamanStudioServices  Před 2 lety

      I agree with PP as this is my choice for editing the Mediaman videos. Just wish Adobe would get better GPU support.

  • @georgelevantovskij8593
    @georgelevantovskij8593 Před 2 lety +1

    This is extreamly valuable info! Thank you for your content!
    I have a following setup: Ryzen 5950X, Asus Prime Pro570X, 64Gigs of Ram (DDR4 Kingston 3200mhz), Asus Tuf Rtx3060 (12Gb), 750watt PSU.
    Do you know is it possible to add a second gpu (I have a leftover Asus Rtx3060ti), how it would effect my workflow and rendertimes in following programs: Blender, After Effects, Premiere Pro and also Unreal Engine 5? I was also thinking to get a bigger PSU (1000 watt).
    Thank you sir for your time and tips! This channel is pure gold!

    • @MediamanStudioServices
      @MediamanStudioServices  Před 2 lety +4

      hi George, adding the second GPU will limit the Vram in the 3960 (12gb) to the same amount of Vram in the 3060ti (8GB) as 3D render software sends two packages, one for each GPU and they need to be the same package size. As for After Effects, Premiere Pro, you will not get much benefit as both of these programs have very little multi GPU support. Same with UE 5.
      thanks for watching

    • @georgelevantovskij8593
      @georgelevantovskij8593 Před 2 lety +1

      @@MediamanStudioServices Thank you. Yes, I desided to build second backup rig which will help me with Blender rendering.

    • @bobbywiederhold
      @bobbywiederhold Před 2 lety

      @@MediamanStudioServices Is this the same for davinci resolve? Davinci being much more GPU reliant.

  • @TheMetalmachine467
    @TheMetalmachine467 Před rokem +3

    Try fitting 2 nvidia 4090 fe cards in lol

  • @jeremiahMndy
    @jeremiahMndy Před rokem +1

    I'm about to run 2x RTX 3090 FE in NVlink configuration for a total of 48GB Vram pool. il let you know how it goes, would be cool to see this on your channel before I rig it up for Blender and Iclone.

  • @TheGamingREZ
    @TheGamingREZ Před 2 lety +2

    Guys could someone help?
    I have a Gigabyte Z690 Aorus Elite DDR4 CPU, and have a rtx 3080 in there.
    Is it worth it if I put another one in there (I already have another 3080, and also a 1070 ti)? I use the PC for gaming / video editing.

    • @mikebrown9826
      @mikebrown9826 Před 2 lety

      For gaming, it's not going to help you as you can not sli/nvlink the GPUs. For video editing. Only resolve can use multiple gpus

  • @janndee6826
    @janndee6826 Před 8 dny

    So you can run two identical gpus without using nvlink or sli?

  • @ozztheforester
    @ozztheforester Před 2 lety

    thanks this is very informative! would it be possible for you to conduct a similar test, using tb3 egpu solutions daisy chained?

    • @MediamanStudioServices
      @MediamanStudioServices  Před 2 lety

      Hi ozztheforester, if I could get my hands on such equipment I will do a video. Here is a link to some reviews.
      egpu.io/best-egpu-buyers-guide/

  • @TheMrJackzilla
    @TheMrJackzilla Před 2 lety

    Great videos. Can you make one talking about pcs and workstation to architectural work

    • @MediamanStudioServices
      @MediamanStudioServices  Před 2 lety +1

      Hi Lucas, this is a great idea for a video. Let me look into the specs required. But I am thinking it will mostly be the same as 3D and VFX computers. But I will do some research to confirm this.
      Thanks for watching.

    • @TheMrJackzilla
      @TheMrJackzilla Před 2 lety +1

      @@MediamanStudioServices thanks, it will be very helpful. I really miss content for small 3d artists. You have been helping me a lot!!!

    • @MediamanStudioServices
      @MediamanStudioServices  Před 2 lety +1

      @@TheMrJackzilla Glad to be of assistance. Please share on your FB page to help grow the channel.

  • @antoniosomera
    @antoniosomera Před 2 lety +3

    Thanks!, Great information and tests!, I have one doubt, Is it possible to render in V-ray with RT cores and 2 RTX GPUs but from different generation (E.g. RTX 2090 + RTX 3090) or different models (E.g. RTX 3090 + RTX 3080)? obviously without nvlink or sli. Well, thanks in advance!

    • @TILK3D
      @TILK3D Před rokem +2

      @andrew I have the same question, maybe you already got an answer

    • @antoniosomera
      @antoniosomera Před rokem

      @@TILK3D I haven't solve my specific doubt regarding 2 RTX GPUs but from different generation (E.g. RTX 2090 + RTX 3090) , But I have found this: 1.-You can render with several Nvidia GPU's using Vray and Vray will manage to use the available CUDA cores in the GPUs (This should work with CUDA cores of the same GPU generation even if they are different model cards (E.g. RTX 3090 + RTX 3080), but I still I don't know if different CUDA cores generations can work (E.g. RTX 2090 + RTX 3090) I have a RTX 3090 and a 980 Ti, I need to make some experiments. 2.-It is no necessary to use SLI or NVLink in order to use all the available CUDA cores of those same generation Cards, 3.-The NVLink will help sharing GPU's RAM but the improvements in rendering time is small (+/- 10%), 4.-I'm deducting (I still don't have evidence) that you can render with Vray and RT cores of several RTX GPUs of the same generatios just like the cuda cores (E.g. RTX 3090 + RTX 3080) also without an NVLink, Vray will manage to use the available RT cores. 5.-I don't know if you can render with VRAY using different RT cores generations GPU's (E.g. RTX 2090 + RTX 3090) Hope this is useful to you, let me check the sources where I found this information and I'll share it here. Please let me know if you have found something else. Best regards.

    • @TILK3D
      @TILK3D Před rokem +1

      @@antoniosomera THANK YOU FOR YOUR INFORMATION I WILL BE LOOKING FOR MORE INFORMATION AND IF I SOLVE MY DOUBT THE SAME I WILL SHARE IT GREETINGS, SUCCESS

    • @Outrider42
      @Outrider42 Před rokem +1

      You can use any combination of supported GPUs. The CUDAs will stack, the VRAM will not. So if you use a 2070 with a 3090, and the scene exceeds the 2070's VRAM, performance will suffer. But aside from this, you can mix and match however you want, and with different generations.
      This should be the case with most CUDA based renderers.

    • @antoniosomera
      @antoniosomera Před rokem

      @@Outrider42 Thanks!

  • @mohammedumar2348
    @mohammedumar2348 Před rokem +1

    Can you make a tutorial video on how to setup 2 different graphics card (rtx 4090 and 4080) in on pc

  • @eriuz
    @eriuz Před 2 lety +3

    I think you don't have a big hit in performance because on the x570 you have two x8 pcie 4.0 that is equal to 2 x16 pcie 3.0 the big hit is gonna be when you use two gpu that are pcie 3.0 or use a board that doesn't support pcie 4.0

    • @MediamanStudioServices
      @MediamanStudioServices  Před 2 lety

      hi Eriuz , I just assumed that most people looking to get two GPU would be using a current MOBO.
      But you are correct, the older MOBO's are using PCIe V3 and a lower bandwidth.
      thanks for watching

  • @TILK3D
    @TILK3D Před rokem

    hello, good video, sorry, an sli bridge is needed to join the two graphics cards for 3d rendering or simply by connecting the two cards to the board, it can be used, I have a 3090 ti and a 3080 zotac

    • @mikebrown9826
      @mikebrown9826 Před rokem +3

      No SLI or Nvlink required to render on both GPUs

    • @TILK3D
      @TILK3D Před rokem

      thank you very much for answering and clearing my doubt greetings and success

    • @mockingbird1128
      @mockingbird1128 Před rokem

      @@TILK3D so i cant use 2 rtx 2060?
      it doesnt support sli

  • @myzingonline
    @myzingonline Před rokem

    I also wanted to run multiple I have 1660 super + 9600 gt + Motherboard grphics uhd 630 with 650 watt supply... do you think I can run all of them?

    • @constantinosschinas4503
      @constantinosschinas4503 Před rokem +1

      Tried with a GTX 1050ti + GTX970 and was a nightmare. No matter the setup or drivers, pc was completely unstable and would not function properly.

  • @johnjr.8824
    @johnjr.8824 Před 2 lety

    Thanks, was going insane not knowing why my 3080 was running on PCIE 3.0 x8 instead of x16.
    2 M.2 SSDs and intel i9-10900k chip. Cant get it to PCIE 4.0 since the chip doesn’t support it

  • @karishmaansari7050
    @karishmaansari7050 Před 2 lety

    If we use 2x3070 with 3ds max does it have double viewport performance than 1x3070???? And the vram would be double 8+8

    • @countdadcula4475
      @countdadcula4475 Před rokem +1

      No. The vram does not double, even in SLI, but as long as your scene can be rendered with 8gb of vram or less, you will see almost 2x the speed when rendering.

  • @adamfilip
    @adamfilip Před rokem

    would like to see when one GPU is 16x and the 2nd one is only 4x

  • @exarkun8250
    @exarkun8250 Před rokem +3

    If you use one of the GPUs for gaming, will having 2 GPUs affect its performance?

    • @UTFapollomarine7409
      @UTFapollomarine7409 Před 10 měsíci

      in some cases yes and no, there are actually maybe 100 games more or less that actually use multi GPU support, out of those games some are AMD and some are GEFORCE while some are both, thankfully a few of my favorite games use two GPU from both sides, fallout 4, fallout nv, fallout 3, deus exmankind, so i will like to try two gpu and 8k them, also i believe a few of the metro games work and tomb raider and also FOX engine games like metal gear, so i been wanting to aim for two gpu

  • @divyasjj1136
    @divyasjj1136 Před rokem +2

    Hey I want to know !! is that possible i can se 1660 super and 3060 ti at the same pc !!! i want to use it for blender render like can i render it with one graphic card work with the other one ??

    • @8D2BFREE
      @8D2BFREE Před rokem +1

      you can ues to different models working together. but i never tried a 1660

    • @swdw973
      @swdw973 Před 11 měsíci

      Blender will look for the card with the least memory so it doesn't crash. My understanding is if you have a GPU with 8GB, and one with and 12GB, it will treat both as 8GB GPUs when assigning rendering. The Blender forums will be able to give you a definite answer.

    • @viniartee
      @viniartee Před 4 měsíci

      @@swdw973 It appears that when you have a GPU with more VRAM, and the less powerful GPU runs out of memory while your system has ample RAM, Blender can utilize the system's RAM to support the less powerful GPU. Consequently, the more powerful GPU can continue to operate at full capacity using your system's RAM. For instance, if you possess both a 3090 and a 3060 Ti GPU, along with a system equipped with 64 GB of RAM, Blender might use the 64 GB of system RAM to match the performance of the 3090. The efficiency of this process is enhanced by faster RAM. Additionally, optimal performance would benefit from a well-configured RAID system with satisfactory HDD speeds. This setup should allow Blender to function smoothly without constraining the performance of the more powerful GPU, correct?

    • @swdw973
      @swdw973 Před 4 měsíci

      @@viniarteeSystem RAM will never get anywhere near the speed of vRAM. My answer was based on info from a Blender developer on one of the Blender forums.
      I actually run 2 different GPUs, a 3060 and 4070, but I made sure they both are the 12GB versions, because of what the developer told me.

  • @IamJeshua
    @IamJeshua Před 2 lety +6

    Is there a problem if I use different cards? i have a 3070 ti but i want to buy a 3080ti, all this mounted on a z690 motherboard with an i7 12700f

    • @studyjourneyofficial
      @studyjourneyofficial Před 2 lety

      Following!

    • @KazCanning
      @KazCanning Před 2 lety +1

      There shouldn't be any issue... I have an rtx 2080 and a 3080 ti both running 8x. I was actually shocked that it worked perfectly immediately, I literally didn't have to do anything, just plugged the card in, turned on my computer and it just worked. Shocking.

    • @studyjourneyofficial
      @studyjourneyofficial Před 2 lety

      @@KazCanning I have two 3070ti back to back connected to my mobo. Do you think they're gonna overheat once I put them underload? If so, what should I do to cool them down better? I have a 680x corsair case. Thank you!

    • @KazCanning
      @KazCanning Před 2 lety

      @@studyjourneyofficial hard to say. I know my 3080tis get pretty hot when working on renders. I also have the founders edition cards so they have the pass through heat fins, basically blowing hot air on the top card.

    • @ShawarmaBaby
      @ShawarmaBaby Před 2 lety

      @@KazCanning which psu you have? I have 3080 and 3090 woth a 1300w and when i render my monitors go off

  • @frankbrancato3997
    @frankbrancato3997 Před rokem

    Can you say what mother board you use on the desktop? Did you have any heat problems with two boars being so close to each other? I'm running a 3080 and a 3070ti and they run at about 70C.

    • @powerso1380
      @powerso1380 Před rokem +1

      After digging the rabit hole by watching two other videos from this channel: "Water cooling for Content Creators (Part 1)" (box behind) and "How to bend PETG tube, perfect bends every time! Water cooling part 2" (closer look). I guess the motherboard is X570 AORUS PRO WIFI. Hope this help
      edit: typo

    • @vucha340
      @vucha340 Před rokem

      Can i use rtx 2060 And 3080 working in same time , and do i see some better fps in games? Like some small increase in fps or huge?

    • @zonaeksperimen3449
      @zonaeksperimen3449 Před rokem

      @@vucha340 no you cant get extra performance, for non SLI setup you will get single main Gpu performance, i used dual rtx 3060

  • @NarekAvetisyan
    @NarekAvetisyan Před 2 lety

    I'm using an RTX 3070 with a Ryzen 2700X on B450 motherboard. It's running at x8 because I have a second GPU. I've measured about 2-3% performance drop on x8. I also have it overclocked +200MHz on the core and +1300MHz on the memory with power limit set to 100% and I gain 14% performance boost on my Blender renders.

  • @FelixTheAnimator
    @FelixTheAnimator Před rokem +1

    Does the mobo HAVE to be crossfire/sli ready to have two gpu cards?

    • @mikebrown9826
      @mikebrown9826 Před rokem

      No. Just support 16x or 8x pcie. Preferably directly from the CPU

  • @elysiumcore
    @elysiumcore Před 2 lety

    I hope we get more lanes on AM5 boards

  • @Visokovart
    @Visokovart Před 2 lety

    Hey. I have a problem connecting two 3090s through the Nvlink bridge. I use blender 3.0 stable to render my works
    b550m pro4 motherboard
    Processor x5900
    I can not find information on the Internet on how to correctly connect the bridge and speed up the blender. Please help

    • @MediamanStudioServices
      @MediamanStudioServices  Před 2 lety

      your mother board needs to support Nvlink. But have you tried to run a render with the two separate GPU? Connecting with Nvlink is not going to necessary speed up your rendering.

    • @djuansayjones6178
      @djuansayjones6178 Před 2 lety

      @@MediamanStudioServices hey I saw that the 3080 or less only uses about 8x. But the 3090 uses more and would lose a lot of performance. Have you tested this out yet? Or do you know it's true?

    • @MediamanStudioServices
      @MediamanStudioServices  Před 2 lety +1

      @@djuansayjones6178 I am not sure of the PCIe bandwidth for each card, but I am making a good guess here, the RTX3080 does use more than the bandwidth of just 8x when at full performance, so yes untrue.

    • @djuansayjones6178
      @djuansayjones6178 Před 2 lety

      @@MediamanStudioServices ok thanks a lot!!!

  • @iledawn9282
    @iledawn9282 Před 2 lety

    Do the vrams add up that way to 16gb ? whether it's desktop or workstations

    • @MediamanStudioServices
      @MediamanStudioServices  Před 2 lety

      you mean the Vram on the GPU? it is physically on the GPU card itself, so its the same no matter what computer you plug in into.

  • @SubirajSinghFitness
    @SubirajSinghFitness Před 2 lety

    Can you please tell the system requirements for a simple rtx 3060 graphics card, as i have one of the desktops which is quite old enough lets say 5th gen intel i5 with motherboard such old to, would this graphics card fit in it, or would i need to change my motherboard and processor as well to an 10-11th gen?

    • @seanspliff
      @seanspliff Před 2 lety +1

      Idk if it helps but I use a 3060 ti with a intel I7 and a 6 year old motherboard

    • @SubirajSinghFitness
      @SubirajSinghFitness Před 2 lety

      @@seanspliff is your processor i7 even 6 years old?

    • @seanspliff
      @seanspliff Před 2 lety +1

      @@SubirajSinghFitness yup bought it in a prebuilt with the same mother board. My motherboard is a gigabyte x99 UD3

    • @SubirajSinghFitness
      @SubirajSinghFitness Před 2 lety +1

      @@seanspliff thanks bro 🍻