Here's WHY You Should BUY Intel ARC GPUs ... and NOT!

SdĂ­let
VloĆŸit
  • čas pƙidĂĄn 14. 06. 2024
  • Friends from ASROCK Sponsored this video about their GPUs, so I took the chance to test out their ARC GPUs and see if they're worth it in 2024/25. Well... there's GOOD NEWS! 👇
    ✅ ASROCK ARC A770 geni.us/ASROCK-A770-16G
    Other A770s: geni.us/t8UHuRA
    ✅ ASROCK ARC A750 geni.us/ASROCK-A750
    Other A750s: geni.us/LQgv
    ✅ ASROCK ARC A580 geni.us/ASROCK-A580-8G
    Other A580s: geni.us/SJIrDqN
    ✅ ASROCK ARC A380 geni.us/ASROCK-A380-Ch
    Low profile A380: geni.us/ASrock-A380-LP
    Other A380s: geni.us/CBu2
    ❗Videos to check out:
    1. Best GPU for Premiere Pro ‱ SAVE YOUR MONEY! 👉 Bes...
    2. Best GPU for Creators ‱ REVEALED: 40+ GPUs tes...
    đŸ‘‡â—ïžPC Builds for Creators [Latest]â—ïžđŸ‘‡
    âžĄïž $750 BB4B (Best-Bang-For-Buck) PC geni.us/750-1300BB4B
    âžĄïž $1500 BB4B Creator PC geni.us/1500-2300BB4B
    âžĄïž $2500 BB4B Creator PC geni.us/2500-3500BB4B
    âžĄïž $4000 BB4B Creator PC geni.us/4000-5kBB4B
    Want me to check out some tech or interested in collaborating?
    đŸ“© email me: collab@technotice.com
    ✅ LATEST MERCH HERE: www.technoticestore.com
    👉 Advice & 100% Answered on Minnect: geni.us/MinnectLP
    FB: / thetechnotice
    IG: @thetechnotice
    #ASROCK #arc #Intel #GPU
    Video produced by Lauri Pesur
    Edited by Sam Ruddick
    -------------------------------------------------------------------------------------------------------------
    GEAR USED IN THIS VIDEO:
    [All 'geni.us' links below redirect to Amazon (or some other Online stores) and may earn commissions]
    AUDIO:
    -The Recorder + Mic I Use: geni.us/tscmdr-10l
    -Shotgun Mic I Use: geni.us/T35T
    -My Main Headphones [CHEAP]: geni.us/dzv7Q
    -My Main Headphones 2.0 [NOT CHEAP]: geni.us/as7q9v
    VIDEO:
    -A-roll Camera: geni.us/NvnoA
    -Lens for A cam: geni.us/J009I
    -B-Roll Camera: geni.us/wVlF8J
    -Top Down Cam: geni.us/56576C
    -Main Bokeh Lens 4 Sony: geni.us/k09oA
    -The Wide Angle beast w OS: geni.us/9KfCwy
    LIGHTS:
    -Main Fill Light: geni.us/qTbE
    -Hair & RGB Lights I use: geni.us/TdY1
    -RGB Tube Wall Light Behind me: geni.us/96gWdz8
    -The Fire effect light: geni.us/GX5KA
    DISCLAIMER: Some of the links in this video description & channel are affiliate links which means by clicking on them and purchasing the item(s) I get a small commission. As an Amazon Associate I earn from qualifying purchases. Using the provided links does NOT change the price for you. Same goes for Newegg, Best Buy, B&H.
    âČ
    0:00 Intro
    0:43 Which GPUs are we testing?
    1:27 Adobe Photoshop Results
    3:07 Lightroom Classic
    3:52 Drivers VS New Software
    6:12 Premiere Pro Results
    8:05 WOW! - Long GOP (H.264/h.265) results
    10:33 After Effects Results
    12:09 DaVinci Resolve Results
    13:20 Bad News - 3D
    13:50 Blender Performance
    14:35 Gaming...
    14:55 Why AV1 encoding is AWESOME!
    15:25 Hyper Encode... 80% awesome
    16:35 Quick Sync on AMD Platform
    17:28 The BEST NEWs!
    19:17 Which GPU would I get/suggest?
    20:17 Concluding thoughts!
  • ZĂĄbava

Komentáƙe • 146

  • @theTechNotice
    @theTechNotice  Pƙed 28 dny +32

    Disclaimer: Sponsored by ASROCK - What does that mean?👇
    Friends from ASrock reached out to sponsor a video to talk about their ARC GPUs for gaming etc...
    I said: "I'm not talking about gaming, but I could talk/test them what their like for creators." They said: "Go for it."
    I did my independent testing (like with all GPUs) and shared the results. ASROCK did preview the video before published and the only thing they asked to change was: "Can you put the Intel ARC [TM] logo on top left corner please".
    What I'm trying to say is they didn't tell me what to say, in fact the other way, I said why don't we talk about X,Y & Z. But I did get paid for focusing on ASROCK ARC GPUs and not mention and show [but still talk about it] other competitor brands in the video.
    Wanted to be upfront about it, coz I know how the 'sponsored videos' can come across and this was a collaboration with very 'loose' sponsorship guide lines. 😊

    • @paulboyce8537
      @paulboyce8537 Pƙed 26 dny +2

      Thank you for the honest review. I find lot of the channels that are sponsored leaving the most important out to shift the blame or just to make their sponsored product look good. 99% seem to fall to the "misinformation" when something is left out in purpose. Mind you there are some channels that will give you the truth no matter what. But not many. One point I think you missed was Intel Application Optimization (APO) that is upcoming and should get more attention because when the ARC is paired with high end INTEL CPU there is potential for lot more in near future. Also I see lot is still on the table for INTEL pairing that we are starting to see in a gap in performance between AMD vs INTEL CPU.

  • @GustavoSanchez64
    @GustavoSanchez64 Pƙed 26 dny +29

    The one issue with arc cards that no one is pointing out which will be an issue for users with high end monitors is that they cannot output native 10bit bit depth over displayport.

    • @nicktayloriv310
      @nicktayloriv310 Pƙed 25 dny +3

      Ah... I didnt' know that. I have three LG 32QN650-B 32-inch QHD which are 2560 x 1440. I don't think they qualify as "high end" but you've given me something to think about.

    • @GustavoSanchez64
      @GustavoSanchez64 Pƙed 25 dny +5

      @@nicktayloriv310 Since you have a 10-bit panel you wouldn't be able to send a native 10bit signal to your display so you would have to rely on turning on HDR in windows which tends to look like crap. I built my PC for mainly photo work so I have a 4k color calibrated monitor so not being able to send native 10bit to my display defeated the purpose of me having such an expensive display in the first place. I sent my a770 back and ended up getting a refurbished rtx 3090 from micro center for $600.
      What's so dumb from intel is that they advertise having DP 2.0 ports but then don't support 10-bit which is so backwards.

    • @survivor303
      @survivor303 Pƙed 25 dny

      @@GustavoSanchez64 try with 4:2:0 :)

    • @mrbabyhugh
      @mrbabyhugh Pƙed 23 dny +1

      this why my setup will be a750 + 4070 Ti Super. I should check if monitor i got can do 10-bit lol, i didn't check for that, I focused on color accuracy, ppi, QHD, 27". Had a ruff time deciding either HP Z or ASUS ProArt (ruler is interesting, but I dont do as much print anymore). My OCD made me choose the HP Z because my laptop is HP, though the Mobo I bought is ProArt.

    • @nicktayloriv310
      @nicktayloriv310 Pƙed 6 dny

      @GustavoSanchez64 My Asus 4080 Super arrived yesterday. Started to get a 4070 Ti Super but changed my mind at the last minute. 999 direct from Asus.

  • @dreadtrain2846
    @dreadtrain2846 Pƙed 26 dny +10

    It's crazy that Rob Dyrdek got into YT computer vids. Guy does it all!

  • @dwanejeff196
    @dwanejeff196 Pƙed 26 dny +9

    I have subscribed and followed you from my first viewing of an earlier video...The $600 Apple Wheel's, that had just came out at that time. You made a video introducing yourself and what you could do. You put together a computer for that same price as the, Apple wheels, impressive.
    You lose me when you go the 'expensive' route...but you get me back when you include my price range. "Budget, Friendly." Reality, of money situation sometimes is a rude awakening. You want the Best, Fastest, Greatest....and in reality....you cannot even save $100. So....please keep up with what you are doing! Keep both though...your high end money builds with the Budget ones....I enjoy both!

  • @cheekster777
    @cheekster777 Pƙed 26 dny +10

    Thanks for your transparency in the first comment. đŸ‘đŸ»

  • @PuR3OwNaGe45
    @PuR3OwNaGe45 Pƙed 26 dny +27

    I think the most optimal budget setup would be a dual card setup with an AMD card for gaming and then an ARC for editing if that was what you were looking for.

    • @theTechNotice
      @theTechNotice  Pƙed 26 dny +10

      I tried it on the threadripper build, didn't work so well...

    • @TC-hl1ws
      @TC-hl1ws Pƙed 26 dny +4

      @@theTechNotice Did you make a CZcams video on this dual card experiment?

    • @StroggKingu
      @StroggKingu Pƙed 26 dny +6

      How's buying two cards on budget and what two were you thinking of?

    • @nicktayloriv310
      @nicktayloriv310 Pƙed 26 dny

      @@TC-hl1ws Yes he did. Put Tech Notice Threadripper dual GPU in the search bar and... there it is.

    • @nicktayloriv310
      @nicktayloriv310 Pƙed 26 dny +3

      I Built the Ultimate ALL AMD HEDT PC build in 2023... | Threadripper 7980x + RX7900XTX + ARC A380 is the title of the video.

  • @AY-gf3jq
    @AY-gf3jq Pƙed 26 dny +7

    Hope that Da Vinci Resolve will update their app to take full advantage of these beasts so we are compelled to use Premiere Pro when having theses cards.

    • @survivor303
      @survivor303 Pƙed 25 dny

      they likely doesn't do that, they even drop some perfectly good amd gpus from their linux version of the product.. about windows, i dont know :)

    • @iamakkkshay
      @iamakkkshay Pƙed 24 dny

      ​@@survivor303 Linux has no licences for HEVC and many other codecs, while on windows you can purchase or install them.
      Davinci Resolve will take some time but will support intel media encoder/decoder eventuallyđŸ€ž

  • @MAFSTUBE7565
    @MAFSTUBE7565 Pƙed 24 dny +2

    Hey, how is Arc A770 in productivity work like 3D animation, Game dev (Unreal Engine 5), video editing and AI stuff compared to RTX 4070? Like 4070 might be better, but which one would be a value for money?

    • @LuccianoNova
      @LuccianoNova Pƙed 3 dny +1

      After my intensive studies of current GPUs. The 4070 super for 600$ is the best purchase for overall performance/price. That is if you want optimized frame gen. If you dont care about Ai upscaling go AMD.

  • @JuanflimIt
    @JuanflimIt Pƙed 22 dny +1

    I use Davinci resolve should I get 7900XT Ultra or 4070 super i manly edit Sony Fx3,30,and fx6 codecs? (the xt is 125$ dollar more)

  • @R3nnan
    @R3nnan Pƙed 4 dny

    The most interesting thing about my A770 phantom lately is how often I simply forget I'm running it. It works just fine in everything that I tried. Hopefully Intel will reward their early adopters with drivers and improvements for many years.

  • @outlawkings6133
    @outlawkings6133 Pƙed 26 dny

    Hello. Maybe my comment will be outdated but I really need help. Right now I’m looking for a dedicated streaming pc using av1. And I have two options both almost the same price. First is to build pc with 16gb ram i5-12400f and arc a380. The second one is a laptop Lenovo LOQ 15iax9i with i5-12450HX 8gb ram and Intel Arc a530m 4gb. The ram can be changed on laptop, it has 2 slots. The only reason that I didn’t choose laptop already, I’m not sure is it enough a530m with 4gb for 1080p/60 or 1440p/60 av1 streaming. Did you have any thoughts on this? Laptop is preferred more because I always on go.

  • @stephenxs8354
    @stephenxs8354 Pƙed 26 dny +3

    As an encoder for Plex/Jellyfin/Emby there is no better value. Ordered my first ARC (380). Can't wait to play with it. If only the higher models weren't so high TDP.

  • @user-wp1be7ty7q
    @user-wp1be7ty7q Pƙed 26 dny

    Which is best for trading computer running software like MT5 for multiple chart windows ?

  • @nightdonutstudio
    @nightdonutstudio Pƙed 26 dny +2

    When is the next generation card come out? If they come out new card with 500-700 price range, would be cool for content creator.

  • @no1seeverywhere77
    @no1seeverywhere77 Pƙed 26 dny +1

    To be honest it good to see more competition. Prices are crazy for anything decent in the big two gpu makers. Great review thanks.

  • @MartinFransson
    @MartinFransson Pƙed dnem

    Sooo, for someone who only uses Davinci Resolve Studio and Lightroom/Photoshop and has a budget for an 8 Gb RTX 4060. Would the ARC 580 8 Gb or a 750 8 Gb be a good choice? I run an i7 8700K and 32 Gb of DDR4 on an Asus Z-390A as the rest of my system. No gaming and no 3D work.

  • @mrbabyhugh
    @mrbabyhugh Pƙed 23 dny

    1:35 last video you did on GPUs, YOU made me decide to build with 2 GPUs. ARC (think i have settled on the a750 LE) for my Photoshop & Premiere Pro workload (aka daily driver) and when it's time to hit some UE5, SD/ML, Gaming... it's the 4070 Ti Super (either ASUS ProArt or MSI Expert).

  • @EROS-SENPAI
    @EROS-SENPAI Pƙed 12 dny +1

    thanks sir
    i just wanna know - can i choose 770 LE over 4060 ?
    i mean i am learning editing and do lil motion graphics . i am so confused coz its huge money for me (as an investment )

    • @xarisathos
      @xarisathos Pƙed 9 dny

      A770 LE 16GB, the VRAM will help a lot and give a headroom for experiments. 8GB in 2024 for CG graphics is no go anymore.

    • @EROS-SENPAI
      @EROS-SENPAI Pƙed 8 dny

      @@xarisathos thanks alot !
      As I can't wait for upcoming battlemage GPUs, ig for rn Arc a770 is my best option in the budget
      Thanks again 🌾

  • @bruh-qi7ue
    @bruh-qi7ue Pƙed 26 dny

    So if they don't upgrade the dual media engines inside the gpu the performance on premiere pro would be the same in a newer gpu? I'm waiting for a newer intel gpu, but don't know if I should as there's no rumours about it

    • @welshe222
      @welshe222 Pƙed 26 dny

      computex is the 4th, hold out (:

  • @rangersmith4652
    @rangersmith4652 Pƙed 26 dny

    I'm a newb hobbyist video editor and maybe streamer, and I'm using an A770 paired with a Ryzen 5600X (which I already had) to learn. The A770 is more than good enough for both use cases, and I suspect it will keep getting better. My most recent

  • @cheekster777
    @cheekster777 Pƙed 26 dny +1

    Thank you my friend.

  • @MrMikegage
    @MrMikegage Pƙed 26 dny +1

    Would this be a good option for a dual card setup with the i9 14900k and the Pro art z790 ?

    • @theTechNotice
      @theTechNotice  Pƙed 26 dny +2

      Nope, don't recommend dual arc :)

    • @MrMikegage
      @MrMikegage Pƙed 26 dny

      Thanks for the quick reply 😁

    • @wrayr
      @wrayr Pƙed 25 dny

      Zero benefit

  • @johnbernhardtsen3008
    @johnbernhardtsen3008 Pƙed 26 dny +3

    I have had the A750, it didnt work on premiere pro, because I used a 5800x cpu!getting the A580 for a 12700k build for a friend of mine!cant wait to see the results in 3 months time, since I only got psu, ram and a cabinet for now!

    • @TC-hl1ws
      @TC-hl1ws Pƙed 26 dny +1

      Intel GPU doesn't work well with a AMD CPU???

    • @johnbernhardtsen3008
      @johnbernhardtsen3008 Pƙed 26 dny +1

      @@TC-hl1ws it needs cuda cors, I tried opencl but it would take sooo long to render!it works with amd, not good in games back then, couldnt play properly with 5800x and a 4k monitor!some wrote the a750 worked nice on an x570 mobo and a 59590x!

    • @magnusnilsson9792
      @magnusnilsson9792 Pƙed 26 dny +2

      @@TC-hl1ws My A750 LE has been working fine for a year now with my Ryzen 5600 on a B350 MoBo.

    • @johnbernhardtsen3008
      @johnbernhardtsen3008 Pƙed 26 dny

      @@magnusnilsson9792 for premiere pro too?or Davinci?

  • @svi5332
    @svi5332 Pƙed 26 dny

    watching this on my new asrock x670e amd cpu with an asrock intel arc a770 16gb challenger.. love it! set up for video editing...

  • @punkbutler88
    @punkbutler88 Pƙed 26 dny

    Id like to upgrade my RX6600, but i dont know what i want to go with. lol. Wouldnt mind an ARC for my Plex/HDHomerun TV viewing acceleration..

  • @mo79ch
    @mo79ch Pƙed 24 dny +1

    Thanks for the review,
    but do you recommend it regardless of the price?

    • @theTechNotice
      @theTechNotice  Pƙed 23 dny

      Not sure what you mean...

    • @mo79ch
      @mo79ch Pƙed 23 dny +1

      @@theTechNotice What I mean is
      Is it recommended more than high-end graphics cards for video editing?

    • @xarisathos
      @xarisathos Pƙed 9 dny

      @@mo79ch Get the A770 16GB, the VRAM goes a long way for video editing, especially in Davinci Resolve

    • @SuperNova-gc7tm
      @SuperNova-gc7tm Pƙed 4 dny

      @@mo79ch he made a whole video about why and when he would recommend these cards... your question should have been answered by the video.
      Did you watch it?

  • @ChairmanMeow1
    @ChairmanMeow1 Pƙed 13 dny

    The Arc 770 is honestly a good card. They've come a looooong way with the drivers, and you can find one for like $325 online. The A310 and A380 are great too just for work or old PCs that just need a little boost to be functional again. And those are only like $99!

  • @felixnoky6748
    @felixnoky6748 Pƙed 26 dny

    does intel has a low latency mode like n vidia reflex

  • @coachvalente
    @coachvalente Pƙed 8 dny

    Doest the integrated gpu from intel CPU's adds performance to Intel dedicated GPU or is it annulled?

    • @theTechNotice
      @theTechNotice  Pƙed 7 dny +1

      Well there is the hyperencode feature it would add, but it's a very specific thing, in most cases I'd sayno additional performance, but extra troubleshoot GPU :)

    • @coachvalente
      @coachvalente Pƙed 7 dny

      @@theTechNotice thank you

  • @squatch545
    @squatch545 Pƙed 26 dny +1

    I thought Quick Sync was a feature built into the CPU? You're saying it's built into the Arc card now?

    • @theTechNotice
      @theTechNotice  Pƙed 26 dny +3

      It's built for Intel media engines

    • @ryanspencer6778
      @ryanspencer6778 Pƙed 24 dny

      QuickSync is part of the media engine that's in the integrated graphics. Arc GPUs are an evolution of the integrated graphics, so they inherit the evolved media engine.

    • @squatch545
      @squatch545 Pƙed 13 dny

      @@ryanspencer6778 So one could pair an AMD CPU with an Intel GPU (Arc or Battlemage) and you would get QuickSync without needing to have an Intel CPU?

    • @ryanspencer6778
      @ryanspencer6778 Pƙed 13 dny

      @@squatch545 yes

    • @squatch545
      @squatch545 Pƙed 13 dny

      @@ryanspencer6778 Good to know, thanks.

  • @mixinthebisquic1238
    @mixinthebisquic1238 Pƙed 20 dny

    My buddy has a rtx 4060 and a ryzen 7 7700 paired with 16 gb ddr5, all the while i had at the time an a580 arc* we ran some 7 days to die he had significant fps drops at the same scenes bit cities and all that he would bottom to 90 sometimes even 46! While my arc 580 sat locked at 110 in those same environments, i dont rightly know why it got outpaced, considering his build was a month old at the time, because of this fact my next gpu is an arc bifrost 770 ive had nothing but good experience with arc.

  • @philindeblanc
    @philindeblanc Pƙed 24 dny

    Would be VALUABLE to see SolidWorks results!!

  • @le_nirnoy
    @le_nirnoy Pƙed 24 dny

    Damn the intro is đŸ”„

  • @Blazinghardware
    @Blazinghardware Pƙed 26 dny

    I own a Arc A750 8gb, and i am pleasantly surprised by its performance.
    My use for it is gaming and I do recommend to push the powerslider to max, set core voltage to +25 mv and slide the setting overclock to +30 in the Intel Arc menu.
    Price performance wise this card is a great buy.

    • @survivor303
      @survivor303 Pƙed 25 dny

      how you control its temps?

    • @Blazinghardware
      @Blazinghardware Pƙed 25 dny

      @@survivor303 You can adjust the fancurve in the Arc control panel

    • @survivor303
      @survivor303 Pƙed 25 dny

      @@Blazinghardware sure but is stock coolers ok with those oc settings?

    • @Blazinghardware
      @Blazinghardware Pƙed 25 dny +1

      @@survivor303 yeah, the temps stay around 65c in Fortnite, Doom Eternal kan hit 77c. But this is within limits.
      My model is an Acer Bifrost Arc A750 OC. So temps will vary...

  • @jordanmntungwa3311
    @jordanmntungwa3311 Pƙed 25 dny

    Availability is the biggest factor. If I only have $300 at the one time and I won’t have that money again for the rest of the year., I’ll buy what’s there. If the shop doesn’t stock this, I can’t even consider it

  • @mctonyward
    @mctonyward Pƙed 22 dny

    nice to see someone trying to bring back the 80's shoulder pads

  • @nicktayloriv310
    @nicktayloriv310 Pƙed 26 dny

    Even though I'm Team DaVinci Resolve I think I'm going to buy an A770 and just sit on it. I'm holding out to the new 15th Gen Intel CPU's come out and get one with onboard graphics to help with the editing and coding. I wanted AMD to address their onboard CPU issues that us editors face (the Gamers, not so much) with some software updates or something but it looks as if it's not going to happen as they've announced they're going to pull back for a while to concentrate of other endeavors. I have a 4070Ti Super now (got it 2 weeks ago, replaced my 3080Ti with it) and a Ryzen 9 3900x that I'm going to run until things sort themselves out plus I've been tinkering with Blender for a few weeks as I want to incorporate 3D into my platform. For editing I didn't see going up to AM5 platform helping any so I'll wait for Intel. Sometimes I stream from this same machine as well.

    • @survivor303
      @survivor303 Pƙed 25 dny

      so, you going to replace 4070ti with a770? do you know, you really dont need to do that ;)

    • @nicktayloriv310
      @nicktayloriv310 Pƙed 24 dny

      @@survivor303 That's not what I said..

  • @omnymisa
    @omnymisa Pƙed 26 dny

    Intel should be able to improve a lot in the next GPU releases sure I hope they do, specially in the 3D department for my particular case.

  • @AsmodeusExx
    @AsmodeusExx Pƙed 17 dny

    Hoping battlemage grows and 15th gen intel cpu will be a great combo😊

  • @CyberClu
    @CyberClu Pƙed 26 dny

    I'm not sure to purchase now, or wait for the holidays. Rumors are saying Battlemage will drop in the fall/winter, and that a B580 with 12Gb of vram would be equivalent to an A770, but with a price of $200 USD, and lower power consumption.

    • @survivor303
      @survivor303 Pƙed 25 dny

      nobody believes that 200USD :D perhaps it is intel price but you dont going to see it with 12GB ram on the shelfs :)

    • @CyberClu
      @CyberClu Pƙed 25 dny

      @@survivor303 Maybe, but you have to remember that Intel is new to the GPU market, and if they want to gain market share, they have to be competitive against Nvidia and AMD. Remember, nobody believed a 6 core 12 thread CPU would be half the price of Intel's i7 6800k, but then Ryzen happened.

  • @TonyRush21
    @TonyRush21 Pƙed 26 dny

    Question: will an intel 3770k cpu support the intel arc a580

    • @FilippoCarrux
      @FilippoCarrux Pƙed 25 dny

      The support depends on the motherboard. But, since Arc cards rely a lot on the resizable bar feature on the motherboard, I don't think the arc is the one for your build. But if you are planning to change the build, you can buy it and then do the upgrade. You need to know you have at least 20/30% less performance without resizable bar.

    • @xarisathos
      @xarisathos Pƙed 9 dny

      No, it doesn't work. Tried an A380 & A770 to revice the system but doesn't boot at all, it requires quite modern CPU/motherboard.

  • @rotnbazturd7569
    @rotnbazturd7569 Pƙed 26 dny +1

    So for now the main issue isn't the card it is the software ? Sucks that NVIDIA has had the market share for so long that all the software is coded to optimize NVIDIA GPUs.

  • @Bobbylopezcreative
    @Bobbylopezcreative Pƙed 25 dny

    I would love it if you made your bar charts mobile viewer friendly. Having so many arranged that way and size is impossible to see even on an iPhone pro max. Check out how max tech does it. Very mobile friendly.

    • @survivor303
      @survivor303 Pƙed 25 dny

      if you use app for youtube, just spinzoom :)

    • @theTechNotice
      @theTechNotice  Pƙed 25 dny

      yeah, but MAXtech has only 2-3 items in the chart, not ALL the GPUs :) I'd love any advice how to do it with 20-30 item chart? :)

  • @cheekster777
    @cheekster777 Pƙed 26 dny +1

    I almost pulled the trigger on an Intel Arc A770 this past week whilst Amazon had it for ÂŁ298 but then the price jumped all the way up to ÂŁ380! đŸ˜źđŸ€Ź
    Waiting for it to drop sub-£300 again


    • @ItsDeeno69
      @ItsDeeno69 Pƙed 26 dny

      wait for battlemage bro

    • @StroggKingu
      @StroggKingu Pƙed 26 dny

      Yup, wait for Battlemage, it either be better for reasonable price or A770 gets cheaper.

    • @scorpionfpv6412
      @scorpionfpv6412 Pƙed 2 dny

      currently ÂŁ220 at currys !

    • @cheekster777
      @cheekster777 Pƙed 2 dny

      @@scorpionfpv6412 - Yes, an absolute *STEAL!*

  • @ionamygdalon2263
    @ionamygdalon2263 Pƙed 25 dny

    For whom is the ARC a310 for by the way? Not interested in gaming performance.

    • @nadtz
      @nadtz Pƙed 24 dny +1

      The A310 and A380 are both pretty good for media servers or boxes where you need a GPU but don't need a ton of GPU horsepower. Back before iGPU's office desktops used to come with low end quadro cards for example and the E2400 Xeons for example sometimes get used for workstations and don't have iGPU.

    • @ionamygdalon2263
      @ionamygdalon2263 Pƙed 24 dny

      @@nadtz
      Thank you for your reply! :)

  • @kokizzu
    @kokizzu Pƙed 24 dny

    the volume of this video is too loud '__') no?
    pretty sure other videos are not this loud

  • @Masteraaa-cb5uj
    @Masteraaa-cb5uj Pƙed 26 dny

    Nice

  • @nasko235679
    @nasko235679 Pƙed 26 dny +3

    I've smoked benchmarks numerous times and my 1% lows increased 10/10 would recommend.

  • @abub002
    @abub002 Pƙed 26 dny

    I'm building an extra machine just for fun and to play with ARC. Just for gaming not a creative.

  • @MrPtheMan
    @MrPtheMan Pƙed 22 dny

    It's weird when Intel gets a more value to users than the other two tech giants. Let's face it, the pc industry is overexpensive for budget oriented people. Even the i5 13th/14th are such great deal to performance. Thxs for the review.

  • @survivor303
    @survivor303 Pƙed 25 dny

    just please down that intro volume. .holly sh... ;D

  • @doctorkj5640
    @doctorkj5640 Pƙed 26 dny +2

    One day we might have a PC with nVidia CPU and Intel GPU 🙂

  • @zaptaz7437
    @zaptaz7437 Pƙed 14 dny

    Too bad the price is not good where I live.

  • @sitaroartworks
    @sitaroartworks Pƙed 18 dny

    Yes, now, better support on Linux please.

  • @marks9233
    @marks9233 Pƙed 26 dny

    Thanks for the info! Good effort. It is difficult for me to use ranked card performance statistics, but with charts with mistakes, like the same exact same card twice in a list with different performance numbers (and also a few mistakes in the script, I think). Also the info with old cards at the top of the list has to be obsolete data. 2060 is not better than the 4080 in Blender. Trying to make a purchase decision on confusing info you list and talk about is not so easy. =/ Thanks for the good info otherwise though. Good topic!

  • @survivor303
    @survivor303 Pƙed 25 dny

    linux is a platform where they really can shine, if only intel get hang of it.. they need to stop this gaming nonsense and make things for creators. sure they have those acl cards to give more performance for render speeds but that thing is for movie makers, not "creators". we need working profiles for these arc gpus for kdenlive.

  • @jaisonthomas4401
    @jaisonthomas4401 Pƙed 26 dny

    is it possible to use two GPU if i have 2PCIeX slot...arc a380 6GBDDR6 +T1000 Nvidia 8GB together???

    • @xarisathos
      @xarisathos Pƙed 9 dny

      Yes of course. Put the T1000 in the first slot for main display GPU (to get 10bit color) and the A380 for the Quicksync HW accelaration, it does miracles. In Resolve you can define the A380 to handle all the decoding in the timeline or AV1 encoding during export

  • @pioneeridi193
    @pioneeridi193 Pƙed 26 dny

    So a low profile GPU beat RTX 4090 in one test! that irony

  • @danmar007
    @danmar007 Pƙed 19 dny

    I like my A750.

  • @bassamatic
    @bassamatic Pƙed 10 dny

    Been running an A750 for 5 months. best 300$ CAD I ever spent. Less than half the price of an nvidia card, and more stable than AMD.

    • @Mr.Adib_2.0
      @Mr.Adib_2.0 Pƙed 4 dny

      Have you used premiere pro 2024? How does it perform while editing 4K videos? Most people talk about benchmarks but I want to know about real world experience.

    • @bassamatic
      @bassamatic Pƙed 4 dny

      @@Mr.Adib_2.0 i do not use that software. but I can tell you I had a 1080 for years before the A750, and I am enjoying the upgrade. One thing I did notice... when I alt-tab put of a fullscreen app with the ARC card , any streaming video content is blacked out until I click on the window to re-focus on it. Its a tiny hitch that could cause pain for video editing I imagine... but are you going to edit while running mechwarrior online?

    • @Mr.Adib_2.0
      @Mr.Adib_2.0 Pƙed 3 dny

      @@bassamatic No, I am not into gaming. So, this issue may not bother me.

  • @bes12000
    @bes12000 Pƙed 23 dny

    If AMD can't get their shit together and Intel does start making better GPU's I might consider switching..lol

  • @michaelbrindley4363
    @michaelbrindley4363 Pƙed 26 dny +1

    Theyre to slow, unless youre a budget gamer and dont mind having issues w some games not running and high idle power usage and inconsistent performance vs the competition. Ive used an a770 and a750. 1440p newer titles gl w that and wanting a high frame rate. Theyre a 1080p gpu and low settings for the majority of games recently released that are demanding. Older games yeh sure go ahead.

    • @rotnbazturd7569
      @rotnbazturd7569 Pƙed 26 dny

      I run 1440 medium setting in newer games no problem. Bought a Sparkle 770 16GB and only issue is stock fan control is set to 50C before fans come on but can change that with Arc Control. But ya NVIDIA is the way to go if you want max FPS just the way it is for now. BTW I run the card at stock setting, can change in Arc Control but have never needed to.

    • @michaelbrindley4363
      @michaelbrindley4363 Pƙed 26 dny

      @@rotnbazturd7569 yeh, on newer ue5 engine games?

    • @wrayr
      @wrayr Pƙed 25 dny +1

      Aren't you on the wrong channel?

  • @gejamugamlatsoomanam7716
    @gejamugamlatsoomanam7716 Pƙed 26 dny

    A770 has better RT performance than rx6650 xt

  • @MR_CROW1
    @MR_CROW1 Pƙed 26 dny

    intel catch radeon

  • @gejamugamlatsoomanam7716
    @gejamugamlatsoomanam7716 Pƙed 21 dnem

    Arc is still a better choice than Radeon. Whatever gpu you buy, don't buy AMD, even the chinese gpu's are better choice

  • @venividivici1990
    @venividivici1990 Pƙed 25 dny

    deus vult

  • @premjitchowdhury262
    @premjitchowdhury262 Pƙed 24 dny

    No gaming fps numbers = dislike

  • @miladh1859
    @miladh1859 Pƙed 26 dny

    No thanks😊😊😊😊