Video není dostupné.
Omlouváme se.

It's 2024 and Ray Tracing Still Sucks

Sdílet
Vložit
  • čas přidán 1. 07. 2024
  • flappy bird ray traced would be insane. just my opinion tho

Komentáře • 104

  • @snugyt
    @snugyt  Před měsícem +5

    Appreciate the comments yall. Lots to learn hardware wise but it was good sparking some discussions. Thats what we all about out here

  • @krovaglovi9573
    @krovaglovi9573 Před měsícem +18

    ray tracing doesnt suck and is the next step in graphics, anyone saying otherwise is:
    1. willfully ignorant
    2. a contrarian
    3. doesnt understand what theyre talking about
    i imagine there were similar discourses when the first generation of graphics cards supporting pixel shaders were launched, yet here we are
    what does suck is the fact that it will take several more generations for graphics cards to be capable of running raytracing at acceptable framerates without BS like upscaling or frame generation (lol)
    i doubt the creator of this video even remembers how big of a performance hit games took when you'd enable tessellation on first gen tessellation-capable hardware - today you dont even notice it's on by default in most games and cant be turned off
    rasterization was always a compromise for the lack in computing power, but we're at a point where hardware can do real-time ray-tracing, so arguing that it "sucks" is as valid as arguing that pixel shaders "sucked" back in 2000 - it's stupid and is just not the case

    • @merked-
      @merked- Před měsícem +4

      Ray Tracing is a useless feature as most games do not incorporate it correctly. Sure, enjoy your $1400 graphics card just for Cyberpunk and Minecraft just to see the full potential. Other than that the extra lighting detail isn't worth the money asked for.

    • @krovaglovi9573
      @krovaglovi9573 Před měsícem +7

      @@merked- how much a piece of tech costs is not an argument, sorry

    • @vaguehenri802
      @vaguehenri802 Před měsícem +3

      @@krovaglovi9573 Yes it is, certainly when you're paying a premium. With the 40-series you're paying 2 premiums. The tech is great, yes. The consumer side of things is awful. 6 years on and you can't even use raytracing without upscaling. Being an early adopter should have ended 2020. Not ongoing in 2024. People are not corps. People don't live in the future. Right now it is terrible.

    • @krovaglovi9573
      @krovaglovi9573 Před měsícem +6

      @@vaguehenri802 no, sorry, its a shit argument because the same was the case when the first several generations of pixel shader supporting graphics cards were released back in 2001
      for example, the 6600 gt (mid-range 2004 card) could barely run doom 3 (2004 game) at anything above low settings at 1024x768, despite the video card being 3rd generation in terms of pixel shading support
      the same graphics card struggled to run half-life 2 as well, a game released in 2004
      the top models cost a lot and werent that good at their intended task
      the cost of anything is never an argument, because the cost of tech does not make the tech itself bad, it just makes it a bad deal, and its cost certainly doesnt change the fact that in 5 years raytracing will be standard
      also "cant even use raytracing without upscaling" - bullshit, yes you can, at resolutions such as 1080p and 1440p. want 4k? then upscaling is needed in a lot of cases if you want 60+ fps. dont be dishonest. this was only the case with 1st gen RT cards.
      if you wouldve just said "i want to have the latest and greatest but i dont have the money for it" you wouldve been honest, but as it is its just you providing your lack of money/unwillingness to pay for it as an argument against new tech...

    • @vaguehenri802
      @vaguehenri802 Před měsícem +4

      ​@@krovaglovi9573 You're getting lost in your own argument.
      The tech itself has existed a long time and is good. Which is not what anyone is arguing. The selling point and implementation is. It is awful.
      The 2x premium you pay for the GPU + req for a good CPU + the horrible optimisation in 4/5 games that actually have ray tracing. And tbh the GPUs released have clearly been too weak.
      Raytracing will not be standard in 5 years. If you think so, you need to look at steam stats over most used gpus. Most can't even run raytracing properly.

  • @TheSilverSmitih
    @TheSilverSmitih Před měsícem +17

    Yeah, I pretty much agree. There's a reason baked lighting has been around for so long. Baked lighting was created to be a facsimile of real lighting since realistic lighting would have been too computationally hungry to run. As a result, baked lighting is much more performant while simultaneously being well optimized throughout the decades.
    I have to admit though that it's cool that accurate lighting simulations are coming to fruition. If I'm honest with myself, I would love to enjoy an experience where everything is realistically simulated. Lighting, AI interactions, physics, and fluid physics would add so much depth to a simulation if it could be accurately implemented. But, the primary purpose of games isn't to be a physics simulation, it's to give the player an enjoyable experience. If these realistic simulations are implemented at the cost of enjoyability, then they have to go.

    • @heatnup
      @heatnup Před měsícem +2

      Amen

    • @jayceneal5273
      @jayceneal5273 Před měsícem

      baked lighting sucks. When everything is baked it means you dont get realistic physics because if an object moves the shadows will still be static, so it means you either dont get shadows or you dont get physics, not to mention the quality will always be dependent on texture resolution so if you want it to look even slightly decent that means you need way more hard drive space and vram. All you get at the end of the day are extremely static lifeless environments. Realtime will ALWAYS beat static

  • @arlynnfolke
    @arlynnfolke Před měsícem +4

    i gotta agree with ya tbh
    but hey, it's for people for seeking better graphic over alot of fps
    if you are playing with keyboard and mouse in 30 fps, it's obvious you can say it's suck (either myself will say that too)
    but if you go with controller, the feels gonna be different, it's not a huge problem (unless it's competitive game)
    and i think raytracing would be work better under res 2k + powerful gpus, since 4k is just too much to handle (our tech not yet reaching that to get stable 60fps right there, especially with raytracing + no upscaling)
    anyway, i'm not using 4090 or other powerful cards (you can see in my youtube channel, i'm just ordinary guy who don't have huge income and don't have a powerful pc)
    but i'm glad raytracing exist, i just don't really like with people who kinda disagree raytracing/didn't like the raytracing and call it a scam without a proper explanation, anycase in the future, the gpu will be stronger and further than we currenly have (and might amd and intels that we knew having cheaper than nvidia, might can run Raytracing in the same fps with nvidia but with cheaper version), and might Nvidia will makes raytracing even way better and optimized, making it even better gaming experience isn't it?

    • @snugyt
      @snugyt  Před měsícem +1

      Didn't realize that point you made with controllers. And yea, I'd love for AMD to have the upperhand on the technology and satisfying everyone with a way better price. I think people wouldn't have a problem with the 40 series if it was cheaper

  • @Salohcin_Semaj
    @Salohcin_Semaj Před měsícem +2

    were still early days for the tech - but the advantages both from a graphical standpoint and development are hard to overstate. Id say itll be "standard" tech around the 7th gen - for now its a fun thing to mess around with if you have a high end card

    • @Salohcin_Semaj
      @Salohcin_Semaj Před měsícem +1

      after watching till the end.. dude please do some research
      you have a few decent points but your overall understanding of the tech, how it works, ita effect on future game development, and how it affects performance is extremely lacking.
      this is the norm for new graphic tech - its always reserved for the high end and seems stupid.. until it gets cheap and ubiquitous and everyone gets to enjoy it

  • @lumbardo1041
    @lumbardo1041 Před měsícem +2

    Physics in games has been good for years. It has actually regressed. Actual physics simulation seems to be replaced by animations in certain areas. I don't understand improving physics in games, it's already been figured out. Especially in the simulation genre. Sure it can be refined but no improvement in that area would be a huge leap in innovation.
    Ray Tracing is a cool technology more than anything as well as DLSS. The complaint lies more on the side of the dev implementation. In most cases, it is necessary to DLSS in parallel with Ray tracing for suitable performance. This is a catch 22 because often times when the image is upscaled back to your native resolution, the lighting from Ray tracing can be left behind, resulting in low res lighting. This has been addressed with the Ray reconstruction feature in Cyberpunk, and it seems to work pretty well.
    In the end it's a trade off to the consumer, and it's their decision to make. Nvidia wants to replace hand built denoisers with their neutral network, and it seems the goal is to have a better performance rendering in transient environments. Hand built denoisers prioritize a stable image, which can lead to some artifacting in rapidly changing frames.

  • @aviatedviewssound4798
    @aviatedviewssound4798 Před měsícem +3

    There's actually a new raytracing technique that have been discovered that doesn't need raytracing cores, looks the same and is 3x more performant, it's called Radiance Cascades. Path of Exiles 2 is using it.

  • @CuttinInIdaho
    @CuttinInIdaho Před dnem

    Somehow this was in my feed. I think google is trying to start flame wars. I had a 5600xt, upgraded to a 2060 super, then arc a770, then 3060...now I am at a 4070 ti super...and you couldn't be more wrong. I play Ultrawide Cyberpunk and really enjoy path tracing. When the 2060 Super was relevant, so too was RT...now that PS5 will be 3x 'ing RT performance, you will have a different take soon enough. Money flows to where technology goes...and for the last 5 years+, RT is where that is going, no matter how many low sub youtube contrarians complain.

  • @PrimeStud
    @PrimeStud Před měsícem +4

    You know ray tracing is not a problem
    But the problem is in future many games wont even give us an option to turn it off!!!!
    Which is a massive issue 2-3 years down the line

    • @PrimeStud
      @PrimeStud Před měsícem +1

      For context over the years people are seeing new technologies take over which u can't even turn off because they are now embedded in the game that it can't work without it and the newer engines will not even give an option to do that
      That's how they're gonna charge us a lot more for something most of the consumers don't need
      Most of the consumers need fps and basic graphics whether u like it or not
      Graphics become a preference when fps is too high
      Fps are low right now so we should not upgrade our graphics
      Alan wake 2 for example used technologies that it doesn't even need
      Vulkan mod fixed the issue and it was made by modders!!!!!!!!!
      Modders did a better job optimizing the game than remedy and nvdia combined
      They just wanted to kill the 10 series gpus!

  • @artie1390
    @artie1390 Před měsícem +10

    Yeah man you are right, don’t listen to those people who spend 2k usd on a graphics card that they don’t even play games on. Using a technology that is supposed to increase the graphics but in return you have to scale down your resolution or even graphics settings is ridiculous

    • @artie1390
      @artie1390 Před měsícem +4

      It was that people were playing games for games, now people play games for graphics that is not even that good, not even going to talking about how much games have downgraded in terms of story and overall fun aspect.
      Games suck now, but hey, they have better graphics now, wow, definitely was worth it.

  • @olly344
    @olly344 Před měsícem +1

    I love everytime we 'change' generations... its always an super amazing demo that a company tries to make a game out of and it completly fails.. simply because the game is not good at being... a game... but yeah! nice reflections dude!

  • @philosoaper
    @philosoaper Před 28 dny

    I got a 2080Ti juuust before covid and the mining boom sent prices off into another galaxy...I'll be waiting for the 50xx series... it's nice in games, but games aren't my primary use.

  • @fran2911
    @fran2911 Před měsícem +1

    I mostly agree, I'd like to share some of my points (that might or not be mutually exclusive to what's presented in the video).
    DLSS has bad marketing/naming scheme. It shouldn't be numbered as 1, 2, 3 etc. because they're different technologies. Auto resolution scaling shouldn't be under the same umbrella as frame generation. It works really well with higher resolutions and for controlling frame drops.
    I don't activate ray tracing on most games because I have a high refresh rate monitor and don't want the trade off. I don't notice shadows that much other than light being more moody/realistic, most games have really good baked in lighting, it's only when light sources play a major role in the game or it also improves materials and water.
    All that being said, your graphics card isn't obsolete after a new generation, new cards still have a considerable bump in raw performance. (They should have more VRAM though). Also I'd prefer game developers focus on optimizing and not rushing games just to require DLSS, it's like a step forward and two backwards.
    I don't think the pictures shown accurately depict differences in technologies since they're so low res and deep-fried, you mentioned you have a 1070 which is pretty anemic compared to anything released on desktop the past 6 years. I think you'd have different opinions have you spent some time with these titles instead of judging compressed CZcams videos, Fortnite does pretty good use of ray tracing and Nvidia technologies for instance.
    Camera quality doesn't add to the video but mic quality and video resolution/sources do. (Your mic is genuinely fine). Hope these considerations add towards your future videos.
    I normally don't comment but growing channels benefit from all the help

    • @snugyt
      @snugyt  Před měsícem +1

      Really appreciate this comment

  • @ysiadpir1423
    @ysiadpir1423 Před měsícem +4

    Ray Tracing is so "GREAT" for Big Green NVIDIA, in the same way keeping up to date on your mRNA boosters is "Great" for Big Pharma.

  • @zilverman7820
    @zilverman7820 Před měsícem +13

    This ray tracing feature is only applicable for high-end cards.

    • @PabloB888
      @PabloB888 Před měsícem +4

      Based on the benchmarks I saw, the RTX2060 can run RT reflections in many games with at least 30fps (console-like framerate), especially with the help of DLSS. The 2080ti can run RT in most games with RT, so if you are only willing to tweak RT settings you can use it even on slower RTX cards. That's not bad considering Turing was the first generation to support HW RT.

  • @lambmaster
    @lambmaster Před měsícem +7

    Nah. This is how tech goes. It starts off in the hands of the rich or early adopters, and eventually trickles down to everyone else. You can see the same thing with the automotive industry: Before your time, air conditioning was literally a luxury only found in luxury cars.

    • @teekanne15
      @teekanne15 Před měsícem +1

      Its always been a marketing gimmick with a very bad price/performance ratio. Nvidia needed to introduce high demand compute to push sells of high end high price high margin GPUs.

  • @TheRealRuddRants
    @TheRealRuddRants Před měsícem +1

    Someone relatable agree to my opinion that race tracing meant nothing but a graphical gimmick makes that could kill your computer. Who knew?

  • @jayceneal5273
    @jayceneal5273 Před měsícem +2

    Imagine if doom 3 didn't push realtime lighting back in the early 2000's because cards didn't have a depth buffer to utilize it properly. Imagine if quake never went fully 3d because the cpus at the time were too weak to do it above 15-30 fps? Tech is ALWAYS going to push forward and there always needs to be a concentrated push in the beginning to get people to upgrade to the new hardware standard to push anything forward. If anything, ray tracing is the only good thing to come out of the graphics scene since the xbox one/ps4 were released because pretty much every graphics technique any AAA game utilizes outside of raytracing are just using the same methods devised over 10 years ago. Ray tracing is not just the thing to get us out of that rut, but it's also THE thing the industry has been pushing for since the very beginning of 3d graphics.

  • @teekanne15
    @teekanne15 Před měsícem

    Im a sucker and went all in on the 4090 as I got my first big pay rise and wanted to treat myself. The incremental improvement is loughable. The best selling argument would be high fps for competetive games, but those you play on low rez anyway and are mostly cpu bottlenecked which is worse.

    • @snugyt
      @snugyt  Před měsícem

      @@teekanne15 exactly my point!!!!!

  • @raiduh899
    @raiduh899 Před měsícem

    Its perfectly okay to bash the price and marketing around the tech but you cannot deny that the tech does have its potential and its been sufficiently demonstrated.
    The video talks in detail about how the performance sucks and that the difference is barely noticeable. The performance definitely sucks but the difference is very much noticeable. Especially in game-development and AI acceleration applications its been very helpful to use these cards with AI and Ray-Tracing capabilities (compare Unreal Engine's Lumen to standard GI and SSR).
    The tech is currently third gen with fourth gen cards set to release by the end of fiscal year. Again, the potential has been demonstrated and its really brilliant tech that you won't see many games use right now because of how taxing it is to run. But you have to consider that we couldn't even do this before by purely relying on rasterisation performance.
    Hate the price and marketing around it, dare not bash the tech itself when it has shown its potential, growth and actual usability.

  • @tofu_golem
    @tofu_golem Před měsícem +1

    I agree!
    I admit that I recently bought a 4080 Super. I was choosing between a 4080 Super and the RX 7900 XTX, which is maybe $50 less. For such a small difference in price, the lower power consumption of the 4080 Super was worth it to me. I also built another computer with much weaker performance requirements, and you better believe I went with AMD for that one.

  • @nelsonpiedade61
    @nelsonpiedade61 Před měsícem +2

    coming from an gtx 1050ti to an rtx 2060super i already enjoy much more the fidelity over all. and then to an rtx4070 and ho GOD i sweear i love ray tracing anbient oclusion and rtgi is awsome and i love wicher series and the three is my ultimate tool to bench mark all my hw. and after switching to next gen patch with full ray tracing in it onmmy rtx 4070 is completly imersive and another level of reality with ray tracing so i dont understand your bad jokes about it.

  • @matjakobs8612
    @matjakobs8612 Před měsícem +1

    RT is very good if you have the money for it. To really enjoy it you need a 4090

    • @snugyt
      @snugyt  Před měsícem

      Yea honestly if i had one of those id probably not make this vid haha

  • @-Rizecek-
    @-Rizecek- Před 18 dny

    We need rtx 7090 for native 1080P and 100FPS

  • @mranderson4469
    @mranderson4469 Před měsícem +1

    Gameplay/Story > realistic graphics

    • @snugyt
      @snugyt  Před měsícem

      Preach mr anderson

  • @subjxct
    @subjxct Před měsícem +6

    Decent editing but I’d recommend understanding the underlying algorithms and hardware you’re talking about before making negative generalizations about the technology

    • @snugyt
      @snugyt  Před měsícem +3

      The underlying algorithms?? Thx for watching

    • @wizzenberry
      @wizzenberry Před měsícem +2

      i have a masters degree in computer engineering, pick a specific point.

    • @wizzenberry
      @wizzenberry Před měsícem

      @@snugyt he is using buzzwords

    • @flowerthencrranger3854
      @flowerthencrranger3854 Před měsícem

      @@wizzenberry no you don’t

  • @Yesthuggu
    @Yesthuggu Před 18 dny

    Bro i think u compare rtx with cartoon graphics fortnite game try to compare with alan wake and cyberpunk like games

  • @jjsavior
    @jjsavior Před měsícem +1

    Nah. Never bought into ray tracing. Looks great in certain games. But it aint necessary.

  • @slaphead90
    @slaphead90 Před měsícem +2

    Thank you for stating what everybody who is sane already knows.
    Raytracing is simply too computationally intensive when compared to other graphic rendering methods the game industry has developed, and when I need a side by side image with the differences pointed out to actually see it then it's a fail.
    Don't get me wrong, raytracing is actually useful to produce very realistic static images - I used it back in the '90s to produce architectural static frames which were incredibly realistic, but these could be 24 hour plus renders to produce a 480p image, depending on complexity.

  • @PabloB888
    @PabloB888 Před měsícem

    I still play on a GTX1080, so I cant game with RT for obvious reasons, but based on YT video comparisons I saw the RT makes a big difference in lighting quality. Of course in some games like Fortnite the difference is subtle (because UE5 uses a software form of RT and turning on HW RT just refines the lighting), but in general there's a pretty big difference and especially in older games, or sandbox games like cyberpunk.
    People need to realize that games have been using RT for many years, but in a pre-baked form. Prebaked lightmaps, prebaked GI, prebaked reflections. In linear games like TLOU2 you have static time of day (TOD), so prebaked gimmicks can look very good. Unfortunately, in sandbox games, TOD is constantly changing, making it IMPOSSIBLE to get realistic results without real-time RT. That's why Cyberpunk looks so much better with RT and especially PT.
    As for The Witcher 3, this game allows me to turn on RT even on my GTX1080. Of course The game is unplayable because it runs around 7-12 fps, but at least I can see the difference RT makes in this game and I'm totally sold on this technology. After seeing how good Witcher 3 looks with RT, I've decided not to play this game again until I upgrade my GPU, because now I know what I'm missing. Without RT the lighting looks flat because indirect shadows are missing. A lot of hard shadows are missing as well (especially vegetation shadows) and there's a very visible line where higher quality shadows are drawn. Even the water looks much worse, because the reflections fade with movement, which is very distracting. I think this video comparison shows that difference very well.
    czcams.com/video/_o59iS_4SHY/video.htmlsi=rKssa3HajLB1wsi8
    Based on YT benchmarks, it looks like I need to buy at least the 4080S to get around 60fps at 1440p (85fps with DLSS) with RT in the witcher 3. The 4080S is expensive but still in my budget and given how much faster it's compared to my GTX1080 (3-4x in raster, 10x in RT) I know I will be happy with such gigantic upgrade. Whatever people may or may not like about RT, this technology is here to stay. Soon games will not even run on GPUs without HW RT.

    • @stevenostrowski6552
      @stevenostrowski6552 Před měsícem

      I recently bought a new pc and im capable of running cyberpunk in rt ultra (second highest rt preset) on 60 fps in 3440x1440p i can assure you it barely makes a difference in how the game looks while eating massive amounts of performance, i used it like twice maybe and switched it off shortly after.
      on some older games it can enhance the visuals massively tho but games like cp already look crisp in normal lightning.

    • @PabloB888
      @PabloB888 Před měsícem

      @@stevenostrowski6552 Based on what I saw in the YT comparisons, the difference between path tracing and raster in Cyberpunk is absolutely massive, even bigger than the already impressive RT in The Witcher 3.

    • @snugyt
      @snugyt  Před měsícem

      @@PabloB888 when do you think we will get to the point on rt essentially being a requirement? Just curious. Appreciate the comment

    • @stevenostrowski6552
      @stevenostrowski6552 Před měsícem

      @@PabloB888 I can only tell you what im seeing with my own eyes it barely looks different in 95% of cases those YT comparisons also specifically use scenes in which rt affects a lot of details but it doesnt really reflect on hwat you will be seeing most of the time.
      for older or generally less graphically intense games like minecraft or quake it does improve visuals massively though, maybe future titles can also make more use of this technology but current gen games barely improve from it as far as i can tell from the few games i tested so far.
      Too be fair in order too play raytracing with playable performance on most cards you will need too use dlss/dlaa or FSR which also degrades the visuals so perhaps thats why i feel like it aint doing much too improve visuals for me but i aint spending a whole paycheck on a 4090 that may be capable off running rt without upscaling.

    • @PabloB888
      @PabloB888 Před měsícem

      @snugyt Some games like Metro Exodus Enhanced Edition already require HW RT to run at all, but I think we'll have to wait for the new console generation before most new games are built with RT lighting from the start. Such a game will obviously not run without RT. I'm sure RT will be a big selling point for the next Xbox and PS6.

  • @Stirlingsays
    @Stirlingsays Před 24 dny

    Bright young man.

  • @akeylawhite9217
    @akeylawhite9217 Před měsícem +11

    4090 Owner. Worth every freaking penny. RT is glorious when implemented correctly. Just don't expect it to look great on your mid-end card. Basically you spent the video hating on new technologies that ultimately imrpove framerate and playability. You seem stuck in a "if the frame is generated at 4k without any assisted technology it's trash" mentality. Reminds of people that used to complain about multicore CPU's as "clockspeed is king!". (Yeah, I'm that old)

    • @dvlax3l
      @dvlax3l Před měsícem +2

      worth every penny for benchmarks and 1440p 240hz games, 4k, rtx, frame gen, dlss are trash gimmick

    • @turnip1744
      @turnip1744 Před měsícem +7

      $2000 ain’t worth it

    • @krovaglovi9573
      @krovaglovi9573 Před měsícem +1

      @@dvlax3l what a stupid take

    • @vivekkparashar6369
      @vivekkparashar6369 Před měsícem +4

      Do you know that there might be like 5% of people around the world who can buy an rtx 4090? Rtx is useless

    • @dvlax3l
      @dvlax3l Před měsícem +2

      @@krovaglovi9573 I think you misunderstood what I've said, rtx and dlss etc are stupid gimmick, it's an overpriced card, it's worth only if u r a bencher or you r a very competitive player who have a lot of money... that's all

  • @JakeBonbrake
    @JakeBonbrake Před měsícem

    lotta complaining but they allow the best of the dlss ect nvida broadcast. i went to amd for a while know and is going back for how mediocre it is. plus if path tracing 30fps ill be happy

  • @50H3i1
    @50H3i1 Před měsícem

    the more you buy the more you save .

  • @gmjakub789
    @gmjakub789 Před měsícem

    Garbage quality but the video is well done, just improve on the video quality.
    Gave it a like

    • @snugyt
      @snugyt  Před měsícem

      @@gmjakub789 thx boss!!!!!!

  • @benjsarchive
    @benjsarchive Před měsícem

    using outdated still images and games with bare minimum RT integration to proof your point is kinda wild? i see the point you're trying to make, but its a scale. some people prefer performance, some graphics. those who prefer graphics, see and appreciate the difference it makes.

    • @benjsarchive
      @benjsarchive Před měsícem

      also, the reason why the images at 1:29 look the same, is because they ARE the literal same image, taken from some YT thumbnail, its not an actual comparison.

  • @kqlys7845
    @kqlys7845 Před měsícem

    really? blud is showing low resolution SCREENSHOTS of fortnite he found on the internet and says that ray tracing is bs 🤣🤣🤣🤣 the first gameplay (that i assume was found on yt) you show a minute later shows a big ass difference and you still say "nah, looks the same" 💀hell nah, play portal and then portal rtx and tell me there's no difference. no low resoluton screenshots from other people or 360p gameplay downloaded from yt though. the difference in performance is not even that big on 40xx series, but 20xx was definitely not ready and same goes from games from that era like bfv that if i remember correctly didn't even have full ray tracing, 30xx was barely ready, and we don't talk about consoles, ps5 is on the same level as rtx 2070 that came out in 2018. do you even own a modern rtx card or is it just bait for clicks?

  • @whigmalwhim4760
    @whigmalwhim4760 Před měsícem

    Nvidia is no longer a graphics company but a deep learning focused company. They've realized the hardware constraint of ray tracing and have pivoted towards their DLSS technology to solve their issue. They're no longer focused on traditional rendering and have moved to a "gen AI" approach where they render maybe a third of the actual frames and have their TPU estimate the rest. It's complete bull. Also, DLSS 3 works on the 30 series; they just locked it from the consumer.

    • @snugyt
      @snugyt  Před měsícem

      @@whigmalwhim4760 yea with the rise of ai modeling thats exactly why so many people are investing into nvidia (not financial advice)

    • @whigmalwhim4760
      @whigmalwhim4760 Před měsícem

      @@snugyt I would not recommend it. AI has existed since the 50s with Symbolic AI and has a history of so-called "AI Winters" spanning the 80s and 90s.
      The point is that Nvidia, from talks, is advertising itself as Deep learning-focused. It's no longer "hybrid rendering" as in rasterization and ray tracing; it's now "hybrid rendering" as in generating a game from a DL model.

  • @betterthantrash111
    @betterthantrash111 Před měsícem +5

    Try swearing less. Good video, but sometimes the swearing makes you come across slow

    • @snugyt
      @snugyt  Před měsícem +8

      Appreciate it and yea fuck swearing

  • @photonboy999
    @photonboy999 Před 26 dny

    *do some more research...*
    While this guy brings up legitimate points, he obviously is missing a lot of key facts. Other people have already mentioned that this is PAR FOR THE COURSE. You introduce new features that need new hardware, it takes time to change the software ecosystem and then eventually you can't run new games on old hardware. Are you still rocking an HD5870? No. But, ray-tracing has a HUGE benefit to the game developer community. It will be FAR, FAR easier to create games once you don't have to deal with all the raster hacks and light baking issues.
    Also, the idea that you turn on ray-tracing and then "lower the quality" by using DLSS is a bit misleading. DLSS at a render of 1080p (balanced or performance. I forget) for a 4K monitor tends to look nearly as good as native 4K much of the time. Sometimes better.
    So this video seems like a semi-informed rant, not a good analysis of the current landscape of RT/DLSS etc.
    (side note: you COULD still be rocking an HD5870 from 2009 and find fun games to play)

  • @flowerthencrranger3854
    @flowerthencrranger3854 Před měsícem +7

    Nice editing, but I’d recommend actually reading about what you’re going to speak negatively on, plus, unfunny jokes don’t make your arguments more valid.

    • @artie1390
      @artie1390 Před měsícem +1

      And what is he wrong about you soy consuming trillion dollar loving so called person?