I Never Knew AI Could Improve TV Picture Quality in These Ways…

Sdílet
Vložit
  • čas přidán 7. 09. 2024

Komentáře • 411

  • @vicdez
    @vicdez Před 5 měsíci +67

    You'd be counting on Samsung supporting a niche product for more than a year which Samsung does not have track record of doing.

    • @vinsta76
      @vinsta76 Před 5 měsíci +9

      Not sure I'd call ai "niche". It's the way all tech seems to be heading. I'd defo be worried if Samsung said "it'll become available via an update later" for specific models though. Like they did with HDR10+ on my 8 year old tv that never materialised.

  • @MsMarco6
    @MsMarco6 Před 5 měsíci +81

    AI video upscaling already exists with Nvidia GPUs, as does AI based SDR to HDR conversion.
    Personally I use my PC as a media box with my S95B for exactly that reason, though it's nice to hear I might not have to with future upgrades.
    As far as the upscaling goes it's so-so. It does a fantastic job at clearing up compression artefacts and makes stuff a bit sharper but it doesn't hold a candle to slower professional Ai upscaling technology like Topaz labs Video AI (any home theater geek with a large 1080p Bluray & DVD collection owes it to themselves to buy that software).
    RTX HDR however is a revelation. I honestly can't go back after being able to experience everything in HDR. It's obviously not accurate but with stuff like CZcams Videos and older SDR Tv shows who really cares. I'd love for Vincent to cover it sometime.

    • @Tential1
      @Tential1 Před 5 měsíci +7

      It's extremely frustrating to have people act like artificial intelligence is new and not something Nvidia was doing since 2018. This is why people think Nvidia is a bubble. They think it's all new....

    • @aquaneon8012
      @aquaneon8012 Před 5 měsíci +3

      I wonder how nvidia's ai upscaling compares to sony and samsung tvs.

    • @MsMarco6
      @MsMarco6 Před 5 měsíci +10

      ​@@aquaneon8012 I guess we'll have to wait and see when these new TV's release.
      Though to be honest I feel like it'll be hard to compete with Nvidia.
      The fact is that even with my 4070 TI, AI upscaling uses a decent amount of power and about 20-40% GPU usage.
      It was unusable with my old 3070 laptop as it created too much heat and sounded like a jet engine.
      So if a 3070 mobile (equivalent to a PS5 in terms of speed) was reasonably taxed by this then it's hard too believe the cheap Soc's in Tv's will have enough grunt to run a similar algorithm without it being sufficiently less sophisticated.
      Then again maybe they've come up with a more efficient method than Nvidia (doubtful) or are finally using a decently fast chip (even more doubtful).
      So who knows.

    • @elvnmagi9048
      @elvnmagi9048 Před 5 měsíci

      The thing is, there are bandwidth limitations even using DSC (display stream compression) - so getting a high hz signal, for example a 240hz 4k 10bit 444(rgb) HDR signal to the tv first over the existing port and cable limitations using DSC, and THEN using hardware on the TV to upscale could be more efficient and get higher performing results than sending a pre-baked (4k upscaled to) 8k signal from the nvidia gpu at lower hz. Especially with the way nvidia has allotted bandwidth on their ports up until now, (60hz 8k max, 7680x2160 at 120hz max, limits on multiple 240hz 4k screens) - though that could change with the 5000 series potentially. Unless they started putting nvidia upscaling hardware on gaming monitors/tvs someday perhaps. That's from a perspective of using a gaming TV as a multi-purpose PC gaming + desktop display, with some media usage.

    • @elvnmagi9048
      @elvnmagi9048 Před 5 měsíci

      I think you are comparing a cgi rendering to a real time game so to speak. You can pre-bake/upscale movies, sure, you can even buy movies that have been upscaled. From what I understand, Topaz works best with tweaked configuration per title or even clip, and though it's pretty fast on a powerful gpu it would take awhile to render/AI upconvert a big library of videos.
      AI upscaling/machine learning upscaling is done in real time on varied content, (plus pc gaming apparently). Dynamic "real-time" delivered content. So... why not have the capabilities of both?
      I agree in general though for static movies and pre-recorded programming. . . why not just pre-bake them as upscaled? I'd say the same for streaming services but they use dynamic bitrates so real time AI upscaling could still have huge benefits there.. It's "real time" content like live shows, live streams, news, and gaming that AI upscaling in real time would benefit most, but my viewing habits digest a lot of those. Since streaming services bit rates are lower and even fluctuate dynamically, streams would also benefit. So it's really only per-recorded local libraries for collectors that have been locally upscaled or bought upscaled that wouldn't derive benefit.

  • @jiiiiiiiiiiiiiiiiiii
    @jiiiiiiiiiiiiiiiiiii Před 5 měsíci +81

    Hopefully there will be an Authentic Image mode.

    • @Datenschutz_Datenschutz
      @Datenschutz_Datenschutz Před 5 měsíci +17

      yeah hopefully. But Samsung TVs are trash..so...

    • @gjk2012
      @gjk2012 Před 5 měsíci +14

      There are TV's out that cant even turn off the soap opera motion smoothing. This is getting scary.

    • @jaro6985
      @jaro6985 Před 5 měsíci +5

      You realize that no consumer level TV is showing you the authentic image right? They all process the signal even with all settings off.

    • @MaMuSlol
      @MaMuSlol Před 5 měsíci +5

      Authentic Image mode as in... A I mode? :)

    • @damyr55
      @damyr55 Před 5 měsíci +4

      Play the same content on 3 different TVs (without AI upscaling) and you'll get 3 slightly different images. So how do you define Authentic Image?

  • @WednesdayMan
    @WednesdayMan Před 5 měsíci +6

    honestly AI upscaling is really interesting, if it works on 480p signals we could have real time upscaling on old games and have them look nicer.

  • @G1Picard
    @G1Picard Před 5 měsíci +67

    I only need 1 preset on TV. It should be called - “No added crap by Samsung offline mode”

    • @alexatkin
      @alexatkin Před 5 měsíci +5

      That would be great if we also had a setting for streaming services that was "Bluray quality".
      Unfortunately, a lot of content is highly compressed or standard definition, so needs some help to look decent.

    • @BackForwardPunch
      @BackForwardPunch Před 5 měsíci

      lmao

    • @cezarstefanseghjucan
      @cezarstefanseghjucan Před 3 měsíci

      Build your own TV.

  • @asafblasbergvideographer
    @asafblasbergvideographer Před 5 měsíci +46

    I do find AI upscaling an interesting idea. I think it should be available on the TV, but there also should be a way to turn it off in case there are errors.

    • @SilverHolland
      @SilverHolland Před 2 měsíci

      Would errors happen, though? I bet the TV makes sure the 4 or 16 pixels that replace the 1 or 4 original ones keep the same overall light and color, so you never get very far from the original?
      I'm very curious for an "AI upscaling off" button, to compare and save energy. In the store, 8K demos look so amazing, it's hard to look away :-D

  • @georgejones5019
    @georgejones5019 Před 5 měsíci +54

    I'm in tech and cyber security.
    AI has become a marketing term since Chat GPT went mainstream.
    Most "AI" is just machine learning or some other algorithm, but it's unknown to common consumers.

    • @michaelbeckerman7532
      @michaelbeckerman7532 Před 5 měsíci +7

      Exactly. That term is being slapped on absolutley everything now for the simple reasons that companies believe that if they DON'T put that on everything they make consumers will think they are outdated and behind the times.

    • @UniverseGd
      @UniverseGd Před 5 měsíci +3

      Like having data sizes of GB (Gigabytes) in power of 10 instead of power of 2 like GiB (Gibibytes). Pure marketing trash.

    • @aagfnv
      @aagfnv Před 5 měsíci +1

      Can't wait for my AI-enchased yerba mate smoothies.

    • @georgejones5019
      @georgejones5019 Před 5 měsíci

      ​@aagfnv I saw a toaster boasting AI on here or LinkedIn somewhere. It's getting ridiculous.
      I could understand if they stated AI was used in the design process. But everyone is skipping that, that's part of reaching singularity.

    • @thanksbetotap
      @thanksbetotap Před 5 měsíci +1

      Machine learning is one of the subfields of the AI field in computer science.

  • @trignite
    @trignite Před 5 měsíci +4

    Generating detail that doesn't exist isn't improving the quality its changing it.

    • @elvnmagi9048
      @elvnmagi9048 Před 5 měsíci

      Think of it like having a huge texture library for a video game. An extremely massive library and intelligently generative off of that. Your game had low detail textures, now it has high detail textures that are based off of the original ones. The original texture and it's mapped grid is still intact, but the holes/missing pixels of detail in the higher resolution screen's finer grid are filled intelligently. It's a fill operation, but one based on a huge AI learning library.

  • @m7gh827
    @m7gh827 Před 5 měsíci +29

    The best tv tech channel on CZcams.

  • @u.2b215
    @u.2b215 Před 5 měsíci +40

    I really can't tell the difference in those 8K comparison shots. They should try showing DVD content non-AI and AI upscaling side by side.

    • @Artemis-v8i
      @Artemis-v8i Před 5 měsíci +14

      It just looks like they turned the sharpness slider up

    • @SilverHolland
      @SilverHolland Před 2 měsíci

      Zoomed in. Always I see comparison that make it smaller on our screen than original. With the TV in out room, we'd be looking at details to appreciate quality.

  • @landoishisname
    @landoishisname Před 5 měsíci +8

    I guess it was only a matter of time for TV companies to make their own form of DLSS

    • @elvnmagi9048
      @elvnmagi9048 Před 5 měsíci +2

      The thing is, there are bandwidth limitations even using DSC (display stream compression) - so getting a high hz signal, for example a 240hz 4k 10bit 444(rgb) HDR signal to the tv first over the existing port and cable limitations using DSC, and THEN using hardware on the TV to upscale could be more efficient and get higher performing results than sending a pre-baked (4k upscaled to) 8k signal from the nvidia gpu at lower hz. Unless they started putting nvidia upscaling hardware on gaming monitors/tvs someday perhaps.

  • @2Burgers_1Pizza
    @2Burgers_1Pizza Před 5 měsíci +4

    I'd be curious to see how it handles high pattern noise situations. Things like finely knit clothing that can throw off digital sensors and cause aliasing where there'd be none with analogue recording.

  • @tridentd5898
    @tridentd5898 Před 5 měsíci +54

    Waiting for the lg g4 review

  • @alim998097
    @alim998097 Před 4 měsíci +1

    As the debate continues to rage on for James Cameron's decision to use Park Road Post Production and their machine learning algorithm, it dawned upon me that one day this technology will trickle down into consumer products either as an integrated or standalone depending on functionality, capability and price points. There are quite a few people who are fans of the final 4K physical media product for Aliens, The Abyss and True Lies. This trend may gradually become a norm for some people as Vincent pointed out, "Do you care if the upscale detail was present in the original source or generated by AI?" I would argue that for the typical consumer and non enthusiast of films, they will not care just like how they don't care if it streaming or not in order to get the best audio and video quality. The enthusiast details mean very little to them, all they care is if it looks good or not. I still remember a time when enthusiasts were moaning how demonstration panels were set on vivid or high frame rate interpolation to get peoples attention. A.I. upscaling is that except on steroids, which sadly may become a more common technology that is applied to entertainment images.

  • @tobiastessmer5125
    @tobiastessmer5125 Před 5 měsíci +4

    So now Samsung reads Text from my screen to send home and build a better profile by also knowing which games I like? Great! Where’s the off switch?

  • @Cloxxki
    @Cloxxki Před 3 měsíci

    The upscaling aspect of 8K TVs seems to be the most underreported tech that's already in stores. The salesman in my local store seemed quite unaware of it, and was utterly unable to help me to a demo for it.
    So many grat old films in 1080P or 720P...how do they come out? Fuzzy documentaries...if it's ANY better, I'm here for it!

  • @SlotHits777
    @SlotHits777 Před 5 měsíci +5

    How well does it upscale DVD 480 to 1080 & 4K?

    • @frankiepoindexter445
      @frankiepoindexter445 Před 5 měsíci

      That's what I really want to know. I want to watch my Scarface DVD in 8k. Or maybe I could just buy the 4k version (if it exists) and see if it looks any better in 8k AI upscaling mode. Also, I'll need to afford the TV, which is something I'm working on.

    • @ditroia2777
      @ditroia2777 Před 4 měsíci

      I’ve downloaded 1080p AI upscaled rips of tv shows that only ever came out on DVD and it’s looks pretty good.

    • @SilverHolland
      @SilverHolland Před 2 měsíci +1

      @@ditroia2777 Oh cool, that's promising. Was that real-time upscaled though, or post processed? If I can get an App for my 8K TV or PC that takes Miami Vice, Night Rider and Airwo;f and make it 8K, that'd be pretty neat. Let alone music videos, if it can give that depth perception we see on demo signal in TV stores. 8K Samsung QLED 900C is just gorgeous with that Demo. I want to see what it and 900D do with 720-1080P media...

    • @ditroia2777
      @ditroia2777 Před 2 měsíci

      @@SilverHolland it was a torrent.

  • @user-tz7jj7bl6q
    @user-tz7jj7bl6q Před 5 měsíci +1

    Yep, all starts here. Next time you say something like this it’ll be “man, I didn’t know ai could blow me and cook at the same time!…” then from there it’s surly all downhill boys

  • @gjk2012
    @gjk2012 Před 5 měsíci +14

    I would be interested in seeing DVD's upscaled to 8K. That's the only time I'll use AI upscaleing and turning off the soap opera motion smoothing.

    • @Datenschutz_Datenschutz
      @Datenschutz_Datenschutz Před 5 měsíci +7

      it's really bad - there are many many many A.I. Upscales from the pirates.

    • @gjk2012
      @gjk2012 Před 5 měsíci +5

      @@Datenschutz_Datenschutz If it's bad, I won't even bother.

    • @Tential1
      @Tential1 Před 5 měsíci +4

      ​@@gjk2012it's not.. Why would you trust a person who's source is looking at ai upscale uploads from pirates.... Lol.... If you have an Nvidia gpu, or Nvidia shield, it will do AI upscaling and look amazing... Nvidia is the leader of ai.... Why would you think they're bad at the technology they literally were first to invent? Ai upscaling is something Nvidia has been doing since 2018. It's not new....

    • @User0815-OG
      @User0815-OG Před 5 měsíci +3

      @@Tential1 nope. You really think
      A.I upscaling in Games is the same like in real world realistic movies. lol. Boy, computer games don't look realistic and DLSS sometimes even creates an oil paint like image. So he's definitely right and you have no clue, obviously.

  • @SlashManEXE
    @SlashManEXE Před 5 měsíci +1

    I can’t imagine many scenarios using this, but one thing that would pique my interest is how retro games would look. The community has long experimented with different upscaling algorithms to find the best look for older systems on modern screens.

    • @SilverHolland
      @SilverHolland Před 2 měsíci +1

      Exactly this I was wondering today. Some have shown it, but a zoomed out view always (triple face palm).
      Imagine an algo that turns 640P VGA racing games into 4K with lovely details taken from a dedicated database. Cranked up to 120 Hz, of course. Now what to do with the tinny car noises? :)
      In the time the 80s games needed to load, modern AI could scrape the internet for 1440P gameplay to develop bespoke real-time algo that overlap in front of actual 640P gameplay. No need to upscale, just pick new pixel to substitute, loads of them :)

  • @nin74
    @nin74 Před 5 měsíci +5

    Whatever happen with reproduce the original source?

    • @alexatkin
      @alexatkin Před 5 měsíci +2

      Streaming services happened.

  • @EyesOfByes
    @EyesOfByes Před 5 měsíci +6

    New Sony reference monitor video soon?

  • @ToneTheOracle
    @ToneTheOracle Před 5 měsíci +4

    Am happy with my Panasonic mz2000

  • @kooijbas
    @kooijbas Před 5 měsíci +1

    Yes, I care. These are all stopgaps / hacks to overcome flaws in display technology or bad source material. The solution is not to fake it, but to fix the problem: better source material and better display tech. I wish all these companies would put their r&d effort into fixing the root of the problem.

    • @DarkLordDeimos
      @DarkLordDeimos Před 5 měsíci +2

      While I don't disagree at the theoretical/utopian level, Samsung does not produce the streaming content from Netflix etc., distribute sports, or publish movies, so I don't know what you expect them to do other than improve whatever the subpar content the TV receives. In an ideal world, all movies/shows would be available in 4K HDR on 4K Blu Rays, and sports would be shot in 4K, 120 fps. But in the real world, some people still buy DVDs, most people stream from highly compressed sources, and sports is only available in 720p/60 or 1080i/60. Your complaints should be to those that generate/produce the content in the first place.

    • @SilverHolland
      @SilverHolland Před 2 měsíci

      I'll get on the phone with Federico Fellini and Walt Disney right away! Do you have quarters?

  • @markph69
    @markph69 Před měsícem

    Imagine AI upscaling your CCTV 360p footage to 4K. Saves lots of storage!

  • @TheMasterMakarov
    @TheMasterMakarov Před 5 měsíci +14

    We went through all the effort to make them include Filmmaker mode
    only for them to "add texture" that wasn't missing
    sounds like people who put their tvs's sharpness on max designed this
    really I want my tv to show me what the creator wanted to show me not a robot "with an oh so large data set" hallucinating grain on walls lmao

    • @absolutium
      @absolutium Před 2 měsíci

      That's not happening ..the creator didnt watch a stream of video with 25% the color data of the captured scene..
      We aren't even getting 4:2:2 from Blu-ray..whatever we are watching on Netflix is maybe 30% of the creator's Master.

  • @skywalkerranch
    @skywalkerranch Před 5 měsíci +2

    Great stuff. Thanks a lot for making this video.

  • @kristofferabild
    @kristofferabild Před 5 měsíci +135

    First thing I do on any TV is turn off all fake AI stuff. I want an accurate image, please.

    • @guadalupe8589
      @guadalupe8589 Před 5 měsíci +5

      AI doesn't make it more, "authentic"?

    • @Datenschutz_Datenschutz
      @Datenschutz_Datenschutz Před 5 měsíci +1

      @@guadalupe8589 lol

    • @tomblade
      @tomblade Před 5 měsíci +35

      I agree but for very poor quality material it could be useful in future tvs.

    • @0M0rty
      @0M0rty Před 5 měsíci +15

      ​@@tomblade I don't think so, very low res/quality source means the AI has less to work with and has to interpret more, leading to more mistakes and weirder results.

    • @Tential1
      @Tential1 Před 5 měsíci +27

      ​@@0M0rtyno, it doesn't. You're theory crafting, rather than looking at the actual results. This isn't new tech dude. Nvidia artificial intelligence upscaling has been here. And it's been tested. Low resolution content, it's like magic. Hardware unboxed has shown that for video games upscaling is better than native image quality. I know, it's weird. Why do you think Nvidia is worth 2.2 trill....

  • @samray3461
    @samray3461 Před 2 měsíci

    Yes, we need more pixels to enjoy this Kubrick.

  • @ms3862
    @ms3862 Před 5 měsíci +2

    Hey Vincent where is the G4 review?

  • @theheadsn
    @theheadsn Před 5 měsíci +1

    A great example of this that regular people can see now is the new 4k release of Aliens. James Cameron used the same kind of A.i. to upscale the film. If you've watched that movie over the years, the amount of clarity they were able to obtain in the new release is kinds crazy. People seem to either love it or hate it. I can see the benefit for sure

    • @pewburrito
      @pewburrito Před 5 měsíci +5

      it added extra wrinkles to Ripley in some shots that makes her look like an elderly lady

    • @theheadsn
      @theheadsn Před 5 měsíci +1

      @@pewburritoeverything but faces it seems to do a very good job of. When faces are static they look great, movement is still hit or miss. It's the best current example I think of the path it's going towards. I would say in another year or so, once it gets moving objects down, it'll be almost flawless. It's very uncanny. I think it will be great until people start using it as a replacement instead of an optional enhancement. It's like 48fps and the "soap opera" effect it creates. When it wasn't intended, it sticks out, but when used properly, it doesn't throw you off as much. We have all gotten used to 24fps and what recordings look like on a basic level, we're getting into enhancements that people outside of enthusiast group will have a hard time accepting easily.
      The more people use it properly will dictate the acceptance level

    • @elvnmagi9048
      @elvnmagi9048 Před 5 měsíci +1

      @@pewburrito give her a break, she was under a lot of stress

  • @paulmcnicholas1658
    @paulmcnicholas1658 Před 5 měsíci

    If the difference is imperceptible from ‘normal’ viewing distance (say 2m or more) then the end justifies the means with regard to AI enhancement and should l benefit the viewer. Whether so remains to be seen. Your eloquence as ever in delivering the technical detail without pause in a digestible format would garner praise from Mr Spock, so long may you rock 🖖👏

  • @Case_
    @Case_ Před měsícem

    I can appreciate the technology, but, at the same time, I'm getting increasingly annoyed by all the AI fakery that seems to be the trend (not only) in visual processing. Whatever happened to simply showing the original unaltered content? I would say that it's fine as long as it can be disabled, except it doesn't really matter, because as soon as such a feature is readily available, it will inevitably become the new "normal" anyway. We can already see the effects of AI making up stuff in other areas, and it's very clear many people are unable to even notice when an AI "enhancement" is mangling images for them under the illusion of added detail or generating horribly unnatural translations or text.

  • @budala1969
    @budala1969 Před 5 měsíci

    I love the AI upscaling in Nvidia Shield. I can't wait to see what the new Shield delivers.

  • @maumentum
    @maumentum Před 5 měsíci +1

    So will these Ai features only be set, released on 2024 S90D & S95D QD-OLED televisions? Or there 2024 QLED as well? Will getting a discounted S90C set me further back then forward from my current mini LED?

  • @craigprocter1232
    @craigprocter1232 Před 5 měsíci

    generative video upscaling has been a thing on PCs and consoles for the past few years (ie: adding in generated pixels to say a 1080p image to upcale it to a 1440p image) and the next iteration is totally generated 'tween' frames to 'boost' frames per second. Having this built into tvs is no great surprise.

    • @DarkLordDeimos
      @DarkLordDeimos Před 5 měsíci +1

      The difference is that if written properly, a game can provide instructions for the consoles/computers to know what is going to happen in future frames, whereas a TV is only receiving the information after the fact. However, unlike a game, where the users responsiveness must be taken into account, limiting the time available for the frame generation to occur, if a TV were to delay its output by several frames (i.e. lag behind the input source) by creating a sort of frame buffer, a user wouldn't notice/care. Definitely interesting to say the least.

  • @wavetrex
    @wavetrex Před 4 měsíci

    I do this quite often with Topaz, but it's a very compute heavy process, barely doing a few frames per second on a high end GPU. I can't see how a dinky TV processor can do such thing in real time...

  • @Daniel189HLL
    @Daniel189HLL Před 5 měsíci +17

    Samsung could be half price and I still wouldn't ever buy one.

    • @michaelbeckerman7532
      @michaelbeckerman7532 Před 5 měsíci +2

      Once you have seen how they consistently over-saturate and over-boost all of their colors by default on every model they make you can't unsee it, unfortunately.

    • @trappedintimesurroundedbye5477
      @trappedintimesurroundedbye5477 Před 4 měsíci +1

      no less they don't have Dolby vision

    • @priyankanigam3315
      @priyankanigam3315 Před měsícem

      ​@@michaelbeckerman7532Samsung colours are better than both lg nd sony , sony looks too dim nd natural nd lg is just sh8t

  • @thenobleyouthofmadness55
    @thenobleyouthofmadness55 Před 5 měsíci +3

    @HDTVTest Do you think samsung will match or even surpass sony in image processing department courtesy of generative AI from what you've seen in that event???? And also how does this generative AI upscaling tech stack up against sony's cognitive processor XR's upscaling??

    • @DisplayNAudioTech
      @DisplayNAudioTech Před 5 měsíci

      Cognitive intelligence is another way of saying AI. Like Apple saying, Spatial Audio instead of Dolby Atoms to make you think it’s something else. Sony always thinks they are Apple of TV world NOT.

    • @DisplayNAudioTech
      @DisplayNAudioTech Před 5 měsíci

      LG use to throw up words like AI way before Sony on everything. It never worked properly but they still use it for ages.

  • @Aid3nn33
    @Aid3nn33 Před 5 měsíci +12

    no AI in my image, I want raw, pure, 1:1 native image

    • @PlayNeth
      @PlayNeth Před 5 měsíci +1

      The only true answer.

    • @c0reying
      @c0reying Před 5 měsíci +9

      So you want a 1080p image to only cover a 25% box in the centre of your TV? A SD image to only be a tiny 5% window?
      Sounds unreasonable. No one does that. It has to be upscaled to fill the screen area somehow!

    • @Chasm9
      @Chasm9 Před 5 měsíci +4

      I'd like to show you a blind test. Play you an (advanced) upscaled image without you knowing, for few days, then switch to "raw, pure 1:1 native image" for few days without you knowing. The findings would be interesting.
      I personally couldn't care less. If there's something that can enhance those crappy 1080p videos with poor bitrate on CZcams or simply old videos, that's fantastic news.

    • @simonchasnovsky1835
      @simonchasnovsky1835 Před 5 měsíci +5

      You only THINK you want it because you've never seen a raw, pure, 1:1 native image. Everything low-res you consume on a phone, computer or TV gets algorithmically upscaled to the display resolution, because displaying a raw, pure 1:1 360p or 480p signal in on a 4K display is absolutely horrible.
      The only displays capable of displaying low-res signals in presentable ways in high-res panels were CRT TVs, since there aren't really "pixels", but instead a phosphor coating on a grid that sets the maximum resolution, and an electron gun which's properties allow it to display any resolution under the maximum resolution clearly.
      Knowing all this:
      1) You want upscaling.
      2) You want AI upscaling (since it's the best upscaling).
      3) I know that you want it because you're customer, and customer's don't know what they want because they don't know anything about the products they're using.
      4) Everything will come with AI upscaling in 5 years, you won't be able to turn it off, and even if you manage to turn it off, you will come back to it because it's awesome.

    • @guadalupe8589
      @guadalupe8589 Před 5 měsíci

      ​@@simonchasnovsky1835is it ACTUAL AI or a sophisticated algorithm?

  • @chrispomplun1511
    @chrispomplun1511 Před 4 měsíci

    Will ai upscale to 8k soon and higher frame rates?
    The hfr Gemini man looked great

  • @vinsta76
    @vinsta76 Před 5 měsíci

    As someone who often uses Topaz AI to upscale old movies/tv shows especially to remove noise (which it's really good at) to make them "watchable" as well as using DLSS in games on my pc. This sounds very interesting indeed.
    I'm sure the first couple of generations of the tech will have its faults but it'll get better in time like all of the ai upscalers on pc have done. With 90-95% of terrestrial tv channels still being less than hd this is the kinda stuff that would make me watch those channels again as i pretty much avoid them.
    Shame I've just ordered a new tv (S90c) so it's gonna be another 8 years or so until i upgrade again but at least it'll be mature and cheaper by then

  • @gowfan357
    @gowfan357 Před 5 měsíci +1

    So basically a DLSS. Sounds great

  • @rolandrohde
    @rolandrohde Před 5 měsíci

    I like Upscaling of course and think AI could probably do a good job. Two things though...most of the time, still images, even compressed ones or 1080p Streaming look absolutely fine, even on a large 4K screen with a relatively close viewing distance.
    Where things fall apart is when there is Movement. Since Hollywood still isn't budging on it's 24p fixation, what I need most to enjoy content on my TV is good motion interpolation. This has never been Samsung's strongges area though.
    Another little "worry" is, that Samsung tends to have there own "Vision" of what a good image is supposed to look like, and that tends to be over-sharpened, overly colourful and overly bright.

  • @teddyholiday8038
    @teddyholiday8038 Před 5 měsíci +13

    Doesn’t the LG G4 use AI upscaling?

    • @nimblegoat
      @nimblegoat Před 5 měsíci

      Not sure exactly the same , think it's also for empathising main subject to make it more 3D , good news 5 years updates , pretty sure will use same alpha 11 next year . So will get more AI. Think its need quite a bit of memory , and need to download model data . So for best would need to scan movie and download models

  • @Neopumper666
    @Neopumper666 Před 5 měsíci

    I just have to say I LOVE that for the first time in so much time, Someone talking about the new buzzword actually calls it what it is, a Neuronal Network.
    I really dislike how AI is being used so freely, when no real Artificial Intelligence exists yet..

  • @derekspecht6323
    @derekspecht6323 Před 5 měsíci

    There is AI and tv, but remember, there are sometimes a catch 22 with them to actually work, like resolution, chroma, what frame rate, and which inputs they are available on and not, along with copyrights stuff

  • @pamo5900
    @pamo5900 Před 5 měsíci +41

    video on nvidia sdr to hdr when? so much better then this

    • @MA-jz4yc
      @MA-jz4yc Před 5 měsíci +8

      Yes, Ive been commenting this for a while. Upscaling using the full power of a dedicated nvidia RTX graphics card has to be miles better than integrated SOCs the TVs use. Plus they have experience with DLSS already.

    • @yadspi
      @yadspi Před 5 měsíci +13

      yes because everyone has a PC with an Nvidia card attached to their TVs so a channel focused on TVs should definitely do a video a bout that first /s

    • @transporter255
      @transporter255 Před 5 měsíci +1

      😂​@@yadspi

    • @MA-jz4yc
      @MA-jz4yc Před 5 měsíci +3

      @@yadspi The RTX upscaling already exists and can be used with any brand of TV, whereas what is mentioned in this video is just a demo for high end 8K TVs most people will never buy. Also many people have a TV attached to their PC rig and can take advantage of it right now.

    • @ToxMod
      @ToxMod Před 5 měsíci

      Keep your diapers on

  • @Radders1433
    @Radders1433 Před 5 měsíci

    The more I care is relative to how close I am to the screen and how big the screen is.

  • @darkmatter7274
    @darkmatter7274 Před 5 měsíci

    I've known about this for a while because a while ago, they were talking about video games graphics that would be generated by the GPU. This would be more efficient than trying to render the details from local libraries. I believe it works in practice, but I don't think it's in any games yet. Generally, I'm right, but I could be wrong about the detail I've given, so be forgiving.
    When I read about this concept I wondered when they would apply it to TVs to make a 4k video from 24 fps to 120 fps with AI generated frames to smooth motion. Making it look like it was filmed that way. That would be its best application in a TV. Making low res content higher is nice, but far less important for improving the movie experience, I think. Not seeing the judder or soft images during motion would be a massive step forward.

    • @jonbrandjes9024
      @jonbrandjes9024 Před 4 měsíci

      NVIDIA and Intel use DLSS and XeSS for upscaling their games from a lower resolution to a higher resolution. It not only upscales it also makes the image more stable with a temporal pass in the upscaling. DLSS also has frame generation which should technically double your framerate but in practise doesn't because there is s slight overhead with using it. And Intel will soon be adding ExtraSS which will work a little bit different.

  • @Deffine
    @Deffine Před 5 měsíci +5

    Basically DLSS (deep learning super sampling) for TVs?

    • @User0815-OG
      @User0815-OG Před 5 měsíci +6

      yeah but on low end trash Hardware SoCs with 5W power consumption. So, there is no GPU like RTX 4080 etc with the horsepower for real AI. And even real AI isn't the holy grail for movie productions - it looks not natural.

    • @od1sseas663
      @od1sseas663 Před 5 měsíci +3

      No. DLSS upscales by using previous frame data not generative ai

    • @Tential1
      @Tential1 Před 5 měsíci +2

      ​@@od1sseas663lol you have no idea what you're talking about.

    • @maou5025
      @maou5025 Před 5 měsíci +2

      @@Tential1 he is correct tho. Generative AI is closer to a look up table to find a closest matching image.

    • @od1sseas663
      @od1sseas663 Před 5 měsíci +4

      @@Tential1 I hope you’re kidding. DLSS uses temporal data and motion vectors to upscale an image. Generative AI is a completely different technology. YOU have no idea what you’re talking about

  • @S.L.S-407
    @S.L.S-407 Před 5 měsíci +1

    SAMSUNG biggest company : big does NOT necessarily mean better.

    • @michaelbeckerman7532
      @michaelbeckerman7532 Před 5 měsíci +1

      Sadly though, millions of people out there every day are fooled into thinking it does. So, they blindly just go out and buy Samsung TVs thinking they are getting "the best TV". But, that's the power of good marketing for you! Spend enough millions on crafting a marketing message and you can just about fool all the people all the time, unfortunately.

    • @frankreynolds9930
      @frankreynolds9930 Před 5 měsíci

      ​@@michaelbeckerman7532Different priorities. Most don't care. They don't like looking at dark, bland colors even if it super accurate.

  • @nicki9356
    @nicki9356 Před 5 měsíci +1

    Anyone know, if the Ai settings on the LG C9 has any effect whatsoever?

    • @DACatface
      @DACatface Před 4 měsíci

      I would like to know too

  • @matthiasverbeure5973
    @matthiasverbeure5973 Před 5 měsíci

    I don’t know if this will works always good... but i hope that we can select than an option to stay a maximum with the original picture quality.
    Today with my 15 year old plasma i’m still waiting for a huge upgrade with a correct price. I hope i can still wait until csot injekt printed 75/77 inch rgb oleds or nanoled/qdel screens are on the market for 1600/1700 euro’s and less, i hope ces 2025 will be better than ces 2024. If i’m realistic one of the ces 2026 models will be my next hardware upgrade. So blackfriday 2026 or january 2027 i buy one. Hopefully my plasma works until that time. Cauze today the price gap between 55 inch qd oled and 75 inch mini led is to much. I think real native 4K will be supported a lot more in the next 4 years. Games on consoles works still today with 1440p and not 2160p. Next generation gameconsoles 2160p with 60 fps is possible.

  • @quantumdestroyer6039
    @quantumdestroyer6039 Před 5 měsíci +2

    This should have been posted on April 1st. What a joke of a video.

  • @tnutz777
    @tnutz777 Před 5 měsíci +1

    I have to say that in pc gaming, nvidia uses its tech to upscale from 720p to 2160p in real time, and the result is often considered superior to native believe it or not bc of additional antialiasing. Therefore, it should be no problem to take a 1080p bluray and create a superior 4k+ image. I wish nvidia would make a blu ray player.

  • @PcGamingForFun
    @PcGamingForFun Před 5 měsíci

    bravo a step in a right direction. if only the ai could keep the motion resolution intact while in fast moving scenes and when panning camera it would be great

  • @Simon_Hawkshaw
    @Simon_Hawkshaw Před 5 měsíci +3

    Thank you for this very interesting and enlightening information on picture quality enhancement. Your time and efforts to share this with us all are much appreciated.

  • @josemolinaro
    @josemolinaro Před 5 měsíci +3

    Pretty extraordinary stuff. You can also see how Ai will be applied across all manner of technologies.
    Thanks for the in depth information.

  • @spiral-blades752
    @spiral-blades752 Před 5 měsíci +1

    This was a great samsung ad. Thankfully I have no interest in auto AI game detection/image manipulation.

  • @User0815-OG
    @User0815-OG Před 5 měsíci +11

    like for HDTVtest support - but not for AI Upscaling in Movies.

  • @jasonschubert6828
    @jasonschubert6828 Před 5 měsíci +2

    Isn't this image database based upscaling just the same as what Sony introduced with the ZD9 8 years ago? Samsung seem to have a habit of copying Sony, adding "Quantum" to the name, and then pretending they invented it. They did this with QD technology too, even though the Sony Triluminous displays were introduced 2 years earlier, and now I have seen _many_ channels saying that Samsung were the first.

    • @ameserich
      @ameserich Před 5 měsíci

      Isnt Triluminous just Sony's marketing for Quantum Dot Enhancement Film?
      Seriously do you believe everything revolves arould sony? Are you a sony fanboy or a samsung hater?

    • @jasonschubert6828
      @jasonschubert6828 Před 5 měsíci

      ​@@ameserichactually, I think Samsung TVs are generally very good, and have one as well. Regardless, I am more concerned about people and companies pretending this type of technology is new, rather than the incremental improvement it probably is. Credit where credit is due.

    • @doicenti9033
      @doicenti9033 Před 5 měsíci

      Sony never made a quantum dot TV. Never. All sony's use pfs phosphor panels. Samsung low-end/mid-range use either pfs phosphor or broad red phosphor and on qled they use pfs phosphor and a quantum dot filter. Sony marketing still strong after many years of scamming and lack of innovation. People still think Sony makes good products but in my opinion they make the worst TVs on the planet, even TCL and Hisense are better. Sony is always lazy, every year nothing new. They never fixes their trash TVs. Are people still waiting for VRR update on A90J ? Sony doesn't have LCD TVs without massive blooming that lit up the whole screen when subtitles appear, even midrange Samsung are much better than any Sony LCD flagship at suppressing blooming, even if Samsung is edge-lit and Sony is FALD or mini-LED.
      XG95 vs Q90R 6:27 czcams.com/video/HTaeZ70mzIk/video.html
      XH95 vs Q80T 2:00 czcams.com/video/2DaZE_VGytQ/video.html
      ZH8 8K vs Q950TS 8K 4:48 czcams.com/video/aM5EdZbTurM/video.html
      Sony has massive blooming but you don't care and you never will because you were too busy going HAHAHAHAHAHAHHAHAHAHAHA

  • @rct8884
    @rct8884 Před 5 měsíci

    Thanks for your overview of Samsungs AI tech. I am ok with the use of AI for picture enhancement if it's implementation shows an improved image and motion processing. I am most curious if the AI algorithm improves star field scenes, one of the most challenging scenes for mimi led tvs.

  • @closeben
    @closeben Před 5 měsíci +1

    It’s interesting technology, but it’s being deployed in the wrong place. They should give this technology to the creative industries so that for example a filmmaker can use the tool to enhance their image if they want to, and so that when you watch it at home you know that they have approved it. I would not ever want to watch a movie with AI added in afterwards. Filmmakers already hate motion smoothing, I can’t imagine they are going to be happy hearing people are watchign their movies at home with AI generative upscaling.

    • @michaelbeckerman7532
      @michaelbeckerman7532 Před 5 měsíci

      Well, remember here, these companies HAVE to keep coming up with the "cool new thing" every single year to convince people that they are getting something more this year than they got last year. They HAVE to do this also to (at least try to) convince people to UPDGRADE, long before their old TV actually breaks or wears out. That's the way the entire tech industry stays in business. There is a gun to their collective heads of these tech companies called "year-over-year revenue growth" and they will do whatever they possibly can to make sure those shareholders of theirs stay happy. Even if what they are telling you each year is about 80% marketing fluff, hype and pure BS.

  • @R1ckmister
    @R1ckmister Před 5 měsíci

    Excited to see where this goes in 5 years time. Ill then be retiring my G3

  • @niallarmstrong255
    @niallarmstrong255 Před 5 měsíci

    Pioneer kind of did this with their Fuga prototype years back? seems similar in some ways

  • @snatcher989
    @snatcher989 Před 5 měsíci +7

    G4 vs c4 vs G3 vs C3 vs A95L pleaseee 😅

    • @djbpresents9584
      @djbpresents9584 Před 5 měsíci +1

      G4 is best tv of the year

    • @TheAntifluencer
      @TheAntifluencer Před 5 měsíci

      G3, C3 and C4 are all old news now.
      Top 3 this year are G4, A95L and S90D

    • @Bushwacked487
      @Bushwacked487 Před 5 měsíci

      @@djbpresents9584says guy who has definitely not seen every flagship tv this year

    • @djbpresents9584
      @djbpresents9584 Před 5 měsíci

      @@Bushwacked487 watched enough review putting g4 as best so far even over way more costly ones, the processing LG is using this year along with all the Xtra features 144hrtz and just suite of gaming features, detail and brightness in picture g4 just sits at the top so far but wait around for it

    • @teddyholiday8038
      @teddyholiday8038 Před 5 měsíci

      Spoiler alert
      G4 and A95L are the best
      We know this already

  • @gadgetvideos
    @gadgetvideos Před 5 měsíci

    I have been using NVidia DLSS for PC games for a couple of years now. I think the technology is really awsome. Do not worry about the algorithm generating details that were not supposed to be there in the original material. The algorithm looks at the picture on a micro-level and detects the patterns that are there in the original material and scales them up to the target resolution in the most plausible way. Upscaling is always going to be generating pixels that were not there in the source material. This technolgy scales the image up in the smartest and most plausible way. It is far superior to any other upscaling methods. It does not come up with wohle elements that are not supposed to be there. But I know TV purists will always turn off fearures that will alter the picture.

    • @elvnmagi9048
      @elvnmagi9048 Před 5 měsíci

      The thing is, there are bandwidth limitations even using DSC (display stream compression) - so getting a high hz signal, for example a 240hz 4k 10bit 444(rgb) HDR signal to the tv first over the existing port and cable limitations using DSC, and THEN using hardware on the TV to upscale could be more efficient and get higher performing results than sending a pre-baked (4k upscaled to) 8k signal from the nvidia gpu at lower hz. Unless they started putting nvidia upscaling hardware on gaming monitors/tvs someday perhaps.
      So at least for PC gaming at high hz, AI upscaling hardware on a gaming TV could be very useful.

  • @roary4092
    @roary4092 Před 5 měsíci +1

    Some of these features seem useful but I don't need or want most of them.

  • @michaelbeckerman7532
    @michaelbeckerman7532 Před 5 měsíci

    I'm not sure I see the big value-add here. This is the essentially the exact same thing that both Sony and Panasonic have already been doing now for years with their processors - with just an AI label thrown on top of it because it is 2024 and everyone feels they need the word "AI" on everything now, of course.

  • @tnutz777
    @tnutz777 Před 5 měsíci +1

    Im not too excited by different game modes on the samsung. create an ideal image and it will be ideal for all genres. Some genres have a tendency to be more demanding, and whatever benefits them will benefit others by default.
    What manufactures need to do is focus on a quality implementation, not pointless and often substandard gimmicks. Tvs do not need more settings. They need to do what they are supposed to do well.

    • @s4nder86
      @s4nder86 Před 3 měsíci

      Samsung is all about bloat and bling, it's what they do.

  • @sj460162
    @sj460162 Před 5 měsíci

    The upscaling on my samsung qn95c is staggeringly good. I fail to see any tv clean it up better!!!!

  • @mrlopez7173
    @mrlopez7173 Před 5 měsíci +3

    Sounds good but samsung oversharpens the image in their tvs

    • @michaelbeckerman7532
      @michaelbeckerman7532 Před 5 měsíci +1

      And don't forget their laughable over-boosting of all their colors. 2024 now and they STILL can't let go of that. What the hell are they afraid of?

    • @frankreynolds9930
      @frankreynolds9930 Před 5 měsíci +1

      ​@@michaelbeckerman7532Maybe because their general audience prefers it. No wonder they are still market leaders.

    • @michaelbeckerman7532
      @michaelbeckerman7532 Před 5 měsíci

      @@frankreynolds9930 no, they don't prefer it actually...they just don't know any better. But, that's what they end up with when they just blindly follow the crowd instead of actually doing their homework before making a purchase. Sadly, millions do exactly that.

  • @brucethen
    @brucethen Před 5 měsíci

    Having seen reviews of Nvidia DLSS, it is on its 3rd generation, and while it looks very good, there are still ghosting and shimmering problems at times.

  • @AchtungEnglander
    @AchtungEnglander Před 5 měsíci +1

    when the tech can transform SD into HD, I'm in. Apart from that native 4K is good enough, it does not need to be any sharper. Tests have shown that people prefer film grain in a film, to give it that "film" aspect to it. I would agree.

  • @earthoid
    @earthoid Před 4 měsíci

    I would not pay extra for the AI feature at this time. I already turn off all motion and image enhancing features on my Sony but I'm open to new technology making image enhancement better. I'll wait and see.

  • @SpykerSpeed
    @SpykerSpeed Před 5 měsíci

    This is going to be incredible once applied to cameras, too. AI will be able to make a phone camera picture look like a DSLR camera.

    • @alexatkin
      @alexatkin Před 5 měsíci

      High-end phones already do and sadly I really don't see it getting close to DSLR.
      AI relies on there being some hints of the details that are missing still left in the image (subtle changes in the colours of the pixels), tiny sensors just don't pickup those details at all which is why skin usually looks flat lacking any texture and red hair usually looks brown, and any subtle changes in the tone of hair in general is lost.
      Smartphone photos only look good on those tiny screens where we can't see all the detail is missing.
      Also remember as we make tiny sensors more light sensitive to try to compensate for their size, we can also then do the same with bigger sensors. So inherently the gap between the two doesn't really close that much.

  • @tiagoj8020
    @tiagoj8020 Před 5 měsíci

    I would think there would be lag and slow response if this is on.

  • @GameSack
    @GameSack Před 5 měsíci +2

    AI can miss me. The latest James Cameron 4K Blu-rays use AI upscaling from 1080p, and they look HORRID. Also, upscaling video with Topaz Video AI is very tough to achieve believable results. It just looks bad. Just scan the film at a higher resolution to begin with. I'd rather have an interpolated low-res image than a high-res one where a computer is guessing (usually poorly) what should fill in the blanks. AI upscaling does have it's uses, but really only in very small steps.

  • @georgewashington3012
    @georgewashington3012 Před 5 měsíci +2

    I’ll pass. I want true to source, not fakery. That’s also why I hate “filters” on pics.

    • @guadalupe8589
      @guadalupe8589 Před 5 měsíci

      All images on any device is a recreation of reality. In a sense, it's all, "fakery". FYI, upscaling is on any device you see video on

    • @nathanddrews
      @nathanddrews Před 5 měsíci

      Me too, I prefer to view the raw binary file structure. 😂

  • @TheAntifluencer
    @TheAntifluencer Před 5 měsíci +5

    G4 vs A95L vs S90D is the battle that matters. S90D is the better sharper looking TV over S95D which is ridicolous!

    • @Jeroenneman
      @Jeroenneman Před 5 měsíci

      Why it the S90D sharper?

    • @griffithdidntdonutin8942
      @griffithdidntdonutin8942 Před 5 měsíci +3

      ​@@Jeroenneman s95Ds anti glare panel causes a pretty noticeable difference in contrast

    • @stevenhocking6407
      @stevenhocking6407 Před 5 měsíci +3

      And the s90d doesn't have the one connect box which is also a big win over the s95d

  • @MrHallonz
    @MrHallonz Před 5 měsíci +1

    So many things in one video I never thought I would hear from a ”purist” like Vincent. To comment on just one of them, 4:35, if I can’t notice or care about upscaling from 2m away then why bother anyways? Or to continue that logic then might as well max out the sharpening, deblocking and all other effects that the ”experts” always tell consumers to turn off? Sounds like marketing, marketing, marketing….”AI upscaling” is nothing new….

    • @michaelbeckerman7532
      @michaelbeckerman7532 Před 5 měsíci

      Exactly. Companies like Sony and Panasonic have already been doing this exact same thing in their processors for YEARS now. No different. They just put a cool new label on it. That's all.

  • @JimElford
    @JimElford Před 2 měsíci

    Call me a purist or boring - but i much prefer standard linear upscaling, without any artifical gap-filling. What if someone were to hack its data set/search algorithms. Couldnt that lead to wildly inaccurate/inappropriate results? No thanks.

  • @AQuantumCraig
    @AQuantumCraig Před 5 měsíci

    So it invents stuff / details that isn’t there. Absolute nightmare for a filmmaker.

  • @TheCynicalAutist
    @TheCynicalAutist Před 5 měsíci +9

    I'll be honest, it's very interesting to see new technology being developed, but I'm too much of a purist to enjoy AI upscales. I'd rather have my old video sources just be scaled up to the display.

    • @User9681e
      @User9681e Před 5 měsíci +2

      Many 4k movies are upscaled by ai

    • @TheCynicalAutist
      @TheCynicalAutist Před 5 měsíci +1

      @@User9681e And I hate those transfers.

    • @elvnmagi9048
      @elvnmagi9048 Před 5 měsíci +3

      @@User9681e most movie production is enhanced by software. TV scaling is also not "pure". Streaming is dynamic compression rates too which use tricks. compressed mkv's etc are also not pure transfers necessarily, just "visually lossless", like DSC. Then there is the fact that HDR compression and different HDR implementations vary a lot in what they deliver across different TV models and tv technologies. There isn't some ultimate pure signal displayed on some consumer display. It's getting the best you can get by them using hacks/tricks/work-arounds to mask or ameliorate the limitations of different display techs.

    • @PierrickTranouez
      @PierrickTranouez Před 5 měsíci +1

      And which algorithm has your preference to « just scale up » your source ?
      en.wikipedia.org/wiki/Comparison_gallery_of_image_scaling_algorithms

    • @BeginsWithTheEnd
      @BeginsWithTheEnd Před 5 měsíci +1

      The thing is whether it's marketed as "AI" or not, anything that isn't already 4K is being upscaled by your TV using some kind of algorithm, otherwise it would be using what it called "nearest neighbor upscaling" which just stretches the pixels to be big enough to fill your screen without any kind of filter, making it look really blocky.
      "AI" is really just a marketing term for a combination of upscaling techniques and post-processing effects and maybe some content recognition via pattern analysis algorithm. It's maybe doing some extra things to try to guess what kind of content you're viewing but that's about it.
      The goal is just so that you don't have to manually adjust image enhancement parameters yourself and so it can do it automatically in real time.
      But yes if you want the most accurate image possible, you're going to stick with the basic built in upscaling and not want all the extra post processing that takes too many liberties with how it handles color and detail, a simple upscaling method ensures it's just filtering the pixels just enough so it's smooth and natural looking and not pixelated and blocky.

  • @redrum_2k
    @redrum_2k Před 5 měsíci

    I certainly don’t want to have imagined details in my home cinema experience. For example, I want to see the actors‘ faces like they were captured by the camera, not how an AI computed them to be “improved“. It‘s interesting from a technological pov, but I would leave such a setting turned off for the most part. James Cameron just butchered his own movies by throwing AI tools at them for the new 4K versions and unfortunately for those versions we don‘t get an “off“ button.

  • @Sekir80
    @Sekir80 Před 5 měsíci

    So, this is like a built in Topaz AI into a tv. Interesting. Should I wait for teams to improve old shows like DS9, or should I just buy this tv? :D

  • @cedarstuff
    @cedarstuff Před 5 měsíci +9

    If the creators didn't put the information in the image, I don't want some engineer guessing at what should be in there.

    • @icanbelievemyeyez4653
      @icanbelievemyeyez4653 Před 5 měsíci

      Yup! Agreed :)

    • @ThisisCitrus
      @ThisisCitrus Před 5 měsíci +5

      It's not the engineers, its AI. Also creators intent isn't always the best looking.

    • @junior-OG
      @junior-OG Před 5 měsíci

      fact

    • @ruben7937
      @ruben7937 Před 5 měsíci

      Problem is that the hardware (your tv) is downgraded cheap hw... so it will NEVER show what the director intended.
      That's why the TV producers introduces ai and software to circumvent how extremely poor the limits of their TV hardware actually is.
      Low bits, too few light bulbs, too much color banding issues, etc, etc.
      It's ridicules how TV hardware producers have gotten away with continously poor low quality hardware.

    • @frankreynolds9930
      @frankreynolds9930 Před 5 měsíci

      Well movie makers use reference monitors cost 40k. How is tv producers supposed to make cheap tv then?

  • @RobHFirstReflect
    @RobHFirstReflect Před 5 měsíci

    Awesome, I can't wait for every character on The Simpsons to have 7 1/2 fingers on each hand now!

  • @twangology
    @twangology Před 5 měsíci +4

    I don't want any AI messing with my movies, is there a way to disable the AI function?

    • @guadalupe8589
      @guadalupe8589 Před 5 měsíci

      You say that, for now....

    • @twangology
      @twangology Před 5 měsíci +1

      @@guadalupe8589 I want un-adulterated image quality, as soon as AI is introduced it becomes fake.

    • @Datenschutz_Datenschutz
      @Datenschutz_Datenschutz Před 5 měsíci +2

      dont buy Samsung.

    • @twangology
      @twangology Před 5 měsíci +1

      @@Datenschutz_Datenschutz I will never buy a Samsung TV - Always LG

    • @georgejones5019
      @georgejones5019 Před 5 měsíci +2

      You can turn it off on LGs.

  • @ryankossick5791
    @ryankossick5791 Před 5 měsíci

    Unless this ai detail enhancer experiences nearly no detail position continuity problems that I see in just about every ai video, I can't imagine this getting too popular.

  • @Untilitpases
    @Untilitpases Před 5 měsíci +1

    AI calibration would be the best thing they can offer alas they don't care

  • @rem9882
    @rem9882 Před 5 měsíci

    I heard that Samsung and Qualcomm have made a deal with AMD to do work together. Could Samsung be getting a bit of help from AMD by seeing how their FSR works and implement their own version in the tvs?

  • @SickPrid3
    @SickPrid3 Před 5 měsíci

    I do not like the idea that the TV will show me something the director of a movie did not intent to have there
    Upscaling - yes
    generative content - no

  • @lonniefarmer7067
    @lonniefarmer7067 Před 4 měsíci

    Thanks!

  • @patrickknapp4588
    @patrickknapp4588 Před 5 měsíci +4

    Vincent when will the test for the G4 come?

  • @normrubio
    @normrubio Před 5 měsíci

    But if you have two tvs, playing the same image, using the AI, would the output be different?

  • @bjorn6706
    @bjorn6706 Před 5 měsíci +1

    Wil this ai chip be in the new samsung oleds to?

    • @babbagebrassworks4278
      @babbagebrassworks4278 Před 5 měsíci

      Depends on which CPU they use for Smart TV functions, AMD, Intel, Qualcomm, Rockchip all have new CPUs with NPU cores.

  • @AndrewJohnWard
    @AndrewJohnWard Před 5 měsíci

    Is the C4 a big downgrade without MLA. Do you think next year's C5 in 48 inches will have MLA? Is it true or false no Dolby Vision support doesn't matter on a Samsung QD-OLED

    • @olgil
      @olgil Před 4 měsíci

      C4 is the right choice for most people. G4 I'd super bright especially in specular highlights but probably not worth the price difference.
      C5 likely won't have MLA in any size unless the G5 makes a significant leap in technology which I doubt.
      Dolby Vision maybe won't matter to a lot of people but it's personally something I wouldn't but a TV without. Dolby Vision content can be a bit hit or miss but when it hits, it's stunning.

  • @Logeekistics
    @Logeekistics Před 5 měsíci

    Too much film grain nowadays in movies, and 4k doesnt even look that sharp as well since its fake 4k to not expose pores on actors faces. The trade off is people not being able to enjoy a 4k tv becuase producers of certain content compromises. For example you see a demo of what the tv can do and its impressive with pop and 3d pq, but then even regularl 4k content doesnt even have the same picture quality (pq) as a high bit rate demo with no film grain.
    Also for streaming like netflix they criminally force low bit rate as 4k so again doesnt even look that good. So the solution to criminals of forcing low bit rate content on streaming, ai helps. In fact companies like netflix should have their own data center to help other companies improve their ai as a form of penance or community service.

    • @elvnmagi9048
      @elvnmagi9048 Před 5 měsíci +1

      yep. Also worth noting that streaming using dynamic compression, so it varies on a graph over time, and in some cases, by server load (like game of thrones?). If you recorded the streamed versions on episode launch and then got the uhd set later, there would be a big quality difference throughout. Some of the best stuff is uhd disc versions of nat geo and bbc nature uhds, science/space stuff, etc.