This GPU is WIRELESS?! - Retro GTX 460 WHDI
Vložit
- čas přidán 5. 05. 2024
- Visit www.squarespace.com/LTT and use offer code LTT for 10% off
Check out SignalRGB and sync all your RGB peripherals at bit.ly/LinusRGBTips
Why use HDMI and DisplayPort when you can go WIRELESS? This GPU technology existed over 10 years ago, but no one has used it since. Is it a hidden gem, or is there a reason it died on the vine?
Discuss on the forum: linustechtips.com/topic/14782...
Purchases made through some store links may provide some compensation to Linus Media Group.
► GET MERCH: lttstore.com
► SUPPORT US ON FLOATPLANE: www.floatplane.com/ltt
► AFFILIATES, SPONSORS & REFERRALS: lmg.gg/sponsors
► PODCAST GEAR: lmg.gg/podcastgear
FOLLOW US
---------------------------------------------------
Twitter: / linustech
Facebook: / linustech
Instagram: / linustech
TikTok: / linustech
Twitch: / linustech
MUSIC CREDIT
---------------------------------------------------
Intro: Laszlo - Supernova
Video Link: • [Electro] - Laszlo - S...
iTunes Download Link: itunes.apple.com/us/album/sup...
Artist Link: / laszlomusic
Outro: Approaching Nirvana - Sugar High
Video Link: • Sugar High - Approachi...
Listen on Spotify: spoti.fi/UxWkUw
Artist Link: / approachingnirvana
Intro animation by MBarek Abdelwassaa / mbarek_abdel
Monitor And Keyboard by vadimmihalkevich / CC BY 4.0 geni.us/PgGWp
Mechanical RGB Keyboard by BigBrotherECE / CC BY 4.0 geni.us/mj6pHk4
Mouse Gamer free Model By Oscar Creativo / CC BY 4.0 geni.us/Ps3XfE
CHAPTERS
---------------------------------------------------
0:00 Intro
1:05 Old-school unboxing
3:51 The wireless receiver
7:04 This is what a wireless GPU looks like
9:44 Let's install it!
12:00 The moment of truth
13:55 This is why we're using an old monitor
14:55 They lied.
16:47 Time to play some games
18:06 Wireless USB too?
18:40 Competitive disadvantage
20:09 Video playback is... A thing
20:58 Conclusion - Věda a technologie
10 years is a long time for shipping... I wonder why it took so long?
COVID
north america is wide
Border Customs
@@crby101 true
its cuz the shipper was a dad so he had to get the milk
Props to the guy who kept that card in such good condition!
Yeah fr
And then I'm all "well, is your GPU at least not busted tf loosened solder and all" on 6800XT mining cards idk man, I seen some like, tech pro tier stuff out there on the used market with all the materials even the sticker for CPU and such, and then you get those really weird guys who literally didn't clean the cheetoh grease off their laptop and GPU. Guys please be like this guy. Keep your materials safe for resale.
After I got upto 300K trading with expert cathie wood .I bought a new House and I'm now able to send my kids to a better school in the states When someone is straight forward with what he or she is doing people will always speak up for them.
Her good reputation already speaks for her last month i invested over $100,000 with her and I've already made over $250,000 profit.
+1518
310
*CORRECTION* 17:30 - That jitter isn't due to the GPU/WHDMI but due to Left 4 Dead 2 thanks to the Source Engine's physics actually moving you and your AI-controlled teammates in an elevator. If you pay attention to the HUD elements in the bottom left or your weapons/items in the middle right, there is no jitter. Cool bit of outmoded tech, though! Thanks LTT for covering this!
Should pin this if they ever look at old videos comments
I was looking for this :D
@@davidwatkins9545their ego won't let em
gay
@@ultrae you aint wrong
“Sry guys my gpu disconnected”
Being an engineer who worked on this and seeing it turn on in an instant when everyone is doubtful must be one of the best feelings in the world
8
@@memerified inch
@@mreli_ deep
@@terra3673😈
@@terra3673 In
the "jitter" in the elevator scene is actually an artifact of the source engine. They always jitter during elevator scenes because of how they didn't attach players to moving platforms, so they're falling and hitting the ground ~30x/second (the tickrate for L4D2)
Yup, been looking for people pointing that out in the comments. Not their obligation to know that, but still is an "unfair" call to the GPU (not that it could be considered good anyways, given the other problems it has lol).
Would explain why if you "take a break" in L4D2 while in an elevator will sometimes make you clip through, and fall to the ground underneath it.
@@El_Mouse that doesn't happen anymore, for well over a year already
@@dirtypink3197 Haven't played L4D2 in a long time but the last time I played the 360 version it was a buggy mess*. The last update completely ruined it. PC version still worked great the last time I played it. Never payed enough attention in the elevator to notice jittering (usually too busy healing up and prepping for the next section, especially when playing with randos or NPCs that like to disturb the witch).
*A lot of Falling through floors, elevator just didn't open half the time or levels would otherwise be incomplete-able, Rochelle or Zoey would end up in a weird slanted state or constantly falling down and getting up again, voting didn't work and would either pass the vote no matter what or not pass the vote no matter what (whatever if felt like).
@@dirtypink3197 A year you say, in a 15 year old game?
This definitely had its use in audio industry, especially with audio mixing studios for movies! There were strict regulations of noise levels, so all of the audio equipment including PC's were in different rooms
So.. why not run a cable from the other room?
@@xx1simon1xx it is done by cables it is still the most reliable connection! This is just one step closer to perfect sound isolation and quite safe option, since if it stops working you replace it... the cables are usually in a network that goes through walls (which are acoustically treated) so if one breaks its expensive to replace them
you can just get a long ass cable 😆
A couple things:
L4D2 is almost always like that, especially with elevators, due to how the engine works all of the collisions for the models. Also, with the video for Crab Rave, you can still see a lot of compression with a 3080 - both at 1080p and 4k. The only good way to show how it is, was CS:GO.
With that said, 5GHz Wi-Fi being used a lot now compared to when it was released, that might also be affecting the latency, and could be causing at least some of the artifacting due to the now busy/noisy signal band.
Only Linus would put a 12900K and GTX 460 together
And Windows 10 no less.
Should he buy a Phenom II just for benchmarking?
@@lurch1539gt 210: *you underestimate my powers*
@Lurch Well yeah.... XOC guys just need an output and often prefer compact cards that can run entirely from the PCIE slot.
I disagree
Coach jittering is caused by the engine working our collisions as the models all travel down the elevator. Happens every time
Absolutely. Was looking for someone who had commented that, L4D2 I a great game but on hell of a glitchy/buggy mess, specially on elevators lol. (Still, the GPU ain't no good).
really? it also looks like bob deinterlacing method, where the interlaced fields just rapidly switch
@@cURLybOi Around 14:53 you can hear Jake say that he can see interlacing on the screen capture but Linus said he couldn't see it on the actual display. There seems to be something off with the display they were using for testing. They have a weird setup using a special screen that has HDMI passthrough. A normal capture setup is video output -> capture card -> passthrough to monitor. The setup in this video is video output -> monitor that has HDMI out -> passthrough to capture card. I think the monitor is causing interlacing on its output to the capture card as well as the "jelly" effect that they saw around 15:18. The jelly effect isn't visible on the captured output which means it's likely an artifact of the LCD panel itself rather than the video signal.
It would have been nice if they tested with another screen to verify but I also understand why they didn't devote any more time to troubleshooting strange video artifacts on an obsolete video card from 12 years ago.
Or maybe, the jitter is to simulate the fact that elevators aren't 100% smooth?
@@2slogan yes yes of course 😂😂 that's what the devs tell you
Linus do more videos like this, your genuine enthusiasm really sells it
I had a LG 55" Wireless LCD TV that used 60Ghz instead of 5ghz. It was able to produce a true no compressed 1080p image. It had a separate box you plugged everything into, and then the box beamed the image wirelessly to the TV. The TV panel itself only had a single HDMI input as a backup.
It worked great in my 1 bedroom apartment. I had the input box in my bedroom, where my PC and PS3 were hooked up, and then the 55" panel in my living room. I gamed for a couple years like that.
In the bedroom, I had an HDMI switch where I could switch between my bedroom TV and the living room TV. It was awesome how in 2009 I was able to pause a game on my ps3 in my bedroom, and then continue exactly where I left off in the living room.
I wish there were more modern solutions to allow having a single spot for all your devices, but then beaming them to any TV in the house. Fortunately now I am able to afford a PS5 and gaming PC for any room I want to play in, but it would be cool to just have a single PS5 and PC I could hook up once but be able to play them anywhere in the house.
The reason for all of this jitter and weird 1080i vs. 1080p is NOT that the receiver is deinterlacing. The GPU is transmitting at 1080i and the Receiver is receiving at 1080i. The MONITOR is the one deinterlacing. It's using a Bob-deinterlacing strategy which means it takes every field and stretches the height to be full height and displays every now-resized field. So every odd field (every 2nd field, now full height frame) will feel displaced by 1px downward.
Is that not exactly what they said?
@@anunoriginaljoke5870 He's posted this after watching the video to make himself look knowledgeable and clever .....
@@anunoriginaljoke5870 no because the receiver isn't what's deinterlacing
@@anunoriginaljoke5870 No, they said it was the receiver... The monitor interlacing never even came up in the video.
can i connect this gpu to a quest 2?
Great. Nvidia will now release a RTX 4090 wireless for $4500
45000$
4.5 x 10⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹⁹$
4085 ti*
$4090
@@m4heshd a Dollar per Mhz sounds like something they would try to pull off and justify.
I had a simpler version of this that just transmitted video over the 2.4 ghz wifi and it produced so much wireless noise that no other devices could connect to my router. You mention that this was 5ghz, so you could also check if it impacts other wireless performance in other devices.
A while back my brother owned a HDMI wireless transmitter and receiver combo. The transmitter box device is where the HDMI devices plug into and it wirelessly transmits the signal to a receiver box which connects to TV via HDMI cable. According to my brother it worked pretty good when outputting his PlayStation 4 to the TV in his living room. Though he lived in a small apartment at the time.
5GHz wifi has also become hugely more used since those days, I wonder if that could affect it and be causing the artifacting?
Read more above comments, it's the monitor and them not actually understanding what they are talking about, neither of them are experts.
@@kiloneie definitely done "by amateurs". I understand that they want to have fun while making a video, but most of us are interested in the technical details and quality testing of such a card - because this is the only opportunity to see something like this...
@@goranjosic my thoughts exactly
@@goranjosic dude they literally discontinued any work on wireless video output after that, its not worth it for consumers, its too much work and too pricey. its not that they are experts, but the stream quality for sure sucked ass.
@@goranjosic nobody watches LTT for technical details lmao
The jitter in 17:35 is because of the elevators being "real" in L4D2 and the engine struggles to keep the models on the elevator floor when its moving.
yes
Correct, I suspect that will be edited out once they realise :)
Now I wonder how this card would perform for both DVD and Blu Ray playback. Imagine if it was used in a theater setup, and the latency shown persisted throughout entire movies.
This is quite impressive. As someone who had multiple 480s, 470s at the time and keeping up with tech, I don’t ever remember hearing about this product. Kinda reminds me of my dell widi dock that couldn’t even penetrate the other room even though it was advertised as. Range test would have been interesting.
You missed out on an opportunity to use that HDMI cable tester you guys have, that is if you still have it lol
OMG You are so right. That would have been interesting.
I tought that too.
You guys should have gone to the Nvidia control panel to set 1080p, I think its defaulting to 1080i as safe resolution because 1080i monitors were a thing back then.
I agree to this, i used a 32inch samsung TV, and it defaulted to 1080i when i set it to 1080, but you can force it to 1080p and that's what i did and what i am using right now
Agree, there was many problems in this video, many things weren’t properly looked into, or even gone deeper into. They quickly threw it to the side as being crap without checking to see if the error was on there end, which it was. Wish this got a more thorough look.
It's 1080i because they're at 60 fps. If they ran 30 fps they could probably run 1080p
Thinking about it, for the time this made a lot of sense to me, given that airplay wasn't a thing and running cables to your big screen from a desktop setup is probably something most didn't wanna do.
I love seeing these unboxings of old tech. It could have been interesting to test the 100 ft. range and test it through walls.
At 15:08 Linus says “1080p-ness” I think even HE didn’t see the pun
@linus
Bro
There you are. The second I heard him say that I went to the comment section. I knew I wasn't the only one! LOL
It's funny that at the time someone would probably say "this card is huge" but compared to the stuff today it looks tiny.
I have a pretty small graphics card and it's about the same length as this 💀
But it was really good for 60fps gaming at medium settings back in 2011
My 4080 is almost as big as my keyboard lol
Not really the 260 was a real monster though
I have GPU with similar cooler, 影馳 海外版 Galaxy GTX 550 Ti HQV (but my GPU only have 1 DVI and 1 DP), this tall boi even longer than GIGABYTE R9 270 WindForce 2X OC
Love these dives into old obscure tech that not many people got to play with, I vote you look at the Ageia Physx cards that were around for a little while with ASUS / BFG and Ghost Recon (few other games too) before Nvidia purchased the company and tech.
Really interesting review! After reading many great ideas in the comments of this video, I hope to see a Round 2 of this product's review. Thanks a lot guys! 😃👍
18:35 "I Have no idea where we would find drivers for something like that at this point"
Maybe in the CD that came with the box that has the drivers for it?
this old style LTT video is actually way better than the newer ones... it feels raw and a lot more personal... not too many things going on on the screen
agreeded
Huh, I'm not the only one who thinks this.
@@rikogarza1729 eh I kinda like em
@@rikogarza1729 They are a youtube channel, they need sponsors, if that's what you are talking about.
@@rikogarza1729 you want videos or not? Merch is not enough.
This is actually a very good idea for the time imo, disregarding performance. The idea of not needing to feed super long display cables for environments like large meeting rooms and/or eliminating the need for any external cable management is a great concept. Would also be good for standing desks since setting up display cables to be able to use the height adjustment is a pain in the butt, unless you’re fine with dangling and sagging cords. Obviously today the add on device is better, but i bet this inspired it. It seems almost like a proof of concept device to me.
Fun fact: WHDI was made by Amimon, which was acquired by Vitec (now Vivendum) and folded into their subsidiary, Teradek, which still produces new wireless devices today, albeit only for professional camera departments.
This video seems very rushed. Linus didn’t test the range of a WIRELESS GPU, I was sure he would do it. He pointed out the flaw of the engine as a flaw of the product. He wasn’t bothered to check the wireless usb and didn’t dig into settings of the card at all. It sucks becasue it was such an interesting product. Take your time and make good full videos. We will wait
I even would have liked to have seen them test it on multiple (or at least one other) monitor to verify that there was no image degradation on the (presumably old) monitor they were using in this video. I agree, they could have run more tests on this thing, even if it still made for an interesting video.
I would like to see Lazy Vide Game Reviewer reviewing this.
It's 11 years old. Does it really need to be thoroughly tested? Lol
@@dnevill I think the point is, if you're going to test it and bother making a video, at least test it and do so properly instead of shit talking it immediately and ending on that.
@@MidnightMarrow he still thanks you for the view.
1:40 Ironically, it's basically what in-home streaming became. Kinda cool tech, just sad it didnt have a LAN port instead of just a wireless dongle
This probably transmits more data than a single gigabit LAN port.
the bots are glitching
1:40 Ironically, it's basically what in-home streaming became. Kinda cool tech, just sad it didnt have a LAN port instead of just a wireless dongle
@@TheWunder Ironically
@@thewiirocks, it's
Wow I remember seeing a review of that when I was looking up to buy a 460 back in those days. Wild! that is cool that he kept it
Antenna on the GPU and the receiver are not phased correctly, GPU is intended to be in a case, so all 5x Antenna would be vertical not horizontal on a test bnch - aka vertical = omni and in phase with the Antenna in the receiver.
17:45 isnt this just the elevator vibrating ingame?
correct, it is been like that since the game launched and any L4D2 enthusiast will know exactly thats what happens in the elevator scene
correct, i've played thousands of hours of l4d and can confirm this is a source engine problem, nothing related to the GPU
@@pirriu exactly
Lmg editors and writers clearly don't play l4d
Guru3D has a review on this (Wireless Graphics card - KFA2 GTX 460 WDHI review, page 4). It says - it's detected as 1080i by default, but can be changed to 1080p in NVidia Control Panel. KFA2 is a European brand of Galaxy.
another review needed?
The artifacts in the Crab Rave video are actually part of the video here on YT and therefore not a problem of the compression. And compression of the stream is impossible while maintaining the low latency, because compression of a FHD steam would introduce 20-30 milliseconds of latency.
This needs to be a thing now that wireless tech has come so far. An adapter for the monitor and a wireless GPU would be amazing
I think the jitter you point out at 17:40 is actually just the game. Coach is moving up and down, and you are in an elevator. The game just isn't quite updating the characters vertical position and the player camera at the same time.
17:40 that jitter my friend is glorious source engine moving elevator with characters inside :D
I have a Galaxy 9800 gtx (?) that I bought way back when for my Vista machine that I believe is still installed there. I wasn't quite sure it was the same company until I saw that beautiful, blue PCB. Man, that takes me back.
Seems like those were the days for weird experiments. I actually had a Hydra motherboard at that time, one of those that offered cross platform multi GPU support. Worked about as well as this.
I'm surprised they didn't try moving the monitor at different distance to see how far it could go
Please keep doing more of these minimalist videos. It's not to say that your modern sets are bad or anything, but this is a very pleasant change of pace. :)
17:35 That's actually because the elevator is actually moving, the jittery of coach is the game trying to keep him (and you) on the floor of the elevator, but it only updates once every 66 ticks (or 128, I don't know what the default is for l4d2) so if you have more than 66 fps you'll notice it
You could have done a test on a test bench of that era, with matching OS, rather than a newer system, simply because some issues might straight up be caused by the newer hardware/firmware (eg: the "new" drivers) used in tandem with the GPU, wouldn't be a first by any means, so that's a missed opportunity. Trying with new cables (especially the HDMI one) would help verify whether the card really is having issues or not. I would assume the signal sent through the antennae has it's completely separate dedicated circuit, as it would be fairly stupid to allocate part of the performance just for the signal, so load on GPU shouldn't be affecting signal, unless power issues come into play at full load.
There's gotta be a way to send a progressive signal, cuz that is all deinterlacing artifacts. I use a special deinterlaced for old consoles for similar reasons. Either in the GPU drivers or Windows, switching the resolution specifically has gotta help maybe to 900p or something. Or even the wireless card itself maybe has something. The interlacing is definitely half the issue, especially on an LCD panel.
I'd guess it's detecting the Monitor to only support 1080i and therefore sending an interlaced signal.
I agree, when I backup some old dvd I have the same awful issues, until I properly deinterlace them
Idk why this isn’t way more common nowadays. Would be nice if my computer could stream to other devices as easily as a modern phone can.
i have no problem bluetoothing my pc to my tv
Because it introduces constant latency, which can get worse depending on what you're doing; it could be fine for watching movies or just browing the web but you can forget about gaming
Windows supports something similar
Wireless displays exist
The answer is its more common than you think. Its built into windows, you just need to find a display that supports casting. Which is basically all smart TVs.
Windows supports Miracast if your wifi card supports it
I remember getting my first GPU for a desktop I had. My Grandpa took me to Staples and we paid for it then got home. And at first I was a bit disappointed with it only having 1gb of vram, but it turned out great for the first few years of me using it. Like given the era, people today would be surprised how awesome it was just to have 1024mb of vram, compared to something like 760mb that I often had on laptops of that period, even years later.
If this had only come out half a decade later, this would have been an awesome addition to VR headsets
Imagine this gpu would exist today you literally could put your pc anywhere
I know right! Like given how common stuff like Chromecast and streaming is, I'm surprised this isn't more common plus imagine an RTX 3090 with this
Even on the moon?
You can probably have the smallest pc ever or even laptops,
Maybe steam deck🤨🤨🤨🤨🤨
Nah im playing with myself
@@Gamerzsociaty Just use network streaming like moonlight for anywhere in your house. That's going to be much better than something like this. They've laid out pretty clearly why nobody makes stuff like this anymore (too expensive/tied to the GPU).
@@rikogarza1729 true but like considering how expensive GPUs are now a days, it should just come included.
The Left 4 Dead vibrating character was just the game's physics freaking out because you were riding an elevator. The character physics jump up and down because characters on moving platforms are a mathematical nightmare.
10 years ago LG had a similar setup for some TV with a connection base that had a lot of inputs, and a wireless dongle, 1080p, wireless.
This is the kind of content I used to love LTT for.
Please go back to your roots more often!
Seeing a video formatted like this feels so nostalgic. Great job calling back to the feel of LTT of 10+ years ago. Hell, watching this in 360p almost makes me feel NCIX still exists... lol
HDCP gives me bad memories of back when I bought a new HD TV about a decade ago and I tried to connect the TV to my laptop and use the TV as the only monitor while the lid of my laptop was down and the laptop-screen turned off while the laptop kept running. Suddenly the TV screen turns green and fluorescent. I tried to change the HDMI cable but I kept getting the same problems. Turns out HDCP doesn't work that well for consumers that tries to follow the law yet it's easy for criminals to circumvent. Just like every other copy protection in other words.
can u explain?
Story of all DRM. A hassle for honest consumers, not a problem for pirates.
Btw Linus I love your G710 keyboard you use on the daily. I have one too and I never wanna get rid of it because its such a good keyboard
If they made this 10+ years ago, I’d love to say a take on it with more modern (and likely better) tech.
With VR being reasonably popular im surprised it hasnt already. Wireless setups already exist and work "decently", dedicated hardware should be a net improvement.
as they epxlained better tech exists already. Especially 60GH wireless but also Miracast over 2,4ghz works well enough
The tech already exists in the modern world, is just not built into the GPU, cuz thats kinda dumb. There are alot of wireless HDMI solutions that can just plug into any GPU.
I have so much respect for the fact that you guys still have that fps_doug Pure Pwnage poster after all these years. (14:06)
TIL AntVenom watches LTT. Nice.
@@atticusnari I think all YTer watches LTT... except themselves...
Antvenom has been watching and commenting on LTT videos for many years at this point. It's nice to still see him comment occasionally
Hi Ant
i use linux
even if it has low quality realtime video transmission, still it was very awesome in that days. even first release of Teradek Bolt, roughly $10K at launching, was 2012.
You know you can technically upgrade that GPU with a more modern chip for some extra power, shouldn't need much change. That would be sick to see a modern GPU fitted with a reverse engineered version of this.
I loved how this just worked out the box. These old tech unboxings are great. Always nice to see Linus genuinely enthusiastic about earlier tech.
I just LOVE when LTT takes a ride in their timemachine and goes back to revisit old tech. I feel that young and upcoming pro tech stars can gain from understandning a bit more of the evolution of tech. I'm old enough to have tinkered with most of it myself and I often meet younger generations distanced from "basic" tech. From a technical problemsolving perspective, peeling off that extra layer, helps alot with the insight and understanding of what is actually going on. Well done
I used to own one of these, for my HTPC, that was set in another room, but mine was a Gainward brand.
The USB cable is for USB passtrough, for a keyboard or mouse.
It worked great for like 2 years, then it stopped working all together.
Got mine for like 180$ like 1-2 years after it was released
@10:30 Fun fact, you don't need optical cables to transmit HDMI over long distances. Companies like Extron, Crestron, AMX and others have been sending HD (and now UHD) over CAT5/6 for 15+ years at distances up to 100 meters (328 feet). And, HDBase-T has been around for nearly 10 years doing the same thing.
These kind of videos, when they are all like "omg did you see that? so baaad" I'm like "... what? Where? ... Looks fine to me" 🤣 Would be nice if they could point it out in the edditing what the heck they were talking about, but I think that even the editors don't know
or maybe make a comparision with the cable, it could also be an effect. Also not sure why did they not go to settings and set the resolution.
Right? It didn't look that bad at all, especially for 10 years old
That's cause CZcams destroys the picture as well. If you saw the picture in person you'd understand. Even 4k CZcams looks atrocious compared to 1080p offline.
Slap a DJI Transmission onto an RTX4090 for a DIY setup
Slap a DJI Transmission onto an RTX4090 for a DIY setu
Slap a DJI Transmission onto an RTX4090 for a DIY setup
Slap a DJI Transmission onto an RTX4090 for a DIY setup
@Zetsubou Slap a DJI Transmission onto an RTX4090 for a DIY setup
@@zetsubou3704 nice pfp
I wonder if they spotted it on a back shelf in Milwaukee PC, and decided to grab it to send to you.
Milwaukee PC is a local chain who's been selling computer parts for a few decades, and has some very vintage stuff lying around.
I wonder if the Windows display setting defaulted to 60i instead of 60p? Because that sure sounds like it is the case. The monitor is doing the de-interlacing in this case, which might cause visual artifacts. But the video playback test leads me to believe there is more to this.
From amimon's website, creator of WHDI:
WHDI™ takes the uncompressed HD video stream and breaks it into elements of importance. The various elements are then mapped onto the wireless channel in a way that give elements with more visual importance a greater share of the channel resources, i.e. they are transmitted in a more robust manner. Elements that have less visual importance are allocated fewer channel resources, and therefore are transmitted in a much less robust way. Allocation of channel resources can include, for example, setting power levels, spectrum allocation and coding parameters.
So it seems as though that the data might be uncompressed, but data is definitely being altered in ways. The nowadays heavy use of the 5 GHz band might also make this tech less useful today.
Yeah I have had that happen before
Watching your videos, it's the second time that I have the impression that you guys dont fully comprehend the difference between interlaced and progressive. It is not a difference of resolution. At that time, you would buy this to broadcast on a tv screen ( most of the time interlaced) and this GPU probably came setup interlaced out of the box. I'm quite conviced that you can set it up in progressive video output in the settings. And if so, as you are using it on a progressive monitor, would make everything fine. Dig in. :)
I kept wondering the same thing. Surely there would be a setting in there somewhere to setup interlaced or progressive if the box was advertising it.
Considering they've discussed on multiple videos the difference between interlaced and progressive, including how it works on different technologies, I'm pretty sure they comprehend it better than the average comment section know-it-all.
@@jjones2582 Verified with only 33 subs, interesting
@@gamin546 - Verified before they created a limit. The weird part: why do I have 33 subs with no content. 🙂
@@jjones2582 The same reason I have 100+ with no content.
I would habe loved to see how far away it actually works.
Yeah, I was waiting for that test.
and with win 7 and right drivers from the disk for both gpu and usb
it would have been very interesting to see if the performance changes if all your 5ghz gear was off to free up the spectrum.
7:31 that's actually just source physics moving on a different tick speed than the elevator moves, you see it in a lot of games
Linus: "What makes it a wireless GPU?"
No wires
I figured this would either lag a lot or be compressed to hell. Looks like it was the latter with occasional lag spikes.
@@gptcdc3 gotta get them internet points amiright?
@@gptcdc3 The beauty of chapters & 2x speed.
No way you watched the video
12:22 plug and play wow and look at the condition.brand new too
Linus' enthusiasm and excitement is so infectious. Love it
This thing deserves major cred for just working like that. The folks that create the HP Wireless printing crap software could learn some lessons here. If it could be done that seamlessly 10+ years ago, you can make it work that seamlessly now. Great vid LTT!
This just made me remember a printer that if it was moved, I had to reinstall the driver for it to work!
the crazyness of HP printers
@@belther7 yeah HP printer software just sucks - at least the wireless side of it. It should just work like this thing did! Lol
I wanted that card so bad for my photography business. The idea of being able to cast my computer to another screen without tons of rewiring was just so cool. Unfortunately, the price was prohibitive at the time when compared to a regular card. Still, it was cool to see this one!
Just buy a penpc, minipc or such and used remote desktop or similar
@williandossantos5200 at the time, a mini pc wasn't an option. They didn't exist. A laptop was an awkward solution due to our lack of surface space. We had room for one table, and it was dedicated to our portfolio and sample albums. The solution ended up being a long hdmi cord and a wireless keyboard and mouse. This gpu would have eliminated the ugly long cord. It was super cool at the time! But it way too expensive and unreliable to justify the purchase.
@@williandossantos5200 besides, where's the fun in that!
I was surprised you didn't use your HDMI testing hardware on the included HDMI cable.
in regards to the "dead upgrade path", how much performance hit would you take by using this as your display out and adding a second, higher tier/newer card for rendering?
Heh heh 1080Penis 15:02
1080p-ness. Nice.
Haha I came to see if anyone else noticed.
LTT guys being excited kind of gets me excited. Contagious, man.
The fact it worked. Mind blowing. But also. It's over a decade old. If someone has one of these and reconfigures it or rebuilds an entire GPU with more recent technology. This could get even better.
Galaxy does exist and doing well. Slightly renamed and have several brands now, including KFA2 that are popular in Europe. I had Galaxy 6800 back in 2005, KFA2 2060 Super EX in 2019 and Galax 3070 Ti Black General now. Cheaper and better than MSI.
@@dogtrollololl galaxy is not palit. they sold part of business, but not that gpu business.
If you ever want to sell the 3070 Ti Black General, I'd be interested. I'm in Europe.
I have a KFA2 3060. I agree it is a great brand!
Galaxy is a cheap brand, but not cheapo quality like zotac
The set very much looks ye ol' blue NCIX Tech Tips set. Good memories, and glad Riley made the jump.
took me way back to old days when I had four galaxy gtx 460s and using them for F@H. And yes, they use to SLI too which I used them for gaming.
My stepfather‘s old projector had this technology integrated. Was pretty cool and even 3D worked through it. Sadly it wasn’t always stable
You have to manually select 1080p in Nvidia control panel or it'll default to 1080i
Reminds me of some of the glorious insanity Asrock got into (and still gets into to some extent, see some of their Threadripper and Epyc offerings) in the motherboard space.
Would you mind telling me more about those boards?
I really love weird Hardware and especially Threadrippers, but don’t have the knowledge to know about every gimmick, so you telling me about some quirky stuff so I can google it, that would be amazing! :)
@@rolux4853 Well, iirc they made a mini itx threadripper board... the socket took up so much space it used SODIMM memory, instead of regular DIMMs
@@rolux4853 if i remember correctly, ASRock sells for example an Mini ITX epyc board (where the CPU socket is like a quarter if the entire Board...0) or a m.2 slot VGA Graphics Card
In genersl they do pretty much whatever niche they find
@@rolux4853 Well they made a full featured (multiple NVME slots, a full 4 PCIE x16 slots, quad channel memory) mATX X399 board (I think it was a TaiChi), and I think they came out with some even smaller (lie miniDTX) Epyc boards more recently under their "Asrock Rack" professional line (Linus has a video on one of them). Bitwit did a build with the x399 board but AFAIK never did a followup with the benchmarks like he promised.
I love how insane ASRock was early on. Like I have their X79 Extreme11 and it has literally everything on it for the time period.
I had the non-wireless version in a Dell Precision 690 that I converted into a 7400. I ran it along side a Sapphire FirePro W4100. The reason I did that was to have a gpu with drivers more optimized for 3D modelling as I was learning Maya. I didn't know that Maya would use it the FirePro for the render card and display the work it was doing on the 460. From that moment I figured out that Maya was doing that, I realized what pcie p2p is.
Really cool product! We take wireless transmission of content for granted these days, but this was before even Chromecast was around.
Also... is anyone else hearing the low level electrical buzzing in this video? Sounds like the wireless being in close proximity to the mic packs? 🤔
This reminds me of the Ati All in wonder cards of the late 90s' early 2000's. They were basically a capture card built into a graphics card. Would be cool to see some retro reviews of these card if you can find them.
Think I may still have one of these laying around 😂
I found one at Goodwill like a week ago, new in box and still plastic wrapped. I debated buying it but really had no reason to. I always wanted the 9800Pro all in a wonder. Able to play bideo games and capture TV ! Young me would have loves that.
@@Gatorade69 Oh Jesus I forgot about that. You know, if I didn't have such sentimental value attached to my hardware, I'd be sorely tempted to just give away some my shit, idk what Salvation Army or how Goodwill or whatever would even price a GTX 980 or a 5700XT.
Now all cards have that feature built in.
@@pandemicneetbux2110 The ram I'm running in my PC I also found at Goodwill... Yeah.
Found unused new in the box 32gb of Corsair Vengeance DDR4 3200 ram for 5$. I don't know how it ended up in a Goodwill but it was a perfect upgrade from my 16gb of 3200 DDR4. Still probably my best find, that day had good stuff. Also found a like new Wii, 2 Wiimotes and a wii zapper. Got all that for 30$ and I got homebrew software running on the wii.
Before that the only tech finds besides DVD-RW drives was an AMD Athlon X2 4200+ which if it was back in the day would have been a perfect upgrade for my system.
I am surprised that the setup works so well. As close as the antennas are arranged, a neutralization of the radio waves is possible.
By the bouncing image and steps in lines we can see that it's an INTERLACED FHD! Even the monitor tells you about: 1920x1080I - "I" means interlaced
This style of LTT reminds me a lot of the style I'm trying to go for. I like it! There should definitely be more videos like this one.
You may want to test it somewhere with less 5GHz WiFi signals. Probably when it was released, they weren't expecting the 5GHz band to be occupied