First Look at Webflow, Figma & ChatGPT in Apple Vision Pro!
Vložit
- čas přidán 13. 09. 2024
- In this video, we're going to take a bit of a different approach, and walk through how the Apple Vision Pro functions with Webflow, Figma, and ChatGPT. It's early days, but I haven't been this excited about a product in a while.
Holy shit I usually jest watch these reviews knowing excatly what reviewers are gonna say but you're the first in a long while who actually had my full focus and curiosity. U have actually shown people how they can use this device for something actually productive and useful.
Utmost respect for you my guy, I can only hope u gonna blow up one day.
You mention that you can forget that you are in the AVP. Marques also relates looking up into a corner of a room expecting to see a floating window only to remember that he wasn't wearing the headset.
Equal parts incredible and terrifying.😂
Sir, you have a very good public speaking voice.
Greatest thing about AVP is that it helped me discover so many other content creators. Still have no idea what Figma is, but still subbed because he has an excellent way of articulating his thoughts.
It's such a beautiful thing about CZcams and just the general excitement around tech/software. I really appreciate the kind words!
5:15 having just the MacBook-Keyboard added as pass-throuhh to the Virtual Environment would be a great, for me almost must have Feature for this setup. I hope they manage to do this with a update. Great Review, thx
In terms of spatial UI, Both Figma and Webflow will have to rethink a couple of stuff. First current UI where everything is fixed will not work, something like detachable property panels would be nice, for example instead of showing the radius and padding panel small text-input or drag like area, It should be big knobs and buttons then I can be detached from the main UI and you can put them anywhere you like. Use layers often? just grab the panel and put it on the left side, grab font panel and put it on the right. In middle you just have the big canvas. Imagine something like Photoshop's detachable windows panels but in 3D space!
Need a photo in your design? open finder and grab a photo and drop it on your canvas. Imagine Figma plugins in worspace like this! Open unsplash plugin and you got millions of photos right above your head move your fingers to swipe through them, grab one and drop on your canvas.
If they pull this off I can finally be Tony Stark in my living room.
How do you only have 1,87k subscribers? Your videos are amazing! The quality exceeds that of CZcamsrs 100× your size. You have earned yourself a subscribe
The dream for every device is when it feels like the device disappears and things just feel natural.
I just keep immersion at like 70-80% so I can still look down and see my keyboard when I need to. Also means I can see my dog as she cuddles up on the couch. It’s a pretty amazing way to work, watch videos, etc.
I am not a very enthusiastic traveller, but I do want to see or revisit many places without the travel hassle. This is a great way to be in Berlin for a while, then Venice, , suddenly Alaska or Vladivostok or Bali, while I still pretty much keep working and explore tech and AI tools. This AVP can certainly do far better than any of the existing devices.
The best video I have watched about Vision Pro! Thanks! Looking forward to hearing your verdict in 13 days
Finally someone is making a proper review. Thank you!
Wowwww, this is completely incredible, i can imagine using Webflow and Figma, while sitting in Hawaii, or at space😅
Let me know when you're going to space - I'll join!
Really glad you did this video McGuire, this looks great. One thing for the next video, can you let us know how comfortable it is after using it for a long time when working. Eg any neck, face, back pain etc.. Just concerned a little about the weight being right at the front of the device. Cheers
Hey McGuire! Wilmer here. It was a pleasure helping you over at World Trade Center. Great insight on APV and its limitations!
Whoa, Wilmer! Great seeing you on here! Thanks so much for an incredible experience at the Apple Store the other day - you were wonderful!
Great commentary. I am looking forward to seeing your follow up videos! And a question: would you rather have a Vision Pro or an Apple Pro Display XDR?
Pro Display XDR hands down. 99% of my *current* workflow is based in 2D - even 3D stuff (like Cinema 4D) is locked into a 2D context on a vertical surface. 3 days in, and I'm finding myself a bit frustrated in the Apple Vision Pro when it comes to doing my work, and I'm most productive when sitting at my workstation. Part of me thinks this is because I'm used to the current way of doing things, but the other part of me thinks it's because the hardware and software for spatial computing is such a nascent and undeveloped category. I'm 100% convinced it'll be a game changer in the long run, but Apple Vision Pro V1 really reminds me of the first iPhone: really great demos, basic functionality, but nothing compared to the stuff that came in the years after.
I’m noticing that people with a certain type of head shape prefer the over the head strap while others are happy with the single wide strap. I seen reviewers with more round shaped heads say they are happy with single strap but others with narrower heads or more rectangular faces say they prefer the other strap.
Very cool, thanks for sharing!
Zojirushi Rice Cooker and the Background✅✅✅
Essential appliance!
Is there a way to curve the screens, instead of just flat “monitors” you can curve the “screen”. I think it would feel more natural with a lot of head turnings. Just my opinion
Interesting thought! I think that would work if you were sitting down constantly, but one of the things I’ve started to do a lot more of is getting up and moving around to look more closely at content. When the perspective is changing, the curve would have to heavily adjust, and I can imagine that being quite distracting. The other thing is that *most* surfaces in real life are rectangular, meaning the rectangular windows just seem to perfectly align in Apple Vision Pro. With curved windows, it’s parallel in the vertical dimension but askew in the horizontal dimension. Having said that, panoramic photos on Apple Vision Pro curve all the way around you, and the experience is nothing short of incredible, so you could be onto something here!
Quest3 has it and ability to create super tall browser windows for infinite scroll apps. Defo preferred
I literally caught fish in that exact location this last summer on Mt Hood
Great video Coventry Boy #1 🔥
Alfonso! AKA Coventry Boy #337. Or whatever number I would randomly invent. So good to see you here!
@@mcguirebrannon haha close enough! So good to see you thriving. Loving the work, you’re putting out sir!
Thanks for video - you make big job 👍🏻
Apple need to add a feature, where you can expand the hand occlusion space, so that you can include what your hands are interacting with in the scene cut out. It seems like that would be a relatively trivial thing to do, as they are already cutting out the hands, thus they know where they are, all they need to do is expand the bubble. No need to do additional object detection. And I think this would greatly improve a number of experiences, like using your laptop in immersed spaces.
Another feature, a long the same lines, that would be easy to implement, would be to be able to manually set exception volumes. Where anything in that excepted volume is cut out and shown over the virtual environment. Thus you could just draw a cube around your laptop, and it would be shown, despite the fact you might move your hands around.
Of course, what Apple is more likely to do is the harder object detection, where they just increase the number of items that are detected and automatically cut out. This also means, this is the slower path, where we have to live with the blindness longer.
You don’t have to work in a virtual environment or can dial it back to the point where your keyboard is visible. I like to visit them but being able to reach for stuff without needing to use the “reality dial”.
@@Co-opSource I realize that, but I *want* to work in a virtual environment, and I want to be able to have pieces of my real environment pop through on a case by case basis. I also don't care for the dial option, as all it does is dial in how surrounded you are. So to dial it back to the point you can see your keyboard, you're basically not immersed at all. So yeah, there's a lot of potential here, but it needs some work to really blossom into what it could be.
i think i want to demand that all apple devices like magic mouse/keyboard/touchpad/pencil are all given in-headset representation. this will likely come with time eventually. this is till a very new platform
Didn’t expect AVP review from McGuire, but this was quite informative! Glad that you covered using Webflow and Figma in AVP, since no one else would do it, I guess, haha. AVP is too expensive for what it offers right now, though I can see the potential. Thanks!
Glad it was helpful! Thanks for taking the time to write this - and I happen to agree with the price and the potential.
Amazing, still waiting for getting my hands on VP in europe, so great to see actual practical review!
Oleg! Can't wait to hear what you think when you get it. Curious to know what your take is!
@@mcguirebrannon in particular the drag & drop, scrub etc interactions in super interesting to get right, this might just be the future of visual development
@@mcguirebrannon also testing on all kinds of device resolutions would be greatly improved, because you need to see a load of devices at a glance to speed up your process and one screen is simply not very practical
So happy for McGuire, this vid actually blew up
I think anything right now that has “Apple Vision Pro” is blowing up right now - I just happen to love this kind of stuff, and I’m happy if someone gets an inkling to try it or think through their workflow with this kind of thing, because it’s actually really interesting!
You need more lights in your real environment to increase the tracking of your hand & will improve on the selection issues.
Jesus what a nice apartment
App’s built using native components would work fine on the vision pro
One thing I’ve been wondering is does this have enough pixel density to have 3 virtual 4k screen/windows side by side ?
I wonder, how is the voice to text application in the vision pro?
I feel that would be awesome for chat GPT.
Also what about copy & paste? How does that work?
Can’t you use the trackpad on your MacBook whilst using the vision pro apps?
Yes you can he literally said it in the review
You know you can use your mouse and keyboard on the native apps if you have the virtual monitor from your MacBook active right? So any web apps can run in vOS and desktop apps can be in the virtual monitor, all using the keyboard and mouse like they're one big canvas
The most informative review for me, thank you. Is it true that shared display from macbook is limited by 2K resolution, despite the fact that Vision Pro has 4K resolution per each eye? Is it hardware or software limitation? Will it be fixed in the future?
There seems to be some conflicting info about this out there, including suggestions that it’s more like 1440p - still learning about this! Great question 🙏
Outstanding
OMG, we have the same Nespresso machine and I can’t believe you even have a rice cooker. That is so Asian 😍
Nespresso twins! Also, a rice cooker is a non-negotiable. Best appliance by far!
@@mcguirebrannon What an honour! 😍Good that you're cooking and taking care of yourself while building your startup!
After taking notes, painting on iPad I have this habit of double tapping expecting to undo even on a physical notebook. Its scary to even think what I would do after getting used to AVP.
Really good video concept
It's only one screen if you use a macbook. If it can do multiple screens, that would be a killer feature. Or change the resolution the extreme ultrawide.
💯 immersed app allows multiple mac screens in quest3. Should port over to App Store soon.
"Ok, so we're on the moon" 😂😂😂
Great vid idea!
Have you tried connecting a magic trackpad directly to the Apple Vision Pro for manipulation?
Not yet - to be completely honest, I haven't touched it in a few days since I've been heads down with other work, but I can't wait to try this. Thanks for the idea!
Try putting the window really close to you, they should act like a touch screen after. At least that’s what I’ve seen from other CZcamsr/X Tech reviewer. Let me know if that works 😊
Why aren’t you using a keyboard and trackpad for actual work?
You have two glasses on Apple Vision Pro headset and inside you have small lenses inside? But if you put large lenses inside the headset then you feeling better.
I find the solo band to be more comfortable
MacBook vs Mac Studio comparison feels off-base. My personal use-case lies somewhere between my iPad and Apple TV.
Are you able to mirror your Studio? Other people have reported that mirroring only works with laptops.
cheaper Vision Pro + Mac mini + Magic Keyboard = killer combo
It’s so adorable watching him try to use safari not realizing this is Safari for iPad and not Safari for Mac 😅
I think a combo of Mac + Vision Pro is gonna rock bottom the work flow. I am looking to buy these but I live far away from USA lol.
Genuine question -- what's the ergonomic experience? How's the neck feel?
Great question. So far no neck problems or eye strain. Biggest physical discomfort is upper cheek and forehead pressure from the front of the Apple Vision Pro weighing down on the face, but that’s significantly reduced since I switched to the dual band.
Decent drinking game drink every time he says web flowN
Just to be very clear... you can only have ONE safari window open at once? And same for Chrome? How about multiple VSCode panes open at once on what seems like multiple monitors? Do you think that programming for several hours inside of this environment would help with focus?
You can have multiple Safari windows open in Vision Pro, each with multiple tabs. Haven’t tested anything related to Chrome yet, and I’ve only used Chrome in the display mirroring from the laptop.
Regarding programming, I suppose that would depend on the development environment. I spent the afternoon with @nickfloats who was testing the workflow in Midjourney, and even inputting basic prompts was an extremely painful chore. I think the thing to remember regarding most apps that run in the browser is that Vision Pro is essentially using a specialized version of the iPadOS Safari; not macOS, so without a native visionOS app, I worry it’ll be far below most people’s expectations.
However, the optimist in me strongly believes this’ll just work itself out, and native IDEs will be developed to make some of this stuff work better over time. Note: the file system in Vision Pro uses the Files app like on iOS and iPadOS devices, so those who use development workflows that rely on local file systems will have to be super aware of this to ensure things aren’t non-starters.
@@mcguirebrannon Great points thank you. Makes me so sad that it isn't quite ready for the ultimate dev environment! So close and yet so far!
One last question... what would it look like if you just made the macbook pro display movie theatre sized and opened up a bunch of windows inside of it? Do you think you could divide the single display into four windows, one in each quadrant, and have it look okay or does it still look strange given the aspect ratios and pixels available and such?
it's doing ...holograms ...better than any other category :P
At the 12 minute mark, you demonstrate using the Mac's trackpad to scroll in an iPad app (ChatGPT). I can't for the life of me figure out how to control anything other than the mirrored Mac screen with the Mac's keyboard and trackpad. What am I missing?
This is a great question. When mirroring your Mac inside your Vision Pro, have you tried controlling iPad apps specifically? ChatGPT is the only one I’ve tried like this so far, so I’m learning right along with you. I’d love to learn more as you figure it out!
Thanks for the response! I figured out why it wasn't working for me. If anyone else runs into this, go into Displays in your Mac's Settings, click Advanced, and make sure "Allow your pointer and keyboard to move between nearby Mac or iPad" is checked.
If you Mac on Vision Pro the video and typing still using mouse or use hand direct can drag the editing section there?
You use the mac keyboard and trackpad/mouse
@@Mattt5 which mean you can direct use eye tracking or hand gestures to control Mac mirroring ? It has be connected to mouse and keyboard or trackpad from Mac ?
Of course navigating around ui made for the mouse is going to suck with the eye-hand ui…
Cool cool cool cool cool cool cool cool cool cool
But then how do you record this whole experience 🤔
Same way you can screen record your iPhone screen
Remember exactly what things that iPad should do better than iPhone and a laptop?
Photos: way better at the Vision Pro specially with spacial videos, a memory is good to be real.
Movies: best 3D movies ever and huge 4k sizes.
Web brose, email etc. Will see but with a fisical keyboard I think is going to be better just because of the multitasking and size of each screen.
Can't even control the macbook cursor with your eyes while mirroring, entire thing is basically 0 unique apps. Constantly needs more light to track hands. Can't even change the aspect ratio of the mirroring. Went from a samsung ultrawide to some headache inducing 16x9. Was honestly super let down.
There's a lot of good points here. I'm still stuck on deciding whether I'm keeping it, especially as I decide whether the practical or entertaining aspects of the device sustain once the novelty wears off.
@@mcguirebrannonI’m still trying to convince myself to keep them, but seems super early from a dev perspective, as I’m still locked to the actual MacBook. However I did find the Vision works incredibly well in direct Florida sunlight.
@@AustinCameron, I might just keep it around so I can take it on my next trip to Florida - my home state!
McQuiAR
I'm getting the immersed visor $400 for better resolution and better design than the Vision Pro
Meta Quest 3 $500 and does all of the samethings apart from the eye tracking and the vision pro has a higher resolution but a lower field of view so you'll see less of the screen compared to the Quest 3 anyway so does it really matter out it this way the added specs do not equate to an extra $3000 dollars and the Quest line is the leading headset for AR and VR apps and games and has been for the last 4 years and Sony is 2nd Apple needs a Quest 3 competitor and fast because they just selling the Quest 3 for Meta at this point as people will see this and get excited and then realise they can do the samething and use the same apps in the Quest 3 and then they will just buy a Quest 3 instead and everyone know that Meta always adds functions and new features nearly ever month and indont think Apple will be able to keep up with updates to add features as apple are slow adding features in their updates you'll be lucky to get features with in a year with apple
It's ridiculous that the official ChatGPT does not support Vision Pro, but a 3rd party developer has created a version that does. I'm interested in how it was accomplished and what it looks like 🤔
U can just ask chatgpt to do it for you
Of course Figma and Webflow on Vision Pro isn’t a good experience right now-those companies haven’t created apps for Vision Pro. Suggesting this is an issue with Vision Pro is disingenuous.
Procrastination 😅
I love the tech, but it still feels like it's Apple trying to force on us a solution to a problem that doesn't exist. I can't imagine hurting my neck trying to do something that I'm already doing... painlessly.
It’s adding solutions to problems we don’t yet realize we’re having. You can’t carry 7 laptops and a 60 inch monitor into a coffee shop but you can sit there with this thing on your face and have the equivalent of those to work with.
That said, I actually think the most powerful/useful aspects to this technology have yet to be developed. Imagine calling an Uber and the car that comes to pick you up is highlighted so you don’t have to search for it
Imagine getting walking directions downtown and they’re displayed right on the street in front of you so you don’t have to stop and stare down at your phone every 30 seconds
Imagine going hiking and it recognizes things like trees, birds and tells you exactly what they are
Imagine going golfing and it displays exactly where to put your hands, rates your stance, records the speed of your swing, tells you if your body moved too much during the swing, gives you tips to better your game, tells you which club to use for which situation, and keeps track of the score
It’s not just a powerful portable workstation. With the right apps and imagination I think this tech could eventually be life changing. We just need them to make it smaller, lighter, and get the price down. Rumors are they’re working on a cheaper consumer friendly version.
Perhaps another way of viewing it is that Apple has released for them a first gen blended XR headset that incorporates elements of desktop computing, media streaming, video collaboration amongst other features. I’ve been using mine now since Friday and I’m learning that it doesn’t fit the current model - you need to assess and develop use cases that are now open to the wearer. You have to try it - few have done justice to their presentation as many are trying to force fit their notion of what it is.
This is a 1st gen product, every 1st gen product is massive, heavy and expensive.
The 1st phone was like a brick and the 1st computer filled a room.
All those people who are buying this product right now is investing in a revolutionary product which will replace mobile phones and computers.
Can't wait to see this tech embedded to glasses in the next 10 yrs
People saying the same thing when AirPods launched….yet now we all love the solution to a problem that didn’t exist
@@Lucifer-fj7mg Earphones are not the solution to a problem that doesn't exist (problem: listen to music privately in an easy, effortless manner. Solution, earphones). But a very heavy visor that you wear on your face and requires a bulky external power source, to point and click at things, can already be done with very little effort. Let's not mention the price.