iPhone 13: Talking to the Camera Engineers

Sdílet
Vložit
  • čas přidán 27. 08. 2024

Komentáře • 52

  • @thecolelaurain
    @thecolelaurain Před 2 lety +30

    I’m sure this was not an easy conversation to arrange, really appreciate this. And highlighting the name of the person speaking was a really nice visual touch for an audio only podcast. Thanks Tyler!

    • @stalmanplus
      @stalmanplus  Před 2 lety +5

      Thanks for noticing! I stayed up late for that one 😁

    • @jedwardoo
      @jedwardoo Před 2 lety +2

      @@stalmanplus well done, Tyler. I really love this episode and the sacrifice you did for making it podcast episode perfect!

  • @whobutni
    @whobutni Před 2 lety +20

    Great content. Apple should include whitepaper notes on all updates post-event.

  • @issiewizzie
    @issiewizzie Před 2 lety +2

    I wished you asked if they will use the LiDAR for addition accuracy in the pro

  • @KrissRho
    @KrissRho Před 2 lety

    Many thanks for such an exclusive interview!

  • @ChrisWilkinson1
    @ChrisWilkinson1 Před 2 lety

    Thanks for this Tyler. I'm starting to think that iPhone could become my main vlog camera. 13 is in the mail so we'll see!

  • @dmitry926
    @dmitry926 Před 2 lety

    When can we get a higher resolution? Current 12MP can't even fill the 5K display!

  • @_topiece
    @_topiece Před 2 lety

    Amazing job!! Thank you for your work!!

  • @TechOutAdam
    @TechOutAdam Před 2 lety +5

    After hearing the top dogs from Apple talk about the cameras, it makes everyone’s thoughts on the new iPhones laughable. Like they don’t know what they’re talking about. Well besides Tyler, of course. 😉Thanks for this Tyler!

  • @JustinBeausoleil
    @JustinBeausoleil Před 2 lety +2

    “Interview” 😉

  • @irjo
    @irjo Před 2 lety +3

    This interview is just a long Apple ad :/

    • @elck3
      @elck3 Před 2 lety

      When you’re Apple you can literally direct the conversation like this. I bet they pre approved every single question

  • @MickiMinach
    @MickiMinach Před 2 lety +2

    Quick question, I know this is a podcast, but why didnt Apple agree to have this interview in video format?

    • @TechOutAdam
      @TechOutAdam Před 2 lety

      They probably are in rooms with products not released yet.

  • @folkusonme
    @folkusonme Před 2 lety

    Amazing!! 😍 friggin great channel, subbed

  • @Quoutub
    @Quoutub Před 2 lety +10

    A lot of important questions were missing:
    1. Why does the 9mm tele camera of the iPhone 13 Pro only have an entrance pupil diameter of 3.21mm? This leads to more light per time from objects than the 3.0mm pupil of the iPhone 12 Pro tele, but gathers less light per time from objects than the 3.41mm pupil of the 12 Pro Max tele. The Xiaomi Mi Note 10 tele (no periscope) even has an entrance pupil diameter of 4.37mm!
    2. Why haven't you increased the megapixel count of the 7.5mm f/2.2 tele camera from the 12 Pro Max instead of choosing a focal length of 9mm? If the read noise per pixel isn't higher and if the quality of the lens is high, the only disadvantage would have been slightly less light from objects at base Iso, but you would have captured 12.5% more light per time from objects with the 7.5mm f/2.2 lens and would even have more depth data available as you don't need to crop the main camera as much.
    The 7.5mm f/2.2 tele camera of the 12 Pro Max, but with more megapixels, would give you a larger field of view without having a lower angular resolution. The smaller field of view of the 9mm tele camera also makes it harder to get everything into the frame.
    3. Why is there no option to turn off machine learning? Machine learning (deep fusion, etc) only guesses, so it can lead to wrong details that are even not recognisable as being wrong (in contrast to traditional artifacts like oversharpening, moiré), so iPhone cameras might be unusable for responsible journalists.
    4. Why haven't you chosen a Quad Bayer sensor for the main camera? With the right algorithms this could maybe improve the digital zoom performance.
    5. Why do iPhone cameras use watercolor noise reduction, which is very annoying and is even baked into the Proraw file? For example, Canon cameras don't use watercolor noise reduction, no matter which light conditions.
    6. (I have edited my post here) Color reproduction is important, therefore it is extremely important that smartphone cameras offer a color temperature slider in auto mode. The iPhone 13 has a "warmth" slider, but it seems that it won't let you choose a constant white balance, the adjustments are relative to the auto white balance. So, the warmth slider doesn't prevent sudden auto white balance variations in the same room. Why do you restrict the white balance choice of the user and don't offer a warmth slider that is fully manual?
    7. Why does the iPhone tell you to hold still for nearly 5 seconds, when the night mode displays a duration of only 3 seconds?
    8. Why don't you use the Tof camera for the Cinematic mode?
    9. Have you thought about having two 12 megapixel wide angle cameras or two tele cameras in order to get a more accurate depth map instead of only relying on crops of the other cameras? Will Apple be able to improve the depth map in the future?
    10. Why is there still no periscope camera?
    11. ...

    • @yaqdhanhinai221
      @yaqdhanhinai221 Před 2 lety +6

      Are you a camera hardware engineer? These are serious questions 😮

    • @ls7vz
      @ls7vz Před 2 lety +7

      Because all these advancements plus other advancements throughout the phone can’t be made in just one year…actually 6-8 months since after that time the final
      iPhone features and design need to be locked in before mass manufacturing release. This video is actually crazy because an interview like this is extremely rare.

    • @gianpakasa
      @gianpakasa Před 2 lety

      The answer is :Soon in Xiaomi and Huawei.

    • @VikalpGautam
      @VikalpGautam Před 2 lety

      🤣😂

    • @chrisak49
      @chrisak49 Před 2 lety +3

      A lot of it has to do with scale. Other phone makers can use new hardware like periscope zoom lens because they sell 5-15 mil units worldwide only. iPhone sells upwards of 100 mil per year worldwide. It’s easy to source parts for 5mil units - not so much for 100mil units especially when time to develop and mass produce is 1 year cycle.

  • @Hamidu
    @Hamidu Před 2 lety

    That’s value 😎 I feel like Apple used computer learning to time how long they should be in this podcast. Becauses they just have a rain drop of info, and there’s so much more.

  • @Potatocat_
    @Potatocat_ Před 2 lety +8

    They are reading script lol

  • @lfcruzsierra
    @lfcruzsierra Před 2 lety

    .... as soon as they started talking, I knew the 24FPS questions weren't happening. Sad...

  • @mich8261
    @mich8261 Před 2 lety +2

    24 fps. DUDE!!!

  • @Wiintb
    @Wiintb Před 2 lety +1

    Great interaction!
    I would have loved to get the marketing person who has no purpose in the conversation. But she is like the North Korean tourist minder. Useless, pain in the back. But everything goes through them!

  • @tommytexter4054
    @tommytexter4054 Před 2 lety

    The thing is , people who actually know what ProRes is, care about ProRes and will use ProRes, they will never shoot using a phone in a real business other than iPhone review or commercial for Apple. Cinematic fake shit? Average joe will never bother to use it. Real pro only use real optical blur. So apple is giving people features that they will never use

    • @TechOutAdam
      @TechOutAdam Před 2 lety +4

      lol so you say. I have a Sony A7C and a Canon R6. I STILL will use my iPhone 13 Pro and have at the farmers market, at restaurants, going on hikes etc because it just DEPENDS on the situation. That’s the point. Not everyone has a DSLR or mirrorless most just have their phone and are dependently solely on that to document their life.

    • @Dcrainmaker
      @Dcrainmaker Před 2 lety +4

      @@TechOutAdam Agree with Adam. Plenty of us that shoot use iPhone's mixed into other content, including now the computational bokeh. I'm already using the iPhone 13 Pro in the last few days in various minor b-roll type scenarios for my content workflow - mixed in with other 'real' cameras. As Tyler highlighted in his main review, if you tweak the F-stop a bit, you can find the perfect middle ground to get good footage that largely doesn't look fake.

    • @TechOutAdam
      @TechOutAdam Před 2 lety +1

      @@Dcrainmaker Absolutely. It's not meant to "replace" anything, it's an awesome tool we get to use that got better depending on how we use it.

    • @stalmanplus
      @stalmanplus  Před 2 lety +1

      Yeah I think Tommy is missing the point here. Professional photography campaigns have been shot on Polaroids too, it’s all about the creative demands of the project

  • @its_argho
    @its_argho Před 2 lety +1

    I am waiting for the Pixel 6 to obliterate the iPhone's camera.

    • @VikalpGautam
      @VikalpGautam Před 2 lety +2

      Great. See you in a few years after you have stopped waiting 🤣

    • @chrisak49
      @chrisak49 Před 2 lety +6

      You’ll be waiting until you die. Pixel cameras are overrated and never deliver since Pixel 2-3

    • @VikalpGautam
      @VikalpGautam Před 2 lety +2

      @@chrisak49 i so wanted to say this too. My pixel 2xl started amazing and with updates its camera went for a toss. Plus night sight was the last useful feature they launched. And even then when they were slightly ahead of apple in photography, mainly low light, their videos sucked big time.
      Even pixel 6 doesn't look like it will change that much. Photos might improve but video, keep waiting. Even OnePlus takes better video than pixel

    • @its_argho
      @its_argho Před 2 lety

      So people complaining about video performance on Pixel. Its not Google's fault that Qualcomm made shit overpriced chips.
      Photos on the Pixel (2&3) were the best in class. Thing is you can do processing on a photo after the shot is taken.
      But for video you need to do all that processing in real time. And simply put Qualcomm chips don't have that power nor are they optimized to do the calculations Google needs.
      And iphone basically ripped off Google's computational photography and created a knock off version of Google's HDR+ called Deep Fusion. But people won't talk about that.

    • @its_argho
      @its_argho Před 2 lety

      @@chrisak49 that is because
      1. they never upgraded the optics.
      2. 4 went to shit because of its battery life and they put all their efforts on that shit Soli.
      3. the newer Pixels all use weaker Snapdragon chips that essentially can't keep up with all the computations needed.