Apple engineers talk about changing the camera on the iPhone 13

An interview with a number of Apple vice presidents dealing with iPhone 13 camera system engineering has been published, proving a better insight into the decisions behind the improvements for the 2021 editions.

Released Monday, episode “iPhone 13: Talk to Camera Engineers” Stalman Podcast contains a trio of Apple representatives. The group is led by Kaiann Drance, vice president of global product marketing for the iPhone, along with Jon McCormack, vice president of camera software engineering, and Graham Townsend, vice president of camera hardware management.

For the iPhone 13, Apple has brought its OIS Sensor Shift OIS, as well as improvements to low-light photography, photographic styles, and cinematic mode. There is a new macro mode on Pro models, with support for ProRes videos.

The half-hour podcast begins with Townsend discussing the benefits of Apple’s camera hardware design, including how the hardware team can work closely with its software counterparts “starting from an early design stage”. The lens, sensor and other hardware “are specially designed to complement the firmware and software processing” of the device.

“Since we own a full range, from photons to jpegs, if you wish, we can choose the optimal location in preparation for providing special benefits,” adds Townsend. For example, Sensor Shift is powerful enough to stabilize one second of video, and helps provide raw and accurate image data to which the software team can expand.

The new Macro in the iPhone 13 Pro is enabled in part from the autofocus system used by Apple, Townsend confirmed, otherwise “you can get into a dedicated macro camera” without it. “It’s just not as effective for us as being able to use the same camera for these two separate but somehow connected purposes.”

Machine learning has advanced significantly, especially with the amount of processing power the A15 now provides, according to McCormack. “This really speaks to the amount of processing power in the iPhone, and in fact we now have so much processing power that we are able to take those same computer photography techniques and introduce them to the world of video to bring computer videography.”

“Indeed, we now apply the same machine learning magic we learned in photography to videos.” McCormack says the iPhone now “segments each frame in real time, and we process the sky and skin and leaves individually, taking our already industry-leading video and making it even better by giving us better clarity and more detail in different parts of the image.”

Source link

Naveen Kumar

Friendly communicator. Music maven. Explorer. Pop culture trailblazer. Social media practitioner.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button