iPhone 12 Pro's cameras got some new tricks that serious photographers will love
© Provided by CNET Apple's iPhone 12 Pro phones get new camera abilities, including a bigger image sensor, a faster main camera lens, improved image stabilization, a lidar sensor for low-light autofocus, and a longer-reach telephoto lens on the iPhone 12 Pro Max. Screenshot by Stephen Shankland/CNETApple's iPhone 12 and iPhone 12 Mini add significant new photography features, but camera hardware and computational photography software on the higher-end iPhone 12 Pro models really show how hard Apple is working to attract photo and video enthusiasts.
Among the changes in the iPhone 12 Pro models are new abilities to fuse multiple frames into one superior shot and a lidar sensor for improved autofocus. The iPhone 12 Pro Max also gets a larger sensor for better low-light performance on the main camera, a telephoto camera that zooms in better on distant subjects, and better stabilization to counteract your shaky hands.
iPhone 12, iPhone 12 Mini, Pro, and Pro Max explained
SHARE
TWEET
Click to expand
UP NEXT
The iPhone 12, iPhone 12 Mini, iPhone 12 Pro, and iPhone 12 Pro Max debuted at Apple's iPhone 12 launch event Tuesday. The iPhone 12 (from $799, £799, AU$1,349) and 12 Mini (from $699, £699, AU$1,199) stick to last year's design, with regular, ultrawide, and selfie cameras.
The bigger photography improvements come with the 12 Pro (from $999, £999, AU$1,699) and 12 Pro Max (from $1,099, £1,099, AU$1,849), which get a larger image sensor and a fourth telephoto camera for more distant subjects. The iPhone 12 Pro has the same 2x zoom telephoto reach as earlier iPhones -- a 52mm equivalent focal length -- but the 12 Pro Max's extends to 2.5x zoom or a 65mm equivalent lens.
The iPhone 12 and iPhone 12 Mini get significant improvements, too. They'll benefit from Night Mode photos that now work on the ultrawide and selfie cameras, too, and an improved HDR mode for challenging scenes with bright and dark elements.
Apple's efforts in this area reflect the fact that consumers consider the camera one of the most important features in a smartphone, along with the processor and network speeds. We snap photos and videos to document our lives, to share with friends and family, and to enjoy artistic expression.
Computational photography tricksHDR stands for high dynamic range -- the ability to capture shadow details without turning highlights into a washed-out mess. All the new iPhones bring third-generation HDR technology designed to better capture details like silhouetted faces, Apple said. It also uses machine learning technology to judge processing choices like boosting brightness in dim areas.
iPhone 12 includes Smart HDR 3 and improved Night Mode
SHARE
TWEET
Click to expand
UP NEXT
The iPhone 12 Pro models get another computational photography technique that Apple calls ProRaw. iPhones and Android phones have been able to shoot raw photos for years, an unprocessed alternative to JPEG that lets photographers decide how best to edit an image. Apple ProRaw blends Apple computational photography with a raw format so photographers will get the benefit of noise reduction and dynamic range with the flexibility of raw images, Apple said. It's similar to Google's computational raw technology that arrived with the Pixel 3 in 2018.
© Screenshot by Stephen Shankland/CNETApple's iPhone 12
Google pioneered work in the range of processing tricks called computational photography, helping erase the comfortable lead in image quality that Apple's early iPhones held for years.
But with the iPhone 11, Apple employed its own versions of some Google techniques, like combining several low-exposure frames into one shot to capture shadow detail without turning skies into an overexposed whiteout. Google calls it HDR+, and Apple calls it Smart HDR; a related technology called Deep Fusion blends frames for better detail and texture, in particular with low light.
On the iPhone 12, Apple's deep fusion technology exercises all the major parts of the A14 Bionic chip, which includes the main CPU, image signal processor, the graphics processor, the neural engine, and other elements. That means Apple can apply deep fusion technology to all the cameras on all the iPhone models, Apple said. And it means the iPhone's portrait shots now work in Night Mode, matching an ability that Google added with its Pixel 5.
Apple reveals iPhone 12 Pro and Pro Max
SHARE
TWEET
Click to expand
UP NEXT
Better iPhone camera hardwareThe larger sensor on the iPhone 12 Pro models -- 47% bigger than the iPhone 11's main camera sensor -- increases pixel size. That engineering choice increases the sensor cost but lets it gather more light for better color, less noise, and improved low-light performance.
The Pro phones also stabilize images by shifting the sensor, not the lens elements, which Apple said lets you take handheld shots with a surprisingly long 2-second exposure time.
All the iPhone 12 models also benefit from a wider f1.6 aperture on the main camera for better light-gathering ability. And the ultrawide camera now gets optical image stabilization.
The phones feature better video abilities too, with 10-bit encoding designed to capture color and brightness better, as well as support for Dolby Vision HDR video technology. The iPhone 12 Pro models can shoot the HDR at 60 frames per second, but the iPhone 12 and 12 Mini top out at 30fps.
Ordinary 4K and 1080p can be shot up to 60fps, but slow-motion 1080p can reach 240fps. Time-lapse videos now are stabilized.
© Provided by CNET The Apple iPhone 12 brings Night Mode to ultrawide and selfie cameras, not just the main camera. Screenshot by Stephen Shankland/CNET What the iPhone doesn't doBut Apple hasn't gone as far as some in trying to grab photography headlines.
The iPhone 12 employs no pixel binning, for example, a technique that uses much higher resolution sensors for photographic flexibility. Pixel binning pools data from groups of four or nine neighboring pixels to yield the color information for a single pixel in the photo being taken. Or, if there's enough light when the photo is taken, the phone can skip the pixel binning and take a much higher resolution photo. That offers more detail or more flexibility to crop into the important part of the scene.
Another newer trick the iPhone skipped is the inclusion of a telephoto camera with much higher magnification. The Huawei P40 Pro Plus has an impressive 10X optical zoom, for example. That's difficult since the laws of physics make telephoto cameras physically large, but smartphone makers like Huawei and Samsung are trying to solve the problem with a mirror that bends the light path sideways into the interior of the phone.
Apple could have other tricks up its sleeve, though. In 2017, Apple acquired image sensor startup InVisage whose QuantumFilm technology held some promise for making image sensors smaller or improving image quality.
And it's done plenty with computational photography on its own, notably, a portrait mode that simulates the blurred backgrounds "bokeh" of high-end cameras and the lighting effects that can be applied to those portraits.
0 Comments
Post a Comment