Apple didn’t care about megapixels, until the iPhone 14 Pro arrived

  • 21

Apple is used to implementing technologies that have been on the market for many years. The truth is that the move usually works out for him, and proof of this is the reinterpretation he made with the widgets in iOS 14 or the turn they have given to the Always on Display mode. However, the most important jump this year with the iPhone 14 Pro and 14 Pro Max is in the camera: finally, goodbye to 12 megapixels.

The bet this generation is for the pixel binningthe pixel grouping system that we have been knowing since Huawei implemented it under the name of Light Fusion in its Huawei P20 Pro. Four years later, Apple bets on the same. The challenge ahead is not minor: they have to show that their implementation is better than that of their competition.

Turns out Pixel Binning wasn’t such a bad idea.


The megapixel war had its fervent time with 108 megapixel sensors, it seemed to calm down years later and, these last two years, they are proving that 12 megapixels are no longer enough. Google took a historic leap with its Pixel 6, giving up 12 megapixels and embracing 50.

Few manufacturers have been able to demonstrate that the jump to 50, 64 or 108 megapixels has made sense

Apple has done the same with its camera, which now groups pixels 4-1, to end up shooting at 12 megapixels, despite having a 48 sensor. Thanks to this technique, on paper, brighter photographs are achieved ( after all, we are grouping pixels into larger pixel groups). The reality is that few manufacturers have taken advantage of it, both at the level of brightness and raw detail rescued by the sensor. For proof, the photography section in maximum resolution of our photographic comparison.

pixel binning

In this regard, Apple assures that they will now be able to collect triple the light in dimly lit spaces. The opening of the main camera has been reduced considerably, so Apple must be pretty sure that its camera is now brighter.

If it's not a new iPhone, it's a second-hand iPhone: this is how Apple artificially limits news

The iPhone 14 Pro will be capable of 2X “optical” zoom, using information from the central part of the 48-megapixel sensor, interpolating it and (theoretically) merging it to obtain a lower resolution photograph, but with greater detail than the classic 2X zoom digital (the one we do by enlarging with the pinch to zoom gesture). Sounds promising on paper but it is something that Samsung already does with its high-end.

There is also a promise that Apple ProRaw will improve substantially, allowing raw shooting at 48 megapixels. Here it will be key if Apple stops washing the noise as in the last generation, so that we can rescue the maximum detail of the sensor. Whether it works as it should or not, again, it is a function already present in Android terminals.

Apple often reinterprets functions that we already knew in Android. With Pixel Binning, they have simply jumped on the bandwagon of something that has been around for four years.

Apple has a great challenge ahead. You have to prove that jumping to 48 megapixels makes sensethat the difference in detail between the 12 and 48 megapixel mode is real (not all phones have been able to say this) and that the wait has been worth it.

If the iPhone 14 Pro does not improve substantially on the camera compared to the iPhone 13 Pro, it will have been late to a trend practically in vain.

Apple is used to implementing technologies that have been on the market for many years. The truth is that the…

Apple is used to implementing technologies that have been on the market for many years. The truth is that the…

Leave a Reply

Your email address will not be published.