One of the greatest features of the iPhone has to do with its fabulous camera. And that’s one of the main reasons why you’ll find Apple fans queuing up to get their hands on the smartphone device.
But what if we told you that a new blind camera test is proving that iPhone photos are getting worse with time? Yes, it’s all thanks to top YouTuber and tech enthusiast Marquest Brownlee who mentioned some major flaws with the image processing system of the device.
He did grant the iPhone 14 Pro as the winner for the category of Best Camera System last month after his high-profile Camera Awards but seeing this end result is taking a lot of people by surprise.
It’s actually the Google Pixel 6A that took the first place while in second place, we had the Pixel 7 Pro come in second. But it wouldn’t be wrong to mention how a lot of people can’t help but wonder what’s going on with the iPhone and its respective images.
The answer is that the system involved in image post-processing is becoming really exaggerated. To get the best click out there, you’ll be needing a great sensor that can take on as much light as imaginable and also its related details. So what tech developers are doing is using tricks to enhance pictures with the entire post-processing mechanism.
Any modern-day device makes use of a combo of hardware with software to enhance pictures after an attempt to make it look much better and make compensation for not having large sensors that are similar to that observed in DSLRs.
They’re making use of features like reducing noise levels, adjusting white balance, and making the brightness better to display greater detail as seen in darker scenes.
The new iPhone contains a Smart HDR that mixes many images in various settings as one. This enables the device to select the best parts of each of them to produce a better image. But when you have so much processing taking place, the pictures tend to look as far away from reality as possible. And that’s the issue with the iPhone’s camera.
While most devices do handle favorable scenarios well like clear skies or a clear background, different hues and textures located in the same scene do tend to allow for distortion. How well the camera adjusts to such changes will ultimately determine your final outlook.
Read next: The Future Of 6G Wireless Technology Could Use Humans As A Power Source
But what if we told you that a new blind camera test is proving that iPhone photos are getting worse with time? Yes, it’s all thanks to top YouTuber and tech enthusiast Marquest Brownlee who mentioned some major flaws with the image processing system of the device.
He did grant the iPhone 14 Pro as the winner for the category of Best Camera System last month after his high-profile Camera Awards but seeing this end result is taking a lot of people by surprise.
It’s actually the Google Pixel 6A that took the first place while in second place, we had the Pixel 7 Pro come in second. But it wouldn’t be wrong to mention how a lot of people can’t help but wonder what’s going on with the iPhone and its respective images.
The answer is that the system involved in image post-processing is becoming really exaggerated. To get the best click out there, you’ll be needing a great sensor that can take on as much light as imaginable and also its related details. So what tech developers are doing is using tricks to enhance pictures with the entire post-processing mechanism.
Any modern-day device makes use of a combo of hardware with software to enhance pictures after an attempt to make it look much better and make compensation for not having large sensors that are similar to that observed in DSLRs.
They’re making use of features like reducing noise levels, adjusting white balance, and making the brightness better to display greater detail as seen in darker scenes.
The new iPhone contains a Smart HDR that mixes many images in various settings as one. This enables the device to select the best parts of each of them to produce a better image. But when you have so much processing taking place, the pictures tend to look as far away from reality as possible. And that’s the issue with the iPhone’s camera.
While most devices do handle favorable scenarios well like clear skies or a clear background, different hues and textures located in the same scene do tend to allow for distortion. How well the camera adjusts to such changes will ultimately determine your final outlook.
Read next: The Future Of 6G Wireless Technology Could Use Humans As A Power Source