The release of each new smartphone with a camera that can take a digital zoom of 50x or even 100x creates a lot of controversy about whether such modules can really take clear pictures of the Moon. At different times, different manufacturers were accused of literally changing the photo of the Moon during the shooting, that is, they are engaged in a complete imitation.

The last such discussion occurred after the release of Galaxy S23 Ultra with a 200-megapixel camera that can take pictures with a digital zoom of 100x. It all started when MKBHD’s Marques Brownlee recorded a clip for YouTube Shorts, showing how the S23 Ultra can take photos of the moon and saying that unlike Huawei, which has been accused of being a knock-off, Samsung’s camera takes real pictures.

This claim prompted Reddit user ibreakphotos to conduct a little experiment. He took a picture of the Moon, scaled it down to 170×170 pixels, blurred it and put it on his computer screen, then took a picture on the Galaxy S23 Ultra at 100x zoom. Of course, the smartphone camera provided more details in the original image, although in fact it was impossible to get them from the original photo.

How a Samsung camera takes clear pictures of the Moon

This has reignited the debate over whether smartphone camera shots of the Moon are complete fakes and what modern mobile photography is all about. Samsung itself, as it turns out, has long since released a blog post in Korean explaining exactly how Moon shots work. Amid discussions on Reddit and in the media,the company translated this post into English.

How a Samsung camera takes clear pictures of the Moon

It reveals that the Galaxy Camera uses the Super Resolution function to synthesize more than 10 images taken at 25x magnification or higher to create the Moon photo. Such pictures, of course, need to remove noise, increase clarity and other details.

At the same time, Super Resolution technology helps create images using multi-frame composition. If the Scene Optimizer function is enabled and the Moon is recognized as an object, the camera provides higher quality images thanks to the use of a neural network.

How a Samsung camera takes clear pictures of the Moon

The Moon recognition mechanism itself was created based on the various shapes and details of the Moon, from full to crescent, and is based on images taken from Earth.

Scene Optimizer uses a deep learning model of artificial intelligence to detect the presence of the Moon and the area it occupies in the corresponding image. Only one half of the Moon (about 60% of its surface) can be seen from the Earth’s surface, so a satellite is a fairly simple object to train a neural network on. Moreover, after the AI model has completed the analysis, it can determine the area occupied by the Moon, even on images that were not used in training.

How a Samsung camera takes clear pictures of the Moon

When the Moon detected by the Galaxy device is at the right level of brightness, the user can press the capture button and the camera will go through several steps to capture the image.

First, the Scene Optimizer confirms whether an AI detail enhancement mechanism should be applied. Then the camera takes several pictures and synthesizes them into one bright image with a reduced noise level.

After multi-frame processing is completed, the Galaxy camera uses the deep learning-based Scene Optimizer engine to effectively remove residual noise and enhance image detail.

How a Samsung camera takes clear pictures of the Moon

In essence, Samsung’s software actually adds the details it needs to take a better picture of the Moon, using a neural network trained on hundreds of photos. The only difference between this photo format and a complete replacement is that the camera actually takes pictures from which it tries to get additional information. This approach can be viewed in different ways, but in fact, this is how modern mobile photography works.

Back in 2016, with the announcement of the first Pixel smartphones, it became clear that post-processing algorithms would play a bigger role in creating images. Then Google added completely ordinary camera modules to its devices, which did not stand out against the background of other solutions, but were able to surprise users with the quality of pictures. And all thanks to the use of a neural network, which literally added the details in the photos.

The same thing happens when shooting Night Mode, often landscapes and even food. Smartphones automatically improve photos, which has already become an integral part of their functionality. Manufacturers are just balancing between the physical capabilities of the camera modules and what can be obtained from them thanks to the algorithms. However, post-processing is generally not too critical, and the Moon case is a rare hyperbolic use of this technology.

As always, the result comes first, and if you like a smartphone photo of the Moon, that’s fine. For fans of a more naturalistic approach, algorithmic enhancements can often be turned off in the camera settings. In Samsung devices, it is enough to turn off the “Scene Optimizer” option.