The photos taken by Samsung’s Space Zoom feature on Galaxy phones have come under scrutiny again this week, after a Reddit post claimed the software process involves fabrication of additional details. Samsung has now responded to the allegations and disputed the claims.
We asked Samsung if moon photos taken with phones like the Samsung Galaxy S23 Ultra contain extra detail or texture not present in the original photos. In an official statement, Samsung said: “When a user takes a picture of the moon, the AI-based scene optimization technology recognizes the moon as the main object and takes multiple shots for multi-frame composition, then AI enhances the details of the moon. image quality and colors. No image overlay is applied to the photo”.
Samsung added that this process is not mandatory, stating that “users can deactivate the AI-based Scene Optimizer, which disables automatic detail enhancements of captured photos.” However, doing this makes it impossible to achieve the kind of results possible with Scene Optimizer enabled, as the feature goes far beyond adjusting exposure.
These comments echo what Samsung has previously said about its Space Zoom moon shots, both on a Samsung Community board (opens in new tab) and in a comment to Input magazine (opens in new tab) last year. At the time it said that “no image overlay or texture effects are applied when taking a photo”, and it stands by that explanation. So where does this leave us?
Just another amazing moon shot with the Samsung Galaxy S23 Ultra. No tripod, not even a full Space Zoom. #SamsungGalaxyS23Ultra pic.twitter.com/PFngh8vcBEMarch 5, 2023
The boring answer is that all photography is on a sliding scale between the so-called “real” kind – photons that hit a camera sensor and are converted into an electrical signal – and the “fake” kind that Samsung is accused of again in this one. latest controversy.
AI-powered modes, such as Samsung’s latest Scene Optimizer, which has been producing moon shots like the one below since the Samsung Galaxy S21, are undoubtedly pushing photography towards the more artificial end of that scale. That’s because it uses multi-frame synthesis, deep learning and what Samsung calls a “detail enhancement engine” to produce the impressive end results.
We’re still not exactly sure what’s going on in that engine, and it’s fair to say that that extra month detail was conjured up from the very limited information captured by your Galaxy’s camera. But Samsung still disputes that this detail is simply applied to or overlaid on Space Zoom moon photos.
Analysis: a debate with fuzzy edges
Samsung’s response isn’t detailed enough to settle the debate over whether its Space Zoom photos are “fake,” because that’s really a matter of opinion. But it does disprove the suggestion that it’s just throwing extra detail and texture over your shots en masse.
The problem with the debate is that any digital photo — even a raw file — is some kind of fabrication. During the demosing process, when the red, green, and blue values of a sensor’s pixels are created, a process called interpolation simply guesses the most likely value of adjacent pixels.
Once you add multi-frame processing and AI sharpening to the mix, it’s clear that every photo is highly artificial (and based on processing guesswork). But the question raised by this debate is whether Samsung’s phones have gotten to the point where some of its images — particularly the moon shots — have become completely detached from photon capture.
That’s a debate that will probably never be settled. AI algorithms fill in details based on patterns they see when trained on a huge dataset of similar photos, but Samsung says what it doesn’t do is retrieve a previous image of the moon to superimpose on your blurry photo. Whether or not that process is acceptable to you is up to you to decide the next time you see a glorious full moon and only have your Galaxy smartphone to hand.