Samsung’s Moon Shots Force Us to Ask How Much AI Is Too Much

We like HDR-pimped pics, but this latest camera controversy outlines how computational photography could set us down a dicey path.
Composite image of two Moons one that is out of focus and one that is in focus and oversaturated in color
Photograph: Onfokus/Getty Images

Have you heard the moon conspiracy theory? No, not the one about the moon landings. It’s about the Samsung Galaxy S23 Ultra, and the theory it fabricates pictures of the moon, creating images that are far more detailed than the camera itself can actually capture.

Is it true? The reality is a bit more complicated than a pure yes or no answer. And the closer you look, the more you realize that whether a photo is “real” or not is something you can ask of most of the photos you take with a phone.

The Moon Issue Lands

The Samsung Galaxy S23 Ultra Moongate saga began when Reddit user ibreakphotos posted about their own experiments with moon photography. Their claim is that when someone shoots the moon using the phone’s super-extended hybrid zoom mode, Samsung effectively puts a lunar texture on the images.

This is something Samsung denies. We reached out to the company to get the official line: “Samsung is committed to delivering best-in-class photo experiences in any condition. When a user takes a photo of the moon, the AI-based scene optimization technology recognizes the moon as the main object and takes multiple shots for multi-frame composition, after which AI enhances the details of the image quality and colours. It does not apply any image overlaying to the photo. Users can deactivate the AI-based Scene Optimizer, which will disable automatic detail enhancements to any photos taken.”

Creating a single image out of multiple exposures is at the heart of computational photography. But, as ibreakphotos proved, there’s more going on here. Samsung’s Fake Moon Photo Controversy shows that some of the user’s testing was pretty smart. A photo of the moon was blurred, and displayed at a distance, putting a hard ceiling on how much detail would be possible to capture, regardless of the quality of the camera’s optics.

However, the Samsung Galaxy S23 Ultra’s image still gives off the impression of having much more detail than the source image being captured. The effect in their sample images is dramatic.

This test has, no surprise, been repeated elsewhere since this whole moon issue hit blast-off. Famous YouTuber Marques Brownlee tried it, for example, and found that while his results were in no way as dramatic as ibreakphotos’ at Reddit, they were there. Mobile photography content creator Shayne Mostyn’s results sat somewhere between the two.

The Moon Travels in Circles?

Something is going on here. But this is not the “gotcha” scoop that parts of the internet might have you believe, because the issue has cropped up before.

Samsung introduced its moon mode processing two years ago with the Samsung Galaxy S20 Ultra, because it in turn introduced us to the company’s 10x zoom camera and 100x hybrid “space zoom.” Its successor, the S21 Ultra, with an even better zoom, was accused of faking photos, leading Input to investigate for itself, and it came to largely the same conclusions we see today. The Galaxy S21 Ultra did do a bit more than your standard image processing when shooting the moon.

Samsung itself published a lengthy explainer on how it works in 2022, as part of the CamCyclopaedia found on the company's Korean website.

“The high-magnification actual image output from the sensor has a lot of noise, so it is not enough to give the best quality experience even after compositing multiple shots,” Samsung’s Seoyoung Lee wrote. “To overcome this, the Galaxy Camera applies a deep learning-based AI detail enhancement engine (Detail Enhancement technology) at the final stage to effectively remove noise and maximize the details of the moon to complete a bright and clear picture of the moon.”

Saying that a piece of tech applied machine learning or AI is barely more useful than just saying “abracadabra,” but Samsung goes a little further.

Courtesy of Samsung

“AI models that have been trained can detect lunar regions even if other lunar images that have not been used for learning are inserted,” says Samsung. What this is saying is that Samsung’s machine learning can recognize fresh pictures you take of the moon, not just images that were used to train the system.

What’s the Real Truth?

Combine this with Samsung’s more recent statement, that the Galaxy S23 Ultra’s Moon mode “does not apply any image overlaying to the photo.” It suggests the S23’s processing engine knows where the shapes that form the moon’s craters and “seas” are, which is the root of the mode’s ability to seemingly increase the perception of detail in photos without basically pasting a JPEG of the moon into your photos.

This begs the question: What else does the Samsung Galaxy S23 perform these tricks on? Interestingly, the moon is an almost singularly suitable candidate for this sort of processing.

Synchronous tidal locking means we always see the same face of the moon. While you may think of the moon as something that changes each day of the month, it’s largely how much of its surface is obscured in shadow that changes.

And unlike, for example, the Eiffel Tower, its appearance is not going to change drastically based on lighting. Moon shooting typically only happens at night, and Samsung’s processing falls apart if the moon is partially obscured by clouds.

One of the clearest ways Samsung’s processing fiddles with the moon is in manipulating mid-tone contrast, making its topography more pronounced. However, it's clearly also capable of introducing the appearance of texture and detail not present in the raw photo.

Samsung does this because the Galaxy S21, S22, and S23 Ultra phones’ 100x zoom images suck. Of course they do. They involve cropping massively into a small 10-MP sensor. Periscope zooms in phones are great, but they are not magic.

Credible Theories

Huawei is the other big company accused of faking its moon photos, with the otherwise brilliant Huawei P30 Pro from 2019. It was the last flagship Huawei released before the company was blacklisted in the US, effectively destroying its phones’ appeal in the West.

Android Authority claimed the phone pasted a stock image of the moon into your photos. Here’s how the company responded: “Moon Mode operates on the same principle as other master AI modes, in that it recognizes and optimizes details within an image to help individuals take better photos. It does not in any way replace the image—that would require an unrealistic amount of storage space since AI mode recognizes over 1,300 scenarios. Based on machine learning principles, the camera recognizes a scenario and helps to optimize focus and exposure to enhance the details such as shapes, colors, and highlights/lowlights.”

Familiar, right?

You won't see these techniques used in too many other brands, but not for any high-minded reason. If a phone does not have a long-throw zoom of at least 5x, a Moon mode is largely pointless.

Trying to shoot the moon with an iPhone is difficult. Even the iPhone 14 Pro Max doesn't have the zoom range for it, and the phone's autoexposure will turn the moon into a searing blob of white. From a photographer's point of view, the exposure control of the S23 alone is excellent. But how “fake” are the S23’s moon images, really?

The most generous interpretation is that Samsung uses the real camera image data and just implements its machine learning knowledge to massage the processing. This could, for example, help it to trace the outlines of the Sea of Serenity and Sea of Tranquility when attempting to bring out a greater sense of detail from a blurred source.

However, this line is stretched in the way the final image renders the position of the Kepler, Aristarchus, and Copernicus craters with seeming uncanny accuracy when these small features are not perceptible in the source. While you can take an inference of where moon features are from a blurry source, this is next-level stuff.

Still, it’s easy to overestimate how much of a leg up the Samsung Galaxy S23 gets here. Its moon photos may look OK from a glance, but they are still bad. A recent Versus video featuring the S23 Ultra and Nikon P1000 shows what a decent sub-DSLR consumer superzoom camera is capable of.

A Question of Trust

The furor over this moon issue is understandable. Samsung uses lunar imagery to hype its 100x camera mode and the images are, to an extent, synthesized. But it has really just poked a toe outside the ever-expanding Overton AI window here, which has directed phone photography innovation for the past decade.

Each of these technical tricks, whether you call them AI or not, was designed to do what would have been impossible with the raw basics of a phone camera. One of the first of these, and arguably the most consequential, was HDR (High Dynamic Range). Apple built HDR into its camera app in iOS 4.1, released in 2010, the year of the iPhone 4.

In its early years this was a feature you might use on occasion. Lots of early Android phones’ HDR processing made images look fake, flattened, and unnaturally coloured. You might be able to see “ghosting” too, where—for example—tree branches swaying in the wind are captured at different spots in the exposures that make up the final HDR image.

However, today almost all phones taken with a phone use multi-exposure HDR processing. It’s an invisible feature. And you could argue it became an AI process when phone image signal processors got smart enough to avoid ghosting by selecting selective information from each constituent exposure.

HDR images aren’t as “fake” as Samsung’s moon photos, but they are a composite. It’s wild how many overexposed skies you’ll see in earlier generation smartphone cameras reviewers raved about.

“Bokeh” background blur portrait modes are even faker. This started with the HTC One M8 in 2014. It’s a phone with a pair of 4-MP cameras on the back. They use the parallax effect to tell near objects from far ones, create a depth map, and then blur out the background of your images. You know the drill.

These images, which simulate the effect of a wide aperture lens of a much larger scale than the crappy little plastic elements of a phone camera, are fakes. But we tend to get less upset because it messes with the context your subject sits in, not the subject itself—at least after the early years when edge detection was particularly poor.

Hardware Versus Software

It took a little longer for “AI” to solve another kind of scene phones couldn’t handle on their own—low light. But hardware had a go first. The Nokia Lumia 920 was the first phone to use optical image stabilization, which uses a little motor to compensate for handshake and allow for longer exposures without a tripod. However, we had reason to go back through years-old phone review samples recently, and there was really nothing good to see by today’s standards before 2018’s Huawei P20 Pro.

It used what Huawei called Master AI. Taking night photos with the P20 Pro felt like capturing a long-exposure, except you could do it handheld because the camera effectively built up the image in slices based on the exposure level. And each subsequent image bunged into the algorithm could be used to lower the overall noise level.

It’s just what we use for night photography today, but it all tends to happen a lot quicker in 2023 than it did in 2018. AI and machine learning have taken over as the driving force of phone photography, but it’s the result of a hardware versus software battle that look place around a decade ago. Hardware kinda lost.

Examples of the hardware side apart from the Lumia 920 include the Samsung Galaxy Camera and its 21x optical zoom from 2012, the high-res, large-sensor Nokia 808 Pureview (2012) and its follow-up, the Nokia Lumia 1020 (2013). And the Panasonic Lumix DMC-CM1 with its 1-inch sensor, which most people never knew existed in the first place.

It turns out you can’t make a chunky, unusual-looking phone a hit. As such we’ve only seen real leaps in hardware in more recent years, when it can fit into this mould. The Huawei P30 Pro is one of the best examples, being the first mainstream phone to use a 5x periscope zoom with folded optics—the same hardware that helped to cause this latest software row.

Making Image AI Feel Good

Google proved you can weave something special out of simple hardware, too. Its astrophotography mode might be seen as the “zoomed out” alternative to the Samsung Galaxy S23 Ultra’s Moon mode.

No one accused Google’s Astrophotography Pixel mode of being fake, but it represents things you can’t really see with your eyes. As explained in a 2019 Google AI blog, it uses a series of 16-second exposures to bring out the stars of the night sky, and even dust clouds of the Milky Way. The main job of AI here is to identify “hot pixels”—camera sensor pixels with dodgy read-out—that, helpfully, look a bit like stars, and then remove them by replacing them with the average of the surrounding pixels.

Google has for years been a bit of a poster child on how to approach “AI” camera software. Its Night Sight mode images tend to look less excessively bright than some; Pixel phones’ color reproduction is typically quite natural.

However, it does use generative and potentially misleading techniques just like Samsung and Huawei. The Pixel 6’s Magic Eraser function can remove people from images and fill in the space where they were using machine learning. Face Unblur combines images captured with the primary and ultrawide cameras to keep faces looking sharp even if the rest of the image is a bit blurred.

These shots are not quite as they appear, just like the Samsung Galaxy S23 Ultra’s Moon images. But Samsung and Huawei are seen to have stepped over line, to have cheated or misrepresented their own product. This may have something to do with when it happens. No one would be upset if Samsung put a Moon Enhance button in its gallery app, but it would also be a bit naff.

We are going to have to get used to the unease of AI supplying what it thinks we want in disconcerting ways. Whether it’s chatbots serving up information you can never be quite sure is correct, or social media becoming increasingly filled with AI-generated images, AI’s growing pains may only have just begun.