The Times Real Estate


.

  • Written by Douglas Goodwin, Visiting Assistant Professor in Media Studies, Scripps College
Phone cameras can take in more light than the human eye − that’s why low-light events like the northern lights often look better through your phone camera

Smartphone cameras have significantly improved in recent years. Computational photography and AI allow these devices to capture stunning images that can surpass what we see with the naked eye. Photos of the northern lights[1], or aurora borealis, provide one particularly striking example.

If you saw the northern lights during the geomagnetic storms in May 2024[2], you might have noticed that your smartphone made the photos look even more vivid than reality.

Auroras, known as the northern lights (aurora borealis) or southern lights (aurora australis) occur when the solar wind disturbs[3] Earth’s magnetic field[4]. They appear as streaks of color across the sky.

Two images of the northern lights, the left labeled 'eye' and the right labeled 'camera.' The 'eye' image is darker with the colors more muted.
The left side shows the aurora as seen with the naked eye. The right side reveals how a smartphone camera can capture brighter and more colorful lights. Douglas Goodwin[5]

What makes photos of these events even more striking than they appear to the eye? As a professor of computational photography[6], I’ve seen how the latest smartphone features overcome the limitations of human vision.

Your eyes in the dark

Human eyes are remarkable. They allow you to see footprints in a sun-soaked desert and pilot vehicles at high speeds. However, your eyes perform less impressively in low light.

Human eyes contain two types of cells that respond to light[7] – rods and cones. Rods are numerous and much more sensitive to light[8]. Cones handle color but need more light to function. As a result, at night our vision relies heavily on rods and misses color.

A diagram of a human eye, with a zoomed panel showing rod and cone receptors. The rods are cylindrical, while the cones are conical. Rods and cones in your eyes are photoreceptors that process black and white as well as color. Blume, C., Garbazza, C. & Spitschan, M., CC BY-SA[9][10]

The result is like wearing dark sunglasses to watch a movie. At night, colors appear washed out and muted. Similarly, under a starry sky, the vibrant hues of the aurora are present but often too dim for your eyes to see clearly.

In low light, your brain prioritizes motion detection and shape recognition[11] to help you navigate. This trade-off means the ethereal colors of the aurora are often invisible to the naked eye. Technology is the only way to increase their brightness.

Taking the perfect picture

Smartphones have revolutionized how people capture the world. These compact devices use multiple cameras and advanced sensors to gather more light than the human eye can, even in low-light conditions. They achieve this through longer exposure times – how long the camera takes in light[12] – larger apertures and increasing the ISO, the amount of light[13] your camera lets in.

But smartphones do more than adjust these settings. They also leverage computational photography[14] to enhance your images using digital techniques and algorithms. Image stabilization[15] reduces the camera’s shakiness, and exposure settings optimize the amount of light the camera captures.

Multi-image processing[16] creates the perfect photo by stacking multiple images together. A setting called night mode[17] can balance colors in low light, while LiDAR capabilities[18] in some phones keep your images in precise focus.

A diagram showing a stack of grainy images flattened down to one clear image. Image stacking involves aligning and combining several noisy photos to enhance the final image’s quality. Averaging these images together suppresses random sensor noise. This results in a clearer and more detailed picture than any of the photos alone. Douglas Goodwin[19]

LiDAR stands for light detection and ranging, and phones with this setting emit laser pulses to calculate the distances to objects in the scene quickly in any kind of light. LiDAR generates a depth map of the environment to improve focus and make objects in your photos stand out.

Two images, the left labeled 'optical' and the right labeled 'depth' of a person dancing. The 'optical' image shows how the person would look normally in the photo, while the 'depth' image shows their silhouette in white against a black background. Smartphone cameras don’t just capture flat images – they collect depth information too. The left side shows a regular photo, while the right side illustrates the depth map, with lighter pixels closer to the camera and darker ones farther away. Normally hidden, this depth data enables smartphones to apply effects such as artificial background blur to mimic the look of the northern lights against a night sky. Douglas Goodwin[20]

Artificial intelligence tools in your smartphone camera[21] can further enhance your photos by optimizing the settings, applying bursts of light and using super-resolution techniques[22] to get really fine detail. They can even identify faces[23] in your photos.

AI processing in your smartphone’s camera

While there’s plenty you can do with a smartphone camera, regular cameras do have larger sensors and superior optics, providing more control over the images you take. Camera manufacturers like Nikon, Sony and Canon typically avoid tampering with the image[24], instead letting the photographer take creative control.

These cameras offer photographers the flexibility of shooting in raw format[25], which allows you to keep more of each image’s data for editing and often produces higher-quality results.

Unlike dedicated cameras[26], modern smartphone cameras use AI while and after[27] you snap a picture to enhance your photos’ quality. While you’re taking a photo, AI tools will analyze the scene you’re pointing the camera at and adjust settings such as exposure, white balance and ISO, while recognizing the subject you’re shooting and stabilizing the image. These make sure you get a great photo when you hit the button.

You can often find features that use AI such as high dynamic range[28], night mode[29] and portrait mode[30], enabled by default or accessible within your camera settings.

AI algorithms further enhance your photos by refining details, reducing blur and applying effects such as color correction after you take the photo.

All these features help your camera take photos in low-light conditions and contributed to the stunning aurora photos you may have captured with your phone camera.

While the human eye struggles to fully appreciate the northern lights’ otherworldly hues at night, modern smartphone cameras overcome this limitation. By leveraging AI and computational photography techniques, your devices allow you to see the bold colors of solar storms in the atmosphere, boosting color and capturing otherwise invisible details that even the keenest eye will miss.

References

  1. ^ the northern lights (theconversation.com)
  2. ^ the geomagnetic storms in May 2024 (www.forbes.com)
  3. ^ solar wind disturbs (theconversation.com)
  4. ^ Earth’s magnetic field (theconversation.com)
  5. ^ Douglas Goodwin (www.evernote.com)
  6. ^ professor of computational photography (www.scrippscollege.edu)
  7. ^ cells that respond to light (www.britannica.com)
  8. ^ sensitive to light (askabiologist.asu.edu)
  9. ^ Blume, C., Garbazza, C. & Spitschan, M. (doi.org)
  10. ^ CC BY-SA (creativecommons.org)
  11. ^ motion detection and shape recognition (hyperphysics.phy-astr.gsu.edu)
  12. ^ camera takes in light (www.adobe.com)
  13. ^ amount of light (www.adobe.com)
  14. ^ computational photography (web.media.mit.edu)
  15. ^ Image stabilization (en.wikipedia.org)
  16. ^ Multi-image processing (r2.community.samsung.com)
  17. ^ night mode (support.apple.com)
  18. ^ LiDAR capabilities (9meters.com)
  19. ^ Douglas Goodwin (www.evernote.com)
  20. ^ Douglas Goodwin (www.pexels.com)
  21. ^ in your smartphone camera (www.zdnet.com)
  22. ^ using super-resolution techniques (www.androidauthority.com)
  23. ^ identify faces (arxiv.org)
  24. ^ typically avoid tampering with the image (asia.nikkei.com)
  25. ^ shooting in raw format (www.pcmag.com)
  26. ^ dedicated cameras (asia.nikkei.com)
  27. ^ use AI while and after (www.dxomark.com)
  28. ^ high dynamic range (www.zdnet.com)
  29. ^ night mode (support.apple.com)
  30. ^ portrait mode (www.androidauthority.com)

Authors: Douglas Goodwin, Visiting Assistant Professor in Media Studies, Scripps College

Read more https://theconversation.com/phone-cameras-can-take-in-more-light-than-the-human-eye-thats-why-low-light-events-like-the-northern-lights-often-look-better-through-your-phone-camera-230068

Metropolitan republishes selected articles from The Conversation USA with permission

Visit The Conversation to see more