PC & Mobile technology
Smartphones
Mobile technology
Tricks and tips
09.12.2023 07:15

Share with others:

Share

How do smartphone cameras work?

For many, the value of a phone depends on the capabilities of the cameras. While I don't fully agree with this, as smartphones can do a lot more than photography, I can't argue with the importance of cameras.
How do smartphone cameras work?

“Which processor does your phone have? How much space does it have? How much RAM?" This is how we compared phones several times, but today the more common questions are: "How good are your phone's cameras? How well does the #141;i take photos and record at night?" When testing phones, everyone, including the testers, and we in the editorial office are among them, pay the most attention to the cameras and only then to the general performance of the phone . Since there are quite a few unknowns and a lot of technical terminology around cameras (screen, sensors, lenses, focus …), it might be time to clear up some of the fog around how they work.

Everything revolves around light

It is surprising (or not) how many parallels we can make between the camera and our eyes. In dark rooms, eyes and cameras are blind. If we want to record a picture, we need light. But the light is often scattered in all directions. Our eyes have lenses that direct light onto the retina. Even the cameras on our phones have lenses that capture and record light information.

Light can also harm photography, which is most obvious with analog cameras that use photographic film. (Too) long exposure to light can destroy the contents of the film. The invention of the aperture solved this conundrum. In analog cameras, the shutter is a physical mechanism that opens and closes quickly, controlling the amount of light that reaches the film. The eyelids have a similar function.

There is no physical shutter on the phones, although when you take a photo, you hear that characteristic "click" sound. This is just a sound effect. Put the phone on silent mode and the sound will disappear. Instead of physical shutters, the cameras on phones use an electronic shutter that has the same function, but it does everything with algorithms and not with physical movement. Some smartphones, such as the Huawei Mate 50 Pro, have a camera with a physical aperture that can be moved between preset positions.

The film has not yet gone into oblivion. It is still in use among hobbyists, professional photographers and even in the film industry. Elsewhere, it has been replaced by sensors.

Why do mobile phones have moreč of different lenses?

Surely you have ever watched a professional photographer change the lenses on the camera according to the scene in front of him. Phones are technically capable of this, as demonstrated by Xiaomi with the Xiaomi 12S Ultra Concept concept phone, but it is extremely impractical, and new obstacles appear, such as problems with durability, water resistance, high price, and the like. . Manufacturers have therefore proposed more of different cameras - each with its own specific lens, among which we can easily switch according to needs within the camera application. Today, most phone cameras work this way.

If you look at the back of your phone, you'll notice two, three, or even four lenses and one on the front screen. Each offers a different perspective, depth and its own unique features. The main lens is understandably always present. The ultrawide camera is also more or less a constant on mobile phones. In the lower class, we often find a macro camera, while premium phones have a telephoto lens and a periscope telephoto lens, such as the Samsung Galaxy S23 Ultra.

What is the function of the lens?

Aperture, lens and image sensor are closely related. Aperture is the opening that you can physically see on the camera lens. As mentioned, the aperture controls how much light will reach the lens and sensor. As a general rule, a larger aperture is better because it means the camera can use more. light information. However, this is not necessarily the best indicator of photo quality.

ČIf you look at the specifications of your phone, you will notice the “f” ratings on the cameras. These degrees are the ratio between the focal length and the physical diameter of the aperture. The smaller this number, the wider the aperture. For example, the vivo X90 Pro has an f/1.8 main lens with a focal length of 23mm, a sf/1.6 (50mm) telephoto lens, and so on.

It will be difficult to compare the performance of phone cameras with the focal length. The focal length is extremely important, but for creating different aesthetics and visual effects. The shorter focal length is intended for a wide-angle perspective – nearby objects appear larger. A longer focal distance, for example, creates a more proportional and neutral photo.

As light enters the camera module, the lens collects the incoming light from the shot and directs it onto the sensor. Smartphone cameras are made up of a number of plastic parts called elements. Due to the nature of light, different wavelengths of light (colors) are refracted (bent) at different angles when passing through the lens. This means that the colors from your scene are projected onto the camera sensor out of alignment. Cameras need more leč to transfer a clear image to the sensor without possible irregularities, such as misalignment and other effects.

Photo: OnePlus

How does focus work on smartphone cameras?

Focusing is ironically not the focus of the user, because it is usually controlled by the cameras themselves. To a certain extent, the focus can be adjusted manually (depending on the phone), and in most cases the software does the job so well that manual intervention is unnecessary. Cell phone cameras use a dedicated sensor and/or additional hardware such as a laser rangefinder to focus.

Software autofocus uses data from the image sensor to determine if the image is in focus and adjusts the lenses to compensate. The usual technique of passive autofocus is based on detecting the contrast of the image and adjusting the sharpness until it is the maximum. This method is entirely software-based, making it the cheapest option. However, the process is slower and does not work as well in low light conditions.

Newer phones use phase detection autofocus (PDAF), which is faster and more accurate. Go to the specs of the latest iPhone 15 Pro Max and you'll notice the PDAF tag on the cameras. The latter ensures that the same amount of light reaches two closely placed sensors on the image sensor. Classic PDAF systems rely on dedicated photosites on the image sensor to measure light coming from the left or right side of the lens. If the image spots on the right side record the same light intensity as the spots on the left, the image is in focus. If the intensity is not the same, the system can calculate how much it needs to compensate for a sharp image, which is much faster than systems that rely on contrast detection.

Older PDAF systems use only a few percent of all image sites, while newer ones, such as the Galaxy S23 Ultra, use all 100 percent. For focusing, in addition to the left and right image points, they also use the top and bottom image points.

iPhones continue to have a dedicated LiDAR sensor that improves focus, depth perception, night imaging, and is handy for augmented reality (AR) applications.

Photo: Huawei

What is an image sensor?

The sensor is basically just a silicon wafer, on which a lot depends. The sensor receives light and converts it into electrical signals. The sensor can have several million pixels. How can you find that out? If you see a 100 or 200 MP camera, it means that the sensor in question has 100 or even 200 megapixels. If the light from the lens does not reach the image point, the sensor records this image point as black. If a large amount of light reaches the image spot, the sensor records it as white. The shades of gray that the sensor can register is called bit depth.

Most phones have 8-bit depth, and some have even 10-bit depth. By comparison, 8-bit depth means that the camera can capture 256 hues for each primary color channel used to mix the color spectrum (red, green, and blue). This is 256 shades of red, green and blue. In total, this is 16.7 million possible color shades. 10-bit cameras can capture more like a billion shades.

How does a camera capture a color photo? Each image site has a color filter that allows only certain colors to pass through. With rare exceptions, such as Huawei phones that use RYYB (yellow instead of green filters), the most commonly used is a Bayer array of color filters that divides each square (2×2) of the image spot into a redč, a blue and two green filters (RGGB). .

Photo: Sony

Normally, a camera working with a set of Bayer filters will calculate the sum of all this color data into one value, but this does not work with pixel binning. Manufacturers needed a way to collect each color separately.

For this purpose, they designed a so-called quad-Bayer array, where each group of pixels (2×2) is assigned one color. Four of these are then combined together - similar to the original set of Bayer filters: 2x green, 1x blue, 1x red.

The new array not only enables smartphone manufacturers to preserve color data in the pixel-combining process, but also it also enabled them to introduce other innovative features such as HDR mode.

Let's go back to the sensors. In the case of sensors, it is necessary to pay attention to their size and the size of the pixels themselves. Larger sensors can capture better photos because they have more image sites, which are also larger. Recently, smartphones and their cameras have entered the 1-inch world. For example, Xiaomi 13 Pro and vivo X90 Pro are among the first to have 1-inch sensors.

Pixels are measured in micrometers (µm). More pixels can absorb more. light, which is good for night photography. Don't worry if your phone has fewer pixels than other phones. Outside of night photography, it will still be able to deliver good results. Even the best Samsung phones face smaller pixels. The Galaxy S23 Ultra has a 200MP sensor, resulting in 0.6µm pixels, while the iPhone 15 Pro Max has a 48MP sensor with 1.22µm pixels. Manufacturers therefore started using pixel binning technology. The Galaxy S23 Ultra combines 16 pixels into one to capture photos with a final resolution of 12 MP.

Optical and electronic stabilization

Stabilization is also important for capturing good photos and videos: optical or electronic.

OIS is a hardware solution that uses a microelectromechanical system (MEMS) gyroscope to detect motion and adjust the camera system accordingly. Example: If you are holding a smartphone and your hand moves slightly to the left, the OIS system will detect this and move the camera slightly to the right. This is especially important for night photography, when the camera needs a longer time to capture the light, and during this time, vibrations can affect the quality of the photo.

Electronic Image Stabilization (EIS) relies on the phone's accelerometer to detect motion. Instead of moving parts of the camera, it moves frames of the image or lighting. Because the exposures are aligned based on the image content and not the image sensor frame, the final image or video has reduced resolution.

What does the software do?

After the image sensor converts the light into electrical signals, it is the job of the image signal processor (ISP) to convert these numbers into an image. The data in electrical signals is essentially a black-and-white image. The ISP must first return the color data based on a set of color filters (Bayer or something else). This creates an image, but the pixels are different intensities of red, green or blue. This is followed by color reconstruction, where the ISP changes the colors of pixels based on the colors of neighboring pixels. For example, if there are many green and red pixels and very few blue pixels in a certain area, the color reconstruction algorithms convert them to yellow.

ISP also has algorithms for noise removal and sharpening after color reconstruction. Each phone then has its own specific algorithms for producing the final photo.

The next time you pick up your phone, turn on the camera and take a photo, you'll know what was going on in the background during that time. Are you interested in how smart watches or their sensors work?




What are others reading?