The Camera. Reimagined.
Samsung's tagline for the Galaxy S9 is "The Camera. Reimagined." You can pretty much guess from it that most of the major innovations are camera focused. And after the media preview session, I'd have to agree that all the fun stuff is centered around the camera. Samsung has finally learned that it's not so much as cramming new, gimmicky features onto a device, but how to implement them in a way that they're accessible, easy and fun to use. And that's a page straight out of Apple's book.
To this end, the innovations for the S9 camera aren't new. Apple did animated emojis first on the iPhone X and Sony's Xperia ZX Premium already touted stacked sensors and super slow-mo last year. But, I dare say Samsung has done it better. Let's get down to it.
My favorite new feature to the camera. Instead of just animating yourself into a limited set of standard emojis, which while fun, wears out its welcome fast. Samsung's AR Emoji implementation creates you very own 3D avatar, which can then be further personalized with clothes, hair styles and accessories. This is somewhat similar to creating an Xbox Avatar or Nintendo Wii, except with your actual face mapped. According to Samsung, AR Emoji uses a data-based machine learning algorithm to map the 2D image of your face to enable more than 100 facial features.
Along with the 3D AR Emoji, the phone will also automatically create a set of emoji stickers that can be used via the Messenger app. Check out how all this works out in our early preview session:-
The best part about AR Emoji is the fact that this feature is designed to be cross-platform, cross-device, and cross-brand compatible. The photos, and videos you create can be shared with anyone. Even the emoji stickers are created as standard animated GIFs, which means you can use them on just about any app that supports sending/viewing GIFs. You can capture your friends as well and turn them all into emojis, which is way more personalized and fun in the long run.
The Samsung Galaxy S9 and S9+ also features an updated biometric log-in feature called Intelligent Scan, which is combes the existing facial and iris scanning technologies in the past phones. There aren't any new front facing cameras or sensors, and when approached, Samsung's official comment was "Intelligent Scan is based on an algorithm that analyzes your iris and facial data".
What I wanted to know was if Intelligent Scan uses both Iris and Facial recognition a) simultaneously, or b) one after another for added security, or c) one or the other in case your eyes or your face is blocked. When I hear back from Samsung, I'll update this part. For the demo, this was the one camera feature, I did not get a chance to test out.
The Samsung Galaxy S9 and S9+ support 960fps super slow-mo video recording. This feature is enabled through the use of a 3-layer stacked sensor design where the image sensor has a DRAM buffer directly attached to it. The technology isn't new of course. Sony can be credited for developing stacked sensor technology way back in 2015 with the RX100 IV compact digital camera, and last year's Sony XZ Premium was the first phone to feature a 3-layer stacked sensor and 960fps slow-mo function called Motion Eye.
Samsung's implementation on the S9 and S9+ however makes it easier to use. Basically, the slow-mo function can be set to automatically trigger based on motion detection. You'd still have to choose Super Slow-mo in the video settings, and then select the area on the screen you want to focus on, so a little setup is still required. After that, the focus box will turn yellow to let you know it's ready, and only when motion is detected within the box will super slow-mo recording activate. Recording lasts for just 0.2 seconds and extends to a clip of 6 seconds , but if you keep recording, it will re-trigger super slow-mo as long as movement is detected again.
After shooting, you can choose to edit the shot with three different looping modes: normal, reverse and bounce. You can also add a soundtrack from a pre-set range of music or from your own playlist. Again, you can save these super slow-mo snippets as animated GIFs to easily share with friends. Here's our trial of this functionality:-
Personally, I thought that the auto function does require some finesse to use. It's not as automatic as it's made out to be. The preview video shows the ideal situation as long as you can steady the phone with a tripod. When I tried using the feature by holding the phone in my hand, I kept getting the message to keep the phone still, and my own hand movements rather than the subject movement would trigger super slow-mo, which spoiled the timing of the shot.
You can however set it to Manual mode, in which case, you manually tap the super slow-mo icon while recording. Similarly, it will capture a 0.2 second burst at 960fps, then return to normal, till you tap it again.
Besides introducing motion detection and auto mode, Samsung is making a real effort into opening up its headline features for easy sharing across devices and platforms. It's not so much as being able to do it, but the convenience of having the option readily built-in. For example, until WhatsApp had the instant GIF function, your average user wouldn't have gone to the trouble to find apps to edit their images or videos to create a GIF for sharing.
Low Light Camera
Every new phone generation has seen some improvements in low light photography, arguably the bane of smartphone camera design. In the past, we've generally seen larger aperture lenses and pixel sizes, both designed to help capture more light.
- The S6 had a 16MP camera with 1.2-micron pixels and an F1.9 lens.
- The S7 dropped down to a 12MP sensor, but increased its pixel size to 1.4-micons and an F1.7 lens.
- The S8 had similar specifications, but Samsung implemented a new software processing feature called multi-frame to help reduce noise, especially in low light.
The Galaxy S9 will take things further with an F1.5 lens, the brightest on any device yet.
The problem however, is that while large aperture lenses work great to compensate for low light, we're getting to a point where it becomes detrimental during the day. Then there's also the issue of depth of field. Images shot with very wide apertures tend to be soft and have less focused surroundings. This is great for close up or portrait photography, but not ideal when you're trying to capture scenery or panoramas.
To get the best of both worlds, the S9 and S9+ will be the first smartphones to feature a dual aperture lens, which will automatically switch to F1.5 in low light situations, and F2.4 when it is bright enough. We're trying to confirm with Samsung, but from the presentations shown, it seems that the switching point is an ambient lighting of about 100 lux.
Remember how the S8 introduced multi-frame processing to help low light photography by reducing image noise? Because of the stacked sensor design, the S9 can improve on this feature as well, processing and combining up to 12 pictures per shot, supposedly improving noise reduction by another 30% compared to the S8.
We couldn't really make any assessment of the low light capabilities during the NDA media preview session as we all had our phone cameras taped over, but the demo Samsung prepared pitted a Galaxy S9+ against a Pixel 2 XL in a <1 lux environment. Expectedly, the S9+ performed superbly.
Bixby Vision Live Translation and Food Recognition
The last set of camera improvements are actually functional features for Bixby Vision. This includes live language translation and food identification. The main feature here is that you no longer need to snap a photo first to send to Bixby for processing, and both features are available live. All you have to do is point your camera at a foreign text or a food item, click on the Bixby icon and you'll get instant translations and food information overlay.
For food identification, Bixby will show you a quick calorie count on the food it recognizes, and you can bring up a more detailed nutritional sheet if you choose to. Samsung couldn't provide us with any numbers yet on how large their database is, but from what we understand, it is separated by market instead of one large global food database.
During the demo however, there was only one muffin we could try on, with some mixed results. It detected the muffin as a popover a few times, and because of that, I now know what a popover is. Bixby does offer a selection of similar suggestions you can manually click on if you don't think it has identified the right food though, so that's something, but among all the features covered so far, this is the most gimmicky. Here's our video of it in action:-
At launch, Bixby's Live Translation can translate 54 input languages into 104 output languages in real time. Out of this 54 input language, Bixby can automatically identify 33 of them, while the remaining 21 requires manual selection.
Read Next (1): Exclusive First Looks: Samsung Galaxy S9 and S9+
Read Next (2): Local pricing, availability and pre-order details!