Non-innovative innovation. Where the roots of Apple technology grow from.

Hacker

Professional
Messages
1,048
Reputation
9
Reaction score
724
Points
113
The content of the article
  • Night Shift mode: how to do it right
  • True Tone: ambient light sensor and smart software
  • Deep Fusion: catch up and overtake the Pixel!
  • Historical reference
  • Night Mode: Ok, but why only for iPhone 12?
  • Face Unlock: Face ID vs. Competitors
  • Not just the face
  • eSIM: electronic SIM cards instead of physical ones
  • Second and third chambers
  • Conclusion

Night Shift mode: how to do it right
Today, the night mode of the screen, which gradually reduces the color temperature to "warm" shades in late hours, has become a habit. Someone uses this mode, someone prefers natural color reproduction, but the main thing is that this mode is, and it is on almost all smartphones of almost all manufacturers.

For Apple users, this mode first appeared in 2016 with the release of 9.3.

No not like this. Officially for users of Apple products, this mode first appeared in March 2016 with the release of 9.3. And unofficially, long before that, there was an application f.Lux. How long is it? F.Lux appeared on iOS in 2011 and required a jailbreak to work. The non-jailbreak version was released in 2015; users could independently install it on their devices. However, already in November 2015, Apple's legal department forced the developers to remove the application from free access, referring to the clauses of the agreement prohibiting developers from using private APIs. And already in March next year, the function officially appeared in iOS 9.3. Innovation? Doubtful.

What about other platforms? Night Light mode appeared in some (far from all) Android devices with Android 7.1.1; additional mode settings became available in Android 8. Night Light officially required Hardware Composer HAL 2.0 support (therefore, only 64-bit processors - as is the case with Apple). In beta versions of Android, this mode has appeared and disappeared even on Google's own devices. 64-bit devices such as the Nexus 9 and Google Pixel C did not get Night Light, despite meeting the requirements.

Night Light has been a mess for a long time, too, with third-party smartphones and tablets supporting Night Light. Night Light received some smartphones that came with Android 6 or 7 on board, and some did not. But devices released immediately with Android 8 in 2017 and later, almost all (with rare exceptions) had this feature. Moreover, many manufacturers have added this capability to their own skins, regardless of the Android version.

There are plenty of apps in the Google Play store that implement a kind of night mode. Users of rooted devices have access to f.Lux and CF.lumen. Others can use one of the many on-screen filters, with all their limitations in the form of “faded” colors, whitish blacks and an unshaded status bar.

By the way, night mode is now available on computers with Windows 10. Unlike Apple, Microsoft did not block the f.Lux application; it can be installed both from its own developer site and from the Microsoft Store.

Conclusion: Apple was not the first to come up with a night mode of the screen, and were not even the first to implement a similar mode in iOS. However, due to the fact that iOS 9.3 became available immediately for all mobile devices of the company, including the old iPhone 5s (released with iOS 7 on board), we will write this feature into the correct implementations.

True Tone: ambient light sensor and smart software
The True Tone function has become a kind of development of the night mode. Unlike the night mode, which simply turns the shades to the "warmer" side as night falls (the most "advanced" part here is to find out the location of the device and determine the exact moment of sunset), True Tone technology requires a hardware sensor to determine the color temperature environment and adjust the image to fit it.

For the first time, the technology appeared not in the iPhone, but in the iPad Pro 9.7 tablet, which was released in March 2016. Today True Tone works on all Apple devices equipped with the appropriate sensor. The list includes all iPhones starting with the iPhone 8, all iPad Pro versions except the very first model in 2015, and the new iPad mini 5 and iPad Air (2019).

Just like with night mode, True Tone technology has its lovers and skeptics. I am ambivalent about technology. On the one hand, on the iPad Pro, it is implemented gently enough not to be annoying. But on iPhones equipped with OLED matrices, enabling True Tone instantly turns the screen yellow: tested on iPhone X and Xs Max.

And what about the competitors? Something similar happened back in the days of the Samsung Galaxy S8 (Adaptive Display, 2017). However, according to user reviews, the technology looked cool only on paper. In real life, only a highly colored external light source could somehow affect the picture, and the effect itself manifested itself only in a few select applications (for example, in the Chrome browser).

In fact, the first to implement the technology really well was Amazon with its Amazon Fire HDX 8.9 tablet (2014). The technology was called Dynamic Light Control; you can see the demo here . Doesn't it look great? Here everything is like Apple later, only two years earlier: both the hardware ambient color temperature sensor and the correct screen driver that uses hardware registers to change the screen settings ...

Although I am (still) the owner of a copy of the Amazon Fire HDX tablet, I managed to find a description of the technology with difficulty, and pictures with a demonstration of its work are generally rare and were found only on one resource. Why did everyone forget about the Amazon tablet, and Apple developers with True Tone technology became innovators and pioneers? Perhaps it's because Amazon's tablet hasn't been as popular as the iPad Pro 9.7. Maybe it’s because Amazon’s marketing department didn’t promote this feature, didn’t explain its benefits to users. Or that the 2014 model was the first and only one where Amazon used Dynamic Light Control technology. Or maybe the reason is that the technology worked perfectly in exactly two applications: the Kindle reader and the Silk browser (this is Amazon's own browser running on FireOS).

I tend to think of Dynamic Light Control as a strange accident that happened to Amazon through an oversight. It looks like Amazon fired the development team that implemented this mode (and I guess I guess which company later hired them). This point of view is indirectly confirmed by the fact that clearly another company was working on the implementation of the new version of the night mode in Amazon tablets (Blue Shade). I have not seen such a ridiculous and ugly implementation of the night mode in any company, either before or after. Admire the Blue Shade at its finest.

The most interesting thing is that both modes can work simultaneously in Amazon Fire HDX 8.9 (2014) - Dynamic Light Control (perfect picture, smooth adjustment, but works only in two applications) and Blue Shade (picture worse than the most terrible horror, works everywhere) ... The difference is obvious.

A mode similar to True Tone is available in some LG smartphones (G7, G8, G8s Thinq models), but who knows? Done and done. But more or less massively similar technology is used in smart displays Google Nest Hub and the Pixel 4 smartphone. The technology is called Ambient EQ. It works absolutely brilliantly: the appearance of devices with Ambient EQ in the home interior significantly outperforms counterparts from technology pioneer Amazon, which, unfortunately, refused to use the technology in its Echo Show and Echo Spot devices.

Takeaway: True Tone had been around long before Apple, but nobody took it seriously. Apple gets a pie off the shelf for smart implementation and a consistent approach. Second place goes to Google for its excellent implementation of Ambient EQ in smart speakers with a screen. Amazon doesn't get anything: the technology, original in 2014, has long been forgotten even by the pioneer.

Deep Fusion: catch up and overtake the Pixel!
For a long time, cameras have been the weak point of the iPhone. By 2017, nearly every Android competitor had overtaken the iPhone in photo quality. Even at the time of release, the photographic capabilities of the iPhone X looked downright pale in comparison to the Google Pixel with its HDR + mode.

An analogue of the HDR + mode appeared in Apple smartphones only with the release of the 2018 line of smartphones. Dubbed Smart HDR, the new mode became available only in the latest generation of devices: iPhone Xr, Xs and Xs Max. Alas, no improvements were added to the processing of photos on the iPhone X, despite the most powerful stuffing, many times exceeding the needs of computational photography.

In 2018 models, the technology operates only in ZSL (Zero Shutter Lag) mode, does not have an analogue of HDR + Enhanced and works only with whole frames, and not with individual tiles, as is done by Google.

The new iPhones 11 in the 2019 lineup have announced radical improvements. The technology has received the marketing name Deep Fusion. From the user's point of view, the technology is similar to HDR + mode without ZSL (or Pixel night mode). The new iPhones brought object recognition and pixel-by-pixel processing instead of frame-by-frame; shooting, gluing and processing frames takes about a second. And again, last year's models were out of work: despite the redundancy of computing resources (according to Apple, the A12 processor in the iPhone Xs outperforms the Snapdragon 855, which is more than good at HDR + processing), the previous generation of the iPhone remains on the previous generation of Smart HDR. That is - at the level of "more or less decent for last year."

And what about the competitors?
For the first time, the possibilities of computational photography were loudly announced by Google in 2016, two years before the release of the iPhone Xs and the introduction of Apple's Smart HDR mode. And if the Smart HDR mode is a kind of "black box", the work of which we know only from Apple marketing materials and third-party reviews, then we know everything about Google HDR +, right down to mathematical formulas.

Computational photography in one form or another is used in their devices by almost all manufacturers of flagships on Android (except, probably, Sony). And if you don't, you can install a modified Google Camera APK and get all or almost all of the Pixel's features.

There are also third-party apps available for the iPhone that promise all the benefits of computational photography. These are both Specter and Halide from the same developer (by the way, there are excellent articles on their blog - here, for example, an analysis of the Smart HDR mode). The only problem is that they work in comparison with the "native" application disgusting, - I assert as a user who bought all three applications.

Historical reference
Interestingly, image stacking, which is the basis of HDR + and Deep Fusion, was not used by Google for the first time in mobile devices. Remember the Lumia 950 and 950 XL smartphones released in 2015? They were developed by Microsoft, and inside they had a mobile version of Windows 10. In HDR mode, these smartphones did something very similar to what later appeared on Google and even later on Apple, and PureView mode was quite a decent analogue of night mode. The pictures were of excellent quality; Moreover, the resolution of photos of smartphones from Microsoft was not 12 MP, as in modern flagships, but all 20 - while neither detail nor noise level suffered. Alas, HDR stitching was not instant on Lumia smartphones: after each shot, the smartphone thought for a few seconds. With the cessation of development of its own mobile OS, Microsoft also abandoned the development of the camera application. It's a pity; on modern hardware, in combination with high-quality optics and a sensor, a strong competitor to Google could turn out - but only from the point of view of the camera.

Takeaway: Google gets a pie off the shelf for its excellent implementation and consistent improvement of the Google Camera app. Additional Google points for making the new Google Camera features available on older Pixels (within understandable hardware limitations). About Apple products, we can only say that the stunningly excess power of neural coprocessors of last year's iPhone models will remain unclaimed: new photo capabilities were not added to the old models, and this is due to anything but hardware limitations.

Night Mode: Ok, but why only for iPhone 12?
The eleventh iPhone has a night mode. In this mode, slow shutter speeds are achieved by cleverly combining multiple shots taken at fast shutter speeds to avoid blurring the image. More precisely, not whole frames are glued together, but their parts, which allows you to correctly process objects moving in the frame. This approach allows you to reduce the level of visible noise while maintaining the overall clarity of the photo.

For the first time, night mode, which works according to the principle described above, appeared in the Google Camera app in the fall of 2017 - almost a year earlier than in the iPhone. I do not specifically mention the model of the device, to the announcement of which the release of the updated application was timed, because the Night Sight mode became available for previous smartphones of the line with the update of the Google Camera application from the Play Store.

Why am I focusing on this? Because no previous Apple models received any night mode. The new camera app is available exclusively on the new iPhone 11 series. Cause? Officially, Night Mode takes advantage of the amazing new A13 Bionic processor. Here are just a few slides earlier from the scene showed graphs from which you can see simply the dominant superiority of the previous A12 Bionic processor over the Qualcomm handicrafts that Google uses. And if Google can implement night mode on processors of the previous and pre-previous generations, then what prevents Apple from adapting it to the very powers of the neuroprocessor that are now not used and are idle? Nothing personal, just marketing: if you want night mode, buy a new iPhone.

Conclusion: the prize goes to Google as a pioneer and for the fact that the night mode became available in previous smartphones of the line. A ray of surprise and bewilderment was sent towards Apple by the decision of the marketing department.

Face Unlock: Face ID vs. Competitors
Face ID became the star of the program with the announcement of the iPhone X in 2017. The new scanner system that creates a 3D image of the user's face, the new Bionic coprocessor used for machine learning for face unlocking (and is idle for the rest of the time), and seamless integration into iOS made face unlocking a complete replacement for the fingerprint sensor.

Competitors? In Android, long before Face ID, it was possible to add face unlock in the Smart Lock subsystem. I mention this system solely so that commentators do not reproach me for omission: the Smart Lock subsystem cannot be considered an analogue of Face ID due to the extremely low level of security. No wonder many security policies completely prohibit the use of Smart Lock in smartphones with Android.

But competitors are on the alert. After the release of the iPhone X, many manufacturers began to add the face unlock subsystem as a standard (and not Smart Lock) capabilities. In the overwhelming majority of cases, the safety of such solutions was below the critical level: a single camera was used, no three-dimensional model was created, just two-dimensional images were compared. A step aside was the Xiaomi Mi 8 Pro smartphone, which had a pair of cameras and infrared illumination.

Something similar was tried to be built into their Samsung smartphones: the iris recognition mode was an optional addition to the face unlock mode. Nevertheless, in these smartphones, unlocking by face has not become the main and only biometrics; seamless integration (authorization in applications) was out of the question.

With the Google Pixel 4 and 4 XL, released two years after the iPhone X, face unlock is the only way to unlock the device with biometrics. The system is based on the same principles as Face ID.

Not just the face
In addition to unlocking by face and fingerprint, there were other methods of user authentication in smartphones. So, in the Lumia 950/950 XL smartphones, unlocking by iris scan was offered as the only biometric option. Formally - an analogue of Face ID. In fact, unlocking by iris scan turned out to be slow and not very reliable, did not work with glasses, did not cope well in excessively bright lighting ... Yes, the level of false positives was extremely low; yes, the security was up to par. But it was so inconvenient to use that Microsoft abandoned this model.

Do you know who actually owns the palm? After all, Microsoft, and I don't mean the Lumia 950 and 950 XL line at all right now. In 2015, Microsoft made a beautiful, secure, highly polished, instant and reliable face authentication system called Windows Hello. Two infrared cameras spaced at a specified (certified) distance, infrared illumination, instead of Secure Enclave, is no less secure, but absolutely standard TPM 2.0 module.

It works like this: the user opens the laptop lid - and he is already authorized; just swipe up the lock screen curtain (or press the "Up" button) to get to the desktop. Fast and without delay. At the same time, biometrics is seamlessly integrated into applications: to view passwords saved in the Chrome browser, it is also enough to be in front of the camera. Many laptops (for example, the HP Specter x360) have the solution already built-in, and for the desktop, you can buy any camera (certified for Windows Hello). A certified camera, driver, and TPM 2.0 are all you need to set up a biometric login in Windows 10.

It would seem, what does Apple have to do with it? The entire Windows Hello subsystem is massive enough. Microsoft failed to integrate it into the phone as it works on tablets and laptops, so they had to create a new system with iris recognition. It was Apple who managed to squeeze reliable biometrics into a miniature form factor, and make the recognition itself instant and almost invisible to the user thanks to the excess power of the neural coprocessor.

Conclusion: despite the presence of such systems long before Face ID, Apple was the first to do it really conveniently, quickly and reliably on a smartphone (tablets and laptops do not count, there is a palm for Microsoft).

eSIM: electronic SIM cards instead of physical ones
I could talk about dual-SIM iPhones here, but the topic with electronic SIM cards seems to me much more interesting. It was with the release of the iPhone Xr / Xs / Xs Max generation in 2018 (and not with the advent of the Apple Watch that supports LTE) that mobile operators around the world began to add eSIM support in droves. Perhaps no other company could achieve this effect, and this is one of those things that not only users of Apple products benefit from, but many others as well.

Few people know that eSIM support in a mainstream smartphone first appeared not in the iPhone at all. The first eSIM-enabled product was the Apple Watch 3 LTE, and the first eSIM-enabled smartphone was the Google Pixel 2, also released in 2017. At the time, Pixel 2's eSIM support was only and exclusively available on Google's own mobile Fi network; there was no massive shift to eSIM by users or cellular operators. Apple Watch 3 with LTE was supported at the time of release by not even every carrier in the United States, let alone other countries.

In October 2018, less than a month after the release of the iPhone Xs, eSIM support was announced in the fresh Google Pixel 3 line. Now, in Google smartphones, support for electronic SIM cards has become full and open to all operators that have eSIM support.

What's the difference between Apple and Google solutions? If the iPhone Xs smartphones (and further down the list) are dual-SIM, allowing simultaneous operation in standby mode for both SIM-cards (physical and electronic), then in Pixel 3 / XL only one SIM, physical or electronic, can be used at the same time. Interestingly, the early beta versions of Android gave users the ability to enable simultaneous support for both SIM-cards on the Pixel 3, which clearly indicates that the inside of the smartphone is quite “dual-SIM”. Alas, the "dual-SIM" mode did not get into any final version of Android.

Conclusion: Despite the almost simultaneous release of the iPhone Xs and Pixel 3, Apple still gets the main prize. It was Apple that prepared the market for the emergence of eSIM in smartphones (let me remind you that the Apple Watch with LTE support came out a year earlier), and it was the scale of the company that made the operators seriously look at the new standard.

Second and third chambers
Apple's first dual-camera smartphone was the iPhone 7 Plus, released in September 2016. In the same 2016, numerous competitor models came out, also equipped with two cameras. Here are the Huawei models (the second camera, monochrome, is designed for better shooting in the dark and to determine the depth of field), and the LG G5 (the second camera is wide-angle), and numerous Chinese devices, the second cameras of which could perform any role, even decorative. Long before 2016, there were other dual-camera models. For example, there were the LG Optimus 3D and HTC Evo 3D models (both in 2011), the dual camera of which made it possible to shoot stereo pairs.
At the same time, Apple has become one of the few companies to use the honest (optical) method of capturing 2x zoom images. Many Chinese dual-camera competitors (including the early Xiaomi and OnePlus models) used a combination of close optical zoom, central cropping and software interpolation, resulting in "good looking from a smartphone" quality of images. The second lens in the iPhone 7 Plus just was and just worked.
The first iPhones with three cameras were the iPhone 11 Pro and Pro Max (2019). And if the iPhone 7 Plus in 2016 looked quite fresh, then the "three-chamber" iPhones of the 2019 sample look like a kind of concession to the market. And if there are no questions about the stabilized telephoto lens with 2x zoom, then the use of optics without autofocus for the wide-angle camera in the iPhone 11 Pro is perplexing.

Conclusion: not innovation. Nobody gets the main prize; the idea was just in the air, and almost all manufacturers were engaged in its implementation at the same time.

Conclusion
It is impossible to cover all innovations and “innovations” in one article even in the last two years. Using the U1 Sensor for Positioning - An Innovation or Not? What about the first and second generation Apple Pencils? What about tablets with a 120Hz refresh rate? And what about the augmented reality capabilities that Google later picked up (and abandoned)? A watch with a breakthrough ECG function, but a very late Always On Display? All this is interesting, but much deserves a separate consideration.

So are Apple innovators or catching up? As is usually the case with Apple, there is no definite answer.

Many things appear very late in Apple products. Here and support for computational photography in cameras, and Always On Display mode in watches, and support for two SIM-cards, and support for two, and then three cameras, and night screen mode, and much more.

Few are truly disruptive innovations. Here is the eSIM standard, which would have taken many years to spread without Apple, and the electrocardiogram function in the watch, and the now outdated "endless" (or simply - with symmetrical frames) "cutout" screen.

In most cases, Apple picks up an unnecessary, half-forgotten technology from the floor, fine-tunes it and makes it all “just works,” after which competitors are given the role of catch-up. A striking example is the True Tone technology, which was invented and used in the Amazon tablet long before Apple, then forgotten, also introduced into the tablet (iPad Pro 9.7) and replicated across a significant part of the company's product line. Then the technology was adopted by competitors quietly (LG G8, G8s) or with advertising support (Google Nest Hub). Another example is Face ID, which, in its current, well-tuned form, just appeared on the Pixel 4.
 
Top