Nivedit Majumdar Nivedit Majumdar

Smartphone Sensors v2.0: Improving Contextual Interaction

Google launched the two new Nexus devices – the 5X and 6P , besides a plethora of other devices and developments in the software sector. And just a week prior to that, Apple launched the new variants of the iPhone – the 6S and 6S Plus. Of primary importance is the improved sensor capabilities of the new iOS and Android devices, which will undoubtedly prove to be beneficial for the Quantified Self movement and self tracking in general.

And that is the crux of my article here. I will be talking about the improvements in terms of the sensor capabilities and the improvements in hardware that might possibly be the industry standard for flagships to come soon.

OVERVIEW

If I go into the detailed specifications of all the devices (I am restricting myself mainly to the two new iPhones and the two new Nexus phones for this article), I would end up ranting on and on like the maniacal fanboy that I am at the end of the day. Nonetheless, let it suffice to say that all the devices come with beefier specs than their predecessors, with more power packed into a slimmer body and more efficient resource management prowess.

Worthy of note is the unique additions that both Google and Apple have made to their flagships. Apple has included a unique form of interaction called 3D Touch, which very simply adds an extra dimension to the way we interact with our devices. This might be the game changer for years to come, and developers will begin designing their applications accordingly, as we’ve spoken about in the 3D Touch article.

Google, on the other hand, has dedicated fingerprint sensors embedded in the rear panels of their new Nexus phones. This means easier authentication and an improved ease (and security) in terms of making wireless payments. Also, with the evolution of Android Pay, Apple Pay and other such payment platforms, this spells out a design trend which many manufacturers might want to take inspiration from.

THE GOOD STUFF

For me, the most interesting proposition was offered by the hardware within these devices. The new Apple devices come with an improved M9 chip which is more efficient and powerful, while Google has incorporated an Android Sensor Hub –  a unified place to manage all the data coming in from all the sensors in the phone.

ANDROID

emberify_android_Sensor_hub

The Sensor Hub is basically a processor dedicated to managing the sensor inputs and data without having to include the main CPU. Snuggled between the main mobile processor and sensor hardware, the Sensor Hub takes readings from the accelerometer, gyroscope, fingerprint sensor and more, and these readings are then passed through Google’s custom algorithms. Power efficient, and at the same time intelligent in functionality, the Sensor Hub promises to improve the data management within a device.

Another interesting thing in this regard would be the way the CPU functions in this delicate balance. The Hub intelligently interprets the relevance of various actions, gestures and sensor data. Those that are deemed important by the Sensor Hub, are sent to the main CPU for processing. And this brings about the power efficiency game to its finest.

For fitness tracking, the Sensor Hub brings about a unique development in the form of hardware sensor batching, which allows sensors to delay handing off non-critical data to the operating system for a short span of time rather than produce a constant stream of data as it is happens. This comes into more relevance when it comes to counting steps without having to rely on the main processor to constantly remain awake and count the number of steps the user takes.

For developers, this promises new avenues in developing applications which are contextually aware and can work in the background without depleting the battery life. Organisations are upping the sensor management game, and this can only imply more opportunities for Contextual Applications (such as Emberify’s very own Instant!) to gain a momentum like never before.

emberify_sensor_hub_developer_market_view

All this implies the sensor prowess being more ambient, being able to stay constantly on without draining the battery and providing meaningful contextual data whenever the user needs it. Combined with features such as Android M’s Doze Mode, the Sensor Hub might just become one of the more interesting developments in Android devices to come.

APPLE

Some might say that the concept behind Android’s Sensor Hub was inspired from Apple, and I would agree with them. Apple has incorporated motion coprocessor technology in their phones since the M7 chip on the iPhone 5S. The underlying principle is the same for both these technologies: sensor data is handled by a processor that is not the CPU. This permits the CPU to handle more tasks thrown at it by the user, while at the same time working on Contextual data in the background.

emberify_apple_new_m9_A9_chip

With the iPhone 6S and 6S Plus, the M series chip is no longer a separate entity from the A series chip. The M9 is integrated within the A9 chip, and this development allows the M9 chip to constantly monitor the sensor data and run in the background without interfering with the operations of the main A9 chip.

This means that the device will track the user’s running patterns constantly. Moreover, this also enables Siri to be in an always on mode, without compromising on the battery life. A definite win-win situation for both the battery life and phone efficiency!

TO CONCLUDE

All these developments are indicating a bigger market for Contextual Applications, and  ergo the Quantified Self. Looking at the bigger picture, one can see that the sensor game is definitely better than ever for the biggest names in the mobile market, and this opens new doors for application developers as well. Sensor and resource management in phones is becoming and more and more relevant, and this is quite exciting to say the least!

(Cover Image depicts the new Nexus devices. Credits: Simone Gaita)

Sign up for our monthly mailing list