Information is everywhere. Earlier, it was in the from of the internet or cloud services, but since the advent of the Quantified Self movement the human body itself is a walking, talking data silos – a plethora of data that can be gathered form the human body, and on correlation with other recorded parameters can actually provide valuable insights.
In this regard, two things stand out as paramount components of the QS flowchart that I’ve been talking about in my previous articles – one, the presence of the data silos, and in this case that comes in the form of the body along with the environment of the user.
The second, and perhaps most essential of the components, comprises of the sensors and the wide range of form factors they come in, in order to tap effectively into the data stream. And that is the crux of my article here – taking the modern day millennial as a test subject, what kind of sensors are present on an individual at any given time.
SENSORS IN A PHONE
Alright, easing into the basics. The fundamental device that most people – specially millennials – carry around is the quintessential smartphone. Smartphones have been evolving over a long time, and more than the processor and memory prowess which make up the bells and whistles of any smartphone advertisement, the underlying developments often occur in the inclusion of new and specialised forms of sensors.
So, let’s begin with the more generic forms of the typical sensors, and then we’ll move on to the more specialised forms.
Spatial awareness is one of the more important features of a smartphone, and in this regard we have the usual suspects: Accelerometer, Gyroscope and the Magnetometer all contribute in giving some form of a positional mapping to the phone.
In 2008, such sensors were mainly restricted to novelty usage, specially developing games which used the accelerometer’s prowess. For instance, I distinctly remember a game on my old Nokia where I had to guide a tiny steel ball through a maze filled with obstacles, or burst bubbles in the path of the ball. Today, however, the accelerometer can in a way replace the usage of a pedometer (which I’ll be talking about in some time) in a very coarse, yet workable manner.
Moreover, applications such as controlling specific functionalities of other devices can be done by applying IF loops to the accelerometer readings. Say for example you want to control a robot using a smartphone, all you need to do is code your way to set different output signals according to the accelerometer readings – it really is that simple.
As far as the QS movement is considered, merely the accelerometer might not be of that much use. However, consider the other sensors within a phone, and you’re to expect a gamut of additional functionalities.
Alright, so as far as location sensing is concerned you have the typical magnetometer and GPS chip in a smartphone. While the magnetometer detects the magnetic north of the earth, the GPS chip coordinates the position of the person on a map.
For the Quantified Self movement, location sensing itself isn’t of much use. But correlate the location data with the accelerometer readings, and you’ve got yourself a makeshift pedometer. Think about it – inputs needed: start time and end time of the activity, location coordinates of the starting point and ending point, and the user’s own input regarding his/her height. Add the three components in an elaborately simple equation, and boom: you can measure how many steps a person has taken in a single run.
Location sensing comes into more use when we consider the contextual side of things. We’ve already spoken earlier about Indoor positioning systems and geofencing, and also the kind of sensors which go into building these concepts. All in all, there’s a lot of use cases involved in the field of context, and a lot of developers and retail store owners are actually tapping into the potential of location sensing sensors.
CUTTING EDGE SENSORS
Move beyond the regular sensors, and we step into the world of specialised, cutting edge sensor technology which is making its presence felt in a wide array of applications – sensors such as fingerprint sensors are doubling up as enhanced security measures, heart rate sensors are stepping up their game, we also have some phones boasting of being able to measure the temperature and air humidity around the user.
To paint a picture, consider this statistic: One of Samsung’s less recent flagships – the Samsung Galaxy S6 – had a total of nine sensors, which are capable of measuring inputs such as fingerprints, pulse rate, stress levels and even blood oxygen levels. All the data pooled in from the various sensors is aggregated in the dedicated application, to provide the user the key visualisations of the data collected.
And it’s not just Samsung. Apple has its motion coprocessor technology, which enables the phone’s CPU to handle user based activities simultaneously with all the data that is being collected from the sensors, which brings about scope for more contextual interactions. Android implemented the sensor hub within its Nexus lineup for 2015, which ensures that sensors in a smartphone are able to work efficiently without over consuming power or the phone’s resources. I’ve spoken at length about both these technologies in my article on Smartphone Sensors v2.
WEARABLES STEPPING IN
Let’s face it, despite all the data that is being captured by a smartphone, there is always the issue of the phone being kept away from the user, and therefore being rendered unable to actually measure any data. And that is where unobtrusive wearables come into the picture.
Wearables nowadays are coming in more smaller and sleeker sizes and from factors, which makes wearing them all the more comfortable. Moreover, with technologies such as Bluetooth Low Energy and more efficient power saving mechanisms, data can be collected by keeping the wearable on the user for longer intervals (I actually went for well over a month’s usage when I was using my Mi Band!).
SO, WHAT ARE WE GETTING AT?
Smartphones on an average have a minimum of ten sensors incorporated (once you consider the light and proximity sensors too). Add a few more in the wearable (accelerometers, GPS) and you’ll have a total of about twelve sensors on a regular millennial.
Why is this relevant? For tracking all activities, you need to make sure that all data sources are covered – more the fishing poles, more the chances of fish getting caught, right?
The need of the hour is to effectively integrate all the collected data, though the various sensors. Add to this equation the computing and aggregating prowess of the smartphone, and you’ve got yourself a healthy mix of data sources, data crunching algorithms and information leading to insight. All this, beginning with the sensors within the regular devices.
Sign up for our monthly mailing list