Let’s be clear about technology for some time: Concepts such as Artificial Intelligence, Internet of Things, Machine Learning and AR / VR are pretty cool, but until their use cases can be converted from novelty to necessity, they will just remain a fad. This is the fundamental upon which concepts such as Lifelogging and The Quantified Self are based, and I’m seeing things go from “Oh look, I ran 5kms today” to “I ran 5kms today, but my pace fell at this particular stretch, owing to which I exerted myself. Maybe if I work on my pace for this stretch, I could run 8kms tomorrow.”
Surely, there’s an upper limit to the information that can be gleaned from one parameter?
Heart rate data, for all this time, has proven to be beneficial in providing data regarding a person’s blood flow and heart conditions. But then, there’s the work Cardiogram is doing. Through intensive heart rate tracking, which takes into account a neural network built around the heart rate data of millions of Apple Watch users, Cardiogram hopes to be able to diagnose and predict a host of conditions, just from heart rate data.
Tech addiction is the new point of discussion in mainstream media with companies like Facebook, discussing how they can make their user’s time more productive. With social apps like Facebook and Snapchat deploying addictive social feedback loops to grab people’s attention. On the other hand, smartphone users are trying to use all sorts of timers and Quantified Self tools to improve the way they spend their time. But are these timers or tracking tools enough to help people?
With the Internet of Things maturing and the next generation low-latency, high bandwidth 5G networks nearing mass deployment, the data available from our lives is growing exponentially. The cost of sensors has been reducing thanks to increased adoption of wearables. With all this data out there, is it really adding value to our lives?
This week, we saw the new iPhone X’s TrueDepth front-facing camera, being used for face authentication. It uses a dot projector that projects over 30,000 invisible dots to map your face structure. Front-facing cameras started showing up in smartphones back in 2003. Selfies being the biggest use case of these front facing cameras. Snapchat (Now so do Facebook & Instagram) thrives on their user’s front-facing cameras with hundred of fun filters. Using computer vision, they map the user’s face. Apple also showcased the improved AR face filter experience with Snapchat on the iPhone X. Beyond selfies and face filters, what else can front-facing cameras do? Can we use them to detect a person’s mood or state of mental health? Can it be used as a mobile healthcare diagnostic tool?
One of the major reasons people are vary of jumping onto the self-tracking wagon is privacy of their data. Granted, when you’re trusting a fitness tracker to record how much you’ve run and how your heart rate varies, you’re trusting a slew of services that are working in the background. How can that balance be achieved?
Here’s an open secret to start things off: Connected technologies are evolving like never before. We’ve always seen the advent of wearable manufacturers and have seen different use cases for trackers, and while the market might be plateauing a little, it’s opening a window to explore more varied use cases.
Wearables can be distracting at times. Especially, when a person is driving. We saw it with the Google Glass, Snap Spectacles and even smartwatches. Apple just got a new patent that could place a limit on notifications on the Apple Watch in order to enhance driver safety. Its aim is to identify if you’re sitting in the driver’s seat and then reduce distraction, it implies that we may be closer to a formula for preventing all of those dreadful accidents which take place because of people texting or checking notifications while driving. The patent was granted for helping wearable detect controllers in vehicles.