Another WWDC came to an end a couple of days back, and while iOS 12, watchOS 5 and macOS Mojave (Apple emphatically stated that there would be no hardware announcements) took the limelight, it was worth noting that the company is now taking digital wellness and augmented reality very seriously, with features such as Screen Time.
I/O 2018 kicked off a couple of days back, and the most significant announcements came in the form of Google Duplex and Assistant, a smarter Gmail, and Android P. However, perhaps the most relevant headliner of I/O for me was Google’s take on Digital Well-Being, that brings with it app usage tracking, and alerts reminding you to take a break. Read More
We all have been looking at who to blame for the smartphone addiction problem. Is it really Apple’s fault for making the iPhone an essential lifestyle product or is it Facebook’s fault for building a really engaging app? Keeping users hooked (engaged) is one of the biggest goals or Key Performance Indicator’s for most of the tech companies. Every product designer is looking how they can employ simple feedback loops that can trigger hits of dopamine, making users addicted to their app.
Let’s be clear about technology for some time: Concepts such as Artificial Intelligence, Internet of Things, Machine Learning and AR / VR are pretty cool, but until their use cases can be converted from novelty to necessity, they will just remain a fad. This is the fundamental upon which concepts such as Lifelogging and The Quantified Self are based, and I’m seeing things go from “Oh look, I ran 5kms today” to “I ran 5kms today, but my pace fell at this particular stretch, owing to which I exerted myself. Maybe if I work on my pace for this stretch, I could run 8kms tomorrow.”
Surely, there’s an upper limit to the information that can be gleaned from one parameter?
Heart rate data, for all this time, has proven to be beneficial in providing data regarding a person’s blood flow and heart conditions. But then, there’s the work Cardiogram is doing. Through intensive heart rate tracking, which takes into account a neural network built around the heart rate data of millions of Apple Watch users, Cardiogram hopes to be able to diagnose and predict a host of conditions, just from heart rate data.
Tech addiction is the new point of discussion in mainstream media with companies like Facebook, discussing how they can make their user’s time more productive. With social apps like Facebook and Snapchat deploying addictive social feedback loops to grab people’s attention. On the other hand, smartphone users are trying to use all sorts of timers and Quantified Self tools to improve the way they spend their time. But are these timers or tracking tools enough to help people?
With the Internet of Things maturing and the next generation low-latency, high bandwidth 5G networks nearing mass deployment, the data available from our lives is growing exponentially. The cost of sensors has been reducing thanks to increased adoption of wearables. With all this data out there, is it really adding value to our lives?
This week, we saw the new iPhone X’s TrueDepth front-facing camera, being used for face authentication. It uses a dot projector that projects over 30,000 invisible dots to map your face structure. Front-facing cameras started showing up in smartphones back in 2003. Selfies being the biggest use case of these front facing cameras. Snapchat (Now so do Facebook & Instagram) thrives on their user’s front-facing cameras with hundred of fun filters. Using computer vision, they map the user’s face. Apple also showcased the improved AR face filter experience with Snapchat on the iPhone X. Beyond selfies and face filters, what else can front-facing cameras do? Can we use them to detect a person’s mood or state of mental health? Can it be used as a mobile healthcare diagnostic tool?