Shashwat Pradhan Shashwat Pradhan

Google I/O 2015: Context & Machine Learning

Google is emerging as the leader in Machine Learning and Context. The Google IO 2015 was booming with the word context. We can finally see context going mainstream which is really exciting. For us at Emberify its really inspiring to be in this space right now.

How we look at context at Emberify, is in 3 stages- Sense>Understand>Adapt. At Google they translate it to Sensors>Algorithms>User Experience. Sensors allow the smartphone to break out from the digital world into the real physical world.

So after sensing the data, Google has some really nice Machine Learning algorithms to make sense of the noisy/conflicting sensor data. The data can successfully be used to fine tune & train the machine learning models thanks to the magnitude the dataset Google can test them at. The algorithms take the raw sensor data and give us signals like higher level activities.

These algorithms translate in to user experience where the apps adapt to the user’s needs depending on the entire contextual situation. This can be used to simply the app interaction in many cases.

Google built an Activity Tracking API for developers by these Machine Learning algorithms. So they got 65,000 sensor traces with the activities labeled from Google employees so they could train the Machine Learning models to be able to predict the user’s activity.

Screen Shot 2015-05-31 at 4.21.15 pm


By combining multiple sensors, we can get more accurate predictions on the user’s activity. Like for this on-body detection model that was showcased at the Google IO.

Context aware flat image


What I have learnt from while building context aware systems is that some common sense assumptions are needed in addition to the sensor data based on general human behaviour to get more accuracy. Sometimes sensors can give us conflicting data. In addition to using multiple sensors to confirm it, common sense logic can be applied to the algorithm like repeating of a certain event occurrence before counting it since it can even be a random event.

Overall from Google I/O, I see how Google is using Android and even iOS through Google Now as an end-point to deliver its machine learning goodness.