This week, we saw the new iPhone X’s TrueDepth front-facing camera, being used for face authentication. It uses a dot projector that projects over 30,000 invisible dots to map your face structure. Front-facing cameras started showing up in smartphones back in 2003. Selfies being the biggest use case of these front facing cameras. Snapchat (Now so do Facebook & Instagram) thrives on their user’s front-facing cameras with hundred of fun filters. Using computer vision, they map the user’s face. Apple also showcased the improved AR face filter experience with Snapchat on the iPhone X. Beyond selfies and face filters, what else can front-facing cameras do? Can we use them to detect a person’s mood or state of mental health? Can it be used as a mobile healthcare diagnostic tool?