November 23, 2021
2 Minute Read
Manohar Paluri has spent the bulk of his career developing methods to make machines see. Now, in his role as Director, Artificial Intelligence at Facebook (now Meta), computer vision is one building block in the massive undertaking of developing egocentric perception: making sense of data collected from a first-person perspective via wearable devices.
Mano joined our podcast, How AI Happens, to discuss the current state of computer vision, the challenges inherent in developing egocentric perception, and how Facebook is weighing the issues of transparency and privacy as personal data becomes, well, more personal.
Chief among considerations for Mano’s team is the shift from third-person sensor perception — that is, holding out a smartphone at arm’s length — to the first-person perspective granted to sensors in wearable tech. While it may not seem obvious, the difference in the data collected is tremendous. First-person perspective allows for better intention prediction with gaze recognition and hand-object interaction. However, it also has its own set of challenges, such as the user not bothering to hold their head as steady as they might hold a smartphone.
Mano explains how his team is tackling these issues, the ethical considerations at play, and the importance of not making sacrifices to transparency in the interest of accuracy. To hear Mano explain this and much more, you can stream the full episode below, or anywhere you get your podcasts.
Rob hosts & produces Sama's podcast, How AI Happens. How AI Happens is a podcast featuring experts and practitioners explaining their work at the cutting edge of Artificial Intelligence. Tune in to hear AI Researchers, Data Scientists, ML Engineers, and the leaders of today’s most exciting AI companies explain the newest and most challenging facets of their field.