Ever feel like you need an assistant to track your daily habits? Everything from what you eat to your productivity level is important, but tends to fall to the wayside when you get caught up at work or even at home. Realizing the usefulness of a proactive lifestyle monitor, AT&T Labs researchers have prototyped Virtual Companion, a personal assistant that anticipates users’ needs. Instead of requesting information from your phone and getting a reactive response, Virtual Companion is able to provide suggestions to improve your life in many different dimensions like well-being, work, productivity, social and more. For example, you can build a diary of what you eat and the app will match your daily calorie intake with your activity level as measured by the device’s sensors. If Virtual Companion detects you’ve eaten too much “junk food” or haven’t exercised much, it may suggest walking around.
As mobile devices have become an integrated and central part of our daily routines, researcher Emiliano Miluzzo started thinking about new ways to make these devices smarter. In particular, he wanted to turn smartphones into virtual assistants that could help people boost their well-being. Miluzzo believes in a future where our devices anticipate our needs and actions, which is how the idea for Virtual Companion was born. He explains, “As devices become smarter and smarter, they should not only be able to track our activity levels, but should also be able to provide suggestions for how we can live more healthful lives.” As he began conversations with another AT&T Labs researcher, Suhrid Balakrishnan, and Mashfiqui Rabbi, a summer intern at AT&T Labs and a PhD student from Cornell University. After his internship concluded, the collaboration continued and work on Virtual Companion also included input from Mashfiqui’s advisor at Cornell, Professor Tanzeem Choudhury.
Currently, the Virtual Companion app is moving into trials. As more devices begin to anticipate our needs without even having to ask, we expect future developments to include:
- Visual Recognition. Users will be able to snap a photo of their food and the app will estimate the number of calories in the dish, further enhancing the ease and usefulness of Virtual Companion.
- Digital Life. Imagine how this technology could be integrated into the Digital Life environment, allowing us to be more fully aware of our own lifestyle habits and respond accordingly.
- Proactive Mobile Experience. Virtual Companion could lead to the development of more expectant actions when interacting with our mobile devices.
Researcher Emiliano Miluzzo is a senior member of the technical staff at AT&T Labs. Miluzzo is an experimental researcher who works at the intersection of mobile systems and applied machine learning in the Mobile and Pervasive Systems Research group. Researcher Emiliano Miluzzo is a senior member of the technical staff at AT&T Labs. Miluzzo is an experimental researcher who works at the intersection of mobile systems and applied machine learning in the Mobile and Pervasive Systems Research group.
Also working on this project is researcher Suhrid Balakrishnan, a principal member of the technical staff at AT&T Labs specializing in machine learning, and particularly interested in scalable, accurate and efficient algorithms for statistical learning. Suhrid has a Ph.D. in Computer Science from Rutgers University, and a B.Tech. from I.I.T. Bombay.