When you visit a special spot it's nice to leave your mark. In today's digitally connected world we're taking it beyond graffiti and enhancing the possibilities of placing your memories and more "in the air." Always looking for reviews from your friends? Whether eating out, deciding which tour to take of a museum or even when you want more information on a particular city landmark, it's nice to know what your friends have to say. AT&T is developing a way to merge social networks with the real world allowing you to get reviews from friends, see videos of others at an exact location and more.
How did the Idea Hatch?
This technology allows users to place their memories "in the air" at physical locations. The idea for Air Graffiti™ first came about in 2001 as researchers had the insight that the intersection of the Internet and the physical world would be a growing space for new services. Essentially, researchers foresaw the explosion of location-based services (LBS).
The technology for Air Graffiti™ was ahead of its time and several key components were not available for its development in 2001, including:
- Ubiquitous mobile Internet access
- Inexpensive and ubiquitous GPS (and AGPS) hardware
- Rich user interfaces on powerful mobile devices
The project resurfaced in 2007, when researcher Bill Cheswick recalled the concept of Air Graffiti™. The researchers began discussing the project once again and decided to try to resurrect it using more modern technology. Researcher Dave Kormann went on to create an iOS prototype and Chris Rath built a backend for Air Graffiti™.
About the Project
Air Graffiti™ is a technology that allows users to place videos, photos and songs "in the air" at a physical location. For example, if you shot a video of a proposal or took a picture at a historical landmark you could digitally place this material "in the air" for others to see. Air Graffiti™ creates a pervasive LBS platform based on geographic coordinates, allowing mobile users to combine online and offline interactions and giving them the ability to "tag" buildings, street corners or potentially the entire world. This technology allows users to become part of the history at that particular location.
Real-world social networking is one application of what's possible when users have access to a pervasive LBS platform.
Other potential applications include:
- Educational applications. Tourists could take an architectural tour around a city, with a virtual architect tour guide who narrates a building's history at each stop.
- Augmented Reality gaming. Gamers could play "capture the flag" on the streets of New York.
- Social network recommendations engine. Hungry mobile users could find reviews for a new restaurant "tagged" on its front door — or suggestions for a better option nearby.
Finally, a platform like the one that makes Air Graffiti™ possible could be used as the foundation for next-generation LBS. While there are many emerging LBS applications, each pulls information from its own database and none are currently interoperable. A standardized infrastructure opens up new possibilities for LBS, bringing a new class of Web.
About the Researchers
Pamela Dragosh, Principle Member of Technical Staff, has been an employee of AT&T Labs Research since 1995. In her career with AT&T Labs, Pam has worked on many innovative software development projects including speech recognition and text-to-speech, one of the first digital music download systems, network bandwidth algorithms, tools for search and location based services. Pam's expertise includes building cross-platform software for multiple form factors including Mobile and Tablet devices.
Dave Kormann is a Senior Technical Staff Member at AT&T Labs. His research is in the Online Platforms Research department, where his work includes systems administration and Internet services development. He received his Master of Science degree in computer science from Northeastern University in 1996.
Chris Rath has been an employee of AT&T Labs Research since 1985. In his career with the labs, Chris has worked on many innovative software development projects, including one of the first commercially available expert systems, one of the earliest speech recognition and text-to-speech processing systems for the desktop computing environment, one of the first digital music download systems, online communities, cluster computing, mobile computing, digital video delivery systems and location-based services.
Jim Rowland, Director of Applied Data Mining Research, has nearly thirty years of experience as a Technical Manager, Distinguished Member of Technical Staff, and Member of Technical Staff in the role of project leader, architect, and developer on Advanced Development projects at AT&T Labs and Bell Labs. He has led projects across a number of technology domains including data mining using hardware clusters, automatic speech recognition, text to speech synthesis, encryption, digital rights management, audio compression, ecommerce, and expert systems.
Gregg Vesonder is Executive Director of the Communication Science and Artificial Intelligence Research Department at AT&T Labs-Research. Gregg has developed and managed software systems supporting operations, e-commerce, sales support and data mining. He has been involved in software tool development for speech recognition, C++, artificial intelligence and software design and analysis. His current research interests are in sustainability, HCI and alife. He is a Bell Labs and an AT&T fellow and is Adjunct Professor of Computer and Information Science at the University of Pennsylvania. Gregg received a BA in psychology from the University of Notre Dame and an MS and PhD in cognitive psychology from the University of Pittsburgh.