Report from Where 2.0
This week I am participating in “Where 2.0,” a conference that focuses on innovations in “the geospatial web” — an ever-broadening category of technologies that utilize location information in some way. By grounding data (often literally) to its physical location on the planet, software can monitor, visualize, analyze, and even predict a mind-boggling variety of results.
While I was eager to learn about the latest crop of location-based services, such as applications for mobile phones and mapping technologies, I could not have imagined the extent to which the intersection of web technologies, GPS, and mobile phones has electrified the entrepreneurial and research communities. Here are just a few reports that represent the breadth of yesterday’s discussions.
- MIT MediaLab professor Sandy Petland introduced us to “Reality Mining” — the pursuit of understanding how organizations work by analyzing who’s talking to whom and who’s out of the loop. One could study the flow of information through a company by tracking email volleys, but he found it much more insightful to analyze in-person meetings by monitoring location and route data of employees. (Research made possible by RFID and other location-sensing devices.) Organizations with more formal and informal in-person interactions were more productive. What does that say about tele-commuting?
- Glympse announced their mobile application with the grammarian’s nightmare of a tagline: “Share your where.” The application allows you to share your location and real-time route with anyone you choose to, for as long as you choose to. The application tracks your route and sends updates to email/phone to the people you allow to monitor your route. See Bob Tedeschi’s review of the product in today’s New York Times.
- Two products are in the running for “coolest demo” — I leave it to you to choose: Joker Racer, a remote-controlled car via WiFi and over the internet that was described for the geeky audience as a “drivable linux server” or Velodyne’s Lidar, a sensor that uses 64 lasers to capture real-time surroundings in three dimensions — used by automatically-driven vehicles, the U.S. military and Radiohead.
What on earth (forgive the pun) will today’s sessions bring?