November 26 2008

Proximity sensor control

I gave my initial impressions of the Google Mobile iPhone app last week before broke Daring Fireball broke the news that Google used the undocumented proximity sensor API. Specifically it detects if you’ve moved the phone and then put your face near the device to turn on voice recognition. All that you can do with the public APIs is determine whether proximity sensing is on and toggle its state on or off. No proximity detection is possible with public APIs. But of course proximity detection is a widely useful feature that could be creatively leveraged in many applications.

I’m not going to comment on whether this was a backdoor agreement with Google and Apple ahead of time or whether the publicity of the Google app in the New York Times led to Apple approving it. Ars Technica and CNet have already discussed that. What I would like to see is Apple moving to make more of the native capabilities available, including the proximity sensor. Other device manufacturers will be more open so this should only help make iPhone’s more attractive.

We are just a platform for developers to build their own applications. And on the iPhone this means that people compile their applications into one big binary. As you may know we provide the capability of accessing device capabilities with both tags (extensions to HTML) and Ruby calls. Should we provide such an extension to call the proximity API and leave it to our app developers to decide whether they want to embed such calls in their applications?