via vellum:

“Responsive Typography” Tracks Your Location to Automatically Resize Text

Web designer  Marko Dugonjić has created a website called “Responsive Typography” that alters the size of the text based on your distance from the screen. As a simple working prototype, Responsive Typography shows us some of the untapped potential of physical interactions with soft wear. Imagine moving away from your screen to get a drink and watching as it magnifies the text so you can read your email from afar, or a computer that goes to sleep when you leave a room and wakes when you return.

It’s fascinating to think of computers becoming more responsive to our bodies as a whole, but with the increasing prevalence of facial recognition these interactions could be taken even deeper. If the computer recognized your face was sad, it might change your music playlist to something cheerier, or send your friend a message to give you a call. If it realized you were getting tired, could it tell you when to take a break? Or open the blinds to let in a little more daylight?

Unsaid in this two paragraph excerpt is that the device will have to be always watching the reader for this to work. The text server and device combo will be reading the and interpreting the proximity of the human reader. This is a potential trojan horse for another layer of panoptic surveillance.

Which layer of abstraction will own the hardware sensor? Will it be the text being read? The browser that organizes access to the text? Will it be the local (or cloud-based) operating system? Will it be some as yet unimagined third party publishing service provider?

Does this presume a continued human-computer-interface ecosystem of desktop, laptop, and hand-held devices? How will home-entertainment consoles handle multiple readers in the living room scale? How will a pervasively sensed living room (c.f. Microsoft’s next generation plans for the Kinect) read and respond to us?

Will Google Glass and other HUD AR wearable technologies need to be responsive? How so?