I drive a Honda Civic. It’s a fine vehicle overall, but it has an issue with me not wearing a seatbelt. Like most vehicles, it starts with a mildly annoying beeping sound reminding me to put on my seatbelt. Unlike most vehicles it doesn’t stop reminding me. Ever. It beeps for 10 seconds, waits a minute, and resumes beeping. Indefinitely. The obvious lesson: George, wear your seatbelt. The *really* important lesson: technology that is designed to influence behaviour and not provide the end user with final control is revealing to how intrusive some technologies have become (perhaps even unethical – maybe we need a Constitution or Bill of Rights for what choices machines can make on our behalf and at which points end users should be abe to override design – i.e. Asimov’s rule for robots).
I’ve argued over the last few years that programmers are today’s philosophers and ethicists. A learning management system tells me what I can and cannot do. My iphone/ipad bounds my range of actions and options. Sometimes these limitations are helpful in making the devices more accessible to a broader population. My colleague at TEKRI – Jon Dron – argues for hard/soft dimensions of technology and design. Rigid structure isn’t always bad, especially when it helps the end-user make effective use of the software/hardware.
However, many of us are increasingly accepting limitations, advertising intrusions, or bounded functionality in the technologies we use. The cloud is great for reducing costs and providing near-ubiquitous access to content and apps/software. But…the cloud requires a trade off of control. Like money, privacy, control, and security are transactional entities. We exchange them when we receive value in return. Or, at least, we should. When Amazon announces a fee-reduced Kindle that is supported by ads and pundits say “it’s a membership into a special deals “club”", the lesson of value generation through transactions based on control shifts has been lost.