This is an fantastic talk and interesting article on internet personalization filtering that happens automatically. By doing this ‘invisible’ filtering, search engines are usurping the control from the user. While the smart algorithm filtering is very helpful, it has to be made visible. Any time a system utilizes automation, that automation must be communicated to all parts of the system – especially the human user (see airplane crashes for the result of that lack of communication in extreme cases).
This invisible algorithmic filtering is essentially censoring. I find it very disturbing that the gateways to the internet are limiting our access to information. Perhaps the internet has moved from being a democratizing force for connecting the world to one that is isolating us, as the speaker says.
Once we can convince everyone that we need to make visible the filtering that happens, and maybe inject some ethics into the algorithms that do the filtering, it will be up to interaction designers to find the best way to implement that visibility and control. How do we show people what they are missing? How do we give them control over their information? Should we encourage people to broaden their information topics? Will simply exposing the filtering and giving access to information outside the filter be enough? Where should we draw the line between the ethics of the design and engineering team and the desires of the user?
Huge issues for interaction design. What do you think?
Next week, I’ll have a longer post. It should be interesting – getting into some UX basics.