We imagine the day not far away when talking to a computer in public will not sound like science fiction. We'll live in a world of muffled speech. A few dialogs will be about important things other than ordering pizza.
# Voice Design
Voice communication is intimate and immediate. We mumble and jest about what is right here right now. We will not talk casually to our computers until they can keep up with our attention depleted expression. A personal hypertext may be the only solution.
Amazon put a kindle in a tube with a good speaker and an excellent microphone. One addresses it as Alexa.
me: Alexa, set a timer for fifteen minutes. it: Fifteen minutes starting now.
Excellent voice design confirms recognition while exposing assumptions so as to minimize the confusion within ambiguous expression. amazon
me: Alexa, what's new on wiki. it: Mike has two new articles, David continues one. me: What has Mike written? it: Mike wrote about Browsing and Editing me: Tell me about Editing. it: We hear repeatedly that fedwiki is confusing to beginners. One solution might be a simple entry-level editor. me: Skim it for me. it: People have asked me ... it: Cant be installed ... it: Inaccessible to vision-impaired ... me: Go on. it: Visually overwhelming ... it: An ideal starter environment ... me: What? it An ideal environment ... might look familiar ... only a few differences ... quickly explained.
Mike has written about entry level browsing and editing especially by the vision impaired. See Minimum Specs for Entry Level Editor
Imagine a sighted person studying complex visualizations and then explaining the insights the might get through the experience. Then imagine the Echo's deep learning reading those insights and highlighting the operative words that would be read to the sightless. Do they match?
Would communication happen? Is there a better word? Better sentence structure. A more important thing to say right away? Would the sightless be the only one to benefit from this robotic proof reading?
Try condensing this transcript: Digest 2015-07-29
Is authoring assistance possible? Valuable? Educational even? Would people express themselves better if the were use to speaking and typing to the Echo? Would beauty disappear? Or would it move to a different level?
# Blind Creatives
I worked with a blind compiler writer years ago. His was challenging work and mistakes were common. I wrote tools that helped me debug. One out of four errors were in the compiler. (But then, one out of forty errors were in the chips themselves.)
Compiler errors were common because a blind compiler writer could not see all of the self-similar parts of the code. If the logic were wrong in one place is was probably equally wrong in a dozen others. But each bug would have to be found, reported, understood and repaired individually.
If the visually privileged find federation confusing what do they think any traditionally sighted task is like to the blind? We advise programmers to identify the thing they find most difficult and do a lot of it.
Could a blind compiler writer write a better compiler if he hand to speak it to Echo from across the room? What would Echo have to do to help? What would every sighted person in the federation have to do to help Echo? That is the browse/editor of the future.