The trouble with existing systems

Oleg Cherevko
Wed, 13 Jan 1999 21:30:02 +0200 (EET)

On Wed, 13 Jan 1999, Tom Novelli wrote:

> USER INTERACTION is something I haven't heard much about in TUNES... just
> the idea of "abstracting out the user interface", which sounds like it might
> be an impossible task. What sucks about GUI's? The mouse - no
> touch-feedback. Hierarchical menus. Windows get cluttered, piled up... what

The mouse also reduces your pointing ability to "one hand with only
one or two fingers" while you have 2 hands with 5 fingers each. Moreover,
the idea of selecting everything with a mouse is rather unnatural.
This is OK while you select some object for a certain operation, but then
it is very unnatural to move your mouse pointer to some menu item or to
some toolbar button to make the action really happen. The "context menu"
idea is certainly a move in the right direction, but alas, you are again
forced to use your mouse to select the action. Perhaps the direct
"thought-to-action" interface would be the most natural, but we have
yet to invent the hardware required. Any other ideas?

> use are windows when they're on top of each other? Why do so many windows
> take up 80% of the screen, when they might as well take it all? These are

While I agree with you on "100% is better then 80%" for interacting with
current active application, you have to have a way to choose between
different windows/applications. Do you know how to do this efficiently
without reserving some screen area for taskbar/etc. or for certain parts
of those inactive windows that you can click to make them active?

> problems we could address pretty easily, when we're ready.. allow full
> keyboard control, use popup menus instead of pulldowns, use smarter window
> management.
> I like to compare user interfaces with the way people interact with the real
> world. (keeping in mind that it varies from person to person) I look for
> metaphors... the office, kitchen, shop, desk, dashboard... filing cabinets
> (a limited hierarchy), bookshelves (a hierarchy where everything is
> visible), a messy room with stuff all over the place :)... Tool boxes, knife
> racks, kitchen drawers.. pots and pans ("temporary containers" for working
> with things)... and the old Trashcan (remember Mac's? :) My point? Current

I'm afraid, this is a wrong way to look. In the real world you have
the _whole of your body_ to interact with the 3d environment to which you
are insider. You have arms and legs, you have a great freedom of movement,
you can lean, reach, take, touch, grab, drop, kick, throw, etc...
Your body is the most important part of your user interface to the real
world. Most of the real-world metaphors are effective because your body
has many abilities. For example, we all know that the windowing system
metaphor is modelled after the real-world metaphor of some sheets of paper
being spread over the top of your desk. In the real world, however, those
sheets are not limited to be stuck to the top of your desk, you can take
them avay, put them on a shelf or on the floor, etc. You can also reach
some piece of paper with your left hand while still looking at some other
paper that is in front of you, you can do those actions _in parallel_,
because your body lets you do this. This is why it is efficient. You have
none of this abilities being equipped with only one mouse pointer that is
used for everything. That's why this metaphor sucks.  

In short, this "real world metaphors" approach can do, if you mean crating
a fun for average layman, but it won't as soon as you think of the
professional who needs efficiency.

I think the move in the opposite direction will be much more efficient:
instead of "bringing real world metaphors to computer" it would be
nice if we could "bring computer metaphors to the real world".
Imagine that all the walls and ceiling in your room is one big computer
screen area, imagine that you can create and move the window of your
favorite application anywhere you like: on the wall right in front of your
desk, on some other wall, or even on the ceiling directly over your head
while you are still in your bed. Imagine also, that you have pieces of a
"digital paper" that can be used as lightweight portable screens,
imagine that you can stack them or spread all over the floor like real
paper. Imagine that you have... Ok, folks, I guess I know what you think
of all this ideas: impossible, or at least not in the near future.
I agree, but my point is that real efficiency comes from _rich_ _natural_
(that's why modern virtual reality outfit sucks) interaction of the human
being with its digital creations (aka computers).