Interface 7
Volume Number: | | 8
|
Issue Number: | | 7
|
Column Tag: | | Interfacial relations
|
Interfacial Relations
Popular Sentiment: The Emperor's New Clothes or Can You Tell a Book By Its Cover?
By Joost Romeu, San Francisco, California
Once, the Macintosh interface solicited wonder and amazement. Current sentiment is much less enthusiastic. Despite notable exceptions, multi-platform vendors seem more interested in completing interfaces that subscribe to platform standards than constructing interfaces that complement their users need. But what is even more disappointing is the reaction of intelligent observers who-rather than decrying the decline of interface values-speak of them as no longer relevant.
Interfacial Relations confronts this critical backlash and the sentiment it has provoked; probes for reasons behind the sentiment; and discusses how this new conservatism might affect developers willingness to create responsive interfaces in the future.
Apple's Core
During MacTutor's resurrection and relocation, Apple introduced impressive hardware updates (the Quadra and the PowerBook), previewed revolutionary hardware technology advancements (pen based computers and consumer electronics), introduced fascinating software capabilities (QuickTime), and promised a radical commitment to RISC. With all of this activity, you'd think there would be more action in the interface arena, the area Apple's best known for. However, today's interface seems less an achievement than an Apple afterthought.
Recently, Tog Tognazzini, Apple's chief interface proponent and staunchest defender of the metaphorical grail-and a number of people from his Advanced Technology group at Apple-left en masse for Sun Microsystems.
What's does this exodus say about the Macintosh interface? Has Apple gotten too large to be innovative? Can Apple no longer see past its laurels? Have Apples interface litigations so indentured it to its past achievements that it can't afford to change?
The Macintosh Graphical User Interface (GUI) is not dead. Compare it to Microsoft Windows and it's clear that the legacy still works. But look at Microsoft Excel (as Interfacial Relations will do in the future) and you see that some developers are still intent on pushing the interface envelope.
Developing interfaces in the nineties is going to require a restructuring of the product development process. More trust is going to have to be placed in the vision and skill of the UI designer. More heed is going to have to be paid to the user through usability testing and a careful reconsideration of the developer/customer and the marketing/customer relationship. And we are going to have to recommit ourselves to basic interface fundamentals that we seem to have swept under the rug.
Backlash
Late last year, a flurry of articles took the GUI to task-not to criticize its approach, appearance, or implementation, but to challenge its raison d'etre (reason to be).
I suspect these criticisms came about because of frustration with a GUI that seems a bit dated. But I'm bothered by the mixed messages these criticisms leave in their wake.
What they all have in common is that they imply the foundations of GUI are bogus. What they seem to presume is that interface design is a cut and dry procedure. Interface design isnt as opaque as accepting an algorithm just because it comes up with the right answer. (Code implementation isnt even such a deterministic process.) Accuracy is essential, but just as programming solutions aren't implemented only because theyre accurate, interface designs take more effort than just following a style manual. Interface design can be made more efficient but it cant be drawn-and-quartered.
The roots of the problem
Dedication to a cause requires belief in the principles underlying that cause. One may survive paying lip service to these foundations. But a long run solution requires a developer committed to underlying principles and willing to call up the extra effort required to address those principles.
Challenging foundations is a valid and necessary activity, but it should be done responsibly. Criticisms that depend on gratuitous assertions or faulty logic may do permanent harm, squelching some of the basic foundations upon which good interface design was founded.
I contend that these examinations speak more of lazy, noncritical attitudes on the part of the critic than they speak to a particular interface's ambiguities and shortcomings. Unfortunately, because these examinations concentrate on buzzwords such as user friendly and intuitive they tend to gain more credence than they may deserve.
And because theyre in popular publications addressed to the general user, they're likely to influence the developer.
What these criticisms fail to consider is that GUI development is a developing discipline and not a plug in procedure. To think differently is to undercut the developer and underrate the technology. It leads to software strong on similarity and stripped of personality-softwares version of the emperors new clothes.
The critics
Frank Romano says that user friendly is a computer oxymoron and implies that the prototypical rest of us is a superset of specialists; John Dvorak asserts that intuitive is nonsensical; countless others contend the paperless office is an infeasible fantasy.
How should developers respond? Should they abandon the effort required to design engaging software and concentrate on bare bones essentials or should they just provide a set of interface construction tools and let the user sort out whatever interface might come to him/her?
unfriendly
Frank Romanos editorial, Too Many Tools, recognizes that many GUI's are still less than friendly. However, he chooses to blame this on the GUI's underlying foundation rather than a particular programs flawed interface implementation. He begins:
The term user friendly is totally meaningless. The only thing easy to use is something that does nothing. Absolutely nothing.
He then argues that rather than simplifying tasks by integrating subtasks, computer programs that adopt a modular approach force the user to customize the software to the work environment.
The article goes on to bemoan the fact that todays tools are mind-boggling in their capability and summarizes: We have come very far in the last five years but we have a long way to go.
This editorial accurately describes problems associated with a transition from a traditional approach to solving a problem to a technological approach. It touches on issues of integration, complexity, and program responsibility. But in confusing technical achievement (the tools technology has provided) with technological advance (the ways technology might change the way we work) the article seems content to find fault with the general principle of user friendliness rather than criticize the way a particular interface has implemented a function the user needs.
Intuition
Rather than accusing the Mac GUI of being counter-intuitive, John Dvoraks diatribe in the December, 1991 MacUser, questions whether there is a relationship at all between intuition and computer program users.
He does this by citing various examples which are apparently meant to prove the non-intuitive nature of computers (and technology). They include:
A secretary who didnt understand why waving a mouse in the air shouldnt cause the cursor to move.
A (StarTrek) movie in which one of the actors mistook a mouse for a microphone.
The atypical responses of African primitives who, confronting television screens, wonder where the people went when they walked off the edges of the screen.
Citing these refutations says more about their author than it does about intuition. Mr. Dvorak errs when he fails to recognize that what he sees to be mistakes may actually be part of the natural trial-and-error learning experience a user needs to undergo to adopt any tool. He fails to recognize that what he sees as a mistake, someone else might recognize as a valuable suggestion-a way technology might be expanded to encompass new experiences and expectations. (A gyroscopic 3D mouse incorporating sound is not a far fetched idea.)
Glibly throwing this material out as evidence that interface design tenants are bogus fails to recognize that general principles like intuitive enable us to confront the things we encounter (like mice) with an open yet responsive mind tempered by some sense of direction.
Popular sentiment
What differentiates these editorial opinions from serious criticism is that they unfairly deprecate rather than seriously challenge. What gives them unfair advantage is that criticizing a few general principles is an easy way to affect many particular implementations of those principles as well as affect popular sentiment.
Are buzzwords ever meaningful?
User friendly and intuition are highly charged and overused buzzwords. Nonetheless, they continue to be viable ways we can compare interface designs and judge competing alternatives. The fact their definition is not completely airtight enhances, rather than detracts from their relevance and viability.
Buzzwords have a bad reputation. But theyre better suited to general discussions of context and intent than the strictly-defined scientifically-charged terminology by which scientists attempt to evade the vagaries of context.
Take user friendly:
The draw program that displays the ongoing construction on screen is more user friendly than a package that accepts coordinate input and leaves the user guessing what the drawing is going to look like until it's been compiled and printed.
User friendly isnt meaningless. For example, though predictability and speed may be considerations when the term is used, it more likely to be applied to a programs general responsiveness than to either its raw speed or obvious predictability.
Or intuitive:
Wielding a device that controls a pointing cursor is a more intuitive way to point to a visible object than using a device that accepts any language command line input.
An intuitive approach (like Nisus well designed Find dialog) may not unambiguously supply you with the right answer on the first try, but it usually holds your interest far longer than a non-intuitive yet logically correct and unambiguous approach (like grep) that requires you understand a specialized language designed to define any ambiguities that may creep into the relationship.
Description vs distinction
In a by-the-numbers bottom line world, interface descriptors like user friendly and intuitive may not be as precise as response time and mouse click count, but they are more advantageous ways of talking about interface design problems and solutions.
Different people use these terms in different ways.
Marketing refers to them to identify with a prospective buyers inclinations and desires. Because they dont have to be numerically substantiated, almost anybody can use them as part of their argument. But having to qualify opinions with facts and figures is a relatively recent historical phenomena thats better suited to some activities (e.g., science) than others (art, philosophy, design).
This terminology-and the principles it espouses-can be especially valuable to the developer. These words represent goals which the developer can stride toward. Theyre the rubric through which the designer can frame the principles he/she feels are necessary for a computer program to assure the user it's a cooperative agent rather than just a dumb machine. They allow the developer to relate to the user as a human being rather than just an operator.
Developers and Buzzwords
To the developer, user friendly should neither be an obvious nor required attribute. One designer may consider the user friendly environment as a visually attractive interface; another might consider functionally engaging as the user friendly prerequisite; yet another may equate animation or audio cues with user friendliness. To the person developing a game or secured environment that attempts to foil a user, user friendly may be something to consciously avoid. And finally, to some designers, user friendly may not be a concern at all.
Like the term user friendly, the way you may choose to define the term intuitive may involve stimulus-response, cause effect, and a bit of hocus-pocus. However, it doesnt imply that a totally new intuitive experience will be instantaneously understood or immediately accommodated. Its a pliable way of gauging the ability of a program to provide enough cues and feedback to hold the interest of a user. And its a way of talking about how-within a reasonable amount of time-your user might be able to skillfully interact with his application.
User friendly, intuitive, and a plethora of other terms represent elusive values. But theyre no different than most of our moral values (compassion, honesty, etc.) and political ideals (democracy, communism). Its precisely their elusive richness that allows us as developers to use them to better relate to our audience. Terms like user friendly and intuitive, when responsibly qualified are exemplary ways to introduce users to advanced concepts and changes.
Why are these foundations being questioned?
Weve discussed what the criticisms are, why they come about in the popular media, and how they might affect developers and the industry. But why are they being levied now? What concerns are behind this foundation bashing and are those concerns valid?
Concern 1: is something ever nothing?
Is it true that the only [thing] easy to use is [something] that does nothing? or is it just a glib way of saying something meaningless? (Try replacing the bracketed terms with some real thing like tool. )
Whats annoying is that there is something in the assertion that seems plausible. Like many glib statements, it conveys a first impression that's hard to shake. In a world in which we dont have the time to look at matters in depth, its too easy to take a statement like this at face level. If you accept this, you might be willing to submit to a similarly meaningless statement such as: the best interface is no interface at all.
So what, you say, Im bright enough to see past the patter. Well it's more than just a matter of rhetoric. It plays into the concept that interface is an unfortunate intermediary-a necessary evil, so to speak-between a-program-and-its-data and the-user-and-his/her-task. It promulgates a misconception that the way to design software is to disregard interface. Why? Because the only thing user is willing to admit is easy is no-thing.
Concern 2: computer alienation?
Do critics who bemoan the term user friendly and intuitive expect a computer interface to miraculously transform a neophyte into an expert capable of turning out relevant, quality work with a modicum of effort or are they confusing computer user with expert?
Technically as well as skillfully, the print industry demands difficult judgment calls. So Mr. Romano is right when he contends that even with (today's) computerized assistance, generating color separations remains a complex activity.
But the complexity that affects this and most industries can never be completely addressed by a computer. Even though the technicalities might be able to be unambiguously (or at least adequately) defined and described and assigned to a computer, that program will not be able to completely address issues of relevancy.
One difference between the technician and the expert is that the technician can rely on past experience and knowledge to arrive at a quality decision whereas the expert must add to these considerations a skill at balancing current contextual relevancies (currently appreciated norms, customer color preferences, etc.) to assure that the decision is relevant as well as qualitatively accurate. Typically, a computer is better equipped to address technical problems.
A computer may eventually supplant the technician. It may provide the non-expert with technical skills and the opportunity to accumulate, within a protracted time period, the experience necessary to become an expert. But it cannot assume the role of expert itself or miraculously transform an idiot into an expert. On the other hand, it might be argued that the expert cannot remain an expert for long without a computer. Because the world of information is moving at breakneck speeds, it can be argued that without a computer, todays experts can quickly lose their edge and therefore, their expertise.
The legal profession is a case in point. Why? Because among other things, an attorney can be sued for supplying an inadequate defense because he didnt use information pertinent to the case. Though still a hardcopy dominated profession, lack of a computer is rapidly making it impossible to be a responsible litigator.
Whats Metaphors role?
There are many possible reasons why people are bashing the principles rather than the products. Industry frustration and vested interests may be partially behind the critical diatribes. But there are technical reasons as well. The more obvious include: the continued difficulty of convincing management that interface is as important as functionality; the difficulty of gauging and dealing with the human factors that need to be addressed by a responsive interface; and the time required to adopt a unique interface solution into a multi-platform environment.
However, I feel that an inordinate dependence on metaphor is largely to account for our inability to address these user concerns.
Few people consider interface design a done deal. However, many subscribe to the school of thought that says interface design is a two-step conformity task. The first step involves standard conformance; the second metaphor identification, description and conformance.
Step one is to design in conformance to menu, widget, dialog box, etc. standards. Presumably, if you have satisfied the standard conditions of one platform youll be able to plug your design into a program which will automatically transfer these design decisions into equally serviceable solutions that address the requirements of the other platforms.
The other step is to design icons, terminology, etc. according to constraints determined by the metaphor you are trying to emulate. Thus, if youre designing for an artist you pick up an art supply catalog, learn the lingo, copy the pictures and paste them on the interface.
If you believe this approach adequately defines the user interface design process then its easy to see non-intuitive or user friendly as irrelevant. Youve done your task as an interface designer and you can switch the blame for any user problems to the users because they dont adequately understand the platform standards or havent adequately acquainted themselves with the programs underlying metaphor.
The Interfacial Relations series has not had kind words for metaphor. It sees design-by-metaphor (the way it has typically been understood by the computer industry) as akin to designing a book by its cover. Metaphorically designed GUI may attract neophytes, but eventually it can severely confine the developer who is trying to improve, rather than merely replicate the real world.
Metaphor reconsidered
Does the professional, the person who has had extended experience in the industry and who is at home with computers find metaphorical interface as engaging as the computer neophyte Apple was initially trying to attract with its desktop? Does the computer sophisticate? Why participate in an interface that mimics the real world when the feedback that interface provides cant have the richness of physicality the real world can supply and often carries a lot of extra baggage particular to the computer environment.
And theres another reason why metaphor is inadequate. When GUI was being invented, computer technology was trying to catch up with the real world. Now the real world is struggling to keep pace with computers. Computer control changes the workplace; it places greater demands on the professional. Its rapidly replacing the metaphor.
Metaphor is important, but it isnt the endall. What we want out of interface is a workflow environment that's a characterization rather than a charade.
Contemporary metaphor
Tog Tognazzini's monthly Apple Direct column, Human Interface represented the Apple myth at its best. Tog seemed the bastion of convention, the person you would ask to deliver a speech on the merits of orthodoxy and standardization. Searching for new interface solutions but conscious that interface guidelines be maintained, Human Interface confidently walked a fine line between the proven and the possible.
Case Study: One or More Buttons, described a success story that gave the reader the impression Apple is an endless reservoir of lucid ideas that are meticulously researched, responsibly tested, and-with an AHA! insight or two-gracefully resolved. But what impressed me most about this article was how it identified contemporary metaphor.
Case Studys problem was to design something like a cross between radio buttons and check boxes, allowing users to select as many options as they want, but always keeping at least one selected. Tog stated the problem, outlined tentative solutions, prototyped, tested, and built on the results.
His development environment seemed ideal: everyone had interesting suggestions, the project moved along at a productive clip. In fifteen hours a convincing solution had been realized.
But the climax was what caught my attention. The AHA! factor the project turned on pivoted (you've guessed it!) on a metaphorical point. But the metaphorical reference took a decidedly different tack than the way we commonly perceive metaphor.
Rather than attempting to configure and assign literal tool functions to computer tools so that the computer would appear to directly emulate its real world referent-rather than be an end in itself-Tog employed it as a learning tool, a stage in a development process.
Searching for a button check that would travel rather than appear or disappear, Tog associated the action he wanted with the performance of a bead of mercury when it was pressed. The mercuric metaphor was useful, not because it provided a literal tool, but because it enabled him to visualize, understand, and explain a problem in an alternative way.
He used metaphor to solve an abstract problem; not to emulate a real world thing. This is an example of metaphor at its best.
The real problem
Phil LoPiccolos Computer Graphics World editorial, Wimpy Interfaces , succinctly states the real problem user interface faces:
...the next generation of user interfaces...remain distant goals The evolution of interface technology has all but stalled. Instead of concentrating on the implementation of new interface models - which may include new metaphors as well as voice input and other emerging technologies - for the most part vendors have chosen to copy the familiar desktop, or WIMP (windows, icons, mouse, and pointing) metaphor first introduced by XEROX Palo Alto Research Center (PARC) in 1973 and popularized by the Macintosh in 1984.
Why these strong words?
He continues: Ironically, despite such rallying around this standard style of computing, the penetration of computers into most professions is still minuscule Indeed, at a recent closed-door industry roundtable...the item identified as the main obstacle to greater acceptance was that computers continue to be far too difficult to use for even the most technically sophisticated professionals.
Conclusion
Every active development effort should encourage and sustain constructive criticism. It's a sign of frustration and stagnation when critics choose to invalidate developer intentions rather than study their implementations, or suggest alternative goals and aspirations.
Our goals need to continue to inspire and challenge. The fact user friendly, paperless office, intuitive, and WYSIWYG continue to dangle slightly beyond our reach should serve as a challenge rather than a reason to deem them unattainable or irrelevant.