Written by David Tebbutt, MacUser Oct 1988 (guess) - scanned

In October 1950, a man called Alan Turing published a paper in which he described a kind of litmus test for machine intelligence. An interrogator is seated at a teletype machine which is connected to, but separate from, the person or machine under assessment. If the interrogator cannot tell whether the answers to his or her questions are coming from a machine or a human, then the machine can be said to think.

People challenged him and said that even if the computer produced humanlike answers, that still didn't mean it was 'thinking' in the way that we know it. Turing replied that we don't really know that humans are thinking, we only have the signals we receive from them by which to judge.

"So what's all this got to do with Apple?", I hear you ask. Well, quite a lot actually. You see, you could apply a similar test to establish whether the computer you are using is a Macintosh. If you can't tell, then the question has to be asked "why pay more for the Macintosh when you can get equivalent functionality more cheaply?" After all, you will soon have all sorts of systems to choose from - Unix with Metaphor, PC-DOS with Windows, OS/2 with Presentation Manager. They're all trying to mimic the Macintosh. And does it matter that the implementation is the result of an afterthought rather than forethought?

The average end user doesn't understand, and probably cares even less, why the computer behaves the way it does. We find ourselves back with the Turing Test. It is simply not interested in the inner mechanisms, only in what is directly observable. The average user, faced with a number of machines which appear similar, is not going to worry about the underpinning philosophy. He or she will go for a suitable combination of cost, capability and confidence in the manufacturer and its supporting organisations.

The reason I mention all this is because Apple has just announced its future plans for Macintosh system software. As well as explaining the technological nuts and bolts of its System software 7.0, Apple has also taken a great deal of trouble to explain OASIS, which stands for 'Open Architecture System Integration Strategy'. Apple describes this strategy as its way of extending Macintosh advantages into new products and multi-vendor environments. I believe that it's more than this. I think it's Apple's way of trying to persuade us that what's inside is important and of distancing the Macintosh from its competitors.

The OASIS literature is part philosophy and part product. The philosophy talks of the consistent look and feel of all applications and of the ability to exchange textual and graphical information between them. It talks of what a user really does - he or she has to encode desires and decode results. The tighter, easier and more natural this loop, the richer the user's experience.

OASIS suggests that the Macintosh was designed to conform to five basic principles: intuitiveness, consistency, configurability, extensibility and integration. It strikes me that history has been re-written more than a little in the claims for extensibility and integration, especially with regard to the outside world. Go back to the very first Macintosh and you will find a closed machine, incapable of expansion and not designed for the sort of real world integration which is so vital today.

You could upgrade the early Macintosh to the 512, but this involved replacing the motherboard - hardly natural extensibility. I could go on, but I think the point has been made. Today, Apple's philosophy has changed. It has no choice but to admit that it won't take over the world, but it could pick up a useful slice by being more open.

Apple tries to remove the need for abstract or structured thought from your dealings with the machine. It tries to make the interaction loop (encode your messages, decode the Mac's) consistent with the real world. And, by giving the users control over the way the computer looks and works (fonts, layout of files and folders, desk accessories etc), Apple believes that they feel as if they run the computer and not the other way round.

All this is good stuff and very seductive to the user. So much so that these are exactly the aspects of the Macintosh that the other manufacturers are trying to hijack. They have few qualms about this because they know that Apple was inspired by the work done at Xerox in the mid-seventies. The truth is that Apple and its third party developers have probably put thousands of man-years into making the Xerox dream a commercial reality.

The Macintosh is a machine that was designed for the user. The philosophical underpinnings and technical implementation should persuade doubters of this fact. Most other computer manufacturers (and I think Hewlett-Packard deserves a mention as an exception) seem to tack useability on as an afterthought. If the end result gives no perceptibly different computing experience, I return to the original thesis which is 'why should anyone pay extra for the Apple version?' That's a tough one to answer, especially in commercial terms. But then we make similar judgements about humans. At different times we like shallow, entertaining people and, at others, we prefer the solid, more dependable type.

So it is with the Macintosh. Apple did the thinking for the machine and created a set of values for it. Thousands of developers then made millions of tiny decisions in line with this philosophy. The end result is a computing system with such depth and consistency that it could almost have a soul.

Unfortunately for Apple, I can't see many companies putting 'possesses a soul' on their list of preferred computer attributes.