We seem eager to trust in the appearance of things. All around us are things, machines, people, brands, companies, signs and symbols that say ‘Trust me by the way I appear to be’ and we proceed to attribute to these things various qualities that are derived solely based on their outward appearance. What does a policeman’s uniform say about the policeman? Nothing except that I must trust that he has authority. The banker, the chef, the lawyer and the stock broker take great pains to identify themselves by their attire. Brands spend hundreds of millions of dollars on creating and sustaining an image. We wear certain kinds of clothing to show that we belong to a particular group or community. We buy nice things for our apartments and paint our houses so that we do not appear to be poor or unkempt.
How does trust show up in everyday life and culture? Think of an example of a story or experience from your life or culture that talks about perceptions of trust.
Machines and Trust
In 1966, at the MIT Artificial Intelligence Laboratory, Joseph Weizenbaum published a paper entitled ELIZA: A Computer Program For the Study of Natural Language Communication Between Man And Machine.1 Eliza simulated conversation by using a pattern matching technique that gave users an illusion of understanding on the part of the program, which had no real understanding. ELIZA’s creator, Weizenbaum designed the program to show the superficiality of communication between man and machine. He ended up proving the opposite and was surprised that many people attributed human-like feelings to the computer program and ended up telling it personal secrets.2
The ELIZA effect refers to “the susceptibility of people to read far more understanding than is warranted into strings of symbols—especially words—strung together by computers”.2 An example of the Eliza effect, given by Douglas Hofstadter, is when an ATM displays the words “THANK YOU” at the end of a transaction. Users tend to infer that the machine is actually expressing gratitude when the machine is only printing a preprogrammed string of symbols.3
In an October 2000 article for the communications of the ACM, Edmund M.A. Ronaldand Moshe Sipper asked a perplexing question; “What use is a Turing Chatterbox?”4
The intrinsic nature of a Turing Chatterbox is to play the imitation game. A game of inherent deceit, the imitation game invites the machine (may it be intelligent or otherwise), to simulate human conversation such that it be indistinguishable from a human being.
Do Eliza and the Turing Test then imply that it is possible, even easy to manufacture or simulate trust between a human and a computer? Does it simply point to the ability of a machine to beguile a human being or does it reveal that the nature of human beings is to be so readily charmed by a creature they choose to forget runs on 1s and 0s?
That is indeed a facet of human nature – to be charmed, to be beguiled, to be put under a spell.
- Weizenbaum, J. (1966). ELIZA—a computer program for the study of natural language communication between man and machine. Communications of the ACM 9(1).
- Billings, L. (2007). Rise of Roboethics. Seed.
- Hofstadter, D. R. (1996). The Ineradicable Eliza Effect and Its Dangers. Fluid Concepts and Creative Analogies: Computer Models of the Fundamental Mechanisms of Thought. Basic Books.
- Ronald, E. and Sipper, M. (2000). What use is a Turing chatterbox? Communications of the ACM 43(1).