1.0. The Goal of this Class

The main goal of this class is to enable students to understand the basic conceptual problems faced by studies of linguistics, artificial intelligence, natural intelligence, cybernetics, robotics, intelligence, and animal communication without forcing anyone to become deeply involved in technical problems that are sure to be come obsolete with the advance of technology.

Basic Conceptual problems include: What is a human language? What is meant by English, Chinese, and so on? What does it mean to understand English? What is a grammar of English? How does a human being know that a certain sound evokes a specific meaning? What constitutes knowledge of language? How is knowledge of language acquired? How is it put to use? Is human language different in principle from animal communication systems? Why is it so difficult to program a computing machine to understand English or to translate between human languages? Why did Konrad Lorenz win a Nobel Prize for analyzing animal (duck/goose) languages given he never indicated what the animals were saying (or talking about)?

Technical problems, most of which will be sidestepped by using computers only to illustrate conceptual issues, include: How can we represent the processes and data structures found in human language on a computer? All canned programs used in the course run in Prolog or Lisp on an IBM PC, Mac, or Unix machine and can be found in Dougherty, 1994, Natural Language Computing, or at various WWW sites.

The Focus of this Class

To maintain a sharp focus on current issues in studies of the intelligence of humans, animals, and machines, we will concentrate on Noam Chomsky's ideas about the interrelation of language and mind as presented in his book, 1966, Language and Mind, New York: Harcourt Brace, and in numerous readings located in the library. Some of the readings can be found on the WWW: http://www.mit.edu/chomsky.html

In particular, we dwell on Chomsky's idea that study of the acquisition, knowledge, and use of a language like English, Chinese, Walpiri, etc. provides the best insight into human mental processes. We will ask: What types of (a) computational abilities, (b) memory storage capacities, and (c) pattern recognition skills underlie normal human language use? We will analyze Chomsky's idea that a human language (English, Japanese, etc.) is a genetically based, species specific, maturationally emergent aspect of human cognition that is largely preprogrammed in a human being at birth and which acquires its fon-n via triggering experiences provided to the child by the environment. We will discuss the differences between first language acquisition (child learning), second language acquisition (adult learning), and bilingual acquisition.

The classroom discussion and readings emphasize the basic conceptual issues underlying the theories of intelligence, language, communication, and learning of Noam Chomsky. We will use the New York University computers (IIBM, Mac, and Unix) to illustrate ideas discussed in class. Students are not required to w-fite programs, although one may write a program as a final project. There will be no questions on any exams about details of computer programming. Most of the computational work in this class, all of which is optional, consists of running already written programs that illustrate some of the more puzzling and complicated aspects of human language structures.

We will concentrate on Chomsky's idea that human intelligence is qualitatively unlike any other intelligence so far encountered in the animal world. We will have some readings that focus on the pro's and con's of,this view. We will discuss Descarte's analysis of animal passions, Darwin's investigation of blushing in animals, Frisch's analysis of bee languages, and Lorenz's studies of the communication systems of ducks, -eese, wolves, and sundry other animals. We discuss species specific capacities, innate knowledge and belief systems, learning theory, neoteny, and maturationally emergent behavior.

Chomsky, in works since 1957, has offered technically complicated - but conceptually very straightforward - concepts of grammar, language, learning, and intelligence. We will focus on defining the conceptual issues. We ask: What does Chomsky mean by intelligence, knowledge, and belief? What does Chomsky think we ought to do to investigate questions of intelligence? Why does he think we should investigate intelligence by examining how human beings construct, store, and manipulate symbols - in particular the symbols (sounds, words, phrases, etc.) of human language?

1.2. The Broader Context of the Course

To enable a student to see our study in perspective, we will discuss materials in Nicholas Negroponti's 1996 book: Being Digital, New York: Random House. http://www.randomhouse.com/knopf/digital.html Negroponti is the director of the MIT Multi-Media Laboratory. His book draws upon his experience and projects his ideas about the future of the infon-nation age, computers, intelligence studies, and so on. There are no readings assigned in this book. All students are assumed to have read it by the first exam.

In brief, the course title is Communication: Humans, NEnds, and Machines, and breaks down into these three rubrics:

Humans

Chomsky offers a theoretical view of human language that claims the content and structure of language like English, Japanese, French, and so on can be described by a geometric-logical model, which he calls universal grammar. Chonisky's notations and diagrams are complicated and can be tedious to draw.

Dougherty 1994 shows how Chomsky's models of language and grammar can be implemented on a computing machine using programs encoded in the computer language Prolog, short for programming in logic. The Prolog programs assign the grammatical notations (identify the parts of speech in a sentence) and draw the diagrams automatically.

Minds

Lorenz and Tinbergen discuss the concepts involved in animal intelligence, animal conununication, and animal behavior, where animal usually means birds, wolves, and insects. Lorenz, Darwin, and Descartes dicuss the evolution of mental capacities, emotions, etc.

Machines

Negroponti offers a broad perspective on the past, present, and future of intelligent machines. Negroponti, and the works he cites, indicate why one would want to construct intelligent machines that can interact in human languages, translate among them, and answer questions in sentences.

Three major areas of research, cognitive psychology, ethology, and artificial intelligence, which were pursued independently before the 1940's have today come together in such a way as to yield fruitful interactions. We will focus on those abstractions, idealizations, and principles that over the past fifty years have brought these areas together. Let us represent these three disciplines in these terms:

Cognitive psychology was characterized by Noam Chomsky in 1968 as a 'science of psychology of a sort that still does not exist, a psychology that begins with the problems of characterizing various systems of human knowledge and belief, the concepts in ten-ns of which they are organized and the principles that underlie them, and that only then turns to the study of how these systems might have developed through some combination of innate structure and organism-environment interaction.' (Chomsky 1968: p. 7)

Ethology was defined by Konrad Lorenz, who with Niko Tinnbergen shared the Nobel Prize in 1972 for their work on animal communication, in these terms: "Ethology, the comparative study of behavior, is easy to define: It is the discipline which applies to the behavior of animals and humans all those questions asked and those methodologies used as a matter of course in all the other branches of biology since Charles Darwin's time." (Lorenz 1981: p. 1)

Arfificial Intelligence seems to have has many definitions as there are authors in the field. We will focus on the linguistic aspects of artificial intelligence and consider various computer systems which process natural languages, e.g., English, German, Spanish, etc. We will focus on machines that attempt (1) to read and write grammatical sentences, translate from English to German, etc.; (2) to play chess, tic-tac-toe, and dominos; and (3) to learn languages, games, and aspects of movement in space. We will read works by Alan Turing, Claude Shannon, Norbert Wiener, and Joseph Weizenbaum.

The lectures and readings are organized to suggest that there are three reasons these fields are converging.
(1) Norbert Wiener's work in cybernetics, Claude Shannon's information theory, and Alan Turing's study of codes and the Turing machine led to a formal algebraic way of discussing language, where language includes all forms of human, animal, and machine communication. As formal language theory developed, it became possible to see that in a communication perspective there were parallels in the three disciplines. Researchers were studying the same thing from different perspectives: Belief, knowledge, wisdom, intelligence, meaning, pragmatics, etc.
(2)Autonomous syntax, a notion discussed in Linguistics by Zellig Harris, Noam Chomsky, and others claims that it is possible to study the structure and fon-n of sentences without knowing the (precise) meaning of any sentence. Just as a jigsaw puzzle has two types of structure, a language does too. One could imagine a jigsaw puzzle with no picture on it. It could be assembled owing to the structure given it by the shapes of its pieces. One could imagine a language as having a structure (syntax) that is independent of meanings (semantics). This idea becomes particularly useful in studying stnictured communication systems where we do not, and perhaps cannot, know the meanings involved, e.g., the communication system of birds, bees, ducks, dogs, and other animals.
(3) The MIT Research Laboratory of Electronics, started and developed during the second world war, brought together researchers in diverse disciplines. In particular it placed electrical engineers (Shannon), mathematicians (Wiener), code crackers (Turing), linguists (Chomsky), and computer scientists (Weizenbaum) together under one roof along with an assortment of biologists, geneticists, psychologists, and general hackers.

1.3. Noam Chomsky and Charles Sanders Peirce

In a nutshell, most of the problems we will discuss are of the same general form, a form discussed at great length and in numerous books by Noam Chomsky.

In the combinatorial problems, which characterize most of our human language examples, we are given:
(1) a huge number of little elements such as morphemes (pre-, post-, re-, anti-, de-, un-...), or words (nouns, verbs, adjectives, etc.), and
(2) a complicated, idiosyncratic, exception-laden set of principles of combination that tell us how the little elements can combine to make bigcer elements.

For instance, how can we combine the morphemes (prefixes, suffixes, affixes, infixes, and so on) and the stems to make the larger elements called words: anti+dis+establ+ish+ment+ar+ian+ism, un+do+able, good+ness, act+ion, and so on. There are lots of incorrect possibilities: *good+ity, *act+ness, and so on. The dictionary (called a lexicon by linguists) has to label morphemes, roots, stems, and words as being derived from Latin, French, Gen-nan, Teutonic, and so on. When words are combined to make compound words, sometimes there are ambiguities, as in American History Teacher - which is a history teacher who is American or a teacher of American History. The compounds, postman, garbage man, and gingerbread man, differ considerably in interpretation: one brings the post, the other removes the garbage, and the last is made out of gingerbread.

At a higher level of linguistic analysis, words can combine to make sentences. The group of processes by which words combine to form phrases, clauses, and sentences is called syntax. Syntax will be a main focus of this course. We will discuss simple sentences, compound sentences, embedded sentences, questions, indicatives, imperatives, conditionals, irreal, passives, and so on.

In any event, the majofity of the examples we will discuss in this class involve combining elements found in a dictionary (lexicon) to fon-n words, phrases, clauses, sentences, paragraphs, and discourses. In this age of computers it would be ridiculous to do this with pencil and paper for several reasons. First, there are huge compilations of words (dictionary, thesaurus, bilingual dictionary, word-frequency lists, etc.) available for no cost on computers. Second - as we shall see - the number of sentences in English of 20 words or less is more than the number of seconds in the history of the universe. And 20 word sentences are not rare. Using the on-line versions of the New York Times, Wall Street Journal, and so on, we can count the number of 20 word sentences that appeared in 1994 or 1995.

Charles Sanders Peirce, 1839-1914, developed a philosophy of science that will form the basis of our presentation. We will discuss his views about abduction, deduction, and induction, and show how Chomsky has incorporated many of Peirce's methodological ideas into his view of grammar. Peirce's contributions are mainly in his studies of methodology, that is, the analysis of the argument structure by which two or more theories can be compared with each other and the data in order to choose the "best theory" among the alternatives.

l.4. Use of Computers in the Class

All examples of encoding language on a computer to be discussed in class are taken from: Dougherty, 1994, Natural Language Computing: An English Grammar in Prolog, Hillsdale, N.J.: Lawrence Eribaum, http://www.nyu.edu/pages/linguistics/anicbk.html. More advanced students, or those with programming background, will be encouraged to visit sites containing more complex information, for instance the site: http://www.nyu.edu/pages/parsers.html. For a general introduction to the basic problems to be discussed, see http://www.nyu.edu/pages/ling.html.

Students in the class are assumed to know nothing about Chomsky's theories and nothing about Prolog or Lisp, the two computer languages in which all examples are encoded. Students will learn the basics of programming in Prolog, and to a lesser extent, Lisp. The computer programs presented are already written and can be executed and modified by students in the class in order to study the combinatorial properties of the basic elements of human language. As one will soon see, it is very difficult to discover the principles that underlie the distribution of simple elements like more, -er, and than in examples like these (assume * indicates an ungrammatical combination): This is nwre interesting (*interestinger) than that. The door is taller (*more tall) than the window. The door is more tall (*taller) than wide. Some words do not permit morel-er: This is (*more) superior than that. The logical principles that underlie simple contractions are not simple. One can say: I have not seen it, which contacts to: I've not seen it or I haven't seen it, but not to: *I'ven't seen it. We can contract He is not tall to He isn't tall, and You are not tall to You aren't tall. But we cannot contract I am not tall to *I amn't tall. This absent amn't, short of am not, is the source of aint in English. The distribution of aint is not a simple matter to describe. Aint can stand for am not, are not, is not, but also have not and has not. She has not arrived yet, She aint arrived yet; Haven't you read it yet? Aint you read it yet? But aint can only be the auxiliary have, not the have of possession: I havn't a horse. *I ain't a horse.

H(yper) T(ext) M(arkup) L(anguage) Lunchtime Discussion Group

Check out: http://www.nyu.edu/pages/linguistics/htmlclub.html

There is an informal discussion group that meets on Tuesdays from 12:15 until 1: 10 for lunch. The main focus of the lunch is to bring together people who want to discuss their successes and failures, happiness and grief in dealing with the intemet, http code, browsers, interesting sites, and so on. All are welcome: the experts, the amateurs, the enthusiasts, the disgruntled, the conftised, the bewildered, and the curious.