10.04.2013 Views

Linguistics Encyclopedia.pdf

Linguistics Encyclopedia.pdf

Linguistics Encyclopedia.pdf

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

The linguistics encyclopedia 40<br />

The other programs of this period did little more syntactic processing, but did at least<br />

do some work on the patterns that they extracted. A reasonable example is Bobrow’s<br />

(1968) program for solving algebra problems like the following:<br />

If the number of customers Tom gets is twice the square of 20 per cent of<br />

the number of advertisements he runs, and the number of advertisements<br />

he runs is 45, what is the number of customers Tom gets?<br />

This appears to be in English, albeit rather stilted English. Bobrow’s program processed<br />

it by doing simple pattern matching to get it into a form which was suitable for his<br />

equation-solving program. It is hard to say whether what Bobrow was doing was really<br />

language processing, or whether his achievement was more in the field of equation<br />

solving. It is clear that his program would have made no progress whatsoever with the<br />

following problem:<br />

If the number of customers Tom gets is twice the square of 20 per cent of<br />

the number of advertisements he runs, and he runs 45 advertisements, how<br />

many customers does he get?<br />

The other pattern-matching programs of this time were equally frail in the face of the real<br />

complexity of natural language. It seems fair to say that the main progress made by these<br />

programs was in inference, not in language processing. The main lesson for language<br />

processing was that pattern matching was not enough: what was needed was proper<br />

linguistic theory.<br />

LINGUISTIC THEORY<br />

The apparent failure of the early work made AI researchers realize that they needed a<br />

more adequate theory of language. As is far too often the case with AI work, there was<br />

already a substantial body of research on the required properties of language which had<br />

been ignored in the initial enthusiasm for writing programs. Towards the end of the<br />

1960s, people actually went away and read the existing linguistic literature to find out<br />

what was known and what was believed, and what they might learn for their next<br />

generation of programs. Simultaneously, it was realized that NLP systems would need to<br />

draw on substantial amounts of general knowledge about the world in order to determine<br />

the meanings in context of words, phrases, and even entire discourses. Work in the late<br />

1960s and early 1970s concentrated on finding computationally tractable versions of<br />

existing theories of grammar, and on developing schemes of meaning representation.<br />

These latter are required both to enable the integration of the specifically linguistic part of<br />

an NLP system with the sort of knowledge required for disambiguation and interpretation<br />

in context, and to actually link the NLP system to some other program which had<br />

information a user might want to access.<br />

SYNTACTIC THEORY

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!