Parsing is the process of structuring a linear representation in accordance with a given
grammar. This definition has been kept abstract on purpose, to allow as wide an
interpretation as possible. The “linear representation” may be a sentence, a computer
program, a knitting pattern, a sequence of geological strata, a piece of music, actions in
ritual behaviour, in short any linear sequence in which the preceding elements in some
way restrict† the next element. For some of the examples the grammar is well-known,
for some it is an object of research and for some our notion of a grammar is only just
beginning to take shape.
For each grammar, there are generally an infinite number of linear representations
(“sentences”) that can be structured with it. That is, a finite-size grammar can supply
structure to an infinite number of sentences. This is the main strength of the grammar
paradigm and indeed the main source of the importance of grammars: they summarize
succinctly the structure of an infinite number of objects of a certain class.
There are several reasons to perform this structuring process called parsing. One
reason derives from the fact that the obtained structure helps us to process the object
further. When we know that a certain segment of a sentence in German is the subject,
that information helps in translating the sentence. Once the structure of a document has
been brought to the surface, it can be converted more easily.
A second is related to the fact that the grammar in a sense represents our understanding
of the observed sentences: the better a grammar we can give for the movements
of bees, the deeper our understanding of them is.
A third lies in the completion of missing information that parsers, and especially
error-repairing parsers, can provide. Given a reasonable grammar of the language, an
error-repairing parser can suggest possible word classes for missing or unknown words
on clay tablets.