We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure [email protected]
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
As explained in the introduction of Section 1.7, there are no appropriate finite-state automata or finite-state transducers that work “directly” on graphs. Thus, for graphs, the role of automata and transducers is taken over by monadic second-order logic: instead of being accepted by an automaton, a set of graphs is defined by a monadic second-order sentence; and instead of being computed by a transducer, a graph transduction is defined by a definition scheme consisting of monadic second-order formulas. With respect to automata, the original motivation for this approach was Theorem 1.16 (cf. Theorem 5.82): for terms and words monadic second-order logic and finite-state automata have the same expressive power.
The aim of this chapter is to show that, in a certain sense, the automata-theoretic characterization of monadic-second order logic for sets of terms and words (Theorem 1.16) can be generalized to transductions. This means of course, that the automata should produce output, i.e., that they are transducers. We will concentrate on transductions that are partial functions, and in particular on deterministic devices, i.e., parameterless definition schemes and deterministic transducers. For these, we will show that monadic second-order transductions of words correspond to two-way finite-state transducers, and that monadic second-order transductions of terms correspond to (compositions of) tree-walking transducers. These transducers are well known from Formal Language Theory.
A two-way finite-state transducer is a finite-state automaton with a two-way read-only input tape and a one-way output tape. The input word is placed on the input tape between endmarkers, and the transducer has a reading head that is positioned over a cell of the input tape, at each moment of time.
We will define graph operations that generalize the concatenation of words. Some of these operations have been used in Section 1.1 to define the cographs and the series-parallel graphs. We will define actually two signatures of graph operations, hence two graph algebras. Both algebras will be defined in such a way that their equational sets are exactly the sets defined by certain context-free graph grammars. (The notion of an equational set has been presented informally in Section 1.1.4 and will be studied in detail in Chapter 3.)
Two main types of context-free sets of graphs defined by certain graph rewriting mechanisms have emerged from the intense research conducted from around 1980 and synthesized in the handbook [*Roz] edited by Rozenberg.
We will first define the HR graph algebra that corresponds in this respect to the hyperedge replacement grammars. It turns out that the operations of this algebra yield an exact characterization (as equational sets) of the sets of graphs of tree-width bounded by fixed integers. The terms built with the operations of the HR algebra can be seen as algebraic expressions of tree-decompositions. (Tree-width and the corresponding tree-decompositions are important for the construction of efficient graph algorithms and also for the characterization of the graphs that exclude a fixed graph as a minor.)
The second algebra, called the VR algebra, is defined so that its equational sets are those generated by the (context-free) vertex replacement grammars. A new graph complexity measure called clique-width has arisen in a natural way from the definition of the VR algebra, without having (yet) any independent combinatorial characterization.
This chapter defines monadic second-order logic, and shows how it can be used to formalize the expression of graph properties. Monadic second-order formulas can also be used to express the properties of sets of vertices and/or edges in graphs. The Recognizability Theorem (Section 5.3.8) implies that a class of graphs characterized as the class of (finite) models of a monadic second-order sentence is VR-recognizable, and that it is HR-recognizable (but not necessarily VR-recognizable) if the sentence is written with edge set quantifications.
It follows that the monadic second-order satisfiability problem for the class of graphs of tree-width or of clique-width at most k is decidable for each k. Applications to the construction of fixed-parameter tractable algorithms and other algorithmic consequences will be developed in Chapter 6.
Although our first applications concern graphs, we will give the definitions and prove the main results for the general case of relational structures because proofs are not more difficult, and the corresponding results apply in a uniform manner to labeled graphs represented, in several ways, by binary relational structures. However, relational structures with n-ary relations for n ≥ 2 are actually interesting by themselves. As we have shown in Section 1.9, they are useful to formalize betweenness and cyclic ordering (and for relational databases).
In Sections 5.1 and 5.2, we review basic notions of logic, define monadic second-order logic, and give some results that delineate its expressive power.
We will leave it to the researchers of the next ten years and beyond to decide which of the results presented in this book are the most important. Here we will only indicate what are, in our opinion, the main open problems and research directions, among those that are closely related with the topics of this book.
Algorithmic applications
This topic, studied in Chapter 6, is the most difficult to discuss because of the large number of new articles published every year. Hence, the following comments have a good chance of becoming rapidly obsolete.
The algorithmic consequences of the Recognizability Theorem depend crucially on algorithms that construct graph decompositions of the relevant types or, equivalently, that construct terms over the signatures FHR, FVR which evaluate to the input graphs or relational structures. The efficiency of such algorithms, either for all graphs, but perhaps more promisingly for particular classes of graphs, is a bottleneck for applications.
Another difficulty is due to the sizes of the automata that result from the “compilation” of monadic second-order formulas. Their “cosmological” sizes (cf. [FriGro04], [StoMey], [Wey]) make their constructions intractable for general formulas. However, for certain formulas arising from concrete problems of hardware and software verification they remain manageable, as reported by Klarlund et al., who developed software for that purpose called MONA (cf. [Hen+], [BasKla]). Soguet [Sog] tried using MONA for graphs of bounded tree-width and clique-width, but even for basic graph properties such as connectivity or 3-colorability, the automata become too large to be constructed as soon as one wishes to handle graphs of clique-width more than 3.
Monadic second-order transductions are transformations of relational structures specified by monadic second-order formulas. They can be used to represent transformations of graphs and related combinatorial structures via appropriate representations of these objects by relational structures, as shown in the examples discussed in Section 1.7.1.
These transductions are important for several reasons. First, because they are useful tools for constructing monadic second-order formulas with the help of the Backwards Translation Theorem (Theorem 7.10). Second, because by means of the Equationality Theorems (Theorems 7.36 and 7.51) they yield logical characterizations of the HR- and VR-equational sets of graphs that are independent of the signatures FHR and FVR. From these characterizations, we get short proofs that certain sets of graphs have bounded, or unbounded, tree-width or clique-width.
They also play the role of transducers in formal language theory: the image of a VR-equational set of graphs under a monadic second-order transduction is VR-equational, and a similar result holds for HR-equational sets and monadic second-order transductions that transform incidence graphs. Finally, the decidability of the monadic second-order satisfiability problem for a set of structures C implies the decidability of the same problem for its image τ(C) under a monadic second-order transduction τ. Hence, monadic second-order transductions make it possible to relate decidability and undecidability results concerning monadic second-order satisfiability problems for graphs and relational structures.
Section 7.1 presents the definitions and the fundamental properties. Section 7.2 is devoted to the Equationality Theorem for the VR algebra, one of the main results of this book.
This book contributes to several fields of Fundamental Computer Science. It extends to finite graphs several central concepts and results of Formal Language Theory and it establishes their relationship to results about Fixed-Parameter Tractability. These developments and results have applications in Structural Graph Theory. They make an essential use of logic for expressing graph problems in a formal way and for specifying graph classes and graph transformations. We will start by giving the historical background to these contributions.
Formal Language Theory
This theory has been developed with different motivations. Linguistics and compilation have been among the first ones, around 1960. In view of the applications to these fields, different types of grammars, automata and transducers have been defined to specify formal languages, i.e., sets of words, and transformations of words called transductions, in finitary ways. The formalization of the semantics of sequential and parallel programming languages, that uses respectively program schemes and traces, the modeling of biological development and yet other applications have motivated the study of new objects, in particular of sets of terms. These objects and their specifying devices have since been investigated from a mathematical point of view, independently of immediate applications. However, all these investigations have been guided by three main types of questions: comparison of descriptive power, closure properties (with effective constructions in case of positive answers) and decidability problems.
A context-free grammar generates words, hence specifies a formal language. However, each generated word has a derivation tree that represents its structure relative to the considered grammar.
In the early stages of developing a programming language or paradigm, the focus is on programming-in-the-small. As the language matures, programming-in-the-large becomes important and a second modules language is often imposed on the previously existing core language. This second language must support the partitioning of code and name spaces into manageable chunks, the enforcement of encapsulation and information hiding, and the interactions between separately defined blocks of code. The addition of such modularity features typically is manifest syntactically in the form of new constructs and directives, such as local, use, import, and include, that affect parsing and compilation. Since the second language is born out of the necessity to build large programs, there may be little or no connection between the semantics of the added modular constructs and the semantics of the core language. The resulting hybrid language consequently may become complex and also may lack declarativeness, even when the core language is based on, say, logic.
In the logic programming setting, it is possible to support some of the abstractions needed for modular programming directly through logical mechanisms. For example, the composition of code can be realized naturally via the conjunction of program clauses, and suitably scoped existential quantifiers can be used to control the visibility of names across program regions. This chapter develops this observation into the design of a specific module language.
This book is about the nature and benefits of logic programming in the setting of a higher-order logic. We provide in this Introduction a perspective on the different issues that are relevant to a discussion of these topics. Logic programming is but one way in which logic has been used in recent decades to understand, specify, and effect computations. In Section I.1, we categorize the different approaches that have been employed in connecting logic with computation, and we use this context to explain the particular focus we will adopt. The emphasis in this book will be on interpreting logic programming in an expressive way. A key to doing so is to allow for the use of an enriched set of logical primitives while preserving the essential characteristics of this style of specification and programming. In Section I.2, we discuss a notion of expressivity that supports our later claims that some of the logic programming languages that we present are more expressive than others. The adjective “higher order” has been applied to logic in the past in a few different ways, one of which might even raise concern about our plan to use such a logic to perform computations. In Section I.3, we sort these uses out and make clear the kind of higher-order logic that will interest us in subsequent chapters. Section I.4 explains the style of presentation that we follow in this book: Broadly, our goal is to show how higher-order logic can influence programming without letting the discussion devolve into a formal presentation of logic or a description of a particular programming language. The last two sections discuss the prerequisites expected of the reader and the organization of the book.