Arbeitsgemeinschaft Logik und Automaten

Vorträge im Wintersemester 2003/04

 

29.1.2004, 14 Uhr
Peter Rossmanith, RWTH Aachen

Zusammenfassung:

From Solvability in lambda-calculi to Meaningless Terms in TRSs

4.12.2003, 14 Uhr
Vince Bárány, RWTH Aachen

Zusammenfassung:

barany.ps

A Reducibility for the Dot-depth Hierarchy of Star-free Languages

11.12.2003, 14 Uhr
Victor Selivanov, Novosibirsk

Zusammenfassung:

Hierarchies considered in computability theory and in complexity theory are related to some reducibilities in the sense that levels of the hierarchies are downward closed and have complete sets. In this talk we propose a reducibility having similar relationship to the Brzozowski's dot-depth hierarchy of star-free languages and some its refinements. We prove some basic facts on the corresponding degree structure and discuss relationships of the reducibility to complexity theory (via the leaf-language approach).

Games on Pushdown Graphs and Extensions

18.12.2003, 14 Uhr AH 1 (Oberseminar)
Thierry Cachat, RWTH Aachen

Zusammenfassung:

cachat.ps

Paritätsspiele mit unendlich vielen Prioritäten

27.11.2003, 13 Uhr
Erich Grädel, RWTH Aachen

Zusammenfassung:

graedel.ps

On the Automata Size for Presburger Arithmetic

20.11.2003, 14 Uhr
Felix Klaedke, Universität Freiburg

Zusammenfassung:

Presburger arithmetic is the first-order theory with addition and the ordering over the integers. Its decidability was first proved independently by Presburger and Skolem by the method of quantifier elimination. Its complexity has been investigated intensively since. Due to the applicability of Presburger arithmetic in various domains (e.g., integer programming and program verification) it is important to have effective decision procedure for Presburger arithmetic or fragments of it.

Recently, it has become popular to use automata for deciding Presburger arithmetic. An idea that goes at least back to an article by Buechi in 1960: Integers are represented as words (e.g., using the 2's complement representation), and the automata are recursively constructed from the formulas. The constructed automaton for a formula precisely accepts the words representing integers which make the formula true. A crude complexity analysis on such an automata-based decision procedure leads to a non-elementary worst-case complexity. However, empirical results show that automata-based decision procedures are competitive to other methods.

In this talk we analyze this automata-based approach. Our analysis provides a triple exponential upper bound on the size of the minimal deterministic automaton for a Presburger arithmetic formula, and it partly explains the "good" empirical results. Our analysis is based on a comparison of automata and quantifier-free formulas that are obtained by an improvement of Cooper's quantifier elimination method for Presburger arithmetic. Moreover, we give an example showing that this triple exponential upper bound on the automata size is tight (even for nondeterministic automata).

Sigma-Definability and Computability on Continuous Data Types

6.11.2003, 14 Uhr
Margarita Korovina, Novosibirsk

Zusammenfassung:

It is well-known that the classical theory of computation, which works with discrete structures, is not suitable for formalisation of computations that operate on real-valued data. Most computational problems in physics and engineering are of this type, e.g. problems relevant to foundation of dynamical and hybrid systems. Since computational processes are discrete in their nature and objects we consider are continuous, formalisation of computability of such objects is already a challenging research problem.

In this talk we will report about logical approach to computability on the reals based on the notion of definability. In this approach continuous objects and computational processes involving these objects can be defined using finite formulas in a suitable structure. We will discuss beneficial features of this approach, recent results and future work.

Parametrische Komplexitätstheorie

5.11.2003, 10 Uhr Seminarraum I 1
Jörg Flum, Universität Freiburg

Zusammenfassung:

Die Komplexitätstheorie macht Aussagen über die zur Lösung von algorithmischen Problemen erforderlichen Ressourcen, wie etwa Rechenzeit. Dabei wird die Komplexität eines Problems üblicherweise als Funktion der Eingabegröße gemessen. Dieses einfache Modell hat den Nachteil, dass gewisse feinere Strukturen der Eingabe nicht berücksichtigt werden und unter Umständen Probleme als "schwer" klassifiziert werden, obwohl nur gewisse für die Praxis irrelevante Fälle schwer lösbar sind. Die Eingabe bei einer Datenbankanfrage etwa besteht aus der Datenbank und der Anfrage. Normalerweise ist die Datenbank um ein Vielfaches größer als die Anfrage. Die parametrische Komplexitätstheorie berücksichtigt dies und ermöglicht eine verfeinerte Komplexitätsanalyse.

Im Vortrag geben wir einen Überblick über die Begriffsbildungen und Ergebnisse der parametrischen Komplexitätstheorie und stellen maschinelle Charakterisierungen einiger Klassen vor.