Abstracts – invited papers
Michael Dunn: The Duality Between Information and Computation
Jens Erik Fenstad: The Necessity of Meaning
Dagfinn Follesdal: Themes from Twardowski
Vincent Fella Hendricks: The Convergence of Scientific Knowledge
Joachim Lambek: Topos methods for establishing intuitionistic principles
Grzegorz Malinowski: Inferential Intensionality
Ewa Orłowska: Information logics: a modal approach to incompleteness and uncertainty of information
Krister Segerberg: FULL DDL: The Modal Logic of Belief Revision
Abstracts – contributed papers
Zofia Adamowicz: 50 years of Polish logic differently – comment to the article by R. Wójcicki
Maria Bulińska: The Pentus Theorem fo Lambek Calculus with Nonlogical Axioms
Claudia Casadio: An Algebraic Approach to Quantification
Janusz Ciuciura: Labelled Tableaux for D2
Janusz Czelakowski: An Algebraic Treatment of Infinitistic Definitions
Maciej Farulewski: On the finite models of Lambek Calculus
Maciej Kandulski: Derived tree languages of categorial grammars
Siegfried Gottwald: Ways to form set theoretic universes of fuzzy sets
Paweł Kawalec: Scientific knowledge, epistemology and convergence. A comment on Vincent Hendricks' paper The Convergence of Scientific Knowledge
Aleksandra Kiślak-Malinowska: Free pregroups as a tool for parsing
Robert Kublikowski: Language, vagueness and definition
Marek Lechniak: J. Łos's system and contemporary logics of belief
Grażyna Mirkowska and Andrzej Salwicki: Logic as a tool for specification and verification of software
Roman Murawski and Jerzy Pogonowski: Logical Investigations at the University of Poznań in 1945-1955
Anna Pietryga: Logical Values after Duhem
Bożena Staruch and Bogdan Staruch: First order theories of partial models
Andrzej Wiśniewski: Socratic Proofs
Urszula Wybraniec-Skardowska: Syntactic and Semantic Notions of Sense
Abstracts – invited papers
Michael Dunn: The Duality Between Information and Computation
We explore a “duality” between information and computation that arises naturally in a ternary frame semantics. Suppose we have a ternary relation R on a set of information states U. As is familiar, each subset P of U can be thought of, in informational terms, as a proposition (a static entity). But each state determines a binary relation, so the set P can be turned in for the set of binary relations determined by those states, i.e., in computational terms, as a set of possible actions on those states (a dynamic entity). If we postulate an information order on the states, as the information order increases, the binary relation determined by the states weakens, and this generates the “duality.”
This idea has already been exploited to represent combinatory algebras and relation algebras (the first joint with R. K. Meyer). In this paper we also explore connections with Pratt's dynamic logic, Hoare's logic of programming, Pratt's action logic, and Kleene algebras.
Jens Erik Fenstad: The Necessity of Meaning
In the study of the interplay between grammar and mind much of current
linguistic theory postulate two modules; one being a conceptual module
which encodes the meaning content and the world knowledge of the
speaker/listener; the other being a computational module which encodes the
syntactical and morphological structure of utterances.
A major part of contemporary linguistics has been focused on the
investigation of the computational module. I shall particularly emphasise
the work on grammar and logic founded on the pioneering studies of
Ajdukiewicz and Tarski. But this is only one part of the story, the
ultimate goal of linguistic theory is to give an account of the link
between linguistic structure and meaning. This means that one must move
from syntax alone to a study of the syntax/semantics interface. The link or
"connecting sign" between structure and meaning has traditionally been
either a formula or term in some logical formalism, or a symbolic
representation derived from som attribute-value matrix. But whatever the
nature of the connecting sign the final representation of "meaning" has
been some model structure derived from the semantics of some system of
formal logic.
This may be sufficient for computational linguistics, but it is not enough
from the point of view of the cognitive sciences. We have indeed learned
from these sciences that an understanding of how the "symbols" of a natural
language can carry "meaning" needs more than formal semantics. I shall in
this lecture, based on current developments in the study of "lexical signs",
discuss what is needed and review some attempts to bridge the gap.
Vincent Fella Hendricks: The Convergence of Scientific Knowledge
It is a well-known philosophical thesis that knowledge may be characterized by convergence to a correct hypothesis in the limit of empirical scientific inquiry. The primary aim is not to say whether convergence will or will not occur. It is rather to systematically investigate the proposal that such convergence, if it occurs, is constitutive of scientific knowledge. To investigate this convergence proposal a new formal framework called modal operator theory is introduced. Modal operator theory denotes the cocktail obtained by mixing alethic, epistemic and tense logic together with a few concepts from computational epistemology to study the strength and validity of limiting convergent knowledge.
Joachim Lambek: Topos methods for establishing intuitionistic principles
The so-called free topos is the initial object in the category of elementary toposes and logical functors. It may be constructed as the Tarski-Lindenbaum category of pure intuitionistic type theory. In our book "Introduction to higher order categorical logic", Phil Scott and I argue that the free topos might be accepted as the world of elementary mathematics by moderate Intuitionists. We used it to establish a number of intuitionistic principles, by glueing arguments originating with Peter Freyd. Twenty years later, we are taking another look at some of the principles we could then not handle, in particular, the countable rule of choice.
Grzegorz Malinowski: Inferential Intensionality
The paper is a study of properties of quasi-consequence operation which is a key notion of the so-called inferential approach in the theory of sentential calculi established in [5].The principal motivation behind the quasi-consequence, q-consequence for short, stems from the mathematical practice which treats some auxiliary assumptions as mere hypotheses rather than axioms and their further occurrence in place of conclusions may be justified or not. The main semantic feature of the q-consequence reflecting the idea is that its rules lead from the non-rejected assumptions to the accepted conclusions.
First, we focus on the syntactic features of the framework andpresent the q-consequence as related to the notion of proof. Sucha presentation uncovers the reasons for which the adjective
"inferential" is used to characterize the approach and, possibly, the term "inference operation" replaces "q-consequence". It alsoshows that the inferential approach is a generalisation of the
Tarski setting and, therefore, it may potentially absorb several concepts from the theory of sentential calculi, cf. [10]. However, as some concrete applications show, see e.g. [4], the new approach opens perspectives for further exploration.
The main part of the paper is devoted to some notions absent in Tarski approach. We show that for a given q-consequence operation W instead of one W-equivalence established by the properties of W we may consider two congruence relations. For one of them the current name is preserved and for the other the term "W-equality" is adopted. While the two relations coincide for any W which is a consequence operation, for an arbitrary W the inferential equality and the inferential equivalence may differ. Further to this we introduce the concepts of inferential extensionality and intensionality for q-consequence operations and connectives. Some general results obtained in Section 3 sufficiently confirm the importance of these notions. To complete a view, in Section 4 we apply the new intensionality-extensionality distinction to inferential extensions of a version of the Lukasiewicz four valued modal logic.
Ewa Orłowska: Information logics: a modal approach to incompleteness and uncertainty of information
Modern logic has developed an important paradigm of relationship between logical and algebraic systems. Along these lines we put forward a general perspective on modeling incomplete information. We show how information presented in an information system induces classes of information frames and classes of information algebras. We explain how these structures reflect various forms of incompleteness of information which is inherent in information systems. We present an abstract characterisation of information frames and algebras. We outline a Jonsson/Tarski – style duality for these frames and algebras. We also mention information algebras and information frames based on not necessarily distributive lattices and an Allwein/Dunn – style duality for them.
Krister Segerberg: FULL DDL: The Modal Logic of Belief Revision
Thanks to Carlos Alchourron, Peter Gardenfors, and David Makinson, there exists a formal theory of belief revision of interest to philosophical logicians and theoretical computer scientists alike. It was originally formulated in algebraic terms, but it is easy to formulate a version of it in terms of modal logic. The classical theory of belief revision does not involve iterated beliefs. To extend the basic modal logic of belief revision (basic DDL) to accommodate iterated belief-a natural move in modal logic-is not entirely straightforward. In this talk I will enumerate a number of possibilities and try to defend the choice of one of them as the definitive weakest normal dynamic doxastic logic-in other words, full DDL.
Abstracts – contributed papers
Zofia Adamowicz: 50 years of Polish logic differently – comment to the article by R. Wójcicki
Our aim is to show an alternative view at Polish logic of the past 50 years. To this end a brief overview of the main trends and ideas of logic in the world in the past 50 years will be given. A definition of logic and of its borders will be derived. The role of Polish logicians will be cosidered in the above general context.
Maria Bulińska: The Pentus Theorem fo Lambek Calculus with Nonlogical Axioms
The Lambek calculus introduced in Lambek [4] is a strengthening of the type reduction calculus of Ajdukiewicz [1]. We study Associative Lambek Calculus L in Gentzen style axiomatization enriched with the finte set of nonlogical axioms, denoted by L(). It is known that finite axiomatic extensions of Associative Lambek Calculus generate all recursively enumerable languages (see Buszkowski [2], [3]). Then we confine nonlogical axioms to the sequents of the form , where p and q are primitive types. For calculus L() we prove interpolation lemma (modifying the Roorda proof for L [6]) and the binary reduction lemma (using the Pentus method [5]). In consequence we obtain weak equivalence of the Context-Free Grammars and grammars based on L().
References:
[1] Ajdukiewicz, K. (1935). "Die syntaktische Konnexität", Studia Philosophica 1, pp.1-27.
[2] Buszkowski, W. (1982). "Some decision problems in the theory of syntactic categories", Zeitschrift für mathematische Logik und Grundlagen der Mathematik, 28, pp.539-548.
[3] Buszkowski, W. (2002). "Lambek Calculus with Nonlogical Axioms", to apear.
[4] Lambek, J. (1958), "The mathematics of sentence structure", American Mathematical Monthly, 65(3), pp.154-170.
[5] Pentus, M. (1993), "Lambek grammars are context-free", Proc. 8 IEEE Symposium on Logic in Computer Science, pp. 429-433.
[6] Roorda, D., 1991, "Resource logic: proof-theoretical investigations", Ph.D.Thesis, University of Amsterdam.
Claudia Casadio: An Algebraic Approach to Quantification
The paper proposes a new insight into the analysis of quantification
developed by Montague [7] applying the grammar of pregroups recently
introduced
by J. Lambek [3] [6] and studied in [1] [2] [4]. Montague's distinction
of a "de re"
reading of quantificational contexts contrasting with the basic "de
dicto" reading
is reconsidered, starting from a set of algebraic types defined within a
pregroup.
Geometrical representations of this distinction, and of the parallel
contrast between
"wide" vs. "narrow" scope [5], will be produced by means of the
non-commutative proof
nets of classical bilinear logic (or non-commutative linear logic).
References:
[1] Buszkowski, W. (2001). "Lambek grammars based on pregroups",
in P. de Groote, G. Morrill and C. Retor (eds.)(2001),
Logical Aspects of Computational Linguistics, 95-109,
Springer-Verlag, Berlin.
[2] Casadio, C. (2002). Logic for Grammar, Bulzoni Editore, Roma.
[3] Casadio, C. and J. Lambek (2002). "A tale of four grammars", Studia
Logica,
vol. 71, 2. Special Issue edited by W. Buszkowski.
[4] Kislak, A. (2002). "Pregroups versus English and Polish grammar", in
V. M. Abrusci
and C. Casadio (eds.), New Perspectives in Logic and Formal Linguistics,
Bulzoni Editore, Roma.
[5] Hendriks, H. (1993). Studied Flexibility, Categories and Types in
Syntax
and Semantics, Ph.D. Dissertation, Universiteit van Amsterdam.
[6] Lambek, J. (2001). "Type grammars as pregroups", Grammars 4, 21-39.
[7] Montague, R. (1974). "The proper treatment of quantification
in ordinary English", in Formal Philosophy: Selected Papers,
Yale University Press, New Haven.
Janusz Ciuciura: Labelled Tableaux for D2
In the late forties, Stanisław Jaśkowski published his well-know papers on the discursive sentential calculus, D2. He provided a definition of it by an interpretation in the language of S5 of Lewis. However, it is the sisyphean labour to transform any discursive formula into its modal counterpart. The inconvenience results in the search for a new simple tool we could use trying to answer the question whether a discursive formula is valid in D2 or not. We intend to introduce a more efficient method that makes the translation procedure redundant and present a direct semantics and Labelled Tableaux for D2.
Janusz Czelakowski: An Algebraic Treatment of Infinitistic Definitions
In this paper a uniform and general theory of definition, encompassing infinitistic, semantic methods of defining abstract objects, is outlined. The focus of the paper is on three basic infinitistic ways of defining objects:
- the method of fixed-points,
- the method of algebraic completions of posets,
- the induction method.
Maciej Farulewski: On the finite models of Lambek Calculus
We study an interesting class of finite models for NL, L, L* (with additive
conjunction). The class of considered models allows us to prove the Finite
Model Property for each of the above system. In our approach we do not use
cut elimination, which is essential in results which are using notion of
logical congruence [2]. This work is an extension of finite restriction of
canonical models method used in [1].
References:
[1] Buszkowski W., Finite Models of Some Substructural Logics, Mathematical
Logic Quarterly 48 (2002) 1, 63-72.
[2] Okada M., Terui K., The Finite Model Property for Various Fragments of
Intuitionistic Linear Logic, Journal of Symbolic Logic, 64(2) (1999)
790-802.
Maciej Kandulski: Derived tree languages of categorial grammars
We propose a method of constructing trees which are associated with derivations in the Lambek calculus in such a way that the length of the yield of the tree always equals the length of the antecedent of the derived sequent. This property enables one to define for every string language generated by a Lambek grammar a tree language consisting of structures which can be directly imposed on generated strings. We examine the defined classes of tree languages from the point of view of their relation to classes of tree language hierarchy.
Paweł Kawalec: Scientific knowledge, epistemology and convergence. A comment on Vincent Hendricks' paper The Convergence of Scientific Knowledge
The mainstream epistemology has been preoccupied with analysis of the concept of knowledge and its constituents, i.e. the concept of truth and justification. The background to assess correctness of epistemological theories is taken to be set out by our ordinary concepts and particular cases of language usage recognized as intuitively correct. The program of computational epistemology V. F. Hendricks subscribes to and further develops departs from this paradigm of the mainstream epistemology. And so did the general approach adopted by the Lvov-Warsaw School.
In the opening historical remark the similarities between the program of computational epistemology and the general approach to epistemology in the Lvov-Warsaw School will shortly be indicated. I will then proceed to evaluate the validity of objections that the mainstream and Bayesian epistemologies - that is the main contemporary alternatives - could raise against their contestant, i.e. computational epistemology, as advocated by Hendrics. And these objections would mostly address the presuppositions that computational epistemology purportedly endorses: an idealized view of science, exclusively non-perspectival knowledge (undermining ordinary and intuitive concept of knowledge), and failure to recognize divergence between substantive philosophical standpoints (esp. realism-antirealism).
I will conclude by considering whether the same set of objections could be found pertinent to the general approach to epistemology as set out in the Lvov-Warsaw School.
Aleksandra Kiślak-Malinowska: Free pregroups as a tool for parsing
The calculus of pregroups was introduced by Lambek in [5]. Pregroups have been studied e.g. in [2], [3]. The calculus of pregroups, is an essential strengthening of Lambek syntactic calculus and can be treated as an attractive alternative. In our paper we show that the calculus of pregroups is a very useful tool for parsing in natural languages. In his paper [5] Lambek analysed some parts of English grammar. We show that his analyses can be translated from the syntactic calculus into the calculus of pregroups by means of basic translation. The calculus of pregroups seems to be more easily treatable than the syntactic one. In our work we also concentrate on analyses of syntactic structures of Polish language in both formalisms.
We also consider the problem of conjoinability in the calculus of pregroups. We show that two types are conjoinable in a pregroup iff they are equal in a free group. This result is analogous to Pentus' characterization of conjoinability in the Lambek calculus [6].
References:
[1] K. Ajdukiewicz, Die syntaktische Konnexität, Studia Philosophica 1 (1935)
[2] W. Buszkowski, Lambek grammars based of pregroups, in: P. de Groote, G. Morril and C. Retoré (eds.), Logical Aspects of Computational Linguistics, Proc. LACL'01, LNAI 2099, Springer, Berlin, 2001, 95-109
[3] C. Casadio, J. Lambek, An Algebraic Analysis of Clitic Pronouns in Italian, in Logical Aspects of Computational Linguistics, Springer, 2001, 110-124
[4] J. Lambek, The mathematics of sentence structure, The American Mathematical Monthly 65 (1958), 154-170
[5] J. Lambek, Type grammars revisited, in: A. Lecomte, F. Lamarche and G. Perrier, Logical Aspects of Computational Linguistics, LNAI 1582, Springer, Berlin, 1999, 1-27.
[6] M. Pentus, The conjoinability relation in Lambek calculus and linear logic, ILLC Prepublications Series ML-93-03, Institute for Logic, Language and Computation, University of Amsterdam, 1993.
Robert Kublikowski: Language, vagueness and definition
[Key words: language, vagueness, precisification, definition]
There is a relatively small number of texts about the relationship between vagueness and definition. Vagueness is related to definition, which is one of the methods of possible precisification. Linguistic vagueness and definition are connected with the relationship between precise and imprecise language, and with the precisification of language. The main aim of this paper is not the elaboration dealing with theory of vagueness or theory of definition. I would like to check if it is possible to make vague terms more precise using a definition, especially a partial definition. The goal is to show that the use of a partial definition doesn't cause a certain vague term to become totally precise.
Grażyna Mirkowska and Andrzej Salwicki: Logic as a tool for specification and verification of software
Logic is today a tool used in many fields of computer science: in software
engineering, in data bases, in expert systems, in robotics and almost
everywhere. It is becoming a daily necessity for the computer
professional.
In particular, software construction requires logical tools proper for
the process of specification, analysis and implementation of software
systems. It turns out, that the classical logic is not quite appropriate
for the description of phenomena that appear in the software production.
Classical logic is static from its nature and therefore it does not
reflect the dynamics of algorithmic processes. Algorithmic logic (AL) is
one of the systems that was invented to fulfill this gap. The aim of AL
is to collect laws and rules that are fundamental to all computer
programming, regardless of the programming languages employed of
hardware, etc.
In this presentation we would like to summarize applications of AL in the
specification of data structures and of algorithms, in analysis of
semantic properties of programs and in the definition of semantics of
programming languages. As we intend to make algorithmic concepts
accessible to readers without special background in computer programming,
all the notions and the ideas will be explained on simple examples.
Roman Murawski and Jerzy Pogonowski: Logical Investigations at the University of Poznań in 1945-1955
The paper is a short synopsis of the logical investigations conducted at the University of Poznań in the first decade after the Second World War, i.e. at the place where and at the time when Studia Logica came into existence. We discuss the achievements of four logicians working at that time in Poznań: Kazimierz Ajdukiewicz, Adam Wiegner, Seweryna Łuszczewska-Romahnowa and Roman Suszko.
The presentation is based on the published works of the above mentioned logicians and a few articles where their works are discussed as well as on the materials found in the archives of the University of Poznań and the archives of the Poznań Friends of Scholarship (Poznańskie Towarzystwo Przyjaciół Nauk). We have also interviewed some participants of the Ajdukiewicz's seminar.
Anna Pietryga: Logical Values after Duhem
Pierre Duhem lived before the multi-valued logics appeared. His holistic view on science can, however, be seen as a basis for new logical systems which try to deal with a set of formulas as a logical whole, especially when they face contradiction. In such case, it's not logical values that become more numerous but a formula, which has been the traditional bearer of logical values, that is replaced with a set of formulas.
Bożena Staruch and Bogdan Staruch: First order theories of partial models
We propose a language describing first order
consequences of the possibility of embedding a given partial
model, and generally, a given family of partial models, which are
treated as a partial knowledge about the real world.
We
describe classes of completions of the given families as
quasi-varieties and varieties of models. We give for every
consistent first order theory Π the construction of the
standard family of partial models i.e. a family, which embeds into
every model for Π.
Next, we replace embeddings by homomorphisms injective on
distinguished subset of the carrier sets of the given family of
partial models. We obtain an analogous theorem, which in effect,
gives a unique standard partial model with a family of its
distinguished subsets. At the end some propositions of
applications in non-classical logics will be given.
Andrzej Wiśniewski: Socratic Proofs
Our aim is to express in exact terms the old idea of solving problems by pure questioning. We consider the problem of derivability: "Is A derivable from Ɗ by classical propositional logic?". We develop a calculus of questions E*; a proof (called a Socratic proof) is a sequence of questions ending with a question whose affirmative answer is, in a sense, evident. The calculus is sound and complete with respect to classical propositional logic. A Socratic proof in E* can be transformed into a Gentzen-style proof in some sequent calculi. Next we develop a calculus of questions E**; Socratic proofs in E** can be transformed into analytic tableaux. We show that Socratic proofs can be grounded in Inferential Erotetic Logic. After a slight modification, the analyzed systems can also be viewed as hypersequent calculi.
Urszula Wybraniec-Skardowska: Syntactic and Semantic Notions of Sense
In a logical conception of language we say that an expression has a syntactic sense if it is a well-formed expression, and we say that it has a semantic sense if it has a meaning and a denotation. The paper explicates these two different notions of sense on the basis of the author's formal theory of language syntax (1991) and its expansion by semantic (1998, 2001) and pragmatic components. In the theory, according to the token-type distinction of Peirce, language is formalised on two levels: first as a language of token-objects (understood as material, empirical, enduring-through-time-and space objects) and then – as a language of type-objects (understood as abstract objects, as classes of tokens). The most important syntactic notion of a well-formed expression (a wfe) is defined separately on the token-level and on the type-level. Two level token-type formalisation of the theory allows us to outline a new semantic-pragmatic theory of meaning. The basic concepts of the theory, i.e. the notions: meaning, denotation and interpretation of wfes are formalised on the type-level, by utilising semantic-pragmatic primitive notions introduced on the token-level such as: using and interpreting of wfe-tokens. The meaning, respectively the interpretation, of a wfe is defined as an equivalence class of the relation possessing same manner of use of types, respectively, the relation possessing same manner of interpreting of types (cf. Ajdukiewicz 1934, Wittgenstein 1953). The concept of denotation is defined by means of the relation of referring which holds between wfe-types and objects of reality described by the given language. In the theory is valid the following theorem: Two expressions have the same denotation if they have the same meanings.
References:
Ajdukiewicz K., Sprache und Sinn, Erkenntnis, vol.IV (1934), pp. 100-138.
Wittgenstein L., Philosophical Investigations, Blackwell, Oxford, 1953.
Wybraniec-Skardowska U., Theory of Language Syntax. Categorial Approach, Kluwer Academic Publisher, Dordrecht-Boston-London, 1991.
Wybraniec-Skardowska U., Logical and Philosophical Ideas in Certain Approaches to Language, Synthese, vol.116 (1998), pp. 231-277.
Wybraniec-Skardowska U., On Denotation of Quantifiers, in: Logical Ideas of Roman Suszko, (M.Omyła, editor), Warsaw, 2001, pp.89-119.
last modfied 20.09.2003; designer and webmaster: Krzysztof Pszczola