Jusletter IT

Legistic models in Lachmayer's Research Approach

  • Author: Enrico Francesconi
  • Category: Articles
  • Region: Italy
  • Field of law: Legal Informatics, Artificial Intelligence & Law
  • Citation: Enrico Francesconi, Legistic models in Lachmayer's Research Approach, in: Jusletter IT 11 September 2014
Prof. Lachmayer's research activity is a virtuous example of how legal theory and computer science can be combined to develop effective e-Government applications improving the relationships between citizens and Public Administrations. In this paper the specific contributions to legistics given by Lachmayer, as compared by similar expericences in the same field, are shown.

Table of contents

  • 1. Introduction
  • 2. Legislation and Semantics
  • 3. Advanced applications based on legistic theories
  • 4. Ex-Post vs Ex-Ante approaches to manage semantics in legislation
  • 5. Conclusions
  • 6. Literatur

1.

Introduction ^

[1]
I’m very pleased and honoured to have the chance to contribute to this volume of essayes in honour of Prof. Friedrich Lachmayer and his activity in the field of legal theory and legal informatics. Prof. Lachmayer is one of the leading personality in legal informatics and definitely the scientists whose work in legistics theory represents an indubitable milestone and an essential reference for the implementation of systems aiming to identify the best modalities for drafting, issuing, processing and applying legal norms. With his scientific research activity in legal informatics, in particular in legistics, as well as in the actual development of RIS1 (the Austrian legal information system) since 1972, he gave an invaluable contribution to promote the cooperation between computer scientists and legal experts for improving quality and accessibility of legal information.
[2]
During my first period at ITTIG-CNR2 in Florence (early 2000’s), I was involved in the Italian legislative XML project called NormeInRete (NIR), which gave the opportunity to apply, in an actual e-Government project, theories of legistics which a number of senior researchers in ITTIG-CNR have been working on. In particular I worked with Carlo Biagioli, whose work on semantic models for legislation and legislative drafting gave rise to a wide literature and a number of interesting applications in the legislative domains. On such topics I have had the opportunity to witness the large consonance of ideas between Carlo and Friedrich, as well as their friendly meetings during the series of Legislative XML Workshops, which took place between 2003 and 2006 in the European countries involved in national legislative XML projects.
[3]
In this paper I will try to identify the main points of contacts of these two visions, as well as the applications developed in ITTIG-CNR, which have been derived by such theories.

2.

Legislation and Semantics ^

[4]
Biagioli’s theory is based on the assumption that laws and regulations may be seen as a set of provisions, carried by speech acts [Searle, 1969]. This allows us to establish a parallelism between the semantic characteristics of a generic text and the specific ones of a legislative text.
[5]
The semantics of a generic text can be mainly perceived according to 3 meaningful levels:
  1. the bottom level, represented by atomic components (simple and complex terms)
  2. the middle level, represented by aggregations of such components (sentences or partitions)
  3. the top level, represented by the whole text.
[6]
In the legislative domain similar considerations can be made; in fact, the semantics of legislative documents can be described according to the previous three levels as well:
  1. the bottom level can be defined as the level of domain concepts, describing entities which are regulated by the including act
  2. the middle level, representing aggregations of such concepts, describes the relations between domain concepts in terms of rules
  3. the top level describes the matter of interest for the act.
[7]
The lack of control of the previously described semantic levels causes problems of different nature: from a difficult harmonization and control over the legal lexicon (such as unclear, confusing terminology, incomplete or inconsistent regulations, use of vague terms), to the uncertainty of the impact of new laws in terms of coherency preservation, as well as the difficulties in accessing norms by both citizens and legal experts, until the inability to obtain an analytical/systematic vision of a legal order which creates obstacles to its knowledge and upkeep.
[8]
While semantic annotation of concepts and document indexing are effective tools able to handle legal texts semantics at bottom level (concepts) and top level (subject matter), they are usually not sufficient to fully address problems of legislation accessibility and upkeeping. Biagioli identified a possible reason of these problems in the fact that while a legislative text is a normative and documentary unit, users and legal experts usually manage, access and refer to the legal order in terms of the contained norms. Therefore, a more analytical unit of reference was identified in order to provide an organic view of the legal system, interpreting the middle level of legislative text semantics in terms of legal provisions [Biagioli, 1997].
[9]
According to a typical law theory distinction, expressed by Rawls [Rawls, 1955] provisions consist in Rules and Rules on rules. Rules can be distinguished into constitutive rules («empowering norms» like Definition, Establishment, Power, Liability, etc.) and regulative rules (which discipline actions like Duty, Right, Procedure, etc., or the substantial and procedural defaults like Sanction). On the other hand, Rules on Rules can be distinguished into content amendments (which modify literally the content of a norm, or their meanings), temporal amendments (which modify the times of a norm (come-into-force and efficacy time)), extension amendments (which extend or reduce the cases on which the norm operates). In this model each provision type has specific attributes, like for example the Bearer, the Action and the Counterparty of a Duty [Biagioli, 1997].
[10]
A similar legistic theory based on the management of legislative text fragments and their semantics was proposed by Lachmayer.
[11]
According to Lachmayer, managing legislative texts in computer systems in terms of fragments has proven to meet the expectations of the user community (citizens, layers and public administration officers), in fact:
  1. it allows lawyers to think in terms of articles or paragraphs, considering that referencing is at the level of articles or paragraphs
  2. it allows for making efficient use of database memory
  3. it allows for efficiently generating time consolidated versions of legislative texts
  4. it allows to provide detailed qualification of specific portions of legal texts, thus distinguishing between all three semiotic dimensions: syntax (e.g. letters, words), semantics (e.g. content, based on syntax) and pragmatics (e.g. legal validity) of a single fragment or speech act.
[12]
The importance of managing fragments, as well as semantically qualifying them in terms of legal concepts, is a way to upkeeping and selectively accessing the legal order.
[13]
Nowadays the achievement of such properties are facilitated by the introduction of Semantic Web technologies, as XML and RDF/OWL. Therefore Lachmayer’s theory seems to represent the predecessor of what has happened later, namely the segmentation of documents by using legal XML, as well as the qualifications of such fragments by using legal ontologies.
[14]
The set of concepts able to qualifying legislative texts can be effectively organized in a specific ontology, reporting taxonomic and other differently qualified relationships. A similar organization of legal categories, used as metatada to qualify legislation fragments, into ontologies is indicated by Lachmayer as an essential pre-condition for a systematic approach able to improve legislation management. In this respect an essential role is played, according to Lachmayer, by the theory of law, and it represents the framework into which legal experts and computer scientiest can effectively cooperate.

3.

Advanced applications based on legistic theories ^

[15]
Biagioli’s and Lachmayer’s theories have led to a number of interesting applications in legislative information management.
[16]
For example advanced applications and services based on the Biagioli’s Model of Provisions can be conceived. The description of amendments using the related provision types paves the way to the development of applications for automatic consolidation of legislative texts [Spinosa, 2009]. A corpus of laws and regulations entirely qualified according to the Provisions Model allows to develop advanced search and retrieval services for legislation able to retrive not just documents but also the contained norms [Biagioli and Turchi, 2005] [Francesconi, 2012] as well as services for legal assessment and reasoning. Moreover, the Provision Model applied to legislation can effectively support analyses concerning the coherency of the legal system. Similarly, regulatory impact analysis procedures can be implemented, able to assess the impact of new norms on the legal systems on the one hand, as well as on the public opinion on the other, making legislation more accessible and understandable, thus stimulating the democratic participation of citizens in the legislative process.
[17]
Similarly Lachmayer’s theory has paved the way to the development of advanced search and retrieval facilities for legislation in the Austrian RIS, based on legal categories organized in ontologies. Such facilities give not only the possibility to provide users with a selective modality to access legislation, but also to visualize laws in a effective and accessible fashion. In this respect Lachmayer, in his and Harald Hoffmann article presented at the first edition of the LOAIT workshop in 2005, talks about «virtualization» of the law [Lachmayer Hoffmann, 2005].
[18]
According to [Lachmayer Hoffmann 2005], virtualisation is more than visualisation: while «visualisation» usually is 2-dimensional representation, on the other hand «virtualisation» has 3 or more dimensions, which, in the case of a legal information systems, aim to focus legislative material retrieval facilities on legal users information needs. Therefore, according to [Lachmayer Hoffmann, 2005], virtualisation puts the user in a feedback loop allowing for faster and more precise responses of the information system, thus presenting to the citizen, as a result of a retrieval task, legal knowledge in a specific context of his/her personal circumstances.

4.

Ex-Post vs Ex-Ante approaches to manage semantics in legislation ^

[19]
Probably the main and innovative contribution which Biagioli and Lachmayer independently gave to the legal informatic research in the legistic field is represented by the original approach towards the management of semantics in legislation.
[20]
Lachmayer distinguishes two main approaches to manage legislative documents and their semantics:
  1. the «ex-post» approach, related to either a post-processing procedure which identifies and annotates («document mark-up» in the XML jargon) the existing document structure, usually by natural language processing procedures, or a post-analysis of legislation provided by advanced search and retrieval facilities based on the semantic annotation of legislative text fragments in terms of metadata and matadata values (concepts) defined in an ontology;
  2. the «ex-ante» approach, related to the use of predefined templates to support legislation drafting, so that the document organization (in terms of syntactic and semantic annotation) is given during the drafting activity and as a result of an automated legislative workflow.
[21]
The «ex-ante» approach presents several benefits because it is able to improve the quality of legislative texts in terms of both structure and semantics. Moreover taking care of documents structure and semantics during the legislative drafting phase allows us to manage all the formal aspects of creating a legal text, producing the authentic version of both structural and semantic annotations. This approach has been actually implemented in RIS as a result of the legal XML Austrian project called eRecht (eLaw) [Lachmayer and Hoffmann, 2007].
[22]
These concepts have been independently conceived by Biagioli, whose theory of a «model-driven» (also called «top-down» or «a priori») management of the structure and the semantics of legislative texts, based on the Provision Model, has several contact points with the Lachmayer’s «ex-ante» approach. On the other hand Biagioli’s theory focuses more on the process of an advanced semantic drafting itself, by elaborating an original theory which, potentially, is able to renovate the whole legislative workflow, producing high quality and more accessibile legislative documents.
[23]
According to such «model-driven» approach the drafter is required to manage semantic objects in terms of provision types and attributes and domain entities (provision attribute values), to establish the necessary mutual relations, to organize the semantics according to specific criteria of aggregation. Such criteria can either depend on provision types or deriving from provisions types and their typical structure (provision attributes and provision attribute values). For instance, according to a common criterion, followed at least by the Italian legislative drafter and in the European directives, the Definitions can be grouped in a single article at the beginning of the text. Another typical criterion is the aggregation of the provisions Duty, Procedure and Derogation, related to a specific action [Biagioli et al., 2007].
[24]
At the end of this process, the formal partitions of the act will contain semantically correlated components (provisions). In this way a qualified formal skeleton of the new act can be generated, according to the assumption that a «paragraph», basic component of the legislative document structure usually contains a «provision», assumption which is widely observed by the legislator and which the model-driven drafting approach tends to promote. Finally, partition wording can rely upon the user, or proposals of partitions wording can be automatically generated on the basis of the semantics of the provision associated to each partition [Biagioli et al., 2007].
[25]
This conceptual, model-driven, legislative drafting activity [Biagioli et al., 2007] is intended to enhance the quality of texts from both structural and linguistic point of views, since the formal structure of a legislative text is obtained as a result of the organization of its semantics. The result is a document structure that fits well the defined semantic structure.
[26]
The differences between the traditional (ex-post) legislative drafting process with respect to the model-driven (ex-ante) are represented in Fig. 1:
  • in the traditional (ex-post) legislative drafting approach the formal structure of a document may not be the best one to express its semantics, since a semantic annotation is usually carried out at the end of the drafting activity, when legislative text structure and wording are already defined;
  • in the model-driven (ex-ante) legislative drafting approach, the traditional drafting process is inverted: firstly the semantics of a legislative text in terms of provision instances is expressed, then it is organized (by grouping semantically correlated provisions), finally the formal structure of the document is created considering that each single provisions is to be represented by a paragraph, and partitions wording (manually or automatically) can be implemented.

Fig. 1 Traditional vs. Semantic (model-driven) legislative drafting

[27]
As in the Lachmayer ex-ante documentalistic approach, also in the Biagioli’s model-driven legislative drafting approach the semantics is defined during the drafting process itself, and it is a guide to create the document structure and facilitate document wording, thus standardizing the drafting process. Moreover, since metadata, in terms of provisions, are chosen by the drafter, therefore already present at the moment of law signing and promulgation, they are «authentic» metadata, thus aligning such approach with the one promoted by Lachmayer into RIS [Lachmayer, 2005].

5.

Conclusions ^

[28]
In this paper the research activities of Prof. Lachmayer in the field of legistics, as compared to the activities carried on in the same field within ITTIG-CNR, have been described. This paper shows only a part of the intense reasearch activity of Friedrich Lachmayer, but aims to underline its decisive role in combining legal theory and computer science, as well as how much it has been a source of inspiration for many researchers in the legal informatics domain.

6.

Literatur ^

Searle, John, Speech Acts: An Essay in the Philosophy of Language. Cambridge University Press (1969).

Biagioli, Carlo, «Towards a legal rules functional micro-ontology», in Proceedings of workshop LEGONT ’97 (1997).

Rawls, John, Two concepts of rule, Philosophical Review, vol. 64, pp. 3-31 (1955).

Biagioli, Carlo, Cappelli, Amedeo, Francesconi, Enrico and Turchi, Fabrizio, «Law making environment: perspectives, European Press Academic Publishing (EPAP) », in Proceedings of the V Legislative XML Workshop, pp. 267–281 (2007).

Lachmayer, Friedrich and Hoffmann, Harald, «From legal categories towards legal ontologies», in Proceedings of the International Workshop on Legal Ontologies and Artificial Intelligence Techniques, pp. 63–69 (2005).

Lachmayer, Friedrich and Hoffmann, Harald, «Provisions as legislative elements: Annotations to Biagioli’s Theory», in Proceedings of the V Legislative XML Workshop, pp. 137–148, European Press Academic Publishing (EPAP) (2007).

Biagioli, Carlo and Turchi, Fabrizio, «Model and ontology based conceptual searching in legislative xml collections», in Proceedings of the Workshop on Legal Ontologies and Artificial Intelligence Techniques, pp. 83–89 (2005).

Francesconi, Enrico, «Axioms on a Semantic Model for Legislation for Accessing and Reasoning over Normative Provisions», in AI Approaches to the Complexity of Legal Systems. Models and Ethical Challenges for Legal Systems, Legal Language and Legal Ontologies, Argumentation and Software Agents, Lecture Notes in Computer Science, Volume 7639, pp. 147-161, Springer (2012).

Schweighofer, Erich and LiebwaldDoris, «Advanced Lexical Ontologies and Hybrid Knowledge Based Systems: First Steps to a Dynamic Legal Electronic Commentary», in Proceedings of the Workshop on Legal Ontologies and Artificial Intelligence Techniques (2005).

Hoekstra, Rinke, Breuker, Joost, di Bello, Marcello and Boer, Alexander, «LKIF core: Principled ontology development for the legal domain», in Law, Ontologies and the Semantic Web, pp. 21-52 (2009).


 

Enrico Francesconi, Institute of Legal Information Theory and Techniques, National Research Council of Italy.

  1. 1 Rechts-Informations-System.
  2. 2 The Institute of Legal Information Theory and Techniques of the National Research Council of Italy.