Programming Methodologies Laboratory

LPM Research


LISA: Language Implementation System based on Attribute Grammars
Compiler generators that accept a formal language specification as input and automatically generate a complete compiler as output have been around for decades. They have widely different capabilities and roots. A compiler can be generated automatically only when a programming language is formally specified using one of several formal language definition methods (e.g., attribute grammars, algebraic specifications, operational semantics, denotational semantics). A formal language specification should be modular, extensible and reusable. Unfortunately, it is difficult to modularize a programming language definition because language features interact among each other in a complex way. The LISA (Language Implementation System based on Attribute grammars) system has been developed primarily to tackle this problem. LISA specifications are more modular, extensible and reusable due to object-oriented and aspect-oriented features that are incorporated into attribute grammars. Coincidentally, other language-based tools (e.g., editor, analyzer, profiler, debugger) can be synthesized from a formal language definition. Such tools can be generated automatically whenever they can be described by a generic fixed part that traverses the appropriate data structures defined by a specific variable part, which can be systematically derivable from the language specification.

Grammar-Driven Generation of Domain-Specific Language Testing Tools
Domain-specific languages (DSLs) assist a software developer (or end-user) in writing a program using idioms that are similar to the abstractions found in a specific problem domain. Tool support for DSLs is lacking when compared to the capabilities provided for standard general purpose languages (GPLs), such as Java and C++. For example, support for testing and debugging a program written in a DSL is often non-existent. This limits an end-user’s ability to discover and locate faults in a DSL program. This project proposed a grammar-driven technique to build a tool generation framework from existing DSL grammars.

Grammar Inference for Domain-Specific Languages
In the area of programming languages, context-free grammars (CFGs) are of special importance since almost all programming languages employ CFG's in their design. Recent approaches to CFG induction are not able to infer context-free grammars for general-purpose programming languages (GPLs). In this project we are investigating several approaches to infer CFGs for domain-specific languages (DSLs). In our work we are using the genetic programming, heuristic and incremental approaches.

MARS: A Metamodel Recovery System Using Grammar Inference
The aim of the MARS system is to be able to infer a metamodel from a collection of instance models. The motivating problem was to address the issue of metamodel drift, which occurs when instance models in a repository are separated from their defining metamodel. In most metamodeling environments, the instance models cannot be loaded properly into the modeling tool without the metamodel. Some examples of problems that require the need to recover or reverse engineer a metamodel are: losing a metamodel definition due to a hard-drive crash, and encountering versioning conflicts when trying to load instance models (also called domain models) based on obsolete metamodels. From our experience in model-driven engineering, it is often the case that a metamodel undergoes frequent evolution, which may result in previous instances being orphaned from the new definition. The key contribution of this system is the application of grammar inference algorithms to the metamodel recovery problem.

PPCea: A Domain-Specific Language for Evolutionary Algorithms
Programmable Parameter Control for Evolutionary Algorithms (PPCea), a domain-specific scripting language, solves the problems of control parameter settings in programmable fashion. It keeps the evolutionary algorithm simple and lifts the problems of control parameter settings into the higher abstraction layer by using metaprogramming.

Program Comprehension for Domain Specific Languages
Application libraries are the most commonly used implementation approach to solve problems in general-purpose languages. Their competitors are domain-specific languages, which can provide notation close to the problem domain. In this project we carried out a family of experiments to investigate program comprehension of participants that were solving experimental tasks in both, DSL as well as GPL notation. Our specific aim was to compare results on end-users’ correctness and efficiency time in three different domains.

Industrial Projects

Research groups

Bioinspired Algorithms (In Slovene)
Institute of Computer Science

International/National Collaborators

Software Composition and Modeling Laboratory, University of Alabama at Birmingham, Birmingham, USA
Language Specification and Processing Group, University of Minho, Braga, Portugal
LIFIA, University of La Plata, La Plata, Argentina
Jan Heering, CWI, Amsterdam, The Netherlands
Tony Sloane, Macquarie University, Sydney, Australia
Bogdan Filipič, Jozef Stefan Institute, Ljubljana, Slovenia