|
Giuseppe Della Penna, Benedetto Intrigila, Enrico Tronci, and Marisa Venturini Zilli. "Synchronized Regular Expressions." Electr. Notes Theor. Comput. Sci. 62 (2002): 195–210. Notes: TOSCA 2001, Theory of Concurrency, Higher Order Languages and Types.
Abstract: Text manipulation is one of the most common tasks for everyone using a computer. The increasing number of textual information in electronic format that every computer user collects everyday stresses the need of more powerful tools to interact with texts. Indeed, much work has been done to provide nonprogramming tools that can be useful for the most common text manipulation issues. Regular Expressions (RE), introduced by Kleene, are well–known in the formal language theory. RE received several extensions, depending on the application of interest. In almost all the implementations of RE search algorithms (e.g. the egrep [A] UNIX command, or the Perl [17] language pattern matching constructs) we find backreferences (as defind in [1]), i.e. expressions that make reference to the string matched by a previous subexpression. Generally speaking, it seems that all the kinds of synchronizations between subexpressions in a RE can be very useful when interacting with texts. Therefore, we introduce the Synchronized Regular Expressions (SRE) as a derivation of the Regular Expressions. We use SRE to present a formal study of the already known backreferences extension, and of a new extension proposed by us, which we call the synchronized exponents. Moreover, since we are talking about formalisms that should have a practical utility and can be used in the real world, we have the problem of how to present SRE to the final users. Therefore, in this paper we also propose a user–friendly syntax for SRE to be used in implementations of SRE–powered search algorithms.
|
|
|
Riccardo Focardi, Roberto Gorrieri, Ruggero Lanotte, Andrea Maggiolo-Schettini, Fabio Martinelli, Simone Tini, and Enrico Tronci. "Formal Models of Timing Attacks on Web Privacy." Electronic Notes in Theoretical Computer Science 62 (2002): 229–243. Notes: TOSCA 2001, Theory of Concurrency, Higher Order Languages and Types. DOI: 10.1016/S1571-0661(04)00329-9.
Abstract: We model a timing attack on web privacy proposed by Felten and Schneider by using three different approaches: HL-Timed Automata, SMV model checker, and tSPA Process Algebra. Some comparative analysis on the three approaches is derived.
|
|
|
Enrico Tronci. "Automatic Synthesis of Control Software for an Industrial Automation Control System." In Proc.of: 14th IEEE International Conference on: Automated Software Engineering (ASE), 247–250. Cocoa Beach, Florida, USA, 1999. DOI: 10.1109/ASE.1999.802292.
Abstract: We present a case study on automatic synthesis of control software from formal specifications for an industrial automation control system. Our aim is to compare the effectiveness (i.e. design effort and controller quality) of automatic controller synthesis from closed loop formal specifications with that of manual controller design, followed by automatic verification. Our experimental results show that for industrial automation control systems, automatic synthesis is a viable and profitable (especially as far as design effort is concerned) alternative to manual design, followed by automatic verification.
|
|
|
Enrico Tronci. "Automatic Synthesis of Controllers from Formal Specifications." In Proc of 2nd IEEE International Conference on Formal Engineering Methods (ICFEM), 134–143. Brisbane, Queensland, Australia, 1998. DOI: 10.1109/ICFEM.1998.730577.
Abstract: Many safety critical reactive systems are indeed embedded control systems. Usually a control system can be partitioned into two main subsystems: a controller and a plant. Roughly speaking: the controller observes the state of the plant and sends commands (stimulus) to the plant to achieve predefined goals. We show that when the plant can be modeled as a deterministic finite state system (FSS) it is possible to effectively use formal methods to automatically synthesize the program implementing the controller from the plant model and the given formal specifications for the closed loop system (plant+controller). This guarantees that the controller program is correct by construction. To the best of our knowledge there is no previously published effective algorithm to extract executable code for the controller from closed loop formal specifications. We show practical usefulness of our techniques by giving experimental results on their use to synthesize C programs implementing optimal controllers (OCs) for plants with more than 109 states.
|
|
|
Enrico Tronci. "Equational Programming in Lambda-Calculus via SL-Systems. Part 1." Theoretical Computer Science 160, no. 1&2 (1996): 145–184. DOI: 10.1016/0304-3975(95)00105-0.
|
|
|
Enrico Tronci. "Equational Programming in Lambda-Calculus via SL-Systems. Part 2." Theoretical Computer Science 160, no. 1&2 (1996): 185–216. DOI: 10.1016/0304-3975(95)00106-9.
|
|
|
Enrico Tronci. "Equational Programming in lambda-calculus." In Sixth Annual IEEE Symposium on Logic in Computer Science (LICS), 191–202. Amsterdam, The Netherlands: IEEE Computer Society, 1991. DOI: 10.1109/LICS.1991.151644.
|
|
|
Andrea Bobbio, Sandro Bologna, Michele Minichino, Ester Ciancamerla, Piero Incalcaterra, Corrado Kropp, and Enrico Tronci. "Advanced techniques for safety analysis applied to the gas turbine control system of Icaro co generative plant." In X Convegno Tecnologie e Sistemi Energetici Complessi, 339–350. Genova, Italy, 2001.
Abstract: The paper describes two complementary and integrable approaches, a probabilistic one and a deterministic one, based on classic and advanced modelling techniques for safety analysis of complex computer based systems. The probabilistic approach is based on classical and innovative probabilistic analysis methods. The deterministic approach is based on formal verification methods. Such approaches are applied to the gas turbine control system of ICARO co generative plant, in operation at ENEA CR Casaccia. The main difference between the two approaches, behind the underlining different theories, is that the probabilistic one addresses the control system by itself, as the set of sensors, processing units and actuators, while the deterministic one also includes the behaviour of the equipment under control which interacts with the control system. The final aim of the research, documented in this paper, is to explore an innovative method which put the probabilistic and deterministic approaches in a strong relation to overcome the drawbacks of their isolated, selective and fragmented use which can lead to inconsistencies in the evaluation results.
|
|
|
Antonio Bucciarelli, and Ivano Salvo. "Totality, Definability and Boolean Circuits." 1443 (1998): 808–819. Springer. DOI: 10.1007/BFb0055104.
Abstract: In the type frame originating from the flat domain of boolean values, we single out elements which are hereditarily total. We show that these elements can be defined, up to total equivalence, by sequential programs. The elements of an equivalence class of the totality equivalence relation (totality class) can be seen as different algorithms for computing a given set-theoretic boolean function. We show that the bottom element of a totality class, which is sequential, corresponds to the most eager algorithm, and the top to the laziest one. Finally we suggest a link between size of totality classes and a well known measure of complexity of boolean functions, namely their sensitivity.
|
|
|
Antonio Bucciarelli, Silvia de Lorenzis, Adolfo Piperno, and Ivano Salvo. "Some Computational Properties of Intersection Types (Extended Abstract)." (1999): 109–118. IEEE Computer Society. DOI: 10.1109/LICS.1999.782598.
Abstract: This paper presents a new method for comparing computation-properties of λ-terms typeable with intersection types with respect to terms typeable with Curry types. In particular, strong normalization and λ-definability are investigated. A translation is introduced from intersection typing derivations to Curry typeable terms; the main feature of the proposed technique is that the translation is preserved by β-reduction. This allows to simulate a computation starting from a term typeable in the intersection discipline by means of a computation starting from a simply typeable term. Our approach naturally leads to prove strong normalization in the intersection system by means of purely syntactical techniques. In addition, the presented method enables us to give a proof of a conjecture proposed by Leivant in 1990, namely that all functions uniformly definable using intersection types are already definable using Curry types.
Keywords: lambda calculusCurry types, intersection types, lambda-definability, lambda-terms, strong normalization
|
|