|
Benedetto Intrigila, Ivano Salvo, and Stefano Sorgi. "A characterization of weakly Church-Rosser abstract reduction systems that are not Church-Rosser." Information and Computation 171, no. 2 (2001): 137–155. Academic Press, Inc.. ISSN: 0890-5401. DOI: 10.1006/inco.2001.2945.
Abstract: Basic properties of rewriting systems can be stated in the framework of abstract reduction systems (ARS). Properties like confluence (or Church-Rosser, CR) and weak confluence (or weak Church-Rosser, WCR) and their relationships can be studied in this setting: as a matter of fact, well-known counterexamples to the implication WCR CR have been formulated as ARS. In this paper, starting from the observation that such counterexamples are structurally similar, we set out a graph-theoretic characterization of WCR ARS that is not CR in terms of a suitable class of reduction graphs, such that in every WCR not CR ARS, we can embed at least one element of this class. Moreover, we give a tighter characterization for a restricted class of ARS enjoying a suitable regularity condition. Finally, as a consequence of our approach, we prove some interesting results about ARS using the mathematical tools developed. In particular, we prove an extension of the Newman’s lemma and we find out conditions that, once assumed together with WCR property, ensure the unique normal form property. The Appendix treats two interesting examples, both generated by graph-rewriting rules, with specific combinatorial properties.
|
|
|
Antonio Bucciarelli, Adolfo Piperno, and Ivano Salvo. "Intersection types and λ-definability." Mathematical Structures in Computer Science 13, no. 1 (2003): 15–53. Cambridge University Press. ISSN: 0960-1295. DOI: 10.1017/S0960129502003833.
Abstract: This paper presents a novel method for comparing computational properties of λ-terms that are typeable with intersection types, with respect to terms that are typeable with Curry types. We introduce a translation from intersection typing derivations to Curry typeable terms that is preserved by β-reduction: this allows the simulation of a computation starting from a term typeable in the intersection discipline by means of a computation starting from a simply typeable term. Our approach proves strong normalisation for the intersection system naturally by means of purely syntactical techniques. The paper extends the results presented in Bucciarelli et al. (1999) to the whole intersection type system of Barendregt, Coppo and Dezani, thus providing a complete proof of the conjecture, proposed in Leivant (1990), that all functions uniformly definable using intersection types are already definable using Curry types.
|
|
|
Antonio Bucciarelli, and Ivano Salvo. "Totality, Definability and Boolean Circuits." 1443 (1998): 808–819. Springer. DOI: 10.1007/BFb0055104.
Abstract: In the type frame originating from the flat domain of boolean values, we single out elements which are hereditarily total. We show that these elements can be defined, up to total equivalence, by sequential programs. The elements of an equivalence class of the totality equivalence relation (totality class) can be seen as different algorithms for computing a given set-theoretic boolean function. We show that the bottom element of a totality class, which is sequential, corresponds to the most eager algorithm, and the top to the laziest one. Finally we suggest a link between size of totality classes and a well known measure of complexity of boolean functions, namely their sensitivity.
|
|
|
Antonio Bucciarelli, Silvia de Lorenzis, Adolfo Piperno, and Ivano Salvo. "Some Computational Properties of Intersection Types (Extended Abstract)." (1999): 109–118. IEEE Computer Society. DOI: 10.1109/LICS.1999.782598.
Abstract: This paper presents a new method for comparing computation-properties of λ-terms typeable with intersection types with respect to terms typeable with Curry types. In particular, strong normalization and λ-definability are investigated. A translation is introduced from intersection typing derivations to Curry typeable terms; the main feature of the proposed technique is that the translation is preserved by β-reduction. This allows to simulate a computation starting from a term typeable in the intersection discipline by means of a computation starting from a simply typeable term. Our approach naturally leads to prove strong normalization in the intersection system by means of purely syntactical techniques. In addition, the presented method enables us to give a proof of a conjecture proposed by Leivant in 1990, namely that all functions uniformly definable using intersection types are already definable using Curry types.
Keywords: lambda calculusCurry types, intersection types, lambda-definability, lambda-terms, strong normalization
|
|
|
V. Bono, and I. Salvo. "A CuCh Interpretation of an Object-Oriented Language." Electronic Notes in Theoretical Computer Science 50, no. 2 (2001): 159–177. Elsevier. Notes: BOTH 2001, Bohm’s theorem: applications to Computer Science Theory (Satellite Workshop of ICALP 2001). DOI: 10.1016/S1571-0661(04)00171-9.
Abstract: CuCh machine extends pure lambda–calculus with algebraic data types and provides a the possibility of defining functions over the disjoint sum of algebras. We exploit such natural form of overloading to define a functional interpretation of a simple, but significant fragment of a typical object-oriented language.
|
|
|
Franco Barbanera, Mariangiola Dezani-Ciancaglini, Ivano Salvo, and Vladimiro Sassone. "A Type Inference Algorithm for Secure Ambients." Electronic Notes in Theoretical Computer Science 62 (2002): 83–101. Elsevier. Notes: TOSCA 2001, Theory of Concurrency, Higher Order Languages and Types. DOI: 10.1016/S1571-0661(04)00321-4.
Abstract: We consider a type discipline for the Ambient Calculus that associates ambients with security levels and constrains them to be traversed by or opened in ambients of higher security clearance only. We present a bottom-up algorithm that, given an untyped process P, computes a minimal set of constraints on security levels such that all actions during runs of P are performed without violating the security level priorities. Such an algorithm appears to be a prerequisite to use type systems to ensure security properties in the web scenario.
|
|
|
Mario Coppo, Mariangiola Dezani-Ciancaglini, Elio Giovannetti, and Ivano Salvo. "Mobility Types for Mobile Processes in Mobile Ambients." Electr. Notes Theor. Comput. Sci. 78 (2003). DOI: 10.1016/S1571-0661(04)81011-9.
Abstract: We present an ambient-like calculus in which the open capability is dropped, and a new form of “lightweightÃ¢â‚¬Âť process mobility is introduced. The calculus comes equipped with a type system that allows the kind of values exchanged in communications and the access and mobility properties of processes to be controlled. A type inference procedure determines the “minimalÃ¢â‚¬Âť requirements to accept a system or a component as well typed. This gives a kind of principal typing. As an expressiveness test, we show that some well known calculi of concurrency and mobility can be encoded in our calculus in a natural way.
|
|
|
Benedetto Intrigila, Daniele Magazzeni, Igor Melatti, and Enrico Tronci. "A Model Checking Technique for the Verification of Fuzzy Control Systems." In CIMCA '05: Proceedings of the International Conference on Computational Intelligence for Modelling, Control and Automation and International Conference on Intelligent Agents, Web Technologies and Internet Commerce Vol-1 (CIMCA-IAWTIC'06), 536–542. Washington, DC, USA: IEEE Computer Society, 2005. ISSN: 0-7695-2504-0-01. DOI: 10.1109/CIMCA.2005.1631319.
Abstract: Fuzzy control is well known as a powerful technique for designing and realizing control systems. However, statistical evidence for their correct behavior may be not enough, even when it is based on a large number of samplings. In order to provide a more systematic verification process, the cell-to-cell mapping technology has been used in a number of cases as a verification tool for fuzzy control systems and, more recently, to assess their optimality and robustness. However, cell-to-cell mapping is typically limited in the number of cells it can explore. To overcome this limitation, in this paper we show how model checking techniques may be instead used to verify the correct behavior of a fuzzy control system. To this end, we use a modified version of theMurphi verifier, which ease the modeling phase by allowing to use finite precision real numbers and external C functions. In this way, also already designed simulators may be used for the verification phase. With respect to the cell mapping technique, our approach appears to be complementary; indeed, it explores a much larger number of states, at the cost of being less informative on the global dynamic of the system.
|
|
|
Francesco Brizzolari, Igor Melatti, Enrico Tronci, and Giuseppe Della Penna. "Disk Based Software Verification via Bounded Model Checking." In APSEC '07: Proceedings of the 14th Asia-Pacific Software Engineering Conference, 358–365. Washington, DC, USA: IEEE Computer Society, 2007. ISSN: 0-7695-3057-5. DOI: 10.1109/APSEC.2007.43.
Abstract: One of the most successful approach to automatic software verification is SAT based bounded model checking (BMC). One of the main factors limiting the size of programs that can be automatically verified via BMC is the huge number of clauses that the backend SAT solver has to process. In fact, because of this, the SAT solver may easily run out of RAM. We present two disk based algorithms that can considerably decrease the number of clauses that a BMC backend SAT solver has to process in RAM. Our experimental results show that using our disk based algorithms we can automatically verify programs that are out of reach for RAM based BMC.
|
|
|
Giuseppe Della Penna, Antinisca Di Marco, Benedetto Intrigila, Igor Melatti, and Alfonso Pierantonio. "Interoperability mapping from XML schemas to ER diagrams." Data Knowl. Eng. 59, no. 1 (2006): 166–188. Elsevier Science Publishers B. V.. ISSN: 0169-023x. DOI: 10.1016/j.datak.2005.08.002.
Abstract: The eXtensible Markup Language (XML) is a de facto standard on the Internet and is now being used to exchange a variety of data structures. This leads to the problem of efficiently storing, querying and retrieving a great amount of data contained in XML documents. Unfortunately, XML data often need to coexist with historical data. At present, the best solution for storing XML into pre-existing data structures is to extract the information from the XML documents and adapt it to the data structures’ logical model (e.g., the relational model of a DBMS). In this paper, we introduce a technique called Xere (XML entity–relationship exchange) to assist the integration of XML data with other data sources. To this aim, we present an algorithm that maps XML schemas into entity–relationship diagrams, discuss its soundness and completeness and show its implementation in XSLT.
|
|