ICT for EU-India Cross Cultural Dissemination Project
MEC

WWV'05

1st Int'l Workshop on Automated Specification and Verification of Web Sites
Valencia, Spain, March 14-15, 2005

LIST OF ACCEPTED PAPERS

Regular Contribution

Gianluca Amato, Massimo Coppola, Stefania Gnesi, Francesca Scozzari and Laura Semini
Modeling Web Applications by the Multiple Levels of Integrity Policy
We propose a formal method to validate the reliability of a web application, by modeling interactions among its constituent objects. Modeling exploits the recent "Multiple Levels of Integrity" mechanism which allows objects with dynamically changing reliability to cooperate within the application. The novelty of the method is the ability to describe systems where objects can modify their own integrity level, and react to such changes in other objects. The model is formalized with a process algebra, properties are expressed using the ACTL temporal logic, and can be verified by means of a model checker. Any instance of the above model inherits both the established properties and the proof techniques. To substantiate our proposal we consider several case-studies of web applications, showing how to express specific useful properties, and their validation schemata. Examples range from on-line travel agencies, inverted Turing test to detect malicious web-bots, to content cross-validation in peer to peer systems.

Gregorio Díaz Descalzo, Juan José Pardo Mateo, M. Emilia Cambronero Piqueras, Valentín Valero Ruiz and Fernando Cuartero Gómez.
Verification of Web Services with Timed Automata
In this paper we show how we can use formal methods for describing and analyzing the behavior of Web Services, and more specifically those including time restrictions. Then, our starting point are Web Services descriptions written in WSCI - WSCDL (XML-based description languages). These descriptions are then translated into timed automata, and then, we use a well known tool that supports this formalism (UPPAAL) to simulate and analyze the system behavior. As illustration we take a particular case study, a travel reservation system.

Bo Hu, Florian Lauck and Jan Scheffczyk.
How Recent is a Web Document?
One of the most important aspects of a Web document is its up-to-dateness or recency. Up-to-dateness is particularly relevant to Web documents because they usually contain content origining from different sources and being refreshed at different dates. Whether a Web document is relevant for a reader depends on the history of its contents and so-called external factors, i.e., the up-to-dateness of semantically related documents. In this paper, we approach automatic management of up-to-dateness of Web documents that are managed by an XML-centric Web content management system. First, the freshness for a single document is generated, taking into account its change history. A document metric estimates the distance between different versions of a document. Second, up-to-dateness of a document is determined based on its own history and the historical evolutions of semantically related documents.

Javier Jesús Gutiérrez Rodríguez, María José Escalona Cuaresma, Manuel Mejía Risoto and Jesús Torres Valderrama.
Testing web applications in practice
Software testing process is gaining importance at same time that size and complexity of software are growing. The specifics characteristics of web applications, like client-server architecture, heterogeneous languages and technologies or massive concurrent access, makes hard adapting classic software testing practices and strategies to web applications. This work exposes an overview of testing processes and techniques applied to web development. This work also describes a practical example testing a simple web application using only open-source tools.

Martin Karusseit and Tiziana Margaria.
Feature-based Modelling of a Complex, Online-Reconfigurable Decision Support Service
In this paper, we show how the concepts of components, features and services are used today in the Online Conference System (OCS) in order to marry the modelling of functionally complex, online reconfigurable internet services at the application level with the needs of a model-driven development amenable to analyze and verify the models. Characteristic of the approach is the coarse-grained approach to modelling and design of features and services, which guarantees the scalability to capture large complex systems. The interplay of the different features and components is realized via a coordination-based approach, which is an easily understandable modelling paradigm of system-wide business processes, and thus adequate for the needs of industrial application developers.

Claude Kirchner, Helene Kirchner and Anderson Santana.
Anchoring modularity in HTML
Modularity is a key feature at design, programming, proving, testing, and maintenance time, as well as a must for reusability. Most languages and systems provide built-in facilities for encapsulation, importation or parameterization. Nevertheless there exists also languages, like HTML, with poor support for modularization. A natural idea is therefore to provide generic modularization primitives.
To extend an existing language with additional and possibly formal capabilities, the notion of anchorage and Formal Island has been introduced recently. TOM for example, provides generic matching, rewriting and strategy extensions to JAVA and C.
In this paper, we show on the HTML example, how to add modular features by anchoring modularization primitives in HTML. This allows one to write modular HTML descriptions, therefore facilitating their design, reusability, and maintenance, as well as providing an important step towards HTML validity checking.

Luigi Liquori, Furio Honsell and Rekha Redamalla.
A Language for Verification and Manipulation of Web Documents
In this paper we develop the language theory underpinning the logical framework PLF. This language features lambda abstraction with patterns and application via pattern-matching. Reductions are allowed in patterns. The framework is particularly suited as a metalanguage for encoding rewriting logics and logical systems where rules require proof terms to have special syntactic constraints, e.g. call-by-value lambda-calculus, or term rewriting systems. PLF is a conservative extension of the well-known Edinburgh Logical Framework LF. Because of sophisticated pattern matching facilities PLF is suitable for verification and manipulation of XML documents.

Vicente Luque-Centeno, Carlos Delgado-Kloos, Jesús Arias-Fisteus and Luis Álvarez-álvarez
Web Accessibility Evaluation Tools: a survey and some improvements
Web Content Accessibility Guidelines (WCAG) from W3C consist of a set of 65 checkpoints or specifications that Web pages should accomplish in order to be accessible to people with disabilities or using alternative browsers. Many of these 65 checkpoints can only be checked by a human operator, thus implying a very high cost for full evaluation. However, some checkpoints can be automatically evaluated, thus spotting accessibility barriers in a very effective manner. Well known tools like Bobby, Tawdis or WebXACT evaluate Web accessibility by using a mixture of automated, manual and semiautomated evaluations. However, the automation degree of these Web evaluations is not the same for each of these tools. Since WCAG are not written in a formalized manner, these evaluators may have different "interpretations" of what these rules mean. As a result, different evaluation results might be obtained for a single page depending on which evaluation tool is being used.
Taking into account that a fully automated Web accessibility evaluation is not feasible because of the nature of WCAG, this paper evaluates the WCAG coverage degree of some well known Web accessibility evaluation tools spotting their weaknesses and providing a formalized specification for those checkpoints which have had fewer automation coverage in nowadays' tools.

Frederic Rioux and Patrice Chalin.
Improving the Quality of Web-based Enterprise Applications with Extended Static Checking: A Case Study
ESC/Java2 is a tool that statically detects errors in Java programs and that uses the Java Modeling Language (JML) as its annotation language. ESC/Java2 can modularly reason about the code of a Java Web-based Enterprise Application (WEA) and uncover potential errors. In this paper, we assessed the effectiveness of ESC/Java2 at helping developers increase WEA quality by detecting design and implementation issues.

Roger G Stone
Validating Scripted Web-Pages
The validation of XML documents against a DTD is well understood and tools exist to accomplish this task. But the problem considered here is the validation of a generator of XML documents. The desired outcome is to establish for a particular generator that it is incapable of producing invalid output. Many (X)HTML web pages are generated from a document containing embedded scripts written in languages such as PHP. Existing tools can validate any particular instance of the XHTML generated from the document. However there is no tool for validating the document itself, guaranteeing that all instances that might be generated are valid.
A prototype validating tool for scripted-documents has been developed which uses a notation developed to capture the generalised output from the document and a systematically augmented DTD.

Position paper / Overview of more extensive work

Shadi Abou-Zahra.
Automated Web Site Accessibility Evaluation
The Evaluation and Report Language (EARL) is an RDF Schema which allows evaluation tools to express test results in a machine-readable, platform-independent, and vendor-neutral format. Describing test results (such as results from automated Web accessibility checks) using technologies provided by the Semantic Web framework facilitates the aggregation of test results with readily available parsers, libraries, and resources. Furthermore, Semantic Web technologies allow authoring tools (such as editors, content management system, or save-as utilities), Web browsers, search engines, or other types of applications to elegantly process these test results and provide more advanced features to the end users.

Temur Kutsia
Context Sequence Matching for XML
Context and sequence variables allow matching to explore term-trees both in depth and in breadth. It makes context sequence matching a suitable computational mechanism for a rule-based language to query and transform XML, or to specify and verify web sites. Such a language would have advantages of both path-based and pattern-based languages. We develop a context sequence matching algorithm and its extension for regular expression matching, and prove their soundness, termination and completeness properties.

System demonstration / Report of practical experiences / Work in progress

David Crocker and John H. Warren.
Generating commercial web applications from precise requirements and formal specifications
We present a new model-based approach that we are using to build commercial web-based applications. The user requirements together with a data model are formally specified in a graphical notation using the CREATIV toolset. The specification may be checked by animation before being automatically translated to Perfect notation. The Perfect Developer toolset uses automated reasoning to generate formal proofs of correctness. It then generates C++ or Java code which, in conjunction with an application framework also written in Perfect, forms the complete application including the HTML user interface. The whole process provides a rapid turnaround from new requirements to a formally-verified application.

Vicent Estruch, Cèsar Ferri, Jose Hernández-Orallo and M. José Ramírez-Quintana.
Web Categorisation Using Distance-Based Decision Trees
In Web classification, web pages are assigned to pre-defined categories mainly according to their content (content mining). However, the structure of the web site might provide extra information about their category (structure mining). Traditionally, both approaches have been applied separately, or are dealt with techniques that do not generate a model, such as Bayesian techniques. Unfortunately, in some classification contexts, a comprehensible model becomes crucial. Thus, it would be interesting to apply rule-based techniques (rule learning, decision tree learning) for the web categorisation task. In this paper we outline how our general-purpose learning algorithm, the so called distance based decision tree learning algorithm (DBDT), could be used in web categorisation scenarios. This algorithm differs from traditional ones in the sense that the splitting criterion is defined by means of metric conditions ("is nearer than"). This change allows decision trees to handle structured attributes (lists, graphs, sets, etc.) along with the well-known nominal and numerical attributes. Generally speaking, these structured attributes will be employed to represent the content and the structure of the web-site.

Demis Ballis and Javier García-Vivó.
A Rule-based System for Web site Verification
In this paper, we describe a system, written in Haskell, for the automated verification of Web sites which can be used to specify (partial) correctness and completeness properties of a given Web site, and then automatically check whether these properties are actually fulfilled. It provides a rule-based, formal specification language which allows us to define syntactic/semantic conditions for the Web site by means of a user-friendly graphical interface as well as a verification facility for recognizing forbidden/incorrect patterns and incomplete/missing Web pages.

Salvador Lucas
Rewriting-based navigation of Web sites
In this paper we sketch the use of term rewriting techniques for modeling the dynamic behavior of Web sites.

Josep Silva.
Slicing XML Documents
Program slicing is a well-known technique to extract the program statements that (potentially) affect the values computed at some point of interest. In this work, we introduce a novel slicing method for XML documents. Essentially, given an XML document (which is valid w.r.t. some DTD), we produce a new XML document (a slice) that contains the relevant information in the original XML document according to some criterion. Furthermore, we also output a new DTD such that the computed slice is valid w.r.t. this DTD. A prototype implementation of the XML slicer has been undertaken.

Marco Winckler, Eric Barboni, Christelle Farenc and Philippe Palanque.
What Kind of Verification of Formal Navigation Modelling for Reliable and Usable Web Applications?
In this paper we introduce briefly a notation dedicated to model navigation of Web applications and some strategies that we plan to employ to assess models built with such as a notation. Our aim with this kind of evaluation is to ensure (prior to implementation) that important users tasks can (or cannot) be performed with the system under construction.


Santiago Escobar
Last modified: Tue Mar 8 11:46:08 CET 2005