TITLE

Model Checking Timed and Stochastic Properties with CSLTA

AUTHOR(S)
Donatelli, Susanna; Haddad, Serge; Sproston, Jeremy
PUB. DATE
March 2009
SOURCE
IEEE Transactions on Software Engineering;Mar/Apr2009, Vol. 35 Issue 2, p224
SOURCE TYPE
Academic Journal
DOC. TYPE
Article
ABSTRACT
Markov chains are a well-known stochastic process that provide a balance between being able to adequately model the system's behavior and being able to afford the cost of the model solution. Systems can be modeled directly as Markov chains, or with a higher-level formalism for which Markov chains represent the underlying semantics. Markov chains are widely used to study the performance of computer and telecommunication systems. The definition of stochastic temporal logics like Continuous Stochastic Logic (CSL) and its variant asCSL, and of their model-checking algorithms, allows a unified approach to the verification of systems, allowing the mix of performance evaluation and probabilistic verification. In this paper, we present the stochastic logic CSLTA, which is more expressive than CSL and asCSL, and in which properties can be specified using automata (more precisely, timed automata with a single clock). The extension with respect to expressiveness allows the specification of properties referring to the probability of a finite sequence of timed events. A typical example is the responsiveness property "with probability at least 0.75, a message sent at time 0 by a system A will be received before time 5 by system B and the acknowledgment will be back at A before time 7", a property that cannot be expressed in either CSL or asCSL. Furthermore, the choice of using automata rather than the classical temporal operators Next and Until should help in enlarging the accessibility of model checking to a larger public. We also present a model-checking algorithm for CSLTA
ACCESSION #
38598822

 

Related Articles

  • An EM Algorithm for Batch Markovian Arrival Processes and its Comparison to a Simpler Estimation Procedure. Breuer, Lothar // Annals of Operations Research;Apr2002, Vol. 112 Issue 1-4, p123 

    Although the concept of Batch Markovian Arrival Processes (BMAPs) has gained widespread use in stochastic modelling of communication systems and other application areas, there are few statistical methods of parameter estimation proposed yet. However, in order to practically use BMAPs for...

  • The critical discount factor for finite Markovian decision processes with an absorbing set. Hinderer, K.; Waldmann, K.-H. // Mathematical Methods of Operations Research;2003, Vol. 57 Issue 1, p1 

    This paper deals with a Markovian decision process with an absorbing set J[sub 0] . We are interested in the largest number β[sup *] ≥1, called the critical discount factor, such that for all discount factors β smaller than β[sup *] the limit V of the N-stage value function V[sub...

  • Asymptotic expansions in limit theorems for stochastic processes. I. Wentzell, A. D. // Probability Theory & Related Fields;1996, Vol. 106 Issue 3, p331 

    Summary. For some families of locally infinitely divisible Markov processes η[sup ε] (t), 0≦ t≦ T, with frequent small jumps, limit theorems for expectations of functionals F(η[sup ε] [0,T]) are proved of the form .| E[sup ε] F(η[sup ε] [0,T])-E[sup 0] F(η[sup...

  • Multiplicative stochastic processes involving the time derivative of a Markov process. Arnoldus, Henk F.; George, Thomas F. // Journal of Mathematical Physics;Feb87, Vol. 28 Issue 2, p340 

    The characteristic functional of the derivative[lowercase_phi_synonym](t) of a Markov process [lowercase_phi_synonym](t) and the related multiplicative process σ(t), which obeys the stochastic differential equation iσ(t)=(A+[lowercase_phi_synonym](t)B)σ(t), have been studied. Exact...

  • The fundamental matrix of singularly perturbed Markov chains. Avrachenkov, Konstantine E.; Lasserre, Jean B. // Advances in Applied Probability;Sep99, Vol. 31 Issue 3, p679 

    Provides a complete characterization of the fundamental matrix of a singularly perturbed Markov chain. Presentation of a simpler formula for the mathematical procedure; Comparison of the technique with previous methodologies used; Examples of the application of the procedure.

  • Optimal Control of a Truncated General Immigration Process Through Total Catastrophes. Kyriakidis, E.G. // Journal of Applied Probability;Jun99, Vol. 36 Issue 2, p461 

    Focuses on a study which considered a Markov decision model for the control of a truncated general immigration process. Optimality of control-limit policies; Computation of the optimal policy; Control through binomial catastrophes.

  • OUTPUT ANALYSIS OF A SINGLE-BUFFER MULTICLASS QUEUE: FCFS SERVICE. Kulkarni, V.G.; Glazebrook, K. D. // Journal of Applied Probability;Jun2002, Vol. 39 Issue 2, p341 

    Presents a study which considered an infinite capacity buffer where the incoming fluid traffic belongs to different types modulated by independent Markovian on-off processes. Output analysis; Expected reward in a fluid model; Computations.

  • LOCALLY CONTRACTING ITERATED FUNCTIONS AND STABILITY OF MARKOV CHAINS. Jarner, S. F.; Tweedie, R. L. // Journal of Applied Probability;Jun2001, Vol. 38 Issue 2, p494 

    Presents a study which considered the Markov chains in the context of iterated random functions and showed the existence and uniqueness of an invariant distribution under a local contraction condition. Results; Discussion on the proof of existence and uniqueness of an invariant distribution for...

  • WAITINING TIMES FOR PATTERNS IN A SEQUENCE OF MULTISTATE TRIALS. Antzoulakos, Demetrios L. // Journal of Applied Probability;Jun2001, Vol. 38 Issue 2, p508 

    Presents a study which developed a variation of the finite Markov chain embedding method for the derivation of waiting times for patterns in a sequence of multistate trials. Discussion on the time-homogeneous Markov chain defined on a finite state space Omega; Details on the concept of a Markov...

Share

Read the Article

Courtesy of THE LIBRARY OF VIRGINIA

Sorry, but this item is not currently available from your library.

Try another library?
Sign out of this library

Other Topics