Reversibility and beyond
Ivan Lanese
Computer Science Department
Focus research group
University of Bologna/INRIA
Bologna, Italy
Summarizing work of INRIA teams
Sardes and Focus
1
Roadmap

Directions

Rhopi

Roll-pi

Crop

What next?
Roadmap

Directions

Rhopi

Roll-pi

Crop

What next?
Reversibility and beyond

1)
2)
3)
We summarize here the thread of research done on
reversibility by INRIA teams Sardes and Focus
Rhopi: making HOpi reversible
Roll-pi: introducing an operator for controlling
reversibility
Crop: adding compensations
Reversibility and beyond

1)
We summarize here the thread of research done on
reversibility by INRIA teams Sardes and Focus
Rhopi: making HOpi reversible
•
•
2)
3)
Extending reversibility techniques from CCS to more
expressive calculi
Preserving structural congruence
Roll-pi: introducing an operator for controlling
reversibility
Crop: adding compensations
Reversibility and beyond

1)
2)
We summarize here the thread of research done on
reversibility by INRIA teams Sardes and Focus
Rhopi: making HOpi reversible
Roll-pi: introducing an operator for controlling
reversibility
•
•
3)
To model checkpointing and rollbacking techniques
Exploiting and constraining reversibility
Crop: adding compensations
Reversibility and beyond

1)
2)
3)
We summarize here the thread of research done on
reversibility by INRIA teams Sardes and Focus
Rhopi: making HOpi reversible
Roll-pi: introducing an operator for controlling
reversibility
Crop: adding compensations
•
•
•
•
Only preliminary ideas
Keeping trace of past failures
Avoiding repeating the same errors
Going towards long running transactions
The small-step approach


We start from a basic calculus: Rhopi
We add to it the smallest mechanism we can think of
going in the desired direction
– Rollback primitive
– Compensating messages

We study how far we can go with these mechanisms
Roadmap

Directions

Rhopi

Roll-pi

Crop

What next?
HOpi fundamentals
P; Q ::=
j
j
j
j
j
ahP i
a(X) . P
(P j Q)
ºa: P
X
0
message
trigger
parallel composition
new name
variable
null process
ahQi j (a(X) . P ) ! P fQ =X g
Rhopi syntax
P; Q ::= ahP i j (a(X) . P ) j (P j Q) j ºa: P j X j 0
con¯gurations
M; N ::=
·:P
j [m; k]
j (M j N )
j ºu: M
j 0
~i ¢ k
· ::= k j hh; h
m ::= ((·1 : ahP i) j (·2 : a(X) . Q))
thread
memory
parallel
restriction
null con¯guration
tags
action record
Rhopi semantics
Forward:
m = (·1 : ahP i) j (·2 : a(X) . Q)
(·1 : ahP i) j (·2 : a(X) . Q) ³ ºk: (k : QfP =X g) j [m; k]
Backward:

(k : P ) j [m; k] Ã m
A forward rule similar to HOpi
– Creating a fresh tag for the continuation
– Creating a memory storing the consumed message and trigger

A backward rule undoing a step (up to garbage
collection)
Rhopi example
k1 : ahP i
k2 : a(X) . bhdh0ii
k3 : b(X) . ch0i j X
Rhopi example
k1 : ahP i
[k1 : M j k2 : N ; k]
k : bhdh0ii
k2 : a(X) . bhdh0ii
k3 : b(X) . ch0i j X
Rhopi example
k1 : ahP i
k2 : a(X) . bhdh0ii
k3 : b(X) . ch0i j X
[k1 : M j k2 : N ; k]
k : bhdh0ii
[k : bhdh0ii j k3 : N1 ; k4 ]
k4 : (ch0i j dh0i)
Rhopi example
k1 : ahP i
[k1 : M j k2 : N ; k]
k : bhdh0ii
k2 : a(X) . bhdh0ii
k3 : b(X) . ch0i j X
Rhopi example
k1 : ahP i
k2 : a(X) . bhdh0ii
k3 : b(X) . ch0i j X
Rhopi structural congruence
(E.TagP) k :
Y
n
i=1

~
¿i ´ º h:
Y
n
~i ¢ k : ¿ ) h
~ = fh ; : : : ; h g
(h hi ; h
1
i
n
i=1
Used for ensuring each thread has a unique tag
– Needed if the continuation of a trigger is a parallel
composition


Here τ is either a message or a trigger
Other structural congruence rules are standard
Rhopi properties


Rhopi is an extension of HOpi
Rhopi³satisfies the Loop lemma
–
If P Q then Q
Ã
P and viceversa

All states reachable from P are weak barbed equivalent
to P itself
Rhopi satisfies causal consistency

Essentially the same properties of RCCS

Roadmap

Directions

Rhopi

Roll-pi

Crop

What next?
Limits of Rhopi

In Rhopi reversibility is wild
– Forward and backward actions always enabled
– Each result can always be annulled
– Impossible to make a result final

We want a mechanism to control reversibility
Roll-pi approach




Normal computation is forward
Backward computation on demand, to backtrack in case
of errors
We use an explicit roll operator to this end
Question: how far back should we go?
– In a concurrent scenario the number of steps is not meaningful

Answer: we go back to undo a communication stored in
a specified memory
– Memory keys allows to individuate the target memory
Roll-pi syntax
P; Q ::= 0 j X j ºa: P j (P j Q) j ahP i j a(X) .° P j roll °
M; N ::= 0 j ºu: M j (M j N) j · : P j [¹; k]





Extends Rhopi syntax
Adds the primitive roll γ for triggering rollback
Adds a γ label to triggers
The idea: roll γ takes the system back to the state before
the trigger labelled by γ has been consumed
More precisely: undoes all the steps caused by the
interaction involving the trigger labelled by γ
Giving semantics: naïve try
(Com)
m = (·1 : ahP i) j (·2 : a(X) .° Q)
(·1 : ahP i) j (·2 : a(X) .° Q) ³ ºk: (k : Qfk;P =°;X g) j [m; k]
N Ik
complete(N j [m; k] j (· : roll k))
(Naive)
N j [m; k] j (· : roll k) Ã m j N &k





The forward rule uses the key k to replace the
placeholder γ
A rule for roll
N ►k verifies that all the elements in N are related to k
Complete checks that the term is closed under the causal
relation
N &k
contains the elements in N not related to k
Naïve semantics example
k1 : ah0i
k2 : a(X) .° bhroll ° i
k3 : b(X) . ch0i j X
Naïve semantics example
k1 : ah0i
[k1 : M j k2 : N ; k]
k : bhroll ki
k2 : a(X) .° bhroll ° i
k3 : b(X) . ch0i j X
Naïve semantics example
k2 : a(X) .° bhroll ° i
k1 : ah0i
k3 : b(X) . ch0i j X
[k1 : M j k2 : N ; k]
k : bhroll ki
[k : M1 j k3 : N1 ; k4 ]
hh ; h
~ i ¢ k : ch0i
1
4
hh ; h
~ i ¢ k : roll k
2
4
Naïve semantics example
k2 : a(X) .° bhroll ° i
k1 : ah0i
k3 : b(X) . ch0i j X
[k1 : M j k2 : N ; k]
k : bhroll ki
[k : M1 j k3 : N1 ; k4 ]
hh ; h
~ i ¢ k : ch0i
1
4
hh ; h
~ i ¢ k : roll k
2
4
Naïve semantics example
k1 : ah0i
k2 : a(X) .° bhroll ° i
k3 : b(X) . ch0i j X
The concurrency anomaly
k
roll k
k1
roll k1
The concurrency anomaly
k
roll k
k1
roll k1
The concurrency anomaly
k1
The concurrency anomaly
k
roll k
k1
roll k1
The concurrency anomaly
k
The concurrency anomaly



Intuitively, I have rolls for undoing every action…
…but I am not able to go back to the starting state
I miss the possibility of performing rollbacks
concurrently
– Forcing this sequential behavior in a distributed
implementation would not be easy

Can I write a semantics capturing concurrent rollbacks?
Giving semantics: taming concurrency
(Com)
m = (·1 : ahP i) j (·2 : a(X) .° Q)
(·1 : ahP i) j (·2 : a(X) .° Q) ³ ºk: (k : Qfk;P =°;X g) j [m; k]
²
j
j
(Start) (·1 : roll k) [m; k] Ã (·1 : roll k) [m; k]
N Ik
complete(N j [m; k])
(Roll)
N j [m; k]² Ã m j N &k

The rollback has been splitted in two steps
– Tagging the memory
– Executing the rollback of a tagged memory
Concurrent rollback
k
roll k
k1
roll k1
Concurrent rollback
k
roll k
k1
roll k1
Concurrent rollback
k
roll k
k1
roll k1
Concurrent rollback
k1
Concurrent rollback
Properties of concurrent semantics

Correct
– If I go backward from M, I reach a state able to go forward to
M
if M ä M 0 then M 0 ³¤ M, with M and M 0 unmarked

Complete
– I can execute any number of concurrent rollbacks

Good as abstract specification
Going towards an implementation


The concurrent semantics is very high-level
Includes atomic steps involving an unbounded number
of participants
– Concurrently executing
– Possibly distributed

Can we refine the semantics to a more distributed one?
– Giving the same final result


Yes!
But technicalities are quite complex…
Low level semantics
k
roll k
Low level semantics
k
roll k
Low level semantics
k
roll k
Low level semantics
k
roll k
Low level semantics
k
Low level semantics
Low level semantics


Based on local checks and asynchronous notifications
In two phases
– Top-down notification of rollback request
– Bottom-up rollback

Equivalent to the concurrent one
– Weak barbed congruent
– Fully abstract


A good starting point for a concurrent and distributed
implementation
Writing a low level semantics equivalent to the naïve
one would be more difficult
Roadmap

Directions

Rhopi

Roll-pi

Crop

What next?
Limits of roll-pi

If I go back, I may redo the same steps infinitely many
times
–
–
–
–

No trace of previous attempts
No possibility to learn from errors
Perfect rollback
Every program that may rollback may diverge
We want a compensation mechanism
Crop idea


Some messages include compensations
When rallbacked they are replaced by different
messages
– Disabling past computations
– Enabling new computations

Is it enough for implementing compensation policies?
Crop syntax
P; Q ::= 0 j X j ºa: P j (P j Q) j ahP i¥C j a(X) .° P j roll °
M; N ::= 0 j ºu: M j (M j N) j · : P j [¹; k]
C ::= ahP i ¥ C j ?




Extends roll-pi syntax
Adds compensating messages ahPi¥C for triggering
rollback
C is the message left after rollback in place of ahPi
ahPi¥? is a message whose compensation is itself
Idea of crop semantics




Very similar to the semantics of roll-pi
Messages are transformed during rollback to their
compensations
The compensation mechanism has no effect on the keys
One can write semantics corresponding to naïve,
concurrent and low level roll-pi semantics
Comments on crop



Just started work-in-progress
A small difference in the syntax and semantics w.r.t.
roll-pi
We hope a large difference in the expressive power
Roadmap

Directions

Rhopi

Roll-pi

Crop

What next?
Summary


A reversible HOpi
A basic operator for controlling reversibility
– Related to checkpointing

A basic mechanism for compensations

All on topic for WP2
Future work

Testing the expressive power of crop w.r.t. existing
techniques for programming dependable systems
– Speculative parallelism
– Hennessy communicating transactions
– Transactional memories

What about other mechanisms for controlling
reversibility?
– Relations with RCCS irreversible actions


What about more efficient ways for storing memories?
… and much more
Finally
Scarica

ppt