Programming Languages
Concurrency
CSCI-GA.2110-003 Fall 2020
Concurrent programming
■ refers to the handling of multiple independent activities
◆ contrast to parallelism—simultaneous execution of independent
activities.
■ a task (Ada) or thread (Java,C++,C#) is an independent execution of the same static code, having a stack, program counter and local environment, but shared data.
■ Ada tasks communicate through
◆ rendezvous (think “meeting someone for a date”)
◆ shared variables
◆ protected objects
■ Java, C++, C# threads communicate through shared objects.
■ C++ also supports promises and futures (C++11).
■ C# employs asynchronous programming on a single thread (it also
supports multi-threading).
2 / 29
Task Declarations (Ada)
A task type is a limited type
task type Worker; — declaration;
— public interface
type Worker_Id is access Worker;
task body Worker is begin
— actions performed in lifetime
loop compute;
— Runs forever;
— will be shutdown — from the outside.
end loop; end Worker;
3 / 29
More Task Declarations
■ a task type can be a component of a composite
■ number of tasks in a program is not fixed at compile-time.
W1, W2: Worker; — two individual tasks
type Crew is array (Integer range <>) of Worker; First_Shift: Crew (1 .. 10); — group of tasks
type Monitored is record Counter: Integer; Agent: Worker;
end record;
4 / 29
Task Activation
When does a task start running?
• if statically allocated =⇒ at the next begin
• if dynamically allocated =⇒ at the point of allocation
declare
W1, W2: Worker;
Joe: Worker_Id := new Worker; — Starts working now Third_Shift: Crew(1..N); — N tasks
begin — activate W1, W2, and the Third_Shift …
end; — wait for them to complete — Joe will keep running
5 / 29
Task Services
■ a task can perform some actions on request from another task
■ the interface (declaration) of the task specifies the available actions
(entries)
■ a task can also execute some actions on its own behalf, without external
requests or communication
task type Device is
entry Read (X: out Integer); entry Write (X: Integer);
end Device;
6 / 29
Synchronization: The Rendezvous
■ caller makes explicit request: entry call
■ callee (server) states its availability: accept statement
■ if server is not available, caller blocks and queues up on the entry for later
service
■ if both present and ready, parameters are transmitted to server
■ server performs action
■ out parameters are transmitted to caller
■ caller and server continue execution independently
7 / 29
Example: semaphore
Simple mechanism to prevent simultaneous access to a critical section: code that cannot be executed by more than one task at a time
task type semaphore is
entry P; — Dijkstra’s terminology entry V; — from the Dutch
— Proberen te verlangen (wait) [P]; — verhogen [V] (post when done)
end semaphore;
task body semaphore is begin
loop
accept P;
— won’t accept another P
— until a caller asks for V
accept V; end loop;
end semaphore;
8 / 29
Using a semaphore
■ A task that needs exclusive access to the critical section executes:
Sema : semaphore;
…
Sema.P;
— critical section code Sema.V;
■ If in the meantime another task calls Sema.P, it blocks, because the semaphore does not accept a call to P until after the next call to V: the other task is blocked until the current one releases by making an entry call to V.
■ programming hazards:
• someone else may call V =⇒ race condition
• no one calls V =⇒ other callers are livelocked
9 / 29
Delays and Time
■ A delay statement can be executed anywhere at any time, to make current task quiescent for a stated interval:
delay 0.2; — type is Duration, unit is seconds ■ We can also specify that the task stop until a certain specified time:
delay until Noon; — Noon defined elsewhere
10 / 29
Conditional Communication
■ need to protect against excessive delays, deadlock, starvation, caused by missing or malfunctioning tasks
■ timed entry call: caller waits for rendezvous a stated amount of time:
select
Disk.Write(Value => 12,
or
delay 0.2;
Track => 123); — Disk is a task
end select;
■ if Disk does not accept within 0.2 seconds, go do something else
11 / 29
Conditional Communication (ii)
■ conditional entry call: caller ready for rendezvous only if no one else is queued, and rendezvous can begin at once:
select
Disk.Write(Value => 12, Track => 123);
else
Put_Line(“device busy”);
end select;
■ print message if call cannot be accepted immediately
12 / 29
Conditional communication (iii)
■ the server may accept a call only if the internal state of the task is appropriate:
select
when not Full =>
accept Write (Val: Integer) do … end; or
when not Empty =>
accept Read (Var: out Integer) do … end;
or
delay 0.2; — maybe something will happen
end select;
■ if several guards are open and callers are present, any one of the calls may be accepted – non-determinism
13 / 29
Concurrency in Java
■ Two notions
◆ classThread
◆ interfaceRunnable
■ An object of class Thread is mapped into an operating system primitive
interface Runnable { public void run ();
}
■ Any class can become a thread of control by supplying a run method
class R implements Runnable { … }
Thread t = new Thread(new R(…)); t.start();
14 / 29
Threads at work
class PingPong extends Thread {
private String word;
private int delay;
PingPong (String whatToSay, int delayTime) {
} }
word = whatToSay; delay = delayTime; }
public void run () { try {
for (;;) { // infinite loop System.out.print(word + ” “); sleep(delay); // yield processor
}
} catch (InterruptedException e) {
return; // terminate thread }
15 / 29
Activation and execution
public static void main (String[] args) {
new PingPong(“ping”, 33).start(); // activate new PingPong(“pong”, 100).start(); // activate
}
■ call to start activates thread, which executes run method
■ Do not call run directly: it will execute like an ordinary method with no
new thread.
■ threads can communicate through shared objects
■ classes can have synchronized methods to enforce critical sections
16 / 29
Volatile variables
Consider the following fragments executing in 2 threads:
■
Concurrency + optimization complicates things:
■
Keyword volatile prevents the compiler from optimizing. ◆ privatevolatileboolkeepgoing;
■ ■ ■
Works with primitives and non-primitives (unlike synchronized). Still need to synchronize—volatile does not provide atomic locking. Seen in many languages [Java, C, C++, C#].
while (keepgoing) {
…
if (condition)
}
// processing loop
keepgoing = false; …
◆ One thread may maintain a variable in a hardware register while the other reads/writes from memory.
◆ Can result in dirty reads and data corruption
◆ E.g., loop above may never terminate
17 / 29
Mutual exclusion
■ Declare a method as synchronized
◆ Locks the object whose synchronized method is running.
◆ All other synchronized regions for “this” object will block.
public synchronized void foo(…) {
// entire method protected
}
■ Declare a defined region as synchronized
◆ Similar to above, but programmer specifies the exact region to protect.
◆ Programmer specifies the lock object.
synchronized (obj) {
// protected region here
}
■ Fairness is not guaranteed with synchronized. Use ReentrantLock.
18 / 29
Java notify/wait pattern
Used to achieve synchronization on an object across threads.
■ wait releases the lock of a specified object and blocks until a notify event from another thread. Then reacquires the lock.
■ notify wakes up an arbitrary waiting thread. No fairness.
■ notifyAll wakes up all waiting threads.
Typical notify pattern:
synchronized (obj) {
// set the condition
readyToConsume = true;
obj.notify(); }
19 / 29
More Java notify/wait
Typical wait pattern:
synchronized (obj) {
}
while( ! readyToConsume ) {
obj.wait(); }
// perform sync action here
■ synchronized prevents race conditions (all threads must agree on the state of the predicate).
■ Condition is needed to address spurious wait wake ups.
■ Condition variables should be volatile.
■ Easy to introduce bugs by not following the proper pattern.
■ Pattern seen in many languages [Java,C++,Scala,PHP]
20 / 29
Threads in C++11
■ C++ didn’t have native thread support until C++11.
■ Previously had to use external libraries like pthreads, Boost
OpenThreads, etc.
■ Full state-of-the-art thread support now included in C++.
■ One-to-one mapping to operating system threads.
■ Based on the Boost thread library.
21 / 29
Example Thread Class
class Runnable {
std::thread mthread;
Runnable(Runnable const&) = delete;
Runnable& operator =(Runnable const&) = delete;
public:
virtual ~Runnable() { try { stop(); }
catch(…) { /* clean up */ } }
virtual void run() = 0;
void stop() { mthread.join(); } void start()
{ mthread = std::thread(&Runnable::run, *this); } };
22 / 29
Use of Thread Class
class myThread : public Runnable {
};
protected:
void run() { /* do something */ }
Mutual exclusion can be acheived as follows:
static std::mutex pmm;
void mySynchronizedFunction() { std::lock_guard
// unlocked automatically on return
}
23 / 29
Automatic Threads & Futures
int main() {
std::future
{
int total = GetStartValue();
await Task.Delay(3000); // replaced int increment = GetIncr();
return total + increment;
■ Special async methods like Task.Delay return a future (called a “Task”).
■ GetTotalAsync is also async and also returns a future.
■ Executing await:
◆ Causes a continuation to be captured and the method immediately exits.
◆ Continuation is executed when the future eventually produces a value.
27 / 29
Asynchronous programming in C#
Now let’s call GetTotalAsync:
private async static Task
}
int startVal = GetStartValue();
int increment = await GetTotalAsync(); return total + increment;
This is (roughly) shorthand for:
private static Task
}
int startVal = GetStartValue();
return GetTotalAsync().ContinueWith((incrementTask) => {
int increment = incrementTask.Result;
return total + increment; });
28 / 29
Asynchronous programming in C#
■ Summary: maximize thread CPU utilization: rather than block the CPU, execute both I/O-bound and CPU-bound activities simultaneously.
■ async keyword must be used in method signature when await is called in the body.
■ Presence of async creates a state machine for special handling.
■ When await is reached, control returns to caller immediately.
◆ Rest of the method body is wrapped in a continuation and deferred. ■ Use of async/await does not require multi-threading.
■ Contrast to multi-threading:
◆ Increases CPU utilization, like threads. But…
◆ No need to create and destroy threads.
◆ No chance of deadlocking.
◆ No prioritization headaches.
■ Similar async/await concept available in other languages [JavaScript,Python,Hack, Dart].
29 / 29