Oz: Concurrency Doesn't Have to be Hard

I'm reading a book called Concepts, Techniques, and Models of Computer Programming which is being hailed as following in the tradition of Structure and Interpretation of Computer Programs.

One of the things that I found interesting about the book is that it says that concurrency doesn't have to be hard:
Concurrency in Java is complex to use and expensive in computational resources. Because of these diffculties, Java-taught programmers conclude that concurrency is a fundamentally complex and expensive concept. Program specifications are designed around the diffculties, often in a contorted way. But these diffculties are not fundamental at all. There are forms of concurrency that are quite useful and yet as easy to program with as sequential programs (for example, stream programming as exemplified by Unix pipes). Furthermore, it is possible to implement threads, the basic unit of concurrency, almost as cheaply as procedure calls. If the programmer were taught about concurrency in the correct way, then he or she would be able to specify for and program in systems without concurrency restrictions (including improved versions of Java). [p. 30]
The main programming language taught in the book is the multi-paradigm programming language Oz, which is related to the programming language Alice. I've been a fan of Erlang-style concurrency for years, but I was amazed to see that concurrency based on "promises" is pretty nifty as well. Here's an example I wrote:
declare X Y in
thread
{Delay 10000}
X=10
end
thread
{Delay 5000}
Y=2
end
{Browse starting}
{Browse X*Y}
In this code, X and Y are two values that are calculated on separate threads. The main thread wishes to show the result of X*Y. However, it automatically waits until X and Y have been calculated. It outputs "starting" and then waits for 10 seconds until both X and Y have been set. Notice that the synchronization is completely implicit.

Comments