It's everything the Web was never meant to do...and so much more!
While I was doing some consulting with Warren DeLano of PyMOL, we envisioned a REST-RPC hybrid for use in distributed computing.
Imagine an RPC system built on top of REST. (I know that REST enthusiasts tend to really dislike the RPC style, but bear with me.) That means two things. It means you're calling remote functions, but it also means you should take advantage of HTTP as much as possible. That means you should use URLs, proxies, links, the various HTTP verbs, etc.
Imagine every RPC function takes one JSON value as input and provides one JSON value as output. Such a system could be used with relative easy in a wide range of programming languages, even C. I'm thinking of a P2P system with many servers who talk to each using RESTful Web APIs, but do so in an XML-RPC sort of way. However, instead of using XML, they use JSON.
Here's what I think URLs should look like: http://server:port/path/to/object/;method?arg1=foo&args2=bar. This would convert to a JSON list of args. There could be alternate syntax for the arguments, such as ?__argument__=some_encoded_json if the JSON structure is deep.
The method may choose to support GET or POST in the usual RESTful way. GET requests should not have side effects.
To support asynchronous operation, you can make an RPC and then pass it a callback URL which the remote server can call later via RPC. There must be a way of registering a temporary callback URL in order to support asynchronous operation. You might even use a list of callbacks in order to create a pipeline. This is like a stack of continuations.
Various things like HTTP auth may be used. Various HTTP status codes should be used in ways that make sense. Internally, bindings for languages like Python should translate HTTP error status codes to exceptions. If a server returns an error status code, a traceback object in JSON makes sense.
It makes sense to do something useful for actual UNIX pipes. For instance, if an app is started in a pipeline, it can output URLs to STDOUT so that the two processes connected via a UNIX pipe can connect using the protocol.
This platform would allow you to make use of the strengths and weaknesses of any platform such as C, Python, IronPython, MPI in C, etc. There might be a server written in C that can use SIMD or the GPU. There might be another server that can output stuff to the screen within an OpenGL context.
It needs some way to stream data. Perhaps the protocol is used to do all the setup, and then it gives you a URL where you can stream stuff in any format. After all, it is a web server. You could probably even stream JSON objects, one per line.
I'd like to thank Warren DeLano for giving me an opportunity to blog about our conversations.