Saturday, September 01, 2007

Core Join Calculus - too hardcore?

After looking closely at core JC, and the proposed encoding of full JC in it, I found it unsatisfactory for my taste.
One problem is that the encoding results in "busy-wait" (stuttering non-termination), even if the original term was "normalizing", which makes the chosen notion of equivalence suspicious. Also, the encoding is sufficiently complicated to ask whether analysis of the resulting terms will be not harder than that of original ones (I recognize that analysis of core JC as a language is easier than of full JC, though - that's the usual trade-off between complexity of a language and its terms, sometimes exemplified by English vs. Chinese, or binary vs. hex).

So maybe I want to take a step back, and pick some variant of JC that is not as spartan as core JC. At the moment I see 3 differences between full and core JCs:
1. Polyadic vs. monadic messages (tuples of channels vs. single channels as payload).
2. N joins per definition vs. 1 join per definition.
3. M patterns per join vs. 2 pattern per join.

I am considering to look closer at the "monadic N M" variant, though in theory one could analyse all combinations, and how they can be encoded in each other (could make a PhD thesis :) ). If one adds to these 3 dimensions another one for allowing or disallowing free names in definitions, this yields 24 = 16 variants. I suspect, though, that one does not need to define encoding for each pair of combinations separately - they all could be factored via few "encoding patterns" (e.g., encoding of N-M JC in 1-2 JC, polyadic JC in monadic JC, and "higher order/closured" JC in "first order/closureless" JC).

No comments: