Monday, September 29, 2008

Esterel: A Unique Language

Esterel was developed in the early '80s in France in parallel with other two languages, Lustre and Signal. These, together with Statecharts, are considered the first synchronous languages.
Esterel found its niche in control intensive real-time applications and evolves as a standard, with several implementations.

Unlike all early and also recent synchronous languages, Esterel is the only one to provide an imperative style of programming. This is kind of surprising for me considering the high popularity of imperative languages.
Why there's no attempts to create new languages based on the Esterel foundations?

I cannot argue about functional vs imperative style of programming, but I do feel more comfortable with the latter (most people do), and they will be around for the next years.
I like the Lua approach, which seems to favor imperative style but concisely support most functional features without bloat.

The example below, written in Esterel, implements the basic training for an athlete:
module Runner:
    input  Second, Morning, Step, Meter, Lap;
    output ...; % not used
    every Morning do
        abort
            loop
                abort RunSlowly when 15 Second;
                abort
                    every Step do
                        Jump || Breathe
                    end every
                when 100 Meter;
                FullSpeed
            each Lap
        when 2 Lap
    end every
end module
This imperative style is almost a software specification given as a recipe in natural English.

The communication in Esterel is made through broadcast input and output signals.
The synchronous preemption structures (every do, loop each, abort when, etc) are the heart of the language.
LuaGravity provides constructs based on them, which I consider even more useful than the functional facilities.

Tuesday, September 9, 2008

Multimedia (Digital TV) Languages

I'm currently working at the Telemídia Laboratory [1], which is responsible for the development of the middleware Ginga [2] for the Brazilian Standard for Digital TV (SBTVD).

A DTV middleware is a common layer above specific platforms responsible for harmonizing them, providing an unified way to create interactive applications. For instance, the DTV middleware provides the languages in which interactive applications must be authored.
Most early digital TV standards around the world followed the Web standards and chose to use HTML + JavaScript + Java as their authoring languages for broadcasted interactive applications.



SMIL (Synchronized Multimedia Integration Language) [3] and NCL (Nested Context Language) [4] are XML based languages that supports multimedia synchronization.

Here's an application written in SMIL:
<smil>
<body>
<par>
<video src="goals.mpg"/>
<img src="pele.jpg" begin="10s" end="15s"/>
<img src="zico.jpg" begin="30s" end="40s"/>
</par>
</body>
</smil>

Now, the same application in NCL:
<ncl>
<body>
<media id="videoGoals" src="goals.mpg">
<area id="aPele" begin="10s" end="15s"/>
<area id="aZico" begin="30s" end="40s"/>
</media>
<media id="imgPele" src="pele.jpg"/>
<media id="imgZico" src="zico.jpg"/>
<link xconnector="onBeginStart">
<bind role="onBegin" component="videoGoals" interface="aPele"/>
<bind role="start" component="imgPele"/>
</link>
<link xconnector="onEndStop">
<bind role="onEnd" component="videoGoals" interface="aPele"/>
<bind role="stop" component="imgPele"/>
</link>
<link xconnector="onBeginStart">
<bind role="onBegin" component="videoGoals" interface="aZico"/>
<bind role="start" component="imgZico"/>
</link>
<link xconnector="onEndStop">
<bind role="onEnd" component="videoGoals" interface="aZico"/>
<bind role="stop" component="imgZico"/>
</link>
</body>
</ncl>

Look, NCL has a reactive behavior through its link primitive (which LuaGravity borrowed).
When the area "aPele" begins, the image "imgPele" is started and so on.



Well, there's much more to tell about them as well as how much they differ.
You can look at the references below if it is the case.

Both are designed to synchronize timed medias like audio and video.
HTML, on the other hand, was specified with text and image in mind and doesn't seem to be a clever option for TV applications.

Probably the biggest difference among SMIL and NCL, as seen in the above examples, is that SMIL does not separate the definitions for media content and synchronism, while NCL does.

Very simple applications are usually straightforward when written in SMIL (like the one above).
On the other hand, since reuse is one of NCL's main concerns, growing applications scale better with NCL than with its SMIL counterparts.

Both SMIL and NCL are XML based, keeping this tradition in the world of multimedia application authoring.

The middleware Ginga chose to use NCL with Lua as its scripting language.
SMIL is the W3C standard for describing multimedia presentations.

[1] http://www.telemidia.puc-rio.br/
[2] http://www.ginga.org.br/
[3] http://www.w3.org/AudioVideo/
[4] http://www.ncl.org.br/