64k-intro: Walking on Four

July 1, 2006 by nicolas, tagged in64k, filed under works

The following is a web edition of my original report written in 2006-07-01



In September 2005, even before the release of Clothing Like A T-Shirt Saying I Wish I already had set my mind on releasing short intros for the next events of the year. Namely, the Saturne Party 6 in Paris, and the bcnparty101[9] in Barcelona. It turned out the cancellation of the former was actually beneficial to the overall project, since going back to work on an intro immediately after another one proved a bit too optimistic.

After some hectic crunch-time programming for the streamMega[10] party, I needed, or thought I needed a bit of rest. The code base also had suffered through the ordeal and some housekeeping needed to be done to clean up and package independent parts for later works.

So I cleaned up the mess, branched out the code, and started working again on it for the next effort.

In the end, after roughly 67 hours of work with a great deal taken by debugging, Walking On Four was released in the 64kb intro competition at bcnparty101[9], on 2006.11.05.

Maybe all I needed was a reason to visit the intoxicating city of Barcelona, and its illustrious inhabitants.


Starting with a fast moving red loading bar, it then displays the very same effect that ended Clothing Like A Tshirt saying I Wish, arrows going up and down diagonally across the screen.

But we notice a slight ghosting of the whole picture.

The soundtrack is distinctively electronic. The pad section is playing a seemingly aimless minor harmony, and the rhythmic section, consisting of just a bass drum and a click, is animated along a long
rhythm difficult to apprehend at first.

As the synthetic pads slowly fade-in, stripes, almost calligraphic drawings start appearing and drawing themselves below the frame. The
grey on dark drawings do not appear to mean anything, and appear one after another, as pieces of entangled ribbons.

Very quickly the original arrows disappear, and after a short display of the ribbons alone, a distorted pink plane of particles jumps in and out of the screen, locally swelling.

Around the 45 seconds mark, the display spreads and saturate into almost homogeneous whiteness. But slowly ripples back in to display another similar screen: transparent ribbons like in the first half, but on top of which a rotating, block of pink lights, pulsating with the underlying rhythm of the music.

With time, ripples form and emerge from the block of light. Saturated, harsh-looking ripples coming from the center of the screen towards its sides, like a suggested tunnel.

In the end, around the 90 seconds mark, the title “walking on four -” appears on a black display, followed in an unusual down to up order, with “!=” then “BCN 7005” as each new beat of the now almost alone rhythm resonates in.

Stay with us as we will cover its design and implementation.

Design and Methodology

Let’s break it to you immediately: there is no concept behind the creation of Walking on Four. It became what it became through a collage of ideas and accidents: it first started as an effort to compensate the dryness of my previous intro, especially in the visual area. Focusing on what demos are made of, effects.

I wanted to come back to the basics, to accept seeing things from the floor like a child does before trying to walk. Mundane things then become tall and impressive. But more than a regression or process of unlearning, it’s about coming back to a wild state of just obeying one’s instinct. Creation as the exposure of one’s desires[6].

That said, one of the ideas that served my purpose was the reuse of a common core for almost all of the visual effects. This made it possible to implement the intro quickly, giving me enough time to sort out all the technical issues I encountered.

This common core, created as a simulation of a simplistic physical system, is reinterpreted through a progression of visual effects: at first, the piece shows up representation of the inner workings of this simulation, - as moving ribbons - then followed by a static result of its operation, - the swelling plane - to finish with a more elaborate use, showing it from “outside”. - radiating lights -

Inside, outside.

This use of fake physics as base for visual effects is one of my favourite themes. As far as I remember, I first started playing with it in an unreleased demo for Assembly2000[11], initially inspired as I was by Golan Levin’s Floo[1].

To underline the continuity of work between my two efforts, the very last effect of Clothing Like A T-Shirt Saying I Wish was turned into the very first effect of Walking on Four.

I won’t repeat here what I wrote earlier[5], but again, time was a critical resource to meet the BCN Party’s deadline, with around 33 days (274 hours) of work available.

I committed 67 hours of work in the end, for a ratio of 0.24 down from my previous ratio of 0.47. Life is life.



Implementation work consisted of two big parts: First I tried enhancing some of the representations I used previously for music and rhythms. Representation of how one can express temporal relationships in the timeline of a piece. In parallel, I built a few classic-looking visual effects.

In an intro, it is always better if we can do more with less. This is precisely why, under the hood I separated my visual effects in two:

The first part, the simulation, is responsible for creating complexity. The second part interprets or otherwise puts into form the result of the first part.

I liked the idea of making a link between this and the previous intro. Here the reuse of the last effect and presentation from Clothing Like A T-Shirt Saying I Wish as initial scene serves to create an illusion of a continuity.

Or how I sought increasingly complex rhythms from simple rules.

First, what is the simplest rhythm? A regular, repetitive beat. Like one’s heart beating. Or somebody clapping.

Clapping in a regular fashion is something perfectly natural to us, and it does not take much effort to do it.

So at the core of our system is the Metronome: a regular beat with a recognisable frequency. It shouldn’t have to be very precise or very regular, but computers being what they are, the simplest
implementation is very strict.

Rhythm is important in music: it brings about a mental image of movements, of physical actions being performed. And rhythms, as they appear in nature, are rarely fully regular. The song of a bird, the beating of wind in a tree. Water as it breaks on the rocks of a river’s bed.

So lets now add more twists to our simplistic rhythm.

If instead of clapping we used our fingers to count down as each regular step comes, clapping only when no finger is left in our counting hand, we gain a flexible way of marking time.

Slowing down our regular beat: 2 1 clap 2 1 clap.

We will write this rhythm down as [3 3]: A clap, then 3 steps later, another clap.

If we were to write an irregular rhythm: 2 1 clap 1 clap, we would write it down as [2 3].

The notation represents a rhythmic pattern of n different beats as:
width="70" height="32" align="middle" border="0"
alt="$ [a_1 \cdots a_n] $" />
. each width="18" height="30" align="middle" border="0"
alt="$ a_i $" />
represents the number of ticks (steps) each beat lasts. The number of ticks per second is controlled by a metronome, and defines the overall scale of the rhythm.

We also define the period of a rhythmic pattern in ticks as:

width="188" height="62" align="middle" border="0"
alt="$\displaystyle \period ([a_1 \cdots a_n]) = \sum_1^n a_i $" />

When a rhythmic pattern forms a full rhythm, the notation also implies that:

width="389" height="32" align="middle" border="0"
alt="$\displaystyle [a_1 \cdots a_n] = [a_1 \cdots a_n\;a_1 \cdots a_n] = [a_1
\cdots a_n\;a_1 \cdots a_n \cdots] $" />

Which enables us to simplify [3 3] into [3].

In this notation, rhythms are defined as a series of beats of varying length. This presumes an underlying Metronome, that gives the basic speed of the final rhythm.

We will now define an agglutination operation (sticking two proto-rhythms together) : width="17" height="30" align="middle" border="0"
alt="$ + $" />

width="40" height="32" align="middle" border="0"
alt="$\displaystyle r + []$" />
width="29" height="30" align="middle" border="0"
alt="$\displaystyle = r$" />
width="151" height="32" align="middle" border="0"
alt="$\displaystyle [a_1 \cdots a_n] + [b_1 \cdots b_n]$" />
width="144" height="32" align="middle" border="0"
alt="$\displaystyle = [a_1 \cdots a_n\;b_1 \cdots b_n]$" />

a special rhythmic pattern, width="13" height="32" align="middle" border="0"
alt="$ [] $" /> is an utterly useless, beatless
rhythm. (Hardly a rhythm at all)

Some elementary patterns

A number of proto-rhythms are used in Walking on

width="13" height="30" align="middle" border="0"
alt="$\displaystyle p$" />
width="38" height="32" align="middle" border="0"
alt="$\displaystyle = [1]$" />
width="20" height="30" align="middle" border="0"
alt="$\displaystyle pp$" />
width="38" height="32" align="middle" border="0"
alt="$\displaystyle = [2]$" />
width="20" height="30" align="middle" border="0"
alt="$\displaystyle e5$" />
width="38" height="32" align="middle" border="0"
alt="$\displaystyle = [5]$" />
width="20" height="30" align="middle" border="0"
alt="$\displaystyle r7$" />
width="50" height="32" align="middle" border="0"
alt="$\displaystyle = [2\;7]$" />
width="17" height="30" align="middle" border="0"
alt="$\displaystyle lr$" />
width="58" height="32" align="middle" border="0"
alt="$\displaystyle = [2\;14]$" />
width="18" height="30" align="middle" border="0"
alt="$\displaystyle t3$" />
width="63" height="32" align="middle" border="0"
alt="$\displaystyle = [3\;2\;2]$" />
width="20" height="29" align="middle" border="0"
alt="$\displaystyle j4$" />
width="63" height="32" align="middle" border="0"
alt="$\displaystyle = [4\;1\;2]$" />
width="20" height="29" align="middle" border="0"
alt="$\displaystyle j9$" />
width="63" height="32" align="middle" border="0"
alt="$\displaystyle = [9\;1\;2]$" />
width="26" height="30" align="middle" border="0"
alt="$\displaystyle m4$" />
width="63" height="32" align="middle" border="0"
alt="$\displaystyle = [4\;2\;4]$" />
width="21" height="30" align="middle" border="0"
alt="$\displaystyle a5$" />
width="75" height="32" align="middle" border="0"
alt="$\displaystyle = [5\;5\;2\;2]$" />

To form the following rhythms:

width="305" height="111" border="0"
alt="\begin{displaymath}\begin{array}{l l l}
pe5 &= [\period (e5)] &= [5]\\
pj4 &=...
pa5j9 &= [\period (a5) + \period (j9)] &= [26]\end{array}\end{displaymath}" />

width="457" height="113" align="middle" border="0"
alt="$\displaystyle \begin{tabular}{l l l}
t3ppp &= t3 + p + pp &= [3\;2\;2\;1\;2]\ ...
m4j4e5 &= pm4 + pj4 + pe5 &= [10\;7\;5]\\
\end{tabular}$" />

Each of those rhythms may be used in different parts, visual or musical ones, and with a different underlying metronome.

We will nevertheless notice that was is missing from this system is a way to emphasise each beat differently: there is no notion of accents, and it must be emulated by hand by layering multiple rhythms together, hence the presence of M4J4E5 together with m4j4e5, or A5j9A5r7 together with a5j9a5t7.



Generating notes at random, like I did previously produces feelings of unresolution, aimlessness, but when done completely at random, fails to have any sort of harmonic character. No atmosphere, except that of a sterile, if somewhat dissonant mechanism.

To obtain a degree of control on this, I wanted to reduce the set of notes - to work in a given mode - selected by the generation algorithms. The relative, average interval between notes will enable or disable certain consonance or dissonances.

The intro would still feature a random generator as a base for note generation, but this time only selecting among notes from certain sets, while also being tweaked in favour of certain preferred sequences.

Modes are specified as a set of notes indexed by their semi tones
interval from a given root:

An example would be :
width="67" height="32" align="middle" border="0"
alt="$ c, \{ 0\;1\;3 \} $" />
which would represent the
following set of notes: width="11" height="13" align="bottom" border="0"
alt="$ c$" />
, width="18" height="30" align="middle" border="0"
alt="$ c\sharp$" />
and width="19" height="30" align="middle" border="0"
alt="$ d\sharp$" />

The octave is not taken into account here.

For reference:

width="11" height="13" align="bottom" border="0"
alt="$ c$" /> width="18" height="30" align="middle" border="0"
alt="$ c\sharp$" /> width="13" height="14" align="bottom" border="0"
alt="$ d$" /> width="19" height="30" align="middle" border="0"
alt="$ d\sharp$" /> width="12" height="13" align="bottom" border="0"
alt="$ e$" /> width="14" height="30" align="middle" border="0"
alt="$ f$" /> width="20" height="30" align="middle" border="0"
alt="$ f\sharp$" /> width="13" height="30" align="middle" border="0"
alt="$ g$" /> width="19" height="30" align="middle" border="0"
alt="$ g\sharp$" /> width="13" height="13" align="bottom" border="0"
alt="$ a$" /> width="19" height="30" align="middle" border="0"
alt="$ a\sharp$" /> width="11" height="14" align="bottom" border="0"
alt="$ b$" />


In Walking on Four I picked

width="284" height="32" align="middle" border="0"
alt="$\displaystyle c,{ 0, 1, 3, 5, 7, 8, 10 } \leftrightarrow (c, c\sharp, d\sharp, f, g, g\sharp, a\sharp) $" />

Which would otherwise be known as C Phrygian[ HREF="#PHRY">7]. It comprises
minor chords, giving a slightly dissonant atmosphere.

Note generator

The note generator was specified in terms of probabilities of notes, but also evolves as time does, going up or down that scale, following the width="68" height="29" align="middle" border="0"
alt="$ a5j9a5r7$" />
rhythm. (at frequency:
width="117" height="36" align="middle" border="0"
alt="$ \frac{120}{16}
=7.5 \frac{ticks}{second}$" />

width="10" height="29" align="middle" border="0"
alt="$\displaystyle i$" />
width="244" height="32" align="middle" border="0"
alt="$\displaystyle = 7 + p1 + ((\sigma step + p3) mod r)$" />
width="74" height="30" align="middle" border="0"
alt="$\displaystyle midi\_note$" />
width="125" height="32" align="middle" border="0"
alt="$\displaystyle = c\;phrygian(i)$" />

Where width="25" height="14" align="bottom" border="0"
alt="$ p1$" />
, width="25" height="14" align="bottom" border="0"
alt="$ p3$" />
, width="17" height="14" align="bottom" border="0"
alt="$ r$" />
, width="33" height="30" align="middle" border="0"
alt="$ step$" />
, width="14" height="13" align="bottom" border="0"
alt="$ \sigma$" />
, width="63" height="30" align="middle" border="0"
alt="$ exprand$" />
and width="35" height="30" align="middle" border="0"
alt="$ prob$" />
are defined as:

width="14" height="13" align="bottom" border="0"
alt="$ \sigma$" />
alternating between width="25" height="30" align="middle" border="0"
alt="$ -1$" />
and width="25" height="30" align="middle" border="0"
alt="$ +1$" />
along a width="42" height="32" align="middle" border="0"
alt="$ [2\;14]$" />
( width="17" height="14" align="bottom" border="0"
alt="$ lr$" />

width="219" height="75" align="middle" border="0"
alt="$ p1 = \prob \left\langle
\begin{array}{l r}
43.55\%:& -1\\
33\%:& 7\\
23.45\%:& 0 \\
\end{array} \right\rangle $" />

width="266" height="54" align="middle" border="0"
alt="$ p3 = \prob \left\langle
\begin{array}{l r}
62\%:& \sigma * 5 \\
38\%:& 21 * exprand \\
\end{array} \right\rangle $" />

width="184" height="54" align="middle" border="0"
alt="$ r = \prob \left\langle
\begin{array}{l r}
51\%:& 7 \\
49\%:& 21 \\
\end{array} \right\rangle $" />

width="138" height="32" align="middle" border="0"
alt="$ step \in [0\; 1 \cdots +\infty] $" />
increases with each beat of the underlying rhythm.

width="63" height="30" align="middle" border="0"
alt="$ exprand$" />
produces an exponential distribution of random numbers.

width="203" height="32" align="middle" border="0"
alt="$ \prob \langle percentage: value \cdots \rangle$" />
returns a value
with a given probability.

The idea is to generate a pattern of notes with a globally raising
(but sometimes descending) motif, with notes wrapped around an
alternating register width="17" height="14" align="bottom" border="0"
alt="$ r$" />
of 7 to 21 semi-tones.

The atonal instruments like the click and bassdrums are controlled
directly by the following rhythms:

clicks t3ppp
bassdrum a5j9a5r7 accented via A5j9A5r7

Video filters

I’ve always loved feedback-based effect, and the patterns that
emerge from their often chaotic behaviour. I tried to capture this
via the implementation of an IIR filter in video.

This is a transposition in video of the simple audio low pass filter I
use in the synthesiser. But with one problem: the sampling rate of
video is quite small (50-60hz) and non homogeneous as well. Both
factors make the results of the filter quite difficult to control.
Nevertheless, we get interesting results.

Frames are saved, rendered into six different textures, three
capturing the input frames, while the three others capture the output
frames. In turn, if one texture stands for the current frame at one
point in time, during the next frame that texture will represent the
past. When the texture is too old, it now again becomes the present
time frame.

Using OpenGL it comes down to using glCopyTexSubImage2D
to copy the currently rendered picture to a texture.

video low-pass filter. width="30" height="16" align="bottom" border="0"
alt="$ z^{-1}$" />
delays a video frame by one frame.

Then multiplication and addition of frames is achieved through
blending, using the filter’s parameters as alpha channels for each
frame, with:

width="329" height="16" align="bottom" border="0"
glblendfunc (gl_src_alpha, gl_one);
\end{lstlisting}" />

For the output stage, an OpenGL extension,
EXT_blend_subtract is used to apply negative
alpha channels:

width="434" height="16" align="bottom" border="0"
glblendequationext (gl_func_reverse_subtract);
\end{lstlisting}" />

The filter is also combined with a rotating and zooming stage to give
it a small twist: the most dramatic effect is achieved towards the end
of the intro, when the filter produces patterns resembling a tunnel.

A Simulation

I am always following many different things at a time, and I
especially like to look around to see what others are doing. So,
watching out for pieces made with the processing[ HREF="#P55">3] software package, I
stumbled on Quasimondo’s portfolio[
HREF="#QUASI">4] and one example
especially attracted my eyes: his caustics simulator, Raycoaster[
HREF="#RAYCOASTER">2]. Caustics are the subtle light effects
one can experience in a swimming pool, when the refracted light rays
accumulates in certain areas and thus produce almost geometrical

The original idea was kept: a simulation where light is represented by
particles, grains projected inside a chamber with varying properties,
and whose resulting trajectories form a final image. This had to form
a source material for the actual visual effects.

My simulation is formed of two subparts. The chamber, where the
simulation is performed, which ends with a projection screen. And the
particle emitter, which defines how grains of lights are injected in
the chamber, and their initial conditions.


The chamber was designed as a stack of different media, each with
their different properties. Each medium is composed of elements that
pulls, pushes or otherwise affect the trajectory of grains inside
them, until the grain exits to the next media.

Let width="19" height="14" align="bottom" border="0"
alt="$ x$" />
, width="17" height="14" align="bottom" border="0"
alt="$ y$" />
, width="16" height="14" align="bottom" border="0"
alt="$ z$" />
be three axis. width="19" height="14" align="bottom" border="0"
alt="$ x$" />
, width="17" height="14" align="bottom" border="0"
alt="$ y$" />
form planes parallel to the final
projection screen. the corresponding coordinates of a grain of light
are width="14" height="13" align="bottom" border="0"
alt="$ x$" />
, width="13" height="30" align="middle" border="0"
alt="$ y$" />
and width="13" height="13" align="bottom" border="0"
alt="$ z$" />

One medium is defined by:

  • an elevation (
    width="81" height="32" align="middle" border="0"
    alt="$ h \in ]0; +\infty[ $" />
    ) : this tells us where in
    the stack it will be, and consequently, how large it will
    be. (coordinate in the width="16" height="14" align="bottom" border="0"
    alt="$ z$" />
  • a “speed of light” (
    width="106" height="32" align="middle" border="0"
    alt="$ c \in ]-\infty; +\infty[ $" />
    ) : this controls the z
    coordinate of each grain of light:
    width="191" height="32" align="middle" border="0"
    alt="$ z(t) = z0 + c * \left \vert \left
    \langle x(t), y(t) \right \rangle \right \vert $" />

note that the width="13" height="13" align="bottom" border="0"
alt="$ z$" />
coordinate is controlled by the length of the
path of the particle in the plane.

  • a number of force fields in the X, Y plane. Each force
    contributes to the instantaneous acceleration of the grain depending on its
    position, following Newton’s formula:
    width="173" height="45" align="middle" border="0"
    alt="$ \frac{d^2 \left \langle x, y \right \rangle}{dt} = \frac{1}{m}
    * \sum f_i (x, y) $" />
    with width="48" height="13" align="bottom" border="0"
    alt="$ m = 1 $" />

  • The force fields are of two main kind:

    width="50" height="32" align="middle" border="0"
    alt="$ \left \langle x', y' \right \rangle $" />
    be the position in the
    field’s referential. the force field is here centered around
    width="40" height="32" align="middle" border="0"
    alt="$ \left
    \langle 0,0 \right \rangle $" />
    with width="109" height="32" align="middle" border="0"
    alt="$ f \in ]-\infty; +\infty[ $" />

    • One of them is the attraction force

      width="253" height="54" border="0"
      alt="\begin{displaymath}f(x', y') = f * \left[
      1 & 0 \\
      0 & 1 \ ...
      ...t \rangle}{\vert \left
      \langle x', y' \right \rangle \vert^2} \end{displaymath}" />

  • The other type is the rotation force

  • width="265" height="54" border="0"
    alt="\begin{displaymath}f(x', y') = f * \left[
    0 & -1 \\
    1 & 0 \\...
    ...ht \rangle}{\vert
    \left \langle x', y' \right \rangle \vert^2} \end{displaymath}" />

    Which tends to rotate the grain around its center.

    When f is below zero, the forces appear to be attractive, when above
    zero repulsive.

    By then, if you have any elementary knowledge of physics you will have
    realized this is not a sound physical simulation of a known, real
    phenomenom. Indeed, very rightly so. Modelling photons as if they were
    balls on a pool table would not be acceptable. But we are not
    interested here in modelling reality, but to produce material of a
    certain degree of complexity while being easy to manipulate, and
    staying continuous.

    It is very easy to create different force fields, and even animate
    them by animating the sources of rotational and attractive forces, or
    changing their magnitudes.

    The ribbon scene

    Contrary to the other two ones, this scene displays the inside of
    the simulation. Instead of taking its result, we are following the
    trajectory of grains sent inside the chamber. Here, their inertia /
    speed is being tweaked so that each force fields affect their
    trajectory in a much stronger way than in the actual simulation,
    but the principles and parameters for each force fields are the

    Splines aren’t even used! We get second order continuity for free, by
    virtue of integrating the Newtonian equations, with continuously
    varying forces.

    Particles are injected in sequence from starting point around the
    screen via a simple sine based equation.

    Every effect starts with a sine after all.

    As they get modified by the simulation the grains record their own
    trajectory: the history of their position, speed and acceleration.

    To display them, we thus sampling the path of each grain. At each step
    in the trajectory, the path is drawn by going through through this
    history, connecting each point via a quadrilateral, slightly slanted
    from the horizontal to give the drawing a calligraphic look. The width
    of each quad of light is also mapped on the velocity of the particle
    at this position, the faster it was, the thinner the ribbon appears
    around this part.

    Generating animated textures with OpenGL

    Throwing all my grains of light through a weird simulation of
    medium/light interaction was initially devised as a way to generate an
    animated texture. A texture which would be later used in other effects
    throughout the intro.

    At the origin of any grain, we have a particle emitter. It generates
    grains over a whole area, spaced out regularly. But covering the
    screen in a natural way is not so straightforward. I first used a
    regularly spaced grid, but then the grid would show up in the
    resulting texture as persistently rectangular features. In order to
    fix it, a Fibonacci-lattice[ HREF="#BPRT">8] was used instead.

    A Fibonacci-lattice of rank 1 is a two dimensional sequence defined
    from the Fibonacci sequence F as:

    width="412" height="83" align="middle" border="0"
    alt="$\displaystyle fl1_i = \left \{ \begin{array}{l l}
    \left [
    i ...
    ... \right
    \rangle \cdots \left \langle 1, 1 \right \rangle}
    \end{array} \right . $" />

    With F:

    width="192" height="73" border="0"
    alt="\begin{displaymath}f_i = \left \{
    \begin{array}{l l}
    0 & i = 0 \\
    1 & i = 1 \\
    f_{i-1} + f_{i-2} & i > 1
    \end{array} \right . \end{displaymath}" />

    Choosing width="13" height="14" align="bottom" border="0"
    alt="$ k $" />
    here means choosing the number of points we can
    generate inside the plane. I chose width="42" height="14" align="bottom" border="0"
    alt="$ k = 9 $" />
    , for width="48" height="30" align="middle" border="0"
    alt="$ 55 + 1 $" />

    Figure:Fibonacci Lattice k = 9.

    Now, each grain is injected in the chamber, and its movements
    calculated as it goes through it. As grains encounter medium changes,
    or the final screen they release a new particle, called a
    ParticleImpact, which shows up as an additive Gaussian grain of
    red/pink light.

    And as we change the simulation parameters with time (for example by
    moving force fields around) we can produce an animation.

    Now we have to capture the resulting frames to standard OpenGL

    An extension enables precisely that: the EXT_framebuffer_object
    extension, also abbreviated as FBO extension. It was introduced in
    2005 to replace the platform specific pbuffer extension. Pbuffers have a
    crufty, slightly different interface for each operating system.

    This extension appeared rather late in OpenGL drivers, first for
    NVIDIAcards then later in the year for ATI cards. Support for
    older card from other manufacturers is unfortunately absent.

    So, running the simulation multiple times with slightly varying forces
    then capturing the result in a texture resulted in an animation of 6
    textures, that I would later use in the visual effects.

    Of course a great deal of time was lost tracking down a bug that led
    me defining non-power-of-two textures, which were silently accepted by
    one driver while leaving the other drivers crawling to a halt. I also
    did experiment a bit with texture compression, but it would reliably
    crash as the sets of frames in the animated texture grew. Scaling down
    the animation to a reasonable number was in the end considered more


    This is one of the big classics of demo effects. It is the basis of
    tunnel effects, and most of the 2d transformations seen in demos of
    the mid 1990s.

    The design involves only one component, but remains very flexible:

    A grid of cells is laid out over the screen. Cells are fixed in the
    screen, and composed of four vertices. Each vertex holds a set of
    input coordinates. A typical meaning for input coordinates is to
    represent the coordinates in an input texture. Thus, each
    quadrilateral maps over the screen a certain area from the input
    texture. The whole grid enables us, by using varied coordinates in
    space and time to deform an input texture.

    This grid forms a sampling of a more general, pre-calculated function.

    This texture can be animated as I did here: the input texture comes
    from a number of frames rendered from the simulation.

    Animating the grid itself is a useful tool. The method I adopted is a
    very simple cycle:

    1. one optional step of deformation.

    2. one step of relaxation.

    The deformation is customised, but here corresponded with an expansion
    of the grid’s input coordinates around a continuously moving center.
    To avoid too great of a change, this deformation is only applied at
    certain points in time, here with a frequency of
    width="118" height="36" align="middle" border="0"
    alt="$ \frac{120}{8} = 15\
    \frac{ticks}{seconds} $" />
    . One can compose deformations rather freely,
    and another step was added, zooming the input texture in and out in
    alternation, at a frequency of
    width="125" height="36" align="middle" border="0"
    alt="$ \frac{120}{96} =
    1.25\frac{ticks}{seconds} $" />

    The relaxation gradually brings the grid to its resting state: a plain
    grid mapping each point of the screen to its corresponding point in
    the texture.

    All of this results in two turgescences that appear to rotate over the
    screen, as we apply two expansion deformations around two different,
    rotating centers. Each deformation is rotating in opposite directions
    around the center, about width="28" height="32" align="middle" border="0"
    alt="$ 2/3$" />
    away from the screen’s center. The full
    rotations take about 12 seconds each to go around the screen.

    Radiating lights

    This effect aims to imitate movings lights as they would appear
    through a dense, fog like atmosphere. It uses the animation created
    during the simulation in a different way, here in three dimension.

    At the center of the screen, a small mapped cube is drawn. This cube
    moves on its own, pushed regularly by forces in opposite directions,
    like somebody was punching it from side to side. The rhythm was
    already mentioned: width="57" height="29" align="middle" border="0"
    alt="$ m4j4e5$" />
    ([*] src="file:/sw/share/lib/latex2html/icons/crossref.png" />), with a frequency of
    width="133" height="36" align="middle" border="0"
    alt="$ \frac{120}{64} = 1.875 \frac{ticks}{seconds}$" />
    ). Another rhythm is used
    to control a greater impact of the “punch” each time the accenting
    rhythm width="69" height="14" align="bottom" border="0"
    alt="$ m4j4e5$" />
    ([*] src="file:/sw/share/lib/latex2html/icons/crossref.png" />) produces its beats.

    The cube is mapped via OpenGL’s spherical mapping, which is normally
    used for environment mapping. It is then recopied many times at
    greater and greater scales, transparently.

    The bigger and bigger cubes, accumulated further on the screen, and
    combined with a low pass filter ([*] src="file:/sw/share/lib/latex2html/icons/crossref.png" />) give the
    illusion of rays of light coming from the origin of the initial cube,
    towards the extremities of the screen.

    The transparent clones of the base cube are themselves expanded and
    contracted as the accenting rhythm width="69" height="14" align="bottom" border="0"
    alt="$ m4j4e5$" />
    kicks in.


    Delivering the intro in Barcelona was difficult, but a great pleasure,
    and I learned a bit while creating it. Physical simulations remain a
    very interesting model for creating complex content.

    The color palette seemed to have irked quite a few people, although
    that might not be surprising as I intended it as slightly aggressive.
    Nevertheless, I’d like to experiment a bit more with representing
    colors and their relations, a quite fascinating topic all in itself.

    Connections between Walking on Four and the old 90s intros also
    didn’t go unnoticed, which was nice to read.

    I took a few months out of coding, to concentrate a bit on music
    making. In contrast with my work in trying to represent timelines
    algorithmically, I also became interested in taking the counterpoint:
    interactive control of a predefined set of effects.[ HREF="#LSR">12]


    In the interactive visual instrument Floo (1999), Golan
    Levin displays entangled tendrils of lights. The interactions
    are produced by a pseudo-newtonian simulation combining
    repulsive and rotative force fields.


    Raycoaster (2005), by Quasimondo is a simple java applet
    made in the Processing multimedia coding environment. It tries
    to display caustics: patterns of lights as they go through a
    serie of material. A mondane example: the shimmering lights
    seen at the bottom of a swimming pool


    Processing, a java-based language for audio visual



    Clothing Like A T-Shirt Saying I Wish (2006)

    Melancholia (1996) by Ryu Murakami

    His character, Yazaki, rants:

    Why? Why do I hate
    talking about myself? Sincerely I hate it. It’s like
    conceptualising one’s desires. Don’t you find one must be
    authentically pretentious to employ those words!
    Conceptualising one’s desires! I mean that I hate all those
    guys who pretend to be authors. A sequence of images, a
    bit of music, and there, here we are. Nothing’s cruder than
    this fashion of using and showing one’s very own desires all
    by oneself. And I’m not talking to you about masturbation.
    The most incredible thing is, people have the utmost respect
    for these guys.


    To trace or not to trace - that is the
    Seminar at BP2005 by Prof. Dr. rer. nat.
    Alexander Keller / Universität Ulm




    Lazy Sunday Radio: Video premiere, held on
    2006.03.19. Performing as neq together with sushibrother.

    TPOLM lazy sunday radio was born in the year 7000,
    as live music sessions broadcasted through the internet by
    electronic musicians in helsinki, finland. Over the years the
    lazy sunday sessions have evolved into an internet festival,
    in which musicians from several countries and continents
    stream music to a global audience from their homes. This
    session on 19 march, is the first time visual artists will
    also stream live video to accompany the music.