Forth is a public art installation commissioned by the University of Central Florida for the Institute for Simulation and Training.
Forth employs computer simulation techniques, interactivity and real time audio synthesis to generate a dynamically changing ocean scene. The seascape projected on a large curved screen is subject to the local weather conditions: the wave height and the atmospheric conditions reflect the weather sampled in Orlando and at the Florida coast. Small row boats containing groups of people advance across the expanse of open water, and narratives emerge within each vessel as the rowers try to stay afloat and on course in turbulent waters. A second screen in the lobby displays an underwater view of the scene, offering an unexpected perspective. Immersive sound heightens the richness of the experience, responding to the weather, and to the flow of visitors through the lobby.
The viewers in the lobby are tracked by a camera and are represented in the projection as paper boats floating on the ocean. The rowers may pick up these paper boats — bridging again the real and virtual worlds.
Software development for Forth was led by James George. Sound design was created by Michael McCrea.
Here are a few details of the production and ideas behind this work.
Interactivity & dynamic data input
Forth’s very simple interaction both compels one to personally invest in the content of the artwork, and connects one to the other viewers. By walking around in front of the projection, the viewers steer the paper boats that were generated for them. This instantly relates them to each other spatially, and also narratively: imagine for example two people standing in front of an image of their two small paper boats bobbing on a vast empty ocean. The play naturally becomes a play with the other viewers: minimalism of the interaction foregrounds this relationship. The viewers will be inevitably ‘in the way’ of the virtual row boats, and they will often instinctively move aside to let the boat by.
Occasionally one of the people in the boats will pick up the viewer’s paper boat, and carry it with them off screen. And inversely: sometimes a person might leave behind a paper boat. In this way the viewer can add to the virtual world in a more lasting way, the traces of the physical world circulate through the ocean-scape, handed off between different boats, an attempt at communication and exchange.
The live data input links the piece to its geographical location, the ‘here and now’ of the real world, making it alive. The people in the boats are constantly trying to maintain their balance within their changing environment, just as we adjust to our own circumstances.
We used a Kinect camera for video tracking, and a modified version of TSPS (an open source computer vision application built with openFrameworks) for analyzing the footage and sending the messages to the application running our game.
Below is a view from our Kinect during installation (hence the ladder) as seen in TSPS — perfect blob tracking imagery:
Format and visual language
Procedural media lets us create stories that can unfold over unlimited time-spans. Not restricted by the length of tape or film, but we can program things to happen over months or even years by building a system in which its elements dynamically balance themselves out.
The visual language of Forth speaks to the format it’s been created in. It adapts the visual language that a game engine offers to reference the idea of a “procedural representation”, or a model representing the larger logic structure that the represented elements are bound with. The people resemble a default human model, stripped of any texture of color, achieving a kind of statuesque look.
The details of the boats arrangements are perceived secondarily, but the tableaux eventually resolve themselves to situations, where each person is an individual.
The Ocean and the Rowers
The ocean surface can be thought of as a visual manifestation of the underlying currents and forces that result in the shape we see. To us it might appear as a kind of abstraction, encoding of something deeper, more inherent and more essential.
The people in boats interact with this surface, adjusting to its changing shape. Their prescribed actions are altered by what they encounter. The rowing animations are created as perfect loops that could endlessly and flawlessly repeat: the abstract notions of the action. These animations are imported into the dynamic ocean scene, and once in the procedural environment they become augmented by the world pushing back on them: the position of the oars is the meeting point of the intention and the its constraints.
As the boats rock on the waves the people will be adjusting their posture to sit upright. To do this we adjust their spine rotation to gradually lean in the desired direction. As the boat tilts the rotation of the oars has to be adjusted so they reach the water surface. The people’s arms have to be controlled with Inverse Kinematics to account for those two corrections, so their hands remain on the oars.
When the waves get bigger, the rower’s strokes become more chaotic and out of sync, further augmenting the original animation imported into the engine.
Constructing the ocean
We’re indebted to the hard work of the Unity 3d community who collectively developed the open source ocean simulation from which we began to build our own ocean. The Ocean Shader project is in turn based on the research of Jerry Tessendorf, who wrote the original white paper on wave simulation.
The first step is to generate an ever changing set of interconnected triangles in a mesh that describe the surface of the ocean. The mesh changes slowly as time progresses, updating in every new frame of our simulation to create a smooth animation.
This mesh also has the special property of “tiling”, which means that if two copies of the same mesh are placed adjacent to one another there won’t be a visible seem between them. This let’s us compute the math to generate the mesh once per frame, but apply that computation like a cookie cutter to hundreds of tiles to create an expansive ocean.
The rendering and texturing the mesh are the next big steps.
Finally we fine tuned our expression of ‘storminess’. The shape of the ocean changes according to the wave-height data. We have three level of detail in our waves: the big swells, the ‘regular’ waves and the small wrinkles on the surface. We set the parameters for all three to arrive at the look we wanted to express.
Boat and Ocean Interaction
One crucial aspect of the project is creating ways for the boats to be affected by the movement of the ocean, and allowing the rowers to adapt as the ocean churns beneath them.
The most basic way for boats to interact is through buoyancy. We want the boats to tilt and sway with the movement of the sea. To float the boats, we sample five specific points underneath each boat: four at the edges one in the center. The average position of the sampled points tells us how high the boat should be floating in the water, while relative differences between the points lets us infer a rotation to create the effect of swaying and turning.
The angle of the waves and movement of the water also affects the speed of the boats. Big waves coming through will push boats along their way, or the rowers will have to struggle against them. Being sideways on a wave is dangerous, so the rowers will divert their course when fighting big currents to steer their vessels in the direction of oncoming swells.
While no one process is extremely noticeable, when combined it results in a generative animation that aims to appears natural, dynamic and responsive to the way our ocean moves.
We are collating data from multiple sources to get a very detailed picture of weather conditions. We have a highly configurable system that enables us to switch weather providers and merge their results, which we’re using for redundancy, but also to get a combination of meteorological and oceanographic data.
Much of this information comes from our own government. The National Oceanic and Atmospheric Administration provides a tremendous wealth of information on our environment and climate, and makes it freely available as government works are in the public domain. Among the multiple sources we use from NOAA are weather observations from nearby Orlando Executive airport. These include things like temperature, wind, humidity, pressure, as well as a textual description of the weather that encodes information not quantified elsewhere, such as “Heavy Thunderstorm Rain Hail”.
The image on the left shows the GUI with some of the weather data we’re using. The 3 modes the from which the weather data can be drawn are live, historical and mock. Most of the testing and designing was done in the mock mode — allowing us to define the weather and mix and match conditions to see the effect that’s produced in our scene. Shape of the ocean is controlled by the wave height and wind speed, the atmospheric conditions such as the cloud cover, haze and precipitation are also changing based on the weather data.
The weather is only one of the elements that is constantly changing and effecting the expression of our world. The other elements are for example time of day and color palette. We have designed several sets of color palettes that include the day and night states. The palettes change once a week, transitioning between night and day palettes at the beginning and at the end of the day.
In the above map, the red pin is the location of the installation. Immediately to the west is the airport. The second icon, 20 nautical miles east of Cape Canaveral, is another source of data: a buoy.
Not just any buoy, though. This is a 20-foot-long observation platform which measures conditions of the ocean, of swell waves and wind waves, and of the air around it as well. With all this information, we could recreate in detail the conditions of the ocean near UCF.
Here are a few examples of the different states of the content and features of the project (which are listed below).
- the weather changes according to the weather conditions in Orlando
- the wave-height of the ocean changes according to the weather conditions at the Florida coast and in Orlando
- color palettes change every week, with a small variation every day, transitioning between different color treatments for daytime and nighttime
- the direction of the boats changes slowly over the course of the year
- the distribution of the boats varies all the time and is designed to create interesting compositions
- people in boats lean to stay upright, compensating for the tilt of the boat
- people’s rowing is also adjusted to the weather conditions
- people walking in front of the projection in the lobby are represented as paper boats floating on the ocean in front of the virtual camera
- the paper boats left on the ocean by the people present in the lobby can be picked up by the people in boats
- two of the ‘directorial gestures’ are shown in the demo: the ‘no rowing gesture’: people in the boats stop rowing and look at the viewer and the ‘stray boat gesture’: one of the boats turns and starts going in a different direction than all the other boats.
the underwater scene is shown on an LCD monitor opposite of the projection
The audio aspect of our ocean scene emerged from noise generators, additive sine tones, control signals, and field recordings. The auditory world of Forth is composed of a handful of core “elements”: wind, waves, boats, bells, resonant bodies, and spatializers. Each element is highly parameterized to articulate a large dynamic range of activity, texture, and gesture.
In anticipation of a complex set of interactive elements, both local to the atrium as well as environmental input from meteorological data, the system driving the auditory scene is capable of responding to a flux of data from moment to moment as well as month to month.
Prototyping interface for modifying “elements”
The sound at times serves to envelop visitors in the scene of our rowers, in which case diegetic audio cues function in concert with visual cues. Rowers cross the frame for example, then recede into the distance behind the vantage point of “our” boat. The sonic environment, however, is not bound by a line of sight. Events may unfold across a full 360 degrees of the listening plane surrounding the audience. This sets the stage for a departure from a diegetic relationship with the projected image. The distant ringing of a bell, for example, may be heard off screen, calling across the space to visitors and other rowers from an unseen rower adrift.
In this auditory scene, you’ll hear a few of these elements working together, creating the gestalt of a fairly stormy day. This is a stereo rendering of what is otherwise a 5-channel surround sound scene.
A 13-min video documenting the installation: