Forth is a public art installation commissioned by the University of Central Florida for the Institute for Simulation and Training.
Forth is a seascape subject to the local weather conditions: the wave height and the atmospheric conditions reflect the weather sampled in Orlando and at the Florida coast. Small row boats dvance across the water. A second screen in the lobby displays an underwater view of the scene. Immersive sound responding to the weather, and to the flow of visitors through the lobby.
The viewers in the lobby are tracked by a camera and are represented in the projection as paper boats floating on the ocean. The rowers may pick up these paper boats — bridging again the real and virtual worlds.
Software development for Forth was led by James George. Sound design was created by Michael McCrea.
The Ocean and the Rowers
The people in boats interact with the ocean surface, adjusting to its changing shape. Their prescribed actions are altered by what they encounter. The rowing animations are created as perfect loops that could endlessly and flawlessly repeat: the abstract notions of the action. These animations are imported into the dynamic ocean scene, and once in the procedural environment they become augmented by the world pushing back on them: the position of the oars is the meeting point of the intention and the its constraints.
As the boats rock on the waves the people will be adjusting their posture to sit upright. To do this we adjust their spine rotation to gradually lean in the desired direction. As the boat tilts the rotation of the oars has to be adjusted so they reach the water surface. The people’s arms have to be controlled with Inverse Kinematics to account for those two corrections, so their hands remain on the oars.
When the waves get taller, the rower’s strokes become more chaotic and out of sync, further augmenting the original animation imported into the engine.
Constructing the ocean
We’re indebted to the hard work of the Unity 3d community who collectively developed the open source ocean simulation from which we began to build our own ocean. The Ocean Shader project is in turn based on the research of Jerry Tessendorf, who wrote the original white paper on wave simulation.
The first step is to generate an ever changing set of interconnected triangles in a mesh that describe the surface of the ocean. The mesh changes slowly as time progresses, updating in every new frame of our simulation to create a smooth animation.
This mesh also has the special property of “tiling”, which means that if two copies of the same mesh are placed adjacent to one another there won’t be a visible seem between them. This let’s us compute the math to generate the mesh once per frame, but apply that computation like a cookie cutter to hundreds of tiles to create an expansive ocean.
The rendering and texturing the mesh are the next big steps.
Finally we fine tuned our expression of ‘storminess’. The shape of the ocean changes according to the wave-height data. We have three level of detail in our waves: the big swells, the ‘regular’ waves and the small wrinkles on the surface. We set the parameters for all three to arrive at the look we wanted to express.
Boat and Ocean Interaction
One crucial aspect of the project is creating ways for the boats to be affected by the movement of the ocean, and allowing the rowers to adapt as the ocean churns beneath them.
The most basic way for boats to interact is through buoyancy. We want the boats to tilt and sway with the movement of the sea. To float the boats, we sample five specific points underneath each boat: four at the edges one in the center. The average position of the sampled points tells us how high the boat should be floating in the water, while relative differences between the points lets us infer a rotation to create the effect of swaying and turning.
The angle of the waves and movement of the water also affects the speed of the boats. Big waves coming through will push boats along their way, or the rowers will have to struggle against them. Being sideways on a wave is dangerous, so the rowers will divert their course when fighting big currents to steer their vessels in the direction of oncoming swells.
While no one process is extremely noticeable, when combined it results in a generative animation that aims to appears natural, dynamic and responsive to the way our ocean moves.
We are collating data from multiple sources to get a very detailed picture of weather conditions. We have a highly configurable system that enables us to switch weather providers and merge their results, which we’re using for redundancy, but also to get a combination of meteorological and oceanographic data.
Much of this information comes from our own government. The National Oceanic and Atmospheric Administration provides a tremendous wealth of information on our environment and climate, and makes it freely available as government works are in the public domain. Among the multiple sources we use from NOAA are weather observations from nearby Orlando Executive airport. These include things like temperature, wind, humidity, pressure, as well as a textual description of the weather that encodes information not quantified elsewhere, such as “Heavy Thunderstorm Rain Hail”.
The image on the left shows the GUI with some of the weather data we’re using. The 3 modes the from which the weather data can be drawn are live, historical and mock. Most of the testing and designing was done in the mock mode — allowing us to define the weather and mix and match conditions to see the effect that’s produced in our scene. Shape of the ocean is controlled by the wave height and wind speed, the atmospheric conditions such as the cloud cover, haze and precipitation are also changing based on the weather data.
The weather is only one of the elements that is constantly changing and effecting the expression of our world. The other elements are for example time of day and color palette. We have designed several sets of color palettes that include the day and night states. The palettes change once a week, transitioning between night and day palettes at the beginning and at the end of the day.
In the above map, the red pin is the location of the installation. Immediately to the west is the airport. The second icon, 20 nautical miles east of Cape Canaveral, is another source of data: a buoy.
Not just any buoy, though. This is a 20-foot-long observation platform which measures conditions of the ocean, of swell waves and wind waves, and of the air around it as well. With all this information, we could recreate in detail the conditions of the ocean near UCF.
Here are a few examples of the different states of the content and features of the project (which are listed below).
- the weather changes according to the weather conditions in Orlando
- the wave-height of the ocean changes according to the weather conditions at the Florida coast and in Orlando
- color palettes change every week, with a small variation every day, transitioning between different color treatments for daytime and nighttime
- the direction of the boats changes slowly over the course of the year
- the distribution of the boats varies all the time and is designed to create interesting compositions
- people in boats lean to stay upright, compensating for the tilt of the boat
- people’s rowing is also adjusted to the weather conditions
- people walking in front of the projection in the lobby are represented as paper boats floating on the ocean in front of the virtual camera
- the paper boats left on the ocean by the people present in the lobby can be picked up by the people in boats
- two of the ‘directorial gestures’ are shown in the demo: the ‘no rowing gesture’: people in the boats stop rowing and look at the viewer and the ‘stray boat gesture’: one of the boats turns and starts going in a different direction than all the other boats.
the underwater scene is shown on an LCD monitor opposite of the projection
The audio aspect of our ocean scene emerged from noise generators, additive sine tones, control signals, and field recordings. The auditory world of Forth is composed of a handful of core “elements”: wind, waves, boats, bells, resonant bodies, and spatializers. Each element is highly parameterized to articulate a large dynamic range of activity, texture, and gesture.
In anticipation of a complex set of interactive elements, both local to the atrium as well as environmental input from meteorological data, the system driving the auditory scene is capable of responding to a flux of data from moment to moment as well as month to month.
Prototyping interface for modifying “elements”
The sound at times serves to envelop visitors in the scene of our rowers, in which case diegetic audio cues function in concert with visual cues. Rowers cross the frame for example, then recede into the distance behind the vantage point of “our” boat. The sonic environment, however, is not bound by a line of sight. Events may unfold across a full 360 degrees of the listening plane surrounding the audience. This sets the stage for a departure from a diegetic relationship with the projected image. The distant ringing of a bell, for example, may be heard off screen, calling across the space to visitors and other rowers from an unseen rower adrift.
Forth was later reconfigured as an installation called Capacity, which shows the editor view of the scene build for Forth. Revealing the edges of the illustion created for the digital camera, it includes debug graphics and artifacts of the digital construction process.
“Code hides itself in the very act of consummating its own expression. A game expends itself in the very act of its being played. And so the game retreats from its own essence. But only in such a way as to be more true than the essence could ever be.” writes Alex Galloway
“Most fantasy game designers would regard visible signs of any technological underpinnings as unwanted anachronisms that would threaten the constitution of the immersive fantasy they are attempting to construct. The resulting by-products of this problem can be found in the designers’ introduction of metaphors that function to assimilate unwanted technological residues into the narrative diegesis.” writes Eddo Stern