Once upon a time I was an oceanographer, and many mist-shrouded years before that I had some fragmentary dream of designing ships for a living. I thought of it as something called hydrodynamic engineering. Now I know it’s called marine architecture. Needless to say I moved on from these dreams onto others, better and cooler and dryer. But imagine my joy when the UAMN production unit found itself working (or was it pushed, maybe, at least in part by myself) on both an animated film about bowhead whale migration and on an exhibit about the UAF-managed and newly being-constructed research vessel, Sikuliaq. And then, this last summer, I had the good fortune of joining our Earth Sciences team for a week on the Yukon River in search of dinosaur tracks. AND THEN, this week it snowed and melted and there were puddles in the road.
But forget the puddles. Forget the dinosaurs; that was time on the river, with cameras, powering downstream for 500 miles — looking at water. A lot of our endeavors seem to concern water these days. THIS YEAR seems to be all about water. Let’s not even get into the soon to be advertised Polar Voices project. There’s lots of water and coastlines in that one too. Check back soon for a blog very close to this one.
But this one is all about computer generated water, and we’ve looked at our rendered water critically for more than 6 months now and still, continually find things to nit-pick (at least I do). But here is where the Yukon River comes in: looking at the very real water on the river – and looking very critically at that very natural effort, I found flaws there too, especially in the raft’s wake and in some of the turbulence behind the outboard motors when coupled with the river chop. Sometimes it did not seem as realistic as it should have been, given it really was quite wet and quite cold. Seriously. Sometimes, you stare at something for too long.
Is there a lesson to take from this? Yes, I think we’ve achieved some darned good water for this show. About time.
A couple years ago we invested in some robust fluid simulation software for the animated rendering of tundra ponds for a museum film about the collections in the museum galleries (it’s a long story. About half an hour). This year, we updated the software to handle the projects at hand. What’s good for the pond is good for the ocean – WITH UPGRADES.
Whether our film camera is below the water, looking at an angle near the water surface, looking down from thousands of feet in the air or even hundreds of miles out from satellite, or seeing a strip of water wedged between great sheets of Arctic pack ice — OCEAN appears in more than 90% of shots in our little film about whales. Each and every one of these perspectives requires something very different from the fluid simulator.
Not nearly as sublime as floating the Yukon, most of our simulated water begins as as infinitely thin sheet in the shape of a square. A “square” skin of water such as shown below is constructed of between 1 and 5 million triangles, is fully animated according to the laws of fluid dynamics, and can be reconfigured for any windspeed, the presence of whitecaps, etc…
The square of water shown above is also only 100m wide, and while it is significantly larger than a bowhead whale, it is also quite inadequate when it comes to filming said whale. When filming arthropods and zooplankton, the camera depth of field and “fogginess” of the under-water ensures we never see as far as 50 meters in any direction (and usually much less), but when when we film a whale moving through and on the surface of the ocean, it runs out of surface width very quickly. Luckily we are far from the first production crew to run into such problems. The above ocean square is also very thoughtfully designed to be perfectly, infinitely tileable. Swim off one edge and you instantly appear on the opposite edge and never know the difference. Computers do INSTANTLY very well. Taking the subtle magic of computer processing even further, we can duplicate our already rather detailed square and extend it out to something closer to 2 kilometers square. Wash, rinse, and repeat.
We wouldn’t want the computer to actually think about all those triangular faces we’ve just asked it to think about (as much as 2.2 billion). As far as the machine is concerned, there is still only the original square at the center of the instanced array, with its modest several million triangles. Technically, the method is called INSTANCING and it is a lot more efficient on computer RAM than the, in this case, 441x horrific alternative. This is all the computer need worry about even though the end result is so much more.
Of course, there are caveats. We would never want our camera frame to reveal a perspective such as this…
…where the repeated squares are obvious and artificial. For such a shot we would need to simulate a different square of greater scale and less resolution. Let’s face it, there are only 2.3 million pixels in the film’s final image. It is only moving animals and an animated camera that require more. But for a shot like this…
…with a quickly moving object and camera, variances in lighting and an ocean surface changing over time, it is very difficult to see the instanced nature of the squares, even though the repetition is “visible” in this scene. Even in the “obvious”array render above, we can see that the offending pattern is broken up best where the sunlight hits the water.
For Arctic Currents, we have pre-simulated half a dozen water “squares” of varying scales and sea heights. The higher seas are used for open water shots and the more subtle skins for where the water exists between ice floes and in leads. We will likely simulate another half dozen for specific before the project is through.
A simulated ocean square requires about 30-60GB of drive space to store for later use, and about 2-3GB of ready RAM overhead to load and apply to an animated scene. Very manageable compared to the FOLLOWING and far more complicated ocean simulation.
What happens when a whale breaks the surface? This isn’t something we can simply instance across a wider ocean like we can do for wind-driven waves. A whale’s wake will not tile realistically. In this case, we need to extend the simulated ocean to a point where sleight-of-hand with cameras can hide EDGES. It’s horrible even to think of it, the ocean having EDGES, but these are the times in which we live.
More on these tiny nightmares later, but to tease, what better way to “hide” the EDGE of the ocean but to make an edge, an ice edge. Here, one of our whales is making waves in a lead between two big sheets of ice. To save drive space, memory, and calculation time, we make the ocean relatively shallow (only a few meters). Trust that the water ends abruptly just to the left and right of the frame; the full lead width is about 100 meters. The simulator is very accomodating. We can dial up the resolution as far as our tricked-out machines can handle — about 30 million calculated particles. It takes about a week to run through the various levels of watery domain from the core fluid to the splashes, waterline details, foam, and even mist, when and where they are needed. The real sticking point is the drive space. A 30-second simulation, calculated and data stored for subsequent in-scene rendering, requires something just over a 1000GB. Okay, I’ll say it. It takes a TERABYTE. Hmmm. Yes, more on this later.
It is very pretty though, and really, it would take about the same to simulate a rafting adventure on the Yukon, or even a puddle in the middle of a road — because we’d want the camera to come a lot closer, naturally, and NEED a fair amount better resolution — right up until the computers throw up their hands. Magically, right about THERE is where things begin to look about right, unless we train our eyes on it too long.
I’ve always liked water, even if it doesn’t always look right. Don’t let the math fool you. It can do strange stuff. It’s a character all its own.
– Roger Topp (UAMN Head of Exhibits and Digital Media Production)