Color the Temple of Dendur - Research at the MET MediaLab
Area: Interactive Projection Mapping & Illustration
Company: Metropolitan Museum of Art
Year: Fall 2013
Work in conjunction with the Department of Egyptian Art to explore the possibilities around color projection and projection mapping onto the Temple of Dendur. We illustrated and colored digitally one of the temple's scenes to project onto the temple in order to simulate the original colors it might have had. Then we developed a storytelling/educational tool for presenting the scene to an audience: a mobile interface to control what is to be displayed: highlight different figures, animations and zoom in the glyphs. We worked under guidance of Research Fellow Erin Peters and Don Undeen, the MET MediaLab director. The work has been used in Erin's presentations, where she uses it to describe her findings about which colors hypothetically the Temple could have been painted.
We started the project by digitalizing the image of the scene. Using a high resolution photo, we did vectors for each area. We wanted to have vectors to allow us to easily tweak and test different colors. However, some parts of the scene (especially the glyphs) were eroded so Erin helped us to identify the right shapes by showing us scholar references.
After finalizing the drawing, Erin gave us the colors she found in her research and we came up with a "safe" first scene. However, she also found evidences of patterns in the paintings that cannot be seen in the carvings. She sent us images that we adjusted to fit our scene. At this moment, due the granularity of details, we realized it would be just faster to work in Photoshop.
We did several visits to the Temple in the evenings. Even before we had the final results from research, so we could test different equipment and the mapping tool. Colors change a bit when projected to sandstone instead of a white wall. Because of this, we brought the researcher in and tweak the colors in real time. This was very important to the success of the project.
An important part was to add interactivity and explore different ways of presenting this image to the museum audience. From the drawings, we created animations that tell the story of the scene, e.g. the Emperor Ceaser Augustus depicted as Pharaoh arriving and offering wine to the deidities Hathor and Horus. We also wanted to explain through animations how the figures in Egyptian art are represented flat and in this case one behind the other, but in reality they were side by side. Another feature we explored was the ability to highlight different figures and their respective text. Zooming was a solution to augment the perception over specific details.
After we tested animations and came to conclusion they looked good, we needed to create a way of allowing the researcher to control them. We developed a system with openFrameworks and spacebrew that would display different scenes according to an input of a mobile interface. The researcher told us about her presentation and the interface was based in her input. We used Syphon framework to bring the video output to Madmapper - the video mapping tool that allowed us to map the drawing exactly onto the carved wall. Mapping the scene was easy and took about 5 minutes to setup.
Here is a summary of the whole process and tools we used: Get images and references, draw in Illustrator and Photoshop, import the file to Adobe After Effects to animate scenes, add these assets to an openFrameworks application that is connected to a web server (which also allows users to acess a mobile interface to control the projection) and send the visual to be projected through Syphon to Madmapper.