the mediasphere: intermediation with digital planetaria · the mediasphere: intermediation with...

25
Senior Thesis in Media Studies The Mediasphere: Intermediation with Digital Planetaria Author: Advisor: Ziv Epstein Dr. Kim-Trang Tran [email protected] Submitted to Pomona College in Partial Fulfillment of the Degree of Bachelor of Arts December 11, 2016 1

Upload: nguyentu

Post on 20-Jun-2018

219 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: The Mediasphere: Intermediation with Digital Planetaria · The Mediasphere: Intermediation with Digital Planetaria. Author: ... multimedia projects that explored both ... the perhaps

Senior Thesis in Media Studies

The Mediasphere: Intermediation with Digital Planetaria

Author: Advisor:

Ziv Epstein Dr. Kim-Trang Tran

[email protected]

Submitted to Pomona College in Partial Fulfillment of the Degree of Bachelor of Arts

December 11, 2016

1

Page 2: The Mediasphere: Intermediation with Digital Planetaria · The Mediasphere: Intermediation with Digital Planetaria. Author: ... multimedia projects that explored both ... the perhaps

Contents

1 Introduction 1 1.1 A Brief History of Expanded Cinema . . . . . . . . . . . . . . . . . . . . . . 1 1.2 Virtual Reality as Youngblood’s Immersive Techno-Brainchild . . . . . . . . 2 1.3 Planetaria as the Bleeding Edge of Digital Media Technology . . . . . . . . . 3 1.4 Related Work in Planetaria . . . . . . . . . . . . . . . . . . . . . . . . . . . 5

2 Methods 7 2.1 Slicing and Dome Mapping . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8 2.2 Mirror Lens . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10 2.3 Adobe AfterEffects . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10 2.4 Analog Video Synthesis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10

3 Results 12 3.1 Native Assets . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12 3.2 3D Models . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12 3.3 Fulldome Mapped Video . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13 3.4 Music Ensemble . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13 3.5 The Show . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15

4 Discussion 19

2

Page 3: The Mediasphere: Intermediation with Digital Planetaria · The Mediasphere: Intermediation with Digital Planetaria. Author: ... multimedia projects that explored both ... the perhaps

Abstract

An explosion of digital media technologies has allowed for new developments in visualization and immersive visual art platforms. The Pomona planetarium offers a unique opportunity to explore experimental digital media in a new setting that remains true to its conception but is realized in a highly immersive manner. I import various media assets from different mediums and creative styles to explore their potential as a new art form within the dome. Then, in the spirit of Stan VanDerBeek, Buckminster Fuller and Gene Youngblood, I build and present an exhibition that showcases the convergence of these media. This work demonstrates the planetarium’s capacity as a state-of-the-art visualization and community engagement platform.

Page 4: The Mediasphere: Intermediation with Digital Planetaria · The Mediasphere: Intermediation with Digital Planetaria. Author: ... multimedia projects that explored both ... the perhaps

Chapter 1

Introduction

The aesthetic revolution of digital computing has, in its wake, led to new explorations of digital media as a cutting-edge art form. As soon as computers had the ability to visually render information (which began with cathode ray tubes in 1977) and to read external input from a variety of data sources, multimedia projects that explored both the aesthetics of technology and the technology of aesthetics have begun to propagate across culture.

1.1 A Brief History of Expanded Cinema

Even during the infancy of computing, technologists and artists alike explored the potential of new digital media. One of the first examples is John Stehura’s innovative 1964 film Cyber­netik 5.3. Stehura’s work combines “computer graphics with organic live-action photography to create a new reality that is both haunting and extraordinarily beautiful [21].” By inte­grating the objectivity of real-world imagery with mathematically-well-defined, generative algorithms, Stehura created a visual experience that is both harmoniously patterned and evocative of the signifiers of reality [8]:

“Throughout the film, complex clusters of geometrical forms and lines whirl, spin, and fly in three-dimensional space. Showers of parallel lines come streaking out of infinityy. Crescents and semicircles develop dangling appendages and then expand until they no longer exist. Whirling isometric skeletal structures permute into quadrant figures, polygons, rotating multiple-axis tetrahedrons.

These images are neon-bright in alternating blue, red, orange and green. They vibrate rapidly as they take shape and disintegrate. The staccato, spiraling, buzzing rumble of Dockstader’s sound complements the kinetic activity with its own sense of acoustical space. The storm of geometrical fantasy is superimposed over a star-spangled image of the solar system in emerald green” —Gene Youngblood [21].

Stehura showcased how the modularity and dynamism of digital media technology allows for phantasmagoric multi-media expositions by stitching together geometric algorithms1 and

1Stehura programmed Cybernteik 5.3 all in Fortran, a state-of-the-art language for scientific computation at the time

1

Page 5: The Mediasphere: Intermediation with Digital Planetaria · The Mediasphere: Intermediation with Digital Planetaria. Author: ... multimedia projects that explored both ... the perhaps

photo-realistic images of space with externally curated sound. The work of Stehura and his contemporaries inspired a digital media revolution termed

“Expanded Cinema” by Gene Youngblood in his seminal book of the same name, where he argues that these new technologies unite art and life through a new form of consciousness [21].

Youngblood expounds on the various technological breakthroughs that computation cat­alyzed, and how these breakthroughs mediate modern cinema advancements and shape hu­man interaction. He defines the Technosphere as a symbiosis between man and machine: the computer does not consign man to obsolescence, but rather frees it from specialization and augments its intelligence. Like with any symbiosis, the process is co-evolutionary. Humanity not only adopts the technology of era into the socio-cultural zeitgeist, but also the machine vision conforms the collective human perception and consciousness to the technology of an era [2].

The interdependence of the Technosphere creates a positive feedback loop. Technology is developed with a particular ideology in mind, and is then disseminated across mainstream channels. As the technology becomes embedded within the fabric of daily life, the ideology associated with it becomes naturalized. This cycle repeats when the next round of technolog­ical updates comes around, and a more extreme implementation of the naturalized ideology is realized.

But Expanded Cinema is more than a purely technical synthesis of man and machine. Its central tenant is expanding and cohering the subjectivities of the participants into a uni­fied consciousness. Technological advancements catalyze this process, but do not define it. For example, consider the famed Single Wing Turquoise Bird concert group. A psychedelic kaleidoscope of rock music, projected visuals and pseudo-religious phantasmagoria2, the Sin­gle Wing Turquoise Bird has no defined program. Rather, “each presentation evolves from the interacting egos of the group working in harmony.” [21] What emerges is an ephemeral entity, collaborative and agentful. The intimacy and potency of this approach rises primarily from the groups capacity to work together and synthesis the disparate components. Indeed after seeing it, Youngblood himself exclaimed, “I’ve never seen a light show in which the participants were so sensitive to each other. After the first five minutes you realize something very rare is happening. You’re so intense when you work. Your hands are separate from your minds. You stare so intently at the screen. Its as though you are projecting your brain waves. I get a vision of beams flashing from your eyes like some science fiction thing.” As the Single Wing Turquoise Bird indicates, Expanded Cinema is necessarily a social enterprise which uses cutting-edge technology as a bridge to bring together groups of people.

1.2 Virtual Reality as Youngblood’s Immersive Techno-Brainchild

Now in 2016, the perhaps logical conclusion of Youngblood’s Technospherical world is realized by the mass adoption of Virtual Reality technologies [14, 16]. VR synthesizes the program­

2Youngblood refers to it as “a combination of Jackson Pollock and 2001, of Hieronymus Bosch and Victor Vasarely, of Dali and Buckminster Fuller.”

2

Page 6: The Mediasphere: Intermediation with Digital Planetaria · The Mediasphere: Intermediation with Digital Planetaria. Author: ... multimedia projects that explored both ... the perhaps

matic assets like those of Stehura and Guy, but immersively superimposes the digital media over the observer’s subjective reality through state of the art dynamic head-tracking and stereoscopic 3D imaging [11]. This fusion of individual subjectivity with digitized virtual objectivity mediates the shift in collective human consciousness and perception posited by Expanded Cinema. Thus VR and the infrastructure surrounding it are in many ways the cutting edge of Expanded Cinema.

Yet as an modern manifestation of Expanded Cinema, VR also has several technological and systematic limitations. Technologically, there are no widely distributed and intuitive programmatic interfaces that allow one to import multimedia assets to a VR headset. Cur­rently, there are only two feasible methods for rendering content in virtual reality. The first is directly importing sterioscopic recorded video using a true fish eye or ultra wide rectilin­ear lens [7], but this equipment is costly and requires extensive photographic training. The second method are industrial game engines, such as Unity [5], which allow users to build video-game like worlds. However these methods require a pre-structured game design and media assets limited to 3D models. What results is a subjective virtual experience with a well-defined aesthetic and semiotic architecture. Aesthetically, these virtual environments are “based on photo-realistic hard-edged objects in empty space [4].” Because of the compu­tational cost of real-time rendering, the simplistic tonalities and textures of VR worlds leave much to be desired, evoking the video games of the late 1990’s. Thus, the very aesthetic fabric of the system blatantly reminds the user that they are viewing a virtual world: the user’s mapping of this virtual world to the physical world is purely symbolic, and abandons many of the spatio-sensorial intuitions we use on a daily basis.

VR also has systematic limitations as a multi-media platform for Expanded Cinema. Current versions do not yet allow for a shared reality between two people. Rather, a user experiences the virtual reality alone while others look on at a first person camera view from a desktop monitor. A key element of Expanded Cinema is sociality [12, 21]. These digital multimedia experiences are not experienced in isolation, but operate within a broader social and cultural context. As such, they form a collective consciousness that synthesizes the varied subjectivities of each individual with the communal communal perceptual objectivity of the experience. Virtual reality in its current form isolates the user and lacks a shared experience that Expanded Cinema leverage in its execution.

1.3 Planetaria as the Bleeding Edge of Digital Media Technology

Standing in contrast to this solipsism of VR is the planetarium theater. With participants scattered about and a human operator at the helm, other human agency is actively injected into the space. The physicality of the dome creates an augmented reality rather than a virtual one: elements of the real world can seamlessly interact with elements of the virtual world. Thus, the planetarium itself has a certain form of agency as the arbiter of the elements and people within it. Entering the planetarium is entering a completely distinct world with its own sounds, rhythms and rules. The technical systems involved, such as the projectors, computers and software, are fully obfuscated from the participant. The participant only sees

3

Page 7: The Mediasphere: Intermediation with Digital Planetaria · The Mediasphere: Intermediation with Digital Planetaria. Author: ... multimedia projects that explored both ... the perhaps

the visuals overlayed on the dome and has no idea how they are generated. This illusion makes the space magical and otherworldly, divorced from the societal conventions and norms of the world outside.

Digital media theorist such as Malcolm Le Grice have argued that for these media tech­nologies, the infrastructure surrounding the content can be equally if not more meaningful than the content itself [10]. Several signifiers surrounding the planetarium serve to reify it.

First, the physicality of the space, with its muted computer whirring, industrial server rack, and plush reclinable seats, define the behavior acceptable and not within it. It is a space for viewing media, where the user is confined to a purely observational role. Imagine, in contrast, if the seats within the planetarium were removed, and the 360-degree symmetry of the dome was fully leveraged. Then the interaction between the user and the dome would take on a completely different set of meanings.

Second, the nature of the media suitable for the dome forms an “ecosystem” of intermedia [21]. Planetaria are designed to project utter darkness, with faint pinpricks of the light. As such, dimly-lit3 media perform well on the dome, preserving the illusion of a larger space, while lighter light media over-illuminate and remind the user that she is under a dome. The nontraditional geometry of the dome further differentiates media that will suited for it. Up-close visuals of daily life that are often well suited for a theater fail to take advantage of the sphericality of the dome, while radial, symmetric or environmental visuals maintain the illusion of a larger space. Finally, from the technical perspective, the planetarium is designed to display large 4k content on the order of gigabytes. In one sense, because there is so much content that satisfies these criteria and thus are “well-suited” for the planetarium, this ecosystem of media has nothing in common. But they are all linked in their immersiveness and richness. As digital media technologies shift more towards immersive visual experience, this ecosystem becomes more pervasive and relevant.

By their very name, planetaria are deeply related to science and the exploration of plan­ets and other heavenly bodies. Indeed they are primarily used to visualize astrophysical phenomena. Originally analog, planetaria consisted of a dark dome and a star projector, that would project the night sky and stellar motions of a given temporal-spatial coordinate. However advances in projector technology has caused a shift towards the use of fully digital planetaria. This shift has created an explosion in planetarium usage as a science educa­tion platform and visualization interface for astrophysical simulations and images [6, 22, 23]. Thus, a planetarium is the confluence of art and science, or, as Younglood puts it, an Aes­thetic Machine. Through it, the artist “is in a position to deal directly with the fundamental scientific concepts of the twentieth century [21].” Indeed due to the highly visual nature of modern science, the value of the artist is well appreciated among scientific communities [15]. Because of this rich history of collaboration between art and science, the planetarium is an organic platform for further media exploration. Its potency lies in its ability to take scientific models and phenomena that are obfuscated by a veil of abstraction, and, with an artistic touch, visually bring them to life.

One example of bringing scientific data to life is the planetarium’s ability to spatially interact with real images of planetary bodies. In the summer of 2015, the interplanetary spacecraft New Horizons arrived at Pluto after a 9 year, 4.6-billion mile voyage. The first

3I.e. with a mean pixel intensity in the 30-100 range

4

Page 8: The Mediasphere: Intermediation with Digital Planetaria · The Mediasphere: Intermediation with Digital Planetaria. Author: ... multimedia projects that explored both ... the perhaps

craft to be sent directly to Pluto, New Horizons offered us the first images of Pluto ever taken in the history of mankind.4 These up close images were then algorithmically stitched together to form a 3D model of Pluto, which can be orbited, approached and landed upon in the Planetarium. This application is scientifically and philosophically profound, as it allows us to directly interact with a planetary body so distant that we discovered it only around 100 years ago.

Digital planetaria also have unparalleled technical specifications that make them unique for the visual manifestation of digital data. Even in 1970, Youngblood recognized the im­portance of screen resolution in bringing these digital realities to life. “The video system consists of 480 lines of resolution, each line composed of 512 individual points. Each of these [245,760] points can be set to to display at any of 64 desired intensities of the gray scale between total black and total white,” Youngblood says of a then state of the art JPL computer graphics subsystem. “Possible variations for one single image thus amount to 64 times 245,760,” he lauds.

“If the visual subsystems exist today, it’s folly to assume that the computing hardware won’t exist tomorrow. The notion of “reality” will be utterly and finally obscured when we reach that point... We’re entering a mythical age of electronic realities that exist on a metaphysical plane.” —Gene Youngblood [21]

Indeed this claim is fully realized by the 4k capabilities of the modern digital planetarium. With a resolution of 4096×4096 pixels, and a continuous 256×256×256 RBG color spectrum, 5 there are over 280 trillion possible variations for a given dome image. Thus the human eye cannot distinguish between physical reality and the projected dome image [1].

1.4 Related Work in Planetaria

Planetaria have of course been used for creative, multimedia projects in the past. Henry Jacobs and Jordan Belson conducted the legendary Vortex Concerts at Morrison Planetar­ium in San Francisco’s Golden Gate Park from 1957 to 1960. The concerts were a series of experimental and ethnic music concerts that explored a myriad of musical genres, ranging from Stockhuasen and Ussachevsky to Balinese and Afro-Cuban polyrhythms. To accom­pany the music, Belson installed a custom suite of interference-pattern and kaleidoscope projectors with strobes lights. Jacobs and Belson integrated the visual and auditory ele­ments into a cohesive, interactive whole. The relationship between the concert audio and the images generated from these projectors was intended to be counterpoint rather than “Mickey Mouse synchronization.” In particular, Vortex did not “simply project sound into space, but employed dimensionality, direction, aural perspective, and speed of movement as musical resources [21].” Each of these musical attributes was viewed as a tunable input parameter to the visual apparatuses. Jordan would modulate the timing of the images, the

4Other images were either distant extrapolations based on Pluto’s light signature (since it is so far, it is essentially a point source) or artistic interpretations of what it might look like.

5Youngblood himself even said “the average human can perceive only 100 to 200 different color shadings

5

Page 9: The Mediasphere: Intermediation with Digital Planetaria · The Mediasphere: Intermediation with Digital Planetaria. Author: ... multimedia projects that explored both ... the perhaps

rates of image enlargement and rotation, and the brightness to match up organically with the music.

The significance of Vortex lies in its re-appropriation of the planetarium space. Con­structed as a locus for scientific exploration, the behavior and content traditional to the planetarium has become normalized. However by providing a space predicated on cutting edge artistry for the sake of artistry, the show de-naturalizes the occupation of the planetar­ium and recasts it as an artistic and communal space.

Expanding upon this idea is Stan VanDerBeek’s Cine Dreams (part of his Movie Drome series), perhaps the most revolutionary Expanded Cinema use of planetaria. Cine Dreams was a 8-hour long audio-visual Planetarium event partitioned into segments to simulate the REM pattern of deep sleep [20]. With people strewn about in various positions with blankets and snacks, Cine Dreams is substantially different from other Planetarium ventures. Throughout the project, VanDerBeek pushed the philosophical and logistical usage of the space by defining an expanded subjectivity for the participants of Cine Dreams.

“By exposing the individuals gathered together to an overwhelming information experience through submerging their bodies in incessant waves of lights, sounds and images in an effort to penetrate and elicit an emotional response, the Movie Drome produced an ‘immersive subject’ — a decidedly social subject. This mode of address can be read in terms of phenomenological bodily experience. The subject was not only immersed visually in the flow of light and images but also aurally through quadraphonic sound. These elements added to the heightened sense of tactility brought on by being enclosed in an intimate space, where participants would have to feel their way around in the darkness, with conscious of being surrounded by other bodies looking, napping, dancing and absorbing the effects together. ” —Gloria Sutton [18]

This construction of the immersive subject takes a step further than Vortex and directly subverts the hegemonic principles of planetarium usage. The prevalent conception of the planetarium is a vehicle to transport the subject to a distant, galactic world. The physical­ity of the dome is obfuscated to maintain the illusion of an alternate reality [18]. But by offering unconventional ways of interacting with the dome (with blankets, snacks and danc­ing), VanDerBeek’s conception of the subject makes salient the architectural space of the dome. The participants are not longer passive individuals observing an objective audio-visual broadcast, but rather become a collective conscious that cultivates a subjective narrative. [18]. This project aims to mirror both the theoretical framing of how observers can interact with a Planetarium and also the methods used to achieve this that Vortex instantiated and Cine Dreams expanded.

6

Page 10: The Mediasphere: Intermediation with Digital Planetaria · The Mediasphere: Intermediation with Digital Planetaria. Author: ... multimedia projects that explored both ... the perhaps

Chapter 2

Methods

Pomona College has, on the second floor of the Millikan Laboratory, a 25-foot digital plan­etarium. The Pomona planetarium consists of three projectors and eight computers: one master computer that manages all operations, one sound computer for sound files and direct access to the sound-sound system, and then six computers that manage the projectors: Each of the three projectors have two computers that mange it. The planetarium is equipped with SkyVision’s DarkSky2, the industry standard software for operating planetarium shows. In addition, Pomona is beta-testing DarkMatter, SkyVision’s newest software. DarkMatter has many advancements over DarkSky2 such as the ability to import external media, and as such is considered the state-of-the-art. The Pomona planetarium is the only dome in North America equipped with this software.

The goal of this project was to import the theories of Expanded Cinema and implemen­tation techniques of previous planetarium shows to this planetarium. Since the Pomona dome represents the state-of-the-art in planetarium technology, I hope the work done here will serve as a reference for contemporary creatives who wish to use any planetarium as a multi-media art platform.

In particular, I aimed to establish a series of media pipelines that will serve as channels to render content to the dome. These pipelines are:

1. Native Assets : The DarkMatter software comes with a suite of native assets, such as planets and galaxies. In addition, it was driving capabilities that allows the operator to navigate the solar system in real time. It also has out-of-the-can full length planetarium shows that can be subsetted and altered.

2. 3D Models : DarkMatter also has the capacity to import, render and manipulate external 3D models (just as .obj files).

3. Fulldome Mapped Video : If a video is mapped to a certain equirectangular file format, the planetarium system has a slicer software that can render this adapted video to the full dome.

4. Interactive Media : In an attempt to make the planetarium an interactive, two-way space, we will explore the use of distributed user content on the dome. We will use social media channels such as Twitter to disseminate and collect this media.

7

Page 11: The Mediasphere: Intermediation with Digital Planetaria · The Mediasphere: Intermediation with Digital Planetaria. Author: ... multimedia projects that explored both ... the perhaps

5. Live Music : In the spirit of Vortex, we will explore the logistics and acoustics of a live band playing in the planetarium.

For each of these pipelines, I found and adapted media for that domain, and rendered it to the dome. These adaptations served as “hello world” examples of possible functionality that could be expanded upon in the future.

In this way, a core principal of this project was Youngblood’s notion of an artist as an ecologist. In his words, “ecology is defined as the totality or pattern of relations between organisms and their environment. Thus the act of creation for the new artist is not so much the invention of new objects as the revelation of previously unrecognized relationships between existing phenomena. [21].”

All of these elements were synthesized into a continuous show to demonstrated the in­dividual representation of the media and also their interaction. In the spirit of Vortex, the format will be a concert with visuals overlayed. The general thematic vision of the show will center around the notion of loneliness within a world saturated by social media. Aestheti­cally, I hope the draw a visual parallel between the isolation of sociality and that of the vast, cosmic emptiness of space.

Due to the scope and diversity of media involved in the Mediasphere, the project utilized many different technologies in new and interesting ways. Below are several of the more sophisticated techniques used, which should be considered anyone wishing to replicate or expand on the work put forward by the Mediasphere.

2.1 Slicing and Dome Mapping

A critical component of planetarium visualization is adapting media formatted for a rect­angular display to the spherical dome. I used SkyVision’s Slicer software to perform these transformations. The input to the slicer is a directory of ordered square .jpeg files that have content within the largest circumscribed circle. The slicer then produces a 30 FPS video by partitioning the circle in disjoint segments that are separately rendered by each of the six projectors. Since the three projectors in the planetarium point to different places on the dome, the slicer figures out how to edge-blend and orient these segments to “reconstruct” the original circle; it warps a 4-sided polygon into a rectangular image each projector can render (see Figure 2.1) [3].

Instead of a forward mapping, the mapping algorithm is a reverse mapping, where for each output pixel p = (ux, uy), the corresponding pixel Q = (qx, qy) in the circular image is found. First, P1 = (p1,x, p1,y) and P2 = (p2,x, p2,y), points on the edge of the 4-sided polygon are found using equations 1–4. Then Q is found by using uy to interpolate between P1 and

8

Page 12: The Mediasphere: Intermediation with Digital Planetaria · The Mediasphere: Intermediation with Digital Planetaria. Author: ... multimedia projects that explored both ... the perhaps

Figure 2.1: Segmentation and mapping process for converting circular image to dome pro­jections. Image courtesy of Paul Bourke [3]

P2 (equations 5 and 6).

p1,x = cx0 + ux(cx2 − cx0) (2.1)

p1,y = cx1 + ux(cx3 − cx1) (2.2)

p2,x = cx6 + ux(cx4 − cx6) (2.3)

p2,y = cx7 + ux(cx5 − cx7) (2.4)

qx = p1,x + uy(p2,x − p1,x) (2.5)

qy = p1,y + uy(p2,y − p1,y) (2.6)

Antialiasing and supersampling are performed to ensure a robust mapping [3]. The particular 4-sided polygons chosen are specific to the dome and projector orientation, so that they align perfectly when the three separate rectangular images are projected.

Since it must iterate over each pixel of the often 4k images, and each frame must be processed individually, this mapping process is computationally expensive. But luckily it is also highly concurrent, and all eight computers can run these processes in parallel. As a frame of reference, it takes 12-24 hours to process a 45 minute 4k film.

9

Page 13: The Mediasphere: Intermediation with Digital Planetaria · The Mediasphere: Intermediation with Digital Planetaria. Author: ... multimedia projects that explored both ... the perhaps

Figure 2.2: Left: Mirror Lens image of the sky. Right: Mirror Lens image of a forest. Images courtesy of Clark Hollenberg.

2.2 Mirror Lens

Since the dome is a circle mapped to a sphere, methods that adapt photography to a circular format can be directly imported to the dome. I experimented with the Mirror Lens, a experimental photography augmentation tool developed by Clark Hollenberg. The Mirror Lens is a hollow cylinder with a reflective mirror-like material on the inner surface. When this cylinder is attached to a camera, the resultant image contains two nested circles. The inner circle corresponds to the opening in the cylinder, and the out ring corresponds to the image being refracted along the mirror surface (see Figure 2.2). Many of these images taken successively forms a kaleidoscopic time lapse that can be directly mapped to the dome.

2.3 Adobe AfterEffects

AfterEffects is a Creative Cloud visual effects application that can be used to generate custom animations and visuals. It utilizes many custom patches for added functionality, which allow for a wide variety of animations particularly well suited for planetaria. For example, circular mappings and fish-eye lens effects allow for visuals that can be easily mapped to the dome using the methods discussed in Section 3.1 (See Figure 3.1 (left).

2.4 Analog Video Synthesis

Digital signal processing (both algorithmic software and modular hardware) has allowed for astonishing advancements in music production. The audio signals that are inputs to these systems are just a form of data. So it is possible to use a different form of data, such as visual

10

Page 14: The Mediasphere: Intermediation with Digital Planetaria · The Mediasphere: Intermediation with Digital Planetaria. Author: ... multimedia projects that explored both ... the perhaps

Figure 2.3: Two analog video synthesis screenshots, courtesy of Chris Konopka

images, as inputs to these systems. What results is a new emerging art form called analog video synthesis. I have been collaborating with Boston-based multidisciplinary artist Chris Konopka, who “researches and experiments with bleeding edge technology searching for new forms of interconnectivity.” He has made tremendous advancements to this new media and has kindly given me access to his content and the following overview of the burgeoning field:

“Analog video synthesis is a generative process that allows an individual to create organic visual material using electronic manipulations. Designs can be created with no video input by using various types of circuits such as video-based oscilla­tors, mixers to blend signals, keyers to attenuate color shades and video pattern generators. Conversely, live video input can also be used as a way to gener­ate non-standard designs by using the environment, people or even pointing the camera back at itself, creating video feedback.

Video synthesizers were created in the 1960s and since that time a lot of improve­ments have been made in terms of cost, portability and functionality. Currently, these type of synthesizers are offered as modular units by companies such as LZX Industries, brownshoesonly and Dave Jones Design. At this point users can cre­ate their own arrangement of modules, which makes the experience of creation unique to each individual. This new modular paradigm allows for circuits to be chained together in ways previously never explored. Along with this new form of synthesis, live video processing is significantly enhanced because now videos can be layered and/or altered in nearly an infinite amount of ways without a computer.

Currently analog video synthesis provides a way to spontaneously create imagery faster than using a digital architecture although it does not always have the same features as a modern digital system. Using an analog system of circuits, compared to a custom or commercial application, an individual has an outlet to create landscapes which are woven in nuance due to the random probability and stability of analog circuity. This potential provides a tangential way to paint in real-time by using a television as the canvas.” —Chris Konopka, 2016

Analog video synthesis creates beautiful kaleidoscopic images that inspire a nostalgic aes­thetic (see Figure 2.3 and 3.1b) that works very well with the planetarium.

11

Page 15: The Mediasphere: Intermediation with Digital Planetaria · The Mediasphere: Intermediation with Digital Planetaria. Author: ... multimedia projects that explored both ... the perhaps

Chapter 3

Results

After experimenting with the system as a whole and the technologies outlined above, I was able to successfully render many media sources on the dome. I then integrate them into a continuous exhibition that showcases their functionality.

3.1 Native Assets

DarkMatter contains richly detailed models of planets and local clusters that can be navi­gated around in real time. DarkMatter comes with a Viewport, which allows the driver to orbit, distance fly, pitch and yaw in a continuous, realistic way that simulates the movement of a ship. The software also comes with full length planetarium shows1 that can be subsetted and adapted. I extracted footage from Google Lunar XPrize and To Space and Back. In the case of TSAB, I rendered the movie as a layer over other cosmological content. The ability to overlay many assets on top of each other increases the opportunity for creative exploration.

3.2 3D Models

DarkMatter’s ability to control external 3D models was tested and validated. I successfully imported and manipulated a 3D model (represented as a .obj file) of a human brain. The file I used was taken from fMRI sessions at the Research Imaging Institute, University of Texas Health Science Center at San Antonio with a Siemens magnetom Trio 3T system [9]. This model is highly detailed, which allows the users to actually enter the interior of the brain and explore the internal structure within.

In addition to rendering and navigating this 3D model, DarkMatter also allow for real time model manipulation. See the below table for all the properties of the model I was able to programatically manipulate in real time, and the corresponding DarkMatter code used to generate it.

1more can be downloaded at https://www.eso.org/public/usa/videos/archive/category/fulldome/

12

Page 16: The Mediasphere: Intermediation with Digital Planetaria · The Mediasphere: Intermediation with Digital Planetaria. Author: ... multimedia projects that explored both ... the perhaps

Figure 3.1: Left: Single frame of full-dome video made in AfterEffects; Right: Single frame of full-dome video made with Analog Video Synthesis.

Property DarkMatter command

Spatial Position Asset.XPos = 50 Spatial Stretch Asset.XScale = 1.2 Color Asset.Color = {r = 100, g = 120, b=10}Emissivity Asset.Emmissivity = 0.4

Unfortunately, DarkMatter does not allow 3D models to be locally updated. For example, a 3D object of a person can be shrunk, stretched or spun, but cannot be “animated” to move their limbs or facial expression while other parts of the model are stationary.

3.3 Fulldome Mapped Video

I successfully mapped several films to the dome. These films were initially in rectangular .mp4 or .mov format, so I had to manually convert them to the format suitable for slicing (see Section 3.1).2 I sliced and rendered movies made with photographic time lapses, AfterEffects and Analog Video Synthesis (see Figure 3.1).

3.4 Music Ensemble

Music production was prepared by psychedelic beatmaker Nobide [13] and mixed in real time using djay by algoriddim on an iPad Pro. I also experimented with three forms of live instrumentation. The first was a 4-foot tall gong played by Jon Van Harmelen and provided

2The code I used to do so is available at https://github.com/zivepstein/media-sphere.

13

Page 17: The Mediasphere: Intermediation with Digital Planetaria · The Mediasphere: Intermediation with Digital Planetaria. Author: ... multimedia projects that explored both ... the perhaps

Figure 3.2: The Gong

by the Pomona College Music Department. The second was an electric guitar, played by Adam Revello. The third was vocals, which Liam Reese sung over the planetarium PA system.

The electric guitar complicated the audio pipeline and thus required a more sophisticated layout. The audio from both the guitar amp and the iPad Pro were piped into a MacBook Pro laptop, which merged the audio and then forwarded in on to the planetarium sound system (see Figure 3.2 for a full schematic of how the audio was integrated). Each piece of music used for the show was an original composition, and is available in the GitHub Repo.

Figure 3.3: Full schematic of live instrumentation.

14

Page 18: The Mediasphere: Intermediation with Digital Planetaria · The Mediasphere: Intermediation with Digital Planetaria. Author: ... multimedia projects that explored both ... the perhaps

Figure 3.4: The performers. From left to right: John van Harmelen (gong ringer), Ziv Epstein (driver), Adam Revello (guitar player) and Liam Reese (singer).

3.5 The Show

The show began with a live narration to contextualize the project.

“Cogitating on the myriads of stars apparently scattered in disorderly spherical array about the heavens, individuals often remark, as may you, I wonder what is outside outside

Right now, we inhabit the same time space coordinate of December 8, 2016 on Spaceship Earth. This vessel allows us to explore the universe along the axes of space along the suns orbit and time in solar years. Cloistered within this vehicle, our journey across these dimensions is cyclical and utterly predictable.

But this temporospatial coordinate is also the grounds for a more transcendental vehicle. Welcome friends to the Mediasphere. Here the axes of space and time become mutable as human creativity becomes the fabric of our reality. Our spheroid shuttle will travel to the furthest reaches of our cosmos, and the deepest peripheries of our own consciousness.

Since our ship has 32 seats, we will experience the Mediasphere together. Each of our subjectivities will blend together to into a communal experience. Say hello to the people on either side of you, as they will shape your voyage.

15

Page 19: The Mediasphere: Intermediation with Digital Planetaria · The Mediasphere: Intermediation with Digital Planetaria. Author: ... multimedia projects that explored both ... the perhaps

Our ship is fueled by raw imagination. So do not let the media wash over you, but rather engage actively with it. With the tolling of the gong, we will start our engines and begin our journey. ”

The show then transitioned with a series of gong hits accompanied by a Mirror Lens time lapse of clouds. We then started orbiting earth and moved to the moon. From there, we traveled outside the galaxy as Revello’s original composition “In My Mind” began playing. The 3D brain appeared, and we entered it as Reese and Revello sing and strum respec­tively. The show continues with Analog Video Synthesis and a natural image montage set to Nobide’s Dependence overlayed with a lecture from Buckminster Fuller. It conclude with another Mirror Lens time lapse, this time of a forest.

The show itself, lasting approximately 20 minutes, ran three times on Thursday, Decem­ber 8th and five times on Saturday, December 10th. It sold out completely and over 200 people attended.

16

Page 20: The Mediasphere: Intermediation with Digital Planetaria · The Mediasphere: Intermediation with Digital Planetaria. Author: ... multimedia projects that explored both ... the perhaps

17

Page 21: The Mediasphere: Intermediation with Digital Planetaria · The Mediasphere: Intermediation with Digital Planetaria. Author: ... multimedia projects that explored both ... the perhaps

Figure 3.5: Left: Single frame of full-dome video made in AfterEffects; Right: Single frameof full-dome video made with Analog Video Synthesis.

18

Page 22: The Mediasphere: Intermediation with Digital Planetaria · The Mediasphere: Intermediation with Digital Planetaria. Author: ... multimedia projects that explored both ... the perhaps

Chapter 4

Discussion

The Planetarium is often considered a niche technology primarily used for the scientific education of children. However, by adapting a wide array of creative content to the dome, I show how the planeterium is a more general purpose media space in the vein of Expanded Cinema. By integrating visual and audio media sources from across mediums and domains, the Mediasphere showcases the dome’s capacity for Intermediation. As immersive, 360­degree visual technologies become more present in our lives (with the advent of virtual reality and high performance visual processing), the Mediasphere eagerly anticipates a new ecosystem of media well-suited for this domain. Within this family of immersive, 360-degree visual technologies, the planetarium engenders a communal experience that stands in sharp opposition to the solipsism of virtual reality. A driving hope of this project was to leverage developments in digital image processing while maintaining an air of collaborative creativity and shared collective experience.

The Mediasphere was not my idea, but rather emerged directly from the medium itself. One of the biggest surprised I experiencing during this project was how open and receptive people were to the idea. Nearly every time I talked to a media producer (animators, artists, photographers, musicians, roboticist, etc) about the planetarium, they immediately became excited about the prospect of importing their work to the dome. In my opinion, this is because the planetarium is an underutilized tabula rasa that could and should be used to display media in a way that is true to its original message, but in a space that is unique and highly immersive.

There are several limitations to the use the planetarium as a media visualization platform that must be considered. The first and foremost is, of course, that it requires a planetarium. Because of their enormous technological costs,planetaria are relatively rare. 1 Thus someone who might want to use one would most likely need a insider connection to a science museum or university. Second, because each dome has a unique complex computer network and software infrastructure, it is almost impossible for an individual to modify, update or expand the functionality of her planetarium. As a result, there is little to no open-source community or traveling multi-media shows. Finally, planetarium software itself has many limitations (such as only rendering pre-made content and in general being fragile and non-deterministic) that

1Indeed the planetarium cost Pomona College a 7-figure dollar amount. According to Wikipedia (which should just be considered as a lower bound on the correct order of magnitude), there are only 593 planetaria in the world, and 196 in the United States.

19

Page 23: The Mediasphere: Intermediation with Digital Planetaria · The Mediasphere: Intermediation with Digital Planetaria. Author: ... multimedia projects that explored both ... the perhaps

software updates or careful development could resolve. Despite these limitations, planetaria offer a wide possibility for future work to expand

their usage and digital media is a whole. The primary goal of this project was not craft a planetarium show, but rather to set the groundwork for future explorations. Through attempting to create many media pipelines, I have discovered many new dome-related en­deavors that could be explored in the future. First, an operator could utilize interactive media that engages the audience in real time to emphasize the community aspect of the planetarium. This could be mediated by social media or direct conversation with the audi­ence. Second, programming interfaces that allow for real time media should be developed. Currently, all content displayed on the dome exist locally as static files and thus cannot be edited in real time. Rendering content to the dome that is dynamically responsive to the current environment, such as visuals or text syncrhonized with the music being played, Kinect haptic feedback of a participant or social media API calls, would greatly increase the palpable immediacy of the dome. Third, we should fully realize the possibility of a full scale musical production, such as an orchestra or a symphony, within the planetarium. Finally, removing the reclining chairs from the planetarium would change the planetarium from a theater to a completely different space with different rules and norms. An open, circular space where people can stand, walk around, dance or lie down offers possibilities along many different axes. I hope to explore all these ideas in the coming months.

Acknowledgments

I would like the thank the Pomona College Physics Department and in particular Loredana Vetere for granting me access to the planetarium. This project would not be possible without their trust and support. The media content of this project was not my own, by rather was given to me by many talented and passionate producers. I would like to thank Chris Konopka, Nick Vann, Adam Revello,Liam Reese, Leo Selker, Peter Mellinger, Wilson Fanestil, Sara Scheidemantle, Clark Hollenberg, Jonathan van Harmelen and Tara Morris for providing media to be brought to life in the dome. I would also like to thank Ian Descamps for helping me document the show with his fantastic photography skills. Finally, I would like to thank Professor Kim Trang Tran, Evan Bodell, Sara-ling Maltesen, Jaya Jivika Rajani, Charlie Watson, Remy Rossi and Matthew Eva for invaluable comments, ideas and discussion.

20

Page 24: The Mediasphere: Intermediation with Digital Planetaria · The Mediasphere: Intermediation with Digital Planetaria. Author: ... multimedia projects that explored both ... the perhaps

Bibliography

[1] Ultra-realistic sightseeing images using digital planetarium system: Application for in­tangible folk cultural property’shimotsuki-matsuri’. Academic world of tourism studies, 1:101–110, 2012.

[2] Roy Ascott and Edward A Shanken. Telematic embrace: visionary theories of art, technology, and consciousness. Univ of California Press, 2003.

[3] Paul Bourke. Image slicing for fulldome (and other applications). http://paulbourke.net/dome/slicer/. Accessed: 2010-09-30.

[4] Char Davies and John Harrison. Osmose: towards broadening the aesthetics of virtual reality. 1996.

[5] Unity Game Engine. Unity game engine-official site. Online][Cited: October 9, 2008.] http://unity3d. com.

[6] Chi-Wing Fu, Wooi-Boon Goh, and Junxiang Allen Ng. Multi-touch techniques for exploring large-scale 3d astrophysical simulations. In Proceedings of the SIGCHI Con­ference on Human Factors in Computing Systems, pages 2213–2222. ACM, 2010.

[7] Scott Highton. Choosing a lens for panoramic vr, 2016.

[8] David E James. Expanded cinema in los angeles: The single wing turquoise bird. Millennium Film Journal, (43/44):9, 2005.

[9] Arno Klein and Jason Tourville. 101 labeled brain images and a consistent human cortical labeling protocol. Frontiers in neuroscience, 6:171, 2012.

[10] Malcolm Le Grice. Experimental cinema in the digital age. British Film Institute, 2001.

[11] Palmer Luckey, Brendan Iribe Trexler, Graham England, and Jack McCauley. Virtual reality headset, March 18 2014. US Patent D701,206.

[12] Maria Miranda et al. Uncertain spaces: artists’ exploration of new socialities in mediated public space. 2007.

[13] Nobide. Psychedelic breakbeats. https://soundcloud.com/nobide.

[14] Howard Rheingold. Virtual Reality: Exploring the Brave New Technologies. Simon & Schuster Adult Publishing Group, 1991.

21

Page 25: The Mediasphere: Intermediation with Digital Planetaria · The Mediasphere: Intermediation with Digital Planetaria. Author: ... multimedia projects that explored both ... the perhaps

[15] Vibeke Sorensen. The contribution of the artist to scientific visualization. School of Film and Video, California Institute of the Arts [Accessed 13 Oct. 2004], 1989.

[16] Jonathan Steuer. Defining virtual reality: Dimensions determining telepresence. Journal of communication, 42(4):73–93, 1992.

[17] Gloria Sutton. Stan vanderbeeks movie-drome: Networking the subject. Future cinema: the cinematic imaginary after film, pages 136–43, 2003.

[18] Gloria Sutton. The Experience Machine: Stan VanderBeek’s Movie-Drome and Ex­panded Cinema. MIT Press, 2015.

[19] Fred Turner. The democratic surround: Multimedia and American liberalism from World War II to the psychedelic sixties. University of chicago Press, 2013.

[20] Stan Van Der Beek. Re: Vision. The American Scholar, pages 335–340, 1966.

[21] Gene Youngblood and R Buckminster. Expanded cinema. Dutton New York, 1970.

[22] Ka Chun Yu. Digital full-domes: The future of virtual astronomy education. Planetar­ian, 34(3):6–11, 2005.

[23] Ka Chun Yu and Kamran Sahami. Visuospatial astronomy education in immersive digital planetariums. Communicating astronomy with the public, pages 242–245, 2007.

22