MW-photo
April 15-18, 2009
Indianapolis, Indiana, USA

Hybrid Realities: Visiting the Virtual Museum

Jesse Allison and John Fillwalk, BSU Institute for Digital Intermedia Arts, USA

http://idiarts.org

Abstract

Virtual worlds provide a platform in which to construct compelling experiences not possible within the material and temporal constraints of the physical world. The virtual realm has the potential to be united and engaged by physicality – informing and transforming the audience’s experience of exhibition in a profoundly transformative nature. The Institute for Digital Intermedia Arts at Ball State University has been incorporating mixed-reality approaches into museum exhibitions, musical performances, installation art, and interface over the last several years. This paper documents specific explorations of the opportunities of the Second Life environment for mixed-reality experiences – analyzing approaches to bridging the worlds; such as, media streaming, client-side interaction, an external Web server communication hub, as well as opportunities for human/computer interaction.

Keywords: Interactive exhibition, Second Life, Virtual Worlds, mixed reality.

Betwixt And Between

Overview

One of the most engaging features of virtual worlds is their ability to represent our physicality in a three-dimensional spatialized environment. Through the simulacrum of the avatar, we can negotiate spatial environments through this represention of ourselves. This vicarious connection to virtualized spaces provides experiences that can transcend more typical screen-based digital phenomena. These are more compelling still when the plane of the screen or fourth wall is expanded to incorporate physical reality in the design of time-based and spatialized hybrid mixed-reality constructions.

The participants become the mediators of these inputs, negotiating the relations between imagery, sound and interactions. The participants themselves are the conduits between the worlds, in providing rich opportunities (relevant to experience) making physical and cognitive connections that result in the fundamental link across the divide. Experiences channeled vicariously through the avatar create a compelling association between the participant and virtual space. As avenues are provided to the participant’s associated avatar to have a new influence over the physical world, reflexively they also can affect the course of the virtual. The bridge is thus strengthened and the experience deepened – eventually creating a context of parallel reality (Damer, 2008).

To span between reality and a virtual world (Vasquez de la Velasco, 2008) such as Second Life, three primary connections need to be made: the visual, the aural, and the interactive. Ideally, these connections should be integrated bilaterally, flowing in both directions. Imagery, data and sounds from physical reality should inform the virtual realm, and vice versa.

Avatar-Based Experience

The conventional approach to accessing the virtual world is by means of an avatar that can negotiate virtual space. An example is the virtual collaborative space developed for the Las Americas Virtual Design Studio collaboration (LAVDS), architecture studios between the United States and Latin America united in a virtual world (Schroeder, 2008). Ball State University’s College of Architecture and Planning and nine Latin American universities teamed up to work on a design studio which paralleled the design of a disaster surge center created by a real life architectural firm. Collaboration is achieved through the forms of avatar interactions: text, data, video, audio, and voice chat using the keyboard and mouse paradigm. The virtual structure is a deconstruction of the notion of a building itself. Once architecture is unfettered from the physical constraints and needs such as gravity, nature, and materiality, it is free to focus solely on the programming of space and engagements by its users. The LAVDS structure is a configurable, collaborative interface that responds to its users’ requirements via reactive data, media and form. In this project the participants affect the virtual world through the avatar’s interactions.

figure 1

Fig. 1: LAVDS Collaborative Space (2008)

The Second Life virtual realm is a potent platform for delivering this mode of interaction. It is an engaging environment that has the ability to transmit live images, geometry, data and audio to and from the virtual realm. Vitally important is the expanding number of modalities of interacting with content inside and out of the virtual world. The Institute for Digital Intermedia Arts (IDIAA) has extensively engaged Second Life as a platform for mixed-reality experiences, exhibitions and performances.

Experiences in Audio/Video Interactions

Convincing aural and visual experiences are an important factor in transforming a virtual world into an immersive user experience. The three-dimensional nature of the Second Life environment is retained both visually and aurally, offering opportunities to spatialize traditionally one or two-dimensional media.

Video streaming

Streaming video can be utilized to great effect in Second Life. Interesting possibilities occur when exploring the ability to spatialize traditional two-dimensional video applied as a texture on three-dimensional geometry. IDIAA has explored various avenues of presentation in virtual video installations, such as in Survey v3 (Figure 2) and Final Wisdom I v2 (Figure 3).

figure 2

Fig 2: Still from Survey v3 (2008)

figure 3

Fig 3: Still from Final Wisdom I v2 (2008)

Although traditional media streaming in Second Life is limited to one stream per parcel (or division of virtual land), developments have been made in collaboration with Mitch McKenzie, IDIAA Research Fellow, to allow multiple streams that are selectively delivered externally from Second Life and applied to textures on 3D objects. When paired with proximity detection, this allows for personal or group-experienced targeted media to be triggered and disseminated on demand.

Real-time video streaming is another video source wrought with possibilities. In Displaced Resonance (Figure 4), Michael Pounds, John Fillwalk and Jesse Allison created a physical sonic installation based on the resonant acoustical frequencies of pipes. They later emulated, virtualized, and expanded the installation into a Second Life version that referenced and enhanced the interactive model of the physical work. When the installation was exhibited, the physical version incorporated a display of the Second Life virtualization, and the Second Life version had a stream of people interacting locally with the installation. In being manifested and mirrored in the virtual, the installation had in essence gained its own reflection presence or avatar. Participants on either side could view and interact with the two installations side by side, creating a unique and engaging event (Figure 5).

figure 4

Fig 4: Still from Displaced Resonance (2007)

figure 5

Fig 5: Still from a mixed-reality reception (2007)

Audio streaming

Streaming of audio is fairly similar to video streaming, with one benefit: a negligible amount of delay allows for convincing synchronized interactions between the physical and virtual environments. Users can stream audio via Real Time Streaming Protocol (RTSP) to an individual parcel of land in SL and broadcast it from there to the world. Alternatively, they can stream audio directly from the client computer. This has the benefit of being simple to set up; however, the audio stream is tied to the client’s avatar, while RTSP streaming can be emitted from any object.

Spatialized sound file playback

Audio experience in Virtual Worlds can be divided into three categories: sample playback, synthesis, and spatialization (Kramer, 1995). Second Life cannot synthesize sounds itself. Sound is restricted to audio files uploaded to the Second Life server and played in a loop or triggered by stimuli such as events, collisions, and proximity. This can be used to creatively sonify the simulated world. When paired with other techniques for interaction like HTTP requests and client-side influences, it can create convincing physical-to-virtual interactions.

In Bob Box v4 (Figure 6), floating boxes use the physics engine to play composed sounds upon collision, turning the physical nature of the objects into the score for the work. Flickr Gettr rapidly creates images pulled from the Web and with each image, triggers short audio clips to create a cumulative sonic effect. In Displaced Resonance, looped sound files increase in intensity based on proximity, creating a gradually shifting timbre dependent on the avatar’s spatial relationship with the objects.

figure 6

Fig 6: Still from Bob Box v4 (2008) showing streaming video from a live Web cam.

The primary limiting parameter for this approach is the ten-second restriction per sound source. This becomes an effective solution for event-based and cumulative audio effects, but is rather poor for creating larger temporally directional audio experiences.

Web texturing

The ability to host texture images outside of the Second Life grid is an important development. Web texturing is meant to provide the ability to display Web pages and images on a primitive within Second Life. At the moment, the imagery is static – no link or dynamic information is retained. Of more immediate application is the ability for a Web texture to represent text and image content that can be situationally dynamic, such as displaying external information that automatically updates.

The IDIAA is utilizing this ability to integrate the Ball State University Museum of Art’s Digital Images Delivered Online (DIDO) 11,000-piece database where SL virtual museum attendees can search for artworks in the collection based on direct in-world search queries parsing though each image’s metadata. This installation is found at the Virtual BSU Museum of Art on the Ball State University SL Public Island 1. Viewers are presented with images of matching artworks and can then choose a specific image to update the Web texture and view the item, incorporating it into their own exhibit arrangement in the gallery. The effect is a three-dimensional, spatialized search engine that employs the gallery itself as the metaphor of browsing portal (Figure 7). A similar effect was used in the Flickr Gettr installation to collect and display queried Flickr images.

figure 7

Fig 7: View of Virtual Museum Gallery (2008)

Limitations to this method are that the page must be created and hosted somewhere else, necessitating the support of resources like Web domains, Web applications, and media resources that are external to Second Life. On a similar note, it may mean repurposing or reformatting the information to display it in a way that is represented well in Second Life. The resolution of the incoming page is limited to 1024x1024 pixels, adequate for many textures, but fine details in high resolution images and text cannot be displayed without preprocessing on the Web application side and only displaying small portions of the entire image or text. Another current limitation is that just one Web texture is available per parceled region. Because users can only see Web textures from within the region that the avatar is standing, they are restricted to using only 1024x1024 pixels as texture. Ideally, users would be able to bring an unlimited number of textures in from the Web, allowing for dynamic image content in the virtual world.

Interaction and Influence Experiences

The communication avenues that are available to transfer information restrict making connections between the virtual world and the physical world. Ideally, these communication avenues would be low latency, flexibly routed messages that could be scripted to initiate a multitude of actions. In practice, most avenues of communication have a specific task in mind, but many can be coerced into other uses.

HTTP requests

HTTP Requests are typically used to request and post information like Web pages to and from Web servers. With the expansion of Web 2.0 based Web services that give access to their internal information, possibilities are expanding exponentially in ways to integrate pertinent information. Scripts in Second Life are able to make requests and utilize the external information within world. As a link, it can be used to pass complex state information in to and out of the virtual world providing a potentially unifying link.

The authors used this technique in the performance piece Traversal to pass avatar location information out of Second Life and into a live performance. The piece used interactions with objects in a structure in Second Life to generatively perform on an actual pipe organ in Sursa Hall on the Ball State University campus.

External Web server

To make these interactions more flexible, an intermediary Web server can be employed to collate and prepare information for Second Life and retain states that can be queried from external applications. The Web application effectively serves as an intermediary between Second Life and outside environments, providing the communications link and logic to assimilate the information. Web 2.0 mash-ups (Web sites that integrate information gleaned from multiple Web services such as images from Flickr, social networking from Facebook, and text-messaging services, to name a few) can be easily accomplished with highly developed code in Java, Ruby, and Perl, for example. Performing a similar task through Second Life in-world scripting language of Linden Scripting Language (LSL) would be difficult or impossible to accomplish. Separating the task into an intermediate Web service takes the computational difficulties out of Second Life and simply passes along Second Life-collated information for easy integration.

In Flickr Gettr, installed at the New Media Consortium’s Aho Museum in Second Life, the external Web service was used as an intermediary to query Flickr, receive images and format them for delivery as a second life texture. The Web service then transmitted the images’ aspect ratios in a second query to allow the Second Life scripts to map the textures properly.

figure 8

Fig 8: Still from Flickr Gettr (2008)

Blackboard

The IDIAA is currently developing a set of open source tools uing HTTP requests to integrate SL with the Blackboard learning platform. IDIAA was the inaugural recipient of the Blackboard Greenhouse Grant for Virtual Worlds for the Aesthetic Camera Project, an on-line distance education cinematography unit for virtual worlds developed by John Fillwalk and recognized by the Campus Technology Innovators Award in Virtual Learning. Blackboard is being used as the course-portal hosting the assessments, discussions boards, and mirroring of instructional media assets, while Second Life is engaged as a virtual “hands-on” studio and synchronous spatialized learning environment. This hybrid model of the union of the two environments allows for a richer distance learning community than can obtained through just one method. The Building Block will securely and seamlessly automate a number of course management processes to augment asynchronous hybrid learning environments.

XML RPC

XML RPC is a protocol to pass information to and from a Web service. The implementation in Second Life, although functional, has a drawback in that it instantiates a three-second delay, doesn’t keep track of queries and results, and can have only one query open at a time (http://wiki.secondlife.com/wiki/Category:LSL_XML-RPC). This means that under many circumstances, the possibility for lost data exists and delayed data is inevitable. Due to these constraints, HTTP requests are more widely used.

Client influence

Second Life is generated by two entities: a simulator that holds all of the information about each primitive and what each one’s current state is, and the client which receives that information on the local computer and renders the virtual world specific to the user that is logged in. The two pass communications to sychronize events that are created locally, events generated by other clients, and events that are generated through the simulation engine on the server.

Keyboard and mouse commands

Keyboard and mouse inputs are generally used to interact with Second Life objects. Using software like Max/MSP to generate and send these commands allows us to hijack client-side control for automation purposes.

Audio cues

The Traversal organ performance required synchronized events to have SL play the physical organ convincingly. Because all of the previous methods of obtaining this information out of SL induce some amount of delay, a client-side approach was taken. Sine waves at various frequencies were loaded into SL and triggered by specific events and physical interactions. These were played locally by the client in complete synchronization with the event. This audio was filtered and analyzed by Max/MSP to track specific frequencies. When the specific sine wave frequency occurred, the associated event was known to be triggered and the note, chord, or parameter change was performed on the organ.

figure 9

Fig 9: Still from Traversal (2008)

Future Possibilities

As the exploration and integration of virtual worlds continues to evolve and be adopted, methods for mixing reality and virtuality will expand. Here are a few avenues that appear to be plausible in the near future:

  • Adding the capacity for general messaging from external inputs to the client. This would allow for user control from external software controls such as innovative GUI elements and video tracking, and through them to external hardware and sensors
  • Adding the capacity for general messaging from the client to external software
  • Developing Web or locally hosted textures that would allow for more dynamic delivery of static assets and relieve the expense of paying to upload content to commercial servers
  • Using Web or locally hosted sound files
  • Allowing multiple video and audio streams per parcel
  • Resolving XML-RPC issues to make integrating with some external Web services much simpler.

References

Articles

Damer, B. (2008). “Meeting in the Ether: A brief history of virtual worlds as a medium for user-created events”. Journal of Virtual Worlds Research 1(1).

Kramer, G. (1995). “Sound and Communication in Virtual Reality”. In F. Biocca and M. Levy (Ed.) Communication in the Age of Virtuality. New Jersey: Lawrence Erlbaum Publications, 293-4.

Schroeder, R. (2008). “Defining Virtual Worlds and Virtual Environments”. Journal of Virtual Worlds Research 1(1).

Category: LSL XML-RPC. Second Life Wiki. Consulted December 10, 2008. http://wiki.secondlife.com/wiki/Category:LSL_XML-RPC

Second Life Virtual Works

Allison, J., J. Fillwalk and M. Pounds (2008). Displaced Resonance. Sonic installation and Virtual installation.

Allison, J. and J. Fillwalk (2008). Bob Box v4. Virtual installation.

Allison, J. and J. Fillwalk (2008). Flickr Gettr. Virtual installation.

Allison, J. and J. Fillwalk (2008). Traversal. Virtual Installation with Physical Organ Performance.

Fillwalk, J. (2008). Survey v3. Virtual video installation.

Fillwalk, J. (2008). Final Wisdom I v2. Virtual video installation.

Vasquez de Velasco, G., A. Angulo, J. Fillwalk, B. Hoopingarner, T. Danehy, and J. Baxter (2008). LAVDS Collaborative Interface. Second Life virtual architecture studio.

Cite as:

Allison, J., and J. Fillwalk, Hybrid Realities: Visiting the Virtual Museum. In J. Trant and D. Bearman (eds). Museums and the Web 2009: Proceedings. Toronto: Archives & Museum Informatics. Published March 31, 2009. Consulted http://www.archimuse.com/mw2009/papers/allison/allison.html