Page T1.2 . 23 October 2002                     
ArchitectureWeek - Tools Department
NEWS   |   DESIGN   |   BUILDING   |   DESIGN TOOLS   |   ENVIRONMENT   |   CULTURE
< Prev Page Next Page >
 
TOOLS
 
  •  
  • Diving Deeper into Designs
     
  •  
  • Basics - Rendering the 3D Model

    [an error occurred while processing this directive]
    AND MORE
      Current Contents
      Blog Center
      Download Center
      New Products
      Products Guide
      Classic Home
      Architecture Forum
      Architects Directory
      Topics Library
      Complete Archive
      Web Directory
      About ArchWeek
      Search
      Subscribe & Contribute
      Free Newsletters
       

     
    QUIZ

    [an error occurred while processing this directive]

    Diving Deeper into Designs

    continued

    Intention and Challenges

    Since the advent of the CAVE and similar projection-based VR systems the 1990s, architectural design visualization has been often cited as an ideal application for the technology. Indeed, an understanding of human-scale spatial relationships within proposed designs would seem logically to benefit from the experiential navigation of virtual spaces.

    But despite the burgeoning of such facilities at universities and research labs, projection-based VR has yet to achieve widespread use in architectural design education and remains primarily a tool for specialized research. Architectural design tasks undertaken in VR are often limited to demonstration projects.

    Our observation is that, for many users, the resources and effort required to apply VR techniques to everyday design are simply too great, so VR environments are seldom used for extensive design exploration. The challenges include: the historically high cost of purchasing and maintaining VR facilities, a scarcity of VR-enabled applications to which CAD users can easily adapt, the difficulty of programming new VR applications, and the resulting cloistering of such facilities within specialized research groups.

    In developing Penn State's IEL system, we wanted to overcome these challenges and encourage the routine use of VR by architecture undergraduates. To this end, we have tried to keep costs affordable by using "commodity" components. We have maintained ease of use by adopting familiar desktop interfaces and applications, and we provide continuous and open access for the students, in a room near their more conventional studio space.

    System Configuration

    The IEL display includes two six- by eight-foot (180- by 245-centimeter) rear-projection screens. The screens are positioned at a 120-degree angle to provide a sense of peripheral surround. When used in VR mode, the perspective projection is programmed to provide a seamless 3D stereoscopic panorama for a few centrally positioned viewers.

    We wear inexpensive plastic polarized glasses to view the 3D stereo images. When used for studio review sessions, the current IEL configuration can accommodate up to about twenty participants. However, the farther off-center a viewer is located, the greater the perspective distortion, and the more apparent the central seam becomes.

    For each screen, a CYVIZ AS xpo.2 stereo converter and two Proxima Ultralight X350 XGA DLP projectors provide polarized passive stereo images. The screen images are aligned to provide a continuous-surround screen panorama of the projected scene. The xpo.2 provides stereo display compatibility for any application that supports quad-buffered, frame-sequential, OpenGL stereo, a widely supported 3D display standard.

    A Windows computer, with dual Xeon processors and a dual-headed 3Dlabs Wildcat III 6210 OpenGL graphics accelerator, serves as the graphics host. A joystick allows intuitive navigation for scenes displayed in VR mode. The two-screen system can be duplicated for $50,000, or less with educational discounts.

    Application and Results

    The students begin by buildings models in form-Z from auto-des-sys, which they already apply routinely in their design studios. Then they export the form-Z models as VRML2, a standard 3D data-description and exchange format, using form-Z's existing export options.

    VRML models are viewed and navigated at human scale using NavLoader, a Java3D application developed around open-source loaders from Sun Microsystems and the Web3D Consortium, Inc. NavLoader is the only locally developed code used in the IEL.

    The NavLoader user interface facilitates spatial comprehension by allowing students to position themselves within their designs, to view from multiple angles, and to zoom into and out of various spatial connections and details. They can move around and through their designed spaces in real time.

    The stereoscopic display provides depth cues that further enhance their immersive experience of the space. Texture rendering makes materials appear real, so students can explore issues of structure, materials, and space simultaneously.

    In addition to personal design explorations, the large screens of the IEL provide a venue for design presentations to the entire class. The familiar Windows desktop environment enables students to readily incorporate multimedia applications in multi-modal presentations.

    For instance, they often add PowerPoint slide shows, movies in Quicktime, AVI, or MPEG formats, interactive Flash content, or 3D stereo-pair renderings to communicate various aspects of their design.

    In practice, then, the IEL has evolved into an immersive multimedia environment, building on the VR system initially envisioned by its designers. The ease with which digital designs can be modified and re-presented offers an additional advantage over traditional modes of architectural presentation.

    By using the IEL early in a design project, students can quickly create, then investigate, and finally present a process of making architecture. Having an immersive environment adjacent to the design studio provides students with new design methods and helps them learn to create and comprehend architecture and space in three dimensions. This system also transforms the way students acquire design capabilities and critique their projects.

    Future Developments

    We are currently designing and constructing a three-screen framework to improve peripheral surround and to accommodate larger audiences. The new design uses mirrors behind the viewing screens to reduce the space required for rear-screen projection. The third screen requires the development of multiple-CPU approaches to graphics rendering and display.

    We're also evaluating input devices that might be more experientially engaging than the joystick we currently use. One device we're investigating continuously tracks the position and orientation of a user's hand, so an application "knows" which way the user is pointing and can respond appropriately. We also want to add responsive, audible feedback to the interactive experience.

    To improve the transfer of complex surfaces and lighting solutions from existing CAD applications into the immersive VR system, we are working to document reliable and straightforward work flows for students to follow.

    And, finally, we are working toward a tighter integration with related engineering design applications, Web-based multi-modal presentation methods, voice and video teleconferencing, and interactive sharing of 3D graphical worlds with geographically distant collaborators.

    Katsu Muramoto, George Otto, and Loukas Kalisperis are with Penn State's Department of Architecture. Otto is also on the staff of ITS.

    Funding for the IEL has been provided by Penn State ITS, the College of Arts and Architecture, the Department of Architecture, and the Department of Landscape Architecture's IMLAB. Research partnership support has been provided by CYVIZ AS. Screen design and construction are by architecture students Jamie Heilman, Spencer Tuck, Michael Crnjaric, Ashley Philips, and Lee Cowan. Java3D programming is by Jack Gundrum and system integration by Gavin Burris.

    Discuss this article in the Architecture Forum...

    AW

    ArchWeek Image

    In Penn State's Immersive Environments Lab (IEL), architecture student Martin Busser combines VR and multimedia in presenting his fifth-year thesis design during the 2002 Kossman Design Award competition.
    Photo: Jamie R. Heilman

    ArchWeek Image

    Interior view from Martin Busser's thesis project displayed in immersive VR mode across both IEL screens. (The project is shown here in monoscopic projection for publication clarity.)
    Photo: Jamie R. Heilman

    ArchWeek Image

    Second-year design student Jessica Dyckes presents her work during virtual review session.
    Photo: George Otto

    ArchWeek Image

    Second-year design student David Niemiec uses mixed multimedia tools to present his work during a virtual review.
    Photo: George Otto

    ArchWeek Image

    Audience viewing Douglas Newkirk's mixed multimedia presentation of his fifth-year thesis design during the 2002 Kossman Design Award presentations.
    Photo: Jamie R. Heilman

     

    Click on thumbnail images
    to view full-size pictures.

     
    < Prev Page Next Page > Send this to a friend       Subscribe       Contribute       Advertise       Privacy       Comments
    AW   |   GREAT BUILDINGS   |   DISCUSSION   |   SCRAPBOOK   |   BOOKS   |   FREE 3D   |   SEARCH
      ArchitectureWeek.com © 2002 Artifice, Inc. - All Rights Reserved