/* * @(#)j3d1x1-vr 1.2 01/10/29 15:41:30 * * Copyright (c) 1996-2001 Sun Microsystems, Inc. All Rights Reserved. * * Redistribution and use in source and binary forms, with or without * modification, are permitted provided that the following conditions * are met: * * - Redistributions of source code must retain the above copyright * notice, this list of conditions and the following disclaimer. * * - Redistribution in binary form must reproduce the above copyright * notice, this list of conditions and the following disclaimer in * the documentation and/or other materials provided with the * distribution. * * Neither the name of Sun Microsystems, Inc. or the names of * contributors may be used to endorse or promote products derived * from this software without specific prior written permission. * * This software is provided "AS IS," without a warranty of any * kind. ALL EXPRESS OR IMPLIED CONDITIONS, REPRESENTATIONS AND * WARRANTIES, INCLUDING ANY IMPLIED WARRANTY OF MERCHANTABILITY, * FITNESS FOR A PARTICULAR PURPOSE OR NON-INFRINGEMENT, ARE HEREBY * EXCLUDED. SUN AND ITS LICENSORS SHALL NOT BE LIABLE FOR ANY DAMAGES * SUFFERED BY LICENSEE AS A RESULT OF USING, MODIFYING OR * DISTRIBUTING THE SOFTWARE OR ITS DERIVATIVES. IN NO EVENT WILL SUN * OR ITS LICENSORS BE LIABLE FOR ANY LOST REVENUE, PROFIT OR DATA, OR * FOR DIRECT, INDIRECT, SPECIAL, CONSEQUENTIAL, INCIDENTAL OR * PUNITIVE DAMAGES, HOWEVER CAUSED AND REGARDLESS OF THE THEORY OF * LIABILITY, ARISING OUT OF THE USE OF OR INABILITY TO USE SOFTWARE, * EVEN IF SUN HAS BEEN ADVISED OF THE POSSIBILITY OF SUCH DAMAGES. * * You acknowledge that Software is not designed,licensed or intended * for use in the design, construction, operation or maintenance of * any nuclear facility. */ /* ************************************************************************ * * Java 3D configuration file for a single screen stereo desktop display * using a head tracker and 6DOF mouse. * ************************************************************************ */ // Configure the head tracker. The NewDevice command binds a logical name // (the 2nd argument) to an InputDevice implementation whose class name is // specified in the 3rd argument. The InputDevice implementation for a head // tracker must generate position and orientation data relative to a fixed // frame of reference in the physical world, the "tracker base" of the Java // 3D view model. // // The InputDevice is instantiated through introspection of the class name. // Available InputDevice implementations are site-specific, so substitute // the class name in a NewDevice command below with one that is available at // the local site. // // Note that properties are used to configure an InputDevice instead of // attributes. The details of an InputDevice implementation are not known to // ConfiguredUniverse, so the property name is invoked as a method through // introspection. The example properties below must be replaced with the ones // needed, if any, by specific InputDevice implementations. // // All property arguments following the method name are wrapped and passed to // the specified method as an array of Objects. Strings "true" and "false" // get wrapped into Boolean, and number strings get wrapped into Double. // Construct such as (0.0 1.0 2.0) and ((0.0 1.0 2.0 0.5) (3.0 4.0 5.0 1.0) // (6.0 7.0 8.0 0.0)) get converted to Point3d and Matrix4d respectively. // Note that last row of a Matrix4d doesn't need to be specified; it is // implicitly (0.0 0.0 0.0 1.0). // (NewDevice glasses LogitechRedBarron) (DeviceProperty glasses SerialPort "/dev/ttya") // Unix paths need quoting. (DeviceProperty glasses ReceiverBaseline 0.1450) (DeviceProperty glasses ReceiverLeftLeg 0.0875) (DeviceProperty glasses ReceiverHeight 0.0470) (DeviceProperty glasses ReceiverTopOffset 0.0000) // Configure an InputDevice to use for a 6 degree of freedom mouse if // required. In some implementations the same InputDevice instance can be // used both for head tracking and multiple peripheral sensing devices. // This example assumes an implementation that requires multiple instances, // one for each sensor, sharing the same physical hardware used for the // tracker base. In either case all the sensors must generate position and // orientation relative to the same fixed tracker base frame of reference. // (NewDevice wand LogitechRedBarron) (DeviceProperty wand SerialPort "/dev/ttyb") (DeviceProperty wand ReceiverBaseline 0.0700) (DeviceProperty wand ReceiverLeftLeg 0.0625) (DeviceProperty wand ReceiverHeight 0.0510) (DeviceProperty wand ReceiverTopOffset 0.0000) // Create logical names for the available sensors in the specified input // devices. The last argument is the sensor's index in the input device. // (NewSensor head glasses 0) (NewSensor mouse6d wand 0) // Set the 6DOF mouse sensor hotspot in the local sensor coordinate system. // The hotspot is simply the "active" point relative to the sensor origin // which interacts with the virtual world, such as the point used for picking // or grabbing an object. Its interpretation is up to the sensor behavior. // // It is set here to 10 centimeters from the base to allow reaching into the // screen without bumping the device into the glass, and to prevent the device // itself from obscuring the pointer echo. // (SensorAttribute mouse6d Hotspot (0.0 0.0 -0.10)) // Create a new screen object and associate it with a logical name and a // number. This number is used as an index to retrieve the AWT GraphicsDevice // from the array that GraphicsEnvironment.getScreenDevices() returns. // // NOTE: The GraphicsDevice order in the array is specific to the local // site and display system. // (NewScreen center 0) // Set the actual available image area. This measured as 0.350 meters in // width and 0.245 meters in height for the monitor in the sample setup when // running in stereo resolution. // (ScreenAttribute center PhysicalScreenWidth 0.350) (ScreenAttribute center PhysicalScreenHeight 0.245) (ScreenAttribute center WindowSize NoBorderFullScreen) // Set the TrackerBaseToImagePlate transform for this screen. This transforms // points in tracker base coordinates to image plate coordinates. // // For this sample setup the tracker base is leaning at 50 degrees about its // X-axis over the top edge of the monitor. The middle of the tracker base // (its origin) is offset by (0.175, 0.345, 0.020) from the lower left // corner of the screen (origin of the image plate). // (ScreenAttribute center TrackerBaseToImagePlate (RotateTranslate (Rotate 50.000 0.000 0.000) (Translate 0.175 0.345 0.020))) // Create a physical environment. This contains the available input devices, // audio devices, and sensors, and defines the coexistence coordinate system. // (NewPhysicalEnvironment SampleSite) // Register the input devices defined in this file. // (PhysicalEnvironmentAttribute SampleSite InputDevice glasses) (PhysicalEnvironmentAttribute SampleSite InputDevice wand) // Register the sensor which will drive head tracking. // (PhysicalEnvironmentAttribute SampleSite HeadTracker head) // Define coexistence coordinates. // // Coexistence coordinates are defined relative to the tracker base to simplify // calibration measurements, just as the tracker base is used as the common // reference frame for the TrackerBaseToImagePlate calibration. // // Here the coexistence origin is set to the middle of the center screen, using // the same basis vectors as its image plate. This will put the tracker base // origin at (0.0 0.220 0.020) relative to the coexistence origin along its // basis vectors. // (PhysicalEnvironmentAttribute SampleSite CoexistenceToTrackerBase (TranslateRotate (Translate 0.0 -0.220 -0.020) (Rotate -50.0 0.0 0.0))) // Define the physical body. // // The head origin is halfway between the eyes, with X extending to the right, // Y up, and positive Z extending into the skull. // (NewPhysicalBody SiteUser) // Set the interpupilary distance. This sets the LeftEyePosition and // RightEyePosition to offsets of half this distance along both directions of // the X axis. // (PhysicalBodyAttribute SiteUser StereoEyeSeparation 0.066) // Define the position and orientation of the head relative to the tracker // mounted on the head. // (PhysicalBodyAttribute SiteUser HeadToHeadTracker ((1.0 0.0 0.0 0.000) (0.0 1.0 0.0 0.020) (0.0 0.0 1.0 0.018))) // Create a view using the defined screens, PhysicalEnvironment, and // PhysicalBody. // (NewView view0) (ViewAttribute view0 Screen center) (ViewAttribute view0 PhysicalEnvironment SampleSite) (ViewAttribute view0 PhysicalBody SiteUser) // Enable stereo viewing. Enable head tracking to get the position of the eyes // with respect to coexistence. // (ViewAttribute view0 StereoEnable true) (ViewAttribute view0 TrackingEnable True)