1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
|
/*
* $RCSfile$
*
* Copyright (c) 2006 Sun Microsystems, Inc. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without
* modification, are permitted provided that the following conditions
* are met:
*
* - Redistribution of source code must retain the above copyright
* notice, this list of conditions and the following disclaimer.
*
* - Redistribution in binary form must reproduce the above copyright
* notice, this list of conditions and the following disclaimer in
* the documentation and/or other materials provided with the
* distribution.
*
* Neither the name of Sun Microsystems, Inc. or the names of
* contributors may be used to endorse or promote products derived
* from this software without specific prior written permission.
*
* This software is provided "AS IS," without a warranty of any
* kind. ALL EXPRESS OR IMPLIED CONDITIONS, REPRESENTATIONS AND
* WARRANTIES, INCLUDING ANY IMPLIED WARRANTY OF MERCHANTABILITY,
* FITNESS FOR A PARTICULAR PURPOSE OR NON-INFRINGEMENT, ARE HEREBY
* EXCLUDED. SUN MICROSYSTEMS, INC. ("SUN") AND ITS LICENSORS SHALL
* NOT BE LIABLE FOR ANY DAMAGES SUFFERED BY LICENSEE AS A RESULT OF
* USING, MODIFYING OR DISTRIBUTING THIS SOFTWARE OR ITS
* DERIVATIVES. IN NO EVENT WILL SUN OR ITS LICENSORS BE LIABLE FOR
* ANY LOST REVENUE, PROFIT OR DATA, OR FOR DIRECT, INDIRECT, SPECIAL,
* CONSEQUENTIAL, INCIDENTAL OR PUNITIVE DAMAGES, HOWEVER CAUSED AND
* REGARDLESS OF THE THEORY OF LIABILITY, ARISING OUT OF THE USE OF OR
* INABILITY TO USE THIS SOFTWARE, EVEN IF SUN HAS BEEN ADVISED OF THE
* POSSIBILITY OF SUCH DAMAGES.
*
* You acknowledge that this software is not designed, licensed or
* intended for use in the design, construction, operation or
* maintenance of any nuclear facility.
*
* $Revision$
* $Date$
* $State$
*/
/*
************************************************************************
*
* Java 3D configuration file for a cave environment with head tracking and
* stereo viewing. This cave consists of 3 projectors with 3 screens to the
* left, front, and right of the user, all at 90 degrees to each other.
*
* The projectors in the VirtualPortal sample site are actually turned
* on their sides to get more height. Screen 0 is rotated 90 degrees
* counter-clockwise, while screens 1 and 2 are rotated 90 degrees
* clockwise.
*
************************************************************************
*/
// Configure the head tracker. The NewDevice command binds a logical name
// (the 2nd argument) to an InputDevice implementation whose class name is
// specified in the 3rd argument. The InputDevice implementation for a head
// tracker must generate position and orientation data relative to a fixed
// frame of reference in the physical world, the "tracker base" of the Java
// 3D view model.
//
// The InputDevice is instantiated through introspection of the class name.
// Available InputDevice implementations are site-specific, so substitute
// the class name in a NewDevice command below with one that is available at
// the local site.
//
// Note that properties are used to configure an InputDevice instead of
// attributes. The details of an InputDevice implementation are not known to
// ConfiguredUniverse, so the property name is invoked as a method through
// introspection. The example properties below must be replaced with the ones
// needed, if any, by specific InputDevice implementations. All arguments
// following the method name are wrapped and passed to the specified method as
// an array of Objects.
//
// All property arguments following the method name are wrapped and passed to
// the specified method as an array of Objects. Strings "true" and "false"
// get wrapped into Boolean, and number strings get wrapped into Double.
// Construct such as (0.0 1.0 2.0) and ((0.0 1.0 2.0 0.5) (3.0 4.0 5.0 1.0)
// (6.0 7.0 8.0 0.0)) get converted to Point3d and Matrix4d respectively.
// Note that last row of a Matrix4d doesn't need to be specified; it is
// implicitly (0.0 0.0 0.0 1.0).
//
(NewDevice glasses LogitechRedBarron)
(DeviceProperty glasses SerialPort "/dev/ttya") // Unix paths need quoting.
(DeviceProperty glasses TransmitterBaseline 0.4600)
(DeviceProperty glasses TransmitterLeftLeg 0.4400)
(DeviceProperty glasses TransmitterCalibrationDistance 0.4120)
// Create a logical name for the head tracker sensor. The last argument is
// the sensor's index in the input device.
//
(NewSensor head glasses 0)
// Create new screen objects and associate them with logical names and numbers.
// These numbers are used as indices to retrieve the AWT GraphicsDevice from
// the array that GraphicsEnvironment.getScreenDevices() returns.
//
// NOTE: The GraphicsDevice order in the array is specific to the local
// site and display system.
//
(NewScreen left 0)
(NewScreen center 1)
(NewScreen right 2)
// Set the available image areas as well as their positition and orientation
// relative to the tracker base. From the orientation of a user standing
// within this VirtualPortal site and facing the center screen, the tracker
// base is along the vertical midline of the screen, 0.248 meters down from
// the top edge, and 1.340 meters in front of it. The tracker base is
// oriented so that its +X axis points to the left, its +Y axis points toward
// the screen, and its +Z axis points toward the floor.
//
(ScreenAttribute left PhysicalScreenWidth 2.480)
(ScreenAttribute left PhysicalScreenHeight 1.705)
(ScreenAttribute left WindowSize NoBorderFullScreen)
(ScreenAttribute left TrackerBaseToImagePlate
(( 0.0 0.0 -1.0 2.230)
( 0.0 -1.0 0.0 1.340)
(-1.0 0.0 0.0 0.885)))
(ScreenAttribute center PhysicalScreenWidth 2.485)
(ScreenAttribute center PhysicalScreenHeight 1.745)
(ScreenAttribute center WindowSize NoBorderFullScreen)
(ScreenAttribute center TrackerBaseToImagePlate
(( 0.0 0.0 1.0 0.248)
(-1.0 0.0 0.0 0.885)
( 0.0 -1.0 0.0 1.340)))
(ScreenAttribute right PhysicalScreenWidth 2.480)
(ScreenAttribute right PhysicalScreenHeight 1.775)
(ScreenAttribute right WindowSize NoBorderFullScreen)
(ScreenAttribute right TrackerBaseToImagePlate
(( 0.0 0.0 1.0 0.2488)
( 0.0 -1.0 0.0 1.340)
( 1.0 0.0 0.0 0.860)))
// Create a physical environment. This contains the available input devices,
// audio devices, and sensors, and defines the coexistence coordinate system
// for mapping between the virtual and physical worlds.
//
(NewPhysicalEnvironment VirtualPortal)
// Register the input device defined in this file and the sensor which will
// drive head tracking.
//
(PhysicalEnvironmentAttribute VirtualPortal InputDevice glasses)
(PhysicalEnvironmentAttribute VirtualPortal HeadTracker head)
// Set the location of the center of coexistence relative to the tracker base.
// Here it set to the center of the center screen. The default view attach
// policy of NOMINAL_SCREEN used by ConfiguredUniverse will place the origin of
// the view platform in coexistence coordinates at the center of coexistence.
//
(PhysicalEnvironmentAttribute VirtualPortal
CoexistenceToTrackerBase
((-1.0 0.0 0.0 0.000)
( 0.0 0.0 -1.0 1.340)
( 0.0 -1.0 0.0 0.994)))
// The above center of coexistence is appropriate for the sample geometry
// files available in the programs/examples directory. Often a more
// immersive point of view is required for larger virtual worlds. This can be
// achieved by placing the center of coexistence closer to the nominal position
// of the user's head, so that the view platform origin in coexistence
// coordinates will map there as well.
//
// Here we set the location of the center of coexistence 0.5 meters along the
// tracker base +Z axis, 1.737 meters from the floor (about 5 ft 8.4 inches).
//
// (PhysicalEnvironmentAttribute VirtualPortal
// CoexistenceToTrackerBase
// ((-1.0 0.0 0.0 0.0)
// ( 0.0 0.0 -1.0 0.0)
// ( 0.0 -1.0 0.0 0.5)))
// Define the physical body.
//
// The head origin is halfway between the eyes, with X extending to the right,
// Y up, and positive Z extending into the skull.
//
(NewPhysicalBody LabRat)
(PhysicalBodyAttribute LabRat StereoEyeSeparation .07)
// Define the position and orientation of the head relative to the tracker
// mounted on the head.
//
(PhysicalBodyAttribute LabRat HeadToHeadTracker
((-1.0 0.0 0.0 0.00)
( 0.0 0.0 -1.0 0.05)
( 0.0 -1.0 0.0 0.11)))
// Now define the view.
//
(NewView view0)
(ViewAttribute view0 Screen left)
(ViewAttribute view0 Screen center)
(ViewAttribute view0 Screen right)
(ViewAttribute view0 PhysicalBody LabRat)
(ViewAttribute view0 PhysicalEnvironment VirtualPortal)
// Set the screen scale. This is scale factor from virtual to physical
// coordinates.
//
(ViewAttribute view0 ScreenScalePolicy SCALE_SCREEN_SIZE)
// Alternative for explict scaling.
//
//(ViewAttribute view0 ScreenScalePolicy SCALE_EXPLICIT)
//(ViewAttribute view0 ScreenScale 5.00)
// Enable stereo viewing. Enable head tracking to get the position of the eyes
// with respect to coexistence. Boolean values may be specified as either
// true, True, false, or False.
//
(ViewAttribute view0 StereoEnable true)
(ViewAttribute view0 TrackingEnable True)
|