Handbook of Virtual Environments Design Implementation and Applications 2nd Edition by Kelly Hale, Kay Stanney – Ebook PDF Instant Download/Delivery. 1466511842, 9781466511842
Full download Handbook of Virtual Environments Design Implementation and Applications 2nd Edition after payment
Product details:
ISBN 10: 1466511842
ISBN 13: 9781466511842
Author: Kelly S. Hale, Kay M. Stanney
A Complete Toolbox of Theories and TechniquesThe second edition of a bestseller, Handbook of Virtual Environments: Design, Implementation, and Applications presents systematic and extensive coverage of the primary areas of research and development within VE technology. It brings together a comprehensive set of contributed articles that address
Handbook of Virtual Environments Design Implementation and Applications 2nd Table of contents:
Section I Introduction
Chapter 1 Virtual Environments in the Twenty-First Century
1.1 Introduction
1.2 Technology
1.2.1 Human–Machine Interface
1.2.2 Computer Generation of Virtual Environments
1.2.3 Telerobotics
1.2.4 Networks
1.3 Psychological Consideration
1.4 Evaluation
1.5 Conclusion
References
Chapter 2 Virtual Environments Standards and Terminology
2.1 Introduction
2.2 Importance of Standards
2.3 Importance of Official Terminology
2.4 Basic Glossary
Websites for Additional Information on Standards
Online Glossaries of Virtual Environments
Further Reading
Section II System Requirements: Hardware
Chapter 3 Vision and Virtual Environments
3.1 Introduction
3.2 What Are the Limits of Normal Vision? (User Requirements)
3.2.1 Luminance
3.2.2 Spatial Abilities
3.2.2.1 Visual Fields
3.2.2.2 Visual Acuity
3.2.2.3 Contrast Sensitivity
3.2.2.4 Spatial Position
3.2.3 Depth
3.2.3.1 Monocular Cues
3.2.3.2 Binocular Cues
3.2.3.3 Accommodation and Vergence
3.2.4 Color Vision
3.2.5 Motion
3.2.6 Motion in Depth
3.3 What Do We Infer from What We See? (Shortcuts)
3.3.1 Figure/Ground Organization and Segmentation
3.3.2 Perceptual Constancies
3.3.3 Exocentric Motion
3.3.4 Egocentric Motion
3.3.5 Visually Induced Self-Motion
3.4 How Do We Look at and Manipulate Objects? (Movement of the Observer)
3.4.1 Eye Movements
3.4.1.1 Fixation
3.4.1.2 Optokinetic Nystagmus
3.4.1.3 Smooth Pursuit
3.4.1.4 Saccades
3.4.1.5 Vergence/Accommodation
3.4.2 Head and Body Movements
3.5 How Do We Depict the Three-Dimensional World in Two-Dimensional Displays? (Hardware Requirements)
3.5.1 Static Two-Dimensional Displays
3.5.2 Dynamic Two-Dimensional Displays
3.5.3 Electronic Displays and the User Interface
3.5.3.1 Spatial Parameters
3.5.3.2 Color
3.5.3.3 Image Motion
3.5.3.4 Stereopsis
3.6 What Are the Successes and Failures of Our Attempts to Simulate the Real World? (The State of the Art and Beyond)
3.6.1 Successes
3.6.2 Failures
3.7 New Research Directions
3.7.1 Limitations of Classical Research
3.7.1.1 Sensory and Sensory–Motor Interactions
3.7.1.2 Esoteric Devices and Generalization
3.7.2 Basic Research in Simulators
3.7.2.1 Machine or Observer Limited
3.7.2.2 Adaptation and Emerging Problems
References
Chapter 4 Virtual Auditory Displays
4.1 Introduction
4.1.1 Why Are Virtual Auditory Interfaces Important?
4.1.1.1 Environmental Realism and Ambience
4.1.1.2 Presence/Immersion and Perceived Simulation Quality
4.1.1.3 Selective Auditory Attention
4.1.1.4 Spatial Auditory Displays
4.1.1.5 Cross-Modal Interactions
4.2 Physical Acoustics
4.2.1 Properties of Sound
4.2.1.1 Frequency
4.2.1.2 Strength
4.3 Psychophysics
4.3.1 Frequency Analysis in the Auditory System
4.3.2 Intensity Perception
4.3.3 Masking Effects
4.3.4 Pitch and Timbre
4.3.5 Temporal Resolution
4.3.6 Spatial Hearing
4.3.6.1 Head-Related Transfer Functions
4.3.6.2 Binaural Cues
4.3.6.3 Spectral Cues
4.3.6.4 Anechoic Distance Cues
4.3.6.5 Reverberation
4.3.6.6 Dynamic Cues
4.3.6.7 Effects of Stimulus Characteristics on Spatial Perception
4.3.6.8 Top-Down Processes in Spatial Perception
4.3.6.9 Benefits of Binaural Hearing
4.3.6.10 Adaptation to Distorted Spatial Cues
4.3.6.11 Intersensory Integration of Spatial Information
4.3.7 Auditory Scene Analysis
4.3.8 Speech Perception
4.4 Spatial Simulation
4.4.1 Room Modeling
4.4.2 Headphone Simulation
4.4.2.1 Diotic Displays
4.4.2.2 Dichotic Displays
4.4.2.3 Spatialized Audio
4.4.2.4 Practical Limitations on Spatialized Audio
4.4.3 Simulation Using Speakers
4.4.3.1 Nonspatial Display
4.4.3.2 Stereo Display
4.4.3.3 Multichannel Loudspeaker Systems
4.4.3.4 Cross-Talk Cancellation and Transaural Simulations
4.4.3.5 Lessons from the Entertainment Industry
4.5 Design Considerations
4.5.1 Defining the Auditory Environment
4.5.2 How Much Realism Is Necessary?
References
Chapter 5 Dynamic Haptic Interaction with Video
5.1 Introduction
5.2 Computation of Force Experienced by the User
5.2.1 Static Force
5.2.2 Dynamic Force
5.2.3 Implementation of Depth Image-Based Haptic Rendering
5.2.3.1 Mapping between Image Coordinates and Haptic Coordinates
5.2.3.2 Interpolation of Depth Map
5.2.3.3 Selection of Mass for Video Object and Haptic Interface Point
5.2.3.4 Calculation of Accelerations and Forces
5.2.3.5 Temporal Interpolation of Forces
5.3 Video Processing for Haptic Rendering
5.3.1 Enhancement of Depth Images
5.3.2 Motion Estimation
5.4 Case Studies and Results
5.4.1 Simulation Results
5.4.2 Results with a Real Video Clip
5.5 Conclusion
Acknowledgment
References
Chapter 6 Olfactory Interfaces
6.1 Introduction
6.2 Human Processing of Smell
6.2.1 Odor Perception
6.2.1.1 Odor Intensity
6.2.1.2 Cross-Modality Interference
6.2.1.3 Personal Experiences/Emotion
6.2.1.4 Culture
6.2.1.5 Gender
6.2.1.6 Age
6.2.1.7 Impairments
6.2.1.8 Olfactory Adaptation
6.3 Effects of Smells on Human State
6.3.1 Psychological Effects of Odors
6.3.1.1 Motivation
6.3.1.2 Engagement
6.3.1.3 Attention
6.3.1.4 Memory
6.3.1.5 Workload
6.3.1.6 Emotion
6.3.2 Physiological Effects of Odors
6.4 Presentation Methods of Scent in Virtual Environments
6.5 Scent Presentation in Practice
6.6 Future Direction for Olfactory Presentation in Virtual Environments
References
Chapter 7 Perception of Body Motion
7.1 Introduction and Scope
7.2 Eliciting Acceleration/Motion Perceptions without Physical Acceleration of the User
7.2.1 Visually Induced Illusions of Self-Motion
7.2.2 Auditory Illusions of Self-Motion
7.2.3 Somatosensory Illusions of Self-Motion
7.2.3.1 Tactile Illusions due to Contact with a Moving Frame of Reference
7.2.3.2 Kinesthetic Illusions due to Limb Movement
7.2.3.3 Kinesthetic Illusions due to Vibration of Localized Body Regions
7.2.4 Methods for Eliciting Vestibular Illusions without Accelerating the User
7.2.4.1 Caloric Stimulation
7.2.4.2 Galvanic Vestibular Stimulation
7.2.4.3 Drug Effects
7.2.4.4 Low-Frequency Sound Effects
7.2.4.5 Other Vestibular Effects Not Requiring Body Acceleration
7.3 Eliciting Acceleration or Motion Perceptions
7.3.1 Changing Perceived Body Weight Using Real Acceleration Stimuli
7.3.2 Illusions of Self-Motion Associated with a Change of Velocity
7.3.3 Illusions of Self-Motion Associated with Multiaxis Rotation
7.3.4 Illusions Associated with Static or Dynamic Body Tilt
7.3.5 Whole-Body Vibration Effects
7.4 Specific Categories of Acceleration/Motion Perceptions Involving Acceleration Stimuli to the Vestibular Modality
7.4.1 Perception of Tilt
7.4.2 Perception of Rotation
7.4.3 Perception of Translation
7.4.4 Illusory Absence of Tilt
7.4.5 Illusory Absence of Rotation
7.4.6 Illusory Absence of Translation
7.4.7 Perception of Increased Weight or G-Forces
7.5 Role of Isomorphic Real-Motion Stimuli
7.5.1 Applying Isomorphic Real Acceleration to Enhance Aviation Mission Rehearsal
7.5.2 Applying Isomorphic Real Acceleration to Enhance Automobile Racing Rehearsal
7.5.3 Applying Isomorphic Real Acceleration via Free-Space Walking in a VE
7.5.3.1 Applying Real Locomotion through Simulated Environments as Part of Police Training
7.5.4 Advantages and Limitations of Using Isomorphic Acceleration/Motion Stimuli
7.5.5 Offering Fairly Isomorphic Acceleration Stimuli in a Smaller Space via Redirected Walking
7.6 Benefit of Combining Different Sensory Cues
7.7 Visual Consequences of Acceleration
7.8 Cognitive Influences
7.9 Summary
Disclaimer
Acknowledgments
Appendix: Recommended Readings on Vestibular Function
References
Chapter 8 Eye Tracking in Virtual Environments
8.1 Introduction
8.2 Stereotypical Eye Movement and Eye Trackers
8.2.1 Eye Movement
8.2.2 Contact and Noncontact Eye Trackers
8.2.3 Head-Mounted and Remote Eye Trackers
8.2.4 Eye Tracking Research in Animal Models
8.3 Use of Eye Trackers for HCI and Cognitive State Measurement
8.4 Integration of Eye Trackers with VE Systems
8.4.1 Commercial Solutions
8.4.2 Low-Cost Solutions
8.5 Case Study of Integration of Eye Trackers with VE Goggles
8.6 Future Directions
References
Chapter 9 Gesture Recognition
9.1 Introduction
9.2 Nature of Gesture
9.3 Representations of Posture and Gesture
9.4 Approaches for Recognizing Gesture
9.4.1 Pen-Based and Touch-Based Gesture Recognition
9.4.2 Device-Based Gesture Recognition
9.4.2.1 Instrumented Gloves
9.4.2.2 Body Suits and Motion Tracking Systems
9.4.3 Passive Vision-Based Gesture Recognition
9.4.3.1 Head and Face Gestures
9.4.3.2 Hand and Arm Gestures
9.4.3.3 Body Gestures
9.4.4 Depth Cameras
9.5 Guidelines for Gesture Recognition Systems
9.6 Conclusions and Future Directions in Gesture Recognition
References
Chapter 10 Avatar Control in Virtual Environments
10.1 Introduction
10.2 First Steps toward Virtual Locomotion
10.3 User Interface Design Principles for Avatar Control
10.4 VIRTE: Dismounted Infantry Simulation
10.4.1 Stimulus Substitution
10.4.1.1 Looking Using an HMD
10.4.1.2 Shooting Using a Rifle Prop
10.4.2 Motor Substitution
10.4.2.1 Virtual Locomotion
10.4.2.2 Virtual Contact
10.5 Problems with the Full-Body-Tracked Infantry Simulator
10.5.1 Sensory–Motor Mismatch
10.5.2 Stimulus Mismatch
10.5.2.1 Limiting the Field of View and Visual Acuity
10.5.2.2 Aiming a Rifle
10.5.3 Motor Mismatch
10.5.3.1 Reflexes Tend to Override Gestural Substitutes
10.5.3.2 Phase Mismatch between In-Place and Natural Stepping
10.5.3.3 Difficulty Executing Tactical Movements
10.5.3.4 Poorly Simulated Virtual Contact
10.6 Toward a More Abstract User Interface for Avatar Control
10.6.1 Motor Substitution: Correspondence and Abstraction
10.6.2 Pointman™: A Seated Interface Using the Head, Hands, and Feet
10.6.2.1 Slide Pedals to Step
10.6.2.2 Depress Pedals to Lower Postural Height
10.6.2.3 Thumb Sticks Direct Heading and Course
10.6.2.4 Tilt Gamepad to Vary Weapon Hold
10.6.2.5 Head Tracker Controls Upper Body
10.6.3 Application of Interface Design Principles
10.6.3.1 Positional Controls
10.6.4 Stimulus Substitution: More Abstractions
10.6.4.1 Head-Coupled View Control
10.6.4.2 Advantages of Using a Fixed Screen Display
10.6.4.3 Advantages of Being Seated
10.6.4.4 Advantages of Virtual Weapons
10.6.5 Abstraction Can Enhance Behavioral Realism
10.6.6 Training to Use a New Interface
10.6.7 Integration with Virtual Battlespace 2
10.6.8 Testing and Assessment
10.6.8.1 Military Utility Assessment
10.7 Conclusion
Acknowledgments
References
Section III System Requirements: Software
Chapter 11 Virtual Environment Models
11.1 Introduction
11.2 Geometry
11.2.1 Geometric Primitives
11.2.2 Surfaces
11.2.3 Designing for Rendering Efficiency
11.2.4 Scene Graph
11.2.5 Material Properties
11.2.6 Lighting and Shading
11.2.7 Surface Detail
11.2.8 Rendering Pipeline
11.3 Behavior
11.3.1 Time Based
11.3.2 Scripted
11.3.3 Event-Driven
11.3.4 Constraint Maintenance
11.3.5 Reintroducing Physics
11.3.6 Interactors
11.4 File Formats
11.5 Making It Real Time
11.5.1 Rendering Pipeline Efficiency
11.5.2 Face and Object Culling
11.5.3 Level of Detail Switching
11.5.4 Billboarded Geometry
11.6 Including the User
11.6.1 User Model
11.6.2 Incorporating Tracking Information
11.6.3 Viewing Model
11.7 Summary
References
Chapter 12 Principles for Designing Effective 3D Interaction Techniques
12.1 Introduction
12.1.1 Motivation
12.1.2 Universal Interaction Tasks
12.1.3 User-Experience Requirements
12.1.3.1 General Usability Requirements
12.1.3.2 Performance Requirements
12.1.3.3 Naturalism Requirements
12.1.3.4 Special Requirements
12.2 Naturalism versus Magic
12.2.1 Biomechanical Symmetry
12.2.2 Control Symmetry
12.2.3 System Appropriateness
12.3 High-Level Guidelines for 3D Interaction Techniques
12.3.1 Existing HCI Guidelines
12.3.2 Interacting in 3D Space
12.3.3 Hardware Considerations
12.3.3.1 Display Devices
12.3.3.2 Input Devices
12.4 Techniques and Guidelines for Common VE Tasks
12.4.1 Selection
12.4.1.1 Categories of Selection Techniques
12.4.1.2 General Guidelines for Designing Selection Techniques
12.4.1.3 Guidelines for High-Performance Selection Techniques
12.4.1.4 Guidelines for More-Natural Selection Techniques
12.4.1.5 Special Guidelines for Selection Techniques
12.4.2 Manipulation
12.4.2.1 Categories of Manipulation Techniques
12.4.2.2 General Guidelines for Designing Manipulation Techniques
12.4.2.3 Guidelines for High-Performance Manipulation Techniques
12.4.2.4 Guidelines for More-Natural Manipulation Techniques
12.4.2.5 Special Guidelines for Manipulation Techniques
12.4.3 Travel
12.4.3.1 Categories of Travel Techniques
12.4.3.2 General Guidelines for Designing Travel Techniques
12.4.3.3 Guidelines for High-Performance Travel Techniques
12.4.3.4 Guidelines for More-Natural Travel Techniques
12.4.3.5 Special Guidelines for Travel Techniques
12.4.4 System Control
12.4.4.1 Categories of System Control Techniques
12.4.4.2 General Guidelines for Designing System Control Techniques
12.4.4.3 Guidelines for High-Performance System Control Techniques
12.4.4.4 Guidelines for More-Natural System Control Techniques
12.4.4.5 Special Guidelines for System Control Techniques
12.5 Goal-Based Design Trade-Offs
12.5.1 Performance-Based Design
12.5.2 Naturalism-Based Design
12.5.3 Entertainment-Based Design
12.6 Evaluation and Application
12.7 Conclusions
Acknowledgments
References
Chapter 13 Technological Considerations in the Design of Multisensory Virtual Environments: How Real Does It Need to Be?
13.1 Introduction
13.2 Combat Search and Rescue Mission
13.3 Visual Environment
13.4 Auditory Environment
13.5 Olfactory Environment
13.6 Haptic Environment
13.7 Locomotion
13.8 Implementing a Multisensory Virtual Environment
13.8.1 Display Compatibility
13.8.2 Temporal Synchrony and Spatial Alignment
13.8.3 Perceptual Illusions and Modal Synergies
13.8.4 Adaptive Environments
13.9 Realism and Presence
13.9.1 Importance of Realism
13.9.2 Presence and the CSAR Task
13.10 Conclusion
References
Chapter 14 Embodied Autonomous Agents
14.1 Introduction
14.2 Appearance of a Virtual Character
14.2.1 Traditional 3D Character Representation
14.2.2 GPU-Based Shading
14.2.3 Image-Based Character Representation
14.2.4 Forward Kinematics/Inverse Kinematics
14.2.4.1 Forward Kinematics
14.2.4.2 Inverse Kinematics
14.2.5 Skinning
14.2.6 Cloth
14.3 Mental Processes
14.3.1 Perception, Attention, and Perceptual Understanding
14.3.1.1 Environment
14.3.1.2 Human Interactant
14.3.2 Reasoning and Representation
14.3.3 Emotion
14.3.4 Natural Language Dialogue
14.3.5 Nonverbal Behavior
14.4 Group and Crowds
14.4.1 Level of Detail
14.4.2 Generating Different Appearances
14.4.3 Pathfinding and Steering
14.5 Conclusion: Making Compromises
References
Section IV Design Approaches and Implementation Strategies
Chapter 15 Structured Development of Virtual Environments
15.1 Introduction
15.1.1 Background
15.1.2 Barriers to VE Application
15.1.3 Understanding Human Performance in VEs
15.1.4 Usability of VEs
15.2 VEDS: The VE Development Structure
15.2.1 Project Definition
15.2.2 Requirements Gathering and Analysis
15.2.3 Specification
15.2.3.1 Personas and Scenarios
15.2.3.2 VE Goals and Prioritization
15.2.3.3 Concept Design and Storyboarding
15.2.3.4 Virtual Task Analysis
15.2.3.5 VR System Configuration: Hardware and Software
15.2.3.6 Validation Criteria
15.2.4 VE Building
15.2.4.1 Basic Choices
15.2.4.2 Choices over Object “Intelligence”
15.2.4.3 Programming Interactivity
15.2.4.4 Implementation of Physical Properties
15.2.4.5 Overcoming Limiting Factors in VE Development
15.2.5 Overall Design
15.2.6 Resource Acquisition
15.2.7 Detail Design
15.2.7.1 Cues and Feedback
15.2.8 VE Programming
15.2.9 Verification
15.2.10 Deployment
15.2.11 Validation
15.3 VEDS in Practice
15.4 Conclusions
Acknowledgment
References
Chapter 16 Cognitive Aspects of Virtual Environment Design
16.1 Introduction: Cognitive Issues for Virtual Environments
16.1.1 Perception
16.1.2 Attention
16.1.3 Learning and Memory
16.1.4 Problem Solving and Decision Making
16.1.5 Motor Cognition
16.2 Cognitive Issues in Virtual Environment Training
16.2.1 Knowledge Type and the Acquisition of Knowledge in VEs
16.2.1.1 Location Knowledge
16.2.1.2 Structural Knowledge
16.2.1.3 Behavioral Knowledge
16.2.1.4 Procedural Knowledge
16.2.2 Tutor Architecture
16.2.3 Task Analysis
16.3 Cognitive Tasks Appropriate for VE
16.3.1 Navigation/Locomotion in Complex Environments
16.3.2 Learning Abstract Concepts with Spatial Characteristics
16.3.3 Complex Data Analysis
16.3.4 Manipulation of Complex Objects and Devices in 3D Space
16.3.5 Decision Making
16.4 Applications of Cognitive Training Using VE
16.4.1 VESUB Harbor Navigation
16.4.2 COVE Underway Replenishment
16.4.3 Performance Measures
16.4.4 Operations Training with VE
16.4.5 VE MOUT
16.5 Conclusions
References
Chapter 17 Multimodal Interaction Modeling
17.1 Introduction
17.2 Multimodal Input from User to the Computer
17.2.1 Input Modalities
17.2.2 Multimodal Input Architecture
17.2.3 Multimodal Input Integration and Synchronization
17.2.4 Summary of Multimodal Input
17.3 Machine–Human Communication
17.3.1 Multimodal Feedback Integration
17.3.1.1 Visual–Auditory Feedback Interaction
17.3.1.2 Visual–Haptic Feedback Interaction
17.3.1.3 Haptic Channels Coupling
17.3.2 Sensorial Transposition
17.3.3 I/O Channel Coupling
17.3.4 Summary of Multimodal Feedback
17.4 Multimodal VE Design
17.4.1 Advances in VE Design Using Multimodal Input
17.4.2 Using Multiple Modalities in CVEs
17.4.2.1 Effect of Multiple Modalities on Collaboration in CVEs
17.4.2.2 Architectures for Multimodal CVEs
17.4.2.3 Multimodal Interaction in Distributed Surgical Simulators
17.4.3 Distributed Web-Based Multimodal VEs
17.5 Conclusions and Future Directions
Acknowledgments
References
Chapter 18 Illusory Self-Motion in Virtual Environments
18.1 Introduction
18.2 General Characteristics of Vection
18.2.1 Types of Vection
18.2.1.1 Circular Vection
18.2.1.2 Roll Vection
18.2.1.3 Pitch Vection
18.2.1.4 Linear Vection
18.2.2 Phenomenological and Behavioral Aspects of Vection
18.2.3 Optical Determinants of Vection
18.2.4 Neurophysiological Correlates of Vection
18.2.5 Motion Sickness and Adaptation Phenomena
18.3 Frameworks for Vection
18.3.1 Perceptual Difference Model/Depth-Order Effect
18.3.2 Rest Frame Hypothesis
18.3.3 Reference Frame Model
18.3.4 Attentional Modulation Framework
18.3.5 Object–Background Hypothesis
18.3.6 Summary
18.4 Multisensory Vection Illusions
18.4.1 Auditory Vection
18.4.1.1 Pure Auditory Vection
18.4.1.2 Auditory Cues in Multimodal Vection Illusions
18.4.2 Haptic and Tactile Cues
18.5 Relevance of Vection for VE System Design and Use
18.6 Research Issues
18.6.1 Effect of Complex Motion Patterns on Vection
18.6.2 Multisensory Patterns of Information Specifying Self-Motion
18.6.3 Adaptation, Readaptation, Transfer of Adaptation, and Virtual Environments
18.6.4 Effect of Visual Frames of Reference on Vection and Orientation
18.7 Conclusions
Acknowledgments
References
Chapter 19 Spatial Orientation, Wayfinding, and Representation
19.1 Introduction
19.1.1 Definition of Terms
19.1.2 Training Transfer or Performance Enhancement?
19.2 Background
19.2.1 Spatial Knowledge Acquisition
19.2.1.1 Direct Environmental Exposure
19.2.1.2 Map Usage
19.2.1.3 Other Techniques and Tools
19.2.2 Representations of Spatial Knowledge
19.2.3 Models of Navigation
19.3 Navigation Performance Enhancement
19.3.1 Navigation Tools and Mediators
19.3.1.1 Maps
19.3.1.2 Landmarks
19.3.1.3 Trails and Directional Aids
19.3.2 Organizational Remedies
19.3.2.1 Environmental Design
19.3.2.2 Visualization
19.4 Environmental Familiarization
19.4.1 Spatial Knowledge Transfer Studies
19.4.1.1 Basis for Comparison
19.4.1.2 Experiments
19.5 Principles for the Design of Navigable Virtual Environments
19.6 Performance Enhancement
19.6.1 Tools and Mediators
19.6.1.1 Map Usage
19.6.1.2 Landmarks
19.6.1.3 Trails or Footprints
19.6.1.4 Directional Cues
19.6.2 Organizational Remedies
19.6.2.1 Environmental Design
19.6.2.2 Visualization
19.7 Environmental Familiarization
References
Chapter 20 Technology Management and User Acceptance of Virtual Environment Technology
20.1 Introduction
20.1.1 Technology Deployment and Adoption
20.1.2 Toward an Understanding of Technology Adoption
20.2 Management Challenges
20.2.1 Integration into Corporate Culture
20.2.2 Need for Credible Productivity Gains
20.2.3 Fear of the Operations and Sustainability Tail
20.3 User Challenges
20.3.1 Killer Application
20.3.2 Physical Ergonomics
20.3.3 Cognitive Ergonomics
20.4 Domain-Specific Challenges
20.4.1 Design and Testing
20.4.2 Manufacturing
20.4.3 Operations
20.4.4 Information Management
20.4.5 Entertainment
20.4.6 Medicine/Health Care
20.4.7 Education and Training
20.4.8 Marketing and Sales
20.5 Conclusions
References
Chapter 21 Virtual Environments and Product Liability
21.1 Introduction
21.2 Aftereffects following Usage of a Product
21.3 Product Standards
21.4 Product Warnings
21.5 Product Liability Issues
21.6 Systems Safety Approach
21.7 Conclusions
References
Section V Health and Safety issues
Chapter 22 Direct Effects of Virtual Environments on Users
22.1 Introduction
22.2 Direct Tissue Effects
22.2.1 Visual System
22.2.1.1 Visible Light
22.2.1.2 Infrared and Ultraviolet Light
22.2.1.3 Photic Seizures
22.2.1.4 Migraines
22.2.2 Auditory System
22.2.3 Skin and Tissue Effects
22.3 Trauma
22.4 Motion Base Platforms
22.5 Avoidance of Injury
22.6 Conclusion
References
Chapter 23 Motion Sickness Symptomatology and Origins
23.1 Introduction
23.1.1 Scope of This Review
23.1.2 Terminology
23.2 MS Incidence and Symptoms in Various Environments
23.2.1 Acceleration and Unusual Force Environments
23.2.2 Moving Visual Surrounds
23.2.3 Simulators
23.2.4 Simulators versus Other Synthetic Environments
23.2.5 Virtual Environments and Head-Mounted Displays
23.2.6 Virtual Environments versus Other Synthetic Environments
23.2.7 Using VE in a Moving Environment: Uncoupled Motion
23.3 Signs and Symptoms of MS
23.3.1 Consensus regarding Symptomatology
23.3.1.1 Descriptions of Signs/Symptoms
23.3.2 Importance of Subjective MS Symptomatology
23.4 Symptom Onset, Early Symptoms, and Initial Symptom Progression
23.4.1 Practical Benefits of Understanding Symptom Onset and Progression
23.4.2 Scientific Benefits of Understanding Symptom Onset and Progression
23.4.3 Findings concerning Average Symptom Onset and Progression
23.4.4 Summary and Gaps in Knowledge concerning MS Onset and Progression
23.4.5 Conclusions and Recommendations for Future Study
23.5 Other Less Obvious Side Effects of Real or Apparent Motion
23.5.1 Sopite Syndrome
23.5.2 Loss of Visual Acuity during Head or Body Motion
23.5.3 Postural Disequilibrium
23.6 Factors Related to Motion Sickness in Virtual Environments
23.6.1 Moving Visual Fields
23.6.1.1 Relation between Vection and Feelings of Discomfort
23.6.1.2 Field of View (FOV)
23.6.1.3 Frequency of Visual Scene Motion
23.6.1.4 Intervals between Exposure and Exposure Duration
23.6.1.5 Lack of Correspondence between Visual Scene and Head Movement
23.6.1.6 Control and Navigation Factors
23.7 Advantages and Challenges of Head-Mounted Displays
23.7.1 Advantages of HMDs for Displaying VEs
23.7.2 Challenges Associated with HMDs
23.8 Prediction of MS Susceptibility
23.8.1 Susceptibility of Men versus Women: An Alternative View
23.9 Theoretical Considerations
23.9.1 What Should a Complete Theory of Motion Sickness Be Able to Explain?
23.9.2 Is a Poison Hypothesis Necessary to Explain Aversive Reactions to Motion, or Is a Direct Evolutionary Hypothesis Possible without a Role for Toxins?
23.9.3 Nausea and Vomiting as Direct rather than Indirect Evolutionary Responses to Motion: A Direct Poison Hypothesis
23.9.3.1 Summary
23.9.4 Hypotheses concerning Sources of Individual Variability in MS Susceptibility
23.10 Conclusions and Recommendations
Disclaimer
Acknowledgments
References
Chapter 24 Motion Sickness Scaling
24.1 Introduction
24.2 History and Development of the Most Widely Used MS Rating Scales
24.2.1 Consensus regarding the Need for a Prevomiting Endpoint
24.2.2 Original and Most Widely Used Multiple-Symptom Checklists
24.2.3 Modifications of the Original MS Scales
24.3 Other Multisymptom Checklists
24.4 Approaches That Solicit a Single MS Answer from the Participant
24.5 Consideration of the Number of Categories Used to Obtain Ratings of MS Severity
24.6 Recommendations concerning the Use of MS Scales
24.7 Sopite Syndrome: Relevant Scales in Development
24.8 MS Scales Measuring Past Susceptibility or History of MS
24.9 Summary
Disclaimer
Acknowledgments
References
Chapter 25 Adapting to Virtual Environments
25.1 Overview
25.2 Adaptation as a Solution to VE Limitations
25.3 Sensory Rearrangements Found in Some VEs
25.3.1 Intersensory Conflicts
25.3.2 Depth and Distance Distortions
25.3.3 Shape and Size Distortions
25.3.4 Delays of Sensory Feedback
25.3.5 Sensory Disarrangement
25.4 Adapting to Sensory Rearrangements and Applicability to VE Training Procedures
25.4.1 Stable Rearrangement
25.4.2 Active Interaction
25.4.3 Error-Corrective Feedback
25.4.4 Immediate Sensory Feedback
25.4.5 Incremental Exposure
25.4.6 Distributed Practice
25.5 Optimal Adaptation-Training Procedure and How to Counteract the Drawbacks of Maximal Adaptation
25.5.1 Optimal Procedure
25.5.2 Simulations of Naturally Rearranged Sensory Environments
25.5.3 Potential Drawbacks of Maximal Adaptation
25.5.3.1 Negative Aftereffects and Negative Transfer
25.5.3.2 Eliminating the Aftereffects
25.6 Role of Individual Differences
25.7 Summary and Conclusions
References
Chapter 26 Visually Induced Motion Sickness: Causes, Characteristics, and Countermeasures
26.1 Introduction
26.2 Theories of MS
26.2.1 Contribution of Sensory Conflict to the Elicitation of MS
26.2.1.1 Role of Uncorrelated Input in Sensory Conflict Theory
26.2.1.2 Mathematical Elaboration of Sensory Conflict Theory
26.2.1.3 Contribution of Conflicts with the Perceived Vertical
26.2.1.4 Contribution of Conflicts with the Perceived Stationary Frame of Reference
26.2.1.5 Contribution of Development of Movement Control
26.2.1.6 Consideration of Sensory Conflict Theory
26.2.2 Contribution of Postural Mechanisms to the Elicitation of MS
26.2.2.1 Consideration of Postural Instability Theory
26.2.3 Attempts to Evaluate Multiple Theories
26.2.4 Contribution of Eye Movement Mechanisms to the Elicitation of MS
26.2.5 Evolutionary Explanations for the Origin of MS
26.2.6 Summary
26.3 MS Measurement
26.3.1 Motion Sickness Symptom Checklist Questionnaires
26.3.2 Rapid Self-Report Questionnaires
26.3.3 Psychophysiological Measurements
26.4 Related Concepts
26.4.1 Illusory Self-Motion (Vection)
26.4.2 Presence
26.4.3 Display Technology Effects
26.4.4 Fundamental Stimulus Characteristics
26.5 Susceptibility to VIMS
26.6 Aftereffects of VIMS
26.7 Countermeasures
26.7.1 Medication Countermeasures
26.7.2 Nondrug Agents
26.7.3 Behavioral Countermeasures
26.8 Conclusion
26.8.1 Theoretical Approaches
26.8.2 Countermeasures
26.8.3 Measurement Techniques
Disclaimer
Acknowledgments
References
Chapter 27 Social Impact of Virtual Environments
27.1 Introduction
27.2 Form and Content of Information Technologies
27.2.1 Form of Information Technologies
27.2.2 Content of Information Technologies
27.3 Theoretical Perspectives and Predictions in Relation to Virtual Environment Content
27.3.1 Social Cognitive Theory
27.3.2 Arousal Theory
27.3.3 Psychoanalytic Theory
27.3.4 Cultivation Theory
27.3.5 Cognitive Theories
27.3.6 Behaviorism
27.3.7 Perceptual Developmental Theory
27.3.8 Uses and Gratification Theory
27.3.9 Parasocial Relationships
27.4 Cognitive, Social, and Behavioral Impact of Virtual Environments
27.4.1 Aggressive Interactions
27.4.2 Identity Construction
27.4.3 Sexual Interactions
27.4.4 Social Interaction or Social Isolation?
27.4.5 Prosocial Interactions
27.5 Limitations of Virtual Environments
27.6 Social Implications for the Future of Virtual Environments
References
Section VI Evaluation
Chapter 28 Usability Engineering of Virtual Environments
28.1 Introduction
28.2 Setting the Context for VE Usability Engineering
28.3 Current Usability Engineering Methods
28.3.1 User Task Analysis
28.3.2 Expert Guidelines-Based Evaluation
28.3.3 Formative Usability Evaluation
28.3.4 Summative Evaluation
28.3.5 Successful Progression
28.4 Application of Usability Engineering Methods
28.5 Case Study 1: Dragon
28.5.1 Dragon Expert Evaluation
28.5.2 Dragon Formative Evaluation
28.5.3 Dragon Summative Evaluation
28.5.4 Dragon: Lessons Learned
28.6 Case Study 2: BARS
28.6.1 BARS Domain Analysis
28.6.2 BARS Expert Evaluation
28.6.3 BARS Formative Evaluation
28.6.4 BARS: Lessons Learned
28.7 Conclusion
Acknowledgments
References
Chapter 29 Human Performance Measurement in Virtual Environments
29.1 Introduction
29.2 Why Measure Performance in VEs?
29.3 What Criteria to Measure
29.3.1 Taxonomies of Measurement
29.3.2 Sensation/Psychomotor Measures
29.3.3 Physical Behaviors
29.3.4 Critical Incidents
29.3.5 Knowledge Tests
29.3.6 Error Documentation
29.3.7 Collective Performance
29.3.7.1 Team Outputs
29.3.7.2 When to Measure Team Performance
29.3.7.3 Team Performance Measurement Requirements
29.3.7.4 Limitations of Team Measures
29.3.8 Reactions to Nonvisual, Nonauditory Sensory Stimuli
29.4 Psychophysiological Measurement in VEs
29.4.1 Psychophysiological Approaches to Measurement
29.4.2 Types of Measures and Performance Indicators
29.4.3 Data Collection Considerations
29.4.4 Advances in Low-Cost Psychophysiological Measurement
29.5 Logistics of Measuring VE Task Performance
29.6 When to Measure Performance in VEs
29.7 Psychometric Properties of Measurements
29.7.1 Psychometric Criteria for Performance Measures
29.7.1.1 Reliability
29.7.1.2 Validity
29.7.1.3 Sensitivity
29.7.2 Issues Impacting Data Integrity in VEs
29.7.2.1 Hardware Issues
29.7.2.2 Software Issues
29.7.2.3 Task Issues
29.7.2.4 User Characteristics
29.7.2.5 Combining Performance Metrics into Global Assessment Scores
29.7.2.6 Extended Use of Avatars and NPCs
29.7.2.7 Performance Measurement with Augmented Reality
29.8 Summary
Acknowledgments
References
Chapter 30 Conducting Training Transfer Studies in Virtual Environments
30.1 Introduction
30.2 Training Transfer
30.2.1 Types of Transfer
30.3 Evaluating Training Transfer
30.3.1 Challenge One: Developing Meaningful Measures of Transfer and Selecting an Appropriate Experimental Design
30.3.2 Challenge Two: Identifying an Operational System or Suitable Substitute to Support Evaluation
30.3.3 Challenge Three: Overcoming Logistical Constraints
30.4 Conclusions
Acknowledgments
References
Chapter 31 Virtual Environment Usage Protocols
31.1 Introduction
31.2 Strength of the Virtual Environment Stimulus
31.3 Individual Capacity to Resist Adverse Effects of VE Exposure
31.4 VE Usage Protocol
31.5 Conclusions
Acknowledgments
References
Chapter 32 Measurement of Visual Aftereffects following Virtual Environment Exposure: Implications for Minimally Invasive Surgery
32.1 Introduction
32.2 Binocular Organization
32.2.1 Refraction
32.2.2 Eye Movements
32.2.3 Vergence and Accommodation
32.2.3.1 Phasic Elements
32.2.3.2 Tonic Elements
32.2.3.3 Cross-Links
32.2.4 Vergence (or Prism) Adaptation
32.2.5 Accommodative Adaptation
32.2.6 Cross-Link Adaptation
32.2.7 Strabismus
32.2.8 Heterophoria
32.2.9 Binocular Vision Anomalies
32.2.9.1 Esophoria
32.2.9.2 Exophoria
32.2.9.3 Position-Specific Heterophoria Adaptation
32.2.9.4 Summary of Some Key Points
32.3 Clinical Measures of Visual Aftereffects following VE Exposure
32.3.1 History and Symptoms
32.3.2 Vision/Visual Acuity
32.3.3 Refractive Error
32.3.4 Ocular-Motor Balance
32.3.5 Near Point of Convergence
32.3.6 Amplitude of Accommodation
32.3.7 Stereopsis
32.4 Beyond Clinical Measures
32.4.1 Study of AV/A and VA/V Adaptation
32.4.2 Results
32.5 Visual Aftereffects and MIS
32.5.1 Laparoscopic Surgery
32.5.2 Robotic-Assisted Surgery
32.5.3 Training Simulators for MIS
32.5.4 Visual Considerations within MIS
32.6 Conclusions
Acknowledgments
References
Chapter 33 Proprioceptive Adaptation and Aftereffects
33.1 Introduction
33.2 Proprioception, Motor Control, and Spatial Orientation
33.2.1 Multisensory and Motor Factors
33.2.2 Sensorimotor Calibration
33.2.3 Muscle Spindles
33.2.4 Role of Body Schema and Spatial Orientation
33.2.5 Bidirectional Interactions of Visual and Muscle Spindle Influences
33.2.6 Role of Tactile Cues in Unifying Muscle Spindle and Other Sensory Influences
33.3 Proprioceptive Adaptation of the Arm to Visual Displacement
33.3.1 Sensory Rearrangement
33.3.2 Internal Form of Adaptation
33.3.3 Conditions Necessary for Adaptation to Occur
33.3.4 Retention and Specificity of Adaptation
33.4 Adaptation to Altered Force Backgrounds
33.4.1 Motor Adaptation to Coriolis Force Perturbations in a Rotating Room
33.4.2 Motor Adaptation and Force Perception
33.4.3 Context-Specific Motor Adaptation and Aftereffects in Different Force Environments
33.4.4 Motor Adaptation to Environments versus Local Contexts
33.4.5 Role of Touch Cues in Motor and Proprioceptive Adaptation
33.5 Perception of Limb and Body Motion during Posture and Locomotion
33.5.1 Touch Stabilization of Posture
33.5.2 Touch and Locomotion
33.6 Aftereffects and Readaptation to the Real World
33.7 Conclusions
Acknowledgments
References
Chapter 34 Beyond Presence: How Holistic Experience Drives Training and Education
34.1 Introduction
34.2 Many Conceptualizations of Presence
34.3 Why Care about Presence, Anyway?
34.4 Experience
34.4.1 Experiential Design
34.4.2 Experiential Design and Virtual Environments
34.4.3 Virtual Experience Test
34.5 Examples of Experience in Virtual Environments
34.5.1 Medical: Virtual Therapy Environments
34.5.2 Military: Infantry Training Simulations
34.5.3 Education: Casual Serious Games
34.5.4 Entertainment: Video Games
34.6 Conclusion
References
Chapter 35 Augmented Cognition for Virtual Environment Evaluation
35.1 Introduction
35.2 Traditional VE Design and Evaluation Approaches
35.3 Augmented Cognition Design and Evaluation Techniques
35.4 Current Challenges/Limitations
35.5 Implications for VE System Design and Evaluation
35.6 Conclusions
References
Section VII Selected Applications of Virtual Environments
Chapter 36 Applications of Virtual Environments: An Overview
36.1 Introduction
36.2 Defense Applications
36.2.1 Close-Range 20 and 30 mm Weapons System Trainer (Royal Navy: Naval Recruitment and Training Agency)
36.2.2 SubSafe (Royal Navy’s Submarine Qualification Course, HMS Drake, Devonport)
36.2.3 Submarine Rescue (“Virtual LR5”: The Royal Navy’s Submarine Escape, Rescue, and Abandonment Systems Project Team)
36.2.4 Defense Diving Situational Awareness Trainer (UK Defense Diving School)
36.2.5 Counter-IED Urban Planning Tool (EODSim) (UK Defence Explosive Ordnance, Munitions and Search School)
36.2.6 Afghanistan Market/Village Scenario (UK Defence Explosive Ordnance Disposal, Munitions, and Search School)
36.2.7 CUTLASS Robot Vehicle and Manipulator Skills Trainer (Defence Science and Technology Laboratory)
36.2.8 Tornado F3 Avionics Maintenance Trainer (RAF Marham)
36.2.9 Helicopter Voice Marshaling (RAF Shawbury, Valley, and St. Mawgan)
36.3 Medical Applications
36.3.1 V-Xtract
36.3.2 Minimally Invasive Surgical Trainer
36.3.3 University Medical Education in the United States
36.3.4 IERAPSI Temporal Bone Intervention Simulator
36.3.5 Interactive Trauma Trainer
36.3.6 VR in Psychology
36.3.6.1 Distraction Therapy
36.3.6.2 Imaginal Exposure
36.3.6.3 Virtual Exposure
36.3.6.4 Restorative Environments
36.3.6.5 Image- and Video-Based Restorative Environments
36.3.6.6 VR-Based Restorative Environments
36.3.6.7 Virtual Restorative Environment Therapy Project
36.4 Virtual Heritage
36.4.1 Virtual Stonehenge and Virtual Lowry
36.4.2 Human Factors Challenges in Virtual Heritage
36.4.3 Project 1: The Wembury Commercial Dock and Railway Proposal of 1909
36.4.3.1 Wembury Docks AR and VR Demonstrations
36.4.4 Project 2: HMS Amethyst’s “Final Resting Place”
36.4.4.1 Augmented Reality Amethyst Demonstrator
36.4.5 Virtual Scylla
36.4.6 Virtual Plymouth Sound
36.4.7 Maria
36.5 Concluding Remarks
References
Chapter 37 Use of Virtual Worlds in the Military Services as Part of a Blended Learning Strategy
37.1 Introduction
37.2 Military Services Shift to a BLS That Includes Virtual Worlds
37.3 Suggested Benefits of a BLS
37.4 Benefits of Immersion and Presence
37.5 Different Business Model—Shifting Costs of Content Creation to the User
37.6 Additional Uses of VWs for Training and Education
37.7 Tolerance and Acceptance—Individual Differences
37.8 Examples of Military Service VW Applications
37.8.1 Virtual World Framework
37.8.2 Air Force Recruiting
37.8.3 Resiliency
37.8.4 Historical Field Trips
37.8.5 Mars Expedition Strategy
37.9 Army Research Laboratory Federal Virtual Challenge
37.10 Enhanced Dynamic GeoSocial Environment
37.11 EDGE Design Considerations
37.12 Case Study: MOSES
37.12.1 MOSES: The Military Open Simulator Enterprise Strategy
37.12.2 MOSES Deployment 1.0
37.12.3 MOSES Deployment 2.0
37.12.4 MOSES Community Development and Contributions
37.12.4.1 Estate: Virtual Harmony
37.12.4.2 Estate: The Constitution
37.12.4.3 Estate: Tulane School for Continuous Studies
37.12.4.4 Estate: Open Virtual Collaboration Environment
37.12.4.5 Estate: Air Force Research Lab
37.12.4.6 Estate: Raytheon Missile Systems
37.12.4.7 Estate: Tech Wizards
37.12.5 MOSES Future Plans—VW Research into Scalability and Flexibility
37.13 Lessons Learned
37.13.1 Costs
37.13.2 Security
37.13.3 Requirements Definition
37.14 Shared Resources and Software Reuse
37.15 Current Trends in Commercial VWs
37.15.1 Adoption and Acceptance
37.15.2 Barriers to Adoption
37.16 Conclusion
Bibliography
Chapter 38 Team Training in Virtual Environments: A Dual Approach
38.1 Overview
38.2 Whirlwind Review of the Past Decade
38.3 Aspects of Team Training
38.3.1 What Makes Up a Team (and Teamwork)?
38.3.2 What Is (and Isn’t) Team Training?
38.3.3 What Are VEs?
38.3.4 What Are VTMs?
38.4 Team Training in VE: State of the Science
38.4.1 Team Training in the Military
38.4.2 Team Training in Business
38.4.3 Team Training in Health Care
38.5 Establishing an Effective Learning Environment
38.5.1 Training Audience: For Whom Is the Training?
38.5.2 Task Requirements: What Is Being Trained?
38.5.2.1 Taskwork versus Teamwork Competencies
38.5.3 Training Environment: Under What Conditions?
38.5.4 Training Delivery: How Are They Being Trained?
38.6 Dual Approach for Team Training in VEs
38.6.1 Approach 1: Demonstration-Based Training
38.6.1.1 Passive Guidance/Support
38.6.1.2 Preparatory Activities/Tasks
38.6.1.3 Concurrent Activities
38.6.1.4 Retrospective Activities
38.6.1.5 Prospective Activities
38.6.2 Implementing DBT into VEs
38.6.3 Approach 2: EBAT
38.6.3.1 Skill Inventory/Performance Data
38.6.3.2 Learning Objectives/Competencies
38.6.3.3 Scenario Scripts/Trigger Events
38.6.3.4 Performance Measures/Standards
38.6.3.5 Performance Diagnosis
38.6.3.6 Feedback and Debrief
38.6.4 Implementing EBAT into VEs
38.7 VE and VTM Contributions for Team Training
38.8 General Guidelines for Using DBT and EBAT
38.9 Concluding Remarks
Acknowledgments
References
Chapter 39 Visual Perceptual Skills Training in Virtual Environments
39.1 Introduction
39.2 Visual Search
39.3 Visual Search Strategies
39.3.1 Exogenous Visual Search
39.3.2 Endogenous Visual Search
39.3.2.1 Endogenous Feature-Based Search
39.3.2.2 Endogenous Position-Based Search
39.3.2.3 Endogenous Goal-Based Search
39.4 Visual Search and Threat Detection
39.5 Training Strategies for Visual Search
39.5.1 Performance Feedback
39.5.2 Process Feedback
39.5.3 Attentional Weighting
39.5.4 Difficulty Variation
39.5.5 Metacognitive Strategies
39.5.6 Expert Performance Models
39.5.7 Tailored Feedback
39.6 Measuring Visual Search Performance
39.7 Diagnosing Visual Search Performance
39.7.1 Case Study 1: Screen Adapt
39.7.2 Case Study 2: ADAPT-AAR
39.8 Future Directions
References
Chapter 40 Virtual Environments as a Tool for Conceptual Learning
40.1 Introduction
40.2 VEs in Education
40.2.1 Defining VEs in Education
40.2.2 Affordances of VEs for Learning
40.2.3 Obstacles to the Effective Use of VEs in Education
40.3 Applicable Learning Theory
40.3.1 Constructivism
40.3.2 Situated Learning
40.3.3 Perceptual and Embodied Learning
40.4 Approaches to Promoting Conceptual Learning in VEs
40.4.1 Exploration
40.4.2 Invention
40.4.3 Inquiry and Hypothesis Testing
40.4.4 Perceptual Tuning and Differentiation
40.4.5 Action and Skill Modification
40.5 Future of VEs for Learning and Education
Acknowledgments
References
Chapter 41 Applications of Virtual Environments in Experiential, STEM, and Health Science Education
41.1 Introduction
41.2 Experiential Learning through Virtual Reality
41.2.1 Foundations of Experiential Learning
41.2.2 Designed Experiences in Game-Based and Virtual Learning Environments
41.3 Augmenting Experience Using Natural User Interfaces
41.3.1 Sight
41.3.2 Hearing
41.3.3 Touch
41.4 Designing Virtual Systems for Learning through Agile Development
41.5 Interactive Virtual Worlds for Experiential Learning
41.5.1 Use of Haptics in a Virtual Environment for Learning Nanotechnology
41.5.2 Virtual Learning Forest
41.5.3 Virtual Reality Theater for Teaching Classical Drama
41.6 Experiential Learning in Health Science Applications
41.6.1 Virtual Environments for Therapeutic Solutions
41.6.2 Interactive Tools for Treating Motor Disabilities
41.6.3 Virtual Brain for Microsurgical Training
41.7 Conclusions and Future Directions
References
Chapter 42 Design and Development of 3D Interactive Environments for Special Educational Needs
42.1 Introduction
42.2 VEs for Special Educational Needs
42.3 Design and Development of VEs for Learning
42.3.1 Specification Phase
42.3.1.1 Initial Contact with Schools/Organizations
42.3.1.2 VR Familiarization
42.3.1.3 Concept Generation
42.3.1.4 Selection and Prioritization of Ideas
42.3.2 VE Development Phase
42.3.3 Formative Evaluation
42.3.4 Outcomes: Summative Evaluation and Deployment
42.4 Inclusive Design Toolbox for VLE Development
42.5 Case Example: COSPATIAL
42.5.1 Initial Contact with Schools/Organizations
42.5.2 VR Familiarization
42.5.3 Concept Generation
42.5.4 Selection and Prioritization of Ideas
42.5.5 VE Design and Development
42.5.6 Formative Review and Evaluation
42.5.7 Outcomes
42.6 Involvement of End Users
42.7 Discussion
42.8 Conclusions
Acknowledgments
References
Chapter 43 Virtual Environment–Assisted Teleoperation
43.1 Introduction
43.2 Teleoperation Issues
43.3 VE-Based Teleoperation
43.3.1 Master–Slave Systems and the Importance of VEs
43.3.2 Coping with Time Delay
43.3.3 Enhancing CAT
43.3.4 Improving Bilateral and Shared Control
43.3.5 Human-Centered Architectures
43.3.6 Telepresence and Brain–Computer Interface Physical Embodiment
43.3.7 Enhancing Sensory Feedback to the Human Operator
43.3.8 Improving Safety and Human Factors
43.4 VEs as a Powerful Tool for Special Purpose Teleoperators
43.4.1 Telesurgery
43.4.2 Teleoperation at Micro- and Nanoscale
43.4.3 Mobile Robot Teleoperation
43.4.4 Web-Based Teleoperation
43.5 Prognosis and Conclusion
References
Chapter 44 Evolving Human–Robot Communication through VE-Based Research and Development
44.1 Introduction
44.2 Robots in Complex Military Domains
44.3 State of the Art in HRI
44.3.1 Challenges with Current Robot Use
44.4 Evolving HRC
44.4.1 Human–Human Interaction
44.4.2 Innovative Solutions
44.5 VE-Based Research Contributing to the Advancement of HRC
44.5.1 Designing Interfaces
44.5.2 Training
44.6 Future of HRC
References
Chapter 45 Clinical Virtual Reality
45.1 Introduction
45.2 History and Rationale for Clinical VR
45.3 VR Exposure Therapy
45.3.1 Use Case: The Virtual Iraq/Afghanistan PTSD Exposure Therapy Project
45.4 Neuropsychological VR Applications
45.5 Use Case: The Virtual Classroom Attention Assessment Project
45.6 Use Case: The Assessim Office Project
45.7 Game-Based Rehabilitation
45.7.1 Use Case: Jewel Mine Application
45.8 VH Agents
45.9 Use Cases: VH for Clinical Training and for Health-Care Information Access
45.9.1 Virtual Standardized Patients
45.9.2 SimCoach: An Online VH Health-Care Guide for Breaking Down Barriers to Care
45.10 Conclusions
References
Chapter 46 Modeling and Simulation for Cultural Training: Past, Present, and Future Challenges
46.1 Introduction: Demand Signal for Cultural Training
46.1.1 Role of Culture in Operations and Tactics
46.1.2 Cultural Markers
46.1.2.1 Perception of and Response to Cultural Differences
46.1.3 How Has Culture Been Trained?
46.1.3.1 Cross-Cultural Assimilators
46.1.3.2 Smart Cards
46.1.3.3 Online and Blended Instruction
46.2 Role of Modeling and Simulation
46.2.1 Current M&S Applications
46.2.2 Examples of Technology-Assisted Cultural Training
46.2.2.1 Fielded Training Systems
46.2.3 Shortcomings of Current M&S Cultural Training
46.3 Methods for Measuring Cultural Competence and Transfer of Training
46.3.1 Evaluating Cultural Competence
46.3.2 Challenges of Performance-Based M&S Measures for Military Applications
46.3.3 M&S Measurements: Feedback
46.3.4 Training Transfer
46.4 Culture-General Training
46.4.1 Benefits of Culture-General Training
46.4.2 Culture-General Training: Archetypal Patterns of Life
46.4.2.1 Benefits of PoL for Culture-General Training
46.4.2.2 Culture-General Training: Recognition of PoL and Anomalies
46.5 M&S Innovations Supporting Culture-General PoL Training
46.5.1 Adaptive Culture-General Training in a Virtual Environment: PercePTs
46.6 Recommendations and Future Work
References
Chapter 47 Immersive Visualization for the Geological Sciences
47.1 Introduction
47.2 Common Geology Data Types
47.2.1 Terrain Elevation Data
47.2.2 Satellite Imagery
47.2.3 2D Fields
47.2.4 3D Material Property Data
47.2.5 Ground-Penetrating Radar (GPR) Data
47.2.6 Well Log Data
47.2.7 Vector Geometry Objects with Attributes
47.2.8 Breadcrumb Trail
47.2.9 Point Cloud Data
47.3 Geology Visualization
47.3.1 Relevant Issues in Geology Visualization
47.3.2 Applying Standard Visualization Techniques to Geology
47.3.3 Visualization Techniques for Geology
47.3.4 Temporal Data
47.3.5 Benefits to Geologic Visualization from the Immersive Interface
47.4 Exemplar Applications and Lessons Learned
47.4.1 Applications for Exploration of Terrains, Associated Geophysical Data, and Well Logs at the University of Louisiana at Lafayette
47.4.1.1 Rendering and Interaction for Terrain Interpretation
47.4.1.2 Well Log Visualization
47.4.1.3 3D Lens Immersive Interface Technique
47.4.2 LidarViewer: An Immersive Point Cloud Visualization Tool
47.4.2.1 Visualization Techniques
47.4.2.2 Immersive Interface Techniques
47.4.2.3 Idaho National Laboratory LidarViewer Workflow
47.4.2.4 Outcomes and Lessons
47.4.3 Visualization of Multimodal Geophysical Survey Data
47.4.3.1 Visualization Techniques
47.4.3.2 Immersive Visualization Application
47.4.3.3 Benefits Derived
47.4.4 DRI Lancaster Sand Dune Layers
47.4.4.1 Visualization Techniques
47.4.4.2 Immersive Interface Techniques
47.4.4.3 Outcomes and Lessons
47.5 Benefits
47.6 Future and Conclusion
References
Chapter 48 Information Visualization in Virtual Environments: Trade-Offs and Guidelines
48.1 Introduction
48.1.1 Observer and Observed
48.1.2 Media and Message
48.2 Background
48.2.1 Graphical Information
48.2.1.1 Visual Markers
48.2.2 Attention
48.2.3 Scientific Visualization
48.2.4 Information Visualization
48.2.4.1 Multiple Views
48.2.5 Virtual Environments
48.2.5.1 Information-Rich Virtual Environments
48.2.5.2 Display: Sizes and Resolution
48.3 Activity Design
48.4 Information Design Guidelines
48.4.1 3D Rendering
48.4.2 Color and Lighting
48.4.3 Multiple Views
48.4.4 Information-Rich Virtual Environments
48.4.5 Platforms
48.5 Interaction Design Guidelines
48.5.1 Navigation
48.5.2 Selection
48.5.3 Manipulation
48.6 Reflections on The Next Reality
References
Chapter 49 Entertainment Applications of Virtual Environments
49.1 Introduction
49.1.1 What’s in a Name?
49.1.2 Video Games as Entertainment-Directed Virtual Environments
49.2 Behavioral Conditioning and Games
49.2.1 Classical Conditioning
49.2.2 Recent Research in Classical Conditioning
49.2.3 Operant Conditioning
49.2.4 Schedules of Reinforcement
49.2.5 Punishment and Negative Reinforcement
49.2.6 Recent Research in Operant Conditioning, Punishment, and Negative Reinforcement
49.2.7 Design Implications of Behavioral Conditioning
49.3 Immersion in Virtual Environment Games
49.3.1 Defining Immersion
49.3.2 Flow and the Gameflow Model
49.3.3 SCI Model of Immersion
49.3.4 Tactical, Strategic, and Narrative Immersion Model
49.3.5 Designing VE Games to Support Challenge-Related Immersion
49.3.6 Designing VE Games to Support Fantasy or Imaginative Immersion
49.3.7 Designing VE Games to Support Sensory or Curiosity Immersion
49.3.8 Involvement Model of Immersion
49.3.9 Design Implications of the Involvement Model
49.3.10 Emerging Input Devices
49.3.11 Guidelines for the Design of Virtual Environment Games to Maximize Immersion
49.4 Motivational Theory in Game Design
49.4.1 Game Player Types
49.4.2 Self-Determination Theory
49.4.3 Design Recommendations to Support Player Motivation
49.5 Conclusions
References
Section VIII Conclusion
Chapter 50 Virtual Environments: History and Profession
50.1 Introduction
50.2 Brief History of Virtual Environments
50.3 Major Application Areas
50.3.1 Scientific Visualization
50.3.2 Architecture and Design
50.3.3 Education and Training
50.3.4 Entertainment
50.3.5 Manufacturing
50.3.6 Medicine
50.4 Periodicals for Professionals
50.5 Major Conferences and Trade Shows
50.6 Research Laboratories
50.7 Organizations and Resources
People also search for Handbook of Virtual Environments Design Implementation and Applications 2nd:
vmware horizon 8 design guide
7 principles of universal design examples
7 principles of universal design pdf
6.1 visual design principles and elements