The emergence of several trends, including the increased availability of wireless networks, miniaturization of electronics
and sensing technologies, and novel input and output devices, is creating a demand for integrated, fulltime displays for
use across a wide range of applications, including collaborative environments. In this paper, we present and discuss
emerging visualization methods we are developing particularly as they relate to deployable displays and displays worn
on the body to support mobile users, as well as optical imaging technology that may be coupled to 3D visualization in
the context of medical training and guided surgery.
Modeling, Simulation and Training (MS&T) technologies have provided significant capabilities for
Military training and mission rehearsal. However, most of the
state-of-the-art MS&T systems used today
are high fidelity, stand alone systems, routinely staffed by a team of support and instructional personnel.
As the military becomes more reliant on these technologies to support ever changing concepts of
operations, they are asking for numerous technological advancements including 1) automated instructional
features to reduce the number of personnel required for exercises, 2) increased capability for adaptation of
human computer interfaces to support individual differences and embedded performance support in
operational settings, and 3) a continuum of low to high fidelity system components to provide embedded,
deployable and transportable solutions. A multi-disciplinary team of researchers at the University of
Central Florida's (UCF) Institute for Simulation and Training (IST) Applied Cognition and Training in
Immersive Virtual Environments Lab (ACTIVE), lead by Dr. Denise Nicholson, is performing research and
development to address these emerging requirements as part of on-going projects for Navy, Marine Corps
and Army customers. In this paper we will discuss some of the challenges that confront researchers in this
area and how the ACTIVE lab hopes to respond to these challenges.
A framework for real-time visualization of a tumor-influenced lung dynamics is presented in this paper. This framework potentially allows clinical technicians to visualize in 3D the morphological changes of lungs under different breathing conditions. Consequently, this technique may provide a sensitive and accurate assessment tool for pre-operative and intra-operative clinical guidance. The proposed simulation method extends work previously developed for modeling and visualizing normal 3D lung dynamics. The model accounts for the changes in the regional lung functionality and the global motor response due to the presence of a tumor. For real-time deformation purposes, we use a Green's function (GF), a physically based approach that allows real-time multi-resolution modeling of the lung deformations. This function also allows an analytical estimation of the GF's deformation parameters from the 4D lung datasets at different level-of-details of the lung model. Once estimated, the subject-specific GF facilitates the simulation of tumor-influenced lung deformations subjected to any breathing condition modeled by a parametric Pressure-Volume (PV) relation.
Pilot cueing is a valuable use of Head Mounted Displays (HMDs) as it greatly helps the user to visually locate electronically identified targets. It is well known that a target which is hard to spot in the sky can be easily tracked and studied after it has been visually located. Transients, including sun glint, can reveal much about distant targets as they are visually studied. This is implicit in the "Visual Rules of Engagement". The term "Virtual Beyond Visual Range" has been coined to reflect the fact that optimized HMD cueing can extend visual identification to ranges previously covered only by radar data. The visual acquisition range can drop by a factor of three, however, when HMD image correlation errors expand the uncertainty zone a pilot must visually search. We have demonstrated that system errors, tolerable for off axis missile targeting, can produce this large drop in operational effectiveness. Studies using the Spectron SE1430 HMD analysis system have shown that errors of this magnitude can develop in current HMD models, and that these errors were neither identified by "ready room" tests nor were they correctable in the cockpit. The focus of this study was to develop affordable techniques to quantify the relationship of combat effectiveness to HMD defects for this and other advanced operating modes. When combined with field monitoring of HMD degradation, this makes economic optimization of the HMD supply/maintenance model possible while fulfilling operational mission requirements.
One issue of head mounted display design relates to the tradeoff between field of view (FOV) and resolution, which can lead to reduced visual acuity (VA). Essentially, an increase in FOV causes a decrease in visual acuity, for a given LCD display that has a fixed number of pixels. The effects of enhanced brightness on VA using two different types of retro-reflective material (cubed or beaded) were tested using a 52 deg. FOV projective helmet mounted display with VGA resolution. Three lighting conditions were also tested. Based on the display size, resolution, and FOV, we estimated a maximum visual acuity of 4.1 minutes of arc. In a counter-balanced between measures design, subjects' psychometric acuity functions were determined using a computer-generated 4AFC Landolt C test presented stereoscopically and probit analysis. The results confirmed that the maximum visual acuity possible within the setup was 4.1 arc minutes, the limit imposed by the microdisplay, and not the retroreflective material.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.