Scaleable Information Infrastructure Awards
SMART: Scalable Medical Alert and Response Technology
Brigham & Women's Hospital proposes to combine existing and new technologies to develop SMART: Scalable Medical Alert and Response Technology, a system for patient tracking and monitoring that begins at the emergency site and continues through transport, triage, stabilization, and transfer between external sites and health care facilities as well as within a health care facility. The system is based on a scalable location-aware monitoring architecture, with remote transmission from medical sensors and display of information on personal digital assistants, detection logic for recognizing events requiring action, and logistic support for optimal response. Patients and providers, as well as critical medical equipment will be located by SMART on demand, and remote alerting from the medical sensors can trigger responses from the nearest available providers. The emergency department at the Brigham and Women's Hospital in Boston will serve as the testbed for initial deployment, refinement, and evaluation of SMART. This project will involve a collaboration of researchers at the Brigham and Women's Hospital, Harvard Medical School, and the Massachusetts Institute of Technology.
Contact: Lucila Ohno-Machado, M.D., Ph.D.
Decision Systems Group
Brigham and Women's Hospital
75 Francis Street
Boston, MA 02115
An Adaptive, Scalable, and Secure I2-based Client-Server Architecture for Interactive Analysis and Visualization of Volumetric-Time Series Data
The Pittsburgh Supercomputing Center (PSC) and the Duke University Center for In Vivo Microscopy (CIVM) will establish an end-to-end Internet2 (I2) testbed supporting a client-server application for the visualization, analysis, storage and manipulation of 4D datasets, i.e. time-varying 3D datasets. The proposed system exploits approaches used in previous PSC work for networked navigation of static 3D volumetric datasets developed as a part of the AUniversity of Michigan Next Generation Internet Implementation to Serve Visible Human Datasets@ (UMVH). The new application, referred to as P4VAS (PSC 4D Visualization, Analysis, Storage), requires fast, real-time uploads of newly-captured data and secured interactive access by geographically distributed users. We will evaluate use of specific I2 features including the IPSec security architecture and IPv6 protocol to meet these needs. Abilene Network connections, already in place at both PSC and Duke, provide an end-to-end Internet2 linkage from CIVM data capture facilities to PSC servers and to users anywhere on the network. Web100 tools, developed at PSC, will let P4VAS measure and adapt to dynamically changing network conditions. Measures of network performance will come from Web100 software and instrumentation within P4VAS. The CIVM will evaluate the testbed application's ability to improve research into mouse embryo development using 4D magnetic resonance microscopy (MRM) data. The P4VAS server, residing on a high-performance, large-memory platform, will deliver compressed 4D data to a portable client program providing navigation, visualization, collaboration and tools for 4D registration, segmentation, labeling, and analysis. Because compression is an essential part of the implementation the project includes research into compression strategies using directional encoding methods to unify aspects of compression, segmentation and mesh representation. Portions of the NLM Insight Segmentation & Registration Toolkit (ITK) will be used to extend the range of data analysis tools available to users. Future telemedicine requirements utilizing MRI and CT time series for human patients have many features in common with the system being developed under this proposal. In particular, needs for rapid, reliable, and secure visualization and analysis of large amounts of data, newly acquired from living subjects, distributed over long distances and under widely varying network conditions with broken links and heavy overloads are common to both research and future medical applications and to potential disaster recovery scenarios. These situations will certainly include data capture and loading to service distribution centers as part of the system. Since many of these new aspects are included in our testbed we will be able to evaluate methods that have not been well studied in the biomedical context but will be important to future health care applications.
Contact: Arthur W. Wetzel
Research Systems Programmer
Pittsburgh Supercomputing Center
Carnegie Mellon University
216A Mellon Institute
Pittsburgh, PA 15213
National Multi-Protocol Ensemble for Self-Scaling Systems for Health
Children's Hospital of Boston proposes a model, large-scale implementation for a self-scaling national networked health system that can be realistically deployed in short order with existing resources and with only conservative expectations regarding developments of the Internet in this decade. National Multi-Protocol Ensemble for Self-scaling Systems for Health (NMESH) seeks to address these problems of scalability and preparedness at multiple levels by leveraging off of prior work on multi-institutional "on-the-fly" data integration, regional patient-controlled medical records, self-describing peer-to-peer networks, cryptographic health identification systems, and a GIS-based biosurveillance toolset. Wireless handsets will be introduced as strong authenticators and Asmart card@ like storage devices. NMESH will be deployed, demonstrated and independently evaluated across pediatric and adult populations over multiple unrelated and competing New England healthcare delivery systems. NMESH's capabilities, under a variety of scenarios will be demonstrated for both individual patient care and regional bio-surveillance.
Contact: Isaac Kohane, M.D., Ph.D.
Children's Hospital Boston
300 Longwood Avenue
Boston, MA 02115
Project Sentinel Collaboratory
Under Project Sentinel Collaboratory, the Georgetown University in partnership with the Emergency Departments of MedStar Washington Hospital Center and MedStar Georgetown University Hospital will build and deploy a data-centric collaboratory to collect and analyze data from hospitals, clinics, weather services, satellite images of vegetation, mosquito collection, veterinary clinics and other sources in order to develop indicators and warnings (I&Ws) of emerging threats to human health. Appropriate I&W's will then allow more time for various authorities to prepare for corresponding responses to potential threats. Project Sentinel Collaboratory will exploit emerging new concepts in information technology such as middleware, network weather service, electronic authorization and authentication approaches and grid services for managing novel disparate data streams and evaluating network-dependent applications. Appropriate use of these tools will enhance interoperability, improve protection of data security, and speed of communicating essential and useful information among the authorized users. Initial users of the system will include public health authorities at the DC Department of Public Health, researchers, and physicians from participating hospitals, emergency departments and community clinics. This project is an integral part of a prototype integrated biodefense system that Georgetown's ISIS Center is developing.
Contact: Seong K. Mun, PhD
Georgetown University Medical Center
2115 Wisconsin Ave. (Suite #603)
Washington, DC 20007
Advanced Health and Disaster Aid Network (AID-N)
Johns Hopkins University Applied Physics Laboratory proposes an advanced high performance communications network to facilitate collaboration among local public health officials, a disease surveillance system, hospital emergency departments, first responders, and care personnel at an auxiliary casualty care center to respond to public health emergencies including chemical and biological terrorist attacks and naturally caused events. In order to provide health institutions with a fully secure and integrated environment for collaboration, AID-N will incorporate object-based middleware to ensure an end-to-end solution for secure remote access to distributed biosurveillance data, and a dynamic command and control center for situational awareness and emergency response management. New and emerging video teleconferencing and wireless technologies will be leveraged to provide first responders, local fire, law enforcement and emergency medical services, the mobility, flexibility, and rapid data capture and transfer capabilities critical to their mission. A team led by JHU/APL systems integrators and subject matter experts will be supported by a diverse team of representatives from; Montgomery County Health Department; Suburban Hospital Emergency Department; Johns Hopkins Pediatric Trauma Center; Montgomery Blair High School, a designated emergency casualty care site; ECRI, an independent evaluator of health systems and devices; Tulane University Medical School Office of Information Technology, developer of the Coherent Informatics conferencing system; and OPTIMUS Corporation, developer of the Michaels mobile ambulatory reporting system.
Contact: David M. White, D.Sc.
Johns Hopkins University
Applied Physics Laboratory
11100 Johns Hopkins Road
Laurel, MD 20723
Advanced Network Infrastructure for Distributed Learning and Collaborative Research
This project, named HAVnet (Haptic Audio Video Network for Education Technology), builds on prior work developing visual and haptic educational applications for anatomy and surgery training. The project includes aspects of self-scaling technology, self-optimizing end-to-end network aware real-time middleware, wireless technology and GIS. These elements will be investigated within a context of two educational test beds, each an extension of work previously developed as part of an NGI project. One test bed, a Clinical Anatomy suite will present challenges of bandwidth and latency. The other, a Clinical Skills test bed, concentrated on surgical training simulations involving haptic devices, will present performance challenges due to low tolerance for network jitter. Specifically, the project proposes to deliver: enhancement and integration of two existing middleware applications, Information Channels and Weather Stations, allowing correlations to be made between network metrics and actual application performance; addition of self-optimizing features to the six applications using the core middleware; development of a new application, Anatomy Window, that uses a handheld computer to map a cadaver and present corresponding images derived from the Visible Human data set; development of a Remote Tactile Sensor, capable of capture and transmission of tactile dermatology information over a network; implementation of the anatomy teaching suite over local, national and global networks for use in early, laboratory based and actual field teaching; and implemention of the clinical skills test bed, primarily in early phase and laboratory testing.
Contact: Parvati Dev, PhD
Stanford University School of Medicine
Stanford, CA, 94305-5466
Advanced Network Infrastructure for Health and Disaster Management
The prehospital (or out-of-facility) medical emergency and public safety information environment is at a threshold of revolutionary change. The change is driven, in part, by several emerging technologies such as secure, high-speed wireless communication in the local and wide area networks (wLAN, 3G), Geographic Information Systems (GIS), Geopositional Systems (GPS), and powerful hand-held computing and communication devices. Integrating these technologies into an effective infrastructure supporting routine emergency medical services, with the scalability to support large-scale medical emergencies, is challenging. We propose to establish testbeds that will enable us to iteratively develop, test and enhance capabilities new to the EMS community. One testbed will address the need for a new generation 9-1-1 Emergency Response and Medical Dispatch System to enable robust operation even during unpredictable medical disasters. Using GIS/GPS technology we will show the location of EMS resources. We will use another testbed to enhanced communication between the EMS providers in the field and the EMS physicians in the hospital. This testbed will be based on Internet2 technologies and the IP-based 3G infrastructure. We will integrate GIS/GPS devices in ambulances and devices used by EMS providers to enhance their effectiveness and documentation in the field (telemedicine).
Contact: Helmuth F. Orthner, Ph.D.
Health Informatics Program
Dept. of Health Services Administration
University of Alabama at Birmingham
1530 3rd Avenue South - Webb 534
Birmingham, AL 35294-3361
Wireless Internet Information System for Medical Response in Disasters (WIISARD)
The objective of the WIISARD project is the development of an integrated software-hardware system designed to enhance the delivery of medical care at the sites of terrorist attacks and other disasters. WIISARD is designed for use by a Metropolitan Medical Strike Team (MMST) in response to a nuclear, biological or chemical (NBC) event. The project proposes to integrate several existing hardware and software components to create a comprehensive management system for rapid deployment at a mass casualty site. WIISARD technologies will include self-optimizing wireless end-to-end networks, GPS, RF tags, GIS, and handheld and wearable computers. Using these technologies, WIISARD will monitor the positions of victims, MMST personnel, treatment assets and NBC weapons plumes. It will also be able to alert first responders of nearby hazards and changes in victims' vital signs and provide an auxiliary channel for communications with the incident command center. The specific aim is to create a prototype system that will be evaluated during a San Diego Regional MMST training exercise. WIISARD is a joint project of the School of Medicine of the University of California, San Diego and the California Institute for Telecommunications and Information Technology (Cal-(IT)2). The research team includes participants from the San Diego Regional MMST, San Diego State University, Lawrence-Livermore National Laboratories, Qualcomm, Verizon, SAIC, PhiloMetron, Dolphin Medical Systems, MindTel, AwarePoint, and Charmed Technologies.
Contact: Leslie Lenert, M.D., M.S.
University of California, San Diego
9500 Gilman Drive
La Jolla, CA 92093- 9111-N
Advanced Biomedical Tele-Collaboration Testbed
Advanced biomedical collaboration (ABC) is a technical framework based on the Access Grid (AG)intended to be used to overcome the inefficiencies and dangers associated with the place-dynamic collaborative workplace that biomedicine has become. Leveraging AG technologies and advanced networks can provide relatively inexpensive, rapid, high quality command and control tools to enhance inter-organizational and intra-organizational teamwork and collaboration. Furthermore, expanding the use of AG technologies among different design points such as stereo or head-mounted displays with human factors considerations, PDAs, Laptop, integration with complex instrumentation, and wireless transmission/bandwidth variability will allow biomedical specialists to remain connected to colleagues and visual data seamlessly wherever they are. The project will develop, demonstrate, and assess network aware and wireless systems for tele-collaboration between biomedical professionals focusing on three acute care specialties: surgery, emergency medicine and radiology. COTS projectors and hardware and open source software will be used. For EMS applications, wireless video hardware will be developed. By using self-scaling and self-optimizing technologies to implement new seamless application control among arbitrary networked topologies, the project will converge immersive virtual reality and teleconferencing.
In addition to building an infrastructure and integrating commodity technology for collaboration and a shared situational context, there will be specific assessments of the technology in biomedical applications. These involve group to group interactions in educational contexts involving patient safety and medical simulation, tele-immersion and stereo display for surgical education and robotic surgery, wireless communication for pre-hospital tele-patient management, and 3D rendering of 2D imaging data in surgical-radiological consultation. The ultimate goal is to advance biomedical research, education, and practice by developing real time interactive, remotely collaborative and remotely manipulated, stereo video and visualization technologies for sharing human experience in complex physical and virtually augmented environments.
Contact: Jonathan C. Silverstein, MD, MS, FACS
The University of Chicago Hospitals
Room A-105, MC 6051
5841 S. Maryland Avenue
Chicago, IL 60637-1470
A Tele-Immersive System for Surgical Consultation and Implant Modeling
The project will employ augmented VR systems being developed at the University of Illinois B Chicago for surgical consultation and cranial implant modeling using C-Wall and Physician's Personal VR Displays. Implant designs will be done by medical professionals in tele-immersive collaborations where medical modelers create virtual implants that precisely fit defects generated from patient CT data. Haptic devices provide a sense of touch to designing the implants and the data is sent to a stereo-lithography rapid prototyping system that creates the physical implant model. After surgery, patients undergo scans and the results are reviewed over the tele-immersive system. The system can be generalized to other medical problems besides cranial implants and used for medical education as well as consultation. Teaching modules will be generated. Reviews of fit by surgeons and modelers will be used to judge fabrication and network measurement tools will be used to calibrate latency, jitter and other network quality of service outcomes (including artificially induced conditions affecting network performance). In addition, subjective data will be collected regarding the interface and other aspects of the system's use.
Contact: Zhuming Ai, PhD
University of Illinois - Chicago
1919 W. Taylor, SBHIS (MC 530)
Chicago, IL 60612-7249
3D Telepresence for Medical Consultation: Extending Medical Expertise Throughout, Between and Beyond Hospitals
The project will develop and test 3D telepresence technologies that are permanent, portable and handheld in remote medical consultations involving an advising healthcare provider and a distant advisee. Advanced trauma life support and endotracheal intubation will be used initially in developing the system and in controlled experiments. These will compare the shared sense of presence offered by view dependent 3D telepresence technology to current 2D videoconferencing technology. The project will focus on barriers to 3D telepresence, including real time acquisition and novel view generation, network congestion and variability, and tracking and displays for producing accurate 3D depth cues and motion parallax. Once the effectiveness of the system in controlled conditions is established, future efforts would involve adapting the technology for use in a variety of clinical scenarios such as remote hospital to tertiary center emergency consultations, portable in transit diagnosis and stabilization systems, interoperative consultations and tumor boards. Quality of medical diagnosis and treatment using a Human Patient Simulator will help assess the system as well as judgments concerning its acceptance and practicality by patients, physicians, nursing staff, technicians, and hospital administrators. Cost-effectiveness of 2D and 3D strategies will be analyzed.
Contact: Henry Fuchs, Ph.D.
Department of Computer Science
209 Sitterson Hall
University of North Carolina
Chapel Hill, NC 27599-3175