The Digital Futures lab now has a new home. We’re now located in the Centre for Health & Care Professionals building – you can find the location at the bottom of this page.

XR Community Simulation Project

Exploring the use of XR technology to bridge the realism gap inherent within simulations that focus on community care environments.

Our project set out to investigate how XR technologies and virtual environments can help overcome the realism gap that often exists in simulations related to community care. Specifically, we aimed to explore how these technologies could enhance the delivery of simulated scenarios and create an experience that is both portable and accessible to community staff and trainees across multiple sites within our trust.

 

The project’s scope focuses on whether Technology Enhanced Learning (TEL) coupled with real-time simulators and immersive experiences can effectively simulate the complexities of diverse community environments. Our objective is to determine whether incorporating the subtleties of these environments can improve patient outcomes and reinforce staff confidence, thereby enabling healthcare teams to deliver safe and effective care.

 

In collaboration with the Simulation team at Torbay and South Devon NHS Foundation Trust (TSDFT), we selected a small caravan/mobile static home as the proposed environment setting for the project. Several existing simulation scenarios, such as a falls scenario and a discharge to assess pathway, could be adapted to fit within this space. These simulations are currently conducted in our physical simulation suite.

Physical sim suite at TSDFT

We conducted research on potential hardware solutions and ultimately selected Varjo’s flagship Mixed reality XR-3 headset. This advanced device features passthrough cameras on the front of the headset, which, when combined with a traditional green screen setup, enables real-time chromakeying to separate physical actors and assets in a scene. This allows us to overlay real actors from our simulation team and blend them seamlessly with the virtual environment.

 

This feature was particularly desirable as it preserved the authenticity of using real simulation actors who can convey human emotion and provide ad-lib responses to participants’ actions and choices in real-time. We felt that this level of authenticity would not be sufficiently achievable through the use of virtual avatars.

Demonstrating the passthrough technology that allows real actors to blend with the virtual environment

Environmental storytelling is a significant aspect of the simulation process. In addition to training and practicing the clinical duties associated with each scenario, participants are encouraged to observe and assess the environment they enter and interpret any clues that may reveal more about the patient’s condition. For instance, they may identify a need for further support or other risk factors.

 

In reality, community care staff may encounter various circumstances and indicators during home visits that signal an individual’s need for further support. Torbay, recognised as one of the most deprived areas in Devon according to the Index of Multiple Deprivation, is facing increasing cases of poverty and struggling individuals, hampered further by the current cost of living crisis. Indicators may include poor nutrition and hygiene, irregular/uncontrolled medication intake, and financial difficulties to name a few examples.

 

Replicating these conditions and environmental details in a physical simulation space can often be time-consuming and costly, particularly when needing to make adjustments between each simulated scenario. Virtual assets allow the simulation team to interchange environmental details easily, enabling them to modify each scenario quickly, saving both time and resources. This also allows simulations to include real-world challenges that staff are likely to encounter, thereby enhancing the training experience.

 

As noted earlier, one of the project’s primary goals was to make it accessible to community staff in various rural areas. To achieve this, we decided to acquire a dedicated van and transform it into a mobile simulation experience.

Exterior of the van unit

Since the technology requires a specific chromakey setup, we wrapped the van’s interior with vinyl greenscreen lining to create a space where we could deploy our simulated environment. A 3D scan was taken of the van’s interior to provide a template for the virtual scene’s size and scale.

Vinyl wrapped van interior

Our internal team developed the initial 3D environment for the project, which took the form of a static home interior featuring a kitchen area and a lounge/living space. To create a more comprehensive sense of a lived space, virtual rooms for the bedroom and bathroom were added, though they were not designed to be navigated by users and virtual barriers were added to discourage participants from attempting to enter these areas.

 

The scene was imported into Unreal Engine, a software package used to develop interactive applications, to set up the functionality of the simulation. This included a player controller to enable free movement around the space and interactable objects such as a working television, radio, and fridge. With this setup, we can introduce deeper levels of interaction points into the virtual environment, and create greater opportunities for learning and feedback. For instance – at the start of the falls scenario, the participant may enter the environment to find the patient on the floor, with the Television on at a high volume – they may choose to initially turn this off, or try to persevere with the background noise acting as a barrier to patient communication. Similarly, they may open the fridge to discover that there’s no food, or the food in there is expired. All of these elements can be adjusted and altered with each run through of the experience.

 

The simulation van is equipped with physical fixtures, such as a sink area and seating, which have been seamlessly integrated with virtual assets for a cohesive experience. The integration allows for enhanced interaction and feedback with the environment, including the ability for participants to touch surfaces and place objects on them. The hope is that this serves to further the sense of immersion and engagement with the scenarios presented.

Virtual environment within Unreal Engine

Initial test sessions have resulted in us receiving positive feedback and interest from staffing groups, prompting us to proceed with full simulation scenarios with clinical teams. Moving forward, we plan to explore incorporating hand tracking technology to create more interactive points, and eye tracking analytics to export user gaze data for more robust feedback and debriefing opportunities.

 

Furthermore, the virtual environment serves as a template for the creation of additional environments, with the ultimate goal of establishing a comprehensive catalogue of experiences and scenarios that can be deployed for staff training on a mobile basis

Share:

Twitter
LinkedIn
On Key

Related Posts

Using virtual reality therapy

Implementing new patient-facing technology in the NHS is not for the faint hearted! It is essential to gather a team around you, which includes patients

Eye tracking wheelchair

https://www.youtube.com/watch?v=bEwxuXprezI A wheelchair that uses Eye tracking to aid people with motor neurone disease in order to give them the freedom of mobility for longer, the

Rheumatology Questionnaire

The rheumatology questionnaire is a mixed-reality application developed in-house for the paediatric ward for reducing anxiety in adolescents. Patients often come in and are asked

InSites Workshop 2024

Watch the Digital Futures Team take part in InSites workshop 2024! https://immersive.tsdft.uk/wp-content/uploads/2024/05/insites_workshop_2024-1080p.mp4 Share: Twitter LinkedIn On Key Related Posts