Open Access

Creating an Online Interprofessional Collaborative Team Simulation to Overcome Common Barriers of Interprofessional Education / Eine internetbasierte, interprofessionelle Teamsimulation zur Überwindung organisatorischer Hürden in der interprofessionellen Ausbildung


Cite

‘Interprofessional collaboration (IPC) occurs when two or more professions work together to achieve common goals and is often used as a means for solving a variety of problems and complex issues’Green & Johnson, 2015, pg.1). IPC practice models are being used in healthcare settings to improve patient care outcomes along with the use of informatics (Health Resources and Services Administration, 2013; World Health Organization (WHO), 2010; Cox & Naylor, 2013; Health Force Ontario, 2013; Hopkins, 2010; Cuff, 2013; Manos, 2012; Institute of Medicine, 2012). Future health providers need interprofessional (IP) communication and teamwork training to work collaboratively to provide patient-centred care (Hopkins, 2010). To qualify as interprofessional education (IPE), these activities must also involve students from two or more professions learning about, from and with each other to enable effective collaboration and improve health outcomes (WHO, 2010; Interprofessional Education Collaboration, 2011). Development and implementation of high-quality activities that achieve the core Interprofessional Education Collaboration (IPEC) competencies, which are required by accreditors of the US health professions programmes, are important (Begley, 2009). However, many barriers to in-person IPE have been documented consistently in the literature. Most common barriers to IPE include logistical and resource issues such as scheduling conflicts, geographical location/physical space and faculty time (Begley, 2009; Oandasan & Reeves, 2005; Lawlis, Anson & Greenfield, 2014).

In addition to overcoming barriers, educators must also select the best instructional design methods for IPE. Two common instructional methods in the IPE literature include online module learning and in-person simulation (Abu-Rish, et al., 2012). IPE using online modules as part of a stand-alone blended activity or course has been reported with mixed results regarding educator and student reactions and attitudes (McKenna, Palermo, Molloy, Williams & Brown, 2014; Blue et al., 2010). In-person IPE simulations typically have favourable effects regarding learner satisfaction, reaction, perceived authenticity and attitudes (Shoemaker, Beasley, Cooper, Perkins, Smith & Swank, 2011; Zhang, Thompson & Miller, 2011). In addition, a recent example of combining online modules and online simulation principles using virtual patient technology demonstrated feasibility and positive student reactions (Shoemaker, Platko, Cleghorn & Booth, 2014). More examples of combining online IPE methods are needed.

The purpose of this article is twofold. First is to describe the process steps for creating the Interprofessional Plan of Care – Simulated E-hEalth Delivery System (IPOC-SEEDS) simulation, to serve as a model for replication or modification. Second is to disseminate the results from this innovative online simulation, which was specifically developed to achieve IPEC and informatics competency domains whilst minimising common barriers to IPE.

Methods

Over 10 months, a team of IP faculty and students met to create the IPOC-SEEDS simulation. Figure 1 outlines the five-step process that was used to plan, develop, implement, assess and revise the IPOC-SEEDS simulation. A lead faculty member was tasked with coordinating and scheduling team meetings, updating team members and identifying work to be completed.

Figure 1

IPOC-SEEDS Simulati on Process Steps

Step 1 Plan

The first step, plan, began with assembling an IP team including faculty from advanced practice nursing, dietetics and nutrition, health information management, occupational therapy and pharmacy; staff from informatics, teaching technology and electronic health record experts; and graduate assistants. A review of current IPE simulation literature was conducted to inform the development of the simulation. The team identified learning objectives for the IPOC-SEEDS simulation to address selected core competencies of IPEC targeting domains of teams and teamwork (TT4, TT5, TT8) and roles and responsibilities (RR4; Interprofessional Education Collaborative Expert Panel, 2011). An additional simulation objective identified by the team was for all involved professions to navigate the electronic health record (Quality and Safety Education for Nurses, n.d.). Study objectives were to assess the effectiveness, value and technology used during the IPOC-SEEDS simulation. The planning step also included determination of the content to be developed and technology systems to be used for delivery.

Step 2 Develop

The second step of the process, develop, involved development of content, assessment measures and technology utilisation (see Figure 1). Content development included a patient case, uni-professional audio patient encounters, team-led debriefing principles and assessment measures. The available technology used to develop the simulation included an electronic health record (EHR), a learning management system (LMS) and a web-conferencing system. Specific development tasks for each focus area are cross-walked in Table 1 with the review method used during each development step. The developmental tasks provide an idea of workload for each focus area. Review of these tasks mostly consisted of multiple iterations of IP development team feedback or peer review. Development took 10–30 h per week of the lead faculty’s time during the first year and approximately 2–10 h monthly for all others involved.

IPOC-SEEDS Simulation Development

Develop ContentTasksReview
Patient case

Expand a standardised patient case (available from the National League for Nursing) to an interprofessional case to be applicable to all professions involved. Four-day case included multiple IP encounters, orders, laboratory results, vital signs, imaging, medication administration record and intake and output findings.

Multiple iterations with IP faculty and informatics/ technology experts feedback (IP development team)

Audio patient encounters

Develop patient-professional visit script for each of the four professions with actor/patient collaboration.

Record, save and post securely audio of each uni-professional patient-professional encounter (3–8 min each) on LMS

Multiple iterations of IP faculty and patient expert feedback

Successful completion of task with technology expert assistance

Team-led debriefing

Develop presentation on how to team debrief after the simulation (10 min)

Develop student-led questions to be used during team debrief.

Faculty peer review

Modified from evidence, faculty expert feedback

Assessment measures

Develop simulation evaluation survey to address learning/study objectives

Select PACT-novice tool to review the team encounters

Select and modify DASH tool to assess the student-led team debriefing

Modified established competencies through multiple iterations of IP faculty and technical experts feedback

ROL to find and modify appropriate tool, IP faculty feedback

Develop Technology
Electronic health record (EHR)

Determine appropriate EHR documentation forms for each profession

Build patient case in EHR including documentation by all providers, orders, vital signs, medication administration record, imaging and flow sheets

Develop student and faculty instructions for accessing and navigating EHR

Develop IP plan of care team documentation form

Informatics expert assessment

Informatics and HIM/ EHR experts consulted to create an EHR

Informatics and EHR expert consulted

Informatics expert researcher consulted

Learning Management System (LMS)

LMS simulation shell development (welcome, simulation directions, 3-week content, evaluation, access to EHR, web-conferencing and e-mailing of team and faculty)

Loading of students and faculty into simulation shell

Faculty peer review

LMS technology expert consulted

Synchronous webconferencing system

Create directions to access, use and record in web conferencing system

Technology expert consulted

ROL: Review of Literature

*Boese T. Advancing Care Excellence for Seniors unfolding case on Red Yoder. New York: National League for Nursing. 2011. Accessed 2013. http://www.nln.org/professional-development-programs/teaching-resources/aging/ace-s/unfolding-cases/red-yoder

**Brett-Fleegler M, Rudolph J, Eppich W, et al. Debriefing assessment for simulation in healthcare: development and psychometric properties. Simul Healthc. 2012; 7(5): 288-94. 10.1097/SIH.0b013e3182620228.

***University of Washington. Performance Assessment of Communication and Teamwork Novice Observer Form. http://collaborate.uw.edu/sites/default/files/files/PACT_RealTime_ShortForm_Generic_110611_copyright.pdf, Published 2011. Accessed January 14, 2013.

Content development. The content of the simulation included a patient case, uni-professional audio patient encounters for each profession, team-led debriefing and assessment measures. In order to create a need for team collaboration, each profession was given incomplete information about the case. For example, the nursing student audio recording was the initial assessment interview and no other team members were present so the nursing student had to share important information with the team. The decision to use audio recordings was to attain a realistic encounter that was practical and feasible whilst minimising costs. The goal for the development of the team-led debriefing content was to frame the importance of debriefing and to assure quality because the students would not have faculty oversight.

Simulation assessment measures development. Three assessment measures were developed or modified to assess the IPOC-SEEDS simulation content regarding IP knowledge gained, collaborative teamwork, team debriefing, simulation objectives and technology delivery.

Collaborative teamwork (PACT-novice). The Performance Assessment of Communication and Teamwork Novice Observer Form (PACT-novice) evaluates the five domains of team structure, leadership, situation monitoring, mutual support and communication rated on a five-point scale (1=poor, 3=average, 5=excellent). The PACT-novice tool was chosen for evaluation because it was designed and validated for novice raters and is simple to use for real-time or retrospective video observations (University of Washington, 2011).

Team debriefing (modified DASH). After an extensive search of the literature, no tools were found for evaluating student-led debriefing. Therefore, the Debriefing Assessment for Simulation in Healthcare (DASH) tool was modified (removed elements one and five) prior to observations to be more applicable to simulation and instructions for the student debriefing leaders (Brett-Fleegler, Rudolph & Eppich, 2012). The teams were scored based on a seven-point rating scale (1 = extremely ineffective/detrimental to 7 = extremely effective/ outstanding) in the following areas:

Element Two: Maintains an engaging learning environment

Element Three: Structures the debriefing in an organised way

Element Four: Provokes engaging discussion

Element Six: Helps trainees achieve or sustain good future performance

Simulation objectives and technology delivery (IPOC-SEEDS evaluation). A review of literature yielded no reliable and valid measures for assessing that the students met the IPOC-SEEDS simulation or technology delivery objectives. Therefore, the IP development team created these measures. Multiple iterations were piloted, and ultimately, the survey consisted of 34 questions: 15 demographic questions that concern profession, age, gender, ethnicity, city, state, online education experience, technology experience, EHR experience and IPE simulation experience; 10 five-point Likert-scale questions [Not at all (0–5%); Sometimes (25%); Often (50%); Most of the time (75%); Always (95–100%)] that ask about individual and team behaviour during team meetings; 4 yes/no questions that address the simulation and technology used; and 5 open-ended questions that ask about IP knowledge, evaluation of the IP simulation content and technology delivery.

Technology development and utilisation. Online technologies selected to deliver this simulation included an EHR, an LMS and a web-conferencing system. EHR documentation forms for each profession were selected or developed by the respective profession faculty in consultation with the informatics expert, avoiding free text forms and considering the level of student. The simulation was housed in an LMS delivered course that was developed to meet Quality Matters criteria and supported by teaching and learning technology experts (Quality Matters, 2014).

Step 3 Implement

The third step of the process, implement, occurred over three weeks (see Figure 1). The first week implementation activities involved introducing students to IP teamwork and technology training about an EHR, an LMS and a web-conferencing system (see Figure 2).

Figure 2

IPOC-SEEDS Online Simulation Methods

Students were also assigned to IP teams and used secure email to schedule their own team meeting times for week 3. During the second week, students individually reviewed the patient’s chart in the EHR, listened to the uni-professional audio patient encounter recording and documented as the provider in the EHR. During the third and final week, students met synchronously in IP teams to discuss and determine an IP collaborative care plan and document their team plan of care in the EHR. Afterwards, the students managed the team-led debrief of simulation, without faculty, discussing team functioning and areas for improvement. Finally, the students anonymously and voluntarily completed a short evaluation of the simulation in REDCap, a data capture system for research (Harris, Taylor, Thielke, Payne, Gonzalez & Conde, 2009). This project was approved by the investigational review board before implementation.

Step 4 Assess

The fourth step, assess, included assessment of the simulation (see Figure 1). Table 2 displays the objectives along with how they were assessed. The modified PACT-novice measure was used to assess the team meeting videos. The modified DASH measure was used to assess debriefing responses. The rest of the implementation was evaluated by student feedback on the IPOC-SEEDS evaluation or faculty review of student team content. Formal and informal student and faculty feedback was also gathered throughout to improve future simulation delivery.

IPOC-SEEDS Simulation Objectives with Respective Assessment

DomainObjectiveAssessment
IPEC: Roles and Responsibilities1 (RR4) Identify the roles and responsibilities for yourself and other care providers and how the team works together to provide care.IPOC-SEEDS evaluation:

How often did your team introduce themselves and describe their profession?

How often did you fulfill your professional role?

Did you feel your team collaborated to come to a consensus on the plan of care?

IPEC: Teams and teamwork2 (TT4) Integrate the knowledge and experience of other professions – appropriate to the specific care situation – to inform care decisions, whilst respecting patient and community values and priorities/ preferences for care.

Assess during IP team meeting

Scores on five domains

IPOC-SEEDS evaluation

How often did you contribute to the plan of care?

How often did your team keep the patient’s perspective considered throughout meeting?

IPEC: Teams and teamwork3 (TT5) Apply teamwork principles that support collaborative practice and team effectiveness.

Assess during IP team meeting

Scores on five domains IPOC-SEEDS evaluation

How often did you feel empowered to speak freely?

How often did you feel your opinion was valued?

How often did you fulfil you professional role?

IPEC: Teams and teamwork4 (TT8) Reflect on individual and team performance for individual, as well as team, performance improvement.

PACT measure

Ratings across four areas: engage learning environment, keep the debrief structured and organised, provoke an engaging discussion and sustain good future performance.

DASH measure

Scores on debriefing effectiveness

IPOC-SEEDS evaluation

How often did you speak up/advocate for profession?

How often did you contribute to the plan of care?

How often did you fulfil your professional role?

How often did you fulfil your team role?

InformaticsNavigate the electronic health record

IPOC-SEEDS evaluation

Number of first time EHR users

Percent of students who completed the individual documentation assignment.

Simulation effectivenessWas the IPOC-SEEDS simulation effective?

IPOC-SEEDS evaluation

Was the case realistic?

Did their team collaborate and come to a consensus on the plan?

What would you change to improve the effectiveness of this simulation in the future?

Did you have enough training to debrief as a team without faculty?

Simulation valueWhat was the value of the simulation?

IPOC-SEEDS evaluation

What value does this simulation have for you and your profession?

Technology utilizationDid technology utilisation and delivery work?

IPOC-SEEDS evaluation

What would you change to improve the effectiveness of this simulation in the future?

* Interprofessional Education Collaborative Expert Panel. Core competencies for interprofessional collaborative practice: Report of an expert panel. Washington, D.C.: Interprofessional Education Collaborative. http://www.aacn.nche.edu/education-resources/ipecreport.pdf. Published 2011.

**Quality and Safety Education for Nurses. QSEN competencies. QSEN.org (n.d.). Accessed January 14, 2013.

Quantitative Assessment Methods. Quantitative questions from the IPOC-SEEDS evaluation regarding demographics, individual and team behaviour related to IPEC objectives and simulation effectiveness were analysed. Descriptive statistics for these items included central tendency (mean, median), dispersion (standard deviation, range, minimum and maximum score), distribution (skewness and kurtosis) and normality (histograms and distribution analysis).

Content Analysis Assessment Methods. Inductive content analysis (Patton, 1990) of the IPOC-SEEDS evaluation was completed to identify common emergent themes and patterns for student responses to two open-ended questions on the simulation assessment measure: ‘What value does this simulation have for you and your profession?’ and ‘What would you change to improve the effectiveness of this simulation in the future?’ Because technology issues were of particular interest, answers to the latter questions were further evaluated for any complaints about technology in the response. Two researchers independently coded each item. Raters were in agreement for 75.9% (n = 469/618) of comments, so raters met to discuss differing results in order to reach a consensus for all items (Patton, 1990).

Behavioural Observation Assessment Methods. Nine teams out of 36 were randomly selected for evaluation using an Internet-based software (Urbaniak & Plous, n.d.). Two validated rubrics were used by two independent student observers to retrospectively assess student’s behaviours regarding IP teamwork and the quality of student-led debriefing via videos of the online simulation and debriefing session. The two observers conducted a training session to discuss the rubrics and then independently assessed a randomly selected team to establish inter-rater reliability. During the training session, team scores between observers varied by less than 10% and inter-rater reliability was established. See Development section for further discussion on the PACT-novice and modified DASH tools that were used (University of Washington, 2011; Brett-Fleegler, Ruldolph & Eppich, 2012).

Step 5 Revise

The final step, revise, focused on the changes in IPE and IP technology based on the previous results (see Figure 1). Revisions were based on assessment results and feedback from students and faculty. Examples of content revisions include clarifying student directions and reducing the length of the IPOC-SEEDS evaluation survey. Examples of technology revisions include changing the web-conferencing system to another commercially available platform with a quicker connection time and having information technology experts available in class during the first week to aid in troubleshooting student issues.

Results

The IPOC-SEEDS was a student-directed online simulation of the care planning process for patient transition from one level of care to another, such as from the hospital to a rehabilitation facility. The simulation involves an IP team meeting and an EHR developed for academic purposes. The students schedule and coordinate their IP teams progressing through three weeks of content and simulation activities (see Figure 2).

Demographics

Students from advanced practice nursing, dietetics and nutrition, occupational therapy and pharmacy participated in the 2013 pilot (N = 100) and spring 2014 (N=106) simulations with both sets of data presented here (see Table 3). A majority of students were Caucasian (83%, 81%), female (87%, 75%) and in their 20’s (72%, 71%). Fourteen percent (fall 2013) and 27% (spring 2014) of students participated online from a location at least 3 h away from the Midwest university location. Most students rated their technology experience as ‘competent, 2–3 years’ or ‘proficient, 3–5 years’. This simulation was dominantly the student’s first EHR experience in fall 2013, but EHR experience was equally distributed for spring 2014 students.

Demographics

DemographicsFall 2013 PilotN = 100Spring 2014N=106
Nutrition16Nutrition0
MSN29MSN30
ProfessionDNP6DNP7
Occupational Therapy34Occupational Therapy29
Pharmacy15Pharmacy40
African American2African American2
Asian6Asian9
EthnicityCaucasian83Caucasian86
Hispanic6Hispanic2
Native American0Native American1
Other or not specified3Other or not specified7
20–297220–2975
30–391830–3920
Age40–49840–497
50–59250–592
Male13Male23
GenderFemale87Female80
Unspecified0Unspecified3
Student Location< 3 h from KUMC86< 3 h from KUMC77
> 3 h from KUMC14> 3 h from KUMC29
Novice3Novice10
Advance beginner21Advance beginner19
Technology experience self-ratingCompetent (2–3yr)30Competent (2–3yr)23
Proficient (3–5 yr)29Proficient (3–5 yr)31
Expert (5–10 yr)17Expert (5–10 yr)23
First EHR Experience55First EHR Experience34
EHR ExperienceUsing EHR 6 months or more16Using EHR 6 months or more37
Using EHR 1 yr. or more29Using EHR 1 yr. or more34

Quantitative

Results for individual and team behaviour reported during the student team meetings related to IPEC objectives were positive overall (see Table 4).

Student Evaluation Results

Evaluation ResultsFall 2013 Pilot N = 100Spring 2014 N = 106
How often did you:Not at all 0% - 5%Sometimes 25%Often 50%Most of the time 75%Always 95% -100%Not at all 0–% - 5%Sometimes 25%Often 50%Most of the time 75%Always 95% -100%
Speak up/advocate for profession?1%12%7%28%52%1%9%14%36%40%
Feel empowered to speak freely?0%3%8%21%68%0%2%13%24%62%
Feel your opinion was valued?0%0%3%11%86%0%1%5%15%80%
Contribute to the plan of care?0%5%7%24%64%0%4%15%35%47%
Fulfil your professional role?0%0%3%19%78%0%2%4%27%67%
Fulfil your team role?1%5%6%21%67%1%3%10%26%60%
Fall 2013 Pilot N = 100Spring 2014 N = 106
How often did your team:Not at all 0% - 5%Sometimes 25%Often 50%Most of the time 75%Always 95% -100%Not at all 0% - 5%Sometimes 25%Often 50%Most of the time 75%Always 95% -100%
Introduce selves/ describe profession?0%3%5%5%87%0%2%9%14%75%
Keep patient’s perspective considered throughout meeting?0%3%4%22%71%0%2%2%22%74%
Percent of time all opinions were valued?0%0%1%9%90%0%0%1%8%92%

For both the pilot 2013 (N = 100) and spring 2014 (N = 106,) the majority of students selected ‘most of the time’ (75% or more) or ‘always’ (95% or more) to respond to the following questions: [How often did you] speak up/ advocate for your profession?; feel empowered to speak freely?; feel your opinion was valued?; contribute to the plan of care?; fulfil your professional role?; and fulfil your team role? (see Table 5).

Assessment of Interprofessional Team Work Using the Performance, Assessment of Communication and Teamwork (PACT) Tool during an Online Simulation

PACT Domain Team StructureLeadershipSituational MonitoringMutual SupportCommunication
Nine Team OverallRange: 3-4Range: 2-5Range: 2-5Range: 3-5Range: 3-5
ScoresMean: 3.4Mean: 3.6Mean: 3.4Mean: 4.6Mean: 3.9

PACT Scoring: 5 Excellent; 4 Average to excellent; 3 Average; 2 Poor to average; and 1 Poor

For both the pilot 2013 (N = 100) and spring 2014 (N = 106). the majority of students selected ‘most of the time’ or ‘always’ to respond to the following questions: [How often did your team] introduce selves/describe profession? and keep patient’s perspective considered throughout meeting? Ninety percent (pilot 2013, N = 100) and 92% (spring 2014, N = 106) of students said that all opinions were valued during the team meeting at all times (always/95–100% of time).

Likewise, the simulation’s effectiveness was positive overall. Ninety-eight percent (pilot 2013, N = 100) and 95.3% (spring 2014, N = 106) of students found the case realistic. One hundred percent (pilot 2013, N = 100; spring 2014, N = 106) of students felt their team collaborated to come to a consensus on the plan of care. Eighty-four percent (pilot 2013, N = 100) and 92.5% (spring 2014, N = 106) of students felt that they had adequate training to debrief as a team. Ninety-eight percent (pilot 2013, N = 100) and 96.2% (spring 2014, N = 106) of students felt that it was appropriate to have students lead the team debrief.

Content Analysis. One hundred comments from the pilot and 106 comments in the simulation were received in response to ‘What value does this simulation have for you and your profession?’ The most frequent answers from both iterations concerned ‘learning roles and responsibilities’ (n = 29, n = 34, respectively) and ‘gaining interprofessional practice’ (n = 34, n = 35, respectively). ‘Improving patient care’ (n = 15, n = 7) and ‘improving confidence in communication’ (n = 5, n = 12) were also identified as valuable contributions to the learning experience.

Ninety-nine comments from the pilot and 106 comments in the simulation were received in response to ‘What would you change to improve the effectiveness of this simulation in the future?’ The most common recommendations for change from those in the pilot included ‘deliver the simulation in-person’ (n = 32) and ‘address technology issues’ (n = 28). In the second iteration, fewer students recommended to ‘deliver the simulation in-person’ (n = 17), whilst a similar number commented ‘address technology issues’ (n = 29). Overall, the percentage who identified technology issues improved from 54% in the pilot to 46% in the simulation. Technology revisions made after the pilot will be discussed in the Revise step.

Behavioural Observation

The observed teams effectively demonstrated IP teamwork during the simulation based on the majority of teams across all five domains ranging from average to excellent on the PACT-novice tool (Brett-Fleegler, Rudolph & Eppich, 2012). The most impressive domain was the students’ ability to provide mutual support where six of nine teams scored excellent. The most common example of mutual support included students from one profession asking questions to another profession about a potential part of the care plan to engage everyone on the team. There were no scores of ‘poor’, and only two teams scored ‘poor to average’ across all five domains. For the complete results of IP teamwork, see Table 5.

Team-led debriefing appeared to be effective based on scores from the modified DASH tool for all the nine teams (Brett-Fleegler, Rudolph & Eppich, 2012). Throughout the four observed elements on the DASH tool, the majority of team scores ranged from mostly effective/ good to extremely effective/outstanding. The scores demonstrated the student’s ability to maintain an engaging learning environment, keep the debrief structured and organised, provoke an engaging discussion and sustain good future performance. For the complete results of the student Team-led debriefing, see Table 6.

Evaluation of Student Team-led Debriefing with Debriefing Assessment for Simulation in Healthcare (DASH) tool

DASHElement 2 Maintains an engaging learning environmentElement 3 Structures the debriefing in an organized wayElement 4 Provokes engaging discussionElement 6 Helps trainees achieve or sustain good future performance
Nine Team OverallRange: 5-7Range: 6-7Range: 4-7Range: 5-6
ScoresMean: 5.9Mean: 6.3Mean: 5.7Mean: 5.8

DASH Scoring: 7 Extremely effective/outstanding; 6 Consistently effective/very good; 5 Mostly effective/good; 4 Somewhat effective/average; 3 Mostly ineffective/very poor; 2 Consistently ineffective/very poor; and 1 Extremely ineffective/detrimental.

Discussion

The online IPOC-SEEDS simulation effectively provided IP students with a collaborative team experience using the EHR that overcame the common barriers of scheduling, physical space and faculty time conflicts. The IPOC-SEEDS simulation objectives, based on IPEC and informatics competencies, were achieved. Simulation effectiveness and value were established as well. Technology utilisation results were adequate but did improve after modifying the technology selected. The simulation provided an experience where students demonstrated IP collaborative skills that they can use in their future practice.

Simulation limitations include that some assessment measures used were newly developed or modified and that students were not assessed for providing socially desirable responses. Additionally, technology can present its own limitations and challenges (Lawlis, Anson & Greenfield, 2014). For example, technologies such as LMS, EHR, and web-conferencing system are crucial, and when they fail to work appropriately, disruption occurs halting the team’s productivity and increasing frustration levels. Back up plans that include technology support should be developed. Resources should be acquired to help with technology including administration buy-in, technology staff support and possibly financial support for faculty/ facilitator time, purchasing new technology and upkeep depending on the needs of simulation.

Online simulation utilisation is not restricted to academic educators (Sutter, Arndt, Arthur, Parboosingh, Taylor & Deutschlander, 2009). Future online IPE simulation developers (hospitals, health departments, universities, conferences) can use the authors’ process steps (plan, develop, implement, assess and revise) as a model for online simulation replication. Developers can modify the process to meet their students/participants needs, simulation content and available resources.

Online delivery of IP simulation has many possible benefits for its users (IP students, IP providers; Lawlis, Anson & Greenfield, 2014). First, online IPE simulation can reduce common barriers of scheduling conflicts, limited physical space and limited faculty/facilitator time. For example, online technology can assist participants to coordinate and schedule team meeting times that work for their schedule. Online simulation eliminates the need for large amounts of physical space for collaboration and thereby also eliminates the need to reserve meeting rooms months in advance. Online self-directed participant simulations can reduce faculty/facilitator time coordinating multiple teams of participants and leading multiple simulation debriefs. Second, online technologies can support users in actively engaging in IP team collaboration and EHR utilisation to meet academic or provider accreditation requirements (Hanna, Soren, Telner, MacNeill, Lowe & Reeves, 2012). Third, the online platform provides access for distance participants to participate in effective IPE simulations where they could not participate otherwise due to long travel distances (Hanna, Soren, Telner, MacNeill, Lowe & Reeves, 2012).

Funding

Interprofessional New Clinical Investigator Research Grant and the Kansas Reynolds Program in Aging Interprofessional Faculty Scholars.

eISSN:
2296-990X
Languages:
English, German
Publication timeframe:
Volume Open
Journal Subjects:
Medicine, Clinical Medicine, other