‘Interprofessional collaboration (IPC) occurs when two or more professions work together to achieve common goals and is often used as a means for solving a variety of problems and complex issues’Green & Johnson, 2015, pg.1). IPC practice models are being used in healthcare settings to improve patient care outcomes along with the use of informatics (Health Resources and Services Administration, 2013; World Health Organization (WHO), 2010; Cox & Naylor, 2013; Health Force Ontario, 2013; Hopkins, 2010; Cuff, 2013; Manos, 2012; Institute of Medicine, 2012). Future health providers need interprofessional (IP) communication and teamwork training to work collaboratively to provide patient-centred care (Hopkins, 2010). To qualify as interprofessional education (IPE), these activities must also involve students from two or more professions learning about, from and with each other to enable effective collaboration and improve health outcomes (WHO, 2010; Interprofessional Education Collaboration, 2011). Development and implementation of high-quality activities that achieve the core Interprofessional Education Collaboration (IPEC) competencies, which are required by accreditors of the US health professions programmes, are important (Begley, 2009). However, many barriers to in-person IPE have been documented consistently in the literature. Most common barriers to IPE include logistical and resource issues such as scheduling conflicts, geographical location/physical space and faculty time (Begley, 2009; Oandasan & Reeves, 2005; Lawlis, Anson & Greenfield, 2014).
In addition to overcoming barriers, educators must also select the best instructional design methods for IPE. Two common instructional methods in the IPE literature include online module learning and in-person simulation (Abu-Rish, et al., 2012). IPE using online modules as part of a stand-alone blended activity or course has been reported with mixed results regarding educator and student reactions and attitudes (McKenna, Palermo, Molloy, Williams & Brown, 2014; Blue et al., 2010). In-person IPE simulations typically have favourable effects regarding learner satisfaction, reaction, perceived authenticity and attitudes (Shoemaker, Beasley, Cooper, Perkins, Smith & Swank, 2011; Zhang, Thompson & Miller, 2011). In addition, a recent example of combining online modules and online simulation principles using virtual patient technology demonstrated feasibility and positive student reactions (Shoemaker, Platko, Cleghorn & Booth, 2014). More examples of combining online IPE methods are needed.
The purpose of this article is twofold. First is to describe the process steps for creating the
Over 10 months, a team of IP faculty and students met to create the IPOC-SEEDS simulation. Figure 1 outlines the five-step process that was used to plan, develop, implement, assess and revise the IPOC-SEEDS simulation. A lead faculty member was tasked with coordinating and scheduling team meetings, updating team members and identifying work to be completed.
The first step, plan, began with assembling an IP team including faculty from advanced practice nursing, dietetics and nutrition, health information management, occupational therapy and pharmacy; staff from informatics, teaching technology and electronic health record experts; and graduate assistants. A review of current IPE simulation literature was conducted to inform the development of the simulation. The team identified learning objectives for the IPOC-SEEDS simulation to address selected core competencies of IPEC targeting domains of teams and teamwork (TT4, TT5, TT8) and roles and responsibilities (RR4; Interprofessional Education Collaborative Expert Panel, 2011). An additional simulation objective identified by the team was for all involved professions to navigate the electronic health record (Quality and Safety Education for Nurses, n.d.). Study objectives were to assess the effectiveness, value and technology used during the IPOC-SEEDS simulation. The planning step also included determination of the content to be developed and technology systems to be used for delivery.
The second step of the process, develop, involved development of content, assessment measures and technology utilisation (see Figure 1). Content development included a patient case, uni-professional audio patient encounters, team-led debriefing principles and assessment measures. The available technology used to develop the simulation included an electronic health record (EHR), a learning management system (LMS) and a web-conferencing system. Specific development tasks for each focus area are cross-walked in Table 1 with the review method used during each development step. The developmental tasks provide an idea of workload for each focus area. Review of these tasks mostly consisted of multiple iterations of IP development team feedback or peer review. Development took 10–30 h per week of the lead faculty’s time during the first year and approximately 2–10 h monthly for all others involved.
IPOC-SEEDS Simulation Development Expand a standardised patient case (available from the National League for Nursing) to an interprofessional case to be applicable to all professions involved. Four-day case included multiple IP encounters, orders, laboratory results, vital signs, imaging, medication administration record and intake and output findings. Multiple iterations with IP faculty and informatics/ technology experts feedback (IP development team) Develop patient-professional visit script for each of the four professions with actor/patient collaboration. Record, save and post securely audio of each uni-professional patient-professional encounter (3–8 min each) on LMS Multiple iterations of IP faculty and patient expert feedback Successful completion of task with technology expert assistance Develop presentation on how to team debrief after the simulation (10 min) Develop student-led questions to be used during team debrief. Faculty peer review Modified from evidence, faculty expert feedback Develop simulation evaluation survey to address learning/study objectives Select PACT-novice tool to review the team encounters Select and modify DASH tool to assess the student-led team debriefing Modified established competencies through multiple iterations of IP faculty and technical experts feedback ROL to find and modify appropriate tool, IP faculty feedback Determine appropriate EHR documentation forms for each profession Build patient case in EHR including documentation by all providers, orders, vital signs, medication administration record, imaging and flow sheets Develop student and faculty instructions for accessing and navigating EHR Develop IP plan of care team documentation form Informatics expert assessment Informatics and HIM/ EHR experts consulted to create an EHR Informatics and EHR expert consulted Informatics expert researcher consulted LMS simulation shell development (welcome, simulation directions, 3-week content, evaluation, access to EHR, web-conferencing and e-mailing of team and faculty) Loading of students and faculty into simulation shell Faculty peer review LMS technology expert consulted Create directions to access, use and record in web conferencing system Technology expert consulted ROL: Review of Literature *Boese T. Advancing Care Excellence for Seniors unfolding case on Red Yoder. New York: National League for Nursing. 2011. Accessed 2013. **Brett-Fleegler M, Rudolph J, Eppich W, et al. Debriefing assessment for simulation in healthcare: development and psychometric properties. Simul Healthc. 2012; 7(5): 288-94. 10.1097/SIH.0b013e3182620228. ***University of Washington. Performance Assessment of Communication and Teamwork Novice Observer Form. Develop Content Tasks Review Patient case Audio patient encounters Team-led debriefing Assessment measures Electronic health record (EHR) Learning Management System (LMS) Synchronous webconferencing system
Element Two: Maintains an engaging learning environment Element Three: Structures the debriefing in an organised way Element Four: Provokes engaging discussion Element Six: Helps trainees achieve or sustain good future performance
The third step of the process, implement, occurred over three weeks (see Figure 1). The first week implementation activities involved introducing students to IP teamwork and technology training about an EHR, an LMS and a web-conferencing system (see Figure 2).
Students were also assigned to IP teams and used secure email to schedule their own team meeting times for week 3. During the second week, students individually reviewed the patient’s chart in the EHR, listened to the uni-professional audio patient encounter recording and documented as the provider in the EHR. During the third and final week, students met synchronously in IP teams to discuss and determine an IP collaborative care plan and document their team plan of care in the EHR. Afterwards, the students managed the team-led debrief of simulation, without faculty, discussing team functioning and areas for improvement. Finally, the students anonymously and voluntarily completed a short evaluation of the simulation in REDCap, a data capture system for research (Harris, Taylor, Thielke, Payne, Gonzalez & Conde, 2009). This project was approved by the investigational review board before implementation.
The fourth step, assess, included assessment of the simulation (see Figure 1). Table 2 displays the objectives along with how they were assessed. The modified PACT-novice measure was used to assess the team meeting videos. The modified DASH measure was used to assess debriefing responses. The rest of the implementation was evaluated by student feedback on the IPOC-SEEDS evaluation or faculty review of student team content. Formal and informal student and faculty feedback was also gathered throughout to improve future simulation delivery.
IPOC-SEEDS Simulation Objectives with Respective Assessment How often did your team introduce themselves and describe their profession? How often did you fulfill your professional role? Did you feel your team collaborated to come to a consensus on the plan of care? Assess during IP team meeting Scores on five domains IPOC-SEEDS evaluation How often did you contribute to the plan of care? How often did your team keep the patient’s perspective
considered throughout meeting? Assess during IP team meeting Scores on five domains IPOC-SEEDS evaluation How often did you feel empowered to speak freely? How often did you feel your opinion was valued? How often did you fulfil you professional role? PACT measure Ratings across four areas: engage learning environment, keep the debrief structured and organised, provoke an engaging discussion and sustain good future performance. DASH measure Scores on debriefing effectiveness IPOC-SEEDS evaluation How often did you speak up/advocate for profession? How often did you contribute to the plan of care? How often did you fulfil your professional role? How often did you fulfil your team role? IPOC-SEEDS evaluation Number of first time EHR users Percent of students who completed the individual documentation assignment. IPOC-SEEDS evaluation Was the case realistic? Did their team collaborate and come to a consensus on the plan? What would you change to improve the effectiveness of this simulation in the future? Did you have enough training to debrief as a team without faculty? IPOC-SEEDS evaluation What value does this simulation have for you and your profession? IPOC-SEEDS evaluation What would you change to improve the effectiveness of this simulation in the future? * Interprofessional Education Collaborative Expert Panel. Core competencies for interprofessional collaborative practice: Report of an expert panel. Washington, D.C.: Interprofessional Education Collaborative. **Quality and Safety Education for Nurses. QSEN competencies. QSEN.org (n.d.). Accessed January 14, 2013.Domain Objective Assessment 1 (RR4) Identify the roles and responsibilities for yourself and other care providers and how the team works together to provide care. IPOC-SEEDS evaluation:
2 (TT4) Integrate the knowledge and experience of other professions – appropriate to the specific care situation – to inform care decisions, whilst respecting patient and community values and priorities/ preferences for care. 3 (TT5) Apply teamwork principles that support collaborative practice and team effectiveness. 4 (TT8) Reflect on individual and team performance for individual, as well as team, performance improvement. Navigate the electronic health record Was the IPOC-SEEDS simulation effective? What was the value of the simulation? Did technology utilisation and delivery work?
The final step, revise, focused on the changes in IPE and IP technology based on the previous results (see Figure 1). Revisions were based on assessment results and feedback from students and faculty. Examples of content revisions include clarifying student directions and reducing the length of the IPOC-SEEDS evaluation survey. Examples of technology revisions include changing the web-conferencing system to another commercially available platform with a quicker connection time and having information technology experts available in class during the first week to aid in troubleshooting student issues.
The IPOC-SEEDS was a student-directed online simulation of the care planning process for patient transition from one level of care to another, such as from the hospital to a rehabilitation facility. The simulation involves an IP team meeting and an EHR developed for academic purposes. The students schedule and coordinate their IP teams progressing through three weeks of content and simulation activities (see Figure 2).
Students from advanced practice nursing, dietetics and nutrition, occupational therapy and pharmacy participated in the 2013 pilot (N = 100) and spring 2014 (N=106) simulations with both sets of data presented here (see Table 3). A majority of students were Caucasian (83%, 81%), female (87%, 75%) and in their 20’s (72%, 71%). Fourteen percent (fall 2013) and 27% (spring 2014) of students participated online from a location at least 3 h away from the Midwest university location. Most students rated their technology experience as ‘competent, 2–3 years’ or ‘proficient, 3–5 years’. This simulation was dominantly the student’s first EHR experience in fall 2013, but EHR experience was equally distributed for spring 2014 students.
DemographicsDemographics Fall 2013 Pilot N = 100 Spring 2014 N=106 Nutrition 16 Nutrition 0 MSN 29 MSN 30 DNP 6 DNP 7 Occupational Therapy 34 Occupational Therapy 29 Pharmacy 15 Pharmacy 40 African American 2 African American 2 Asian 6 Asian 9 Caucasian 83 Caucasian 86 Hispanic 6 Hispanic 2 Native American 0 Native American 1 Other or not specified 3 Other or not specified 7 20–29 72 20–29 75 30–39 18 30–39 20 40–49 8 40–49 7 50–59 2 50–59 2 Male 13 Male 23 Female 87 Female 80 Unspecified 0 Unspecified 3 < 3 h from KUMC 86 < 3 h from KUMC 77 > 3 h from KUMC 14 > 3 h from KUMC 29 Novice 3 Novice 10 Advance beginner 21 Advance beginner 19 Competent (2–3yr) 30 Competent (2–3yr) 23 Proficient (3–5 yr) 29 Proficient (3–5 yr) 31 Expert (5–10 yr) 17 Expert (5–10 yr) 23 First EHR Experience 55 First EHR Experience 34 Using EHR 6 months or more 16 Using EHR 6 months or more 37 Using EHR 1 yr. or more 29 Using EHR 1 yr. or more 34
Results for individual and team behaviour reported during the student team meetings related to IPEC objectives were positive overall (see Table 4).
Student Evaluation ResultsEvaluation Results Fall 2013 Pilot N = 100 Spring 2014 N = 106 How often did Not at all 0% - 5% Sometimes 25% Often 50% Most of the time 75% Always 95% -100% Not at all 0–% - 5% Sometimes 25% Often 50% Most of the time 75% Always 95% -100% Speak up/advocate for profession? 1% 12% 7% 28% 52% 1% 9% 14% 36% 40% Feel empowered to speak freely? 0% 3% 8% 21% 68% 0% 2% 13% 24% 62% Feel your opinion was valued? 0% 0% 3% 11% 86% 0% 1% 5% 15% 80% Contribute to the plan of care? 0% 5% 7% 24% 64% 0% 4% 15% 35% 47% Fulfil your professional role? 0% 0% 3% 19% 78% 0% 2% 4% 27% 67% Fulfil your team role? 1% 5% 6% 21% 67% 1% 3% 10% 26% 60% How often did Not at all 0% - 5% Sometimes 25% Often 50% Most of the time 75% Always 95% -100% Not at all 0% - 5% Sometimes 25% Often 50% Most of the time 75% Always 95% -100% Introduce selves/ describe profession? 0% 3% 5% 5% 87% 0% 2% 9% 14% 75% Keep patient’s perspective considered throughout meeting? 0% 3% 4% 22% 71% 0% 2% 2% 22% 74% Percent of time all opinions were valued? 0% 0% 1% 9% 90% 0% 0% 1% 8% 92%
For both the pilot 2013 (N = 100) and spring 2014 (N = 106,) the majority of students selected ‘most of the time’ (75% or more) or ‘always’ (95% or more) to respond to the following questions: [How often did you] speak up/ advocate for your profession?; feel empowered to speak freely?; feel your opinion was valued?; contribute to the plan of care?; fulfil your professional role?; and fulfil your team role? (see Table 5).
Assessment of Interprofessional Team Work Using the Performance, Assessment of Communication and Teamwork (PACT) Tool during an Online Simulation PACT Scoring: 5 Excellent; 4 Average to excellent; 3 Average; 2 Poor to average; and 1 PoorPACT Domain Team Structure Leadership Situational Monitoring Mutual Support Communication Nine Team Overall Range: 3-4 Range: 2-5 Range: 2-5 Range: 3-5 Range: 3-5 Scores Mean: 3.4 Mean: 3.6 Mean: 3.4 Mean: 4.6 Mean: 3.9
For both the pilot 2013 (N = 100) and spring 2014 (N = 106). the majority of students selected ‘most of the time’ or ‘always’ to respond to the following questions: [How often did your team] introduce selves/describe profession? and keep patient’s perspective considered throughout meeting? Ninety percent (pilot 2013, N = 100) and 92% (spring 2014, N = 106) of students said that all opinions were valued during the team meeting at all times (always/95–100% of time).
Likewise, the simulation’s effectiveness was positive overall. Ninety-eight percent (pilot 2013, N = 100) and 95.3% (spring 2014, N = 106) of students found the case realistic. One hundred percent (pilot 2013, N = 100; spring 2014, N = 106) of students felt their team collaborated to come to a consensus on the plan of care. Eighty-four percent (pilot 2013, N = 100) and 92.5% (spring 2014, N = 106) of students felt that they had adequate training to debrief as a team. Ninety-eight percent (pilot 2013, N = 100) and 96.2% (spring 2014, N = 106) of students felt that it was appropriate to have students lead the team debrief.
Ninety-nine comments from the pilot and 106 comments in the simulation were received in response to ‘What would you change to improve the effectiveness of this simulation in the future?’ The most common recommendations for change from those in the pilot included ‘deliver the simulation in-person’ (n = 32) and ‘address technology issues’ (n = 28). In the second iteration, fewer students recommended to ‘deliver the simulation in-person’ (n = 17), whilst a similar number commented ‘address technology issues’ (n = 29). Overall, the percentage who identified technology issues improved from 54% in the pilot to 46% in the simulation. Technology revisions made after the pilot will be discussed in the Revise step.
The observed teams effectively demonstrated IP teamwork during the simulation based on the majority of teams across all five domains ranging from average to excellent on the PACT-novice tool (Brett-Fleegler, Rudolph & Eppich, 2012). The most impressive domain was the students’ ability to provide mutual support where six of nine teams scored excellent. The most common example of mutual support included students from one profession asking questions to another profession about a potential part of the care plan to engage everyone on the team. There were no scores of ‘poor’, and only two teams scored ‘poor to average’ across all five domains. For the complete results of IP teamwork, see Table 5.
Team-led debriefing appeared to be effective based on scores from the modified DASH tool for all the nine teams (Brett-Fleegler, Rudolph & Eppich, 2012). Throughout the four observed elements on the DASH tool, the majority of team scores ranged from mostly effective/ good to extremely effective/outstanding. The scores demonstrated the student’s ability to maintain an engaging learning environment, keep the debrief structured and organised, provoke an engaging discussion and sustain good future performance. For the complete results of the student Team-led debriefing, see Table 6.
Evaluation of Student Team-led Debriefing with Debriefing Assessment for Simulation in Healthcare (DASH) tool DASH Scoring: 7 Extremely effective/outstanding; 6 Consistently effective/very good; 5 Mostly effective/good; 4 Somewhat effective/average;
3 Mostly ineffective/very poor; 2 Consistently ineffective/very poor; and 1 Extremely ineffective/detrimental.DASH Element 2 Maintains an engaging learning environment Element 3 Structures the debriefing in an organized way Element 4 Provokes engaging discussion Element 6 Helps trainees achieve or sustain good future performance Nine Team Overall Range: 5-7 Range: 6-7 Range: 4-7 Range: 5-6 Scores Mean: 5.9 Mean: 6.3 Mean: 5.7 Mean: 5.8
The online IPOC-SEEDS simulation effectively provided IP students with a collaborative team experience using the EHR that overcame the common barriers of scheduling, physical space and faculty time conflicts. The IPOC-SEEDS simulation objectives, based on IPEC and informatics competencies, were achieved. Simulation effectiveness and value were established as well. Technology utilisation results were adequate but did improve after modifying the technology selected. The simulation provided an experience where students demonstrated IP collaborative skills that they can use in their future practice.
Simulation limitations include that some assessment measures used were newly developed or modified and that students were not assessed for providing socially desirable responses. Additionally, technology can present its own limitations and challenges (Lawlis, Anson & Greenfield, 2014). For example, technologies such as LMS, EHR, and web-conferencing system are crucial, and when they fail to work appropriately, disruption occurs halting the team’s productivity and increasing frustration levels. Back up plans that include technology support should be developed. Resources should be acquired to help with technology including administration buy-in, technology staff support and possibly financial support for faculty/ facilitator time, purchasing new technology and upkeep depending on the needs of simulation.
Online simulation utilisation is not restricted to academic educators (Sutter, Arndt, Arthur, Parboosingh, Taylor & Deutschlander, 2009). Future online IPE simulation developers (hospitals, health departments, universities, conferences) can use the authors’ process steps (plan, develop, implement, assess and revise) as a model for online simulation replication. Developers can modify the process to meet their students/participants needs, simulation content and available resources.
Online delivery of IP simulation has many possible benefits for its users (IP students, IP providers; Lawlis, Anson & Greenfield, 2014). First, online IPE simulation can reduce common barriers of scheduling conflicts, limited physical space and limited faculty/facilitator time. For example, online technology can assist participants to coordinate and schedule team meeting times that work for their schedule. Online simulation eliminates the need for large amounts of physical space for collaboration and thereby also eliminates the need to reserve meeting rooms months in advance. Online self-directed participant simulations can reduce faculty/facilitator time coordinating multiple teams of participants and leading multiple simulation debriefs. Second, online technologies can support users in actively engaging in IP team collaboration and EHR utilisation to meet academic or provider accreditation requirements (Hanna, Soren, Telner, MacNeill, Lowe & Reeves, 2012). Third, the online platform provides access for distance participants to participate in effective IPE simulations where they could not participate otherwise due to long travel distances (Hanna, Soren, Telner, MacNeill, Lowe & Reeves, 2012).
Interprofessional New Clinical Investigator Research Grant and the Kansas Reynolds Program in Aging Interprofessional Faculty Scholars.