INFORMAZIONI SU QUESTO ARTICOLO

Cita

Introduction

After the onset of coronavirus, universities all over the world had to move their operations to online spaces. The move has also affected assessment practices, particularly when the locus was pre-pandemic on campus or at teaching or examination centres. By upscaling and accelerating the adoption of digital assessment practices in online or distance education, educators were given the opportunity to rethink assessment processes as an integral part of the authentic digital life experience for students and staff (Hatzipanagos et al., 2020).

These changes have accelerated the transition to a ‘new normal’ regime of assessing students online. The abrupt transition to online assessment posed instantly important questions about the quality of the student experience of the transition. The priority in the sector was that fundamental principles of assessment, including integrity, authenticity, access and the provision of a secure environment, were supported in the design and implementation of assessments, ensuring that no assessed student was disadvantaged in accessing digital learning opportunities. The Covid-related transition had a catalytic impact on revising assessment strategies in relation to the content and format of assessment and the right timing of transitioning. It also posed three related questions:

What measures would institutions need to put in place to ensure that the student experience of assessment is positive and unaffected by technology, format or location?

What was the impact of the transition on pedagogy of assessing student performance?

Was it the right time, taking the outcome of evaluations into account, to move to online assessment practices for the longer term?

The paper draws from a longitudinal evaluation at the University of London that collected data about a transition of this type. The aim of our evaluation was to generate understanding of the move to online assessment, primarily from the perspective of the experience of the students and staff. Students at the University of London are distributed across over 180 countries, studying at a distance, but generally attending local examination centres, where they take paper based, fixed time examinations (the university’s established examinations standard practice). During the COVID-19 pandemic, students had to move to online assessment in place of conventional examinations and approximately 110,000 exams events were impacted by closure of exam centres.

In late March 2020 the University communicated to all students that examinations in 2020 would have to move online, as conventional examinations in examination centres would not be possible due to COVID-19 restrictions. Students also received communication from programme teams about modes and formats of assessment that would replace conventional examinations. A range of examination formats were adopted differentially by Programme Boards, which included (a) online, open-book exams with a paper to be downloaded from the virtual learning environment (VLE) and returned within a prescribed time and (b) digitally proctored exams. The window of submission varied in different programmes from a few hours (similar to the time allocated for a traditional exam) to 7 days in some cases. Text-matching software was used in the majority of the examination submissions to ensure academic integrity considerations were respected.

The response was made in emergency mode, and in the event 38,000 students or 93% of eligible students sat for examination events, approximately 8% more than in 2019. Embracing a rapid response to the impact of COVID-19 on the students’ ability to participate in scheduled exams at a local examination centre during the summer of 2020, the University provided over 100,000 exam opportunities online (Amrane-Cooper, 2020).

The paper seeks to evaluate the impact of this transition based on empirical evidence that has been collected and analysed. In addition, implications are explored when designing assessment in end-of-term examinations. Our claim is that the extraordinary circumstances that higher education went through during the pandemic had silver linings: the transition to online assessment encouraged universities to innovate and to review assessment practices.

In the following section we explore the theoretical background of this research.

What Research Says about Exam-Based Assessment
Assessment theories and online assessment in practice

The dominant rationale for final examinations is to authenticate learning, that is, to ensure the student whose name is on the student register, is the student who is completing the assessed work (Butler-Henderson & Crawford, 2020). Research has highlighted that all assessment activities (including exams) should be governed by the same key principles: authenticity, accessibility, automation, providing continuous/periodic assessment opportunities and security (JISC, 2020a, b). Table 1 elaborates on the meaning of these principles, whereas Table 2 presents an evaluation of the exam format against the assessment principles.

Assessment principles

Assessment principles
Authentic Preparing the learner for what they are going to do next, testing knowledge and skills in a more realistic, contextualised and motivating way and meeting employer needs.
Accessible Designed throughout to be usable by everyone to the greatest possible extent, including those who have special needs, for example, those who have a long-term disability, a short-term injury or a mental health challenge.
Appropriately automated Appropriately automated: easing teachers’ marking and feedback workload, and providing quicker, more detailed and more actionable feedback for students.
Continuous/periodic Providing assessment opportunities that are rich in practice and reflect the fact that students need to be capable of lifelong learning, to adapt to changes in the world of work and across their lives, rather than succeeding at one high-stakes, high-stress exam.
Secure Ensuring that the right student is taking the right assessment and that the work they are submitting is their own and abides by the rules.

Assessment principles and exams

Assessment principles Score Comment
Authentic ×/√ Exams traditionally do not score highly in all related aspects, particularly when they test reproduction of knowledge from textbooks. However, exam questions in an exam can be designed to be authentic.
Accessible × Accessibility may be compromised in exam centres where students’ needs are not catered for. On the other hand, students with special needs can be disadvantaged in an online exam where their needs have not been considered in the design of the assessment.
Appropriately automated Online exams offer the opportunity to embed automation to support marking and feedback. Automation may be difficult if the exam includes open-ended questions, for instance.
Continuous/periodic × Participating in high-stakes, high-stress exams does not comply with the principle of continuous assessment.
Secure Personation and invigilation is an issue in exams, either online or in exam centres. However, it can be designed out by setting up an appropriate online environment and/or assessment design.

Table 2 presents an evaluation of the exam format against these assessment principles.

Recent research has explored the relationship between students’ performance and preferences when using online and offline assessments in order to improve digital assessment practice and student motivation and engagement (Abrar & Mumtaz, 2017; Chase et al., 2017). Attitudes to formative and summative assessment in open and distance learning environments have also been explored (Hatzipanagos, 2009), since experience of appropriate assessment, good teaching, clear goals and standards, good materials and good tutoring are all positively associated with increased attainment for distance learning students (Ertl et al., 2008).

Authenticity of assessment and the exam format

End-of-term assessments taking place in examination centres face a few challenges in demonstrating how authenticity is embedded in the assessment process. From a process point of view, technology is left at the door of the assessment centre as students are asked to use a pen/pencil and paper, a reality far from their daily set-up of using a laptop or any other mobile devices. This may also come in contrast with other pre-end of term assessment, for example, coursework formative or summative which is often submitted online. In addition, different subjects have their own complexities when it comes to digital assessment, whether that is the need for large amounts of storage space for some visual or media subjects or the difficulty of showing ‘working out’ online in STEM exams.

University commitments to the development of authentic assessment environments require a radical revision of current examination practice to incorporate real-life learning processes and unstructured problem-solving (JISC 2020a; Williams & Wong, 2009).

Research has identified some challenges in enhancing the authenticity aspect of examinations. These are related to technology barriers that might present issues in accessing materials and submitting work, exam questions using fact-based responses rather than unique case studies and scenarios or comparative analysis (Pauli et al., 2020) and promoting equality, diversity and inclusivity for students.

A criticism of examinations is that in their context, formative and summative assessment may be perceived as mutually exclusive. Too much emphasis on a limited number of high-stakes assessment points places stress on individuals and institutional processes and reduces the time and effort that can be put into formative opportunities (JISC, 2020a). Economic considerations also frequently determine assessment practices–continuous/periodic and formative assessment increase staff workloads, for example.

Exams and academic integrity

The shift to online assessment and employing online invigilation and/or proctoring systems has generated debates on academic integrity (Farrell, 2020), highlighting good practice. There seem to be two dominant threads in such debates, which are far from complementary; one involves promoting creative design of authentic assessment and clear guidelines to students about expectations around referencing and plagiarism; the other provides technological and practical safeguards to protect academic integrity such as moderation of marking, text-matching software, and the use of mechanisms, for example, vivas to verify student academic work.

Access to multiple question banks through effective quiz design and delivery is a mechanism to reduce the propensity to cheat, by reducing the stakes through multiple delivery attempts (Sullivan, 2016). The research on cheating has focused mainly on technical challenges, rather than ethical and social issues; the latter has been researched in more depth in traditional assessment methods, for example, Wright (2015).

Most responses to online examinations use bring your-own-device models, where laptops are brought to traditional lecture theatres or exam centres, use of software on personal devices in any location desired or use of prescribed devices in a physical setting. The primary goal of each is to balance the authentication of students and maintain the integrity and value of achieving learning outcomes. Some scholars recommend employing online synchronous assessments as an alternative to traditional proctored examinations, while maintaining the ability to manually authenticate (Chao et al., 2012).

Examinations and regulatory frameworks

Recent debates on assessment in the higher education sector have revolved around changes and uncertainty for the future, mainly investigating whether the pandemic has marked the beginning of the end for in-person, fixed-time, paper-based assessment (Emerge Education & JISC, 2020; JISC, 2020a). Universities saw a need for a balance between mitigating student stress and meeting the demands for academic rigour, both to satisfy student demands and achieve the recognition of professional, statutory and regulatory bodies (PSRBs) which is critical to the career paths of many students (JISC, 2020b).

In higher education, the use of an end-of-course summative examination as a form of validating knowledge has been informed significantly by accreditation bodies. In-country and PSRB regulators often place requirements on the universities to include examinations in approved and recognised programmes. For instance, the American Bar Association has required a final course examination to remain accredited (Sheppard, 1997). Other PSRBs might require proctored exams (Pauli et al., 2020). PSRBs’ recognition of online assessment may be problematic and similarly, regional regulators may not approve of online assessment.

Student attitudes to online assessment

Debates on this particular area have pondered on whether students prefer online examinations to the traditional exam centre-based ones and whether students benefit from an end-of-term assessment if this is not preceded by some form of continuous (or more accurately periodic) assessment. Butler–Henderson and Crawford’s (2020) systematic review of online examinations indicates that students seem to prefer online examinations compared with paper exams, appreciating the flexibility and opportunities afforded by online exams.

Methodology
Aims and objectives

The project this paper reports on aimed to collect data about and generate understanding of this transition to online assessment, primarily from the perspective of the experience of the students who have been affected and to answer a fundamental question: What was the impact of the transition to online assessment on the experience of students and student outcomes? The objectives were as follows:

Analyse the views of responders, gathered by an online survey on whether they engaged with online assessment;

Explore issues arising from the survey results by interviewing students;

Investigate the rationale for selecting and establishing new formats of assessment and why these particular formats have been chosen;

Analyse the views of examiners (markers) on their experience of the transition to online assessment.

Evaluation focus

We focused our evaluation on four key areas: student behaviours, student sentiment, student outcomes, and operational issues (Figure 1).

Figure 1:

Areas under investigation.

Online survey

A survey with a target audience that was estimated at 38,000 was disseminated, including all students who had booked to take exams that summer. A total of 8,595 responses were received during the survey live period, with an overall response rate of 29.5%. Figure 2 provides an overview of the demographics of students who participated in the survey.

Figure 2:

Demographics of student respondents.

The survey and the interviews explored understanding of the need to change the traditional examinations, whether the students received adequate information, whether they sat in all examinations as planned, which format their exam was in, whether they had reliable IT connectivity, their overall experience, whether their performance was affected, whether academic offences were more likely under the circumstances, whether the recognition of their qualification had been affected and whether they would like the alternative form of assessment to continue in the future.

Student interviews

Semi-structured interviews were conducted with 22 students (14 undergraduates and 8 postgraduates) to elaborate on issues identified in the student survey.

Programme director interviews

Semi-structured interviews were conducted to collect the views from a sample of 12 programme directors (undergraduate: 3, postgraduate: 8, foundation: 1), examine their views on the choices made and explore their perspectives on the quality of what was achieved

Examiner attitudes from survey

A survey with a target audience of 621 examiners was disseminated. A total of 176 responses were received, with an overall response rate of 27%. As the examiners completed their marking online, rather than marking paper exam scripts couriered to them, which was the dominant practice until then, we had the opportunity to explore the experience of the academic staff undertaking the marking. The following section describes the findings of our investigation.

Findings

With exams completed, approximately 93% of the students were able to participate in online assessment. On average, this was a higher percentage of exam engagement than in the traditional model of unseen exams taken at the University’s 600 Examination centres (between 83% and 87% in 2019). The assessments were selected from undergraduate and postgraduate academic subjects across a range of disciplines where online examinations took place. Findings reveal that students felt that there were benefits in moving to online assessment and students felt they were given the opportunity to demonstrate their learning despite the pandemic.

The intention was also to examine differences in the format of examinations—we used three types of online examination: proctored exams, fixed time unseen closed book style exams and unseen but open book exams. In open book exams, the questions were changed to accommodate the open book format and students were allowed to refer to notes and other course materials while they were taking the exam, with a longer response time (24 hr or several days). Student behaviour data were easily accessible via the VLEs and in addition to strong uptake of online exams, trends were identified. For example, with open book type exams, students still predominately accessed their exam paper as soon as their zonal exam became live for their location (the university operates zonal exam papers for larger programmes to accommodate the need for security of the exam papers across global time zones). Some of the students appeared to submit within a few hours of receiving the paper, not taking the opportunity of a longer timeframe to evaluate the quality of their answers and reflect on their work. This has implications for how students are supported to prepare for future assessments, for example, supporting them to develop strategies for a format that includes open book type assessment over a long period.

Student survey

The majority of students indicated a positive experience with online assessment; 82% agreed that they were able to take the online assessment in a suitable environment; 80% agreed that the platform used for online assessment worked well; and 79% agreed that they were able to demonstrate their learning through the online assessment. On average, agreement levels were higher among independent learners than among students at teaching centres and higher among postgraduates. Agreement levels were also significantly lower among programmes which experienced problems with online invigilation arrangements.

Two-thirds of respondents (66%) agreed they would like to see online assessment continue in the future; agreement with this statement was correlated with students feeling well prepared to use the online assessment platform and feeling able to demonstrate their learning. Men (68%) were slightly more likely than women (65%) to indicate interest in continued online assessment, while students aged 25+ (74%) were more open to its continued use than younger students (60%). Analysis of the open-text comments suggested that many students recognised that online forms of assessment are less costly and easier logistically (e.g., no exam centre, travel or accommodation costs) and can afford greater flexibility for students in terms of where they take their exams.

The survey indicated that students who undertook online assessment had significant support needs. While 54% of survey respondents said they needed to ask a question about online assessment, with undergraduates, those at teaching centres and those in countries with developing digital infrastructure (e.g., Pakistan, Malaysia, Sri Lanka) more likely than the average student to have needed to contact the University, 76% agreed communication provided by the University about online assessment was clear, with lower proportions of agreement among undergraduates and students at teaching centres than their counterparts.

Only a minority of respondents (12%) believed that the use of online assessment had a negative impact on the grades they thought they would obtain through the 2020 assessment round. A majority of students (55%) said that online forms of assessment made no difference to the risk of students cheating compared with unseen written examinations; however, a significant minority (39%) believed that the risk of cheating was higher.

There were also responses from students who did not complete all or some of their online exams, and this reveals that for this early sample, their lack of engagement with the assessment was generally COVID-19 related rather than having to do with issues of access to WIFI or suitable computer equipment. It appears that Illness, disruption and mental health issues resulting from the pandemic meant that these students were not able to engage with the assessment. Limited use of online proctoring (via a third-party platform) did not work for students in some locations and circumstances.

Student interviews

All students from a diverse range of disciplines sat for the exams that they were registered for in the summer session of 2020. Our data indicated that students understood the need for change from standard examinations and were sympathetic to the university’s decision to move exams online in an unprecedented situation, citing reasons of physical safety and fairness across the board.

The students reported that the University employed a wide range of communication: the VLE portal, emails from the University and the teaching teams, videos, webinars, online lectures, and sample exam papers. The majority of interviewees believed the information provided by the University of London was adequate, although in a few cases they felt that there was some confusion (information coming both centrally and from programme teams) and a lack of timeliness or delay to the communication.

Overall, students indicated they had a positive experience with the online assessment. Advantages were perceived to be mainly flexibility and lower cost (inc. travelling and accommodation). In terms of process, the majority of the students (20 out of 22) did not report any technological problems that undermined their performance during the exams. Five students conveyed that the technology created anxiety for them. Two interviewees feared their experience could be affected by unreliable Internet connectivity, but they were able to take the assessment without it becoming a challenge. Another two respondents found the online form of assessment more difficult than the standard physical assessment and feared that a stricter approach would be adopted in marking the examinations.

The opportunity to type their exams rather than write them by hand was well received by most students, however, it was claimed that this could be problematic for some disciplines (where students would have to use formulae in their answers, e.g. STEM), see Figure 3.

Figure 3:

Students and assessment responses.

The added advantage of moving to the alternative assessment for most students that had been given a wider submission window, was having extra time to complete their exam within the allocated time. For some of them, having additional time was not viewed as a positive change as it was anxiety-inducing and might encourage academic offences.

In terms of content, the majority of students (21 out of 24) did not have any objections to the alternative formats used or the open book exam format. Most students did not feel that their performance in the exams was affected by the alternative format, indicating that there was no difference between the actual and the online exam experience. For those who did feel that it affected them, most thought that their results were affected positively and again flexibility was the main reason cited. Two students felt that their performance was affected negatively (for one of them the content of the exam was different from the questions they were given for practice during the academic year).

Overall, students agreed that the recognition of their qualification would not be negatively affected by the online examinations, as universities across the world were required to have an alternative form of assessment. A number of students stressed the need to maintain the credibility of their programme if online assessment were to continue and underlined the importance of making academic integrity a key issue in online examinations in order to retain credibility for their qualification.

Students were divided on whether cheating was more likely with the online assessment they had undertaken. In particular, many felt that it was down to the particular discipline as some subjects made it more difficult to cheat. Some of them referred to the need of introducing online invigilation to maintain rigour of assessment.

Finally, a large majority of students (16/22 or 73%) indicated that they would like the alternative format to continue in the future.

Examiners survey

The majority of examiners seemed to welcome the move to mark typed examination scripts, compared with marking handwritten scripts, as the former tend to be more legible and easier to access. However, a majority of examiners agreed that the online marking tools the university provided needed further development. The opening and reading of scripts, recording of marks and comments, reviewing of marks and comments of other markers, flagging of answers for possible plagiarism, and agreement and confirmation of final marks were all identified as being areas that would benefit from improvement if online assessment were used by the university in the future. This has helped the University to identify the areas for development in the VLE.

A total of 48% of respondents said that their overall experience of being an examiner in 2020 had been worse than in previous years, 24% said it had been better, while 23% said it was no more difficult than previous years.

In all, 60% of the examiners agreed that they had received sufficient information for completing the marking process; however, overall examiners noted that there were some information gaps and that the move to the timing of the examinations, because of the pandemic, had negatively affected the marking experience.

We asked examiners to identify the top adjustments they would recommend the University makes for future online assessment to minimise assessment offence— the top two responses were for exams to be invigilated and for exams to be converted into open world/open book format which made use of text-matching software during submission. The open text comments called for greater levels of student training on academic integrity and for avoiding plagiarism and collusion.

In addition, in the open text comments, many examiners recognised the significant effort the university had undertaken to transition to online assessment during a global health pandemic and used the survey to praise the effort and dedication shown by assessment teams.

Programme director interviews

Programme directors we interviewed were from pre-degree (1), undergraduate (3) and postgraduate (4) programmes and a range of disciplines. The programme directors reported on changes in assessment design when moving exams online.

In some programmes, changes in assessment were accelerated by the pandemic. The rapid pivot to online exams, although challenging organisationally, has made visible some limitations of exams and enabled enhancements to student learning that might otherwise have taken much longer to achieve. Changes, when implemented, were prompted by feedback received by students and external examiners and the initiatives of programme teams to transform assessment practice. Changes to assessment included adoption of alternative forms of assessment, broadly coursework (in some cases submitted online) to complement or replace exams. Alternative forms of assessment, including coursework, group work and peer learning were considered advantageous because they allowed students to develop a range of skills. Formative assessment when adopted served the purpose of allowing students to practice ahead of the exam and in most cases was optional.

While reviewing assessment options in their courses, exams remained the predominant assessment format. Programme directors cited as reasons for this: (1) the perceived established academic rigour of the summative assessment process, (2) the recognition of the exams by PSRBs (inc. regional regulators) as a rigorous assessment format and (3) the impact any radical change might have on confidence in the quality of degrees.

In terms of exam content, changes in most cases consisted of moving to open book exams, redesigning questions to discourage plagiarism (including self-plagiarism from students’ previous assessed work). The rationale was to reduce reliance on rote learning and introduce an element of flexibility and ease anxiety by establishing windows of submission of variable length.

Technology was an issue in some programmes and disciplines, linked to the difficulties that examiners had with an online environment to which they were not used and which they thought was not optimal for online marking.

While some programme directors reported that the move to online assessment would be temporary and that they would return to previous assessment methods post pandemic, the majority (9 out of 12) reported that they would continue with online exams.

Support offered to the students before the online exam included webinars, training workshops, revision forums, all of them initiatives to allow students to practice ahead of the exam and to achieve clarity about format and process of the online assessments.

Discussion

Our evidence showed an increase of proportion of eligible students taking exams and indicated a resilience of both the student body and the institution on which this evaluation focused in response to extraordinary circumstances, those of the pandemic.

It seems that in the context of distance learning, moving exams online represented a very desirable prospect for students who appreciated the flexibility and the lower cost of travel. As far as the future of such changes, there is a strong incentive for institutions to retain online exams in order to save students time for travel, stress, and money for exam centre fee and allow further flexibility.

There were also responses from some students who did not complete all or some of their online exams, and this reveals that, lack of engagement with the assessment was generally COVID-19 related rather than having to do with issues of access to networks or suitable computer equipment. It appears that Illness, disruption and mental health issues resulting from the pandemic meant these students were not able to engage with assessment.

The data also revealed some challenges institutions have faced in:

strengthening the communication to students and examiners about assessment process and format of online assessment; and

addressing lack of digital literacies or competences and assessment literacy (Price et al., 2012) of students that might undermine performance or student experience.

Lack of access to IT equipment, network and broadband access may undermine the student experience and also adversely impact student retention and progression, however, this was not an issue of concern in the findings of this investigation. A logical explanation could be that the institution’s programmes when advertised make it clear that IT equipment and access to Internet are prerequisites in order to engage with distance learning. However, it might be a serious issue in campus-based learning where such prior requirements are not necessary since there are on campus ICT facilities and supporting broadband networks for students to benefit from.

This change of paradigm in assessment had implications for increase of workload for staff as examiners noted. Apart from the impact on workload management, the transition raised a professional development issue for staff who may not be confident in using new technologies and need to be given the opportunity to develop related skills before they embrace change.

Strong research-based voices in the sector advocate the reduction of online examinations and the replacement of exams with alternative forms of assessment. This is not something encountered as a dominant thread in our discussions with programme directors, though 5 out of 12 did indicate a planned move away from examinations. Going further in revisiting the exam format by reviewing online exam formats and selecting those that strengthen links to authenticity and employability seemed to be a valid response.

It is the case that 66% of students reported in the survey that they preferred to continue with online assessments. That number is only likely to increase as broadband infrastructure improves along with student and staff confidence in virtual working and presents institutions with the opportunity to move examinations online in a planned and on a permanent basis. This has to be done, however, in consideration and mitigation of the risk to academic integrity (Amrane et al., 2022).

Will the exam be fixed time or open or replaced by coursework?

Will the exam be invigilated?

Will both exam centre and online exam for the same course/module be possible?

Do the choices 1–3 meet PSRB body requirements?

In what way will the examination format change (e.g. shorter exams, fewer questions, type of questions, open book format)?

For those examinations requiring invigilation, institutions should consider engaging online examination software that assures examinee identity and provides examination proctoring.

Final, issues of perception of the value of online assessment should be addressed in order to increase awareness of digital assessment opportunities.

Conclusions

In conclusion, our investigation gathered valuable data in assessing the success of the 2020 alternative assessment and looking to the future assessment in distance learning environments.

Our investigation looked at the new landscape that emerged in higher education after the pandemic. The move to online assessment instigated an urgent review of assessment practices in distance learning environments. In these environments the standard practice before the pandemic, that is, invigilated unseen exams that were the mainstay of distance learning providers, needed urgent measures so that students did not get disadvantaged in the unprecedented circumstances. A rather complex transition which may have initially seemed like a ‘like for like’ transition of a face-to-face exam to an online environment, in response to a pandemic, whose silver linings were a rethink of assessment practices. The introduction of alternative forms of assessment created opportunities for redesigning assessment. The main contender has been coursework which could replace or complement online exams when exams are required by PSRBs.

Our evidence indicated that the transition to online assessment during the pandemic has facilitated changes in approaches to assessment, including the increased use of online exams. The adoption also touched on designing authentic assessments that use digital tools and require digital information literacies that are relevant to the professional lives of students.

Moving examinations online has implications for assessment practice and how much it improves student motivation and engagement, whether authenticity, validity and reliability considerations are obeyed in assessment design, how practice is set up to address the needs of students in distance learning and campus-based environments and how the higher education sector keeps abreast of societal developments.

A final recommendation for optimal use of exams would be to enhance the content of exam papers and focus on redesigning assessment to allow students to apply acquired knowledge to ‘real’ or authentic situations. This is not always feasible as it requires staff who are assessment literate (Price et al., 2012), that is, have clear understanding of how assessment fits into a course and are able to make critical decisions about the application of appropriate approaches and techniques to assessed tasks, for example, engaging with alternative forms of assessment such as peer review or groupwork that are frequently considered problematic and unwieldy in the context of higher education.

eISSN:
1027-5207
Lingua:
Inglese
Frequenza di pubblicazione:
2 volte all'anno
Argomenti della rivista:
Social Sciences, Education, Curriculum and Pedagogy, other