IIER logo 4
Issues in Educational Research, 2015, Vol 25(4), 535-554
[ Contents Vol 25 ] [ IIER Home ]

Developing student assessment related to a work-placement: A bridge between practice and improvement

Pirjo Vuoskoski
University of Brighton, UK

Sari Poikela
University of Lapland, Finland

This paper explores the ways in which student assessment can be developed in higher education and work-related contexts to form a strong bridge between practice and improvement. Our aim is to provide a starting point for evaluation and improvement of assessment practices, which benefits the learners, instructors, and designers of the curricula, as well as developers of educational organisations. Empirical explorations in the article draw on interview data gathered from physiotherapy students, teachers, and workplace supervisors, on their experiences of student assessment, in two higher education and work-related contexts. Although the actual empirical study is not the focus of this paper, short descriptions of the participants' lived-through experiences are used to illustrate the explicit theoretical purpose of the discussion. In particular, we outline a theoretical model of 'zones and mirrors' that provides a compelling link between student assessment and learning, by exploring the stages and elements that can join the two, in dialogue with the empirical data.


Background

Nowadays it is generally acknowledged that assessment in higher education alongside certification must serve the purpose of supporting student learning (Boud, 2000; Poikela, 2002; Boud & Falchikov, 2005). It is also widely recognised that individuals are learning throughout their lives, and much of that takes place in the workplace, and in work-related learning settings (Barnett, 1999; Boud & Garrick, 1999; Boud, Solomon & Symes, 2001; Poikela, 2004). Furthermore, workplaces, in the form of the students' work-placement periods, and particularly in the context of education in the health professions, are appraised as potential sources for making student learning and course curricula more relevant, facilitating professional learning and preparing students for professional practice (Dall'Alba, 2009; Webster-Wright, 2009; Thistlethwaite, 2013; Virolainen & Stenström, 2013). At the same time, mixed pictures of student assessment practices and conceptions of assessment are noted in the literature (Ashgar, 2012; Evans, 2013; Li & De Luca, 2014), and the student assessment experience is becoming an increasingly challenging area in higher education (Ashgar, 2012; Dearnly, Taylor, Laxton, Rinomhota & Nkosana-Nyawata, 2013). This leads us to the notion that, whilst seeking clarity within the complexity, development of student assessment practice related to a work-placement carries much potential for transforming the assessment practice in ways that are in alignment with the demands of the contemporary era, and provide a link between assessment and improvement in a wider scope.

Before giving a broader insight into the proposed model and its exploration with empirical data, the main terminology used in the remainder of the paper will first be clarified. Then, in order to outline the background and motivation for this article - the definition of a model of 'zones and mirrors' of assessment and learning, and the need to build a bridge between practice and improvement - contemporary themes and debates connected with assessment and work-placements, and quality aspects in assessment and evaluation, are briefly highlighted.

Main terminology in this article

The notion of work-placement, in the sense it is used in this paper, refers to a fixed period of education within an authentic workplace context, as an integral part of the higher educational process and undergraduate course curriculum. In the Finnish higher education system, where the empirical data for the article were obtained, students' work-placement periods are generally grounded in established cooperation and contracts between higher education institutions and other (public, private, or community sector) work-organisations. Similarly, the term work-integrated learning is also in use in the literature when referring to integration of classroom and workplace learning (Cooper, Orrell & Bowden, 2010; Trede, 2012; Scott & van Etten, 2013). The notion of assessment then, as discussed in this article, refers to student assessment in connection with a work-placement in the context of an ongoing higher education process. As such, it refers to a whole range of activities, such as setting goals and assignments, formulating assessment criteria and/or grade symbols and/or descriptors, applying tools and methods of assessment, making judgments, marking or grading, receiving and providing feedback, and moderating and/or agreeing marks or grades. In that sense, student assessment related to a work-placement engages at least three parties, namely the student, the teacher, and the workplace supervisor (educator or instructor).

More generally, the term assessment in the literature is often used in parallel with feedback (Evans, 2013; Li & De Luca, 2014), and as a synonym for evaluation (Sadler, 2005; 2010). Furthermore, assessment feedback is in use as an umbrella concept "to capture the diversity of definitions, to include the varied roles, types, foci, meanings, and functions, along with the conceptual frameworks underpinning the principles of assessment and feedback, as well as all feedback exchanges generated within an assessment design" (Evans, 2013). For Raivola (2000), who is a Finnish scholar, evaluation can be seen as the broadest concept. According to Raivola, evaluation systems are needed to gather different types of assessment information (including, for example, feedback given to students or by students) for different purposes; for use in educational policy, and for the examination and enhancement of educational processes. In addition, he emphasises the contextual factors of quality and education, and states that quality is a multidimensional phenomenon, unique for every product and for the process creating the product. In this article the term 'assessment' is used to refer to any procedure for estimating student learning for whatever purpose, and 'evaluation' is understood more broadly, as defined by Raivola.

Student assessment and work-placements in higher education

The manifold challenges that professional and higher education all over the world are currently facing are well documented in the literature. One of the challenges pointed out to higher education institutions is the need to provide educational and pedagogical environments that are relevant to the demands of the nature and organisation of contemporary work and work environments (Boud & Garrick, 1999; Boud, Solomon & Symes, 2001; Poikela, 2002, 2004; Schmidt & Gibbs, 2009). Another challenge is the demand for more effective, efficient, and evidence-based practices in education that deliver improved outcomes, based on effective assessment and quality systems (Poikela & Poikela, 2006; Boud & Lawson, 2011). Different responses to these demands are noted in the literature, including the world-wide inclusion of work-based learning programs (Boud & Garrick, 1999; Boud, Solomon & Symes, 2001) and work-related (or work-integrated) education in general (Schmidt & Gibbs, 2009; Von Treuer et al., 2011; Ahola & Hoffman, 2012; Trede, 2012; Virolainen & Stenström, 2013).

The ongoing shifts and transformations related to student assessment and learning practices are much discussed in the literature. It is noted that, traditionally, assessment in higher education has focused on 'measuring knowledge' rather than 'fostering learning', and the role given to students has been that of 'objects' of measurement rather than that of active 'agents' or 'participants' in their own assessment process (Hager & Butler, 1994; Biggs, 1996, 1999; Serafini, 2000; Boud, 2000; Poikela, 2002). It is suggested that the emphasis in the contemporary era is shifting towards student centred practices, and the needs of the learners being the key focus for institutional attention (Boud, 2000, 2007; Poikela, 2002, 2004; Boud & Falchikov, 2005, 2006; Poikela & Poikela, 2005, 2012). However, mixed pictures of assessment practices and existing conceptions of learning and assessment are noted (Hager & Hodkinson, 2009; Ashgar, 2012; Dearnley et al., 2013; Evans, 2013), adding to the complexity of the situation. It is claimed that while reforms and more innovative approaches to student assessment are welcomed by some, they can cause confusion to others (Dearnley et al., 2013), and the meaning of assessment seems to be a 'work-in-progress' (Bennett, 2011; Evans, 2013; Li & De Luca, 2014). Furthermore, it is suggested that the student assessment experience is becoming an increasingly challenging area in higher education (Ashgar, 2012; Dearnly et al., 2013), and the lack of knowledge and understanding of the student assessment processes related to work-placements needs to be addressed (Ferns & Moore, 2012).

In exploring the ways in which student assessment could be developed related to work-placements, in this article we start with the premise that all the above mentioned shifts and transformations of the contemporary era are noteworthy. We also believe that student assessment practices should be aligned, not only with the applied pedagogical assumptions and work-life expectations, but also to foster learning and practice improvement in a wider sense. One implication of this is that students need to become assessors of their own learning within the context of participation in practice; recognised as one of the stakeholders of the assessment process, involved in the making of decisions that affect them, as well as making a contribution in the workplace. Therefore, it is the claim of this paper that a more holistic (relational and contextual) approach to development is needed in higher education with regard to work-placements; one in which all aspects of student assessment - regardless of the environment or the tools in use - are collaboratively considered and tuned to support student learning and practice improvement, on different levels.

Theoretical basis for developing student assessment and learning

The theoretical basis adopted in this paper, for the development of student assessment and learning, and forming a bridge between practice and improvement, can be found in Esa Poikela's idea of context-based assessment, drawing from David Boud's (2000) notion of sustainable assessment, David Kolb's (1984) experiential learning theory, and Andrew Pettigrew's (1985) contextual analysis. Boud (2000) argued that assessment involves identifying appropriate standards and criteria and making (subjective) judgments about quality. In the sustainable approach to assessment, the purpose and methods of assessment are extended and regarded as an indispensable factor in all forms of lifelong learning. Kolb (1984), in his theory of experiential learning, highlighted a holistic, integrative perspective on learning that combines the experience, perception, cognition, and behaviour of the learner, and presents experience as the source of learning and development. This holistic perspective provides conceptual bridges across life-situations (such as school and work), also portraying learning as a continuous lifelong process. Learning is conceived as a continuum of a four-stage cycle: concrete experience - observation and reflection of experience - assimilation of observations into a 'theory' - testing implications or hypotheses in new situations. Pettigrew's (1985) contextual framework originally presented a contextualist approach to organisational change and development that involved continuous interplay between the context of change, the process of change, and the content of change. The key to his contextualist analysis hence lies in positing and establishing relationships between context, process, and outcome. For Pettigrew, the starting point of an analysis, however, is in the description of the process - explained by the external societal context and the internal organisational context. One of the tasks of analysis is to develop criteria for assessing activity and its effects on the process as a whole.

Esa Poikela (2002, 2004) found an analogical relationship between Kolb's (1984) experiential learning cycle, Boud's (2000) concept of judgmental assessment, and Pettigrew's (1985) contextualism as a mode of analysis. Poikela (2002, 2004) presented the idea of context-based assessment (CBA) which requires that situational and contextual factors (of different levels) are carefully considered. It offers a very broad perspective on assessment processes in educational contexts, and presents the zones and mirrors of assessment and learning by exploring the stages and elements that can join the two (on different levels). This provides a starting point for the exploration of student assessment practice that could benefit the learners, instructors, and designers of the curricula, as well as the developers of organisations, and is presented next, as follows.

The zones and mirrors of assessment and learning

According to Kolb (1984), reflective observation (of self and others) is an essential part of a learner's activities, and learning in general. In this way, reflection can be seen as a factor that unites the processes of assessment and learning (Poikela & Poikela, 2005), and, in this case, practice evaluation and improvement in different contexts. Similarly, Boud (2000) chose to extend and regard the purpose and methods of assessment as an indispensable factor in all forms of lifelong learning. Furthermore, in Pettigrew's (1985) contextualist approach to development and change, the interplay between the context, the process, and the outcome was highlighted. Accordingly, from a context-based approach to assessment and learning (Poikela, 2004), the learner's ability to assess and reflect upon his/her own learning and knowledge, is the most important factor in understanding and influencing the situation and the context of action. Thus, the learner is not only considered as the owner of the learning process, but also of the assessment process. This again, can be further expanded to the learning and improvement of all participants of the assessment/ evaluation process. Process assessment, in the sense of continuous reflection, feedback, and discussion between all participants of the student assessment process, throughout the educational process, creates a basis for guiding the learner's self-assessment, and for evaluating the outcomes or products of learning activities (Figure 1), from the perspective of the learner and the instructor(s), as well as those interested in the development of the curriculum.

Figure 1

Figure 1: The mirrors of the assessment process

The core of the figure consists of the cycle of experiential learning (Kolb, 1984) with reflective observation as an essential part of that process. Self-assessment occupies the central zone of the core, process assessment the middle, and product assessment (in the sense of the assessment of the learning outcomes and achievements of the learner) the outer zone. Between the three zones are the boundaries needed for learning and development; not only from a student perspective, but also from the perspective of the instructors; for example, in developing assessment skills.

After outlining the theoretical model that provides a compelling link between student assessment and learning, and forms a bridge between practice and improvement, the stages and elements that can join the two will be explored next, in relation to empirical data obtained in two higher education and work-related contexts.

Empirical research context and methods

The interview data forming the basis for empirical explorations in this article were obtained in the context of the work-placements of an undergraduate level physiotherapy course, at two Finnish universities of applied sciences ('case A' and 'case B'), related to a PhD research project by the first author of this article. Each work-placement was conducted within an authentic workplace learning site in a public or private sector health care organisation, within the framework of the established cooperation and contracts implied earlier in this article, between the schools and the work organisations. In both schools, a problem-based learning (PBL) approach was implemented across the entire course curriculum with a cross-disciplinary approach, and academic study blocks alternated with periods of work-placement, in each year of study.

PBL, as an approach to learning and instruction and to organising the curriculum, does not follow the structure of academic subjects but that of problem solving within shared and individual learning processes (Barrows, 1996; Boud & Feletti, 1997). In the literature, certain 'core characteristics' are recognisable across the different applications of PBL, such as: students work in small groups guided by tutors, problems are used as the starting point of the learning process, and the learning process is organised around self-study and supported by complementary learning resources (Barrows, 1996; Boud & Feletti, 1997; Savin-Baden & Major, 2004; Schmidt et al., 2009; Poikela & Poikela, 2005, 2012). Assessment, however, is an area that appears to be somewhat problematic and controversial within the PBL literature as well. On the one hand, the potential of assessment is addressed, and on the other hand it is suggested that assessment practices may not fit well with the assumed pedagogical rationale, the expected educational outcomes, or the needs of the students (see Savin-Baden, 2003, 2004; Poikela & Poikela, 2005; Winning, Lim & Townsend, 2005; Poikela, Vuoskoski & KärnäŠ, 2009; Poikela & Moore, 2011).

With regard to the context of assessment, in this paper, collaborative assessment methods based on student self-assessment and instructor feedback were addressed in the study plan of both courses. Accordingly, the learning goals and outcomes for the student, in both programs, were recorded on a written assessment form, and were expected to be considered by all parties (student, teacher, and workplace educator/supervisor) during the work-placement. At the end of the work-placement, the student's performance was assessed based on a 'pass-fail' basis for case B, and a numerical (0-5) scale for case A. In addition, a work journal had to be written during the period by the student in both courses. Here, the work journal was a reflective account of the student's clinical experiences together with relevant research literature, put into the form of a case report. The journal was marked at the end of the placement by the teacher and the workplace educator, numerically (0-5) in case A, and on a 'pass-fail' basis in case B.

During the work-placements, teachers were supposed to visit the placements for supervision and assessment purposes when possible, in both schools. Besides, online tools were used for asynchronous and/or synchronous communication between the parties (student, teacher, and workplace supervisor). Online discussion forums and/or emails were used for asynchronous communication, feedback, and information sharing, in both cases. In case B, web cameras and headset microphones with desktop conferencing tools (Adobe Connect Professional) were used for synchronous communication when the teachers were not able to visit the workplaces; mainly when the location of the student placement was further than 40-50 kilometres away from the university campus.

In practice, in-depth interviews were conducted soon after the students' work-placements (between August 2008 and April 2009), after obtaining the required ethical approvals (according to the requirements of the Finnish higher education and health care systems). The interviewees consisted of second year students (n=8), their teachers, (n=4) and workplace supervisors (n=4). All sixteen participants were interviewed face-to-face in their first language (Finnish) by the first author of this article. Anonymity, confidentiality, and other ethical aspects as well as the research interest and methods, were clarified at the beginning of the interview session. Each interview was conducted based on the same interview plan (see Appendix 1) and open, in-depth methods were used (see Kvale, 1996). All interviews lasted between 45 and 75 minutes, and were audio-recorded, then transcribed verbatim by the researcher-interviewer.

A qualitative approach, drawing from philosophical phenomenology and hermeneutics, was applied for the purposes of data analysis and discussion in this article. Adopting the phenomenological attitude, as understood in the study, relates to the object of research (work-placement assessment as a lived through experience) as a lifeworld phenomenon and experiential given, understanding the experience from the perspective of the experiencer, and suspending all theoretical assumptions of the experience, when collecting and analysing the data, and describing the phenomenon (see Giorgi, 2009). Hermeneutics, then, relates to the interpretive framework applied to the raw data, in this case seeking to find evidence in the given context from the applied theory perspective (see Pollio, Henley & Thompson, 2006). Although the way of combining the two approaches has been criticised elsewhere (Cloonan, 1996; Giorgi, 2006; Applebaum, 2011), it is assumed that the phenomenological criteria are not violated, as no phenomenological (descriptive) status, in the strict Husserlian sense of the term, is claimed for the interpretations of the raw data in this article.

The empirical findings and their implications have been previously discussed in another forum (Poikela & Vuoskoski, 2009). Based on the more specific nature and limited scope of this paper, only some of the most representative quotations (translated from the original expressions in Finnish) from the raw data have been selected for exploration through the chosen theoretical lens, based on the discursive accounts as follows.

The first mirror: Developing reflective skills

In the theoretical model of the zones and mirrors of assessment and learning (Poikela, 2004), the boundary between self-assessment and process assessment provides a mirror that helps students/learners to develop reflective skills for assessing themselves, their performances, and their relations to other actors. The most essential mechanism for reflection, developing reflective skills, and learning through reflection, is feedback. Learners can observe themselves and others in action with the help, for example, of a case study or work journal. They can receive and consider instant feedback from instructors, and other students or work colleagues. Obviously, this applies to other participants of the process too, if aiming for (self-) improvement.

The most typical issue regarding the meaning of assessment, as expressed by participants in the empirical data, was the dynamics between self- and process assessment. The significance of shared discussions, and receiving feedback from others (peers and/or instructors) for learning and self-improvement, face to face and online, was particularly addressed by the students. For example, the students stated that asynchronous collaboration online enabled them to receive and post written comments and feedback and to reflect on the given feedback. At the same time they appreciated receiving timely feedback, positive encouragement, and constructive criticism from the instructors. With versatile feedback, students were able to recognise their own progression and future challenges.

It was very important for me to receive instant feedback especially from those in authority, and from professionals. I was able to utilise the teacher's and supervisor's feedback much better than my own reflection. (Student 3/case A)

Usually we talked with my supervisor after each patient situation. I was told in a very constructive manner if there was something I could have done better. I never felt offended or uncomfortable and afterwards I usually agreed with my supervisor, and I was able to understand what went wrong and why. (Student 1/case B)

In the data, self-assessment and process assessment were not considered easy tasks by any parties (students or instructors), but both were recognised as essential to the enhancement of (student) learning and improvement. The students of case B highlighted desktop conferencing as a versatile and efficient means of synchronous communication, allowing students to obtain immediate feedback and responses from their teacher during placement, and reassurance of being 'on the right track'. In addition, the textual online communications enabled students to follow parallel discussions about questions which also concerned them. However, most of the asynchronous collaborations in the online discussion forum were between students and teachers, and workplace supervisors were not actively involved (although they had similar access to the forum). If the students felt that the given positive feedback (face to face or online) was undeserved, or the feedback they anticipated was delayed, or they felt that no constructive criticism had been given, they found the situation frustrating and this decreased their motivation for self-improvement.
The discussions and feedback that I was receiving from my peers were very beneficial. Sometimes I was getting support for my own thoughts and ideas, and sometimes, if my peers didn't agree with me, I had to go and search for more information. (Student 3/case B)

At the beginning of the period, when everything was new to me, I needed all the positive feedback and encouragement I was given, but later on, when things were getting better, I would have expected more constructive and corrective feedback to be able to stay motivated and willing to improve my performance. (Student 4/case A)

Based on the data, the importance of students' self-assessment skills, and their need for further development of themselves were well recognised by all parties. As implied above, the student participants addressed the significance of receiving versatile feedback from others, of enhancing reflection and understanding of their own learning, and of acknowledgement of areas that needed further development. Similarly, teachers and workplace supervisors stated that their aim was to support student learning and improvement, and development in self-assessment.
I am trying to be less demanding for the second-year students. I am trying to get them used to conducting self-assessment by explaining to them what it is and what should be taken into account with it; and later on, when they are more experienced, I will expect them to be able to analyse their own learning much more systematically. (Teacher 1/case A)
However, assessment produces information that is needed by all parties committed to improvement. Continuous improvement in self-assessment and process assessment is important both for students and instructors. Because learning demands skills of reflection and interaction, meaningful and effective ways/tools supporting the quality and relatedness of individual and joint processes are needed. The use of case reports and recordings of learning goals and achievements, for example, offers information for collaborative assessment, which can be mutually beneficial. But clearly, when more sophisticated technologies are implemented, attention must be paid to training all actors to use these online environments and technologies, to be able to support learning and assessment, and to facilitate the collaboration and communication of all parties within the assessment process.

The second mirror: Integration of process and product assessment

The aim of the mirror between process and product assessment is to examine the means involved in setting the learning goals and the criteria for achieving them. Traditionally, in more teacher-centred practices, the setting of goals and assessment criteria is not carried out in cooperation with the learners. Rather, it is assumed that the learners' task is simply to accept them and act accordingly. As noted by Esa Poikela (2004), in order to improve motivation, commitment and responsibility for reflective learning, the underlying assumptions and means of assessment need to be made explicit; even if the criteria already exist, learners need to re-create them from their own perspective, in order to meaningfully engage in the processes of learning and assessment. In the framework of the work-placements, the role of the instructors, especially the workplace educators, in communicating the specific workplace requirements is equally important. Hence, creating sustainable assessment for long-term learning (Boud, 2000), requires that learners and instructors are constantly working together, building upon learners' capacity for self-judgment and self-improvement.

In the data, the integration of (collaborative) process and product assessment was challenging. At its best, it was described as repetitive negotiation about the learning goals and achievements of the student, the criteria, and/or the possibilities and limitations of the work-placement. Supervisors and teachers both addressed the importance of giving timely feedback to the student, but the time required for continuous process assessment was not always in balance with their resources and other work responsibilities. Neither was the communication of the learning goals and achievements, nor understanding of the meaning of the assessment criteria, in balance between all parties.

I found our assessment discussions useful. It all began from setting the goals together, and the teacher was helping with questions and different viewpoints. The supervisor was more concerned about my skills and what I could do, but it all helped me to understand my own capabilities and challenges at the workplace. And in the final evaluation we were considering the achievements together [with the supervisor], and it really helped me to notice my own progression. (Student 4/case A)

I found the assessment form very useful, especially the assessment criteria. Some of the students needed a lot of guidance for understanding the criteria, and the difference between the goals and the achievements, to be able to use the criteria in their own self-assessment; but for some students it was relatively easy. (Teacher 1/case B)

I tried to give positive feedback and encouragement for the student, but, like always, I was often too busy doing something else; and that meant a guilty conscience for not having enough time for the student. (Supervisor 2/case A)

Clearly, the circumstances, as present in the data, did not always support the implementation of collaborative assessment, equal participation, and building on shared understanding of all parties. Nevertheless, the significance of continuous feedback and process assessment for the student was positively acknowledged by all parties. Participants, including students and instructors, also stated the benefits of the collaborative use of the assessment form and regular discussion around the goals and achievements. When striving for students to becoming assessors of their own learning within the context of participation in practice (see Boud & Falchikov, 2006), and for the whole assessment system to be aligned with the applied pedagogical assumptions (Poikela & Poikela, 2005), there is a need for transparency; the processes of learning and instruction, including assessment, should be shared with and developed between the students, teachers, and workplace supervisors (Poikela, Vuoskoski & Kärnä, 2009). Furthermore, instead of complaining about or accepting the lack of resources, attention could (and should) be focused on planning and strategies for (collaborative) improvement of the assessment practice.

The third mirror: Assessment of learning and knowing

The third mirror, as stated by Poikela (2004), exists between product assessment and contexts; meaning that learners are engaged in a process of relating their own actions and achievements to the requirements of working life and society, as well as the present organisational and educational circumstances. Employers, and workplace instructors as (co-)participants in the student's learning and assessment processes, are particularly interested in the capability of the learner; their ability to respond to the curricular and workplace requirements. Furthermore, those involved in developing education and curricula share the same interests in both perspectives, and besides would like to know whether any pedagogical changes are needed. The main question therefore concerns the ability of the assessment and evaluation system to identify what is needed, on different levels.

From a competency-based viewpoint (Hager, 2004), integration of product assessment within the context of working life is related to student's professional knowledge and competence. This type of 'knowing', and how it is developed, can be characterised as a process involving decision-making and problem solving while accessing increasing amounts of tacit knowledge. Within a contextual framework (Poikela, 2004; Pettigrew, 1985), tacit knowledge (and also explicit knowledge) is owned not only by individuals but by communities of workers, and the whole work-organisation. Therefore, in a more holistic approach to assessment and development, within higher educational and work-related contexts, not only is the integration of process and product assessment required, but also an appreciation of the limitations of local contexts. In other words, besides the needs and interests of different individuals, and people working together as communities of practice, the needs and requirements embedded within organisational, cultural, and societal contexts need to be acknowledged.

Assessment of professional knowledge is difficult, because tacit knowledge becomes visible only in fluent personal or shared actions (Schö, 1983). Therefore, as noted by Poikela (2004), it is understandable that, in such circumstances, assessment is easily focused on the results of actions; that is, learning outcomes. However, in this kind of assessment system, learners are left alone with their difficulties because they do not receive enough information about their knowledge, and also lack process information for self-improvement. According to Poikela, those involved in curriculum development are also left without meaningful information of the assessment process. Although other types of 'assessment information' related to work-placements, for example student evaluation of teaching, are often routinely collected for development purposes. Furthermore, an assessment system, which concentrates on measuring 'qualifications', has its mirror only between the products and contexts. This again results in a secured control system focusing on the qualifications of individual learners based on detailed examinations. However, a more dynamic assessment system based on generating learning and improvement, building on the judgment and meaningful information of all stakeholders, provides opportunities for learning and improvement within the whole educational system, and for justifying the pedagogical (or other) changes needed.

In the data, the purpose of assessment as a guiding factor in the learning and development of individual students had clearly been internalised. Accordingly, the importance for student participation and continuous assessment, and discussion and negotiation between all parties, was understood, yet not fully acted upon. Shared understanding and collaborative utilisation of the assessment form was considered to have been particularly challenging. The question was also raised whether the students were given too much responsibility for their own process assessment, especially in recording their learning and improvement. This again may imply that the 'mirror' of assessment and learning was predominantly between the products (learning outcomes) and the context.

I have learnt how to use the assessment form and I think I am quite good at it; but still I was hoping that my supervisors would have recorded their opinions of my suggested achievements on the form; or that we would at least have considered my achievements together; but there wasn't any real negotiation or discussion about them. [during the work-placement]. (Student 2/case B)

I wasn't told what went wrong in my patient report, not until the end of the period, and then it was too late to make any corrections. I don't know if there was any sense in that. I think assessment in general [during the work-placement] was based too much on student self-assessment. I mean how the assessment was conducted; I nearly didn't get any constructive or corrective feedback at all. (Student 1/case A)

The dynamics between the assessment of learning outcomes and the context of assessment, as present in the data, appear somewhat distorted. There were some differences in the roles and responsibilities as well as concerns of the teachers and the workplace supervisors related to student assessment, as expressed by the participants themselves. Teachers described their own role as a mediator between the student and the workplace supervisor, responsible for the pedagogical purposes of work-related assessment and the curricular expectations. Supervisors acknowledged themselves as being merely concerned about the students' understanding of the workplace requirements, but also expressed feelings of uncertainty about their role as an assessor, the pedagogical expectations, and the meaning of the assessment criteria.
I see my own role in the early phase more as an advisor than an assessor. The early workplace periods are so exciting for the student; they need a lot of support and encouragement, and although we were having skilled and experienced supervisors, I still thought I had to ensure that they [workplace supervisors] really understood the study phase of the students correctly. (Teacher 1/case A)

I think it was really hard for the student to understand the expectations of the workplace; and for us it was hard to understand the expectations of the school; for example, what was the level of competence to be expected from the student and how could we be fair in our assessment for the student. (Supervisor 1/case A)

Overall, the exploration of the empirical data suggests that there is some mismatch between the formal learning-assessment environment as expected in the curricula, and the actualised assessment practice as experienced by participants related to a work-placement. Neither were the expectations of different parties aligned with each other, nor was the meaning of assessment clear between participants. However, based on the data, it can be stated that there was a shared, unifying meaning within the individual variations of the assessment experience, highlighting the commitment for collaborative assessment practice; discussion between participants within the assessment process (student, teacher, and workplace supervisor); supporting student self-assessment; and learners receiving continuous feedback from various resources. In summary, the most problematic area, that needs further attention, in the light of the empirical data and the applied theoretical model, was the need for a more integrative and holistic approach to the student assessment and learning related to work-placements; that is, integration of process and product assessment, individual and shared processes, understandings and meaning(s) of assessment in different contexts, and acknowledgement of the need for continuous improvement of all parties involved in the assessment process.

Conclusions

In this article, we have explored ways in which assessment can be developed, in relation to work-placements, that forms a strong bridge between practice and improvement. Our aim was to outline the 'zones and mirrors of assessment' model that provides a compelling link between assessment and learning, by exploring the elements that can join the two. This was achieved through interpretive framework and dialogue with empirical data obtained from two higher educational contexts involved in work-placements. The data consisted of interviews with the three parties involved in the work-placement assessment process: students, teachers, and workplace supervisors.

This provided a starting point for the theoretical exploration and discussion on the development of student assessment practices that could benefit different stakeholders involved in student assessment in higher educational contexts more widely; namely the learners, instructors, and designers of curricula, as well as developers of higher educational and work organisations. As such, it aimed to highlight the development of student learning and assessment in higher education based on mutual communication and co-operation between all parties involved in the improvement of the educational processes. The paper also contributes to previously published research which argues that there is a need for knowledge of the assessment processes in work-related environments (Ferns & Moore, 2012), of the student assessment experiences (Ashgar, 2012; Dearnly et al., 2013), and the meaning of assessment (Bennett, 2011; Evans, 2013; Li & De Luca, 2014) in higher educational contexts, and in engagement with integrated work and learning perspectives in general (Johnsson, Boud & Solomon, 2012). Moreover, it offers a model for bridging the gap between learning and assessment and/or practice and development; that is, the ways in which student learning and assessment in relation to work (or work-placements) are understood and talked about, the ways in which they are experienced and/or actually practised, and the ways in which they are being developed, in higher educational contexts.

Based on the data, the teachers and supervisors had a clear focus on supporting student learning, but limited resources were felt to be a hindrance to creating shared actions and understanding between participants. Nevertheless, the importance of developing students' (self-) assessment skills was highlighted by all parties. The participants also clearly stated their experience of the benefits of the collaborative use of the assessment form, and regular discussion between parties. However, a question was also raised: namely, whether students were given too much responsibility for their own learning and assessment. Furthermore, it was not only the students who felt that they were not receiving enough information for learning and self-improvement; the instructors also felt that they lacked important information for examining the student learning and assessment processes, and for justifying the pedagogical goals. These results support the suggestion of the previous research, that while reforms and more innovative approaches to student assessment are welcomed by some, they can cause confusion to others (Dearnley et al., 2013); that the student assessment experience is becoming an increasingly challenging area in higher education (Ashgar, 2012; Dearnly et al., 2013); and that there is a need for more integrated perspectives on work and learning, and interactive relationships between various individuals within the organisational contexts (Johnsson, Boud & Solomon, 2012). In this paper, we suggest that the assessment and evaluation system as a whole for work-placements and higher education learning environments should be more focused on learning and development, on multiple levels, and that a more holistic perspective towards development is also needed. Besides, the exploration of the empirical data suggests that although the different elements needed for the learning and development of all parties of the assessment process are generally recognised, and the means for the development are available, the focus of assessment is still largely on the products rather than the process of learning and improvement, in a wider sense.

In conclusion, this paper has highlighted that the assessment and evaluation of work-placements is a fundamental part of higher education development, and that it is essential to take them into account during the curriculum planning phase, and its implementation into practice. Within the context of the ongoing shifts and transformations of the contemporary era, the need for the development of assessment practices in parallel with the emergent views and various forms and appearances of work-related learning and pedagogy is becoming more and more pronounced. However, successful implementation of the anticipated aims and assumptions needs persistent development work, interactive conditions, and mutual recognition of different stakeholders involved in the process, at all levels. In this article, we have provided a starting point for the evaluation and improvement of student assessment practices which may benefit parties on multiple levels. It is our intention to publicise the presented model for further discussion and development.

References

Ahola, S. & D. M. Hoffman (Eds) (2012). Higher education research in Finland: Emerging structures and contemporary issues. Jyväskylä: Jyväskylä University Press.

Applebaum, M. H. (2011). (Mis)appropriations of Gadamer in qualitative research: A Husserlian critique (Part 1). Indo-Pacific Journal of Phenomenology, 11(1), 1-17. http://dx.doi.org/10.2989/IPJP.2011.11.1.8.1107

Ashgar, M. (2012). The lived experience of formative assessment practice in a British university. Journal of Further and Higher Education, 36(2), 205-223. http://dx.doi.org/10.1080/0309877X.2011.606901

Dall'Alba, G. (2009). Learning professional ways of being: Ambiguities of becoming. Educational Philosophy and Theory, 41(1), 34-45. http://dx.doi.org/10.1111/j.1469-5812.2008.00475.x

Barnett, D. (1999). Learning to work and working to learn. In D. Boud & J. Garrick (Eds), Understanding learning at work. London, New York: Routledge, pp. 29-44.

Barrows, H. S. (1996). Problem-based learning in medicine and beyond. In M. Birenbaum & F. Dochy (Eds), Bringing problem-based learning to higher education: Theory and practice. San Francisco, CA: Jossey-Bass, pp.3-13.

Bennett, R. E. (2011). Formative assessment: A critical review. Assessment in Education: Principles, Policy & Practice, 18(1), 5-25. http://dx.doi.org/10.1080/0969594X.2010.513678

Biggs, J. (1999). Teaching for quality learning at university: What the student does. Buckingham: The Society for Research into Higher Education and Open University Press.

Biggs, J. (1996). Assessing learning quality: Reconciling institutional, staff and educational demands. Assessment & Evaluation in Higher Education, 21(1), 5-16. http://dx.doi.org/10.1080/0260293960210101

Boud, D. (2007). Reframing assessment as if learning were important. In D. Boud & N. Falchikov (Eds), Rethinking assessment in higher education: Learning for the longer term. London: Routledge, pp.14-27.

Boud, D. (2000). Sustainable assessment: Rethinking assessment for the learning society. Studies in Continuing Education, 22, 151-167. http://dx.doi.org/10.1080/713695728

Boud, D. & Falchikov, N. (2006). Aligning assessment with long-term learning. Assessment & Evaluation in Higher Education, 31(4), 399-413. http://dx.doi.org/10.1080/02602930600679050

Boud, D. & Falchikov, N. (2005). Redesigning assessment for learning beyond higher education. In Higher education in a changing world. Proceedings of the 28th HERDSA Annual Conference, Sydney, 3-6 July, pp 34., 34-41. http://www.herdsa.org.au/wp-content/uploads/conference/2005/papers/boud.pdf

Boud, D. & Feletti, G. (Eds). (1997). The challenge of problem-based learning. London: Kogan Page.

Boud, D. & Garrick, J. (1999). Understandings of workplace learning. In D. Boud & J. Garrick (Eds), Understanding learning at work. London, New York: Routledge, pp.1-12.

Boud, D. & Lawson, R. (2011). The development of student judgement: The role of practice in grade prediction. Paper presented at the 14th Biennial EARLI Conference, 29 August - 3 September, Exeter, UK.

Boud, D., Solomon, N. & Symes, C. (2001). New practices for new times. In D. Boud & N. Solomon (Eds), Work-based learning: A new higher education? Buckingham: The Society for Research into Higher Education & Open University Press, pp.3-17.

Cloonan, T. F. (1995). The early history of phenomenological psychological research in America. Journal of Phenomenological Psychology, 26(1), 46-126. http://dx.doi.org/10.1163/156916295X00033

Cooper, L., Orrell, J. & Bowden, M. (2010). Work integrated learning: A guide to effective practice. Routledge: USA.

Dearnley, C. A., Taylor, J. D., Laxton, J. C., Rinomhota, S. & Nkosana-Nyawata, I. (2013). The student experience of piloting multi-modal performance feedback tools in health and social care practice (work)-based settings. Assessment & Evaluation in Higher Education, 38(4), 436-450. http://dx.doi.org/10.1080/02602938.2011.645014

Evans, C. (2013). Making sense of assessment feedback in higher education. Review of Educational Research, 83(1), 70-120. http://dx.doi.org/10.3102/0034654312474350

Ferns, S. & Moore, K. (2012). Assessing student outcomes in fieldwork placements: An overview of current practice. Asia-Pacific Journal of Cooperative Education, 13(4), 207-224. http://www.apjce.org/files/APJCE_13_4_207_224.pdf

Giorgi, A. (2009). The descriptive phenomenological method in psychology: A modified Husserlian approach. Pittsburgh PA: Duquesne University Press. http://www.dupress.duq.edu/products/psychology6-paper

Giorgi, A. (2006). Concerning variations in the application of the phenomenological method. The Humanistic Psychologist, 34(4), 305-319. http://dx.doi.org/10.1207/s15473333thp3404_2

Hager, P. (2004). The competence affair, or why vocational education and training urgently needs a new understanding of learning. Journal of Vocational Education & Training, 56(3), 409-433. http://dx.doi.org/10.1080/13636820400200262

Hager, P. & Butler, J. (1994). Problem-based learning and paradigms of assessment. In S. E. Chen, R. M. Cowdroy, A. J. Kingsland & M. J. Ostwald (Eds), Reflections on problem-based learning. Sydney: Australian PBL Network.

Hager, P. & Hodkinson, P. (2009). Moving beyond the metaphor of transfer of learning. British Educational Research Journal, 35(4), 619-638. http://dx.doi.org/10.1080/01411920802642371

Johnsson, M. C., Boud, D. & Solomon, N. (2012). Learning in-between, across and beyond workplace boundaries. International Journal of Human Resources Development and Management, 12(1/2), 61-76. http://dx.doi.org/10.1504/IJHRDM.2012.044200

Kolb, D. (1984). Experiential learning: Experience as the source of learning and development. Englewood Cliffs, NJ: Prentice Hall.

Kvale, S. (1996). InterViews: An introduction to qualitative research interviewing. Thousand Oaks, California: SAGE.

Laitinen-Väänänen, S. (2008). The construction of supervision and physiotherapy expertise: A qualitative study of physiotherapy students' learning sessions in clinical education. Doctoral thesis. Jyväskylä: University of Jyväskylä (Studies in Sport, Physical Education and Health). http://urn.fi/URN:ISBN:978-951-39-3297-8

Li, J. & De Luca, R. (2014). Review of assessment feedback. Studies in Higher Education, 39(2), 378-393. http://dx.doi.org/10.1080/03075079.2012.709494

Parjanen, M. (2003). Amerikkalaisen opiskelija-arvioinnin soveltaminen suomalaiseen yliopistoon [The application of American student assessment to a Finnish university]. Korkeakoulujen arviointineuvoston julkaisuja. Helsinki: Edita. https://karvi.fi/app/uploads/2015/01/KKA_803.pdf

Pettigrew, A. M. (1985). Contextual research: A natural way to link theory and practice. In E. E. Lawler, A. M. Mohrman, S. A. Mohrman, G. E. Ledford & T. G. Cummings (Eds), Doing research that is useful for theory and practice. USA: Lexington Books. pp.222-274.

Poikela, E. (2004). Developing criteria for knowing and learning at work: Towards context-based assessment. The Journal of Workplace Learning, 16(5), 267-274. http://dx.doi.org/10.1108/13665620410545543

Poikela, E. (2002). Osaamisen arviointi [Assessing knowing]. In R. Honkonen (Ed), Koulutuksen lumo - Retoriikka, politiikka ja arviointi [Enchantment of education - Rhetoric, politics and assessment]. Tampere: Tampere University Press, pp.229-245.

Poikela, E. & Poikela, S. (Eds.) (2012). Competence and problem based learning - experience, learning and future. Selected papers book, International Conference on PBL, 12-13 April 2012. Rovaniemi University of Applied Sciences Publication Series A no 3: Rovaniemi.

Poikela, E. & Poikela, S. (2006). Learning and knowing at work - professional growth as a tutor. In E. Poikela & A. R. Nummenmaa (Eds), Understanding problem-based learning. Tampere: Tampere University Press, pp.183-207.

Poikela, E. & Poikela, S. (2005). The strategic points of problem-based learning - organising curricula and assessment. In E. Poikela & S. Poikela (Eds), Problem-based learning in context - bridging work and education. Selected papers book, International Conference on PBL, 9-11 June 2005, Lahti, Finland. Tampere: Tampere University Press, pp.7-22.

Poikela, S. (2005). Learning at work as a tutor - the processes of producing, creating and sharing knowledge in a work community. In Problem-based learning in context - bridging work and education. Selected papers book, International Conference on PBL, 9-11 June 2005, Lahti, Finland. Tampere: Tampere University Press, pp.177-194.

Poikela, S. & Moore, I. (2011). PBL challenges both curriculum and teaching. In T. Barrett & S. Moore (Eds.), New approaches to problem-based learning: Revitalising your practice in higher education. New York: Routledge. pp.229-238.

Poikela, S. & Vuoskoski, P. (2009). Developing context-based assessment in problem-based physiotherapy education. Presentation at ISATT 2009, 14th Biennial Conference of International Study Association on Teachers and Teaching. University of Lapland, Rovaniemi, Finland, 1-4 July 2009.

Poikela, S., Vuoskoski, P. & Kärnä, M. (2009). Developing creative learning environments in problem-based learning. In Tan Oon-Seng (Ed.), Problem-based learning and creativity. Singapore: Cengage Learning Asia. pp.67-85.

Pollio, H. R., Henley, T. & Thompson, C. B. (2006). The phenomenology of everyday life. NY: Cambridge University Press.

Raivola, R. (2000). Tehoa vai laatua koulutukseen? [Efficacy or quality for education?]. Helsinki: WSOY.

Sadler, D. R. (2010). Fidelity as a precondition for integrity in grading academic achievement. Assessment & Evaluation in Higher Education, 35(6), 727-743. http://dx.doi.org/10.1080/02602930902977756

Sadler, D. A. (2005). Interpretations of criteria-based assessment and grading in higher education. Assessment & Evaluation in Higher Education, 30(2), 175-194. http://dx.doi.org/10.1080/0260293042000264262

Savin-Baden, M. (2004). Understanding the impact of assessment on students in problem-based learning. Innovations in Education and Teaching International, 41(2), 221-33. http://dx.doi.org/10.1080/1470329042000208729

Savin-Baden, M. (2003). Assessment, the last great problem in higher education? PBL Insight, 6(1).

Savin-Baden, M. & Major, C. (2004). Foundations of problem-based learning. Maidenhead: Open University Press/SRHE.

Schmidt, H. G., Van der Molen, H. T., Te Winkel, W. W. R. & Wijnen, W. H. F. W. (2009). Constructivist, problem-based learning does work: A meta-analysis of curricular comparisons involving a single medical school. Educational Psychologist, 44(4), 227-249. http://dx.doi.org/10.1080/00461520903213592

Schmidt, R. & Gibbs, P. (2009). The challenges of work-based learning in the changing context of the European Higher Education Area. European Journal of Education, 44(3), 399-410. http://dx.doi.org/10.1111/j.1465-3435.2009.01393.x

Schön, D. A. (1983). The reflective practitioner: How professionals think in action. New York: Basic Books.

Scott, R. H. & van Etten, E. (2013). Environmental and conservation volunteering as workplace integrated learning for university students. In Special Issue: Teaching and Learning in Higher Education: Western Australia's TL Forum. Issues in Educational Research, 23(2), 242-257. http://www.iier.org.au/iier23/scott.html

Thistlethwaite, J. E. (2013). Practice-based learning across and between the health professions: A conceptual exploration of definitions and diversity and their impact on interprofessional education. International Journal of Practice-based Learning in Health and Social Care, 1(1), 15-28. http://dx.doi.org/10.11120/pblh.2013.00003

Trede, F. (2012). Role of work-integrated learning in developing professionalism and professional identity. Asia-Pacific Journal of Cooperative Education, 13(3), 159-167. http://www.apjce.org/files/APJCE_13_3_159_167.pdf

Virolainen, M. & Stenström, M-L. (2013). Building workplace learning with polytechnics in Finland: Multiple goals and cooperation in enhancing connectivity. Journal of Education and Work, 26(4), 376-401. http://dx.doi.org/10.1080/13639080.2012.661846

Von Treuer, K., Sturre, V., Keele, S. & McLeod, J. (2011). An integrated model for the evaluation of work placements. Asia-Pacific Journal of Cooperative Education, 12(3), 195-204. http://www.apjce.org/files/APJCE_12_3_195_204.pdf

Webster-Wright, A. (2009). Reframing professional development through understanding authentic professional learning. Review of Educational Research, 79(2), 702-739. http://dx.doi.org/10.3102/0034654308330970

Winning, T., Lim, E. & Townsend, G. (2005). Student experiences of assessment in two problem-based dental curricula: Adelaide and Dublin. Assessment & Evaluation in Higher Education, 30(5), 489-505. http://dx.doi.org/10.1080/02602930500187014

Appendix

The interview plan for the in-depth interviews in the phenomenological empirical study * Authors' note: The original interview questions have been translated from Finnish to English.

Authors: Dr Pirjo Vuoskoski is a senior lecturer in the School of Health Sciences, University of Brighton, UK. A physiotherapist by background, she holds a PhD in education. Her interdisciplinary research interests integrate phenomenological and qualitative research, human lifeworld phenomena, higher education, work-related learning, problem-based pedagogy and physiotherapy. Email: p.vuoskoski@brighton.ac.uk
Web: https://www.brighton.ac.uk/about-us/contact-us/academic-departments/school-of-health-sciences.aspx

Dr Sari Poikela is an associate professor in the Faculty of Education, University of Lapland, Finland. She holds a PhD in adult education. Her research interests include development of higher education: problem-based learning, evaluation and assessment, university and international pedagogy; and the interaction between work-life and education. Email: sari.poikela@ulapland.fi Web: http://www.ulapland.fi/InEnglish/Units/Faculty-of-Education

Please cite as: Vuoskoski, P. & Poikela, S. (2015). Developing student assessment related to a work-placement: A bridge between practice and improvement. Issues in Educational Research, 25(4), 535-554. http://www.iier.org.au/iier25/vuoskoski.html


[ PDF version of this article ] [ Contents Vol 25 ] [ IIER Home ]
© 2015 Issues In Educational Research. This URL: http://www.iier.org.au/iier25/vuoskoski.html
Created 21 Dec 2015. Last revision: 21 Dec 2015.
HTML: Roger Atkinson [rjatkinson@bigpond.com]