IIER logo 4
Issues in Educational Research, 2015, Vol 25(4), 460-479
[ Contents Vol 25 ] [ IIER Home ]

A perspective on supporting STEM academics with blended learning at an Australian university

Rachael Hains-Wesson
Swinburne University of Technology, Australia

Russell Tytler
Deakin University, Australia

Design-based educational research can aid in providing a lens into understanding the complexities around imaginative methods, while also creating an avenue to share personal insights to support the solving of teaching and learning problems to direct future efforts. In this study, the 'I' narrative was extensively utilised in the form of an autoethnography perspective. This was achieved by incorporating three self-report methods within a design experiment, in order to explore the messiness associated with showcasing the creation and modification of a faculty-wide blended learning framework for STEM teachers. Data generation procedures from three sources provided the evidential basis for investigating this process: (1) self-reflection, (2) key literature findings, and (3) critical discussions from a community of inquiry. The findings identified three particular features of the process of change that were challenging, for which STEM academics required support: educators' professional context; finding models to support change in practice; and identifying the change agent. The paper argues for the program of a personal and complex methodology to inform practice, providing insights into the change process, because process is just as important as product.


In higher education, the number of students studying science, technology, engineering and mathematics (STEM) subjects is on the decline. This decline has been attributed to a number of causes. Researchers claim that poor teaching, a lack of academic support (Yarker & Park, 2012), out-dated and uninteresting teaching practices (Dobson, 2014; Marginson, Tytler, Freeman & Roberts, 2013; Office of the Chief Scientist, 2014; The White House, 2009), students' low percentage rates (45% to 69%) around learner engagement and student support (University Experience Survey National Report, 2013), and quality of entire education experience, teaching and learning and learning engagement (University Experience Survey National Report, 2014, p. 17) are just some of the factors as to why. There is arguably a need for STEM to be taught in a "more enriching and interesting manner and [at times] interdisciplinary in nature to keep curiosity alive" (Yarker & Park, 2012 cited in Jayarajah, Saat & Rauf, 2014, p. 156). Additionally, there is a requirement for STEM teaching practices to shift toward student centred approaches (Marginson, Tytler, Freeman & Roberts, 2013; Office of the Chief Scientist, 2014) and therefore, a justification for providing academic support around the improved teaching of STEM subjects (Office of the Chief Scientist, 2015; Prinsely & Baranyai, 2015; The Australian Industry Group, 2015).

The rapid growth in online teaching across the Australian university sector (Fletcher & Bullock, 2015) and the requirement to enhance the delivery of all STEM subjects sees blended learning as a positive option for such improvement (Marginson, Tytler, Freeman & Roberts, 2013; Picciano, 2009; Torrisi-Steele, 2011). For more than a decade, blended learning was judged as "all pervasive in the training industry" (Reay, 2001, p. 6). The members of the American Society for Training and Development have also argued that blended learning was one of the top emerging trends in the knowledge delivery industry (Finn, 2002). However, the term blended learning can mean different things to many people. Torrisi-Steele's (2011) definition states, "blended learning aims to enrich student-centred learning experiences made possible by the harmonious integration of various strategies, achieved by combining f2f [face-to-face] interaction with ICT [information communications technology]" (p. 366). This is a useful definition, because of the emphasis on the "harmonious integration" of ICT into the face-to-face learning environment. Torrisi-Steele's (2011) work also suggests three dimensions for creating an effective blended learning model:

STEM teaching that focuses on inclusivity, adheres to effective constructive alignment and 'harmoniously' integrates ICT into the face-to-face learning experience are just some of the key ingredients for an effective blended learning model.

Deakin University

The faculty of Science, Engineering and Built Environment (SEBE) at Deakin University in Australia is one of four faculties and is the smallest in terms of student numbers and STEM teaching staff. It offers a range of STEM undergraduate, postgraduate courses and study areas. The majority of STEM courses occur at Deakin's metropolitan and rural campuses in both face-to-face and distance modes (a blended learning approach). An internal Student Evaluation of Teaching and Units (SETU) Summary Report for Trimester 3 (2013) indicated that SEBE required improved instructional change around student learning in the areas of "manageable workload, helpful feedback, library resources, online technologies and being educationally challenged" (p. 4). Senior management in the area of teaching and learning within SEBE's organisational structure agreed with these findings, and argued that blended learning would "likely... emerge as the predominant model of the future - and to become far more common than either [online or face-to-face instruction] alone" (Watson, 2008, p. 3). Therefore, two strategies for SEBE around the improvement of the teaching of STEM subjects were:


This study took place during a time when I was employed as SEBE's Lecturer in Blended Learning at Deakin University. I was responsible for the above mentioned priority areas, and during this period, I was engaged in the systematic study of my educational practice in an effort to deepen my understanding around practice and the teaching practice of others, namely for STEM teachers. My mindset, during this time, was focused on undertaking qualitative, design-based research (Hall & Herrington, 2010; Lefoe, Philip, O'Reilly & Parrish, 2009; Reeves, McKenney & Herrington, 2011; Wozniak, Pizzica & Mahony, 2012) that utilised an analytical autoethnography as its theoretical underpinning (Dyson, 2007; Hughes, Pennington & Makris, 2012; Merga, 2015; Ngunjiri, Hernandez & Chang, 2010). I desired to provide a narrative space for my personal insights so that I could provide "a direct link between research and [my] practice [and the practices of others], and thus [increase] the chances that it [would] have a meaningful impact" for those who operated in similar contexts (Reeves, et al, 2011, p. 58). There is power in the personal narrative, which I have termed elsewhere as the 'I' experience (Hains-Wesson, 2013). Others have also noted elsewhere that this is an appropriate methodology to use in education because "my personal journey was woven into the fabric of a wider world study of the culture that I was researching" (Dyson, 2007, p. 37). Incorporating autoethnography as the study's theoretical underpinning further allowed me to explore how I initiated change (self, other and context) as a transformative learning experience (Starr, 2010), in the face of complexity. It allowed for documenting the seepage that occurred between the research and the researcher's life, and especially when conducting research that is qualitative, self-focused, and context-conscious (Ngunjiri, et al, 2010, p. 2).

Throughout my time as a blended learning specialist, I was continually reflecting on self, other, and context, and began to value the recognition of myself as a subject of the research (Dyson, 2007), and a researcher within the research. For example, I found that one of the main challenges for STEM academics was the dichotomy between the values associated with research output, versus the importance of implementing instructional change. On the one hand, to assist academics to improve instruction, Deakin University's LIVE 2020 Agenda (2014a) advocated for every faculty to complete a university-wide Course Enhancement Process by 2016 (Deakin University, 2014b). The program aimed to harness the capabilities of blended learning that used technology for the purpose of student engagement. It was a university-wide collaborative effort between faculty members in each school, and focused on enabling graduates to become highly employable through course experiences that were personal, relevant and engaging wherever learning took place (on campus, in the cloud, or in industry settings). On the other hand, the faculty's research culture encouraged STEM teachers to pursue discipline-specific research and funding outputs for promotion and tenure. The opposing interests did not sit side-by-side so easily.

Further, SEBE's organisational structure was such that it was common for teaching and learning projects to operate within a top-down decision making culture - an organisational system that commonly operates within the higher education sector, especially when technological-driven initiatives are being implemented at a university-wide level (Hains-Wesson, Wakeling & Aldred, 2014). Usually, this type of operational style is implemented due to limited finances, resources and time. As an alternative, Slade, Murfin & Readman (2013) advocated a middle-out approach, where mid-year career academics, professional staff and learning designers work together (as teaching and learning champions) to showcase good practice in order to influence instructional change over a longer period of time. However, this particular style was not always achievable in SEBE's context of operation, because of the need for urgent results via the university-wide Course Enhancement Process.

In a similar study (Hains-Wesson, et al, 2014) at a different university, my colleagues and I explored the challenges of operating within a top-down decision making model, which was similar to SEBE's. We discovered that functioning effectively as academic support staff, while being responsible for strategic initiatives within a multi-faceted system (Hains-Wesson et al, 2014, p. 156), presented pedagogical uncertainties, a circumstance demanding further investigation.

Therefore, this study, first describes and shares (via the 'I' experience) my practitioner's journey as the blended learning framework was being developed and modified within SEBE's teaching and learning environment. Second, I interpret and analyse personal learnings, understandings and experiences around the pre-, during and post-design of the model in order to share my insights with those who operate in similar contexts (Reeves, et al, 2011; Wozniak, et al, 2012). The following research questions guided this qualitative, design-based inquiry.

  1. What changes, growth and understanding have occurred within my role that has fed into the provision of the blended learning process?
  2. What understandings have impacted on the refinement of the process leading to the development of the blended learning model within the design experiment?
In the next section, I present the methods, data generation and analysis, before moving on to the discussion and conclusion.


For the study, a design-based research approach was selected (Collins, Joseph & Bielaczyc, 2004; Hall & Herrington, 2010) that utilised an analytic autoethnography as the study's philosophical underpinning, as suggested by Anderson (2006) and Merga (2015). Hall and Herrington (2010) noted that design-based research methodologies allow for a "cyclic approach, in that empirical research findings can be applied to the theoretical design and then to the practical design, resulting in continuous modifications of both theory and practice" (p. 1018). Because I wanted to focus on the "bigger picture of lessons learned and not just the immediately developed results" (Reeves, et al, 2011, p. 62), the inclusion of an analytical autoethnography philosophy of practice fitted well. Within this approach, I also required a methodology that allowed for a range of data collection methods. Therefore the methodology needed to accommodate a data generation process that allowed for an analysis of the outcomes of intervention, and refinement of the blended learning framework, while it was being developed and used.

A further complicating factor was that STEM academics were implementing blended learning at differing levels already, and had different learning and teaching needs. I chose a design experiment that incorporated self-report methods because the methods were open, involved reflection that was based on a personal venture (Mooney, 1957), and were cyclic in nature while bridging theory with practice (Dyson, 2007; Opie, 2004). Design experiment and self-report methods allow teachers to be researchers and observers (Lincoln & Denzin, 1994; Lincoln & Guba, 1985), to directly contribute to their own self-realisation (Mooney, 1957). This is for the reason articulated by Bakker (2004), "if you want to understand something you have to change it, and if you want to change something you have to understand it" (p. 37). However, Zeichner and Noffke (2001) cautioned that researchers who use such perspectives and methods might become limited by their preconceptions, because it is difficult to avoid bias towards self-validation. However, others have acknowledged that such research thinking and practices strengthen educational discourse, and support change in curriculum and pedagogy that can improve the quality of students' learning experiences (Barab & Squire, 2004; Ruthven, 2005).


The self-report methods within the design experiment were used over a three to six month period to monitor and explore the unfolding process during the development and modification of the blended learning framework. Figure 1 shows the design experiment cycle, and Table 1 unpacks how I used the self-report processes that occurred within the centre of the design experiment, which is also where the self-report study was a main focus in terms of critical review. Each iteration of the self-report method within the design experiment involved multiple players and functions. Via the self-report method, three data generation processes were employed: 1) field notes, 2) reflections on pertinent literature about STEM education, and 3) critical discussions and reflections within a community of inquiry. The process entailed triangulation through the multiple data sets being generated as I investigated my personal understandings (especially in the moment of operation), aiming to generate a robust, rich and comprehensive autoethnography account (Walter, 2011).

Figure 1

Figure 1: The design experiment cycles as suggested by Nieveen & Folmer (2013, p. 159)

The self-report method also helped me to understand the internal 'going-ons' within the design experiment, which were often messy in nature. A highly useful way of documenting and exploring the 'messiness' that was associated with such a complex methodology, namely reflection on self, other, and context around organic moments of discovery, which occurred in the present, continuously and/or retrospectively (see Table 1). Additionally, this process enabled me to navigate effectively and focus on personal understandings and those of my colleagues who operated within a top-down decision making culture. This in turn, helped me to create alternative avenues in order to work effectively with STEM academics who were often at the coal face for change, relying on initiatives and support to occur at the macro level. Table 1 illustrates how the key reflective strategies within the self-report method corresponded to the design experiment's cycle phases in terms of its inner, middle and outer circles (see Figure 1). To help explain this further, I have numbered (1-6) the key 'I' narratives as reflective strategies alongside the inner, middle and outer circles that are associated with the design experiment phases, which are depicted in Figure 1.

Hamilton and Pinnegar (1998) pointed out that self-report studies which use a similar process, "involve a thoughtful look at texts read, experiences had, people known and ideas considered" (p. 236). In the case of this study, this included "note-taking, memory work, narrative writing, observation and interview" (Hamilton, Smith & Worthington, 2008, p. 22), and formed the main data generation processes. Essentially, the methods allowed for a systematic approach to data collection, analysis, and interpretation about self, other and context (Ngunjiri, et al, 2010), which aided in developing and/or modifying the blended learning model. I discuss each data generation process in more detail in the following section.

Table 1: Self-report method within the design experiment

No.Key reflective strategies according to
Fletcher and Bullock (2015)
No.Key design experiment stages according
to Nieveen & Folmer (2013)
1Being aware of my self-focused perspective1Outer-circle: Design research
2Improving my understanding of self and my practice2Outer-circle: Design research
3Reflections3Inner-circle: Research - analyse, evaluate, reflect
4Literature findings4Inner-circle: Research - analyse, evaluate, reflect
5Community of inquiry: critically informed discussions and feedback5Mid-circle: Construct, refine, continuing design, and design
6Reflections, community of inquiry and literature findings6Mid-circle: specified design principles, refined design principles, final design principles, tentative design principles

Data generation

Reflective practice

I used reflective practice journal writing to generate relevant data. I relied on notes that I had taken during meetings using my computer, email correspondence when peers sent me feedback via email, or from verbal communications. I chose this type of method of data generation, because self-reflection is a powerful way to make sense of events and to learn from them (Boud, 1986; Nicholl & Higgins, 2004; Schon, 1983). The use of reflection as a form of learning and teaching scholarship (Hains-Wesson, 2013) is highly beneficial, because it helps to explore personal experiences, which leads to developing new understandings and appreciations (Nicholl & Higgins, 2004). I found this data generation process most useful while working within and for a top-down decision making culture. Because, I often perceived that teaching and learning initiatives (for change) occurred at the macro level, rather than at the micro level, which made my efforts for influencing change for the improvement of STEM teaching and learning limited. For example, I received only modest opportunities to present blended learning pedagogies to a faculty-wide STEM audience. Instead, I was encouraged to work towards instructional change via one-on-one consultations that were self and/or peer initiated.

STEM research: literature on blended learning in STEM

I completed a systematic examination of the literature by reviewing 120 peer-reviewed journal articles that focused on Australian blended learning for STEM. The systematic quantitative review was based on Pickering & Byrne's (2014, p. 535) method of surveying the literature to quantify where there is research and where there were gaps. This method has gained recognition in the field of environmental sciences and geography (Guitart, Pickering & Byrne, 2012; Roy, Pickering & Byrne, 2012; Steven, Pickering & Castley, 2011). I located each journal article via the online resources: Education Resources Information Centre (ERIC), Google Scholar and the Excellence in Research for Australia (ERA) listing. I then reviewed and collected data pertaining to each journal such as abstracts, methods and empirical findings that centred on blended learning frameworks and models at Australian universities. I then undertook a reflective analysis of the empirical findings, offering a critical edge to the design experiment.

A community of inquiry (or a community of practice)

I tapped into a community of inquiry that operated in the faculty where each member functioned within a similar framework to mine, and/or was interested in blended learning as an active educator in STEM. Even though I termed the group a 'community of inquiry' they did not necessarily view themselves in this way. Each member's common 'work' motivation (Zellermayer & Tabak, 2006) was centred on the best way to support teaching staff and/or their peers for the improved teaching of STEM subjects. A total of six members joined the community of inquiry for this project. The community of inquiry met fortnightly or feedback sessions were completed one-on-one via specific appointments made by myself. I was the main person to facilitate meetings, discussions and transcribe the feedback sessions.

Data analysis

Data was collected and stored in files of personal reflections, notes from critical friend meetings and discussion, and an annotated bibliography of the relevant literature. A two phase pattern of data analysis was completed. This was conducted upon all data collected to minimise personal and/or intuitive viewpoints that could influence the development and/or modification of the model. The first phase focused on analysis of 'in the moment' data that was used for continuous improvement within the design experiment. Data was only viewed as beneficial when a particular piece of information aligned with data from at least two other method generation processes such as reflections and the literature. The data generation and continual critical analysis aided in responding to 'problems of practice' (Fletcher & Bullock, 2015) and pinpointing specific types of themes, which led to the discovery of 'turning points' (Fletcher & Bullock, 2015). Turning points focused on my understanding of the challenges faced by educators, and how best to support them. Turning points, as suggested by Bullock and Ritter (2011), are a common discovery within self-report studies. Simply put, 'turning points' are tensions/moments of understanding that challenge our prior assumptions (Fletcher & Bullock, 2015) and lead to new levels of insight. The second phase involved an additional review of the data as a collective whole in terms of reflections, feedback, and notes from the community of enquiry as well as from pertinent literature findings. This was achieved via a thematic analysis of the content that was objective and retrospective. Thematic analysis is a form of a pattern recognition technique, involving searching through the data for emerging themes. The data was reviewed line-by-line to identify recurring patterns, which from my perspective, led to the re-establishment of the particular 'turning points' that I discovered in the first phase of the analysis. The combining of the two phase process allowed for a scientific and systematic strategy that could be repeated. As a result, this type of analysis was fed back into the design experiment for improvement (see https://youtu.be/2YGkfBSfO58 for the blended learning artefact).


Throughout the study, I often found myself completing an 'in the moment' type of analysis, which often occurred continuously as the model was being designed and modified. I also discovered that I was changing as a practitioner, influenced by the literature findings, the feedback, and discussions from the community of inquiry. In addition, the discovery of themes that led to the specific 'turning points' revealed to me that I was gaining critical understandings on both a continual and retrospective level. The turning points were mostly concerned with:
  1. My understanding of STEM educators' learning styles;
  2. Alternative ways to share blended learning information, and
  3. Developing my personal identity as a blended learning specialist and researcher (see Table 2).
The specific turning points that were then fed back into the design experiment, which helped me to scientifically and systematically decide on how to further develop/modify the model were: a) discipline specific needs, b) finding new ways to present information and c) my professional identity. In Table 2, I present a snapshot that has been extracted from the completed data sets from all of the collection phases, to illustrate the main turning points that were determined by the thematic analysis process. It is important to note that with all three data collection phases I implemented a scientific approach to the analysis such as the way in which the data sets were organised for review and analysis. The first column showcases an example of overall outcomes concerning the three self-report data sets, and the second column describes the theme gained from the self-report data after analysis. The final column identifies the particular turning point insight that flowed from these data set examples, which I elaborate on in the proceeding section.

Turning Point One: My understanding of STEM educators' learning styles

As evidenced by the journal entries, there were times during the study when I felt frustrated, such as, "he/she does not want to discuss", "he/she wants to learn away from me" and "he/she wants basic stuff" (see Table 2). As a consequence, I felt de-motivated to take action. Once I realised, however, that STEM educators' time was precious due to the requirement of a high research output (Fairweather, 2008; Fairweather & Paulsen, 2008), I was less inclined to place such high expectations on STEM educators. The critical feedback from the community of inquiry allowed for a more reflective approach to my understanding around personal frustrations when supporting STEM educators at a faculty-level, who I perceived as being less engaged. The literature stated that one way to influence curriculum renewal in STEM-centric environments for teaching and learning was to be a part of large scale, high quality, professional development program (Wilson, 2013). This was difficult to instigate because I was not always in a position to advocate or actively plan for instructional change at such a level, due to operating within a top-down decision making culture of operation. Therefore, I required a new way to present the blended learning framework that was linked to practice (Penuel, Fishman, Yamaguchi & Gallagher, 2007), while avoiding an intrusive way of working.

Table 2: Examples of data generation, themes and turning points from journal entries

Self-report data generation (phase 1 and 2)ThemeTurning point
  1. He/she does not necessarily want to chat about everything with me - just the basic stuff, like setting up their rubric in the learning management system.
  2. He/she wants to learn away from me, come back another time - sometimes he/she doesn't come back at all.
  3. He/she came and talked to me. We talked about blended learning, their ideas and how I can support them in order to put them into practice. This was a great outcome. We have some plans now.
  1. STEM educators often seek to optimise their time for discipline-specific research rather than focusing on the improvement of their teaching practice (Fairweather, 2008; Fairweather & Paulsen, 2008).
  2. To meet these new standards (improved instruction) it is a daunting enterprise requiring large-scale professional development (PD) of high quality that is adaptable across myriad contexts (Wilson, 2013).
  3. STEM educators value PD activities that are close to practice (Penuel, Fishman, Yamaguchi & Gallagher, 2007).
Peer feedback:
  1. Try to work with educators who are interested in teaching and learning.
  2. [The model] will only probably be looked at by those already doing good practice anyway.
  3. Others [the uninterested] will take longer - don't give up - it's just better not to wait, that's all.
My understanding of STEM educators' learning styles and the pressures they face - a variety of ways to provide buy-in for instructional change STEM educators desire professional development that is discipline-specific and easy to implement so that they can spend time on their research
  1. They don't seem to understand how important it is for me to talk, chat, and discuss this area with large groups of faculty members.
  2. I want to get the blended learning message across - that I am here to help - it is frustrating to only ever communicate with the converted. I think it is better to have a philosophy of blended learning - flexibility is the key.
  3. In the last twelve months, I have only presented once to a school on blended learning and even then my presentation was cut in half due to someone else going over their presentation time.
  1. Instruction is best designed to meet the needs of a variety of learners (Picciano, 2009).
  2. It's best to focus on content knowledge, opportunities for active learning and coherence with other learning activities (Garet, Porter, Desimone, Birman & Yoon, 2001).
  3. It is 'safe' to employ online PD, in light of concerns about what might be given up by face-to-face PD (Herold, 2013).
Peer feedback:
  1. He/she gave a great presentation. He/she always knows just what to say to the difficult ones.
  2. I find it difficult answering those tricky questions by painful staffs who don't want to change anyway.
  3. Did you see his/her face [dislike] when he/she said: giving student feedback as a cycle of learning is just as important as the content!
The need to work in ways consistent with STEM educators' learning styles - using online models are about STEM educator's learning styles and how they will influence the delivery of professional development, resources and/or models Finding new ways to present blended learning information that suits the teaching and learning culture for STEM educators
  1. I get excited when I see educators exchanging roles with me such as when they start telling me all the great things that they are doing in their unit to engage students more.
  2. I feel nervous and anxious when I have to think about delivering support for staff to develop eManuals for science laboratory learning, because the literature isn't clear yet - do students' really desire and want this type of blended learning, and if so why?
  3. I am always perplexed about understanding the 'best practice' for teaching hard core science, maths and technology in practicals - it's not my area of expertise! I need to be more open.
  1. Change is difficult in higher education: less risk taking and the inquiry required for change to occur (Cohen, 1988).
  2. The place to begin to make immediate and measurable change is at the course level (Sunal et al., 2001).
  3. Research indicates faculty members are interested and have positive attitudes toward the use of instructional consultation on a personal basis (Sunal et al., 2001).
Peer feedback:
  1. We need to communicate more within our group, especially if you're working with staff one-on-one.
  2. I think it is important to get staff who have implemented something new in the learning and teaching to then showcase what they have done.
  3. Scholarship of teaching and learning - it is your thing. Can you do this more? We need to get them wanting to do research in this area. If you use their ideas then invite them to be co-authors with you.
Developing my professional identity as an individual, a team member and researcher in the area of blended learning The development of the blended learning model will be influenced by my teacher's identity in terms of leadership, skills, and self-efficacy

For example, a simple approach to an educator's learning and teaching requirement, such as placing a rubric online, allowed for doors to be opened for later supportive opportunities, "he/she does not necessarily want to chat with me [about blended learning] just the basic stuff like setting up a rubric" (see Table 2). At times, STEM academics required help, but only after they had purposely tried-out innovations themselves, and away from me. Once this pattern had repeated itself a few times with good results, I found that academics often dropped-in for 'just-in-time' support and/or phoned, emailed or made an appointment to see me face-to-face, such as: "thanks for the discussion today about using video for student feedback. I will get back to you if I have any trouble" (field note entry, 2015). When an academic felt confident to discuss the faculty's approach to blended learning, the conversations lasted for some time, were robust, informative and a two-way problem solving event. Additionally, the community of inquiry often talked about problematic conversations that had occurred with particular staff who were struggling with making changes in their teaching practice and/or course curriculum design work. Advice was readily shared between members of the community of inquiry in order to boost self and team confidence, such as "don't give up - it's just better not to wait, that's all" (see Table 2).

Turning Point Two: Working in ways consistent with STEM educators' learning styles

The sourcing, reviewing and analysis of various blended learning models in the literature, such as those developed in other Australian universities, assisted me to pinpoint what types of models were available, what evidence had been presented for their success, and how the designs of the models were orchestrated. I was able to use my reflection entries, notes and critical feedback from the community of inquiry to refine and improve the development of the model, which was based on what users most desired, required and needed. The modifications were always influenced by what I had learnt from self-reflections, the literature findings and peer feedback in terms of: 1) using the model quickly and 2) providing insight for instructional change in the teaching of STEM subjects. This approach to the development of the blended learning framework became more flexible as I realised that I was not in a position to discuss the blended learning framework at a faculty-wide level, but could more effectively proceed via individual consultations.

This led to providing the blended learning framework via an online format so that academics could access it anywhere and at any time. Figure 3 shows the blended learning model as an artefact (see https://youtu.be/2YGkfBSfO58 for more detail). There were three main reasons for presenting additional information pertaining to the blended learning framework via an open access resource such as YouTube. First, it allowed academics to view and explore the resource in their own time, and as long as they had Internet access, because instruction is best designed to meet the needs of a variety of learners (Picciano, 2009). Second, I found that it was important to realise that it was safe to employ online professional development strategies when options were limited (Herold, 2013).

Figure 2

Figure 2: Blended learning model for STEM for improving practice and instructional change
(see https://youtu.be/2YGkfBSfO58 for a full explanation)

Turning Point Three: Developing my professional identity

In terms of my identity as a practitioner in blended learning, I also came to realise that I had anxiety around my own lack of STEM discipline knowledge, "I find it difficult answering those tricky questions" (see Table 2). At the time, I believed that this was a negative position to hold and it was not until I began investigating the development and modification of the blended learning model as an online resource for academics to access (anywhere and at any time) that I started to feel more confident. Therefore, my negative feelings about my lack of STEM discipline knowledge transformed into a positive advantage. For instance, by offering the blended learning model online it gave me the chance to reflect and respond (in time) to 'tricky questions' compared to a face-to-face discussion with a large group, which might entail quick responses that I was not ready to make.

I came to understand my practitioner's identity in a STEM-centric faculty as both positive and negative. For example, I found myself trying to refine a process while working with others and receiving external (feedback) information that then required a further refinement of the model. In many ways, I was struggling with knowing how instructional change works as a general concept versus a STEM-centric way of doing things. As a consequence, I was ascertaining a new way of working where I felt torn between supporting academics at a course level versus supporting the individual without extensive knowledge of STEM subjects, because the majority of my opportunities to support STEM academics were at a one-on-one and/or just-in-time support level. Sunal et al. (2001), advocated that faculty members are interested in just-in-time approaches, because these are more personal, noting that change is always difficult. As I became more confident in this area, "it is your thing. Can you do this more" (see Table 2), I felt I had finally turned a corner. I had something worth offering staff compared to my original perspective, which was based on intuition, personal perceptions and feeling stuck when operating within a top-down decision making culture.


Working within a top-down decision making culture in a STEM learning and teaching environment was challenging. The incorporation of a design-based, autoethnography philosophy of practice that utilised three self-report methods within a design experiment helped me to unpack the messiness around process, while developing the blended learning framework as it was being used and improved upon. Additionally, the study focused on the process rather than on the product/artefact, which enabled me to understand various academics' needs such as blended learning strategies and tools that are useful, accessible and easy to implement, rather than based on my personal perceptions, intuition and feelings of being stuck. The scientific approach to reviewing and analysing the literature findings allowed me to be reflexive, by demanding that I critically analyse the literature, enforcing a rigorous process that prevented me from arbitrarily choosing literature that pleased my personal understandings and/or preferred theoretical underpinnings. The community of inquiry supported the discussion of ideas around improving blended learning at an individual level such as how to better support STEM academics who 'I' perceived as being unengaged. As a consequence, I learnt to be more flexible due to the feedback received from the community of inquiry, and because it was collaborative in nature, a supportive network where critical thoughts and ideas could be discussed honestly and openly. As Ruthven (2005) explains:
Stronger forms of collaboration are important in developmental research aiming to define good practice in teaching and learning because of the centrality of practitioner knowledge and thinking in realising such practice. (p. 424).
The personal 'I' story of an academic support person's experiences around artefact making and process is important, because it highlights the difficulties around pedagogical change in higher education institutions, and the importance of this work for academic support professionals, who often work for and within top-down decision making cultures of operation. I have critically challenged myself in terms of verifying or contradicting intuitions and/or personal bias. I have in turn discovered a number of key learning issues:
  1. Multiple underlying pressures exist for STEM academics that impact instructional change;
  2. STEM academics' working culture and the faculty's decision making culture impact the work of support professionals;
  3. It is important to find new ways to engage with the unengaged;
  4. My identity as a professional blended learning support academic impacts change, and;
  5. In learning about and communicating how to institute a blended learning model, the process was just as important as the product.


I argue that the process narrative of the 'I' experience for this paper for understanding the nature and support of instructional change is important in design-based educational research. The use of an autoethnography perspective via three self-report methods within a design experiment can empower such process stories, and help share knowledge via a dual focus on critical perspectives on change and the experience of the change agent. This study has allowed me to communicate with others working in similar contexts, allowing a deeper perspective on the factors determining change, the pressures for academics, and what processes can support them. It was a dual inward-outward alignment, which emphasised the relationship between the change agent and STEM academics. By sharing my story, I have gained insight into the best way to present the model when options were limited. I have also developed a blended learning philosophy for working with STEM academics within a top-down decision making model of operation, which centres less on assumptions and more on flexibility, on understanding colleagues' requirements, and finally sharing insights with others via the 'I' experience. For this is a powerful way to investigate and report on process, which is just as important as showcasing an end product.


Anderson, L. (2006). Analytic autoethnography. Journal of Contemporary Ethnography, 35(4), 373-395. http://dx.doi.org/10.1177/0891241605280449

Bakker, A. (2004). Design research in statistics education: On symbolizing and computer tools. Utrecht, The Netherlands: CD-Beta Press.

Barab, S. & Squire, K. (2004) Design-based research: Putting a stake in the ground. Journal of the Learning Sciences, 13(1), 1-14. http://dx.doi.org/10.1207/s15327809jls1301_1

Boud, D. J. (1986). Implementing student self assessment. Higher Education Research and Development Society of Australia Green Guide No. 5. Sydney: Higher Education Research and Development Society of Australia.

Bullock, S. M. & Ritter, J. K. (2011). Exploring the transition into academia through collaborative self-study. Studying Teacher Education, 7(2), 171-181. http://dx.doi.org/10.1080/17425964.2011.591173

Cohen, D. (1988). Teaching practice: Plus a change. (Issue paper No. 88-3). East Lansing, MI: Michigan State University, The National Center for Research on Teacher Education.

Collins, A., Joseph, D. & Bielaczyc, K. (2004). Design research: Theoretical and methodological issues. Journal of the Learning Sciences, 13(1), 15-42.

Deakin University (2014a). LIVE the Future AGENDA 2020, Course Enhancement Guidelines, 2014. http://www.deakin.edu.au/__data/assets/pdf_file/0003/224193/Course-enhancement-guidelines-2014.pdf

Deakin University (2014b). Deakin Curriculum Framework. http://www.deakin.edu.au/learning/deakin-curriculum-framework

Dobson, I. A. (2014). Staffing university science in the twenty-first century. AustralianCouncil of Deans of Science Victoria: Educational Policy Institute Pty Ltd. http://www.acds.edu.au/wp-content/uploads/sites/6/2015/05/ACDS-Science-Staffing-2014_August_Final.pdf

Dyson, M. (2007). My story in a profession of stories: Auto ethnography - an empowering methodology for educators. Australian Journal of Teacher Education, 32(1). http://dx.doi.org/10.14221/ajte.2007v32n1.3

Fairweather, J. (2008). Linking evidence and promising practices in science, technology, engineering, and mathematics (STEM) undergraduate education. The National Academies National Research Council Board of Science Education. http://sites.nationalacademies.org/cs/groups/dbassesite/documents/webpage/dbasse_072637.pdf

Fairweather, J. S. & Paulson, K. (2008). The evolution of American scientific fields: Disciplinary differences versus institutional isomorphism. In Cultural Perspectives on Higher Education, pp. 197-212. Springer Netherlands, 2008. http://www.springer.com/us/book/9781402066030

Finn, A. (2002). Trends in e-learning. Learning Circuits, 3. [viewed at http://www.learningcircuits.org/2002/nov2002/finn.htm; not found 14 Nov 2015, see http://www.cedma-europe.org/newsletter%20articles/TrainingZONE/Trends%20in%20e-Learning%20(26%20Jul%2004).pdf]

Fletcher, T. & Bullock, S. M. (2015). Reframing pedagogy while teaching about teaching online: A collaborative self-study. Professional Development in Education, 41(4), 690-706. http://dx.doi.org/10.1080/19415257.2014.938357

Garet, M. S. Porter, A. C., Desimone, L., Birman, B. F. & Yoon, K. S. (2001). What makes professional development effective? Results from a national sample of teachers. American Educational Research Journal, 38(4), 915-945. http://dx.doi.org/10.3102/00028312038004915

Guitart, D., Pickering, C. & Byrne, J. (2012). Past results and future directions in urban community gardens research. Urban Forestry & Urban Greening, 11(4), 364-373. http://dx.doi.org/10.1016/j.ufug.2012.06.007

Hains-Wesson, R. (2013). Why do you write? Creative writing and the reflective teacher. Higher Education Research & Development, 32(2), 328-331. http://dx.doi.org/10.1080/07294360.2013.770434

Hains-Wesson, R., Wakeling, L. & Aldred, P. (2014). A university-wide ePortfolio initiative at Federation University Australia: Software analysis, test-to-production, and evaluation phases. International Journal of ePortfolios, 4(2), 143-156. http://www.theijep.com/pdf/IJEP147.pdf

Hall, A. & Herrington, J. (2010). The development of social presence in online Arabic learning communities. Australasian Journal of Educational Technology, 26(7), 1012-1027. http://ajet.org.au/index.php/AJET/article/view/1031/292

Hamilton, M. L., Smith, L. & Worthington, K. (2008). Fitting the methodology with the research: An exploration of narrative, self-study and auto-ethnography. Studying Teacher Education, 4(1), 17-28. http://dx.doi.org/10.1080/17425960801976321

Hamilton, M. L. & Pinnegar, S. (1998). Conclusion: The value and the promise of self-study. In M. L. Hamilton (Ed.), Reconceptualising teaching practice: Self-study in teaching education (pp. 235-246). London: Falmer Press.

Herold, B. (2013). Benefits of online, face-to-face professional development similar, study finds. Education Week - Digital Education, 20 June. http://blogs.edweek.org/edweek/DigitalEducation/2013/06/no_difference_between_online_a.html

Hughes, S., Pennington, J. L. & Makris, S. (2012). Translating autoethnography across the AERA standards: Toward understanding autoethnographic scholarship as empirical research. Educational Researcher, 41(6), 209-219. http://dx.doi.org/10.3102/0013189X12442983

Jayarajah, K., Saat, M. R. & Rauf, A. A. (2014). A review of science, technology, engineering and mathematics (STEM) education research from 1999-2013: A Malaysian perspective. Eurasia Journal of Mathematics, Science and Technology Education, 10(3), 155-163. http://dx.doi.org/10.12973/eurasia.2014.1072a

Lefoe, G., Philip, R., O'Reilly, M. & Parrish, D. (2009). Sharing quality resources for teaching and learning: A peer review model for the ALTC Exchange in Australia. Australasian Journal of Educational Technology, 25(1), 45-59. http://ajet.org.au/index.php/AJET/article/view/1180/408

Lincoln, Y. S. & Denzin, N. K. (1994). The fifth moment. In N. K. Denzin & Y. S. Lincoln (Eds.), Handbook of qualitative research (pp. 575-586). Thousand Oaks, CA: SAGE.

Lincoln, Y. S. & Guba, E. G. (1985). Naturalistic inquiry. Newbury Park: SAGE.

Marginson, S., Tytler, R., Freeman, B. & Roberts, K. (2013). STEM: Country comparisons, International comparisons of science, technology, engineering and mathematics (STEM) education. Australian Council of Learned Academies. http://www.acola.org.au/index.php/projects/securing-australia-s-future/project-2

Merga, M. (2015). Thesis by publication in education: An autoethnographic perspective for educational researchers. Issues in Educational Research, 25(3), 291-308. http://www.iier.org.au/iier25/merga.html

Mooney, R. L. (1957). The researcher himself. In Research for curriculum improvement. Association for Supervision and Curriculum Development, 1957 yearbook (pp. 154-186). Washington, DC: Association for Supervision and Curriculum Development.

Ngunjiri, F. W., Hernandez, K. C. & Chang, H. (2010). Editorial: Living autoethnography: Connecting life and research. Journal of Research Practice, 6(1), Article E1. http://jrp.icaap.org/index.php/jrp/article/view/241/186

Nicholl, H. & Higgins, A. (2004). Reflection in preregistration nursing curricula. Journal of Advanced Nursing, 46(6), 578-585. http://dx.doi.org/10.1111/j.1365-2648.2004.03048.x

Nieveen, N. & Folmer, E. (2013). Formative evaluation in educational design research. In T. Plomp & N. Nieveen (Eds.), Educational design research - part A: An introduction (pp. 152-169). Enschede, the Netherlands: SLO.

Office of the Chief Scientist (2014). Science, technology, engineering and mathematics: Australia's future. Australian Government, Canberra. http://www.chiefscientist.gov.au/wp-content/uploads/STEM_AustraliasFuture_Sept2014_Web.pdf

Office of the Chief Scientist (2015). STEM skills in the workforce: What do employers want? Australian Government, Canberra. http://www.chiefscientist.gov.au/2015/04/occasional-paper-stem-skills-in-the-workforce-what-do-employers-want/

Opie, C. (2004). Doing educational research. USA: SAGE.

Penuel, W. R., Fishman, B. J., Yamaguchi, R. & Gallagher, L. P. (2007). What makes professional development effective? Strategies that foster curriculum implementation. American Education Research Journal, 44(4), 921-958. http://dx.doi.org/10.3102/0002831207308221

Picciano, A. G. (2009). Blending with purpose: The multimodal model. Journal of the Research Center for Educational Technology (RCET), 5(1), 4-14. http://www.rcetj.org/index.php/rcetj/article/viewFile/11/14

Pickering, C. & Byrne, J. (2014). The benefits of publishing systematic quantitative literature reviews for PhD candidates and other early-career researchers. Higher Education Research & Development, 33(3), 534-548. http://dx.doi.org/10.1080/07294360.2013.841651

Prinsley, R. & Baranyai, K. (2015). STEM skills in the workforce: What do employers want? Occasional Papers, Issue 9, Office of the Chief Scientist, Australia. http://www.chiefscientist.gov.au/wp-content/uploads/OPS09_02Mar2015_Web.pdf

Reay, J. (2001). Blended learning - a fusion for the future. Knowledge Management Review, 4(3), 6. [viewed at http://www.melcrum.com/products/journals/kmr.shtml; not found 14 November 2015]

Reeves, T. C., McKenney, S. & Herrington, J. (2011). Publishing and perishing: The critical importance of educational design research. Australiasian Journal of Educational Technology, 27(1), 55-65. http://ajet.org.au/index.php/AJET/article/view/982/255

Roy, S., Byrne, J. & Pickering, C. (2012). A systematic quantitative review of urban tree benefits, costs, and assessment methods across cities in different climatic zones. Urban Forestry & Urban Greening, 11(4), 351-363. http://dx.doi.org/10.1016/j.ufug.2012.06.006

Ruthven, K. (2005). Improving the development and warranting of good practice in teaching. Cambridge Journal of Education, 35(3), 407-426. http://dx.doi.org/10.1080/03057640500319081

Schön, D. (1983). The reflective practitioner. New York: Basic Books.

Slade, C., Murfin, K. & Readman, K. (2013). Evaluating processes and platforms for potential ePortfolio use: The role of the middle agent. International Journal of ePortfolio, 3(2), 177-188. http://www.theijep.com/pdf/IJEP114.pdf

Starr, L. J. (2010). The use of autoethnography in educational research: Locating who we are in what we do. Canadian Journal for New Scholars in Education, 3(1) June 2010. http://cjnse.journalhosting.ucalgary.ca/ojs2/index.php/cjnse/article/viewFile/149/112

Steven, R., Pickering, C. & Castley, J. G. (2011). A review of the impacts of nature based recreation on birds. Journal of Environmental Management, 92(10), 2287-2294. http://dx.doi.org/10.1016/j.jenvman.2011.05.005

Student Evaluation of Teaching and Units (SETU) (2013). Summary report for Trimester 3. Deakin University. https://www.deakin.edu.au/planning-unit/surveys/public/setu-report-tri3-12.pdf

Sunal, D. W., Hodges, J., Sunal, C. S., Whitaker, K. W., Freeman, L. M., Edwards, L., Johnston, R. A. & Odell, M. (2001). Teaching science in higher education: Faculty professional development and barriers to change. School Science and Mathematics, 101(5), 246-257. http://dx.doi.org/10.1111/j.1949-8594.2001.tb18027.x

The Australian Industry Group (2015). Progressing STEM skills in Australia. http://www.aigroup.com.au/portal/binary/com.epicentric.contentmanagement.servlet.ContentDelivery

The White House (2009). Press release: President Obama launches 'educate to innovate' campaign for excellence in science, technology, engineering and mathematics (STEM) education. http://www.whitehouse.gov/the-press-office/president-obama-launches-educate-innovate-campaign-excellence-science-technology-en

Torrisi-Steele, G. (2011). This thing called blended learning - a definition and planning approach. In Research and Development in Higher Education: Reshaping Higher Education, 34, 360-371 (Proceedings HERDSA 2011). http://www.herdsa.org.au/wp-content/uploads/conference/2011/papers/HERDSA_2011_Torrisi-Steele.PDF

University Experience Survey National Report (2014). Australian Government, Department of Education and Training. https://education.gov.au/university-experience-survey

Walter, M. M. (Ed.) (2011). Social research methods. Australia: Oxford University Press.

Watson, J. (2008). Blended learning: The convergence of online and face-to-face education. Vienna, VA: North American Council for Online Learning. http://files.eric.ed.gov/fulltext/ED509636.pdf

Wilson, S. M. (2013). Professional development for science teachers. Science, 340(6130), 310-313. http://dx.doi.org/10.1126/science.1230725

Wozniak, H., Pizzica, J. & Mahony, M. J. (2012). Design-based research principles for student orientation to online study: Capturing the lessons learnt. Australasian Journal of Educational Technology, 28(5), 896-911. http://ajet.org.au/index.php/AJET/article/download/823/120

Yarker, M. B. & Park, S. (2012). Analysis of teaching resources for implementing an interdisciplinary approach in the K-12 classroom. Eurasia Journal of Mathematics, Science and Technology Education, 8(4), 223-232. http://www.ejmste.com/v8n4/eurasia_v8n4_yarker.pdf

Zeichner, J. & Noffke, S. (2001). Practitioner research. In V. Richardson (Ed.), Handbook of research on teaching (4th edn). Washington, DC: American Educational Research Association, 298-330.

Zellermayer, M. & Tabak, E. (2006). Knowledge construction in a teachers' community of enquiry: A possible road map. Teachers and Teaching: Theory and practice, 12(1), 33-49. http://dx.doi.org/10.1080/13450600500364562

Authors: Dr Rachael Hains-Wesson is a Senior Lecturer in Work-Integrated Learning, Student Advancement at Swinburne University of Technology. She specialises in the following teaching and research areas: blended learning for science, technology, engineering and mathematics (STEM), reflective practice, work-integrated learning, creative and performance writing. Rachael has received numerous accolades for her work in the areas of work-integrated learning, performance arts, learning and teaching.
Email: rhainswesson@swin.edu.au Web: http://www.rachaelhainswesson.com

Russell Tytler is Professor of Science Education at Deakin University. He has researched and written extensively on student learning and reasoning in science, and science investigations. His interest in the role of representation in reasoning and learning in science extends to pedagogy and teacher and school change. He researches and writes on student engagement with science and mathematics, and STEM policy.
Email: russell.tytler@deakin.edu.au Web: http://www.deakin.edu.au/profiles/russell-tytler

Please cite as: Hains-Wesson, R. & Tytler, R. (2015). A perspective on supporting STEM academics with blended learning at an Australian university. Issues in Educational Research, 25(4), 460-479. http://www.iier.org.au/iier25/hains-wesson.html

[ PDF version of this article ] [ Contents Vol 25 ] [ IIER Home ]
© 2015 Issues In Educational Research. This URL: http://www.iier.org.au/iier25/hains-wesson.html
Created 21 Dec 2015. Last revision: 21 Dec 2015.
HTML: Roger Atkinson [rjatkinson@bigpond.com]