Critically evaluate the degree to which their own professional practice is learner-centered and the impact that the practice has on student learning
In the past I have not requested feedback for my modules, though I have sought general feedback as a course coordinator at the programme level (which did give me some feedback on my own modules). I also garnered some feedback indirectly from student reflections, e.g. what learning resources they liked, which topics they found difficult. Unfortunately with Blackboard modules being wiped after 2 years, a lot of valuable data has been lost, but it has given me a general sense over the years of what works for students.
There are 3 main approaches to evaluate one’s practice – student surveys, self evaluations and peer evaluations. Stephen Brookfield (2002) lists those along with a fourth: theoretical literature. The latter would require one to be widely read in educational literature and, I would argue, to participate in continuing professional development, which is the main reason for taking this SPA, to challenge what I think I know and to expand on what I know. For the Cloud Software Engineering module, I performed a mid-term student survey and followed this up with a self evaluation. Peer evaluations aren’t the norm in CIT, though it would be possible with online learning to have another lecturer view a sample of online lectures and provide feedback.
For the 10-credit Cloud Software Engineering module, I sent out a simple midterm evaluation form to the remaining 13 active students asking three simple questions:
- What I particularly like about this module is: ____________
- What I particularly dislike about this module is: ____________
- If I could change one thing about the module, it would be: __________
I used Google Forms for this and I got 9 responses, which I was very pleased with considering rates of response tend to be much lower for surveys. The responses and a thematic analysis are presented in Appendix A below. There was just enough data to make some conclusions and raise follow-up questions for me. I have performed a content analysis using the approach outlined in Graneheim and Lundman (2004) and the terminology emphasised hereafter is defined fully in that article, including synonyms. I examined the text of the student responses for both manifest content and latent content – the former being what is obvious and the latter what is inferred; there is a level of interpretation in both, but more with the latter. I searched the content for meaning units (words or phrases relating to a central meaning), condensing where necessary, and then abstracted the higher-order headings – codes, categories and themes.
I began my analysis by breaking down the responses into meaning units, then condensed those to make coding easier. I then grouped the codes under categories and from those categories emerged three themes:
- Online experience
- Learning experience
- Instructional design
The feedback was largely positive from several perspectives related to student-centred learning with negative feedback centering around information overload. By and large, students find the module very relevant to industry with a wide range of topics delivered in an integrated way. The instruction is clear and the experience of the lecturer shows – this backs up my cognitive apprenticeship approach. The materials are very useful, such as tutorials, exercises and videos, but students would like to see more of them – the comments included feedback indicating that my approach to scaffolding (as Vygotsky would put it, discussed in Berk and Winsler, 1995) some of the active learning (e.g. step by step tutorials followed by an exercise) was successful, allowing students to gain a foothold in autonomy (it emphasizes student independence, as Boud, 2012 puts it). The online experience was negative for some students with feelings of isolation and occasional technical challenges, such as scheduling group interactions – Garrison (2011) writes about the opportunities inherent in e-learning platforms for a rich community of learners, but despite social media groups with Google+, the synchronous Adobe Connect with chat, group project tools like Trello and Google Hangouts, there is room for improvement and the suggestion of richer online interaction (e.g. using voice and webcams) warrants further investigation. A large percentage of the students reported that there were too many topics to cover in such as short space of time, while others pointed to the range of topics as a positive – it would seem a middle ground needs to be found where a wide range of relevant topics can be presented while slowing the pace down and allowing even more time for active learning.
The latter point of information overload is one that I have been aware of for some time, but there is a factor playing into that: the Springboard+ programme for reskilling mature students demands that courses are relevant to industry and intensive (it is almost an extreme apprenticeship) so that those who graduate can quickly be assimilated into the workplace. However, my philosophical position is that I should look after the mental health of my students – or at least play my part in not over-stressing them. With that in mind, the main take away from this evaluation is to reduce the amount of content and to increase active learning through more activities like the GitHub collaboration exercise, and to make even more use of audience response with PollEV to revise before moving on. Having said that, dropout rates are down on previous years and the quality of assessment submissions is at least on a par with previous years with some students excelling, so while they report being overloaded, they are able to submit their assignments on time and to a good standard on average.
For the self evaluation, I chose a method that focuses not just on providing metrics or simple score, but also prompts reflection. One such method is provided by Trigwell and Prosser (1999), which attempts to provide a link between teachers’ approaches to teaching and students’ approaches to learning. Their approach is based heavily on the research of Marton and Saljo (1976), Biggs (1978) and Ramsden (1983) from the perspective of deep versus surface learning (all quoted in Trigwell et al., 1999). In addition, they underpin their approach in terms of the links between teaching and learning with reference to Ramsden (1992), which suggests that student perceptions of learning are based on the environment they learn in – a deep approach is associated with perceptions of high quality teaching. Trigwell et al. (1994) identified five approaches of teaching, ranging from approach A which is the most teacher focused (pure transmission) to approach E which is the most student focused (where students question what they learn, build self-awareness, etc.). Which all leads to the questionnaire called the ‘Approaches to Teaching Inventory’ from Keith Trigwell.
The questionnaire has a number of sections: a series of twenty-two questions with a Likert scale answer, a scoring table, a simple bar graph, and a number of questions that prompt reflection on the results. I think the questions are best considered with the accompaniment of the results of a student survey, as I have done, to see if there is a correlation between what I think and what the students are telling me. A scan of the questionnaire (which I filled in by hand) is in Appendix B.
When it came to answering the twenty-two questions, I found myself reluctant to provide any scores of 1 or 5, so all my responses were in the range 2 to 4. However, when I totted up the scores and shaded in the bar graph, there was a clear difference between teacher-focused and student-focused approaches. I do not think the intention of the inventory is complete accuract, rather to do a side by side comparison to see if one is more student-centred than teacher-centred. My results indicated that I am more student than teacher focused (somewhere between D and E, perhaps), which is not surprising when looking at the results of the mid-term evaluation where on the one hand I tended to overload the students with too much information (a bit teacher focused) but more than counterbalanced with student-centred resources, choice of learning methods, imparting of teacher experience to make what is taught relevant and engaging, and so on.
The results, I think, show that I am primarily student-focused in my teaching with evidence from the students to back that up and the self evaluation shows my preference for how I should teach a technical, hands-on subject like Cloud Software Engineering. It does point to areas of improvement, such as slowing down the delivery of content to allow for more discussions, for example. As Prosser et al. suggest, the results should be viewed in their context – this was a busy time with PhD, this SPA and other draws on my time; with more time to dedicate, I am sure I would have done better.
Graneheim, U.H., Lundman, B., 2004. Qualitative content analysis in nursing research: concepts, procedures and measures to achieve trustworthiness. Nurse Education Today 24, 105–112.
Appendix A – Midterm Evaluation and Thematic Analysis
|Meaning Unit||Condensed Meaning Unit||Code|
|It’s very practical.||It’s very practical||Practical|
|The tools used and learned enabled us to create things relatively easily at a fast pace.||Tools learned speed up development||Development Speed|
|It is very interesting, seeing how everything is coming together.||Everything comes together||Integration|
|It is very hands on||Hands on and applied||Practical|
|we apply what we learn.||Apply what we learn||Applied|
|How up-to-date and relevant the content is. Also, the right mix of coding and development tools/practices.||Up-to-date and relevant||Relevant|
|Clear explanations in the lectures and good supporting material.||Clear explanations and good materials||Clarity|
|Tutorials – practical learning||Tutorials – practical learning||Practical|
|Although I’m new to programming, the demos are well explained and a great help in learning how to go about writing my own programs. Larkin is obviously a well-experienced programmer himself and this come across in his explanations||Clear demos by experienced practitioner||Clarity
|When we are provided with a sample code example and learn how to implement particular elements into the code already provided. I feel it gives a great template to work from, and provides reference material for when we are attempting our own projects.||Building on existing code / template, which gives a good start to projects.||Scaffolding|
|I also think small scale video tutorials are incredibly useful.||Video tutorials are useful||Video|
|The huge scope of information and different topics that are covered, and the fact that it genuinely prepares students for the real world of software engineering||Wide range of topics that are linked to industry||Range
|Meaning Unit||Condensed Meaning Unit||Code|
|Some of the provided documentation was slightly out of date and required investigation of previous software versions and other extra activity to get working.||Some out of date documentation wasted time||Out of date documentation|
|I find it difficult at times not having the lecturer or even other students…in front of me to just bounce an idea or sometimes just having a watchful eye over the code when its being created just to say yes that seems ok or try it this way or just to say no that is not going to work||Isolation makes learner unsure if going in right direction||Isolation|
|… Interaction online is slow, because by the time I’ve formed the question and typed it I ‘ve missed the point I was thinking I had.||Slow response online||Slow latency|
|Lots of information to cover.||Lots of information to cover.||Information overload|
|I understand there is a lot to cover, but I found the pace a little fast – moving on to the next topic before I fully understood the first was like playing catch-up each week.||Too much information too fast||Information overload|
|Also, having a lot of links in the lecture notes is great for supporting information, but please bear in mind the time it takes to read and process all of that information when looking at workload hours/week!||Not enough time to do directed research||Not enough time|
|Too many new concepts/technologies in short space of time||Too many concepts||Information overload|
|Working full-time in a position where I work mainly evenings and anti-social hours in general it is difficult to watch lectures live and to interact on group projects. I usually work more on my own and tie up with team members through social media means (google+ and google hangouts). It is often difficult to keep up to date, but I don’t think there is an easy solution to this.||Unable to take part live due to commitments leads to isolation.||Isolation|
|Too much covered and not enough time to take it in sufficiently to understand. As one concept is almost understood the next is upon us and that prior concept is lost.||Too much covered in the time allowed||Information overload|
|By the nature of the reskilling course it is incredibly dense with material to unpack. I feel like it would be easy to fall behind.||Very dense course||Information overload|
|Quicker turn around with assessment results would also help to show the student that they are headed in the right direction.||Would like faster feedback||Slow feedback|
|Again probably the size of the scope. It is a part time course and I’m finding it difficult to keep up with the sheer amount of new information we need to learn.||Scope of material makes it difficult to keep up||Information overload|
Change one thing:
|Meaning Unit||Condensed Meaning Unit||Code|
|Just updating the extra documentation provided.||Bring documentation up to date||Out of date documentation (same issue being raised by same person, so don’t count again)|
|Have all the students online to video chat so that better interaction can take place. I know this may cause band width problems but I think it would help when Someone wanted to say something rather than waiting endlessly for the typing to stop. before getting an answer back. In an Ideal world where fibre broadband means fast internet connection…||Allow richer student interaction online||Richer interaction|
|More tutorials. Found them very helpful.||More tutorials||Tutorials|
|For the coding, to have a short weekly lab exercise/worked example to cement the concepts before moving on the following week. The Github examples are very useful, but I find that having to build something myself helps me to understand it better.||Exercise at week end to help understand before moving on||More exercises|
|time to complete course||time to complete course||Extend time to complete|
|Maybe run over a slightly longer timescale in a less intense format||Longer, less intense timeframe||Extend time to complete|
|More practical coding examples for each new topic with different use case examples too||More examples||More examples|
|I don’t think the group project can work well in our situation with everyone on different schedules.||Group projects difficult for dispersed students||Online group work challenges|
|I would like more example code bases(there are already a lot of information provided, I am merely picking this as I think it is a great asset and I would enjoy even more of these) and lots of individual tutorial videos||More example code and more tutorial videos||More examples
|The granular dissection of lectures based on topic is also an excellent resource.||Breaking up lecture videos is excellent||Navigation|
|If the volume of work could be slightly reduced, maybe the fast pace of the course could be slowed down even a fraction to give us time to catch up with everything it would make the course a lot more manageable time wise, but otherwise it is an excellent course with some great content and great lecturers||Slowing down the pace of delivery would help||Information overload|
|Online group work challenges
|Category||Practicality||Relevance to industry||Assistance to student learning||Workload||Feedback|
Not enough time
Extend time to complete
|Category||Flipped classroom||Presentation||Quality||Active Learning|
|Navigation||Out of date documentation||More exercises|