2 minute read

Course QA cycle

Next Article
Portfolio QA cycle

Portfolio QA cycle

The Course QA Cycle comes to life through the systematic iteration of the four PDCA steps.

Course Improvement Plan

Advertisement

Course Development Roadmap Pedagogical Model

Accessibility and Usability Guideliness

Trainings and Course Creation Resources

Platform Data

Learners’ Survey Data

Course

Moderation Feedback Log

Assesment Results

Complaints Log

Course Evaluation Report/Survey

Course Evaluation Meeting

Feedback from Instructors and Learning Developers

Ensure that our approach to course and curriculum design and development is carried out across our entire portfolio using pertinent course development processes, training and resources. Our approach follows our unique pedagogical model, based on the Online Learning Experience Principles, a learnercentred approach, and open practices. Learning Developers are responsible for providing Course Teams with the necessary guidance to adhere to the Quality Standards by following quality control checks throughout the course creation process. Staff need to have access to resources and be able to complete the necessary training before they can start any moderation activity as part of the course delivery, and in compliance with the relevant quality standards. Moderators monitor activity in the courses and resolve or escalate any issues that may arise and hinder learners from completing the course.

Plan Check

Conduct a course evaluation at the conclusion of every 1st and 2nd course runs for continuous improvement. After the first two runs, courses will be evaluated at regular intervals of three runs (see figure on the next page) taking into considerations the five perspectives identified in the Portfolio Review.

Participants involved in the evaluations are:

• Course Team

• Faculty Coordinator

• Marketer, Learning Developer, Portfolio & Product Manager.

The evaluations take the following input into account:

• course evaluation report (for MOOCs 1st runs and Programmes) or learner survey results for 2nd runs, ProfEds, and further runs

• moderation feedback

• learning developer feedback

• instructor feedback.

Part E contains a complete list of these evaluation instruments.

Utilise a whole range of learner data gathered through the completion of pre and post surveys, aggregated platform data and activity, course assessment results, and learner feedback. The Data Analyst in the Learning Technology Team collects and analyses data to inform the full evaluation that takes place once the course is concluded.

Produce and share a course improvement plan that includes considerations based on the evaluation. The improvement plan may refer, as necessary, to the:

• didactic quality of the course

• value proposition/About Page, promotional activities

• course delivery methods (instructor-paced vs self-paced, moderation by Learning Experience Team, trainings necessary for TA)

• list of actions for improving the course before the next run.

Evaluation cycle of individual courses

Evaluations will be planned at regular intervals of three runs. This will allow to systematically monitor the course quality and learners’ satisfaction.

Course evaluation process

The teams in the figure below are involved in the various phases of the evaluation process and are responsible for defining and implementing the course improvements.

This article is from: