When MOOCs (massive open online courses) exploded in 2012 they promised to disrupt the higher education sector by giving students the freedom to learn what they wanted, when they wanted, for free. Yet they have failed to make the kinds of revolutionary waves necessary to seriously impact traditional bricks and mortar institutions.
Why is this?
Part of the problem is that most MOOC users are not looking to substitute a traditional university experience with an online DIY education: only 15% of Coursera learners are college aged, and 77% already have degrees. On edX the proportion of students with degrees is 72%.
Another problem is the low completion rates – between 10-15% on average. This has led Udacity to pivot away from providing undergraduate courses to instead provide corporate training, since employees are much more incentivised to complete a course that is explicitly connected with career advancement.
But perhaps the biggest problem is the fact that MOOCs have actually ended up reinforcing a very traditional model of higher education: in most cases an “expert” instructor imparts their specialist knowledge to a large cohort of students, who then take tests or write papers to prove they have sufficiently learnt the content. Furthermore, the partnerships between several major universities and a range of online platforms have not diminished the value of a university degree, but have instead strengthened the perceived importance of gaining a widely recognised credential from a prestigious institution.