WikiToLearn migration, why?

Well, currently WikiToLearn runs on MediaWiki, which is a good model for dealing with an encyclopedia but, when you are trying to build a more structured content, it doesn’t fit.

For the release 1.0 we have developed CourseEditor, which tries to make the unstructured content more structured, for example offering a drag-and-drop UI to manage a course structure.

However, this isn’t enough: the issue with the MediaWiki data structure is that the versioning is only at the single page level, which is a good design for something that has little to none requirement of unambiguous references between pages, but it is a very big deal if you are talking about a course.

This is the main reason why we had to think about something new.

Because we need something new we have the opportunity to build something that make easier to add features, functions and capabilities, try new things without too much worry about the whole stack.

Thanks to a big support from GARR, we have access to a lot of computing power, not only in terms of raw CPU/RAM/Storage/Network but also in terms of number of servers (or VMs) and this means that we can build something distributed, and therefore having full tolerance and resilience.

The first thing that comes to mind in this scenario is: microservices! Microservices everywhere!

With containers, containers all around the place!

Yes, microservices, not in the “put a LAMP stack in docker” way of doing it, but with a proper stratified design.

MediaWiki is not designed with micro-services in mind and we have already mediawiki inside docker, we need to design your platform from scratch.

The first step for this ambitious project is the storage: we have to store the user data in a safe way, and this service is the critical one, the one that we can’t do wrong – there isn’t a second chance for doing it.

Now we are testing using Eve to store everything in an object storage with RESTful API, the backend at this time is MongoDB (with replication).

Now the very big issue is: how do we transform the MediaWiki data-structure in something with rigid internal references and course-wide versioning?

For this we used mongodb as temporary storage to work on the data and process every page to find and resolve every reference.

Now the migration is working quite well, it’s not done yet but we are confident that we can do the magic trick very soon.

Bye!