Drupal data migration

Code is like fish. Data is like wine.

Over time code will go off. It needs to be maintained, updated, tweaked, refactored. Sometimes it is thrown out and you start again.

Data, on the other hand, gets better with age. Or at least it should. Over time it can be cleaned, augmented, annotated, indexed and linked. All of these activities increase the utility of the data and its value to your organisation and stakeholders. Data represents the intellectual property of your organisation and the efforts of your people. If your data isn’t getting better than your business is going backwards.

Deciding to migrate your site to Drupal is a big move which can make life easier for you from a code perspective. You have access to a mature, successful open source codebase with many modules and developers contributing to its ongoing development. The Drupal community ensure that your codebase won’t start to stink over time.

What about your data though? There are many questions to consider:

  • How is it going to be migrated out of that legacy system? 
  • How can I migrate my users all of whom have hashed passwords?
  • What data structures will best support it inside Drupal? 
  • What features of Drupal could be used to add value to the data? 
  • Are their new ways for users to retrieve, reference and search for data?
  • How does this relate to SEO, sharability, linked data and the semantic web?

The answers to each of these questions will influence the success of your migration project.

The Drupal Data Model

Drupal has very flexible internal data structures which are able to handle a very wide range of data modelling requirements. There are built in structures for users, taxonomy terms, files and nodes. Nodes can be sub types to include things you are familiar with such as articles, pages, people, places, organisations, events, slideshows and galleries. They can be further extended to handle any other content you may wish to model. Further, Drupal all supports a more general Entity system which can be used for more data like structures. The Field API can be used to easily add properties to all of these types.

All of this structure, with the exception of entities, can be easily built out through the Drupal UI and then captured as code. The hard part is working out what you want to model and how you want to do it. Along the way you need to make modelling decisions based on a knowledge of the APIs, what works well in practice with an eye on what future requirements may be. This takes experience and knowhow.

The Migration Process

There are a number of ways to populate data in Drupal. Besides from doing it manually, automated options include using:

  • the feeds module to consume data
  • custom scripts to create content using Drupal’s APIs directly
  • custom SQL scripts to write data to the database directly
  • the Migrate module which handles data consumption, manipulation, identity management and job queuing and rollback.

Each of these options have their respective pros and cons. We believe that the best all round performer is the Migrate module. It has all of the bases covered. In order to use it effectively it requires a good knowledge of Drupal, PHP and object oriented (OO) principles. Once mastered it offers the most flexible, well documented and repeatable process for importing data into Drupal.

Morpht Expertise in Data Migration

At Morpht we specialise in data migration for Drupal. We work with other agencies and clients directly to migrate legacy data into Drupal. We are able to work with data from a variety of sources including SQL, SQL and CSV. We have successfully worked on several, large scale migrations into Drupal.

Marji Cermak is a programmer and system administrator with several years of database design and administration under his belt. in the past he has worked on commercial migrations for online bookstores and other clients.

Murray Woodman is an information architect and all round Drupal code monkey who has worked with data in a variety of capacities. For fun one day he decided to import a semantic version of Wikipedia into Drupal. 13 Million nodes later and hundreds of millions of relationships later he had a Drupal system complete with faceted search across a number of languages.

We love working with data and making it work well with Drupal.

How do we work?

The initial stages of an engagement involve an examination of the legacy data. This includes an audit of the data structure and features, including any work needed to clean or dedupe the objects.

You may already have a Drupal site with a target data model already defined. You may not. Generally we prefer to be involved with the planning and building of the initial site to ensure that migration is as smooth as possible.

Once these issues have been worked through we are able to start on coding and running migrations. This can take days or weeks depending on the complexity and amount of data to be converted. Once completed the data is made available for review on an external server. Any changes can then be fed back into the process until the data has been successfully converted. We then hand you code, database and files with the migrated data.

Please get in touch with us to discuss your data migration needs. 

Contact

Personal details
Comments
By submitting this form, you accept the Mollom privacy policy.