<< ALL BLOG POSTS

Transmogrify Your Content Migration

Table of Contents

We are currently in the process of building a new website in Plone 5. The current site is running on Plone 4.3, but was originally built in Plone 2, and upgraded various times along the way. Instead of migrating the site again, we've decided to start fresh with a new buildout, new theme, and mostly new content. While much of the site content will be created new, we still want to maintain existing news items and blog posts, and turn them into Dexterity items. Hence the need for a Transmogrifier migration...

The Mighty Transmogrifier

In the Calvin and Hobbes universe, the Transmogrifier is an invention of Calvin's that would turn one thing into another.  Similarly, collective.transmogrifier provides support for building pipelines that turn one thing into another, hence greatly facilitating content import and export. This is a life saver when you need to migrate content from one site to another, regardless of the CMS platform.

A Simple Content Export Process

Exporting content from the old site turned out to be quite easy with collective.jsonify. Full instructions and options are clearly detailed on that site, but the basic process for exporting all content is to:

Install collective.jsonify into the buildout of the old site

Add an External Method at the root of the site with the following properties:

  • id: export_content
  • module name: collective.jsonify.json_methods
  • function name: export_content

Go to http://[your site]/export_content

See the instance log output for where the export throws the content (mine went into /tmp)

Copy the numbered folders from the export into the new buildout, in a folder at the root called content-import and add this to your .gitignore.

Write a Content Import Pipeline

Now you are ready to start the process of importing the content into the site. This part requires writing a pipeline that determines how to handle the content coming in to the new site. For this we recommend creating a custom package just for the migration code. This makes it easy to remove it later after everything is migrated.

First install the necessary add-ons into the buildout. You will need collective.jsonmigrator and transmogrify.dexterity. You may also need to pin zope.app.container < 4.0 because 4.0 requires a lot of newer Zope packages than what are pinned for Plone 5.1.

Use mr.bob to create a custom package. In a virtualenv:

$ pip install mr.bob
$ pip install bobtemplates.plone
$ mrbob -O yourproject.migration bobtemplates:plone_addon

This will ask a series of questions. Create a 'Basic' add-on, and supply the author information.

Below is part of our import_content.cfg. Each section specifies a blueprint section to be used and the information of what you want to manipulate.

[transmogrifier]
# Each pipeline corresponds to a blueprint section below
pipeline =
 jsonsource
 limittypes
 removeid
 pathfixer
 movenews
 constructor
 copyuid
 deserializer
 schemaupdater
 workflowhistory
 datesupdate
 properties
 logger

[jsonsource]
# Specify the source folder in the buildout we are importing from
blueprint = collective.jsonmigrator.jsonsource
path = content-import

[limittypes]
# Only import News Items, Blog Posts, Images, and Files
# Everything else will be skipped if it doesn't match
blueprint = collective.transmogrifier.sections.condition
condition = python:item['_type'] in ['News Item', 'BlogPost', 'Image', 'File']

[removeid]
# The jsonify will have the id we want to use in the `_id` field
# so remove the one from the data we don't want to use
blueprint = collective.transmogrifier.sections.manipulator
delete = id

[pathfixer]
# Remove the old Plone Site ID from the paths so we can import into any new Plone
# site reguardless of its site ID.
blueprint = plone.app.transmogrifier.pathfixer
stripstring = /SFUPSite

[movenews]
# Relocate the path to the main news folder, old site path vs. new site path
blueprint = collective.transmogrifier.sections.inserter
key = string:_path
value = python:item['_path'].replace('/company/news/news', '/company/whats-happening/company-news')

[constructor]
# Here's where the actual magic happens, create the content object in the site
blueprint = collective.transmogrifier.sections.constructor

[copyuid]
# Needed so that the schemaupdater can set the UUID correctly
blueprint = collective.transmogrifier.sections.manipulator
keys = _uid
destination = string:plone.uuid

[deserializer]
# If the data was contained inside of an attached JSON file, stuff that data
# back into the pipline for the next step.
blueprint = transmogrify.dexterity.deserializer

[schemaupdater]
# Now updated the created item from the data in the dictionary that has been
# passed down the pipeline
blueprint = transmogrify.dexterity.schemaupdater

[workflowhistory]
# Custom blueprints can also be created and used
# For example, if you need to include workflow history
blueprint = sixfeetup-plone5.migration.workflowhistory

[properties]
# Copy each item's zope properties into the the property sheet of the object
blueprint = collective.jsonmigrator.properties

[datesupdate]
# Plone itself includes some nice blueprint sections that are ready to use
# Set creation, modification and effective dates on the object
blueprint = plone.app.transmogrifier.datesupdater

[logger]
# Critical to developing your piplines is the ability to see log output of what has happened
# Use the logger blueprint section to pick and choose what to output into your logs
blueprint = collective.transmogrifier.sections.logger
name = IMPORTER
level = DEBUG
delete =
 text
 _datafield_file
 _datafield_image

This pipline is registered with the following ZCML:

<configure
    xmlns="http://namespaces.zope.org/zope"
    xmlns:transmogrifier="http://namespaces.plone.org/transmogrifier"
    xmlns:genericsetup="http://namespaces.zope.org/genericsetup"
    i18n_domain="collective.transmogrifier">

  <include package="collective.transmogrifier"/>
  <include package="collective.transmogrifier" file="meta.zcml"/>

  <transmogrifier:registerConfig
    name="sixfeetup-plone5.migration.import_content"
    title="Import sixfeetup-plone5 content"
    description="This pipeline imports content to single Plone site"
    configuration="import_content.cfg"/>

</configure>

One more step needs to be done to hook up the pipeline to the default profile. Add a file into the profiles/default folder called transmogrifier.txt. Inside the file, only put the name of the pipeline. From the example above, this is sixfeetup-plone5.migration.import_content.

To run the import, go to portal_setup in the ZMI to run the migration package's default profile Make sure to test the import and adjust for what you need before running in production.

Related Posts
How can we assist you?
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.