Moving Data to a Distributed Environment

Streamlining a large-scale data migration with the IBM Data Movement Tool

One day, without warning, I was informed that a new data warehouse was being created. The data warehouse was to reside on a platform running something other than IBM® z/OS®, and I was to lead this effort. The distributed platform was to be using IBM DB2® for Linux, UNIX and Windows (LUW) on Solaris. I was excited to be given this opportunity, and I looked forward to starting this project. One of the first things I did was gather requirements and find tools to help me migrate data/objects from the mainframe to the distributed LUW platform.

The tool had to be easy to understand and use. It had to migrate data and objects to the LUW platform from z/OS, transporting large volumes of data in a reasonable amount of time.

After spending weeks researching, I found a tool that met my requirements: the IBM Data Movement Tool. I found it to be a great tool for copying extremely large volumes of data.

The Data Movement Tool is easy to install and flexible to operate. It can be run from the command line or a GUI. Importantly, the Data Movement Tool allows the selection of objects that you want to copy. You can copy individual objects, entire database, objects by authID, or schema.

Data movement in action

Let’s walk through the steps of using the Data Movement Tool. I’ll use my project—in which I migrated data from the z/OS platform to a distributed LUW platform—as an example.

After downloading and configuring the tool, follow the basic prompts to migrate data. I chose to use the command line method. After making a Telnet connection to the target server, issue the following command: ./

You will then be guided through the data and object migration to the LUW server by answering a series of wizard-like prompts, such as:

Is Target DB2 Local=Yes

Extract DDL=y

Extract data=y

Extract objects=y

Compress objects=y

You can then choose the source database. In my case, I selected option “7” from a list of possibilities:

Other than DB2


My SQL Server =2

Sybase = 3

My Access = 4

My SQL=5


Postgre SQL= 6

z/OS =7

DB2 iSeries =8


Next, enter the additional information requested for the source database, including IP address, port #, database name, userid, and password. After entering the information, the tool connects to the source database, verifying the userid and password.

Then select a schema name for the objects to copy (for example, “Test.DB”). Provide an output directory on the new platform where the data will be unloaded (for example, c:/luw/data/unload).

Enter the target database information, including the IP address, port #, database name, userid, and password. The tool then connects to the target database, verifying the userid and password.

Once connection is made to LUW server, change the directory to the output directory you entered earlier (c:/luw/data/unload). Once the connections to both platforms have been made, you are ready to enter the following commands:

  • ./unload unloads all the data from the selected objects (z/OS).
  • ./db2ddl.sql lists the ddl for the objects to be created.
  • ./db2load.sql loads cards listed in order to load data into objects on LUW.
  • ./db2runstats.sql delivers statistics on objects after loading.

To conduct the data migration, first execute the ./unload command, which will migrate data from z/OS to LUW. Then execute the ddlddl.sql command to create objects. (This command is optional if objects already exist on LUW.) Then execute the dd2load.sql command to load the data into the LUW objects. After the data is loaded, execute db2runstats.sql to see the statistics on objects.

This is essentially the task of migrating data and objects from z/OS to LUW. The process is repeatable.

I hope this article helps anyone who may be faced with a similar challenge of copying data from a mainframe platform to a distributed environment. This tool made my job much easier and took away some of the fear I had of working on a distributed environment.

For more instructions on downloading, configuring, and optimizing the tool for high performance, please visit

Any questions? What migration projects do you have on the horizon? Whatever they may be, this tool can assist you with that task.

Previous post

Getting Started With Fit-for-Purpose Architectures

Next post

Statement-Level Monitoring: Part 2

Lloyd Matthews

IBM Champion Lloyd Matthews has 29 years of information systems experience, including pre- and post-sales training for DB2 products and DB2 for z/OS and DB2 for LUW database administration. His background covers maintaining databases, SQL performance tuning, database backup and recovery, development and implemention of archiving strategies, teaching DB2 administration courses, and presenting at IOD 2010, 2011 and IDUG 2012. He has been Vice Chairman of the Baltimore Washington DB2 User's group, participated in the Liaison IDUG RUG Committee, and is a Certified Database Administrator on DB2 10 for z/OS. You can follow Lloyd on Twitter: @aldbaguy