• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer
TSB Alfresco Cobrand White tagline

Technology Services Group

  • Home
  • Products
    • Alfresco Enterprise Viewer
    • OpenContent Search
    • OpenContent Case
    • OpenContent Forms
    • OpenMigrate
    • OpenContent Web Services
    • OpenCapture
    • OpenOverlay
  • Solutions
    • Alfresco Content Accelerator for Claims Management
      • Claims Demo Series
    • Alfresco Content Accelerator for Policy & Procedure Management
      • Compliance Demo Series
    • OpenContent Accounts Payable
    • OpenContent Contract Management
    • OpenContent Batch Records
    • OpenContent Government
    • OpenContent Corporate Forms
    • OpenContent Construction Management
    • OpenContent Digital Archive
    • OpenContent Human Resources
    • OpenContent Patient Records
  • Platforms
    • Alfresco Consulting
      • Alfresco Case Study – Canadian Museum of Human Rights
      • Alfresco Case Study – New York Philharmonic
      • Alfresco Case Study – New York Property Insurance Underwriting Association
      • Alfresco Case Study – American Society for Clinical Pathology
      • Alfresco Case Study – American Association of Insurance Services
      • Alfresco Case Study – United Cerebral Palsy
    • HBase
    • DynamoDB
    • OpenText & Documentum Consulting
      • Upgrades – A Well Documented Approach
      • Life Science Solutions
        • Life Sciences Project Sampling
    • Veeva Consulting
    • Ephesoft
    • Workshare
  • Case Studies
    • White Papers
    • 11 Billion Document Migration
    • Learning Zone
    • Digital Asset Collection – Canadian Museum of Human Rights
    • Digital Archive and Retrieval – ASCP
    • Digital Archives – New York Philharmonic
    • Insurance Claim Processing – New York Property Insurance
    • Policy Forms Management with Machine Learning – AAIS
    • Liferay and Alfresco Portal – United Cerebral Palsy of Greater Chicago
  • About
    • Contact Us
  • Blog

Documentum – EMC World 2013/Momentum – Day 2 – Migrations and Upgrades

You are here: Home / Documentum / D7 / Documentum – EMC World 2013/Momentum – Day 2 – Migrations and Upgrades

May 8, 2013

Interesting topic for 8:30 a.m. meeting after an open Vegas Bar (and a pretty good party – kudos to IIG).

Migrations and Upgrades: Introducing EMA, the EMC Migration Appliance. With Chris Dyde and Mike Mohen.  This post will present our thoughts on the presentation.

Why the need for EMA?

As we mentioned during Rick’s Keynote yesterday, Documentum is concerned that clients are not upgrading to 7.0 or the new D2 and xCP interfaces.  Significant issue is the need to migrate, given the change in object model from Webtop or xCP 1.0 to D2 and xCP 2.0.  Documentum Consulting’s response is to provide a migration tool, complete with a “counter” showing a live migration during EMC World.

Phasing Out/Marginalizing Webtop, DCM, CenterStage and eRoom

Presentation began with the restating that Webtop, DCM, CenterStage and eRoom are slowing being replaced by new products since there will be no new features in these interfaces.  To get clients to the new interfaces EMA will offer:

  • Extract, Transform & Load
  • Reporting
  • Validation
  • Cloning Scripts that take a DB (from one platform to another, etc.)
  • Sizing Spreadsheet
  • OnDemand Tools

Stat that is presented was 1.2 million documents per hour.  No specific detail on the type of documents, size of docs, renditions, hardware or other components.   EMA is processing at the database level and can preserve Doc IDs, Audit Trails while transforming objects from old object models to new ones

Major Use Cases

  • DCM to D2 life sciences
  • OnDemand and Cloning
  • Webtop to D2

EMA Components

  • EMA Cloner
  • EMA Migrate
  • EMA Morph (used for instances such as migration into D2)
  • EMA Replatform
  • EMA API – build plugins and extend the tool
  • EMA Plugins (File Share, etc.)

EMA Engine

  • Java-based
  • Uses Spring and Spring Batch
  • Web-based user interface
  • Uses mongoDB (NoSQL Database)

Migration Options

  • < 1 million objects, IIG suggests that users would copy content with data
  • > 1 million – Copy all content over in mass at end of migration
  • Morph – this is not a migration, but using another utility. This is used when changing document types during migration (i.e. migrating to D2)

DCM to D2 (Morph Tool)

  • Would need to use Change Object 3-4 times for each object type and you would lose the data associated with it
  • EMA allows migration to by-pass this by going straight to the database

Product, Solution or Tool

EMA is a tool that ONLY Documentum Professional Services Team Uses. Only used as long as PST is engaged.  Once engagement is over, EMA leaves with the team and cannot be used for additional migrations.

From the presentation, EMA can transfer workflows and attached documents.

Presently, EMA migration can only select a cabinet and migrates all the folders under the cabinet and does not provide migration of just certain document type .Future intends to be updated to allow for a DQL statement. If linked to documents in other cabinets, those documents will not be pulled currently.

Some Concerns- is Speed all that is important?

We see some pretty large flaws in regards having it productized including:

  • Delta Migrations – Most clients prefer not to be down during the migration.  Our understanding from other presentations was that EMA is a one-shot deal – use it to migrate content once but no Delta change.
  • Database Approach – We are concerned that the migration is focused on speed and not accuracy.  Not leveraging the API runs the risk that key components like lifecycle, ACL, folder, TBOs and a ton of other components might not be created (and not verified until the document is accessed).

Also, throughput varies greatly between migrations, which is why we always recommend running benchmark migrations in environments that are as similar to what will be used in the actual migration as possible.  Below are some items which can impact throughput for OpenMigrate migrations that we think would be concerns with EMA as well:

  • Number of OM threads
  • Average size of native content
  • Average size of renditions
  • Average number of document versions
  • Average number of document renditions
  • Inclusion of audit trail information or other related items
  • Source server performance capabilities
  • Target server performance capabilities
  • Physical distance between source and target systems
  • Complexity of migration logic
  • Existing target system TBOs (this can definitely have a big impact)
  • Applying overlays
  • Looking up additional information in database tables/repository
  • Writing additional information in database tables/repository
  • Complex metadata mappings

Summary

EMA is currently a “one and done” migration as part of an upgrade for D2.  We have been recommending that a migration infrastructure is something that clients need and includes other migration needs besides just for upgrades.   Some of concerns in regards to a migration that we are not sure EMA addresses include:

  • Ability to repeat the process for ongoing migration needs
  • Ability to apply business logic throughout the migration process
  • Incorrect assumption that all migrations are the same
  • Ability to address documents/data that failed to migrate
  • Ability to repeat the process for different data sources

Migration consultants spend most of their time understanding the current data and setting up the migration – defining transformations, locating exceptions, modifying migrations to handle exceptions, etc.  Faster migration time is always nice, but the time spent executing the migration is a small fraction of the overall project.  Typically consultants not only review migrated data from the backend (typically the way technical IT folks want to verify a migration) but to review migrated data and content using the front end user interface (Webtop, D2, etc.) in the source and target system. Migration and transformation requirements are often missed when only reviewing the migrated data and content from the backend. The EMA solution is so focused on moving back-end database rows, we would recommend our clients confirm there is a detailed review of the migration results from the front end interface.

Lastly, as a tool that can only be used with professional services, clients should evaluate EMA versus other migration solutions, based on their trust and confidence in the consulting resources that will leverage the tool and assigned to the project.

Let us know your thoughts below:

Filed Under: D7, Documentum, Migrations, News, OpenMigrate

Reader Interactions

Comments

  1. Anhtuan Doventry says

    May 13, 2013 at 10:21 am

    Can TSG’s OpenMigrate handle the migration that we need?

    We plan on doing a 6.6 to 7.0 or 7.1 early next year. Can OpenMigrate handle what we need if we continue to keep the Webtop front-end?

    Reply
    • TSG Dave says

      May 13, 2013 at 1:13 pm

      Anhtuan,

      Some thoughts on the migration/upgrade.

      On the front-end – you want to probably go to WebTop 6.7 SP2 (our understanding of the last WebTop) – as we mentioned in the Roadmap post.
      https://www.tsgrp.com/2013/05/08/documentum-emc-world-2013-momentum-emc-documentum-7-0-roadmap/
      This will give WebTop the ability to work on D7 but will work on your existing 6.6 repository (that is our understanding).

      At a later point in time, when you are ready to change the back-end, you can either upgrade in place or migrate to D7 or D7.1. Migrate would only if you are changing hardware or want to change the object model. Upgrade in place if everything staying the same.

      Does that make sense?

      Dave

      P.S. 7.0 has the better memory and session management for Windows back-ends (like you). Significant performance benefits. 7.1 will attempt same benefits for Linux and other back-ends.

      Reply
    • eryan says

      May 17, 2013 at 9:16 am

      Anhtuan –

      Of course it can! OpenMigrate does migrations at the API layer so moving to 7.0 or 7.1 will not be difficult especially if you are sticking with Webtop as the front end. A few of our clients are planning to do the same thing – upgrade the backend to 7.1 eventually, keeping with Webtop 6.7. The next major move (D2, HPI, etc.) is a big decision and will take some significant analysis and prototyping.

      Ellen

      Reply
  2. Mike Gration says

    May 14, 2013 at 3:32 am

    First of all, thanks for the great blog! With regards to the live migration of EMA at EMC World here are some clarifications. 1.2 million documents per hour is not the correct figure! EMC migrated ~39 million objects in 54 hours and just 1.2 million documents with content, ~2.500 folders and 300 users. It’s not such a large migration as communicated at the EMC World. At this customer EMC copied all objects (garbage in, garbage out) without clean-up of e.g old ACL (dm_xxx) etc. We know that this is not the approach my customers are looking for!

    Reply

Trackbacks

  1. Documentum – EMC World/Momentum 2013 – TSG Recap | TSG Blog says:
    May 10, 2013 at 11:50 am

    […] « Documentum – EMC World 2013/Momentum – Day 2 – Migrations and Upgrades […]

    Reply
  2. Documentum D2 – Do you need to migrate? | TSG Blog says:
    May 30, 2013 at 2:03 pm

    […] about a new migration tool if you don’t need always need it?  As we pointed out in our post on EMA, whether it is upgrading or just evolving your Documentum repository, a migration tool is a big […]

    Reply
  3. Documentum and Alfresco Migrations – Why the Migration Tool is Only Part of the Equation | TSG Blog says:
    August 22, 2013 at 8:33 am

    […] This was true in regards to EMC World when Documentum unveiled their own migration tool, EMA.   As mentioned then, EMA can migrate content at 1.2 million documents per hour by skipping the […]

    Reply
  4. Documentum – Momentum EMC World 2014 – Day 1 – Content Bridge and DEMA | TSG Blog says:
    May 6, 2014 at 8:38 am

    […] One of the big announcements in last years 2013 keynote was the “Enterprise Migration Applicance” or EMA.  This has been renamed, rather appropriately, to the “Documentum Enterprise Migration Appliance” or DEMA.  For background, see our thoughts on Chris Dyde and Mike Mohen’s presentation on EMA last year. […]

    Reply

Leave a Reply to Anhtuan Doventry Cancel reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Primary Sidebar

Search

Related Posts

  • Documentum or Alfresco Migration Tools – What about Speed?
  • Documentum Content Server 6.7 – Primary Support Ending April 30, 2015 – What should clients do?
  • Enterprise Content Management Predictions – 2015
  • Documentum Migration – "Two-Step" Bulk Load versus a "One-Step" Migration Approach
  • Documentum – Momentum EMC World 2014 Recap – Some bunts, hits as well as some swings
  • Documentum D2 – Do you need to migrate?
  • Documentum – EMC World/Momentum 2013 – TSG Recap
  • Documentum – EMC World 2013 Momentum – Day 1 – Rick Devenuti Keynote
  • Documentum Client Briefing – Final Agenda – June 7th – University of Chicago Gleacher Center in Chicago
  • Alfresco Consulting – Documentum Disruptor #2

Recent Posts

  • Alfresco Content Accelerator and Alfresco Enterprise Viewer – Improving User Collaboration Efficiency
  • Alfresco Content Accelerator – Document Notification Distribution Lists
  • Alfresco Webinar – Productivity Anywhere: How modern claim and policy document processing can help the new work-from-home normal succeed
  • Alfresco – Viewing Annotations on Versions
  • Alfresco Content Accelerator – Collaboration Enhancements
stacks-of-paper

11 BILLION DOCUMENT
BENCHMARK
OVERVIEW

Learn how TSG was able to leverage DynamoDB, S3, ElasticSearch & AWS to successfully migrate 11 Billion documents.

Download White Paper

Footer

Search

Contact

22 West Washington St
5th Floor
Chicago, IL 60602

inquiry@tsgrp.com

312.372.7777

Copyright © 2023 · Technology Services Group, Inc. · Log in

This website uses cookies to improve your experience. Please accept this site's cookies, but you can opt-out if you wish. Privacy Policy ACCEPT | Cookie settings
Privacy & Cookies Policy

Privacy Overview

This website uses cookies to improve your experience while you navigate through the website. Out of these cookies, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may have an effect on your browsing experience.
Necessary
Always Enabled
Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website. These cookies do not store any personal information.
Non-necessary
Any cookies that may not be particularly necessary for the website to function and is used specifically to collect user personal data via analytics, ads, other embedded contents are termed as non-necessary cookies. It is mandatory to procure user consent prior to running these cookies on your website.
SAVE & ACCEPT