Data Center Transformation: Decommissioning Applications with Hitachi Content Platform
by Hu Yoshida on Jun 3, 2011
As we talk to customers about data center transformation through virtualization for tiering, migration, and dynamic provisioning, one issue that comes up is the problem of decommissioning applications. We make it easier to move to new applications and infrastructure, but what about the old applications that we need to decommission?
The problem is the data. While the application may no longer serve a useful purpose, the application’s data still has to be retained for compliance and tax audit reasons. If that data can only be referenced through the application, those applications and the infrastructure that supports it must be retained. This process not only requires infrastructure, facilities and maintenance cost, but also adds a compliance risk if the immutability of that data is compromised.
How can we eliminate this problem?
The first step is to virtualize, or dereference the data from the application so that you can eliminate the need for the application. This dereferencing can be done through the use of a content platform, like the Hitachi Content Platform (HCP). When we ingest data into HCP, we store the object in bitmap format, then store the meta data, policies and permissions for that data object into a database. All the information needed to identify the data is stored with the data so that it no longer needs the application for reference. Using standard protocols, an authorized client can do a content aware search for any data object in HCP.
The second step is to ensure the immutability of the data. HCP will hash the data object as it is ingested so that the hash can be checked on retrieval to ensure it has not changed.
The third step is to ensure the long term privacy and availability of the data. This is done through encryption of the data at rest, and replication of the data to a second and/or third site. A virtualization storage system like Hitachi Virtual Storage Platform (VSP) will be able to non-disruptively migrate the data across technology refreshes. With VSP, we can minimize storage costs through dynamic tiering, thin provisioning, and single instance store. VSP can also shred the data at end of life.
One way this is done is through the integration of our Hitachi Content Platform with SAP’s Life Cycle Management platform, NetWeaver Release 7. Through “ILM – WebDav storage interface 2.0 (BC-ILM 2.0), “we enable SAP NetWeaver to define archive and retention policies, apply legal hold, conduct e-discovery requests, and decommission legacy systems. Hitachi is providing a multi-tier storage solution allowing customers to store and manage both their active and archived data using a common and integrated platform with existing storage infrastructure and investments to simplify and reduce operational costs.
This integration enables the storing of SAP application data in HCP directly via WebDav without the cost and complexity of a third party connector. Once data is stored in HCP, it can be replicated to another HCP, which eliminates the need for backup. This integration allows the archive and retention policies set in the application for objects and unstructured content to be applied to archived data. SAP application data is protected through HCP data protection, replication, and encryption of data at rest. The HCP archive also can be shared with other content applications, like email and file archiving. With Hitachi Data Discovery Suite, we have the additional advantage of conducting content aware searches across emails, files, and SAP archival data.
The NetWeaver solution with HCP solves the problem of decommissioning. Data can be referenced without the need for maintaining the application and infrastructure that was used to create it. If you have SAP applications, check out our press release which has links to our certified SAP solutions.
[...] Yoshida posted a blog last week on the issues around data center transformation and the impact of Decommissioning [...]