ESG Validation

ESG Lab Validation: Hitachi Vantara’s Content Platform Portfolio

Introduction

ESG Lab performed hands-on evaluation and testing of the Hitachi Content Platform Portfolio, consisting of the Hitachi Content Platform (HCP), Hitachi Content Platform Anywhere (“HCP Anywhere”) online file sharing, Hitachi Data Ingestor (HDI), and Hitachi Content Intelligence (HCI) data aggregation and analysis. Testing focused on integration of the platforms, global access to content, public and private cloud tiering, data quality and analysis, and the ease of deployment and management of the solution.

Background

If data is royalty, then unstructured data is the heir to the throne and analytics is the power behind the throne.

As the digital world evolves, unstructured data becomes ever more important as the language of media, logs, and Internet of Things (IoT) devices. This data lives outside the rigid structure of the traditional OLTP castle with its OLAP tools and data warehouses. Unstructured data (e.g., flat file logs and messages), and semi-structured data (e.g., video clips with geolocation metadata), are well suited to object storage.

The ESG research data in Figure 1 shows how the major challenges of simplifying management of unstructured data, providing a foundation for cloud solutions, and delivering a repository for analytics drive the need for object storage.1

Additional ESG research shows that initiatives directly addressed by object storage rank highly with respondents, including 17% of respondents ranking using data analytics for real-time business intelligence and customer insight as most important and 12% ranking mobility as their most important IT initiative for 2017.2

Cloud architectures result in simple user access to data services. More significantly, the cloud makes data accessible from any location; the decision to place data in the cloud effectively makes every office a remote office because the data is located elsewhere. As a result, IT departments face the same challenges as cloud service providers when providing services to ROBOs: They must cost-effectively deliver acceptable application performance and availability, keep data secure and protected, enable employees to collaborate, and minimize the cost of bandwidth for data replication, backup, etc.

Hitachi Vantara Content Platform Portfolio

Hitachi Content Platform Portfolio is a comprehensive hybrid cloud solution spanning small, medium, and enterprise organizations. The portfolio includes: Hitachi Content Platform (HCP), a massively scalable object storage system designed for secure private and hybrid cloud storage, content distribution, backup free archiving, and compliance; HCP Anywhere, a solution for enterprise mobility with collaboration, user data protection, and file synchronization and sharing; Hitachi Data Ingestor (HDI), a bottomless and backup-free on-ramp for remote locations and cloud customers; and Hitachi Content Intelligence (HCI), a data aggregation and processing capability to enhance analytics.

Hitachi Content Platform (HCP)

The centralized core infrastructure is the massively scalable, multi-tiered, multi-tenant HCP. This object store can be divided into thousands of virtual content platforms: Tenants and underlying namespaces have configurable attributes to deliver varying service levels for different users and applications (or, in the case of a cloud service provider, different organizations). Network File System (NFS), Common Internet File System (CIFS), and Representational State Transfer (REST) protocols such as Amazon’s S3 are supported, as well as Active Directory authentication.

The HCP easily accommodates changes in both scale and storage technology so that data can reside for decades or longer with minimal disruption. It scales from a few terabytes to over half an exabyte of capacity, and manages data according to “service plans” that define service levels and how data is stored, accessed, tracked, and ultimately deleted and shredded across spin-down media, commodity storage, cloud services, or removable media. These features enable HCP to help organizations eliminate storage sprawl and reduce the cost and complexity of storing unstructured data.

The HCP back-end provides object storage with compression and single instancing for capacity efficiency. Designed for cloud deployments, it enables organizations to store and protect unstructured content such as documents, files, images, and video with massive scalability (supporting over half an exabyte in a single HCP cluster/cloud) and it can retain content indefinitely. A single device can store data from different tools such as file servers, email, S3-based applications, and Microsoft SharePoint, and both storage tiering and configurable multi-tenancy are supported.

Each piece of content is stored as an object, which is basically a container that includes the data and metadata used to define the structure and administration of that data. This provides IT with a deep understanding of the nature of the content and enables IT to assign policies and automate storage tiering with greater intelligence. HCP can automatically apply data retention and disposition, deleting expired content and reclaiming storage. Content is accessed through HTTP/REST APIs like Amazon’s S3, NFS, CIFS, SMTP, and more. Monitoring, reporting, and audit capabilities are built in and enable chargeback.

A combination of built-in reliability factors can eliminate the need for backup of HCP itself. RAID-6, local and geo-distributed erasure coding, dynamic data protection levels, encryption, data integrity checking, retention of multiple versions of content, extensive metadata, and self-service recovery keep content protected and preserved without tape backups—although backups are supported. Data can also be replicated from HCP for disaster recovery. When combined with HCP Anywhere, the solution has the potential to eliminate the need for endpoint backups, and when used with HDI, eliminate the need for backups of remote and branch office data.

Hitachi Content Intelligence (HCI)

HCI is a set of data processing, aggregation, and exploration capabilities that enable users to create insights from the wide variety of unstructured and semi-structured file and object data available to the Hitachi Content Platform framework. HCI can gather data from sources including HCP, flat files, cloud stores accessible on Amazon S3 protocol, and user-defined sources. HCI processes and normalizes object metadata to make it accessible for further processing. User-defined filtering then helps the user focus on what’s important. HCI can enrich metadata to improve analysis, for example to add geocoding to photos or videos. HCI can then load, or store, its results on an HCP for multi-user access and processing by HCI’s included search utility, Hitachi Vantara’s Pentaho tool, and/or third-party or user-developed tools.

HCI is implemented as a highly available cluster of micro-service instances, on physical servers, virtual machines, or cloud-based compute services. A single HCI instance requires four CPU cores. For high availability, HCI requires a minimum of four instances in a cluster. Each instance supports infrastructure and value-added services running in 64-bit Linux Docker containers. Infrastructure services include cluster management, security, load balancing, monitoring, orchestration, and high availability. Value-added services include the task scheduling pipeline, indexing, data management, and search.

Hitachi Content Platform Anywhere (HCP Anywhere)

HCP Anywhere is engineered to be a secure, compliant solution for cloud home directories, collaboration, end-user data protection, and file sharing and content distribution designed to improve end-user productivity while maintaining corporate control of data. HCP Anywhere leverages internal corporate resources that are managed and protected by IT.

HCP Anywhere consists of the HCP Anywhere pod containing the server and database infrastructure as well as HCP Anywhere software, with the corporate network connecting to a Hitachi Content Platform object store. Other parts of the corporate infrastructure can be integrated, such as Active Directory servers for user authentication and permissions, DNS servers, mobile device management (MDM), virus scanning, etc. These features enable content distribution and file sharing to benefit from full IT governance and management.

HCP Anywhere can also be run as software on VMware ESXi, enabling organizations to leverage existing resources to gain the always-available file sharing and content distribution benefits of HCP Anywhere.

Hitachi Data Ingestor (HDI)

HDI serves as a bottomless and backup-free cloud file gateway. HDI is designed for remote locations with little or no IT presence. The standards-based file interface requires no recoding or application usage changes; storage is accessed using standard NFS or SMB protocols, which allows HDI to serve as a protocol translator between traditional applications that speak NFS or SMB(CIFS) and object storage protocols when greater performance than HCP’s native file system protocols can provide is required. HDI is essentially a cache that provides users and applications with seemingly endless storage capacity and advanced centralized storage management and protection.

HDI provides numerous benefits to distributed organizations with many small remote locations as well as cloud service providers:

  • Elastically scalable backup-free file services.
  • Centralized configuration, management, and reporting via HCP Anywhere.
  • Remote provisioning: ship server to remote site and boot.
  • Intelligent caching and automated capacity management.
  • Compliance and lifecycle management.

ESG Lab Validation

ESG Lab performed hands-on evaluation and testing of Hitachi Content Platform at a Hitachi Vantara facility in Santa Clara, California. Testing was designed to demonstrate the ease with which Hitachi Content Platform, HCP Anywhere online file synchronization, and Hitachi Data Ingestor can be integrated into an organization’s infrastructure to provide limitless unstructured data storage to a distributed IT environment. Further testing explored the integration of HCI into the Hitachi Content Platform Portfolio and its ease of use. Also of interest were the scalability, secure multi-tenancy, and resilience of the solution.

Hitachi’s solution combines the object-based Hitachi Content Platform (HCP) at the core, HCP Anywhere on user devices for enterprise mobility and end-user data protection, Hitachi Data Ingestor (HDI) at remote and branch offices—which, as a minimal-footprint physical or virtual appliance, caches data for fast retrieval at the edge and sends data to the core infrastructure—and Hitachi Content Intelligence for data collection, normalization, and analysis. By connecting to HCP, data from remote sites, user devices, and data center applications enjoys advanced storage and data management capabilities, and Hitachi Content Intelligence for distilling knowledge and information from the data accessible to the HCP Platform. This configuration provides full-featured IT services to end-users and remote deployments while minimizing costs and complexity. In this way, edge sites and individual users gain seamless scalability while benefiting from the centralized management and protection capabilities at the core. Remote deployments no longer need to worry about having IT staff to handle storage management or backup, and corporate IT teams (or cloud service providers) don’t have to design and build their own edge-to-core. HCI adds analytic capabilities to the HCP Portfolio’s ability to provide data access.

Hitachi Content Platform (HCP)

The Hitachi Content Platform (HCP), at the core of the HCP Portfolio, is a clustered system that enables users to build a cloud architecture primarily for the storage and access of unstructured data. Recent releases include software enhancements and new hardware nodes for constructing an HCP cloud consisting of access components (S nodes or the HCP VM), and storage components (G nodes, local storage, and public cloud storage). Much to Hitachi’s credit, the recent hardware and software releases, while adding features and evolving the architecture, preserve the HCP GUI and user experience.

ESG Lab Testing

ESG Lab began with an environment designed to simulate a multi-tenant enterprise environment, such as an enterprise with each division being a separate tenant, or a more traditional multi-tenant service provider. In the data center environment, HCP was installed and populated with unstructured (file and object) data in multiple namespaces. One HDI was installed in a simulated remote office with Windows clients mounting the file system using the SMB protocol. Connectivity between the clients, HDI, and HCP was accomplished via 1 GbE. A second HCP was also configured, simulating synchronization between geographically separate enterprise data centers.

First, ESG Lab reviewed the current state of the HCP using the management GUI, which provides at-a-glance usage and health information. Next, ESG added a storage component to the HCP. Content Platform can store objects in internal disk drives or a variety of external storage components including NFS servers and cloud storage services from Hitachi Cloud Services, Amazon S3, Microsoft Azure, Google Cloud, and any Amazon S3-compatible third-party cloud.

Adding an Amazon S3 cloud storage component was accomplished through the create storage component wizard, which walks the administrator through the necessary steps. With a few mouse clicks, followed by entry of the Amazon S3 bucket and private key information, Amazon S3 was added to the HCP (see Figure 4). Then a new storage pool using only the Amazon S3 storage was created.

Next, ESG Lab created a new service plan. As with adding a storage component and creating a storage pool, creating new service plans was wizard-driven, and required just a few mouse clicks and the entry of some basic information.

The first step in creating a service plan was to generate a set of storage tiers. The storage tier specified the trigger, which is the action that causes data to be moved from one tier to the next tier. Also specified was the storage pool for the tier and the number of copies of data.

As can be seen in the summary of the service plan in Figure 5, every time data is ingested, or stored, the service plan will maintain two copies of each object along with two copies of metadata on the primary storage pool. Two days after data is stored, the number of copies of the object on the primary storage will be reduced to one. Simultaneously, a copy of the object will be stored on the DemoS3 pool, which is dedicated Amazon S3 cloud storage.

Using service plans, administrators have the flexibility to control how objects are stored on the system, and when they are moved around. This powerful storage tiering environment enables administrators to balance reliability, compliance, speed, bandwidth, cost, and latency for all objects within the system.

The final step in deploying the multi-tenant configuration was to create a tenant (see Figure 6). As with the previous steps, tenant creation was wizard-based, and took just a few mouse clicks to complete.

In the tenant creation wizard, administrators can configure object replication and versioning and retention policies; assign service plans; and apply storage quotas. This enables IT to balance total storage used against the business needs of the tenant. For instance, a tenant representing the legal department might want all files preserved forever, thus needing different versioning and retention policies than a tenant representing a marketing department.

As enterprises grow to have more than one data center, they can connect multiple HCP storage systems into a single universal storage environment. As shown in Figure 7, using the create link wizard, ESG Lab configured a link between two HCP systems.

Hitachi supports three types of links between HCP systems: outbound, inbound, or active/active. An outbound link creates a master-slave relationship, where data in the master HCP is always pushed to the outbound slave HCP. If the master is in the corporate data center and the slave is located at a remote divisional data center, then the divisional HCP is kept up to date with any changes created at the corporate office. An inbound link also creates a master-slave relationship, where the remote HCP is the master, and the local HCP is the slave and could be used to ensure that the corporate HCP is kept up to date with data created at a remote site.

An active/active link is a bidirectional link, where both HCP units are completely synchronized. Any changes on the local HCP are pushed to the remote HCP, and any changes on the remote are sent to the local. As can be seen in Figure 7, data can be compressed and encrypted during data transfer between systems for efficiency and security. An inbound or outbound link can be converted to an active/active link, but the conversion is one-way: once converted, the link cannot be changed back.

Why This Matters

ESG asked IT managers to name their most important IT priorities. Managing data growth, regulatory compliance initiatives, and building a private cloud infrastructure were all among the most-cited responses.3

As the size and number of files that need to be kept online continue to grow, capital equipment and operating budgets are being stretched to their limits. Scaling capacity at remote offices can lead to lost productivity, and, in some cases, lost revenue as legacy storage systems are filled, becoming increasingly difficult to back up. ESG Lab has confirmed that HCP with HDI offers effectively unlimited, backup-free file and content storage for distributed users with advanced data availability and functionality that protects all file and unstructured content in the data center, while providing local file system performance to users. The addition of active/active replication enables the creation of a global access topology, providing content synchronization across an entire enterprise.

ESG Lab was extremely impressed with the ability of HCP to provide mobility between private and public clouds using adaptive cloud tiering without compromising enterprise control.


Hitachi Content Intelligence (HCI)

Hitachi Content Intelligence is a data collection, aggregation, processing, normalization, indexing, and analysis software suite. HCI implements processing as Workflows, each consisting of Data Connections for the sources that are crawled; Processing Pipelines to extract, transform, and enrich the data, and Index Collections to Load (store) results. As shown in Figure 8 , a single HCI deployment can handle multiple sources, processed through multiple pipelines, for multiple downstream users, including HCI’s included search capability. HCI Search provides users with a powerful tool to perform analytics directly with the HCI suite.

The heart of HCI’s capabilities is the ETL processing pipeline. Each pipeline can transform and enrich the selected files and objects. Pipelines are comprised of multiple stages executed serially and can be conditionally controlled. HCI provides prebuilt stages and users can augment processing with their own custom stage plug-ins.

ESG Lab Testing

ESG Lab tested HCI for its integration into the Hitachi Content Platform Portfolio and infrastructure. As Hitachi developed HCI to provide structure to the classic Extract-Transform-Load process especially for unstructured and semi-structured data, ESG Lab explored the software’s workflow management and data handling capabilities.

First, ESG Lab brought up the HCI Administrative Interface. The GUI includes capabilities for managing HCI’s services and instances, as well as application characteristics that are important to the user’s infrastructure such as security and certificates. The screen in Figure 9 shows plug-ins for the various directory services in the environment, like Active Directory and Open LDAP.

Next, ESG Lab explored HCI’s Workflow Designer. Workflow Designer, shown in Figure 10, is a wizard-driven tool to create and manage multiple data processing workflows.

Workflow Designer includes a drag and drop Processing Pipeline builder, shown in Figure 11 , with the ability to conditionally control processing flow. The Pipeline builder includes stages for text and metadata management—analyze, extract, transform, filter, enrich, and store.

The HCI Pipeline Builder’s Content Classes capability, shown in Figure 12, provides additional conditional control of processing flow based on user-defined text or metadata pattern matching.

Why This Matters

In a recent survey, ESG research found that 95% of midmarket respondents ranked data analytics projects in the top ten of their IT priorities, putting data analytics tools high in their consideration.4

HCI’s capabilities to process many disparate sources of unstructured and semi-structured data into a consistent set for further processing bring the Hitachi Content Platform Portfolio from an elegant storage capability to a powerful analytical tool. HCI’s included search capability makes it valuable out of the box.


Hitachi Content Platform Anywhere (HCP Anywhere)

HCP Anywhere is designed to provide cloud home directories, end-user data protection, and secure collaboration with file sharing and synchronization to improve end-user productivity while maintaining corporate control of data. HCP Anywhere leverages internal corporate resources that are managed by IT according to established practices. Public networks are used only when external users access a shared file or when internal users are accessing data from outside the corporate network.

As Figure 13 shows, HCP Anywhere consists of the HCP Anywhere pod containing the server and database infrastructure as well as HCP Anywhere software, with the corporate network connecting to a Hitachi Content Platform object store. Other parts of the corporate infrastructure can also be integrated, such as Active Directory servers for user authentication and permissions, DNS servers, virus scanning, etc. These features enable content distribution and file sharing to benefit from full IT governance and management.

The HCP Anywhere pod—as tested—included dual active/active, clustered servers for both load balancing and automatic failover. Each server is configured with an Intel Xeon E5 processor; six 300GB, 10,000RPM SAS drives; internal RAID; 8 GB of memory; and 1 GbE connectivity. Dual Dell PowerConnect 2824 switches are also included. The HCP Anywhere application installed on each server includes sync, notification, and web server components as well as the REST API and a Postgres SQL database. The HCP back-end provides object storage with compression and single instancing for capacity efficiency. Other configurations are available.

ESG Lab Testing

ESG Lab began exploring usability by launching the Hitachi Content Platform Anywhere management interface. As shown in Figure 14, the main console provides an overview of system status, including a detailed list of major events. Recent usage of the platform, including namespace usage, operations, connection rate, and data transfer rate, are also displayed, giving administrators an at-a-glance view of the health and usage of HCP Anywhere. Also from this page, just above the Major Events view, additional filtered information can be selected for different components by selecting Hardware Status, System Status, or File Sync and Share Status.

ESG Lab used the HCP Anywhere management interface to add two new user groups to the HCP Anywhere validation test environment. Because HCP Anywhere integrates directly with Microsoft Active Directory (AD), a preexisting AD group was selectable from the corporate environment. Each group created in HCP Anywhere inherits the security attributes configured for its users in AD.

HCP Anywhere provides the ability to brand the user interface. Both the product logo and the “powered-by” logo can be customized, enabling the presentation of a complete branded solution suitable for enterprises and service providers. The user interface can also be switched to multiple different locales and languages, including English, simplified Chinese, Korean, Japanese, and Canadian French.

ESG Lab next explored the browser-based user interface. Multiple devices were configured and assigned to the same user account that was set up for validation testing. A Windows desktop, a web browser, and an Android mobile device were added to the account; all these devices can be used for accessing and sharing HCP Anywhere data, including the activity log (or audit trail), showing all file activity as well as the history for a specific file.

Users can download HCP Anywhere client software to be installed directly on Mac or Windows PCs. Once installed, an HCP Anywhere directory is created in users’ local file systems. Any file or subdirectory within the HCP Anywhere directory tree is automatically synchronized with the HCP Anywhere servers. The client also provides a dashboard view to provide users with at-a-glance information regarding their account.

As seen in Figure 15, HCP Anywhere can be integrated with Windows File Explorer in traditional PCs as well as persistent and non-persistent virtual desktops. The client creates a ‘cloud home directory’ where users can see all their files as if they are local while many or all their files may be stored in the HCP Anywhere system. Users can pin files to their local storage for offline use while keeping less used data in the HCP Anywhere cloud to save space.

Figure 16 shows the properties of a file that is stored only in HCP Anywhere, not stored locally. It is visible to the user and easily accessible, yet consumes no space on the user’s device. This is ideal for the newer generation of small form-factor devices such as lightweight laptops and tablets that do not have the on-board storage capacity of more traditional laptop and desktop PCs.

In addition to traditional access through web browsers, Windows, and Mac PCs, HCP Anywhere supports iOS (iPhone and iPad), Android, and Windows phone devices. ESG Lab examined the Android phone app, as shown in Figure 17. On the far left is the Settings panel, where users can see the amount of storage consumed and set limits on file sizes and whether HCP Anywhere will use cellular bandwidth or Wi-Fi bandwidth.

Users can easily find important or frequently used files in the Favorites panel. Files appear in the favorites panel (middle-left) when marked by a star in the file explorer panel (far right).

The middle-right panel shows the Create Link panel, enabling users to share files and folders directly from the mobile user interface. Shared links automatically expire in a maximum of 730 days, and users can set the links to expire sooner. This forces users to reevaluate the necessity of a long-lived share, helping to control access to corporate data. Shared links can have an associated password, and can be restricted to just those internal to the organization.

HCP Anywhere supports versioning and rollback of files to past points in time, as shown in Figure 18.

Why This Matters

ESG research shows that a clear majority of midmarket and enterprise organizations consider the ability for their employees to access business applications and IT services anytime from any location to be either critical (43%) or at least important (45%).5 The smartphone is driving the way businesses deliver and enhance an end-user workspace.

ESG Lab validated that the HCP Anywhere solution was easy to use and deploy with a clean, uncluttered GUI. For administrators, setting up user accounts was simple and fast. System monitoring and management is enhanced with the dashboard view, and IT policies can be implemented by group with easy drill-down to individual users or devices if needed. The redundant nodes in the HCP Anywhere pod and robust HCP back-end ensure a rock-solid foundation for file sharing.

Integration with Active Directory ensures corporate user authentication, and integration with virus scanning and other corporate security initiatives are supported. Data is shared only by using intelligent links that expire. From the administration GUI, IT can easily perform tasks such as enabling and disabling accounts, editing user credentials, and remotely wiping anywhere-specific data from devices in case of loss or theft. Mobile devices are secured with a user profile and require an extended lock code.

From the user perspective, the similar look and feel among all platforms (computer/mobile device/web) makes it easy to access and share files regardless of location, without using portable drives or filling up mailbox quotas with attachments. The HCP Anywhere personal folder is quickly accessible on all devices, which simplifies and speeds file sharing. Additional features such as alert and device management and activity viewing are available through the self-service web portal.


Hitachi Data Ingestor

Hitachi Data Ingestor is designed for remote locations with little or no IT presence. HDI presents a standards-based file system interface to users that is tightly integrated with HCP to provide seamless data access and a wide range of advanced storage features. Users at the remote site use CIFS or NFS to access the HDI as a standard NAS file server. HDI securely migrates inactive content to a central HCP and maintains a local link to the migrated content, referred to as a stub. When a user accesses a stubbed file, the full file contents are automatically retrieved by HDI, replacing the stub. This delivers effectively bottomless storage capacity (up to 400 million files per HDI) to remote sites.

For distributed organizations and service providers with many small remote locations or customers with little or no dedicated IT staff, HDI provides centralized configuration, management, and reporting via HCP Anywhere as well as remote provisioning, where organizations ship HDI to remote sites that then simply connect it to the network and power it on.

As Figure 19 shows, users at remote sites have full access to all their content, including content that has been migrated to the HCP.

ESG Lab Testing

Hitachi developed HDI for ease of administration and deployment. In addition to the traditional method of configuring all parameters through a web-based wizard interface directly on the HDI, administrators can also use the HCP administration console to create predefined provisioning templates for multiple HDI units. When a unit is deployed in the field, only a few parameters are required. The rest of the configuration is retrieved from the HCP and applied directly to the HDI.

ESG Lab followed the steps an organization or service provider would take to set up an HDI before shipping it to a remote site. Completing setup and applying the configuration took a matter of minutes, at which point HDI was ready to use. The web-based administration console also provides a dashboard, as shown in Figure 20. Administrators are provided with basic status information, local storage capacity, and the status of any scheduled tasks.

Next, ESG Lab used the HCP Anywhere administration console to create a provisioning template, enabling quick and painless deployments of multiple HDI units to the field. Using the web-based interface, the create template wizard was started. The template was named, and, like the local configuration wizard, basic parameters for networking, time, and authentication were provided.

The next step was to configure the HCP information, the file services provided to local users, and the status reporting interval. Once the settings were saved, the provisioning template was available to be applied to any HDI unit in the inventory.

Next, ESG Lab imported an xml file containing a list of the HDI units in inventory. This populated the available inventory list in the HCP Anywhere administration console. An HDI unit was selected, and the recently created provisioning template was applied to the unit.

The final step was to set a unique admin password for the unit. The unit and password could then be sent to a remote site, where the only steps required by the onsite administrator would be to plug in power and networking. At that point, HDI would be fully operational.

Why This Matters

ESG research found that almost half—43%—of respondents operate more than 60% of their sites with no dedicated on-premises IT staff.6 This means that any data sharing products must be directly usable, with no onsite support.

HDI and HCP offer native file system access to a hugely scalable distributed object store with advanced storage and management capabilities. Using easy-to-follow configuration wizards, ESG Lab created a provisioning template and configured multiple HDI systems at once to provide file services to small business clients or remote office users with native, local access.


ESG Lab Validation Highlights

  • ESG Lab confirmed that HCP with HDI offers effectively unlimited, backup-free file and content storage while providing local file system performance to users. Active/active replication enables the creation of a global access topology, providing content synchronization across an entire enterprise.
  • ESG Lab was extremely impressed with the ability of HCP to provide mobility between private and public clouds using adaptive cloud tiering without compromising enterprise control.
  • ESG Lab validated that the HCP Anywhere solution was easy to deploy for administrators and easy to use. Setting up user accounts was simple and fast, while the dashboard view provided clear monitoring.
  • IT policies can be implemented by group with easy drill-down to individual users or devices if needed. The redundant nodes in the HCP Anywhere pod and robust HCP back-end ensure a rock-solid foundation for file sharing.
  • Security was robust, using integration with Active Directory, virus scanning, and other corporate security initiatives. Data is shared using intelligent links that expire. IT can easily perform tasks such as enabling and disabling accounts, editing user credentials, and remotely wiping anywhere-specific data from devices in case of loss or theft. Mobile devices are secured with a user profile and require an extended lock code.
  • The similar look and feel among all platforms (computer/mobile device/web) makes it easy for users to access and share files regardless of location, without using portable drives or filling up mailbox quotas with attachments. The HCP Anywhere personal folder is quickly accessible on all devices, which simplifies and speeds file sharing. Additional features such as alert and device management and activity viewing are available through the self-service web portal.
  • Native file system access to a hugely scalable distributed object store with advanced storage and management capabilities was provided by HDI and HCP.
  • ESG Lab quickly and easily created a provisioning template and configured multiple HDI systems at once to provide file services to small business clients or remote office users.
  • ESG Lab found that HCI administrative interface was easy to use and that HCI seamlessly integrated into the HCP Portfolio environment as well as with existing IT security and access practices.
  • HCI’s wizard-driven workflow designer gave ESG Lab extensive control over the data aggregation and normalization process, with a unique set of Content Class controls in the extraction step.
  • HCI provides a powerful end-user capability to search the output of the data processing pipeline, with an ease of use like familiar web search tools.

Issues to Consider

  • The results and data presented in this document are based on testing executed in a controlled lab environment. Due to the many variables in each production environment, it is still important to perform testing in your own environment to validate the applicability of the solution to your needs.

The Bigger Truth

Cloud architectures have gained significant ground in corporate IT departments and with service providers in recent years—the cost and business-process benefits are compelling. They include simplified provisioning, better application availability, centralized data management and protection, and the opportunity to improve service levels while minimizing capital and operational costs. However, users still plan to implement over 50% of their net-new analytics deployments on-premises, i.e. on hardware and software they own and operate.7

These are not the only challenges facing IT. ESG confirms that managing data growth remains a top IT priority.8 In addition, as IT delivery becomes more service-focused, user expectations for application availability and data access are soaring. Providing these services to remote and branch offices is particularly difficult, and as a result, IT and cloud providers are anxiously seeking a faster, simpler way to provide a full suite of data services on demand, without increasing cost or complexity. Providing a comprehensive suite of file and content services can make a big difference, but IT and service providers must maintain data control, providing security and ensuring compliance—efforts that are often at cross-purposes with sharing files online.

Hitachi Content Platform creates an integrated product offering that provides distributed consumers of IT, such as remote office workers, branch office workers, or cloud storage users, with a seamlessly scalable, backup-free storage solution. HDI serves as an easy cloud entry at the edge, caching active data for rapid access, while all data is stored, protected, retained, and governed centrally on HCP. With HCP Anywhere, Hitachi Vantara takes advantage of its own proven, rock-solid object storage platform to provide an on-premises file sharing solution that makes users more productive without jeopardizing data. Remote office and cloud consumers obtain data services equal to or better than public OFS services, while cloud providers and IT departments can consolidate service delivery and data management.

ESG Lab testing demonstrated that HDI can be set up quickly and easily by central IT or a service provider so that a remote site need only plug it into the network and power it on. HDI management via HCP Anywhere is intuitive and complete. ESG Lab used HCP Anywhere to provide mobility between private and public clouds using adaptive cloud tiering without compromising enterprise control. Active/active replication enabled ESG Lab to create a global access topology, providing content synchronization across an entire enterprise or service provider network with robust multitenant controls. ESG Lab also validated that HCP Anywhere provides employees with an easy and effective method of sharing files that maintains corporate IT control and security, limiting data vulnerability and compliance risk.

ESG Lab confirmed that HCI is a powerful addition to the Hitachi Content Platform Portfolio, providing a capability for users to extract information from their data. HCI includes end-user tools and the ability to connect to ISV analysis tools.

Over the years, Hitachi Vantara has proven it has an innovative approach to solving IT challenges, and Hitachi Content Platform Portfolio consisting of HCP, HCP Anywhere, HDI, and HCI proves the point yet again. The combined solution is secure, efficient, and easy for IT and users to operate; it delivers on all the critical requirements for online file sharing and enterprise content management. For both corporate IT organizations and cloud service providers, this solution can ease the pain of massive unstructured data growth while delivering an easy entry to a cloud deployment.



1. Source: ESG Research Report, 2015 Data Storage Market Trends, October 2015.
2. Source: ESG Research Report, 2017 IT Spending Intentions Survey, March 2017.
3. Source: ESG Research Report, 2017 IT Spending Intentions Survey, March 2017.
4. Source: ESG Research Report, Big Data Trends: A Midmarket Perspective, March 2016.
5. Source: ESG Research Report, Security, Productivity, and Collaboration: Trends in Workforce Mobility, May 2016.
6. Source: ESG Research Report, Remote Office/Branch Office Technology Trends, May 2015.
7. Source: ESG Research Report, Enterprise Big Data, Business Intelligence and Analytics Trends, Redux, July 2016.
8. Source: ESG Research Report, 2017 IT Spending Intentions Survey, March 2017.

Appendix


ESG Lab Reports

The goal of ESG Lab reports is to educate IT professionals about data center technology products for companies of all types and sizes. ESG Lab reports are not meant to replace the evaluation process that should be conducted before making purchasing decisions, but rather to provide insight into these emerging technologies. Our objective is to go over some of the more valuable feature/functions of products, show how they can be used to solve real customer problems and identify any areas needing improvement. ESG Lab's expert third-party perspective is based on our own hands-on testing as well as on interviews with customers who use these products in production environments.

Topics: Storage Data Platforms, Analytics, & AI Enterprise Mobility Cloud Services & Orchestration