Hybrid cloud use cases with Teradata’s new Everywhere offering
Article by Teradata data warehouse support director Rob Armstrong
Teradata’s new offering Teradata Everywhere gives customers new opportunities to extend traditional multi-system solutions while leveraging the flexibility and elasticity of cloud environments.
These alternatives to complement on-premises, physical hardware solutions significantly change the cost equation as well as the resources necessary to manage and maintain the analytic ecosystem.
Not all clouds are equal, so these use cases can be leveraged with the public cloud as well as a private or managed cloud option.
The three hybrid cloud uses cases are:
- Cloud Bursting – Load and resource balancing to manage peak periods
- Cloud Data Lab – Allowing greater end-user self-service and exploration
- Cloud Disaster Recovery – Providing lower-cost, off-premises environments for disaster recovery
Cloud Bursting is a capability that can help during peak times or during unexpected needs.
An example would be a retailer during the peak holiday shopping season, or the management of a regular quarterly spike for larger-than-normal financial reporting requests.
The idea is to temporarily instantiate a secondary cloud environment that may be kept up-to-date with a production system or simply has a copy of the data necessary to provide the extra processing relief.
Workloads can be directed to each system as service level and performance demands.
Under normal operations, the workload may be less rigidly controlled but during peak or seasonal demands one can use Teradata Unity routing rules to ensure the highest priority workloads are not interfered with by other lower-priority workloads.
This has been done in the past with dual on-premises systems but with the cloud environment one can lower the cost and overall data centre management of the secondary system while maintaining the same level of responsiveness.
Cloud Data Lab
The idea of a data lab – or sandbox – is not new, but data labs have evolved over the years.
In the early 90s, users were given some space and allowed to create tables for data exploration.
Back then the tools were limited, the users needed special training, and they shared resources with the production platform.
All of these limited the effectiveness of the data lab environment.
Data labs improved over time with better tools and mixed workload management tools to prevent the exploratory activity from impacting the production workloads.
The big need, though, was that the data lab needed to be co-located with the production data so that users could not only load and explore new data, but also integrate the core production and reference data into their queries as well.
Today, users can very easily instantiate a short-term public cloud and have control over their resources and processes with little impact to the production warehouse.
When the user is done with their data lab trials, they can easily terminate the public cloud instances and thereby stop the additional cost clock from ticking.
Cloud Disaster Recovery
In a similar vein to the cloud bursting example above, companies can now lower the cost and complexity of having accessible disaster recovery (DR).
Most companies have moved beyond the “tape drive” mentality of DR and have gone to secondary systems that are kept warm, or active copies of the production data warehouse.
But “real” DR requires geographic separation as well as rigorous methodologies to ensure data is accurately maintained and available whenever needed.
With the Teradata Hybrid Cloud solutions, companies can easily have a cloud system housing the separate, protected DR copy of data.
Using Teradata Unity, the data can easily be managed across the two systems (primary and secondary) and when disaster strikes or a routine maintenance window is opened, workloads can quickly and transparently be routed to the secondary environment.
Users need not know the difference.
Hybrid Cloud Solutions – Agility without Anarchy
It is becoming a complicated world for data management and business analytics.
Users are asking for more self-serve capability with less IT interference.
But in the rush to be agile, we cannot allow anarchy to take root because that just creates more future chaos.
To be agile, you need to have the coordination and cooperation and consistency across the total analytic ecosystem.
Article has been edited for clarity and brevity. Originally published here on the Teradata blog.