On Analytics, Data Platforms and Smart Applications
Disaster recovery and development and testing are obvious starting points, but there are many other hybrid-cloud DW use cases as well as pitfalls to avoid.
We’ve already witnessed a seismic shift of mainstream corporate workloads into the cloud, but the movement has been slower to take off where data and analytics are concerned, and with good reason.
Most companies view data as their most valuable asset, so they’ve been more conservative about the digital treasure chest otherwise known as the data warehouse. You can argue all you want about cloud vs. on-premises security, but some businesses and, broadly speaking, some industries just aren’t going to move the bulk of their data into public clouds. In some cases regulatory or data-residency requirements make public-cloud deployment challenging. There’s also the issue of control, with some businesses facing tight service level agreements that demand performance levels that public cloud service providers won’t guarantee.
All of the above are among the reasons some companies choose private-cloud or Hybrid deployment options combining on-premises deployments with private-cloud services. One example is Core Digital Media, a marketing services firm I recently interviewed for this case study report. Core Digital handles lots of customer data, so it chose a hybrid approach combining its on-premises production system with disaster recovery (DR) running on private-cloud Teradata Database-as-a-Service.
A second Teradata customer I interviewed for the research is currently running DR and development and testing (dev-test) on Teradata DBaaS. But this company does not deal in customer data, so it’s also investigating the public-cloud Teradata Database on AWS set to debut later this month.
As I’ll detail in an in-depth Webinar set for this Thursday, DR and dev-test are typically among the first data warehousing workloads that companies move into the cloud. Other common use cases include unpredictable workloads where you’re not sure of the road ahead and you want to avoid disruption or performance impacts on your production environment. It might be new applications that have emerged from testing and development, but that have yet to prove their business value. It could be fast-growing or compute-intensive workloads that you didn’t foresee in long-range capacity planning. Or it could be spikey workloads that occasionally impact production performance.
Another hybrid use case is using cloud services to handle unique analysis requirements. One of the Teradata customers I spoke to, for example, periodically does data-discovery querying against high-scale, historical data. This querying can impact the performance of their production system, so they’re considering copying data from their Teradata Cloud DR instance into Teradata Cloud for Hadoop for discovery analysis.
No matter what database management system you’re using and whether you’re considering public- or private-cloud database services, register for this week’s Webinar (Thursday at 1 pm ET/10 am PT, but also available on demand) to hear about hybrid deployment use cases in more detail. I’ll also share advice on pitfalls to avoid, such as lack of familiarity with cloud capacities and performance characteristics and the related mistake of over- or under-provisioning. Joining me will be Dominique Jean of Core Digital Media, who will offer a first-hand account of hybrid-deployment dos and don’ts.