New KB articles published for the week ending 17th August,2019
August 19, 2019What’s New With Bluekeep? Are Your Devices Vulnerable?
August 19, 2019Enterprise data warehouses are getting more expensive to maintain. Traditional data warehouses are hard to scale and often involve lots of data silos. Business teams need data insights quickly, but technology teams have to grapple with managing and providing that data using old tools that aren’t keeping up with demand. Increasingly, enterprises are migrating their data warehouses to the cloud to take advantage of the speed, scalability, and access to advanced analytics it offers.
With this in mind, we introduced the BigQuery Data Transfer Service to automate data movement to BigQuery, so you can lay the foundation for a cloud data warehouse without writing a single line of code. Earlier this year, we added the capability to move data and schema from Teradata and S3 to BigQuery via the BigQuery Data Transfer Service. To help you take advantage of the scalability of BigQuery, we’ve now added a service to transfer data from Amazon Redshift, in beta, to that list.
Data and schema migration from Redshift to BigQuery is provided by a combination of the BigQuery Data Transfer Service and a special migration agent running on Google Kubernetes Engine (GKE), and can be performed via UI, CLI or API. In the UI, Redshift to BigQuery migration can be initiated from BigQuery Data Transfer Service by choosing Redshift as a source.
The migration process has three steps:
- UNLOAD from Redshift to S3–The GKE agent initiates an UNLOAD operation from Redshift to S3. The agent extracts Redshift data as a compressed file, which helps customers minimize the egress costs.
- Transfer from S3 to Cloud Storage–The agent then moves data from Amazon S3 to a Cloud Storage bucket using Cloud Storage Transfer Service.
- Load from Cloud Storage to BigQuery–Cloud Storage data is loaded into BigQuery (up to 10 million files).