How to import data to Google Cloud Storage

Quick steps to setup a command line tool to copy big files to Google Cloud Storage

Google Cloud Platform in production, AWS (Amazon Web Services) and DO(Digital Ocean) have been my goto services, i recently had to use Google SQL for some Analysis on Swaggable preferences data.

By using Google Cloud Platform you can use Google’s core infrastructure, data analytics and machine learning tools. To do that, you’d normally need to import your data. Unfortunately that’s not as straightforward as it should be. Below is the problem i faced and steps i followed to reach the solution.

Problem: I need to import large mysql dump to google cloud sql for computation

  1. We can only import a file that is on google storage so we need to import the file to storage first before we can use it in Google SQL
  2. Datadump file (sql) is placed on the database server, it won’t be practical to download the dump and then upload it via Google Storage Uploader. We should probably send the file directly from server.
  3. Install Google Cloud SDK on the database server https://cloud.google.com/storage/docs/gsutil_install#deb. We’d need the google utilities on our terminal to send the files to the Google Cloud Bucket. In the utilities we’ll use the cp command.
  4. Find the dump file on server. It’s in /home/ubuntu/dumps
  5. Copy from there to Google storage gsutil cp file-name.sql.gz gs://swag-101
  6. Once the sql file is in storage, it can be used by any Google Cloud Platform Product.
2017-05-11T10:23:35+00:00

Leave A Comment