Explore how to add data to - or download from - your Workspace bucket or an external Google bucket. The best approach depends on how many files you have and what size they are, whether you're moving to or from local storage, and how comfortable you are with command-line tools.
Transfer between local storage and a workspace bucket in Terra
Uploading/downloading data in TerraYou can move files around through the Terra website, rather than the command line. But there are restrictions on how you can use this feature:
- Only for transfers between workspace storage and local storage (e.g., laptop).
- Recommended only for small numbers of small data files.
Note: This is the sort of transfer you often see where you upload or download a file from the internet. Because your local storage has no cloud-native "path", you can only transfer files stored on the system running your browser.
-
1. Start in the workspace Data page.
2. Select the Files icon on the lower left side.
3. In the Files section, click on the upload button with a cloud icon in the upper right corner.
4. Use the finder window to select the file(s) to upload. -
1. Start from the workspace Data page.
2. Select the Files icon on the bottom of the left column (underneath Other Data).
3. Find the file you want to download (you may have to navigate down many levels of file folders) to access the file you want.
4. Click on the file to download. This will open a pop-up window with instructions for downloading the data in multiple ways, and the cost of the download.
-
1. Start from the workspace Data page.
2. Click on the table with the data file to download on the left side of the screen. The example below is for the sample table.
Any files available for download will be shown as a link in the sample row.
3. Click on a file link to open a pop-up window describing the size of the file, instructions for downloading the data in multiple ways, and the cost of the download. For example, in the screenshot below the file could be downloaded via a terminal command or by clicking a button.
4. Click on the “Download for [cost to download your file]” button to initiate the download. Note: This button starts the download immediately. You won't get another opportunity to verify before the download starts. However, you can cancel the download at any time during the process.
5. Repeat for any additional files you would like to download.
Transfer using gcloud storage
When to use gcloud storage
- Works well for all size transfers
- Ideal for large file sizes or 1000s of files
- Can be used for transfers between local storage and a bucket, workspace virtual machine (VM) or persistent disk and a Google bucket, as well as between Google buckets (external and workspace)
What is gcloud storage CLI? gcloud storage is a Python application that lets you access Cloud Storage from the command line in a terminal.
The terminal you use can be run on your local machine (local instance) or built into the workspace Cloud Environment (workspace instance).
gcloud storage in a terminal - Step-by-step instructions
Step 1. Open gcloud storage in a terminal
You can run a terminal locally or in your workspace. Which you use depends on where your data are located.
Which terminal instance should you use?
Moving data to or from the Cloud Environment VM/PD?
- Use the workspace terminal instance.
Moving data to or from local storage?
- Use a local terminal instance.
Google bucket to Google bucket transfer?
- You can use either instance.
-
Use for moving data to/from a cloud environment
1.1. Start a Cloud Environment if one is not already running, as this is the virtual machine the terminal runs on.
1.2. Scroll to the right of your workspace page to see these icons, which will lead you to one of the best-kept secrets of Terra - a command line interface. Click on the (
>_
) icon and you can access what resembles a UNIX terminal.Opening the terminal from an RStudio Cloud Environment If you're running a Cloud Environment with RStudio, you won't see the terminal icon on the right-hand panel, as shown above. Instead, follow the instructions in Using the terminal in RStudio to open the terminal.
1.3. From here, you can perform command-line tasks including
gcloud storage cp
. -
Use for moving data to/from local storage
1.1. Follow Google’s instructions to install Google SDK, which includes gcloud storage CLI.
1.2. Open a Google Cloud SDK shell and run
gcloud init
to authenticate. You will be asked to sign into your Google Cloud account and select your Google Cloud project.1.3. Set a default project name using
gcloud config set project MY_PROJECT
.1.4. Verify gcloud storage installation. To do this, run
gcloud storage ls
to see all of the Cloud Storage buckets you have access to.List the buckets for a specific projectRun
gcloud storage ls -p PROJECT_NAME
to list buckets for a specific project. You will need to have owner access to the project to run this command.
Step 2. Run gcloud storage commands
Once in a terminal (either on your local machine or in a Terra workspace), you can copy data from one place to another using the cp
command:
gcloud storage cp WHERE_TO_COPY_DATA_FROM/FILENAME WHERE_TO_COPY_DATA_TO
For example, to copy a file from one location in the workspace bucket to a folder called 'favorites' in the same bucket, your command would look something like this:
gcloud storage cp gs://fc-3dfd2d6a-d382-4c2b-b593-39651709b7bf/myFile.txt gs://fc-3dfd2d6a-d382-4c2b-b593-39651709b7bf/favorites
Finding the full path to workspace bucket
In Terra, you can find the full path to the workspace bucket in the Cloud Information box on the right-hand side of the workspace's Dashboard tab. Copy this path by clicking the clipboard icon in the right side of the path.
You can find the full path to an individual file in the workspace by clicking on the clipboard icon to the right of the file's name in the files section of the workspace's data tab.
Additional details on the gcloud storage cp command can be found in the Google gcloud documentation.
You must be an Owner or Writer to upload to a Google bucket, including the workspace bucket!
- To generate a manifest when uploading, use the
-L
option.gcloud storage cp WHERE_TO_COPY_DATA_FROM/FILENAME WHERE_TO_COPY_DATA_TO -L WHERE_TO_SAVE_MANIFEST
-
To copy the file "Example.bam" from an external bucket "gs://My_GCP_bucket" into the "gene_files" folder in a workspace bucket "gs://fc-7ac2cfe6-4ac5-4a00-add1-c9b3c84a36b7", use the command
gcloud storage cp gs://MY_GOOGLE_BUCKET/EXAMPLE.bam gs://fc-7ac2cfe6-4ac5-4a00-add1-c9b3c84a36b7/gene_files
-
To download data from a bucket, reverse the order of the bucket URL and local file path, use
gcloud storage cp [bucket URL]/[file name] [local file path]
Make sure to leave a space between the the bucket URL and the file path. For example:
gcloud storage cp gs://WorkspaceBucket/GeneFiles/example.bam /Users/Documents
Note that operating systems specify local file paths differently -- for example, on a Windows system the local path in the example above might be
Users\Documents
.To download data from a bucket that is enabled with requester-pays, run the command this way.
gcloud storage -u GOOGLE_BILLING_PROJECT cp gs://BUCKET_URL/FILE_NAME LOCAL_FILE_PATH
To learn more about accessing files from a requester-pays enabled Google bucket, see the Google requester pays docs.
-
If you're downloading folders, you'll need to use the
-R
flag to copy the folder and its contentsgcloud storage cp -R gs://EXAMPLE_BUCKET/FOLDER_1 LOCAL_FILE_PATH
The cp command automatically runs parallel (multi-threaded/multi-processing) copies as needed. To recursively copy subdirectories, use the --recursive flag in the command. For example, to copy files including subdirectories from a local directory named top-level-dir to a bucket, you can use:
gcloud storage cp top-level-dir gs://EXAMPLE_BUCKET/FOLDER_1 LOCAL_FILE_PATH --recursive
More gcloud instructions working with large data can be found here and an explanation of
-m
can be found here.
File validation / checksum generation
At the end of every upload or download, the gcloud storage cp
command validates that the checksum it computes for the source file/object matches the checksum the service computes. If the checksums do not match, gcloud storage will delete the corrupted object and print a warning message. You can learn more about this from Google's documentation. This very rarely happens, but if it does, please contact gs-team@google.com.
Troubleshooting
The following are the most common errors our users encounter when moving data using gcloud storage. If you experience a different error, please note the error in the comments of this article and contact Frontline Support by clicking on contact us under Support in the main Terra menu.
-
You may have trouble accessing your Terra workspaces if you authorized your gcloud sdk installation with a Google Account that is not registered in Terra and applied to your workspace. You can verify which Google Account you’ve authorized with gcloud by running the following command:
gcloud auth list
.-
If the Google ID returned matches the one on your Terra workspace, you should be able to access your workspace. If it doesn't, please contact your Project Manager.
-
If the Google ID returned does not match the one on your Terra workspace, run the following command to specify the correct account:
gcloud auth login GOOGLE_ACCOUNT
-
-
When working on a Unix system, you need to to tell it not to try to start a browser. Once you do that, you should receive a url you can paste into your desktop browser.
To tell the system not to start a browser, use the command
gcloud auth login --no-launch-browser