Saving data from an interactive Cloud Environment (such as Galaxy, Jupyter Notebooks, or RStudio) can be a useful trick. If you're worried about losing work that you've done in an interactive environment because you need to delete or modify your persistent disk, you can use "gcloud storage" commands to copy it to your workspace bucket. Learn step-by-step instructions for doing this from either Jupyter or RStudio.
Be careful not to lose data when reducing disk size! Important reminder: Reducing your disk size mid-analysis can lead to loss of data stored on that disk. When you do reduce your Persistent Disk (PD) size, it's a good idea to save data first.
To learn more, see the documentation on Detachable Persistent Disks.
Why copy generated data to workspace storage?
Below are the primary reasons you might want to copy data generated in a notebook analysis to workspace storage (or to an external Google bucket).
Use generated data as input for a workflow
Files generated by a notebook are not automatically saved in workspace storage (Google bucket) and are not accessible outside your personal virtual Jupyter Cloud Environment.
Share generated data with collaborators - even in a shared workspace
For the same reason, you need to copy data to workspace storage if you want to grant colleagues access. This is true even if you are working in a shared workspace, since each user has their own Cloud Environment and Persistent Disk that is inaccessible to anyone else.
Archive data
If you want to archive data, especially if you want to copy it to less expensive Nearline or Coldline storage, you need to copy it to an external bucket first.
To safeguard data when re-creating or deleting the Persistent Disk
At times, you may need to reconfigure your Cloud Environment (e.g., if you are moving between a notebook and RStudio analysis) or delete your PD. In some cases, you can lose all or some generated data unless you explicitly save your output to workspace or external storage (i.e., Google bucket). For example, you may want to decrease your PD because you overestimated how much you would need and don't want to pay for unused space. In this case, you should back up your data before decreasing the disk size, in case the part of the disk that is deleted includes some of the data generated by your analysis.
Notebook (i.e., .ipynb) files are autosaved to workspace storageWhen working in a Jupyter Notebook on Terra, your notebook is regularly autosaved to your workspace bucket, so normally you don't need to worry about saving the changes to the notebook itself (i.e., code or documentation cells). Outputs displayed in the notebook itself (e.g., plots) will also autosave. However, output files (e.g., matrices) that aren't displayed in the notebook are saved to the PD, not to workspace storage.
Copy data from a Jupyter notebook to workspace storage
You can explicitly save generated outputs to permanent cloud storage (the workspace bucket) using code in the notebook itself by following the directions below.
Step 1. Set environment variables in a Jupyter Notebook
Setting the environment variables lets the notebook grab variables -- such as the workspace's name and its Google bucket -- directly. The result is cleaner and more flexible notebooks that don't require you to hardcode these specific variables.
Run the commands below in a code cell:
-
import os
BILLING_PROJECT_ID = os.environ['WORKSPACE_NAMESPACE']
WORKSPACE = os.environ['WORKSPACE_NAME']
bucket = os.environ['WORKSPACE_BUCKET'] -
project <- Sys.getenv('WORKSPACE_NAMESPACE')
workspace <- Sys.getenv('WORKSPACE_NAME')
bucket <- Sys.getenv('WORKSPACE_BUCKET')
Step 2. Save output files to a bucket with bash commands
The workspace storage is a Google bucket, so basic bash commands in notebooks need to be preceded by "gcloud storage."
These commands will work only if you run the commands above to set the environment variables. Once you execute the code below, the data files should be visible in the workspace bucket.
To save all files, use the commands below. If you want to copy individual files, you can replace `*` with the name of the file that you want to copy.
-
# Copy all files in the notebook into the bucket
!gcloud storage cp ./* $bucket
# Run list command to see if file is in the bucket
!gcloud storage ls $bucket -
# Copy all files generated in the notebook into the bucket
system(paste0("gcloud storage cp ./* ",bucket),intern=TRUE)
# Run list command to see if file is in the bucket
system(paste0("gcloud storage ls ",bucket),intern=TRUE)
Copy data from the persistent disk to workspace storage
Below are instructions for saving files or folders from your Cloud Environment to your workspace bucket.
Step 1: Find the files in the PD
1.1. Your Cloud Environment comes with its own storage, which you can access with command line tools by clicking on the terminal icon in the Cloud Environment sidebar (as long as your Jupyter Cloud Environment is running). This opens a command line terminal directly to your Cloud Environment virtual machine (VM). |
When you first open this terminal, you're in your PD /home directory. If you use the ls
command to list the contents of this directory, you'll notice that the files DON'T necessarily correspond to the notebooks listed in your notebooks tab or the "notebooks" folder in your workspace bucket files.
1.2. List files in "notebooks" folder (in the Terminal) using the ls command:
Compare to the files listed in the workspace's Analyses or Data tabs:
Files in Analyses tab: |
Files in workspace storage: |
Why are the notebooks different in different places? The .ipynb files of the notebooks in your workspace exist before you've launched a Jupyter Cloud Environment. That's what you see stored in workspace storage (on the Data tab) and listed in the Analyses tab.
However, launching Jupyter does not automatically bring the notebooks into your Cloud Environment. The .ipynb files aren't copied to the PD until you open your notebook - either for editing, or in playground mode.
1.3. When you click to open any notebook, Terra will create a new folder in the /home directory of your Jupyter Cloud Environment, which will be named after your workspace.
Now you'll see a new directory in the Terminal, named after the workspace:
1.4. If you list the contents of this new directory, you'll find a folder named /edit. This subdirectory contains copies of all of the .ipynb files in your workspace, and these files include whatever edits you've saved to those notebooks during your current interactive session.
Persistent Disk file structure
It's useful to understand the file structure of your Cloud Environment and workspace bucket storage, and how to transfer things between the locations. This is especially true if you're generating output files from both workflows and interactive analyses (Galaxy, Jupyter, and RStudio), as they're stored by default in different locations.
Step 2: Copy files (gcloud storage commands)
2.1. Use gcloud storage to copy any of these files (or even the entire folder containing all of the files) to your workspace storage by copying the address of your workspace bucket from the dashboard, and using that as the destination for the copy.
-
Use the following command.
gcloud storage cp [file name] gs://[workspace bucket address]
HINT: You can find the Workspace bucket address in the Cloud Information section on the right side of the Dashboard page. -
Remember to add the -r argument to the cp command to copy all contents recursively.
gcloud storage cp -r [folder name] gs://[workspace bucket address]
HINT: You can find the Workspace bucket address in the Cloud Information section on the right side of the Dashboard page.
Expected output (screenshot below)
2.2. The files you've copied are now safely in workspace storage, regardless of what you do with your Jupyter Cloud Environment or persistent disk! You can find and download them either by navigating to Files in the data tab, or directly through the GCP console. See these two options for navigating workspace storage below.
-
Step 1: Open browser (in Dashboard)
Step 2: Find files (GCP console)
Copy RStudio data to workspace storage
To move data generated in RStudio to permanent cloud storage, follow the directions below. Note: Directions apply to workspace storage or an external Google bucket.
Step 1. Open the built-in RStudio terminal
You can access a bash terminal from the Terminal tab in the main RStudio pane:
Step 2. Set the variable "bucket" for the destination storage
To use the workspace bucket for storage, run the command bucket="$WORKSPACE_BUCKET"
.
To save data to an external Google bucket, run the command bucket="$gs://<your-bucket-name>"
.
Setting this as a variable allows you to copy/paste the commands from the documentation below.
WORKSPACE_BUCKET
is an environment variable that is predefined when using the terminal in Terra. Using environment variables lets RStudio grab the workspace Google bucket directly. This helps avoid hardcoding these variables into the code to move the data.
Step 3. Save files to "bucket" with bash commands
Note: Workspace storage is a Google bucket, so basic bash commands in the RStudio terminal need to be preceded by "gcloud storage."
To copy all files generated in the notebook into the bucket, use the command:gcloud storage cp * "$bucket"
To make sure the files are in the bucket, run the following:gcloud storage ls "$bucket"
Be careful when copying all filesUsing `*` can mean copying a lot of large files, which can be expensive. Check the size of the files in the bucket before copying! If you want to copy individual files, you can replace `*` with the file name to copy.
Additional resources
To learn more about your workspace Cloud Environment storage, see Detachable Persistent Disks.
For additional bash capabilities, see Using the terminal and interactive shell in Terra.
A deeper dive: Terra's Cloud Environment To understand what's under the hood and why RStudio and notebooks have these characteristics, see this article about key notebook components or this article about key notebook operations.