Need Help?

Search our documentation and community forum

Terra is a cloud-native platform for biomedical researchers to access data, run analysis tools, and collaborate.
Terra powers important scientific projects like FireCloud, AnVIL, and BioData Catalyst. Learn more.

Terra uses pipelines in Workflow Description Language (WDLs) for batch processing genomics data. Find here documentation on setting up and running WDLS using sample or your own data in a Google Cloud bucket.

  • Get started running workflows plus-icon minus-icon

    Intro paragraph We think the best way to get started running workflows on Terra is to dive in and get started! The Terra QuickStart workspace is a hands-on practice that guides you through the process. You'll follow the steps below to get experien...
    See more

  • Getting started with GATK workflows in the cloud FAQs plus-icon minus-icon

    If you're new to running GATK on a cloud-based platform, or new to Terra, this information will help get you started. From pre-processing raw sequencing data through variant calling and joint calling, showcase workspaces provide fully reproducible...
    See more

  • Create, edit, and share a new workflow plus-icon minus-icon

    Creating or editing a new Workflow in Terra currently requires working in the "Broad Methods Repository." Below are step-by step instructions for how to A. Create a new workflowB. Edit a workflowC. Share or clone a workflow A. Create a new workflo...
    See more

  • Finding the workflow (method) you need in the Methods Repository plus-icon minus-icon

    Many workflows already exist in Terra to help carry out different steps in your research journey. Below is a step-by-step outline for finding workflows in the Broad Methods Repository.   =================================================  Note that...
    See more

  • Configure a workflow to process your data plus-icon minus-icon

    Configuring a workflow means customizing the parameters the WDL needs to run to meet your specific needs: defining Inputs - including reference files, compute parameters, and input data file names and locations - and Outputs. Some workflows (such...
    See more

  • Getting workflows up and running faster with a JSON file plus-icon minus-icon

    Configuring with a JSON file helps avoid manually typing in workflow inputs and outputs and can get your WDL up and running in record time. This is especially useful when you want to quickly test that a WDL workflow can run successfully on Terra w...
    See more

  • Creating a list file of reads for input to a workflow plus-icon minus-icon

    Raw genomics data is in the form of many reads from the sequencer. Since it would be messy and time consuming to type in the location of every one of these data files as input for a WDL, the input is often a 'list' file. This file is just a list o...
    See more

  • How to test workflows locally plus-icon minus-icon

    When creating WDL workflows, you can test on a local installation of the Cromwell execution engine prior to uploading and testing in Terra. Cromwell is the workflow execution service used to run and test WDL workflows. Cromwell is a Java applicati...
    See more

  • How does pipeline versioning work? plus-icon minus-icon

    Workflows in Terra Terra allows you to execute predefined bioinformatics workflows against data in your workspace. Each workflow is a chain of individual software tasks. Each task comprises a set of software instructions to execute (code), as well...
    See more

  • Updating workflows to the Latest Version plus-icon minus-icon

    This article shows how to add a new version of a workflow to your Terra workspace in two instances - workflows hosted on Dockstore and those in the Broad Methods Repository.  Note for FireCloud users: you may be familiar with the "snapshots" creat...
    See more

  • Monitor and troubleshoot in the Job History tab plus-icon minus-icon

    The Job History tab is your workspace operations dashboard, where you can check the status of past and current workflow submissions, drill down to see what’s going on and find direct links to all input and output files involved. This article walks...
    See more

  • CPUs and persistent disk quotas: What are they and how do you request more? plus-icon minus-icon

      In the Google Cloud Platform (GCP), quotas specify how many resources, such as central processing units (CPUs) and persistent disks (PDs), can be used by a Google Project at any given time. Quotas prevent unforeseen spikes in usage so resources ...
    See more

Powered by Zendesk