Skip to content

dataopsly import from YAML

dataopsly offers an amazing feature for it's enterprise users to import their environments, projects and jobs as YAML file to dataopsly hassle free. And dataopsly takes care of making it a environments, projects and jobs structure.

    name: environments
    type: snowflake
    account: <account_name>
    user: <username>
    password: <password>
    database: <db_name>
    warehouse: <warehouse_name>
    schema: <schema>

check with the environments page, or check what are the values required for an environment based on the database from here.

    name: projects
    url: <git_url>
    ghp_token: <password>
    python_env: dbt-postgres==1.7.5 or something similar to your database
    environment: <name of your available environment>{optional}
    name: jobs
    args: dbt seed, dbt build, dbt run
    environment: <name of your available environment>{optional}
    project: <name of your available project>{optional}
    branch: <name of the respective branch>{optional}

after doing these make sure you link the respective environment in the project.