Skip to content

Blog

Read our medium blogs

Integrating Snowflake OAuth with Django

Hi, dev ninjas 🥷 !

Read more on Medium

Django Application with Allauth Configuration

Hi, dev ninjas 🥷 !

Read more on Medium
Provide your valuable feedaback!

How would you rate this tool?


   


Snowflake notifications ❄

dataopsly allows it's users to give their snowflake environment and dataopsly will send email or webhook notification by executing SYSTEM$SEND_EMAIL.

Then when connected to any of the jobs inside dataopsly, the notification will be sent via snowflake.

Refer here for more info!


Trigger jobs in dataopsly from Airbyte

Do you have numerous connections in Airbyte and a plethora of dbt jobs? Are you looking to trigger these jobs automatically as soon as a connection succeeds?

We understand your needs, and we're excited to present dataopsly with integrated Airbyte job capabilities.

Now, with dataopsly, you can trigger jobs directly via Airbyte Cloud. Simply navigate to the scheduler section when creating jobs, select the Airbyte job option, and enter the required credentials.

Streamline your workflows and enhance efficiency with dataopsly's new Airbyte job integration.

airbyte_job_1

airbyte_job_2

with respect to the generation of token, for cloud, firstly you need to have an enterprise account. If you have one, go to setting > applications, create one, and then click on generate token.

airbyte_job_4

airbyte_job_5

copy the token from there or else you can generate one from the API token by loggin in and creating the token by giving a name.

usually for airbyte cloud the url for API is https://api.airbyte.com/v1/connections

Add the webhook url of dataopsly in the airbyte.

Head to settings > notifications > switch on webhook > give the url http(s)://{your_respective_url}/airbyte/webhook/ and save the settings

Now, with dataopsly, you can trigger jobs directly via Airbyte Local/OSS. Simply navigate to the scheduler section when creating jobs, select the Airbyte job option, and enter the required credentials.

Streamline your workflows and enhance efficiency with dataopsly's new Airbyte job integration.

airbyte_job_1

airbyte_job_3

with respect to the generation of token, for local, we need to add the Base64 value of the username:password, so the default value will be:

YWlyYnl0ZTpwYXNzd29yZA==
if the username:password is airbyte:password

but if you have a different username and password, use this to enter the username:password there in the text box and give convert and copy the Base64 value and paste it here in the token section.

for local the api url might vary, check with the website, as of the update source while updating this document, it is /api/public/v1/workspaces/

but for the url, it cannot be accessed using localhost or 127.0.0.1, but we need to use port forwarding and use that link.

Add the webhook url of dataopsly in the airbyte.

Head to settings > notifications > switch on webhook > give the url http(s)://{your_respective_url}/airbyte/webhook/ and save the settings

Fact
  • Just connecting the job with the respective airbyte connection is enough, dataopsly takes care of remaining things.
  • You can change the connection anytime, and also switch the job from airbyte to normal jobs and vice versa anytime!!

Multi job runs

dataopsly allows it's users to select multiple job at time and run all at once.