

This will be the only time you will be shown this info, so copy/paste or write it down and keep it in a safe place: IMPORTANT: After you name your token you will be shown a screen with your unique token identifier. Select the button, give it a name of your choice, and hit “Create”. In order to do this, log into your Rivery account and select the button from the left hand panel.


yml docker file and configuring a local webserver to run Apache Airflow.įirst, we need to create Rivery API credentials. This tutorial assumes a basic familiarity with Apache Airflow and Docker, including creating a containerized environment using a. While Rivery can stand on its own as a fully fledged data integration/management/orchestration tool, part of Rivery’s value comes from its adaptability to existing organizational data architecture.
APACHE AIRFLOW TUTORIAL HOW TO
In this tutorial, geared toward advanced users coming from a data engineering background, we walk through how to enable and execute Rivers using the Apache Airflow platform to better integrate Rivery with existing enterprise data engineering architecture. We believe that Airflow’s customizability, dynamic nature, and scheduling options, combined with Rivery’s intuitive UI for building ELT pipelines in the cloud make for an exciting combination that would allow - for the first time - both technical and non-technical teams in an organization to build data workflows. Airflow’s scalability, extensibility, and integration with devops tools such as docker has made it the go-to platform for data engineers to build data ingestion and transformation workflows. You can read more about how to set up authentication credentials to enable the Rivery API here.Īpache Airflow is a very popular open-source python based platform used to author, monitor and schedule workflows. The Rivery API is an API that allows you to integrate the functionality of Rivery’s platform into other applications or schedulers, written in other programming languages. It does not store any personal data.Using the Rivery API to schedule data transformations in Apache Airflow The cookie is set by the GDPR Cookie Consent plugin and is used to store whether or not user has consented to the use of cookies. The cookie is used to store the user consent for the cookies in the category "Performance". This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Other. The cookies is used to store the user consent for the cookies in the category "Necessary".

The cookie is set by GDPR cookie consent to record the user consent for the cookies in the category "Functional". The cookie is used to store the user consent for the cookies in the category "Analytics". These cookies ensure basic functionalities and security features of the website, anonymously. Necessary cookies are absolutely essential for the website to function properly. | Information for authors → | Terms → | Privacy → | Members → | Shop → | Is your company interested in working with Towards AI? → Post navigation All of our articles are from their respective authors and may not reflect the views of Towards AI Co., its editors, or its other writers. We receive millions of visits per year, have several thousands of followers across social media, and thousands of subscribers. We have thousands of contributing writers from university professors, researchers, graduate students, industry experts, and enthusiasts. Read by thought-leaders and decision-makers around the world. We aim to publish unbiased AI and technology-related articles and be an impartial source of information. is the world’s leading AI and technology publication focused on diversity, equity, and inclusion. Established in Pittsburgh, Pennsylvania, US - Towards AI Co.
