Logo
Sign in
Product Logo
Amazon Managed Workflows for Apache Airflow (MWAA)Amazon Web Services (AWS)

Highly available, secure, and managed workflow orchestration for Apache Airflow

Vendor

Vendor

Amazon Web Services (AWS)

Product details

Secure and highly available managed workflow orchestration for Apache Airflow

Why Amazon MWAA?

Amazon MWAA is a managed service for Apache Airflow that lets you use your current, familiar Apache Airflow platform to orchestrate your workflows. You gain improved scalability, availability, and security without the operational burden of managing underlying infrastructure.

Amazon MWAA is accessible in the next generation of Amazon SageMaker

With Amazon MWAA in the next generation of Amazon SageMaker, you can deploy and scale Apache Airflow seamlessly without operational burdens. With automated scaling and built-in fault tolerance, MWAA in Amazon SageMaker ensures your workflows execute reliably—allowing you to focus on innovation, not infrastructure.

Benefits

Deploy Apache Airflow

Deploy Apache Airflow at scale without the operational burden of managing underlying infrastructure.

Run Apache Airflow workloads

Run Apache Airflow workloads in your own isolated and secure cloud environment.

Monitor environments

Monitor environments through Amazon CloudWatch integration to reduce operating costs and engineering overhead.

Connect to AWS, cloud, or on-premises resources

Connect to AWS, cloud, or on-premises resources through Apache Airflow providers or custom plugins.

Build, execute, and monitor workflows in Amazon SageMaker

Amazon MWAA powers workflows for the next generation of Amazon SageMaker with access to a personal, open-source Airflow deployment, running alongside Jupyter notebooks in Amazon SageMaker Unified Studio. You can easily develop Airflow Directed Acyclic Graphs (DAGs) that can orchestrate their project artifacts such as notebooks, queries, and training jobs.

Use cases

Support complex workflows

Create scheduled or on-demand workflows that prepare and process complicated data from big data providers.

Coordinate extract, transform, and load (ETL) jobs

Orchestrate multiple ETL processes that use diverse technologies within a complex ETL workflow.

Prepare ML data

Automate your pipeline to help machine learning (ML) modeling systems ingest and then train on data.