Transform building event-driven applications by continuously processing streams of data with a familiar developer experience.
Vendor
MongoDB
Company Website
Atlas Stream Processing
Simplify integrating MongoDB with Apache Kafka to build event-driven applications.
A data model built for streaming data Schema management is critical to data correctness and developer productivity when working with streaming data. The document model gives developers a flexible, natural data model for building apps with real-time data.
A unified developer experience Developers can use one platform—across API, query language, and data model—to continuously process streaming data from Apache Kafka alongside the critical application data stored in their databases.
Fully managed in Atlas With a few lines of code, developers can quickly integrate streaming data from Apache Kafka with their database to build reactive, responsive applications—all fully managed with Atlas.
Features
- **Integrate with Apache Kafka data streams: **Atlas Stream Processing makes querying data from Apache Kafka as easy as querying a MongoDB database. A stream processor is made up of a source stage, any number of processing stages, and a sink stage.
- **Perform continuous analytics using window functions: **Window operators in Atlas Stream Processing allow you to analyze and process specific, fixed-sized windows of data within a continuous data stream, making it easy to discover patterns and trends in near real-time.
- **Validate schema on complex events: **In Atlas Stream Processing, developers can perform continuous validation. Detecting potential message corruption and late-arriving data ensures that events are properly formed before processing.