Macrometa GDN allows you to integrate streaming data and take action based on streaming data. Typically the stream processing use cases involve collecting, analyzing and, integrate or acting on data generated during business activities by various sources i.e.,
|Collect||Receive or capture data from various data sources.|
|Analyze||Analyze data to identify interesting patterns and to extract information.|
|Act||Take actions based on the results and findings done via processing the data. The action can be executing some random code, calling an external service, or triggering a complex integration.|
|Integrate||Make processed data available for consumers to consume globally in right format with very low latencies.|
Stream Workers is currently an Enterprise only feature. We will be rolling it out to all users in Q1 of 2022. Contact [email protected] if you have any questions.
The architecture of Macrometa stream processing engine fits this natural flow. Following are the major components of our stream processing engine.
The stream processing engine receives data event-by-event and processes them in real-time to produce meaningful information i.e.,
- Accept event inputs from many different types of sources.
- Process them to transform, enrich, and generate insights.
- Publish them to multiple types of sinks.
To use stream processor, you need to write the processing logic as a stream application using streaming SQL language which is discussed in the Stream Query Guide.
When the stream application is published, it:
- Consumes data one-by-one as events.
- Pipe the events to queries through various streams for processing.
- Generates new events based on the processing done at the queries.
- Finally, sends newly generated events through output to streams.
Macromete stream processing engine allows you to write rich & complex stream processing logic using an intuitive SQL-like language. You can perform the following actions on the fly using stream queries and constructs.
|Realtime ETL||In realtime, extract data when available, transform it on the fly, and integrate it using sinks (http, streams, mqtt..) etc.|
|Consume & Publish Events||Consume and Publish events via |
|Data Filtering||Filter events based on conditions such as value ranges, string matching, regex, and others.|
|Data Cleansing||Filter out corrupted, inaccurate or irrelevant data from a data stream based on one or more conditions. Modify or replace content to hide/remove unwanted data parts from a message (|
|Data Transformations||Support |
|Data Enrichment||Enrich the data received in the stream with data from c8db or another data stream, or an external service to derive an expected result|
|Data Summarization||Aggregate data using |
|Scripting||Write custom functions in |
|Pattern & Trend Mining||Identifies event |
|Sequence Processing||Identifies continuous sequence of events from streams. Supports |
|Scatter-Gather||Process complex messages by dividing them into simple messages using |
|Data Pipelines||Periodically trigger data pipelines based on time intervals, and cron expression using |
|Geo Replicated Data Store||Query, modify, and join the data stored in tables which support primary key constraints and indexing.|
|Rule Processing||Execution of rules based on single event using |
|Realtime Decisions as Service||Provide REST APIs to query |
These features allows you to build robust global data processing and integration pipelines at the edge by combining powerful stream processing, multi-model database and geo-replicated streams capabilities.