Gerenderte Dokumentation
Diese Seite rendert das Markdown und Mermaid des Moduls direkt aus der offentlichen Dokumentationsquelle.
Overview#
The Ingestion Pipeline domain provides data normalization and ingestion coordination. It manages normalization jobs with queuing, retry logic, and status tracking for transforming raw data into standardized formats through a multi-stage pipeline process.
Key Features#
- Normalization job submission with configurable retry limits
- Multi-stage pipeline processing (validation, extraction, normalization, mapping, enrichment, completion)
- Job status tracking (pending, running, retrying, completed, failed)
- Queue depth monitoring for operational visibility
- Manual job processing and retry capabilities
- Correlation key tracking for data linking
- Raw and normalized payload preservation with metrics
Use Cases#
- Normalizing external data from various connectors into standardized platform formats
- Monitoring ingestion queue health and processing throughput
- Retrying failed normalization jobs after transient errors
- Tracking data through multi-stage processing pipelines
Integration#
The Ingestion Pipeline domain integrates with Connector Registry for data connectors, Data Source Integration for source management, and Entity Resolution for deduplication.
Last Reviewed: 2026-02-05