Title: Pipeline Overview Locale: en URL: https://sensorswave.com/en/docs/data-center/pipeline/overview/ Description: Learn about the Pipeline feature and how to use data export pipelines Pipeline is Sensors Wave's data export system that automatically syncs your event and user data to external data warehouses on a recurring schedule, enabling unified cross-platform analytics. With Pipeline, you can: - Automatically sync user behavior events from Sensors Wave to an external data warehouse for deeper analysis by your data team - Regularly export user profile data to your data warehouse for joins with other business datasets - Backfill historical data so your data warehouse has a complete event history ## Core Concepts ### Pipeline A pipeline is an automated data sync channel from Sensors Wave to an external data warehouse. Each pipeline targets a specific destination and runs automatically on a configured schedule. ### Connector A connector is the integration type between Sensors Wave and an external system. There are two categories: - **Source connector**: Brings external data into Sensors Wave, e.g., Android SDK, iOS SDK - **Destination connector**: Exports data from Sensors Wave to an external system, e.g., Snowflake ### Data Source Types Each pipeline exports one type of data: | Data Source | Description | |-------------|-------------| | **Events** | User behavior events, synced incrementally by time | | **Users** | User profile attributes, synced incrementally by update time | ### Execution Frequency Pipelines support two scheduling modes: - **Interval Execution**: Triggered at fixed intervals (1–23 hours), e.g., every 1 hour or every 6 hours - **Scheduled Execution**: Triggered at a fixed time in a specified timezone, e.g., daily at 08:00 ### Historical Data Backfill Backfill lets you load data for a historical time range after a pipeline is created. Backfill runs serially in calendar-day windows and can resume from the last completed window if interrupted—without re-processing already completed data. Pipelines are managed in **Data Center → Pipeline**, where you can: - Create event export or user export pipelines - Configure connection settings and execution frequency - View run status, exported row counts, and logs for each execution - Trigger historical data backfills - Monitor export metric trends ## Supported Connectors | Connector | Status | Description | |-----------|--------|-------------| | [SDK](sources/sdk.mdx) | Available | Collect events and user data from client and server apps via SDK | | [Snowflake](destinations/snowflake.mdx) | Available | Export events and user data to Snowflake | Support for additional data warehouse targets is planned for future releases. ## Relationship with Other Modules - **Data Integration**: Pipeline exports data that has been collected via SDK or server-side API and stored in Sensors Wave. Data Integration is Pipeline's upstream - **Data Dictionary**: The events and properties exported by Pipeline correspond to the tracked events and properties managed in the Data Dictionary - **Analytics**: After syncing data to an external data warehouse, you can use SQL in the target system for custom analysis, complementing Sensors Wave's built-in analytics ## Next Steps - **[Snowflake](destinations/snowflake.mdx)**: Configure Snowflake connection and start syncing data to your data warehouse --- **Last updated**: April 13, 2026