Observability Pipelines

Transform, Enrich, Redact, Reduce, and Parse your observability data with real-time and scheduled pipelines. Streamline data ingestion, enhance data quality, and gain actionable insights faster.

GET STARTED FOR FREE
pipeline-Image
Bottom decoration
Bottom decoration

Why Use OpenObserve Pipelines?

OpenObserve simplifies telemetry processing with flexible pipelines and powerful enrichment capabilities. Handle any data format, enrich data on-the-fly, and ensure data quality for reliable analysis.

Pipeline Types
Data Transformation
Data Enrichment
Pipeline Components
Pipeline-Types-image

Pipeline Types

Real-time

Transform data as it arrives in your streams in real time. Parse, filter, and enrich on-the-fly for immediate insights through powerful stream processing capabilities.

Scheduled

Run pipelines at defined intervals for batch processing and periodic data transformations. Perform aggregations. Convert logs to metrics.

Data-Transformation-Image

Data Transformation

VRL Functions

Create custom transformations using Vector Remap Language (VRL). Parse, enrich, and filter data with powerful VRL functions that support complex logic.

Data Parsing

Parse structured and unstructured data using prebuilt VRL functions to extract meaningful information. Convert complex log formats into structured data for easier querying.

Data-Enrichment-Image

Data Enrichment

Enrichment Tables

Add context to your data using CSV-based lookup tables. Enrich events with location data, user information, and other external metadata.

Dynamic Lookups

Enrich data with external information from APIs and other dynamic sources. Seamlessly integrate with external data sources to enhance the value of your observability data.

Pipeline-Components-Image

Pipeline Components

Function Nodes

Process data with VRL functions that execute custom logic. Create reusable function nodes for common data transformation tasks.

Stream Operations

Route and transform data streams with flexible stream operations. Filter data, clone streams, and route data to multiple destinations.

Get Started with Pipelines

Begin building data processing pipelines with OpenObserve. Start with our free tier or schedule a demo.

Fair and transparent pricing

Only pay for what you use.

view pricing plans

Openobserve Cloud Free Tier

Monthly Limits:

  • iconIngestion - 50 GB logs, 50 GB metrics , 50 GB traces
  • iconQuery volume - 200 GB
  • iconPipelines - 50 GB of Data Processing
  • icon1K RUM & Session Replay
  • icon1K Action Script Runs
  • icon3 Users
  • icon7-Days Retention
Get started for free

Get started in minutes—no credit card required.

Pipeline Management FAQs

What types of pipelines does OpenObserve support?

toggle

OpenObserve supports two main types of pipelines: real-time and scheduled. Real-time pipelines process data as it arrives, transforming and enriching it on the fly. These pipelines can include multiple processing steps using functions, conditions, and stream operations. Scheduled pipelines run at defined intervals, allowing for batch processing and periodic data transformations.

How do functions work in pipelines?

toggle

Functions in OpenObserve use Vector Remap Language (VRL) for data transformation. Each function contains VRL code that can parse, transform, and enrich data. Functions can access the incoming data fields, perform conditional processing, and modify or create new fields. For example, a function might parse JSON logs, extract specific fields, and enrich them with geographical information using enrichment tables.

What are enrichment tables and how are they used?

toggle

Enrichment tables are CSV-based lookup tables that allow you to add additional context to your data. You can upload CSV files containing reference data, which can then be queried within pipeline functions using the get_enrichment_table_record function. Common use cases include IP to location mapping, user agent parsing, and adding business context to technical data.

How does data parsing work in pipelines?

toggle

Data parsing in OpenObserve pipelines is handled through VRL functions. The platform supports parsing various formats including JSON, structured logs, and custom formats. Functions can include conditional logic to apply different parsing rules based on the data source or content. Error handling is built into the parsing functions, allowing graceful handling of malformed data.

What stream operations are available?

toggle

Pipeline streams can be configured with various operations including source selection, transformation steps, and destination routing. The platform supports both logs and metrics streams. You can create complex workflows with multiple processing steps, conditions, and parallel processing paths. Stream operations maintain data consistency while allowing for flexible transformation chains.

How do pipeline conditions work?

toggle

Conditions in pipelines allow for selective processing based on data attributes. Using VRL expressions, you can create sophisticated routing and processing logic. Conditions can check field values, apply pattern matching, and implement complex business rules. This enables targeted processing of specific data streams or message types.

What monitoring and debugging features are available?

toggle

The platform provides visibility into pipeline execution through the pipeline viewer. You can monitor pipeline performance, track processing errors, and debug transformation logic. The system maintains metrics about pipeline throughput and processing latency. Test functions allow you to verify transformation logic before deployment.

How are pipeline changes managed?

toggle

Pipeline configurations are managed through the OpenObserve interface, where you can create, edit, and delete pipelines. Changes to pipeline functions can be tested before deployment using sample data. The platform maintains version control for pipeline configurations, allowing you to track changes and roll back if needed.

Want to Learn More? Check out our blog.

Explore log management best practices and OpenObserve's capabilities on our blog.

Default Image

Complete Fortinet Firewall Monitoring Guide: Log Analysis

Learn how to monitor Fortinet firewalls using OpenObserve. Step-by-step guide for syslog setup, log transformation, and creating dashboards for real-time security monitoring.

Default Image

Token Exchange & OpenObserve Service accounts

Discover OpenObserve’s Service Accounts feature, designed for secure programmatic access to APIs. Learn how token exchange enhances security and simplifies automation.

Default Image

OpenObserve Reaches 15,000 GitHub Stars: A Journey to Provide Simple, Efficient, and Performant Observability for All

OpenObserve has just surpassed 15,000 stars on GitHub, a milestone that fills me with both pride and gratitude. When we started this project three years ago, the goal was simple yet ambitious: to build an open-source observability platform that is easier, faster, and dramatically more cost-effective than anything out there.

Default Image
Complete Fortinet Firewall Monitoring Guide: Log Analysis

Learn how to monitor Fortinet firewalls using OpenObserve. Step-by-step guide for syslog setup, log transformation, and creating dashboards for real-time security monitoring.

Default Image
Token Exchange & OpenObserve Service accounts

Discover OpenObserve’s Service Accounts feature, designed for secure programmatic access to APIs. Learn how token exchange enhances security and simplifies automation.

Default Image
OpenObserve Reaches 15,000 GitHub Stars: A Journey to Provide Simple, Efficient, and Performant Observability for All

OpenObserve has just surpassed 15,000 stars on GitHub, a milestone that fills me with both pride and gratitude. When we started this project three years ago, the goal was simple yet ambitious: to build an open-source observability platform that is easier, faster, and dramatically more cost-effective than anything out there.

SEE ALL BLOGS

Platform

  • Logs
  • Metrics
  • Traces
  • Frontend Monitoring
  • Pipelines
  • Alerts
  • Visualizations & Dashboard

Solutions

  • Azure Monitoring
  • AWS Monitoring
  • GCP Monitoring
  • Kubernetes Observability
  • Database Monitoring
  • OpenTelemetry
  • DevOps & SRE
  • Development Teams

Company

  • About
  • Careers
  • Contact Us
  • Why OpenObserve?

Resources

  • Documentation
  • Blog
  • FAQs
  • Articles

Community

  • Slack
  • Github
  • Twitter
  • LinkedIn
  • YouTube

Pricing

  • View Plans

SOC2 Type 2

Certified

Star Fork

OpenObserve Inc. © 2025

3000 Sand Hill Rd Building 1, Suite 260, Menlo Park, CA 94025

Terms Of ServicePrivacy Policy