Skip to content

Commit bb25255

Browse files
committed
Remove unnecessary condition to have at least one processor
Signed-off-by: Bogdan Drutu <[email protected]>
1 parent 9aa2386 commit bb25255

File tree

2 files changed

+2
-10
lines changed

2 files changed

+2
-10
lines changed

config/config.go

Lines changed: 2 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -781,14 +781,8 @@ func validatePipelineProcessors(
781781
pipeline *configmodels.Pipeline,
782782
logger *zap.Logger,
783783
) error {
784-
if pipeline.InputType == configmodels.TracesDataType {
785-
// Traces pipeline must have at least one processor.
786-
if len(pipeline.Processors) == 0 {
787-
return &configError{
788-
code: errPipelineMustHaveProcessors,
789-
msg: fmt.Sprintf("pipeline %q must have at least one processor", pipeline.Name),
790-
}
791-
}
784+
if len(pipeline.Processors) == 0 {
785+
return nil
792786
}
793787

794788
// Validate pipeline processor name references

docs/design.md

Lines changed: 0 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -110,8 +110,6 @@ In the above example “jaeger” exporter will get data from pipeline “traces
110110

111111
A pipeline can contain sequentially connected processors. The first processor gets the data from one or more receivers that are configured for the pipeline, the last processor sends the data to one or more exporters that are configured for the pipeline. All processors between the first and last receive the data strictly only from one preceding processor and send data strictly only to the succeeding processor.
112112

113-
The traces pipeline must have at least one processor. Metrics pipeline does not require processors since we currently do not have any implemented metrics processors yet.
114-
115113
Processors can transform the data before forwarding it (i.e. add or remove attributes from spans), they can drop the data simply by deciding not to forward it (this is for example how “sampling” processor works), they can also generate new data (this is how for example how a “persistent-queue” processor can work after Collector restarts by reading previously saved data from a local file and forwarding it on the pipeline).
116114

117115
The same name of the processor can be referenced in the “processors” key of multiple pipelines. In this case the same configuration will be used for each of these processors however each pipeline will always gets its own instance of the processor. Each of these processors will have its own state, the processors are never shared between pipelines. For example if “queued_retry” processor is used several pipelines each pipeline will have its own queue (although the queues will be configured exactly the same way if the reference the same key in the config file). As an example, given the following config:

0 commit comments

Comments
 (0)