-
Notifications
You must be signed in to change notification settings - Fork 1.8k
Closed
Labels
bugSomething isn't workingSomething isn't working
Description
Describe the bug
Intermittently, calling of LogRecordCount inside exporter batcher cause panic in the goroutine.
Steps to reproduce
Unsure how this happens exactly. When there is data flowing, this happens from time to time
What did you expect to see?
Data ingested successfully and exporter successfully
What did you see instead?
goroutine is panicking
What version did you use?
collector v0.96 with pdata v1.5
What config did you use?
- exporter helper is enabled.
- persistent queue is enabled
- exporter timeout is enabled
Environment
linux and mac
Additional context
Panic stacktrace example:
panic({0x449e640?, 0x8083de0?})
runtime/panic.go:914 +0x21f
go.opentelemetry.io/collector/pdata/plog.ResourceLogsSlice.At(...)
go.opentelemetry.io/collector/[email protected]/plog/generated_resourcelogsslice.go:56
go.opentelemetry.io/collector/pdata/plog.Logs.LogRecordCount({0xc009189050?, 0xc00535cda4?})
go.opentelemetry.io/collector/[email protected]/plog/logs.go:48 +0x20
go.opentelemetry.io/collector/exporter/exporterhelper.(*logsRequest).ItemsCount(0xc0015df260?)
go.opentelemetry.io/collector/[email protected]/exporterhelper/logs.go:63 +0x1d
go.opentelemetry.io/collector/exporter/exporterhelper.(*logsExporterWithObservability).send(0xc001929e90, {0x5739760?, 0xc00a8cfdb0?}, {0x5706360?, 0xc009189068?})
go.opentelemetry.io/collector/[email protected]/exporterhelper/logs.go:156 +0x98
go.opentelemetry.io/collector/exporter/exporterhelper.(*baseRequestSender).send(0x0?, {0x5739760?, 0xc00a8cfdb0?}, {0x5706360?, 0xc009189068?})
go.opentelemetry.io/collector/[email protected]/exporterhelper/common.go:35 +0x30
go.opentelemetry.io/collector/exporter/exporterhelper.(*baseExporter).send(0xc0013bf7c0, {0x5739760?, 0xc00a8cfdb0?}, {0x5706360?, 0xc009189068?})
go.opentelemetry.io/collector/[email protected]/exporterhelper/common.go:211 +0x66
go.opentelemetry.io/collector/exporter/exporterhelper.NewLogsRequestExporter.func1({0x5739760, 0xc00a8cfdb0}, {0xc009189050?, 0xc00535cda4?})
go.opentelemetry.io/collector/[email protected]/exporterhelper/logs.go:131 +0x325
go.opentelemetry.io/collector/consumer.ConsumeLogsFunc.ConsumeLogs(...)
go.opentelemetry.io/collector/[email protected]/logs.go:25
github.com/open-telemetry/opentelemetry-collector-contrib/processor/routingprocessor.(*logProcessor).route(0xc001922690, {0x5739760, 0xc00a8cfdb0}, {0xc009188f90?, 0xc00535ccf4?})
github.com/open-telemetry/opentelemetry-collector-contrib/processor/[email protected]/logs.go:139 +0x45f
github.com/open-telemetry/opentelemetry-collector-contrib/processor/routingprocessor.(*logProcessor).ConsumeLogs(0xc001959a70?, {0x5739760?, 0xc00a8cfdb0?}, {0xc009188f90?, 0xc00535ccf4?})
github.com/open-telemetry/opentelemetry-collector-contrib/processor/[email protected]/logs.go:79 +0x32
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't working