Skip to content

Deltatocumulative processor panics #42163

@madaraszg-tulip

Description

@madaraszg-tulip

Component(s)

processor/deltatocumulative

What happened?

Description

We are running the deltatocumulative processor in an alloy pipeline, and in one of our clusters it started panicking in a loop starting today.

Steps to Reproduce

Cannot give exact steps as it depends on the input, which I don't have recorded.

Expected Result

No crashes

Actual Result

panic: runtime error: index out of range [0] with length 0

Collector version

v0.122.0 (from alloy v1.10.2)

Environment information

Environment

Running on kubernetes, official alloy image v1.10.2

OpenTelemetry Collector configuration

Log output

panic: runtime error: index out of range [0] with length 0

goroutine 64263495 [running]:
go.opentelemetry.io/collector/pdata/pcommon.UInt64Slice.At(...)
	/go/pkg/mod/go.opentelemetry.io/collector/[email protected]/pcommon/generated_uint64slice.go:58
github.com/open-telemetry/opentelemetry-collector-contrib/processor/deltatocumulativeprocessor/internal/data/expo.Collapse({0x405c94c9f0?, 0x403c410ee0?})
	/go/pkg/mod/github.com/open-telemetry/opentelemetry-collector-contrib/processor/[email protected]/internal/data/expo/scale.go:98 +0x218
github.com/open-telemetry/opentelemetry-collector-contrib/processor/deltatocumulativeprocessor/internal/data/expo.Downscale({0x405c94c9f0?, 0x403c410ee0?}, 0x3c410ee0?, 0x6)
	/go/pkg/mod/github.com/open-telemetry/opentelemetry-collector-contrib/processor/[email protected]/internal/data/expo/scale.go:57 +0x50
github.com/open-telemetry/opentelemetry-collector-contrib/processor/deltatocumulativeprocessor/internal/data.Adder.Exponential({}, {0x405c94c9a0?, 0x403c410ee0?}, {0x400f830ee0?, 0x4025d8929c?})
	/go/pkg/mod/github.com/open-telemetry/opentelemetry-collector-contrib/processor/[email protected]/internal/data/add.go:91 +0xac
github.com/open-telemetry/opentelemetry-collector-contrib/processor/deltatocumulativeprocessor/internal/delta.Aggregate[...]({0x405c94c9a0, 0x403c410ee0?}, {0x400f830ee0?, 0x4025d8929c?}, 0x4001e822d0?)
	/go/pkg/mod/github.com/open-telemetry/opentelemetry-collector-contrib/processor/[email protected]/internal/delta/delta.go:60 +0xf0
github.com/open-telemetry/opentelemetry-collector-contrib/processor/deltatocumulativeprocessor/internal/delta.Aggregator.Exponential(...)
	/go/pkg/mod/github.com/open-telemetry/opentelemetry-collector-contrib/processor/[email protected]/internal/delta/delta.go:77
github.com/open-telemetry/opentelemetry-collector-contrib/processor/deltatocumulativeprocessor.(*Processor).ConsumeMetrics.func1.1.4({0x405c94c9a0?, 0x403c410ee0?})
	/go/pkg/mod/github.com/open-telemetry/opentelemetry-collector-contrib/processor/[email protected]/processor.go:161 +0x60
github.com/open-telemetry/opentelemetry-collector-contrib/processor/deltatocumulativeprocessor.(*mutex[...]).use(0x9ec78bd7da203cd9, 0x400bab3500?)
	/go/pkg/mod/github.com/open-telemetry/opentelemetry-collector-contrib/processor/[email protected]/processor.go:244 +0x94
github.com/open-telemetry/opentelemetry-collector-contrib/processor/deltatocumulativeprocessor.(*Processor).ConsumeMetrics.func1.1({{{{{...}}, {0x400bab3500, 0xe}, {0x0, 0x0}, {0x0, 0x0, 0x0, 0x0, 0x0, ...}}, ...}, ...}, ...)
	/go/pkg/mod/github.com/open-telemetry/opentelemetry-collector-contrib/processor/[email protected]/processor.go:160 +0x920
github.com/open-telemetry/opentelemetry-collector-contrib/processor/deltatocumulativeprocessor/internal/metrics.Metric.Filter.func4({0x400f830ee0, 0x4025d8929c})
	/go/pkg/mod/github.com/open-telemetry/opentelemetry-collector-contrib/processor/[email protected]/internal/metrics/metrics.go:102 +0xc4
go.opentelemetry.io/collector/pdata/pmetric.ExponentialHistogramDataPointSlice.RemoveIf({0x40113274e0?, 0x4025d8929c?}, 0x4001e82908)
	/go/pkg/mod/go.opentelemetry.io/collector/[email protected]/pmetric/generated_exponentialhistogramdatapointslice.go:127 +0x80
github.com/open-telemetry/opentelemetry-collector-contrib/processor/deltatocumulativeprocessor/internal/metrics.Metric.Filter({{0x400221a138, 0x4025d8929c}, {0x40329dc5b0, 0x4025d8929c}, {0x400c764a68, 0x4025d8929c}}, 0x4001e82b90)
	/go/pkg/mod/github.com/open-telemetry/opentelemetry-collector-contrib/processor/[email protected]/internal/metrics/metrics.go:100 +0x378
github.com/open-telemetry/opentelemetry-collector-contrib/processor/deltatocumulativeprocessor.(*Processor).ConsumeMetrics.func1({{0x400221a138, 0x4025d8929c}, {0x40329dc5b0, 0x4025d8929c}, {0x400c764a68, 0x4025d8929c}})
	/go/pkg/mod/github.com/open-telemetry/opentelemetry-collector-contrib/processor/[email protected]/processor.go:96 +0xe8
github.com/open-telemetry/opentelemetry-collector-contrib/processor/deltatocumulativeprocessor/internal/metrics.Filter.func1.1.1({0x400c764a68?, 0x4025d8929c?})
	/go/pkg/mod/github.com/open-telemetry/opentelemetry-collector-contrib/processor/[email protected]/internal/metrics/iter.go:14 +0x78
go.opentelemetry.io/collector/pdata/pmetric.MetricSlice.RemoveIf({0x40329dc5f0?, 0x4025d8929c?}, 0x4001e82cb8)
	/go/pkg/mod/go.opentelemetry.io/collector/[email protected]/pmetric/generated_metricslice.go:127 +0x80
github.com/open-telemetry/opentelemetry-collector-contrib/processor/deltatocumulativeprocessor/internal/metrics.Filter.func1.1({0x40329dc5b0?, 0x4025d8929c?})
	/go/pkg/mod/github.com/open-telemetry/opentelemetry-collector-contrib/processor/[email protected]/internal/metrics/iter.go:13 +0x64
go.opentelemetry.io/collector/pdata/pmetric.ScopeMetricsSlice.RemoveIf({0x400221a158?, 0x4025d8929c?}, 0x4001e82d58)
	/go/pkg/mod/go.opentelemetry.io/collector/[email protected]/pmetric/generated_scopemetricsslice.go:127 +0x80
github.com/open-telemetry/opentelemetry-collector-contrib/processor/deltatocumulativeprocessor/internal/metrics.Filter.func1({0x400221a120?, 0x4025d8929c?})
	/go/pkg/mod/github.com/open-telemetry/opentelemetry-collector-contrib/processor/[email protected]/internal/metrics/iter.go:12 +0x54
go.opentelemetry.io/collector/pdata/pmetric.ResourceMetricsSlice.RemoveIf({0x4018d97ad0?, 0x4025d8929c?}, 0x4001e82de8)
	/go/pkg/mod/go.opentelemetry.io/collector/[email protected]/pmetric/generated_resourcemetricsslice.go:127 +0x80
github.com/open-telemetry/opentelemetry-collector-contrib/processor/deltatocumulativeprocessor/internal/metrics.Filter({0x4018d97ad0?, 0x4025d8929c?}, 0x400221a218?)
	/go/pkg/mod/github.com/open-telemetry/opentelemetry-collector-contrib/processor/[email protected]/internal/metrics/iter.go:11 +0x3c
github.com/open-telemetry/opentelemetry-collector-contrib/processor/deltatocumulativeprocessor.(*Processor).ConsumeMetrics(0x40040f6ea0, {0xbe8d060, 0x40640d3800}, {0x4018d97ad0?, 0x4025d8929c?})
	/go/pkg/mod/github.com/open-telemetry/opentelemetry-collector-contrib/processor/[email protected]/processor.go:88 +0x1f4
github.com/grafana/alloy/internal/component/otelcol/internal/lazyconsumer.(*Consumer).ConsumeMetrics(0x40040f6e10, {0xbe8d060, 0x40640d3800}, {0x4018d96870?, 0x4025d88e8c?})
	/src/alloy/internal/component/otelcol/internal/lazyconsumer/lazyconsumer.go:113 +0x164
github.com/grafana/alloy/internal/component/otelcol/processor.(*Processor).Update.func2({0xbe8d060, 0x40640d3800}, {0x4018d96870?, 0x4025d88e8c?})
	/src/alloy/internal/component/otelcol/processor/processor.go:211 +0x1a4
github.com/grafana/alloy/internal/component/otelcol/internal/interceptconsumer.(*MetricsInterceptor).ConsumeMetrics(0x4009b85698?, {0xbe8d060?, 0x40640d3800?}, {0x4018d96870?, 0x4025d88e8c?})
	/src/alloy/internal/component/otelcol/internal/interceptconsumer/metrics.go:41 +0x48
go.opentelemetry.io/collector/processor/processorhelper.NewMetrics.func1({0xbe8d060, 0x40640d3800}, {0x4018d96870?, 0x4025d88e8c?})
	/go/pkg/mod/go.opentelemetry.io/collector/[email protected]/processorhelper/metrics.go:66 +0x21c
go.opentelemetry.io/collector/consumer.ConsumeMetricsFunc.ConsumeMetrics(...)
	/go/pkg/mod/go.opentelemetry.io/collector/[email protected]/metrics.go:27
github.com/grafana/alloy/internal/component/otelcol/internal/lazyconsumer.(*Consumer).ConsumeMetrics(0x40020d8f30, {0xbe8d060, 0x40640d3800}, {0x4018d963d8?, 0x4025d88b8c?})
	/src/alloy/internal/component/otelcol/internal/lazyconsumer/lazyconsumer.go:113 +0x164
github.com/grafana/alloy/internal/component/otelcol/processor.(*Processor).Update.func2({0xbe8d060, 0x40640d3800}, {0x4018d963d8?, 0x4025d88b8c?})
	/src/alloy/internal/component/otelcol/processor/processor.go:211 +0x1a4
github.com/grafana/alloy/internal/component/otelcol/internal/interceptconsumer.(*MetricsInterceptor).ConsumeMetrics(0x4009c5bea8?, {0xbe8d060?, 0x40640d3800?}, {0x4018d963d8?, 0x4025d88b8c?})
	/src/alloy/internal/component/otelcol/internal/interceptconsumer/metrics.go:41 +0x48
go.opentelemetry.io/collector/processor/processorhelper.NewMetrics.func1({0xbe8d060, 0x40640d3800}, {0x4018d963d8?, 0x4025d88b8c?})
	/go/pkg/mod/go.opentelemetry.io/collector/[email protected]/processorhelper/metrics.go:66 +0x21c
go.opentelemetry.io/collector/consumer.ConsumeMetricsFunc.ConsumeMetrics(...)
	/go/pkg/mod/go.opentelemetry.io/collector/[email protected]/metrics.go:27
github.com/grafana/alloy/internal/component/otelcol/internal/lazyconsumer.(*Consumer).ConsumeMetrics(0x40020d9440, {0xbe8d060, 0x40640d3800}, {0x4018db9f38?, 0x4025d88808?})
	/src/alloy/internal/component/otelcol/internal/lazyconsumer/lazyconsumer.go:113 +0x164
github.com/grafana/alloy/internal/component/otelcol/internal/fanoutconsumer.(*metricsFanout).ConsumeMetrics(0x400aaa2690, {0xbe8d060, 0x40640d3800}, {0x4018db9f38?, 0x4025d88808?})
	/src/alloy/internal/component/otelcol/internal/fanoutconsumer/metrics.go:80 +0x13c
github.com/grafana/alloy/internal/component/otelcol/receiver.(*Receiver).Update.func2({0xbe8d060, 0x40640d3800}, {0x4018db9f38?, 0x4025d88808?})
	/src/alloy/internal/component/otelcol/receiver/receiver.go:198 +0x1a4
github.com/grafana/alloy/internal/component/otelcol/internal/interceptconsumer.(*MetricsInterceptor).ConsumeMetrics(0x4009f699f0?, {0xbe8d060?, 0x40640d3800?}, {0x4018db9f38?, 0x4025d88808?})
	/src/alloy/internal/component/otelcol/internal/interceptconsumer/metrics.go:41 +0x48
go.opentelemetry.io/collector/receiver/otlpreceiver/internal/metrics.(*Receiver).Export(0x40098d61f8, {0xbe8d060, 0x40640d37d0}, {0x4018db9f38?, 0x4025d88808?})
	/go/pkg/mod/go.opentelemetry.io/collector/receiver/[email protected]/internal/metrics/otlp.go:41 +0x9c
go.opentelemetry.io/collector/pdata/pmetric/pmetricotlp.rawMetricsServer.Export({{0xbe04180?, 0x40098d61f8?}}, {0xbe8d060, 0x40640d37d0}, 0x4018db9f38)
	/go/pkg/mod/go.opentelemetry.io/collector/[email protected]/pmetric/pmetricotlp/grpc.go:88 +0xec
go.opentelemetry.io/collector/pdata/internal/data/protogen/collector/metrics/v1._MetricsService_Export_Handler.func1({0xbe8d060?, 0x40640d37d0?}, {0xa41ed20?, 0x4018db9f38?})
	/go/pkg/mod/go.opentelemetry.io/collector/[email protected]/internal/data/protogen/collector/metrics/v1/metrics_service.pb.go:311 +0xd0
go.opentelemetry.io/collector/config/configgrpc.(*ServerConfig).getGrpcServerOptions.enhanceWithClientInformation.func9({0xbe8d060?, 0x40640d3770?}, {0xa41ed20, 0x4018db9f38}, 0x80?, 0x4018db9f50)
	/go/pkg/mod/go.opentelemetry.io/collector/config/[email protected]/configgrpc.go:535 +0x4c
go.opentelemetry.io/collector/pdata/internal/data/protogen/collector/metrics/v1._MetricsService_Export_Handler({0x8f85700, 0x4000c14ba0}, {0xbe8d060, 0x40640d3770}, 0x4004433180, 0x40098da160)
	/go/pkg/mod/go.opentelemetry.io/collector/[email protected]/internal/data/protogen/collector/metrics/v1/metrics_service.pb.go:313 +0x148
google.golang.org/grpc.(*Server).processUnaryRPC(0x400ac84000, {0xbe8d060, 0x40640d35f0}, 0x404573cf00, 0x40098d57a0, 0x11ca3360, 0x0)
	/go/pkg/mod/google.golang.org/[email protected]/server.go:1405 +0xc9c
google.golang.org/grpc.(*Server).handleStream(0x400ac84000, {0xbe92208, 0x4007cb36c0}, 0x404573cf00)
	/go/pkg/mod/google.golang.org/[email protected]/server.go:1815 +0x900
google.golang.org/grpc.(*Server).serveStreams.func2.1()
	/go/pkg/mod/google.golang.org/[email protected]/server.go:1035 +0x84
created by google.golang.org/grpc.(*Server).serveStreams.func2 in goroutine 1090
	/go/pkg/mod/google.golang.org/[email protected]/server.go:1046 +0x138

Additional context

No response

Tip

React with 👍 to help prioritize this issue. Please use comments to provide useful context, avoiding +1 or me too, to help us triage it. Learn more here.

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions