unable to hook up opentelemetry collector with grafana tempo

6k Views Asked by At

I am unable to get otel collector to work, I'm new to opentelemetry, so it feels I'm making a stupid error in congiration somewhere

this is my python sample script that is supposed to create a sample trace that should be exported, picked up by opentelemetry-collector and pushed into grafana tempo backend

from opentelemetry import trace
from opentelemetry.exporter.otlp.proto.grpc.trace_exporter import (
    OTLPSpanExporter,
)
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import (
    BatchSpanProcessor,
    ConsoleSpanExporter,
)

span_exporter = OTLPSpanExporter(
   # endpoint="http://tempo.monitoring:3100"
   endpoint="10.120.4.111"
   # endpoint="http://10.120.7.235:4317"
)
provider = TracerProvider()
processor = BatchSpanProcessor(ConsoleSpanExporter())
span_processor = BatchSpanProcessor(span_exporter)
provider.add_span_processor(processor)
provider.add_span_processor(span_processor)
trace.set_tracer_provider(provider)


tracer = trace.get_tracer(__name__)

with tracer.start_as_current_span("foo"):
    with tracer.start_as_current_span("bar"):
        with tracer.start_as_current_span("baz"):
            print("Hello world from OpenTelemetry Python!")

this is the output when I run it:

description should only be set when status_code is set to StatusCode.ERROR
Hello world from OpenTelemetry Python!
{
    "name": "baz",
    "context": {
        "trace_id": "0xbc86870da9156c784c340ca16042bf6b",
        "span_id": "0xdd38d089ff192445",
        "trace_state": "[]"
    },
    "kind": "SpanKind.INTERNAL",
    "parent_id": "1c559a143ee166cd",
    "start_time": "2021-10-08T22:56:27.973528Z",
    "end_time": "2021-10-08T22:56:27.973569Z",
    "status": {
        "status_code": "UNSET"
    },
    "attributes": {},
    "events": [],
    "links": [],
    "resource": {
        "telemetry.sdk.language": "python",
        "telemetry.sdk.name": "opentelemetry",
        "telemetry.sdk.version": "1.0.0",
        "service.name": "unknown_service"
    }
}
{
    "name": "bar",
    "context": {
        "trace_id": "0xbc86870da9156c784c340ca16042bf6b",
        "span_id": "0x1c559a143ee166cd",
        "trace_state": "[]"
    },
    "kind": "SpanKind.INTERNAL",
    "parent_id": "0a5bd46eef0c046a",
    "start_time": "2021-10-08T22:56:27.973497Z",
    "end_time": "2021-10-08T22:56:27.973611Z",
    "status": {
        "status_code": "UNSET"
    },
    "attributes": {},
    "events": [],
    "links": [],
    "resource": {
        "telemetry.sdk.language": "python",
        "telemetry.sdk.name": "opentelemetry",
        "telemetry.sdk.version": "1.0.0",
        "service.name": "unknown_service"
    }
}
{
    "name": "foo",
    "context": {
        "trace_id": "0xbc86870da9156c784c340ca16042bf6b",
        "span_id": "0x0a5bd46eef0c046a",
        "trace_state": "[]"
    },
    "kind": "SpanKind.INTERNAL",
    "parent_id": null,
    "start_time": "2021-10-08T22:56:27.973419Z",
    "end_time": "2021-10-08T22:56:27.973622Z",
    "status": {
        "status_code": "UNSET"
    },
    "attributes": {},
    "events": [],
    "links": [],
    "resource": {
        "telemetry.sdk.language": "python",
        "telemetry.sdk.name": "opentelemetry",
        "telemetry.sdk.version": "1.0.0",
        "service.name": "unknown_service"
    }
}

However this trace is nowhere to be found in grafana tempo:

curl http://tempo.monitoring:3100/api/traces/bc86870da9156c784c340ca16042bf6b
trace not found in Tempo

opentelemtry-collector pods are not showing anything valuable

monitoring/opentelemetry-collector-agent-fhcd2[opentelemetry-collector]: 2021-10-08T22:58:16.162Z   INFO    loggingexporter/logging_exporter.go:375 MetricsExporter {"#metrics": 15}
monitoring/opentelemetry-collector-agent-bbsk8[opentelemetry-collector]: 2021-10-08T22:58:21.490Z   INFO    loggingexporter/logging_exporter.go:375 MetricsExporter {"#metrics": 15}
monitoring/opentelemetry-collector-agent-z8jlp[opentelemetry-collector]: 2021-10-08T22:58:22.314Z   INFO    loggingexporter/logging_exporter.go:375 MetricsExporter {"#metrics": 15}

grafana tempo is "numb" as well

monitoring/tempo-0[tempo]: level=info ts=2021-10-08T22:40:33.403096615Z caller=frontend.go:114 tenant=single-tenant method=GET traceID=470d349a2aa1a574 url=/api/traces/742a5bfccbfb8df24251a19c85600818 duration=1.040366ms response_size=0 status=404
monitoring/tempo-0[tempo]: level=info ts=2021-10-08T22:45:11.168260856Z caller=frontend.go:114 tenant=single-tenant method=GET traceID=281b800cb325d227 url=/api/traces/d2da9c5a4638245a8b7d8807934778bc duration=1.969761ms response_size=0 status=404
monitoring/tempo-0[tempo]: level=info ts=2021-10-08T22:57:32.889567771Z caller=frontend.go:114 tenant=single-tenant method=GET traceID=2f1eee7bb623353d url=/api/traces/bc86870da9156c784c340ca16042bf6b duration=1.123504ms response_size=0 status=404
host 10.120.4.11 has ports opened
[root@utils /]# curl -X POST 10.120.4.111:4317
curl: (1) Received HTTP/0.9 when not allowed
opentelemetry is running through this helm chart: https://open-telemetry.github.io/opentelemetry-helm-charts

config for opentelemetry helm chart

config:
  exporters:
    logging: 
      loglevel: info
    otlp:
      endpoint: tempo.monitoring:4317
  receivers:
    otlp:
      grpc:
      http:
  service:
    pipelines:
      traces:
        receivers:
          - otlp
          - jaeger
          - zipkin
        processors:
          - batch
          - memory_limiter
        exporters:
          - logging
          - otlp

nodeSelector:
  apps: "true"            
standaloneCollector:
  enabled: false

So I tried different values with config.exporters.otlp.endpoint - no effect.

Any ideas?

1

There are 1 best solutions below

0
On

I've encountered a similar issue where I had traces coming in opentelemetry-collector, but were not able to export them in Grafana Tempo.

I suppose that your problem was the configuration of the receivers that seems invalid (but it's just a guess).

I think it should be like this:

config:
  receivers:
    otlp:
      protocols:
        grpc:
          endpoint: 0.0.0.0:4317
        http:
          endpoint: 0.0.0.0:4318

Once you have the traces in opentelemetry-collector, you should have these kinds of logs:

2022-10-21T14:38:22.641Z        info    TracesExporter  {"kind": "exporter", "data_type": "traces", "name": "logging", "#spans": 5}
2022-10-21T14:38:32.658Z        info    TracesExporter  {"kind": "exporter", "data_type": "traces", "name": "logging", "#spans": 30}

Here's my working configuration on Kubernetes using community Helm charts:

Helm Charts:

opentelemetry-collector values:

config:
  exporters:
    logging:
      loglevel: info
    otlp:
      # Doc: https://github.com/open-telemetry/opentelemetry-collector/tree/main/exporter/otlpexporter
      endpoint: tempo-distributed-distributor.monitoring.svc.cluster.local:4317
      tls:
        # Disabled TLS for this example
        # Doc : https://github.com/open-telemetry/opentelemetry-collector/tree/main/config/configtls
        insecure: true
  receivers:
    otlp:
      protocols:
        grpc:
          endpoint: 0.0.0.0:4317
        http:
          endpoint: 0.0.0.0:4318
  service:
    pipelines:
      traces:
        exporters:
          - logging
          - otlp
        processors:
          - memory_limiter
          - batch
        receivers:
          - otlp

tempo-distributed values:

distributor:
  config:
    receivers:
      otlp:
        protocols:
          grpc:
            endpoint: "0.0.0.0:4317"
          http:
            endpoint: "0.0.0.0:4318"

Hope this will help others!