-
|
Notice that Vector already has OpenTelemetry Sink and try the following config: sources:
otel_traces:
type: "opentelemetry"
http:
address: "0.0.0.0:4318"
grpc:
address: "0.0.0.0:4317"
sinks:
store_traces:
type: opentelemetry
inputs:
- otel_traces.traces
# My OpenTelemetry traces the backend.
# It will accept the protobuf encode traces data through HTTP.
protocol:
type: "http"
uri: "http://localhost:4000/v1/otlp/v1/traces"
method: post
encoding:
codec: "native"Unfortunately, my trace backend can't parse the data. It seems the data have been transformed to another format in Vector. My vector version is: Also, I can't find any documents that describe the example of opentelemetry. How can I use OpenTelemetry sink? |
Beta Was this translation helpful? Give feedback.
Replies: 3 comments 15 replies
-
|
Hey @zyy17, if I understand correctly you are doing OTEL traces -> OTEL traces via Vector. Edit: Please refer to #22231 (comment) Older DYI solution (not recommended):
|
Beta Was this translation helpful? Give feedback.
-
Moving to a new thread @rauanmayemir Edit: Here is a full example OTEL to OTEL example. References: |
Beta Was this translation helpful? Give feedback.
-
|
facing a similar issue, I think... tried looking into the network traffic; found vector sending an array instead of json, even after setting the appropriate fields to use suggestion by @pront was very useful; https://vector.dev/docs/reference/configuration/sinks/opentelemetry/#how-it-worksalso helped to dealing with the and OLTP expects: hence adding this to transforms worked out well im not sure why using the endpoint Hope this helps in some small way |
Beta Was this translation helpful? Give feedback.
Hey @zyy17, if I understand correctly you are doing OTEL traces -> OTEL traces via Vector.
Edit: Please refer to #22231 (comment)
Older DYI solution (not recommended):