I am using "github.com/confluentinc/confluent-kafka-go/kafka" in a very simple consumer. It's pretty much what confluent has as a kafka/go tutorial.
"go build ." and "go run ." succeed, "docker build ." does not.
The error is:
> [builder 7/7] RUN go build -o /app/bin/main .:
#12 9.826 # gitlab.com/.....
#12 9.826 ./main.go:24:24: undefined: kafka.ConfigMap
#12 9.826 ./main.go:30:18: undefined: kafka.NewConsumer
Here is my Dockerfile:
FROM golang:1.20.2-alpine3.16 AS builder
RUN apk --update add git
WORKDIR /app
COPY go.mod go.sum /app/
RUN go mod download
COPY . .
RUN go build -o /app/bin/main .
FROM scratch
WORKDIR /app
COPY --from=builder /app/bin/main /app/bin/main
ENTRYPOINT ["/app/bin/main"]
I am at a loss as to why this happens, especially since it claims that some functions in the package are undefined. I have not redeclared "kafka" anywhere in my code.
That's because you need the native (C/C++) librdkafka. dependencies in order to build it. That will imply in most cases having
gccavailable as well. That's the underlying Kakfa lib used by your code.You also need to make go build aware of that. I've added and modified these two lines on your example and was able to build it.
Functional example:
I've used this quickstart code and the build process was OK.
Update
As Zeke pointed out, Kafka already has the prebuilt binaries.
Breaking down the Dockerfile below:
gccandmusl-devare required for compilation using the selected image.-tags musloption for Alpine, as per the Kafka SDK docs.-ldflags '-extldflags "-static"'will statically compile everything in a single binary, including the dependencies. This will include for example C std lib, and will keep the scratch image clean. If dynamic is chosen, then one has to add the dependencies to the scratch image section directly.The Dockerfile which generated the image with ~10MB size:
Integration with Kafka using test code was successful when running in my demo.