How can I run a kafka consumer in a django project so that messages will passed to django project

44 Views Asked by At

I am running my kafka consumer (from the confluent_kafka library) in a separate django management command. (I did this because I couldn't find a way of running a kafka consumer within the runserver process in django without blocking up the runserver process. If there is a better way of doing this I would love to hear suggestions)

My projects requirements are:

Djnago rest api project will receive kafka message for designated topic.

Project will create an instance of model1.

Creation of model1 will propagate further actions/logic such as creating/updating other models + sending further kafka messages.

My issue comes when designing logic to meet my requirements. Requirement 3 can be met using django signals and there is already code that implements these further actions/logic. My problem comes requirement 2 and a bit of 1. My consumer lives in a separate process from my server process so I need to a way of communicating the messages consumed to my server process. What would be the best way of doing this? Using a cache will require another process to read this cache for updates and that leaves me in the same situation as before. The fact that django code runs synchronously makes designing a listener very difficult without running it in a separate process. Am I missing something here?

OR Is my approach of trying to use django's signals to meet requirement 3 flawed. Should I be separating+moving all logic that could come after reading of a kafka message into the kafka consumer process?

Is there anyone that has encountered this use case or something similar in django? How is this usually done? I am also designing my first microservice arch in django so if there is a flaw with my overall design I would appreciate some help/resources to help. Thanks!

0

There are 0 best solutions below