High load: real-time get SQL message and send it to the Kafka broker. What architectural pattern is suitable here?

63 Views Asked by At

I need to make a simple Spring Boot stream Java application that selects messages from a table in the database and sends them to a Kafka topic. And on success - deleting messages from PostgreSQL: enter image description here The main problem is the high load: the number of incoming messages can be more than 50 million. It must be sent in real-time. I’m thinking about multithreading and, understand that selecting/deleting rows from the database in concurrency threads will compete, and create locks.

My idea is to have: INPUT: 1 thread reading/deleting rows in the database and OUTPUT: many concurrent threads for sending messages from it to a queue.

I haven't done any measurements, but I expect that sending N-messages to the queue takes longer than reading and deleting N-rows from the database table. Is this true?

In other words: what is faster under high load: the execution time of an SQL query or the time of sending a message to the broker?

If the timings for get data from SQL and sending it to Kafka are similar, It makes sense to read data in several threads: PostgreSQL from 9.5 incorporates SELECT ... FOR UPDATE ... SKIP LOCKED. This makes implementing working queuing systems a lot simpler and easier. Now simple to fetch 'n' rows that no other session has locked, and keep them locked until we commit confirmation that the work is done.

0

There are 0 best solutions below