Kafka streams in hexagonal architecture

175 Views Asked by At

Im creating service in hexagonal architecture that consume data from topic. In the same projects I want to use kafka streams to merge few topics into single one and then consume that data from that topic created by kafka streams in my adapter. I have a problem where should I put kafka streams code in my hexagonal architecture? It don't seems that adapters are good place for that, either my domain. It's just topic data transformation that will be use later in those.

3

There are 3 best solutions below

0
Hikmet Cakir On

Hexagonal architecture aims to separate business codes and framework codes. In addition to that framework codes can access the business codes but business codes should not access "directly" the framework codes.

In the below structure, I simulated a bank account process in hexagonal architecture. The client sends an account opening request and the request goes to the domain to do the bank account opening process if the business needs to save account details to db, it uses ports (the ports are interface or other abstract structures) to access infrastructure.

-account
   - infrastructure 
      - rest
         - AccountRestController.java
      - dto
         - BankAccountOpeningRequest.java
         - BankAccountOpeningResponse.java
      - adapter
         - AccountJpaDataAdapter.java
      - jpa
         - repository
            - AccountJpaRepository.java
         - entity
            - AccountEntity.java
         
   - domain 
      - port
         - AccountDataPort.java
      - handler
         - BankAccountOpeningUseCaseHandler.java
      - usecase
         - BankAccountOpening.java
      - model
         - Account.java

Of course, package structure, class structure, or other things can be changed by developer, but one thing must not change the separation of business codes from framework codes.

For your question, it depends on your business but Kafka codes can not be put into the domain (business). you can create a kafka consumer class such as ABCKafkaConsumer.java on the framework side and then this consumer class can be used in your kafka adapter.

0
Nico de Wet On

You are referencing what appears to be more than one software system, only one of which is enforcing a Hexagonal architecture. The topics you wish to merge are presumably governed by other software systems in terms of lifecycle, so they are merely your external dependencies. Now if you were to merge topics, with particular semantics and say the output topic has it's own schema and specification (perhaps AsyncAPI) then the application that does the merging is probably not a part of the software system with a hexagonal architecture - though you will probably have an adapter that consumes from the output topic (the merged one).

0
Romain Goussu On

I once participated in a project that had the exact same use case (merging several topic to a single output topic that would be then consumed by a consumer of the service)

Kafka Streams in that regard is not really tied to your hexagonal architecture since it never actually interacts with your domain at all :)

Meaning the logic to build your Stream Topology makes sense only in the context of the Topology

In that particular project, we had the following layout :

  application
  |_ rest
  |_consumer // Module where your actual, runnable consumer resides, having 
  |          // dependecies on the domain and the infrastructure package
  |_ stream // Module harboring the topology itself, with no dependencies to 
  |          // the domain
  domain
  |_core // Domain implementation
  |_contract // Domain contract (DTO and ports definition)
  infrastructure
   

We deployed the stream application in an isolated container, and its module had no interaction/dependencies towards the domain

So, to summarize : you can treat Kafka Stream as an independent "application" module (that is, module whose build will produce an executable artifact) if you wish to colocate it in the same project (meaning in the same codebase).

If you do not have the need or the will to colocate it in the same codebase, it could reside in a repository of its own, because it will never be tied to your domain logic.

You can see it as a technical component making a sort of Kafka Topics arithmetic of sort (topic A + topic B = topic C) that has in the end no ties nor impact to your domain.

The only dependency between your stream and the rest of your service would be the topology output topic contract, that would be a dependency of your consumer :)

Hope that helps, feel free to reach out if you have any questions/need any precisions :D