FluentD Aggregator - When to use?

2.6k Views Asked by At

We are contemplating using FluentD to send Blue Coat log events from proxy & dhcp servers to elastic. One of the key requirements is to flag outliers & anomalies as soon as possible.

For such a requirement does it make sense to use FluentD Aggregator to take in events from several (potentially hundreds) of FluentD collector agents? Alternately, is it just better to have FluentD agents send events to a Kafka cluster that writes to Elastic?

Will be really helpful if I can get some guidelines for when to use FluentD Aggregator.

Thanks, Sharod

1

There are 1 best solutions below

0
On

Generally Speaking, there are 2 common cases.

1 - When you need to use several plugins for Filters 2 - Throttling throughput for a destination service/db/etc...

Plugins for Filter consumes a lot of performance. It means it might affect a server which generates a log. So, fluentd on the server should do a minimum work, which transfers a log to an aggregator or Kafka or something.

Additional use-case, if you need to rewrite configs sometimes, you don't need to restart all fluentd on the server, if you use the filters in aggregate servers.

2 - Generally, an external service / DB has a lot of limitations to update/insert data to its storage. If a lot of fluentd tries to write data frequently, they would show any exceptions easily. In order to avoid such a case, an aggregator should be used.