I am beginner to Elastic Search and logStash. After going through the n number of documentation still i couldn't figure out. Why there is an broker needed in between logshipping and indexing component. Can't we directly send the logs to Elastic Search and start indexing? 
Why Redis, AMQP or 0MQ is needed along with elastic search and logstash?
993 Views Asked by Naresh At
2
There are 2 best solutions below
0
Jettro Coenradie
On
Yes you can send the logs immediately to the indexer. However, there is a scalability and maintainability reason to use the Broker. If the indexer at some time becomes overloaded, sending the logs could slow down. Also if you want to restart the indexer for any reason, using the Broker you can keep sending logs.
Related Questions in LOGGING
- Is Log4j2 xml configuration case sensitive?
- Logback stopped logging after splitting shared config file
- logging setup best practices
- C Simple Logging Management
- OpenShift Pyramid logging to file
- Log of dependency does not show
- Node/Express access logger from request object
- How does one locate all git log messages in the git object database?
- Logging error when executing Maven SonarQube plugin
- refactor 'execute and log' pattern
- CMD specifying columns to save?
- How to get information about error from HttpContext in WCF services
- Django not logging all errors
- Empty space at beginning of rsyslog log file
- Avoid log trace of external framework J2EE
Related Questions in ELASTICSEARCH
- Elasticsearch schema for multiple versions of the same text
- Elasticsearch nested filter query
- Elasticsearch data model
- search with filter by token count
- Usage of - operator in elasticsearch
- Running multiprocessing on two different functions in Python 2.7
- How to get an Elasticsearch aggregation with multiple fields
- How to implement custom sort in elasticsearch?
- Custom Analyzer not working Elasticsearch
- How to implement full text search using Elasticsearch in Rails?
- UnresolvedAddressException in Logstash+elasticsearch
- Elasticsearch Fiddler No DNS
- Monolithic ETL to distributed/scalable solution and OLAP cube to Elasticsearch/Solr
- how to disable page query in Spring-data-elasticsearch
- Create Custom Analyzer after index has been created
Related Questions in REDIS
- start redis with supervisor
- How to do Mass insertion in Redis using JAVA?
- RedisResponseException: Unknown reply on multi-request
- Redis / Get all keys & values from redis with prefix
- Remove a member from multiple sets in Redis
- Using memcached or Redis on aws-elasticache
- Get Socket Object by Id with node, redis-adapter and socket.io
- how can i save a complex json as string in redis and retrieve it as unescaped legit json object
- How to specify versions on PIP when installing a python package with it's dependencies
- Eloquent model for Redis
- Is exists check required before calling StringSet method of StrackExchange.Redis
- Predis: Pros and Cons of the two cluster strategies
- hmset redis with result from mysqlDB
- does redis cluster support transactions ?
- How change redis to be persistent
Related Questions in LOGSTASH
- UnresolvedAddressException in Logstash+elasticsearch
- Grok parse error when using custom pattern definitions
- Delete logs after consumption: logstash
- Delete records of a certain type from logstash/elasticsearch
- Unable to push data from file to elastic search
- logstash dns filter miss
- Logstash parse error CISCOTIMESTAMP Debugger checks OK
- Performing searches on JSON data in Elasticsearch
- Logstash not writing to Elasticsearch with Shield
- logstash parsing timestamp halfday am/pm
- Parsing multiline log file in Logstash
- how to start logstash-forwarder as a service in Windows?
- How to parse a xml-file with logstash filters
- Cannot select a pattern as defaultIndex on Kibana
- What is better: logStash agents on the appserver or the remote kibana server?
Related Questions in INDEXER
- Overloading operators with the IndexerName attribute
- What are the benefits of an indexer?
- Why Redis, AMQP or 0MQ is needed along with elastic search and logstash?
- Optional parameter in an indexer - an object reference is required error
- Cannot cast from KeyValueConfigurationCollection to derived class
- Hibernate Search 3.3.0 Final: Massindexer taking a long time to index
- ArgumentNullException, indexer and Gendarme rule error
- Class with indexer and property named "Item"
- How do you use indexers with array of objects?
- Setting multiple Index properties in a class
- Different types for Indexer's getter and setter
- Is there a way to remove old data from azure search index
- Could not open genesis config file
- What exactly is pandas doing for this data slicing under the hood?
- Lucene automatic indexer
Trending Questions
- UIImageView Frame Doesn't Reflect Constraints
- Is it possible to use adb commands to click on a view by finding its ID?
- How to create a new web character symbol recognizable by html/javascript?
- Why isn't my CSS3 animation smooth in Google Chrome (but very smooth on other browsers)?
- Heap Gives Page Fault
- Connect ffmpeg to Visual Studio 2008
- Both Object- and ValueAnimator jumps when Duration is set above API LvL 24
- How to avoid default initialization of objects in std::vector?
- second argument of the command line arguments in a format other than char** argv or char* argv[]
- How to improve efficiency of algorithm which generates next lexicographic permutation?
- Navigating to the another actvity app getting crash in android
- How to read the particular message format in android and store in sqlite database?
- Resetting inventory status after order is cancelled
- Efficiently compute powers of X in SSE/AVX
- Insert into an external database using ajax and php : POST 500 (Internal Server Error)
Popular Questions
- How do I undo the most recent local commits in Git?
- How can I remove a specific item from an array in JavaScript?
- How do I delete a Git branch locally and remotely?
- Find all files containing a specific text (string) on Linux?
- How do I revert a Git repository to a previous commit?
- How do I create an HTML button that acts like a link?
- How do I check out a remote Git branch?
- How do I force "git pull" to overwrite local files?
- How do I list all files of a directory?
- How to check whether a string contains a substring in JavaScript?
- How do I redirect to another webpage?
- How can I iterate over rows in a Pandas DataFrame?
- How do I convert a String to an int in Java?
- Does Python have a string 'contains' substring method?
- How do I check if a string contains a specific word?
The message queue
The role of the message queue is to be the gatekeeper and protect the parser from being overloaded with too many log messages. It is all about the maximum number of events your parser can process in one second and when this rate becomes too high, the log parser will drop events and will cause data lose. To prevent this situation a message queue is mandatory.
Pull vs Push
When you send log messages directly to the log parser from log shipper you basically push the messages and hope that the parser can handle the rate at which these events are being pushed to it. When you choose to use a message queue you allow the log parser to pull the messages at the rate that it can handle. When the rate is too high and the parser can’t pull all the messages, they will accumulate in your message queue and once the rate will become lower, the parser will pull these messages and clear the queue. A message queue is your best protection against temporary high load on your central logging solution.
The database crisis
In rare cases your database server will crash and during that time the parser will have no available destination to send its parsed log messages to. From the input side the parser will receive more and more messages from the log shipper and will start dropping them. In that case, all the log messages generated during this time will be gone. A message queue is a great solution to that situation and will allow the parser to stop pulling events and let them accumulate in the message queue. Once the session to the database will be restored all the events will be pulled by the parser and sent to the database. It might take some time to parse and write such a big queue but eventually you will have complete access to your generated log data and no data will be lost.
A layer of safety
In some cases your log files might be scattered between different servers outside of your data center and you will want them to send data to your centralized logging solution. By using a message queue, you will be able to keep your data safe, sending it encrypted and limiting your inbound access to a single port on your message queue server. It is very important to take into consideration the security aspects of your centralized logging solution and especially so when it comes to a distributed server environment.