logstash.config

Installation and Configuration of filebeat and logstash


Preview

Prerequisites


  1. We’ll require 2 servers,
    • one to install beats (filebeat), Apache2, generate some logs and forward them to logstash
    • second server will be used to configure logstash and act according to pipeline in which it’ll take input (logs) from filebeat (1st server), parse the logs according to the pipeline script and forward the parsed output (logs) to elasticsearch
  2. Log parsing means that the logs which are generated are in text format and if we want to search for something then searching text would be very hectic work so inorder to make it understandable, we convert these logs into some format so that they can be easily queryable in this case it’s json format

  3. If the logs generated are already in json format we can directly send them to elasticsearch/elastic cloud no need of logstash

  4. We can configure 2 more servers for elasticsearch & kibana respectively, but I’ll be using Elastic cloud for both of them

  5. And, kibana is our UI to communicate with elasticsearch and also to perform some analysis and generate documentations about the parsed logs which we’ll receive from logstash

Let’s get started

1. Beats (Filebeat) Server Configuration

Preview

Preview

Preview

sudo apt update
wget -qO - https://artifacts.elastic.co/GPG-KEY-elasticsearch | sudo apt-key add -
sudo apt-get install apt-transport-https
echo "deb https://artifacts.elastic.co/packages/8.x/apt stable main" | sudo tee -a /etc/apt/sources.list.d/elastic-8.x.list
sudo apt-get update && sudo apt-get install filebeat
# ============================== Filebeat inputs ===============================

filebeat.inputs:

# Each - is an input. Most options can be set at the input level, so
# you can use different inputs for various configurations.
# Below are the input-specific configurations.

# filestream is an input for collecting log messages from files.
- type: filestream

  # Unique ID among all inputs, an ID is required.
  id: Apache_filestream

  # Change to true to enable this input configuration.
  enabled: true

  # Paths that should be crawled and fetched. Glob based paths.
  paths:
    - /var/log/apache2/access.log
# ================================== Outputs ===================================

# Configure what output to use when sending the data collected by the beat.

# ---------------------------- Elasticsearch Output ----------------------------
#output.elasticsearch:
  # Array of hosts to connect to.
  #hosts: ["localhost:9200"]

  # Protocol - either `http` (default) or `https`.
  #protocol: "https"

  # Authentication credentials - either API key or username/password.
  #api_key: "id:api_key"
  #username: "elastic"
  #password: "changeme"

# ------------------------------ Logstash Output -------------------------------
output.logstash:
  # The Logstash hosts
  hosts: ["10.0.0.4:5044"]

  # Optional SSL. By default is off.
  # List of root certificates for HTTPS server verifications
  #ssl.certificate_authorities: ["/etc/pki/root/ca.pem"]

  # Certificate for SSL client authentication
  #ssl.certificate: "/etc/pki/client/cert.pem"

  # Client Certificate Key
  #ssl.key: "/etc/pki/client/cert.key"

sudo systemctl daemon-reload
sudo systemctl enable filebeat.service
sudo systemctl start filebeat.service

2. Logstash Server Configuration

sudo apt update
wget -qO - https://artifacts.elastic.co/GPG-KEY-elasticsearch | sudo gpg --dearmor -o /usr/share/keyrings/elastic-keyring.gpg
sudo apt-get install apt-transport-https
echo "deb [signed-by=/usr/share/keyrings/elastic-keyring.gpg] https://artifacts.elastic.co/packages/8.x/apt stable main" | sudo tee -a /etc/apt/sources.list.d/elastic-8.x.list
sudo apt-get update && sudo apt-get install logstash
sudo systemctl daemon-reload
sudo systemctl enable logstash.service
sudo systemctl start logstash.service