HowTo: Setup ELK Stack 2025: Centralize Logs

Centralize logs with ELK Stack: install Elasticsearch, Logstash and Kibana, configure Beats, secure the cluster with TLS and build insightful dashboards.
HowTo: Setup ELK Stack 2025: Centralize Logs

1. Introduction

In today’s complex IT environments, centralized log management is essential for effective cybersecurity, troubleshooting, and compliance. The ELK Stack—comprising Elasticsearch, Logstash, and Kibana—has become the industry standard for aggregating, searching, and visualizing logs from diverse sources. This comprehensive tutorial, HowTo: Setup ELK Stack 2025: Centralize Logs, guides you step-by-step through deploying the ELK Stack, securing your setup, and leveraging its full potential for log centralization and analysis.

Whether you’re a security analyst, DevOps engineer, or IT administrator, mastering the ELK Stack will empower you to detect threats, ensure compliance, and optimize system performance. Let’s get started!

2. What is the ELK Stack?

The ELK Stack is a powerful open-source platform for log management and data analytics. It enables organizations to collect logs from various sources, process and enrich them, and visualize insights in real time. Here’s a breakdown of its core components:

2.1 Overview of Elasticsearch

Elasticsearch is a distributed, RESTful search and analytics engine. It stores, indexes, and enables fast querying of large volumes of structured and unstructured data. Its scalability and speed make it ideal for log analytics, security monitoring, and operational intelligence. Learn more at Elastic’s official documentation.

2.2 Overview of Logstash

Logstash is a data processing pipeline that ingests, transforms, and forwards data. It supports a wide range of inputs (files, syslog, cloud sources), filters (parsing, enrichment), and outputs (Elasticsearch, files, alerts). Logstash is highly extensible and crucial for log centralization.

2.3 Overview of Kibana

Kibana is a visualization and exploration tool for data stored in Elasticsearch. It provides interactive dashboards, search capabilities, and reporting features. Security teams use Kibana to monitor threats, investigate incidents, and demonstrate compliance.

2.4 Why Centralize Logs?

Centralizing logs with the ELK Stack offers several advantages:

  • Improved Security Monitoring: Detect threats and anomalies across your infrastructure.
  • Faster Troubleshooting: Correlate events from multiple systems for rapid root cause analysis.
  • Regulatory Compliance: Meet requirements for log retention and audit trails (see CIS Controls: Log Management).
  • Operational Insights: Gain visibility into application and system performance.
For more on log management strategies, see Log Management Best Practices 2025.

3. Prerequisites

Before installing the ELK Stack, ensure your environment meets the following requirements.

3.1 System Requirements

For a basic setup, the following minimum requirements are recommended:

  • CPU: 2+ cores
  • RAM: 4GB+ (8GB+ for production)
  • Disk: SSD storage, 20GB+ free space
  • OS: Ubuntu 22.04 LTS, CentOS 8, or compatible Linux distribution
  • Network: Reliable connectivity between ELK components and log sources
For larger environments, refer to Elastic’s hardware guidelines.

3.2 Network and Security Considerations

Network segmentation is crucial. Place ELK components on trusted networks, restrict access using firewalls, and avoid exposing Elasticsearch or Kibana directly to the internet. For best practices, consult CISA’s guidance on securing open-source software. To further harden your setup, you may also want to review Secure Coding Practices 2025: Top 10 Tips.

3.3 Choosing Your Deployment Environment

You can deploy the ELK Stack:

  • On-premises: Full control, but requires hardware and maintenance.
  • Cloud (AWS, Azure, GCP): Scalable, managed options available.
  • Containers (Docker, Kubernetes): Flexible, portable, ideal for DevOps workflows.
Select the environment that aligns with your organization’s needs and compliance requirements.

4. Preparing Your Environment

Proper preparation ensures a smooth installation and secure operation of the ELK Stack.

4.1 Updating System Packages

Update your system to the latest packages and security patches:

sudo apt update && sudo apt upgrade -y   # For Ubuntu/Debian
sudo dnf update -y                         # For CentOS/RHEL 8+

4.2 Installing Java (if required)

Elasticsearch and Logstash require Java 17+. Some distributions bundle OpenJDK, but verify with:

java -version
To install OpenJDK 17:
sudo apt install openjdk-17-jdk -y          # Ubuntu/Debian
sudo dnf install java-17-openjdk -y         # CentOS/RHEL

4.3 Setting Up User Permissions

For security, create dedicated system users for each ELK component:

sudo adduser --system --no-create-home --group elasticsearch
sudo adduser --system --no-create-home --group logstash
sudo adduser --system --no-create-home --group kibana
Assign correct ownership to their respective directories and files.

5. Installing Elasticsearch

Elasticsearch is the backbone of the ELK Stack, responsible for storing and indexing logs.

5.1 Download and Install

Import Elastic’s GPG key and repository:

wget -qO - https://artifacts.elastic.co/GPG-KEY-elasticsearch | sudo apt-key add -
sudo apt install apt-transport-https
echo "deb https://artifacts.elastic.co/packages/8.x/apt stable main" | sudo tee -a /etc/apt/sources.list.d/elastic-8.x.list
sudo apt update
sudo apt install elasticsearch
For RPM-based systems, see official instructions.

5.2 Configuring Elasticsearch

Edit /etc/elasticsearch/elasticsearch.yml:

network.host: 127.0.0.1
http.port: 9200
cluster.name: elk-cluster
node.name: elk-node-1
path.data: /var/lib/elasticsearch
path.logs: /var/log/elasticsearch
For production, configure cluster.initial_master_nodes and adjust heap size in jvm.options.

5.3 Securing Elasticsearch

Enable built-in security features (since Elastic 8.x, security is enabled by default):

  • Set strong passwords for built-in users using elasticsearch-setup-passwords
  • Enable TLS for HTTP and transport layers (TLS configuration guide)
  • Restrict network access to trusted IPs only
If you are managing sensitive or regulated data, you may also benefit from reviewing Password Policy Best Practices 2025 to strengthen authentication.

5.4 Starting and Testing Elasticsearch

Start and enable Elasticsearch:

sudo systemctl enable --now elasticsearch
Test with:
curl -u elastic 'https://localhost:9200' --cacert /etc/elasticsearch/certs/http_ca.crt
You should see cluster information in JSON format.

6. Installing Logstash

Logstash ingests and processes logs before sending them to Elasticsearch.

6.1 Download and Install

Install Logstash from the Elastic repository:

sudo apt install logstash

6.2 Basic Logstash Configuration

Logstash uses configuration files in /etc/logstash/conf.d/. A basic pipeline:

input {
  beats {
    port => 5044
  }
}
output {
  elasticsearch {
    hosts => ["https://localhost:9200"]
    user => "logstash_internal"
    password => "your_password"
    ssl => true
    cacert => "/etc/elasticsearch/certs/http_ca.crt"
  }
}

6.3 Setting Up Logstash Pipelines

Organize pipelines for different log sources (e.g., syslog, application logs). Use filters for parsing and enrichment:

filter {
  grok {
    match => { "message" => "%{SYSLOGBASE} %{GREEDYDATA:msg}" }
  }
  date {
    match => [ "timestamp", "MMM  d HH:mm:ss", "MMM dd HH:mm:ss" ]
  }
}
See Logstash pipeline documentation for advanced options.

6.4 Testing Logstash

Start Logstash and check logs for errors:

sudo systemctl enable --now logstash
sudo journalctl -u logstash -f
Test with sample data:
echo "Test log message" | nc localhost 5044

7. Installing Kibana

Kibana provides the web interface for visualizing and analyzing logs.

7.1 Download and Install

Install Kibana:

sudo apt install kibana

7.2 Configuring Kibana

Edit /etc/kibana/kibana.yml:

server.host: "127.0.0.1"
elasticsearch.hosts: ["https://localhost:9200"]
elasticsearch.username: "kibana_system"
elasticsearch.password: "your_password"
server.ssl.enabled: true
server.ssl.certificate: /etc/kibana/certs/kibana.crt
server.ssl.key: /etc/kibana/certs/kibana.key

7.3 Securing Kibana Access

Restrict access to trusted IPs or use a reverse proxy (e.g., NGINX) with authentication. Enable HTTPS to encrypt web traffic. See Kibana security settings.

7.4 Starting and Testing Kibana

Start Kibana:

sudo systemctl enable --now kibana
Access the web UI at https://localhost:5601. Log in with your credentials. If you see the dashboard, Kibana is running correctly.

8. Forwarding Logs to Logstash

Use Filebeat to forward logs from endpoints to Logstash.

8.1 Installing Filebeat

Install Filebeat on your log sources:

sudo apt install filebeat

8.2 Configuring Filebeat

Edit /etc/filebeat/filebeat.yml:

output.logstash:
  hosts: ["elk-server-ip:5044"]
  ssl.certificate_authorities: ["/etc/elasticsearch/certs/http_ca.crt"]

filebeat.inputs:
- type: log
  enabled: true
  paths:
    - /var/log/syslog
    - /var/log/auth.log
Enable and start Filebeat:
sudo systemctl enable --now filebeat

8.3 Testing Log Forwarding

Check Filebeat logs:

sudo journalctl -u filebeat -f
In Kibana, verify that new logs appear in the Discover section.

9. Visualizing and Analyzing Logs in Kibana

Once logs are flowing into Elasticsearch, use Kibana for visualization and analysis.

9.1 Accessing Kibana Dashboards

Navigate to https://your-kibana-server:5601 and log in. Use the Discover tab to search and filter logs. Pre-built dashboards are available for common log types (e.g., system, nginx, auditd).

9.2 Creating Custom Visualizations

Go to Visualize in Kibana to create:

  • Bar, line, and pie charts
  • Data tables
  • Geo maps
Combine visualizations into dashboards for real-time monitoring. For guidance, see Kibana visualization documentation.

9.3 Setting Up Alerts and Reporting

Kibana’s Alerting feature lets you trigger notifications (email, Slack, webhooks) based on log patterns or thresholds. Set up scheduled reports for compliance and management. For more, see Kibana alerting guide.

10. Security Best Practices for ELK Stack

Securing your ELK Stack is vital for protecting sensitive log data and maintaining compliance.

10.1 User Authentication and Authorization

Enable role-based access control (RBAC) in Elasticsearch and Kibana. Assign users only the permissions they need. Integrate with LDAP/Active Directory for enterprise environments. See Elastic security documentation.

10.2 Encrypting Data in Transit

Use TLS to encrypt all communications:

  • Between Elasticsearch nodes
  • Between Logstash and Elasticsearch
  • Between Filebeat and Logstash
  • Between users and Kibana
Refer to TLS configuration for details.

10.3 Monitoring and Auditing the Stack

Enable audit logging in Elasticsearch and Kibana to track access and configuration changes. Monitor system health and resource usage. For advanced monitoring, use Elastic Observability or third-party SIEM solutions. You may also consider comparing with Network Monitoring Tools 2025: Top 10 Compared for additional insights.

11. Troubleshooting Common Issues

Even with careful setup, issues can arise. Here’s how to resolve common problems.

11.1 Installation Errors

Check system logs (/var/log/elasticsearch/, /var/log/logstash/, /var/log/kibana/). Ensure all dependencies are installed and compatible. For package conflicts, clear the cache and retry.

11.2 Connection Problems

Verify network connectivity and firewall rules. Confirm TLS certificates are valid and trusted. Use curl or telnet to test service ports.

11.3 Performance Tuning Tips

For optimal performance:

  • Allocate sufficient heap memory to Elasticsearch and Logstash
  • Use SSDs for Elasticsearch data storage
  • Scale horizontally with additional nodes for large environments
  • Regularly review and optimize index settings and mappings
See Elastic’s tuning guide. For further optimization, check out Hashcat Usage 2025: Crack Passwords Efficiently for tips on high-performance computing, which may inspire your ELK hardware strategies.

12. Conclusion

The ELK Stack is a robust, scalable solution for centralized log management and security analytics. By following this tutorial, you’ve learned how to install, configure, secure, and leverage Elasticsearch, Logstash, and Kibana to gain actionable insights from your logs. Remember to apply security best practices and monitor your stack regularly to stay ahead of evolving cyber threats.

13. Further Resources and References

For advanced threat detection and incident response, consider integrating the ELK Stack with threat intelligence feeds and SIEM platforms. Stay updated with the latest security advisories from CISA and BleepingComputer.

Share this Post:
Posted by Ethan Carter
Author Ethan
Ethan Carter is a seasoned cybersecurity and SEO expert with more than 15 years in the field. He loves tackling tough digital problems and turning them into practical solutions. Outside of protecting online systems and improving search visibility, Ethan writes blog posts that break down tech topics to help readers feel more confident.