Skip to content

Elastic日志分析案例

ELK分析Linux系统日志

  • 创建Logstash配置文件:vim /etc/logstash/conf.d/syslog.conf
input {
    file {
        path => ["/var/log/secure"]
        type => "syslog"
    }
}

filter {
    if [type] == "syslog" {
        grok {
            match => { "message" => "%{SYSLOGPAMSESSION}" }
        }
        mutate {
            remove_field => ["message"]
        }
    }
}

output {
    if [type] == "syslog" and "_grokparsefailure" not in [tags] {
        elasticsearch {
            hosts => ["10.10.10.15:9200"]
            index => "logstash-syslog-%{+YYYY.MM}"
        }
    }
}
  • 执行logstash
/usr/share/logstash/bin/logstash -f /etc/logstash/conf.d --config.reload.automatic &

# -f: 加载配置文件或文件夹
# --config.reload.automatic: 自动重新加载被修改的配置文件
# -t: 测试配置文件

ELK+Filebeat分析Nginx日志

Nginx日志为JSON格式,所以需要在Filebeat中进行配置解析。

  • 配置filebeat:vim /etc/filebeat/filebeat.yml
filebeat.inputs:
- type: log
  enabled: true
  paths:
    - /usr/local/nginx/logs/access.log
  json.keys_under_root: true
  json.overwrite_keys: true
# 注释kibana与elasticsearch的输出
output.logstash:
  hosts: ["10.10.10.15:5044"]
  • 配置Logstash
input {
    beats {
        port => 5044
        type => "nginx"
    }
}

filter {
    if [type] == "nginx" {
        mutate {
            remove_field => ["message"]
        }
    }
    geoip {
        source => "remote_addr"
    }
}

output {
    if [type] == "nginx" {
        elasticsearch {
            hosts => ["10.10.10.15:9200"]
            index => "logstash-nginx-%{+YYYY.MM}"
        }
    }
}