LINUX.ORG.RU
ФорумAdmin

Не приходят логи

 


0

1

Добрый день.

Использую ELK для postfix

Для теста связи logstash и filebeat в logstash

input {
  beats {
    port => 5044
  }
}


output {
        elasticsearch {
            hosts    => "localhost:9200"
            index    => "postfix1-%{+YYYY.MM.dd}"
        }

 file {
         path  => "/some/debug"
         codec => rubydebug
    }
}

В конфиге filebeat.yml

filebeat.inputs:
- type: log
  enabled: true
  paths:
      - /var/log/maillog*

output.logstash:
  hosts: ["192.168.199.146:5044"]
xpack.monitoring:
  enabled: true
  elasticsearch:
    hosts: ["http://192.168.199.146:9200"]

Но папка файл /some/debug не создаётся и в Kibana нет индекса.

Warning только в логе filebeat

WARN    beater/filebeat.go:152  Filebeat is unable to load the Ingest Node pipelines for the configured modules because the Elasticsearch output is not configured/enabled. If you have already loaded the Ingest Node pipelines or are using Logstash pipelines, you can ignore this warning.

Что можно сделать, чтобы индекс появился ?


Добрый день. Скажите пожалуйста какую версию ELK вы используете? А так судя по конфигу у вас логи одновременно должны лететь и в logstash и в Elastic. У вас индекс filebeat в elastic'е есть ? По logstash по умолчанию он умеет создавать только файл если такого нет, директории он создавать не умеет.

Fess02
()
Ответ на: комментарий от Fess02

Спасибо.

ELK 7.4

Удалил file { path => «/some/debug» codec => rubydebug } Перезапустил logstash и filebeat.

Но логи всё равно не приходят. Индекса filebeat в elastic'е нет.

beren
() автор топика
Ответ на: комментарий от beren

А не могли бы полностью прислать лог запуска и logstash и filbeat? а еще проверьте права и у вас ошибка в конфиге filebeat. У вас прописан только хост elastic'a а сам output не прописан.

output.elasticsearch: hosts: [«localhost:9200»]

Fess02
()
Ответ на: комментарий от Fess02

Если конфиг filebeat.yml , который в первом сообщении

То /var/log/filebeat/filebeat

2019-10-11T03:30:30.292-0400    INFO    instance/beat.go:607    Home path: [/usr/share/filebeat] Config path: [/etc/filebeat] Data path: [/var/lib/filebeat] Logs path: [/var/log/filebeat]
2019-10-11T03:30:30.292-0400    INFO    instance/beat.go:615    Beat ID: 1a3f0e19-0b95-4d3e-881a-d37400468513
2019-10-11T03:30:30.292-0400    INFO    [beat]  instance/beat.go:903    Beat info       {"system_info": {"beat": {"path": {"config": "/etc/filebeat", "data": "/var/lib/filebeat", "home": "/usr/share/filebeat", "logs": "/var/log/filebeat"}, "type": "filebeat", "uuid": "1a3f0e19-0b95-4d3e-881a-d37400468513"}}}
2019-10-11T03:30:30.292-0400    INFO    [beat]  instance/beat.go:912    Build info      {"system_info": {"build": {"commit": "f940c36884d3749901a9c99bea5463a6030cdd9c", "libbeat": "7.4.0", "time": "2019-09-27T07:45:44.000Z", "version": "7.4.0"}}}
2019-10-11T03:30:30.292-0400    INFO    [beat]  instance/beat.go:915    Go runtime info {"system_info": {"go": {"os":"linux","arch":"amd64","max_procs":1,"version":"go1.12.9"}}}
2019-10-11T03:30:30.293-0400    INFO    [beat]  instance/beat.go:919    Host info       {"system_info": {"host": {"architecture":"x86_64","boot_time":"2019-10-10T06:15:26-04:00","containerized":false,"name":"mail.test.ru","ip":["127.0.0.1/8","::1/128","192.168.199.145/24","fe80::277c:61b3:ac2:bc8c/64"],"kernel_version":"3.10.0-957.el7.x86_64","mac":["00:0c:29:cb:1a:3d"],"os":{"family":"redhat","platform":"centos","name":"CentOS Linux","version":"7 (Core)","major":7,"minor":6,"patch":1810,"codename":"Core"},"timezone":"EDT","timezone_offset_sec":-14400,"id":"b27e5adbf8a1485faffe0eeec83f47f2"}}}
2019-10-11T03:30:30.294-0400    INFO    [beat]  instance/beat.go:948    Process info    {"system_info": {"process": {"capabilities": {"inheritable":null,"permitted":["chown","dac_override","dac_read_search","fowner","fsetid","kill","setgid","setuid","setpcap","linux_immutable","net_bind_service","net_broadcast","net_admin","net_raw","ipc_lock","ipc_owner","sys_module","sys_rawio","sys_chroot","sys_ptrace","sys_pacct","sys_admin","sys_boot","sys_nice","sys_resource","sys_time","sys_tty_config","mknod","lease","audit_write","audit_control","setfcap","mac_override","mac_admin","syslog","wake_alarm","block_suspend"],"effective":["chown","dac_override","dac_read_search","fowner","fsetid","kill","setgid","setuid","setpcap","linux_immutable","net_bind_service","net_broadcast","net_admin","net_raw","ipc_lock","ipc_owner","sys_module","sys_rawio","sys_chroot","sys_ptrace","sys_pacct","sys_admin","sys_boot","sys_nice","sys_resource","sys_time","sys_tty_config","mknod","lease","audit_write","audit_control","setfcap","mac_override","mac_admin","syslog","wake_alarm","block_suspend"],"bounding":["chown","dac_override","dac_read_search","fowner","fsetid","kill","setgid","setuid","setpcap","linux_immutable","net_bind_service","net_broadcast","net_admin","net_raw","ipc_lock","ipc_owner","sys_module","sys_rawio","sys_chroot","sys_ptrace","sys_pacct","sys_admin","sys_boot","sys_nice","sys_resource","sys_time","sys_tty_config","mknod","lease","audit_write","audit_control","setfcap","mac_override","mac_admin","syslog","wake_alarm","block_suspend"],"ambient":null}, "cwd": "/var/log", "exe": "/usr/share/filebeat/bin/filebeat", "name": "filebeat", "pid": 11533, "ppid": 10829, "seccomp": {"mode":"disabled"}, "start_time": "2019-10-11T03:30:29.400-0400"}}}
2019-10-11T03:30:30.294-0400    INFO    instance/beat.go:292    Setup Beat: filebeat; Version: 7.4.0
2019-10-11T03:30:30.296-0400    INFO    [publisher]     pipeline/module.go:97   Beat name: mail.test.ru
2019-10-11T03:30:30.296-0400    WARN    beater/filebeat.go:152  Filebeat is unable to load the Ingest Node pipelines for the configured modules because the Elasticsearch output is not configured/enabled. If you have already loaded the Ingest Node pipelines or are using Logstash pipelines, you can ignore this warning.

Какие права нужно проверить ?

beren
() автор топика
Ответ на: комментарий от Fess02

Логи logstash

Starting Logstash {"logstash.version"=>"7.4.0"}
[2019-10-11T02:45:47,708][INFO ][org.reflections.Reflections] Reflections took 123 ms to scan 1 urls, producing 20 keys and 40 values
[2019-10-11T02:45:49,170][WARN ][org.logstash.instrument.metrics.gauge.LazyDelegatingGauge][main] A gauge metric of an unknown type (org.jruby.RubyArray) has been create for key: cluster_uuids. This may result in invalid serialization.  It is recommended to log an issue to the responsible developer/development team.
[2019-10-11T02:45:49,186][INFO ][logstash.javapipeline    ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>1, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>125, :thread=>"#<Thread:0x45782123 run>"}
[2019-10-11T02:45:50,413][INFO ][logstash.inputs.beats    ][main] Beats inputs: Starting input listener {:address=>"0.0.0.0:5044"}
[2019-10-11T02:45:50,476][INFO ][logstash.javapipeline    ][main] Pipeline started {"pipeline.id"=>"main"}
[2019-10-11T02:45:50,639][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2019-10-11T02:45:51,087][INFO ][org.logstash.beats.Server][main] Starting server on port: 5044
[2019-10-11T02:45:51,611][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2019-10-11T02:56:31,168][WARN ][logstash.runner          ] SIGTERM received. Shutting down.
[2019-10-11T02:56:36,435][WARN ][org.logstash.execution.ShutdownWatcherExt] {"inflight_count"=>0, "stalling_threads_info"=>{"other"=>[{"thread_id"=>19, "name"=>"[main]<beats", "current_call"=>"[...]/vendor/bundle/jruby/2.5.0/gems/logstash-input-beats-6.0.1-java/lib/logstash/inputs/beats.rb:204:in `run'"}, {"thread_id"=>18, "name"=>"[main]>worker0", "current_call"=>"[...]/logstash-core/lib/logstash/java_pipeline.rb:243:in `block in start_workers'"}]}}
[2019-10-11T02:56:36,437][ERROR][org.logstash.execution.ShutdownWatcherExt] The shutdown process appears to be stalled due to busy or blocked plugins. Check the logs for more information.
[2019-10-11T02:56:37,807][INFO ][logstash.javapipeline    ] Pipeline terminated {"pipeline.id"=>"main"}
[2019-10-11T02:56:37,918][INFO ][logstash.runner          ] Logstash shut down.
[2019-10-11T02:57:10,262][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"7.4.0"}
[2019-10-11T02:57:13,217][INFO ][org.reflections.Reflections] Reflections took 88 ms to scan 1 urls, producing 20 keys and 40 values
[2019-10-11T02:57:14,867][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[2019-10-11T02:57:15,298][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2019-10-11T02:57:15,383][INFO ][logstash.outputs.elasticsearch][main] ES Output version determined {:es_version=>7}
[2019-10-11T02:57:15,387][WARN ][logstash.outputs.elasticsearch][main] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7}
[2019-10-11T02:57:15,478][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost:9200"]}
[2019-10-11T02:57:15,616][WARN ][org.logstash.instrument.metrics.gauge.LazyDelegatingGauge][main] A gauge metric of an unknown type (org.jruby.RubyArray) has been create for key: cluster_uuids. This may result in invalid serialization.  It is recommended to log an issue to the responsible developer/development team.
[2019-10-11T02:57:15,635][INFO ][logstash.javapipeline    ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>1, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>125, :thread=>"#<Thread:0x3de4ad4e run>"}
[2019-10-11T02:57:15,704][INFO ][logstash.outputs.elasticsearch][main] Using default mapping template
[2019-10-11T02:57:16,142][INFO ][logstash.outputs.elasticsearch][main] Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}
[2019-10-11T02:57:16,973][INFO ][logstash.inputs.beats    ][main] Beats inputs: Starting input listener {:address=>"0.0.0.0:5044"}
[2019-10-11T02:57:17,130][INFO ][logstash.javapipeline    ][main] Pipeline started {"pipeline.id"=>"main"}
[2019-10-11T02:57:17,229][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2019-10-11T02:57:17,414][INFO ][org.logstash.beats.Server][main] Starting server on port: 5044
[2019-10-11T02:57:18,119][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}

Если в filebeat.yml добавить output.elasticsearch: hosts: [«localhost:9200»] то filebeat не стартует

beren
() автор топика
Ответ на: комментарий от beren

Решение

Всем спасибо. Решено. Дело было, что новых логов не приходило.

beren
() автор топика
Вы не можете добавлять комментарии в эту тему. Тема перемещена в архив.