LogStash:: ошибка конфигурации, но конфигурация в порядке

#logstash

#logstash

Вопрос:

Я проверил конфигурацию Logstash:

 root@learn-elk:/etc/logstash/conf.d# /opt/logstash/bin/logstash -t /etc/logstash/conf.d/
Configuration OK
  

но по-прежнему выдается ошибка, и конвейер прерван после

 ==> /var/log/logstash/logstash.log <==
{:timestamp=>"2016-10-22T17:48:28.391000 0000", :message=>"Pipeline aborted due to error", :exception=>"LogStash::ConfigurationError", :backtrace=>["/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/config/mixin.rb:88:in `config_init'", "org/jruby/RubyHash.java:1342:in `each'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/config/mixin.rb:72:in `config_init'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/outputs/base.rb:79:in `initialize'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/output_delegator.rb:74:in `register'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/pipeline.rb:181:in `start_workers'", "org/jruby/RubyArray.java:1613:in `each'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/pipeline.rb:181:in `start_workers'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/pipeline.rb:136:in `run'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/agent.rb:491:in `start_pipeline'"], :level=>:error}
{:timestamp=>"2016-10-22T17:48:31.424000 0000", :message=>"stopping pipeline", :id=>"main"}
  

после запуска logstash с ‘-v —debug —verbose’ я получил гораздо больше информации:

 starting agent {:level=>:info}
starting pipeline {:id=>"main", :level=>:info}
Settings: Default pipeline workers: 1
Registering file input {:path=>["/opt/logstash/GOOG.csv"], :level=>:info}
No sincedb_path set, generating one based on the file path {:sincedb_path=>"/root/.sincedb_0a3b7d0b4841f166ec450717c6ce4124", :path=>["/opt/logstash/GOOG.csv"], :level=>:info}
Pipeline aborted due to error {:exception=>"LogStash::ConfigurationError", :backtrace=>["/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/config/mixin.rb:88:in `config_init'", "org/jruby/RubyHash.java:1342:in `each'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/config/mixin.rb:72:in `config_init'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/outputs/base.rb:79:in `initialize'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/output_delegator.rb:74:in `register'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/pipeline.rb:181:in `start_workers'", "org/jruby/RubyArray.java:1613:in `each'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/pipeline.rb:181:in `start_workers'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/pipeline.rb:136:in `run'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/agent.rb:491:in `start_pipeline'"], :level=>:error}
stopping pipeline {:id=>"main"}
Closing inputs {:level=>:info}
Closed inputs {:level=>:info}
  

После исправления проблемы logstash {hosts => [«localhost»] }} с {host => localhost }} Я объединил конфигурацию в один файл ниже и использовал стандартный вывод вместо elasticsearch

 input{
file{
path =>"/opt/logstash/GOOG.csv"
start_position =>"beginning"
type => google
}
}
filter{
if [type] == "google" {
csv{
columns =>
["date_of_record","open","high","low","close","volume","adj_close"]
separator => ","
}
date {
match => ["date_of_record","yyyy-MM-dd"]
}
mutate {
convert => ["open","float"]
convert => ["high","float"]
convert => ["low","float"]
convert => ["close","float"]
convert => ["volume","integer"]
convert => ["adj_close","float"]
}
}
output {
  stdout {
  }
}
  

Комментарии:

1. В трассировке стека не так много деталей. Попробуйте начать с —verbose или —debug?

2. вопрос обновлен с помощью debug, thnx

3. После изменения выходных данных из elasticsearch {host => «localhost»} в elasticsearch {hosts => [«localhost»]} logstash запущен, ошибок нет;) Но в elasticsearch по-прежнему не создан индекс, поэтому я должен просмотреть входные данные и filter..in факт, что это взято из книги safaribooksonline.com/library/view/learning-elk-stack /…

4. Не могли бы вы показать нам свой conf-файл?