欢迎您访问程序员文章站本站旨在为大家提供分享程序员计算机编程知识!
您现在的位置是: 首页

ElasticSearch 6.x 学习笔记:17.词项查询

程序员文章站 2024-01-04 08:29:10
...




1、下载

https://www.elastic.co/downloads/past-releases/logstash-6-1-1

[es@node1 ~]$ tar -zxvf logstash-6.1.1.tar.gz
  • 1
[es@node1 ~]$ cd logstash-6.1.1
[es@node1 logstash-6.1.1]$ ls
bin     CONTRIBUTORS  Gemfile       lib      logstash-core             modules     tools
config  data          Gemfile.lock  LICENSE  logstash-core-plugin-api  NOTICE.TXT  vendor
[es@node1 logstash-6.1.1]$
  • 1
  • 2
  • 3
  • 4
  • 5

2、快速入门例子

https://www.elastic.co/guide/en/logstash/6.x/index.html

[[email protected] logstash-6.1.1]$ bin/logstash -e 'input { stdin { } } output { stdout {} }'
Sending Logstash's logs to /home/es/logstash-6.1.1/logs which is now configured via log4j2.properties
[2018-03-31T00:11:53,157][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"fb_apache", :directory=>"/home/es/logstash-6.1.1/modules/fb_apache/configuration"}
[2018-03-31T00:11:53,198][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"netflow", :directory=>"/home/es/logstash-6.1.1/modules/netflow/configuration"}
[2018-03-31T00:11:53,383][INFO ][logstash.setting.writabledirectory] Creating directory {:setting=>"path.queue", :path=>"/home/es/logstash-6.1.1/data/queue"}
[2018-03-31T00:11:53,388][INFO ][logstash.setting.writabledirectory] Creating directory {:setting=>"path.dead_letter_queue", :path=>"/home/es/logstash-6.1.1/data/dead_letter_queue"}
[2018-03-31T00:11:54,199][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2018-03-31T00:11:54,305][INFO ][logstash.agent           ] No persistent UUID file found. Generating new UUID {:uuid=>"6614aafd-2e89-47a2-bce4-c2e04e460644", :path=>"/home/es/logstash-6.1.1/data/uuid"}
[2018-03-31T00:11:56,129][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"6.1.1"}
[2018-03-31T00:11:57,182][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2018-03-31T00:11:59,874][INFO ][logstash.pipeline        ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>2, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>250, :thread=>"#<Thread:0x24296acb run>"}
[2018-03-31T00:12:00,045][INFO ][logstash.pipeline        ] Pipeline started {"pipeline.id"=>"main"}
The stdin plugin is now waiting for input:
[2018-03-31T00:12:00,267][INFO ][logstash.agent           ] Pipelines running {:count=>1, :pipelines=>["main"]}
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 10
  • 11
  • 12
  • 13
  • 14
  • 15
hello
2018-03-31T04:12:31.435Z node1 hello
  • 1
  • 2
   -e          执行操作
   input       标准输入
   { input }   插件
   output      标准输出
   { stdout }  插件  
  • 1
  • 2
  • 3
  • 4
  • 5

3、配置文件

[es@node1 logstash-6.1.1]$ cd config/
[es@node1 config]$ ls
jvm.options  log4j2.properties  logstash.yml  pipelines.yml  startup.options
[es@node1 config]$ vi logstash.yml
  • 1
  • 2
  • 3
  • 4

文件末尾添加

path.data: /var/data/logstash
path.config: /home/es/logstash-6.1.1/config/*.conf
path.logs: /var/log/logstash
  • 1
  • 2
  • 3

创建目录

[es@node1 config]$ sudo mkdir /var/data/logstash
[sudo] password for es: 
[es@node1 config]$ sudo chown -R es:es /var/data/logstash
[es@node1 config]$ sudo mkdir /var/log/logstash
[es@node1 config]$ sudo chown -R es:es /var/log/logstash
[es@node1 config]$
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6

4、综合例子

(1)启动ES

[es@node1 elasticsearch-6.1.1]$ bin/elasticsearch -d
[es@node1 elasticsearch-6.1.1]$ jps
2371 Elasticsearch
2376 Jps
  • 1
  • 2
  • 3
  • 4

(2)编辑配置文件

[es@node1 config]$ vi test.conf
[es@node1 config]$ cat test.conf 
input{
   stdin{}
}
filter {
}
output{
    stdout{
      codec => rubydebug
    }
    elasticsearch {
      hosts => "node1:9200"
      index => "log-%{+YYYY.MM.dd}"
    }
}
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 10
  • 11
  • 12
  • 13
  • 14
  • 15
  • 16
  • 17

(3)运行

[[email protected] logstash-6.1.1]$ bin/logstash -f config/test.conf
Sending Logstash's logs to /var/log/logstash which is now configured via log4j2.properties
[2018-03-31T00:27:34,512][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"fb_apache", :directory=>"/home/es/logstash-6.1.1/modules/fb_apache/configuration"}
[2018-03-31T00:27:34,553][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"netflow", :directory=>"/home/es/logstash-6.1.1/modules/netflow/configuration"}
[2018-03-31T00:27:34,719][INFO ][logstash.setting.writabledirectory] Creating directory {:setting=>"path.queue", :path=>"/var/data/logstash/queue"}
[2018-03-31T00:27:34,744][INFO ][logstash.setting.writabledirectory] Creating directory {:setting=>"path.dead_letter_queue", :path=>"/var/data/logstash/dead_letter_queue"}
[2018-03-31T00:27:35,548][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2018-03-31T00:27:35,759][INFO ][logstash.agent           ] No persistent UUID file found. Generating new UUID {:uuid=>"c51f4a1a-61f2-4b08-8a9c-c716199a1b7d", :path=>"/var/data/logstash/uuid"}
[2018-03-31T00:27:36,791][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"6.1.1"}
[2018-03-31T00:27:38,147][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2018-03-31T00:27:44,300][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://node1:9200/]}}
[2018-03-31T00:27:44,335][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://node1:9200/, :path=>"/"}
[2018-03-31T00:27:47,267][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://node1:9200/"}
[2018-03-31T00:27:47,610][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>nil}
[2018-03-31T00:27:47,615][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>6}
[2018-03-31T00:27:47,652][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2018-03-31T00:27:47,826][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2018-03-31T00:27:48,044][INFO ][logstash.outputs.elasticsearch] Installing elasticsearch template to _template/logstash
[2018-03-31T00:27:48,304][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//node1:9200"]}
[2018-03-31T00:27:48,364][INFO ][logstash.pipeline        ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>2, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>250, :thread=>"#<Thread:0xb1a4d7c run>"}
[2018-03-31T00:27:48,624][INFO ][logstash.pipeline        ] Pipeline started {"pipeline.id"=>"main"}
The stdin plugin is now waiting for input:
[2018-03-31T00:27:48,908][INFO ][logstash.agent           ] Pipelines running {:count=>1, :pipelines=>["main"]}
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 10
  • 11
  • 12
  • 13
  • 14
  • 15
  • 16
  • 17
  • 18
  • 19
  • 20
  • 21
  • 22
  • 23
  • 24

(4)输入测试

hello
{
       "message" => "hello",
      "@version" => "1",
          "host" => "node1",
    "@timestamp" => 2018-03-31T04:28:08.951Z
}
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7

(5)启动kibana

[[email protected] kibana-6.1.1-linux-x86_64]$ bin/kibana &
[1] 2497
[[email protected] kibana-6.1.1-linux-x86_64]$   log   [04:30:30.037] [info][status][plugin:[email protected]] Status changed from uninitialized to green - Ready
  log   [04:30:30.177] [info][status][plugin:[email protected]] Status changed from uninitialized to yellow - Waiting for Elasticsearch
  log   [04:30:30.220] [info][status][plugin:[email protected]] Status changed from uninitialized to green - Ready
  log   [04:30:30.281] [info][status][plugin:[email protected]] Status changed from uninitialized to green - Ready
  log   [04:30:31.231] [info][status][plugin:[email protected]] Status changed from uninitialized to green - Ready
  log   [04:30:31.265] [info][listening] Server running at http://node1:5601
  log   [04:30:31.284] [info][status][plugin:[email protected]] Status changed from yellow to green - Ready
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9

(6)kibana查询

GET /log-2018.03.31/_search
  • 1

{ 

  "took": 183, 

  "timed_out": false, 

  "_shards": { 

    "total": 5, 

    "successful": 5, 

    "skipped": 0, 

    "failed": 0 

  }, 

  "hits": { 

    "total": 1, 

    "max_score": 1, 

    "hits": [ 

      { 

        "_index": "log-2018.03.31", 

        "_type": "doc", 

        "_id": "AQ1QemIBrpzkDC_bEMun", 

        "_score": 1, 

        "_source": { 

          "message": "hello", 

          "@version": "1", 

          "host": "node1", 

          "@timestamp": "2018-03-31T04:28:08.951Z" 

        } 

      } 

    ] 

  } 

}
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 10
  • 11
  • 12
  • 13
  • 14
  • 15
  • 16
  • 17
  • 18
  • 19
  • 20
  • 21
  • 22
  • 23
  • 24
  • 25
  • 26
  • 27
  • 28



上一篇:

下一篇: