I've had a brief play with a new ( to me ) Docker image, ELK: -
Collect, search and visualise log data with Elasticsearch, Logstash, and Kibana.
using this documentation: -
This time around, I built it using Docker Compose ( on my Mac ) : -
Create a DC YAML
vi docker-compose.yml
image: sebp/elk
ports:
- "5601:5601"
- "9200:9200"
- "5044:5044"
Spin up the Container
docker-compose up elk
…
Creating elk_elk_1 ...
Creating elk_elk_1 ... done
Attaching to elk_elk_1
elk_1 | * Starting periodic command scheduler cron
elk_1 | ...done.
elk_1 | * Starting Elasticsearch Server
elk_1 | ...done.
elk_1 | waiting for Elasticsearch to be up (1/30)
elk_1 | waiting for Elasticsearch to be up (2/30)
elk_1 | waiting for Elasticsearch to be up (3/30)
elk_1 | waiting for Elasticsearch to be up (4/30)
elk_1 | waiting for Elasticsearch to be up (5/30)
elk_1 | waiting for Elasticsearch to be up (6/30)
elk_1 | waiting for Elasticsearch to be up (7/30)
elk_1 | Waiting for Elasticsearch cluster to respond (1/30)
elk_1 | logstash started.
elk_1 | * Starting Kibana5
elk_1 | ...done.
elk_1 | ==> /var/log/elasticsearch/elasticsearch.log <==
elk_1 | [2017-10-20T09:58:07,375][INFO ][o.e.p.PluginsService ] [Q6xLn7b] no plugins loaded
elk_1 | [2017-10-20T09:58:09,062][INFO ][o.e.d.DiscoveryModule ] [Q6xLn7b] using discovery type [zen]
elk_1 | [2017-10-20T09:58:09,753][INFO ][o.e.n.Node ] initialized
elk_1 | [2017-10-20T09:58:09,753][INFO ][o.e.n.Node ] [Q6xLn7b] starting ...
elk_1 | [2017-10-20T09:58:09,960][INFO ][o.e.t.TransportService ] [Q6xLn7b] publish_address {172.17.0.2:9300}, bound_addresses {0.0.0.0:9300}
elk_1 | [2017-10-20T09:58:09,974][INFO ][o.e.b.BootstrapChecks ] [Q6xLn7b] bound or publishing to a non-loopback or non-link-local address, enforcing bootstrap checks
elk_1 | [2017-10-20T09:58:13,044][INFO ][o.e.c.s.ClusterService ] [Q6xLn7b] new_master {Q6xLn7b}{Q6xLn7bNR66inZlv5JcUaQ}{HPqd_E_QSJ2eHModlSUT6A}{172.17.0.2}{172.17.0.2:9300}, reason: zen-disco-elected-as-master ([0] nodes joined)
elk_1 | [2017-10-20T09:58:13,080][INFO ][o.e.h.n.Netty4HttpServerTransport] [Q6xLn7b] publish_address {172.17.0.2:9200}, bound_addresses {0.0.0.0:9200}
elk_1 | [2017-10-20T09:58:13,080][INFO ][o.e.n.Node ] [Q6xLn7b] started
elk_1 | [2017-10-20T09:58:13,143][INFO ][o.e.g.GatewayService ] [Q6xLn7b] recovered [0] indices into cluster_state
elk_1 |
elk_1 | ==> /var/log/logstash/logstash-plain.log <==
elk_1 |
elk_1 | ==> /var/log/kibana/kibana5.log <==
…
Creating elk_elk_1 ... done
Attaching to elk_elk_1
elk_1 | * Starting periodic command scheduler cron
elk_1 | ...done.
elk_1 | * Starting Elasticsearch Server
elk_1 | ...done.
elk_1 | waiting for Elasticsearch to be up (1/30)
elk_1 | waiting for Elasticsearch to be up (2/30)
elk_1 | waiting for Elasticsearch to be up (3/30)
elk_1 | waiting for Elasticsearch to be up (4/30)
elk_1 | waiting for Elasticsearch to be up (5/30)
elk_1 | waiting for Elasticsearch to be up (6/30)
elk_1 | waiting for Elasticsearch to be up (7/30)
elk_1 | Waiting for Elasticsearch cluster to respond (1/30)
elk_1 | logstash started.
elk_1 | * Starting Kibana5
elk_1 | ...done.
elk_1 | ==> /var/log/elasticsearch/elasticsearch.log <==
elk_1 | [2017-10-20T09:58:07,375][INFO ][o.e.p.PluginsService ] [Q6xLn7b] no plugins loaded
elk_1 | [2017-10-20T09:58:09,062][INFO ][o.e.d.DiscoveryModule ] [Q6xLn7b] using discovery type [zen]
elk_1 | [2017-10-20T09:58:09,753][INFO ][o.e.n.Node ] initialized
elk_1 | [2017-10-20T09:58:09,753][INFO ][o.e.n.Node ] [Q6xLn7b] starting ...
elk_1 | [2017-10-20T09:58:09,960][INFO ][o.e.t.TransportService ] [Q6xLn7b] publish_address {172.17.0.2:9300}, bound_addresses {0.0.0.0:9300}
elk_1 | [2017-10-20T09:58:09,974][INFO ][o.e.b.BootstrapChecks ] [Q6xLn7b] bound or publishing to a non-loopback or non-link-local address, enforcing bootstrap checks
elk_1 | [2017-10-20T09:58:13,044][INFO ][o.e.c.s.ClusterService ] [Q6xLn7b] new_master {Q6xLn7b}{Q6xLn7bNR66inZlv5JcUaQ}{HPqd_E_QSJ2eHModlSUT6A}{172.17.0.2}{172.17.0.2:9300}, reason: zen-disco-elected-as-master ([0] nodes joined)
elk_1 | [2017-10-20T09:58:13,080][INFO ][o.e.h.n.Netty4HttpServerTransport] [Q6xLn7b] publish_address {172.17.0.2:9200}, bound_addresses {0.0.0.0:9200}
elk_1 | [2017-10-20T09:58:13,080][INFO ][o.e.n.Node ] [Q6xLn7b] started
elk_1 | [2017-10-20T09:58:13,143][INFO ][o.e.g.GatewayService ] [Q6xLn7b] recovered [0] indices into cluster_state
elk_1 |
elk_1 | ==> /var/log/logstash/logstash-plain.log <==
elk_1 |
elk_1 | ==> /var/log/kibana/kibana5.log <==
…
See what's running
docker ps -a
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
be3d5ee65642 sebp/elk "/usr/local/bin/st..." 2 minutes ago Up 2 minutes 0.0.0.0:5044->5044/tcp, 0.0.0.0:5601->5601/tcp, 0.0.0.0:9200->9200/tcp, 9300/tcp elk_elk_1
4f54bc00b67d websphere-liberty:wlp101 "/opt/ibm/docker/d..." 8 days ago Exited (143) 45 hours ago dazzling_mestorf
be3d5ee65642 sebp/elk "/usr/local/bin/st..." 2 minutes ago Up 2 minutes 0.0.0.0:5044->5044/tcp, 0.0.0.0:5601->5601/tcp, 0.0.0.0:9200->9200/tcp, 9300/tcp elk_elk_1
4f54bc00b67d websphere-liberty:wlp101 "/opt/ibm/docker/d..." 8 days ago Exited (143) 45 hours ago dazzling_mestorf
Start a shell on the container
docker exec -it be3d5ee65642 /bin/bash
Pull the logs to the foreground
Note the subtle use of the apostrophe ( ' )
/opt/logstash/bin/logstash --path.data /tmp/logstash/data -e 'input { stdin { } } output { elasticsearch { hosts => ["localhost"] } }'
Sending Logstash's logs to /opt/logstash/logs which is now configured via log4j2.properties
[2017-10-20T10:24:30,729][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"fb_apache", :directory=>"/opt/logstash/modules/fb_apache/configuration"}
[2017-10-20T10:24:30,741][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"netflow", :directory=>"/opt/logstash/modules/netflow/configuration"}
[2017-10-20T10:24:31,388][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[2017-10-20T10:24:31,390][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://localhost:9200/, :path=>"/"}
[2017-10-20T10:24:31,516][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2017-10-20T10:24:31,607][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2017-10-20T10:24:31,614][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>50001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"_all"=>{"enabled"=>true, "norms"=>false}, "dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date", "include_in_all"=>false}, "@version"=>{"type"=>"keyword", "include_in_all"=>false}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2017-10-20T10:24:31,620][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost"]}
[2017-10-20T10:24:31,623][INFO ][logstash.pipeline ] Starting pipeline {"id"=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>500}
[2017-10-20T10:24:31,792][INFO ][logstash.pipeline ] Pipeline main started
The stdin plugin is now waiting for input:
[2017-10-20T10:24:31,976][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9601}
[2017-10-20T10:24:30,729][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"fb_apache", :directory=>"/opt/logstash/modules/fb_apache/configuration"}
[2017-10-20T10:24:30,741][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"netflow", :directory=>"/opt/logstash/modules/netflow/configuration"}
[2017-10-20T10:24:31,388][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[2017-10-20T10:24:31,390][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://localhost:9200/, :path=>"/"}
[2017-10-20T10:24:31,516][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2017-10-20T10:24:31,607][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2017-10-20T10:24:31,614][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>50001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"_all"=>{"enabled"=>true, "norms"=>false}, "dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date", "include_in_all"=>false}, "@version"=>{"type"=>"keyword", "include_in_all"=>false}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2017-10-20T10:24:31,620][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost"]}
[2017-10-20T10:24:31,623][INFO ][logstash.pipeline ] Starting pipeline {"id"=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>500}
[2017-10-20T10:24:31,792][INFO ][logstash.pipeline ] Pipeline main started
The stdin plugin is now waiting for input:
[2017-10-20T10:24:31,976][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9601}
Send a test message
The Quick Brown Fox Jumped Over The Lazy Dog!
Check the log
{
"took" : 2,
"timed_out" : false,
"_shards" : {
"total" : 6,
"successful" : 6,
"skipped" : 0,
"failed" : 0
},
"hits" : {
"total" : 8,
"max_score" : 1.0,
"hits" : [
{
"_index" : ".kibana",
"_type" : "config",
"_id" : "5.6.3",
"_score" : 1.0,
"_source" : {
"buildNum" : 15554
}
},
{
"_index" : "logstash-2017.10.20",
"_type" : "logs",
"_id" : "AV85PK0Ji95TIyQOdvFj",
"_score" : 1.0,
"_source" : {
"@version" : "1",
"host" : "be3d5ee65642",
"@timestamp" : "2017-10-20T10:03:18.652Z",
"message" : "this is a dummy entry"
}
},
{
"_index" : "logstash-2017.10.20",
"_type" : "logs",
"_id" : "AV85PT5gi95TIyQOdvFm",
"_score" : 1.0,
"_source" : {
"@version" : "1",
"host" : "be3d5ee65642",
"@timestamp" : "2017-10-20T10:03:55.857Z",
"message" : "I love it !"
}
},
{
"_index" : "logstash-2017.10.20",
"_type" : "logs",
"_id" : "AV85UBqvi95TIyQOdvFp",
"_score" : 1.0,
"_source" : {
"@version" : "1",
"host" : "be3d5ee65642",
"@timestamp" : "2017-10-20T10:24:31.867Z",
"message" : "Hello Fluffy"
}
},
{
"_index" : "logstash-2017.10.20",
"_type" : "logs",
"_id" : "AV85UWpEi95TIyQOdvFr",
"_score" : 1.0,
"_source" : {
"@version" : "1",
"host" : "be3d5ee65642",
"@timestamp" : "2017-10-20T10:25:57.808Z",
"message" : "The Quick Brown Fox Jumped Over The Lazy Dog!"
}
},
{
"_index" : "logstash-2017.10.20",
"_type" : "logs",
"_id" : "AV85PKzri95TIyQOdvFi",
"_score" : 1.0,
"_source" : {
"@version" : "1",
"host" : "be3d5ee65642",
"@timestamp" : "2017-10-20T10:03:17.729Z",
"message" : "this is a dummy entry"
}
},
{
"_index" : "logstash-2017.10.20",
"_type" : "logs",
"_id" : "AV85PK9Si95TIyQOdvFk",
"_score" : 1.0,
"_source" : {
"@version" : "1",
"host" : "be3d5ee65642",
"@timestamp" : "2017-10-20T10:03:19.238Z",
"message" : "this is a dummy entry"
}
},
{
"_index" : "logstash-2017.10.20",
"_type" : "logs",
"_id" : "AV85UCuXi95TIyQOdvFq",
"_score" : 1.0,
"_source" : {
"@version" : "1",
"host" : "be3d5ee65642",
"@timestamp" : "2017-10-20T10:24:36.234Z",
"message" : "Hello Fluffy"
}
}
]
}
}
So we have Kibana running: -
and Elasticsearch: -
Next job is to wire my BPM Event Emitter up to this - but that's the easy part :-)
*UPDATE*
And, as expected, it just worked. I completed one of my running BPM processes, and immediately saw messages in Elasticsearch, including: -
*UPDATE*
And, as expected, it just worked. I completed one of my running BPM processes, and immediately saw messages in Elasticsearch, including: -
elk_1 | [2017-10-20T13:14:49,377][INFO ][o.e.c.m.MetaDataCreateIndexService] [Q6xLn7b] [bpm-events] creating index, cause [api], templates [], shards [5]/[1], mappings []
elk_1 | [2017-10-20T13:14:50,641][INFO ][o.e.c.m.MetaDataMappingService] [Q6xLn7b] [bpm-events/cWG124C1SOqS4UR_6QQboA] create_mapping [ProcessEvent]
elk_1 | [2017-10-20T13:14:50,738][INFO ][o.e.c.m.MetaDataMappingService] [Q6xLn7b] [bpm-events/cWG124C1SOqS4UR_6QQboA] create_mapping [ActivityEvent]
elk_1 | [2017-10-20T13:14:50,828][INFO ][o.e.c.m.MetaDataCreateIndexService] [Q6xLn7b] [restore_task_index] creating index, cause [auto(bulk api)], templates [], shards [5]/[1], mappings []
elk_1 | [2017-10-20T13:14:52,022][INFO ][o.e.c.m.MetaDataMappingService] [Q6xLn7b] [restore_task_index/HMBr8hw4RAmDJrNzZCX-ag] create_mapping [configuration_type]
elk_1 | [2017-10-20T13:18:30,329][INFO ][o.e.c.m.MetaDataMappingService] [Q6xLn7b] [bpm-events/cWG124C1SOqS4UR_6QQboA] update_mapping [ActivityEvent]
elk_1 | [2017-10-20T13:18:38,529][INFO ][o.e.c.m.MetaDataMappingService] [Q6xLn7b] [bpm-events/cWG124C1SOqS4UR_6QQboA] update_mapping [ActivityEvent]
elk_1 | [2017-10-20T13:18:38,718][INFO ][o.e.c.m.MetaDataMappingService] [Q6xLn7b] [bpm-events/cWG124C1SOqS4UR_6QQboA] update_mapping [ProcessEvent]
elk_1 | [2017-10-20T13:18:38,810][INFO ][o.e.c.m.MetaDataMappingService] [Q6xLn7b] [bpm-events/cWG124C1SOqS4UR_6QQboA] update_mapping [ActivityEvent]
elk_1 | [2017-10-20T13:18:38,836][INFO ][o.e.c.m.MetaDataMappingService] [Q6xLn7b] [bpm-events/cWG124C1SOqS4UR_6QQboA] update_mapping [ActivityEvent]
which is nice.
No comments:
Post a Comment