ElasticSearch and Logstash : Transform message

I try to use ElasticSearch with logstash and Kibana. It wors fine but I want to parse specific fields to retrieve another fields.
After research, I found that it was necessary to install specifics plugins in ElasticSearch BUT I can’t find how to install plugins with the ElasticSearch installed from the catalog.


Ok, Sorry I have misunderstand this part. After research, the parsing of my field to create another field is set in the filter part of logstash with grok.

I try this in my config :

grok {
  match => {
    "message" => "\[32m%{LOGLEVEL:loglevel}\[39m: memory: %{NOTSPACE:memory}, uptime \(seconds\): %{NUMBER:uptime}, load: %{NUMBER:load1},%{NUMBER:load5},%{NUMBER:load15}"
mutate {
  rename => { "docker.id" => "container_id" }
  rename => { "docker.name" => "container_name" }
  rename => { "docker.image" => "docker_image" }
  rename => { "docker.hostname" => "docker_hostname" }

To transform this type of message :

e[32minfoe[39m: memory: 76Mb, uptime (seconds): 5529.927, load: 0.05322265625,0.1298828125,0.19384765625

To this variables :

load15 0.19384765625
uptime 5529.927
load1 0.05322265625
load5 0.1298828125
memory 76Mb
loglevel info

I test the pattern in http://grokconstructor.appspot.com/do/match and my matches work fine. But, In Kibana I can’t retrieve this fields.