Background:
I setup a Rancher 2.x cluster with RKE.
OS: CentOS 7
I am able to access the Rancher UI and deploy containers in my cluster.
I deployed an ELK stack on a separate server VM and I am trying to configure logging for my Rancher environment.
Elasticsearch, Logstash, Kibana versions: v7.8
https://rancher.com/docs/rancher/v2.x/en/cluster-admin/tools/logging/
https://rancher.com/docs/rancher/v2.x/en/cluster-admin/tools/logging/elasticsearch/
Requirements:
I am trying to configure logging for Rancher Projects. I want my container logs to end up in Elasticsearch.
In order to do this we need 1 program to send the logs and 1 program to parse the logs or just 1 program that does both.
Problems:
Rancher 2.x forces fluentd to be used to send logs directly to Elasticsearch. It doesn’t allow us to send the logs to Logstash.
How can I parse the log messages before they are indexed into Elasticsearch.
If I send the log messages to Elasticsearch I am getting all of these fields and most of them are not that important for people to see:
@timestamp
_id
_index
_score
_type
my-test-field
docker.container_id
kubernetes.container_image
kubernetes.container_image_id
kubernetes.container_name
kubernetes.host
kubernetes.labels.pod-template-hash
kubernetes.labels.workload_user_cattle_io/workloadselector
kubernetes.master_url
kubernetes.namespace_id
kubernetes.namespace_labels.cattle_io/creator
kubernetes.namespace_labels.field_cattle_io/projectId
kubernetes.namespace_name
kubernetes.pod_id
kubernetes.pod_name
log
log_type
projectID
stream
tag
I would like to limit these default fields and only include a couple of them.
I would like to be able to parse my log messages so that additional fields can be generated in Elasticsearch based on the log messages. I usually do this with Logstash Grok patterns, but with Rancher this doesn’t seem possible.
What is the equivalent configuration for doing this in Rancher ?
There is a “Edit as a file” option for fluentd, but from what I can see only the output should be written there. Is it possible to write a full fluentd configuration with input, filter, parser, output plugins ?
Why are we not allowed to send the logs to Logstash for parsing before indexing ?