This page provides examples and explanations for configuring Promtail, a log collection agent for Loki. It covers various scenarios including log parsing, transformation, and filtering.
Here are some useful examples and resources for setting up and configuring Promtail:
- Promtail Gist Examples
- Install Promtail as a Service
- Promtail Pipelines Documentation
- Parsing Labels from Log Tag
- Docker Logging Configuration
- Create Labels from Filename
- Drop Logs Based on Content
- Drop Logs with Match Stage
- Using Promtail with Loki (YouTube)
- Promtail Scraping Documentation
- Promtail Setup on AWS EC2
- Promtail Pipeline Templating
Stay updated with ongoing development and known issues:
Learn how to query your logs effectively using LogQL:
Promtail pipelines allow you to process logs before they are sent to Loki. This includes stages for parsing, transforming, and dropping logs.
More Info:
The pipeline example below takes the current value of
levelfrom the extracted map and converts its value to be all lowercase. For example, if the extracted map containedlevelwith a value ofINFO, this pipeline would change its value toinfo.
Pipeline Transform example: Change uppercase log levels to lowercase (e.g., INFO to info):
scrape_configs:
- job_name: app1
static_configs:
- targets:
- localhost
labels:
job: app1
environment: production
host: app1.mydomain.com
service: app1
__path__: /var/log/app1_*.log
pipeline_stages:
- match:
selector: '{service="app1"}'
stages:
- regex:
expression: "(?P<level>(INFO|WARNING|ERROR))(.*)"
- template:
source: level
template: '{{ ToLower .Value }}'
- labels:
level:
Convert stdout to info, stderr to error:
pipeline_stages:
- regex:
expression: '(?P<level>(stdout|stderr))'
- template:
source: level
template: '{{ if eq .Value "stdout" }}{{ Replace .Value "stdout" "info" -1 }}{{ else if eq .Value "stderr" }}{{ Replace .Value "stderr" "error" -1 }}{{ .Value }}{{ end }}'
In this scenario, we want to drop specific logs to prevent them from appearing in Loki.
Reference: Drop Logs with Match Stage
- job_name: qa/docker
entry_parser: raw
static_configs:
- targets:
- localhost
labels:
job: preprod/docker
service: app1
__path__: /var/lib/docker/containers/*/*-json.log
scrape_configs:
pipeline_stages:
- match:
pipeline_name: 'drop_elb_healthchecks'
selector: '{service="app1"} |= "ELB-HealthChecker"'
action: drop
- match:
pipeline_name: 'drop_ecs_agent_logs'
selector: '{service="app1"} |~ ".*(Managed task|Task engine).*" |~ "arn:aws:ecs:eu-west-1:000000000000:task"'
#selector: '{service="app1"} |~ ".*Managed task.*" |= "eu-west-1:000000000000:task"'
action: drop
- match:
pipeline_name: 'drop_blackbox_exporter_checks'
selector: '{service="app1"} |~ ".*Go-http-client.*" |= "GET /login"'
action: drop
This configuration demonstrates how to parse the Docker log --log-opt tag to extract container and image information.
scrape_configs:
- job_name: system
static_configs:
- targets:
- localhost
labels:
job: varlogs
__path__: /var/log/*log
- job_name: containers
entry_parser: raw
static_configs:
- targets:
- localhost
labels:
job: containerlogs
__path__: /var/lib/docker/containers/*/*log
# --log-opt tag="{{.ImageName}}|{{.Name}}|{{.ImageFullID}}|{{.FullID}}"
pipeline_stages:
- json:
expressions:
stream: stream
attrs: attrs
tag: attrs.tag
- regex:
expression: (?P<image_name>(?:[^|]*[^|])).(?P<container_name>(?:[^|]*[^|])).(?P<image_id>(?:[^|]*[^|])).(?P<container_id>(?:[^|]*[^|]))
source: "tag"
- labels:
tag:
stream:
image_name:
container_name:
image_id:
container_id:
This workaround uses file-based service discovery to manage Promtail targets for Docker containers.
Generated target file example (run via `docker ps --format`):
docker ps --format '- targets: ["{{.ID}}"]
labels:
container_name: "{{.Names}}"' > /etc/promtail/promtail-targets.yaml
Promtail configuration using file_sd_configs:
scrape_configs:
- job_name: containers
entry_parser: docker
file_sd_configs:
- files:
- /etc/promtail/promtail-targets.yaml
relabel_configs:
- source_labels: [__address__]
target_label: container_id
- source_labels: [container_id]
target_label: __path__
replacement: /var/lib/docker/containers/$1*/*.log
This section demonstrates how to extract nested JSON data, specifically the level key from a log line.
Example log line:
{"log": {"level": "info", "timestamp":"2023-02-26 21:55:24.702"}}
The Promtail configuration to extract the nested level:
pipeline_stages:
- cri: {}
- multiline:
firstline: ^\d{4}-\d{2}-\d{2} \d{1,2}:\d{2}:\d{2},\d{3}
max_wait_time: 3s
- json:
expressions:
log:
- json:
expressions:
level:
source: log
- labels:
level:
Referenced from: