Developing logstash pipelines can be frustrating. Finding ways to send data to logstash, editing the filters and testing them can be painful.
So I decided to set out to making it easier to develop pipelines. In order for the project to be successful the following criterias had to be met.
- The tool had to support any client platform, be it Windows, Linux or Macintosh
- It had to be easy to use without knowing Linux commands
- The interface should give direct results from the Logstash output section
The result ended up in a combination of a UI built with NodeJs and Logstash using Docker.
Table of Contents
Before you start
Please note that this tool is here to test the filter section of a pipeline, not the input or output.
- The script natively supports tcp/udp + syslog inputs
- If you want to other types of input you need to send the data directly to logstash and also expose the corresponding port
- Each output section needs to be defined as stated below
How to start
Video on how to get started:
Please note that the web interface is listening to port 8080, not 3000 as depicted in the video.
Written instructions
- If needed, install docker on your machine
- Clone the repository on GitHub.
- Copy your pipeline folder to
logstash/logstash-config/pipeline
- Replace the output section in each pipeline with the tools predefined output:
output { http { format => "json" http_method => "post" url => "http://pipeline-ui:8080/api/v1/receiveLogstashOutput" } }
- Enter the repository directory
- Run
docker-compose build
- Run
docker-compose up
- Open up http://localhost:8080 in your browser to access the interface
- It will directly attempt to connect to the backend and logstash. Wait until both status buttons are green before starting to send data.
IMPORTANT – Depending on docker host you might get the question if you want to share your drive with the docker service. If you get this message, share your drive, stop the containers and run docker-compose up
again.
Adding pipelines
There are two ways, either modify the existing pipelines or follow thes following steps.
- Modify
logstash-config/pipelines.yml
# Example covers the creation of a pipeline called mypipeline
- pipeline.id: generic-json
path.config: "/usr/share/logstash/pipeline/mypipeline"
- Then create a directory in
logstash-config/pipeline
calledmypipeline
- Copy your logstash config to
logstash-config/pipeline/mypipeline
- Last but not least, restart the containers using
docker-compose up
Modifying pipelines
Since the logstash pipeline directory is mounted in the containers logstash should detect file changes and reload the pipeline. If this does not happen for some reason you need to restart the containers.
Troubleshooting
Check the docker-compose window for any exceptions thrown.
Reporting issues
First, please check if there’s any current issues that matches your problem. If not, please feel free to submit an issue here at Github.
I have very limited time so I won’t be able to act fast on any issue but it’s always good to have it logged and who knows, maybe someone else will pick it up and make a PR.
Application flow diagram
Screenshot from the application
Some tips on workflow
- Wait until the pipelines has started and the logstash state turns green.
- Enter one or more raw log lines in the log line input field.
- Pick a predefined pipeline or enter port and protocol manually.
- Click send.
- Check out the result in the logstash text area to the right.
- Update the pipeline file if you’re not happy with the outcome, logstash will automatically re-read the config when you save
- Rince, repeat
Here’s a few tips in case stuff does not work
- Pipelines in the Development environment needs to use the http output, look at other pipelines and copy if unsure.
- Pipelines can be configured to drop stuff. Then you won’t get any result.
- Grok failures means that the line you send was not parseable by a grok expression in the pipeline.
- To troubleshoot Grok I can highly recommend the Heroku Grok Debugger.
- Add an issue on GitHub