Lab 3 : Kibana Dashboard

Build a Kibana dashboard

Kibana has a set of visualizations that you can configure and deploy into a dashboard. When you enable Auto-refresh, you get near-real-time monitoring for your web server. In the following sections, you will set up a number of visualizations and create a dashboard from those visualizations.

Setup index patterns for the web server and proxy server logs

In lab 2, you enabled logs to flow from Fluentd to Kinesis and then have those logs pulled by Logstash and pushed to the Amazon Elasticsearch Service. In the Logstash configuration, you directed the logs from the Amazon Kinesis Data Stream to split based on a field value in the JSON document. This gave you the ability to view different formats of logs in separate indexes with entirely separate mappings. To work with the data, you need to setup index patterns for those logs. In lab 1, you had practice in creating index patterns with Metricbeat data. Go ahead and navigate to the Settings plugin. You are going to create two additional index patterns so you can build dashboards and visualizations.

img

Click on “Index Patterns” to create a new pattern. As seen in lab 2, a list of existing indexes appears to assist you in your selection so that you can create an index pattern. Start typing proxy-* so that you can pattern match all indexes created that match that specific pattern.

As you did in lab 2, create the pattern and select the \@timestamp field for the time series relationship. Proceed to do the same steps with the webapp-* index.

With the index patterns in place, you can create visualizations and dashboards.

A word on Elasticsearch aggregations

Kibana builds visualizations based on the Elasticsearch aggregations feature. To understand how to build visualizations, you need to understand aggregations.

Elasticsearch is a search engine first, and an analytics engine second. When you send log data into an Elasticsearch cluster, you, or the ingest technology you are using, parse each log line and build structured JSON documents from the values in it. Here is an example log line

192.168.0.167 - - \[21/Nov/2017:00:15:18 +0000\] \"GET / HTTP/1.1\" 200 12943 \"-\" \"ELB-HealthChecker/2.0\"

When Fluentd receives that line, it parses the full string, and assigns the values to JSON elements. Each element represents a single field, whose value is the value from the log line. Fluentd parses and structures the above log line to produce

{
	"request": "/",
	"agent": """"ELB-HealthChecker/2.0"""",
	"auth": "-",
	"ident": "-",
	"verb": "GET",
	"referrer": """"-"""",
	"@timestamp": "2017-11-21T00:15:18.949Z",
	"response": "200",
	"bytes": "12943",
	"clientip": "192.168.0.167",
	"httpversion": "1.1",
	"timestamp": "21/Nov/2017:00:15:18 +0000"
}

When you perform a search, you specify fields (explicitly or implicitly), and values to match against those fields. Elasticsearch retrieves documents from its index whose field values match the fields you specified in the query. The result of this retrieval is a match set.

Elasticsearch then creates an aggregation by iterating over the match set. It creates buckets according to the aggregation (e.g., time slices) and calculating a numeric value (e.g., a count) placing each value from the document’s field into the appropriate bucket. For example, a search for documents with a \@timestamp in the range of 15 minutes ago to now might yield 60 matches. An aggregation for those values with 1-minute buckets would increment the count in the newest bucket (1 minute ago to now) for each document with a \@timestamp in that range.

Aggregations nest. Elasticsearch can take all of the documents in a bucket and create sub-buckets based on a second field. For example, if the top-level bucket is time slices, a useful sub-bucket is the response. Continuing the example, Elasticsearch will create sub-buckets for each value of the response field present in one of the documents in that bucket. It increments a counter in the sub-bucket for each document with that sub-bucket’s value. This analysis of the data displays as a stacked bar chart with one bar per time slice and height of the sub-bars proportional to the count.

Count is not the only function that Elasticsearch can perform. It can compute sums, averages, mins, maxes, standard deviations and more. This provides a rich set of combinable functions to be the basis for Kibana to display.

Simple metrics

The simplest thing you can do is to count the requests to your web server and display that count as a number. Click the Visualize tab at the left of the Kibana page, then click Create a visualization or the img button.

img

You can select from many different visualizations. Under New Visualization, click Metric.

img

You need to tell Kibana the indexes to search, and you do that by specifying the index pattern that you want to use. Click proxy-*.

img

You immediately get a metric named Count. Count sums the total number of documents (web log lines) the domain ingested in the last 15 minutes.

img

Add to that by creating another metric that reports the unique number of hosts that have sent requests to the domain. Click Add metrics.

img

Under Select metrics type, click Metric. Open the menu under Aggregation, and select Unique Count.

img

This reveal another menu: Field. Select referrer.keyword. Click img at the top of the entry panel to show the second metric in the visualization.

img

Repeat this process to add a Unique Count for the instance_type.keyword field.

img

This will let you know how many different requests are coming to your web servers and from how many instances. Your visualization should look like this:

img

Save your visualization for later use in your dashboard. At the top of the screen, click Save.

img

img

Name the visualization Metrics, and click the Save button. Navigate to the Simple Search Page in your browser and run a few searches. Come back to Kibana and you should see the counts increase.

Track result codes

To make sure that your website is functioning, you need to track result codes. You can build a simple visualization to see result codes over time. Click the Visualize tab once, and if you see a visualization instead of the below screen, click Visualize again to clear it. Click the img to create a new visualization.

img

Select Vertical Bar as the type.

img

Click webapp-* under Name as the index pattern.

img

Many of the Kibana’s visualizations work with both an X and a Y-axis. When you build these visualizations, you will usually start by dividing the X-axis into time slices (a Date Histogram aggregation) and then further sub-dividing for the value you want to graph.

img

Under Buckets, click X-Axis.

img

Then, select Date Histogram from the Aggregation menu.

Kibana automatically selects the \@timestamp field. If you click img now, you will see a duplicate of the Discover pane, with a histogram of traffic in time slices. You will subdivide the time slices by the values in the response field. Click the Add sub-buckets button. Then click Split Series under Select Buckets Type. Select Terms from the Sub Aggregation menu.

Then select code from the Field menu. Click img and you will see something like this

img

Now Save this visualization as Response codes. You can see I have all 200 responses, along with one 404.

That is somewhat interesting, but it is more interesting to monitor for error codes. Remember, Elasticsearch is a search engine. We can modify the results by adding a search in the search bar. Replace the * in the large text box with NOT response:200 and hit enter. This will filter the data for this visualization to only those documents that do not have HTTP 200 responses; that is, errors (if you do not have any error responses, the visualization may be empty).

Create a dashboard for monitoring

You can gather all of your visualizations into a dashboard. With Auto-repeat on, you can monitor the key metrics of your website, in near real time. Click on the Dashboard tab, and then Create a dashboard.

img

Click the Add button

img

You will see all of your saved visualizations. Click each one once to add it to the page, and then click Add again to collapse the panels drop down.

img

You can use the img control in the lower left corner reveals a data table view.

You can save your dashboard, by clicking Save at the top of the screen. Data for visualizations and dashboards is saved in Elasticsearch itself.

Jump to Lab 4