Hugo Future Imperfect Slim

Tech Notes

Linux,MAC,Docker,Kubernetes,
CloudStack,AWS,GCP,AZURE,
Raspberrypi,VMWARE,GO,BOOKS,
NETFLIX,Movies,Seasons

1-Minute Read

Using Elastic search is a great tool to analysze logs generated by your application

Elastic search is made up of 3 parts

  1. Elastic search –> the core search engine
  2. Logstah –> to parse the logs
  3. Kibana –> visualization

Here i have imported the cloudstack management server logs to the ELK and I can perform various analytics

  1. Follow the below link to set up your elk infrstructure

https://www.digitalocean.com/community/tutorials/how-to-install-elasticsearch-logstash-and-kibana-elastic-stack-on-ubuntu-18-04

  1. Once done make sure your elastic, logstash, and kibana process are able to communicate with each other

  2. Get the logstash configuration from the following git hub repository

This git hub repo contains the logstash filtering for Cloudstack mangement server

https://github.com/kiranchavala/logstash-cloudstack

  1. Example

Make sure your the logstash.conf under /etc/logstash

The grok pattern folder is under /etc/logstash/pattern

Run the logstash binary

/usr/share/logstash/bin/logstash -f /etc/logstash/cloudstack.conf

The outputs the logs into the elastic search index and created various feilds

  1. Sample KQL query

To identify the total number of jobs of deployVirtualMachine

api_command.keyword : deployVirtualMachine and api_command_status : START

log_message : “submit async job-514” or log_message : “Complete async job-514”

comments powered by Disqus

Recent Posts

Categories

About

I am a Software Engineer at Persistent Systems, working on Cloud Stack Orchestration and various tech related to Cloud Infra, container technology like Docker, Kubernetes