Hugo Future Imperfect Slim

Tech Notes


1-Minute Read

Using Elastic search is a great tool to analysze logs generated by your application

Elastic search is made up of 3 parts

  1. Elastic search –> the core search engine
  2. Logstah –> to parse the logs
  3. Kibana –> visualization

Here i have imported the cloudstack management server logs to the ELK and I can perform various analytics

  1. Follow the below link to set up your elk infrstructure

  1. Once done make sure your elastic, logstash, and kibana process are able to communicate with each other

  2. Get the logstash configuration from the following git hub repository

This git hub repo contains the logstash filtering for Cloudstack mangement server

  1. Example

Make sure your the logstash.conf under /etc/logstash

The grok pattern folder is under /etc/logstash/pattern

Run the logstash binary

/usr/share/logstash/bin/logstash -f /etc/logstash/cloudstack.conf

The outputs the logs into the elastic search index and created various feilds

  1. Sample KQL query

To identify the total number of jobs of deployVirtualMachine

api_command.keyword : deployVirtualMachine and api_command_status : START

log_message : “submit async job-514” or log_message : “Complete async job-514”

comments powered by Disqus

Recent Posts



I am a Software Engineer at Persistent Systems, working on Cloud Stack Orchestration and various tech related to Cloud Infra, container technology like Docker, Kubernetes