Question

I am having a difficult time trying to get the combination of the Logstash, Elasticsearch & Kibana working in my Windows 7 environment.

I have set all 3 up and they all seem to be running fine, Logstash and Elasticsearch are running as Windows services and Kibana as a website in IIS.

Logstash is running from http://localhost:9200

I have a web application creating log files in .txt with the format:

Datetime=[DateTime], Value=[xxx]

The log files get created in this directory:

D:\wwwroot\Logs\Errors\

My logstash.conf file looks like this:

input {

  file {
    format => ["plain"]
    path => ["D:\wwwroot\Logs\Errors\*.txt"]
    type => "testlog"
   }
}


output {
  elasticsearch {
        embedded => true
    }
}

My Kibana config.js file looks like this:

define(['settings'],
function (Settings) {


return new Settings({

elasticsearch: "http://localhost:9200",

kibana_index: "kibana-int",

panel_names: [
  'histogram',
  'map',
  'pie',
  'table',
  'filtering',
  'timepicker',
  'text',
  'fields',
  'hits',
  'dashcontrol',
  'column',
  'derivequeries',
  'trends',
  'bettermap',
  'query',
  'terms'
]
});
});

When I view Kibana I see the error:

No index found at http://localhost:9200/_all/_mapping. Please create at least one index.If you're using a proxy ensure it is configured correctly.

I have no idea on how to create the index, so if anyone can shed some light on what I am doing wrong that would be great.

Was it helpful?

Solution

It seems like nothing is making it to elasticsearch currently.

For the current version of es (0.90.5), I had to use elasticsearch_http output. The elasticsearch output seemed to be too closely associated with 0.90.3.

e.g: here is how my config is for log4j format to elastic search

input {
  file {
    path => "/srv/wso2/wso2am-1.4.0/repository/logs/wso2carbon.log"
    path => "/srv/wso2/wso2as-5.1.0/repository/logs/wso2carbon.log"
    path => "/srv/wso2/wso2is-4.1.0/repository/logs/wso2carbon.log"
    type => "log4j"
  }
}

output {
  stdout { debug => true debug_format => "ruby"}

  elasticsearch_http {
    host => "localhost"
    port => 9200    
  }
}

For my file format, I have a grok filter as well - to parse it properly.

filter {
      if [message] !~ "^[ \t\n]+$" {
        # if the line is a log4j type
        if [type] == "log4j" {
          # parse out fields from log4j line
          grok {
            match => [ "message", "TID:%{SPACE}\[%{BASE10NUM:thread_name}\]%{SPACE}\[%{WORD:component}\]%{SPACE}\[%{TIMESTAMP_ISO8601:timestamp}\]%{SPACE}%{LOGLEVEL:level}%{SPACE}{%{JAVACLASS:java_file}}%{SPACE}-%{SPACE}%{GREEDYDATA:log_message}" ]
            add_tag => ["test"]
          }

          if "_grokparsefailure" not in [tags] {
            mutate {
              replace => ["message", " "]
            }
          } 
          multiline {
            pattern => "^TID|^ $"
            negate => true
            what => "previous"
            add_field => {"additional_log" => "%{message}"}
            remove_field => ["message"]
            remove_tag => ["_grokparsefailure"]
          }
          mutate {
            strip => ["additional_log"]
            remove_tag => ["test"]
            remove_field => ["message"]
          }

        }
      } else {
        drop {}
      }
    }

Also, I would get elasticsearch head plugin to monitor your content in elasticsearch- to easily verify the data and state it is in.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top