View By



Kibana GeoIP example: How to index geographical location of IP addresses into Elasticsearch

Updated: 4 days ago

The relation between your IP address and geolocation is very simple. There are numerous websites available as of today like Maxmind, IP2Location, IPstack , Software77 etc where you can track the geolocation of an IP address. What's the benefit? It's very simple, it gives you another dimension to analyze your data.

Let's say my data predicts that most of the users traffic is coming from It doesn't make complete sense until I say most of the traffic is coming from New Jersey.

When I say geolocation it includes multiple attributes like city, state, country, continent, region, currency, country flag, country language, latitude, longitude etc. Most of the websites which provide geolocation are paid sites.

But there are few like IPstack which provides you free access token to make calls to their rest API's. Still there are limitations like how many rest API calls you can make per day and also how many types of attributes you can pull. Suppose I want to showcase specific city in the report and API provides limited access to country and continent only, then obviously that data is useless for me.

Now the best part is Elastic stack provides you free plugin called "GeoIP" which grants you access to lookup millions of IP addresses. You would be thinking from where it gets the location details? The answer is Maxmind which I referred earlier. GeoIP plugin internally does a lookup from stored copy of Maxmind database which keeps on updating and creates number of extra fields with geo coordinates (longitude & latitude). These geo coordinates can be used to plot maps in Kibana.

ELK Stack Installation

I am installing ELK stack on Mac OS, for installation on Linux machine refer this. ELK installation is very easy on Mac with Homebrew. It's hardly few minutes task if done properly.

1. Homebrew Installation

Run this command on your terminal. If you have already installed Homebrew move to the next step, or if this command doesn't work - copy it from here.

$ /usr/bin/ruby -e "$(curl -fsSL"

2. Java Installation

Check if java is installed on your machine.

$ java -version

java version "9.0.1"

If java is not installed, run following steps to install java.

$ brew tap caskroom/cask

$ brew cask install java

$ brew cask info java

3. Elasticsearch Installation

$ brew tap elastic/tap

$ brew install elastic/tap/elasticsearch-full

$ elasticsearch

If you see all INFO without any error, that means installation went fine. Let this run, don't kill the process.

Now, simply open localhost:9200 in your local browser. You will see elasticsearch version.

[TIP] You might face permission issue if you are not logged in with root user. To enable root user on Mac you can follow this. It's due to security reasons that root user is disabled by default on Mac.

However another solution is to change folder permission itself. Run these commands if you want to change folder permissions,

$ sudo chown -R $(whoami) /usr/local/include /usr/local/lib/pkgconfig

$ chmod u+w /usr/local/include /usr/local/lib/pkgconfig

Install xcode if it's missing,

$ xcode-select --install

4. Kibana Installation

$ brew install elastic/tap/kibana-full

$ kibana

Let this process run, don't kill. Now, open localhost:5601 in your local browser to check if kibana is running properly,

5. Logstash Installation

$ brew install elastic/tap/logstash-full

Configuring Logstash for GeoIP

Let's begin with few sample IP addresses as listed below. I generated this sample data from so please ignore if there is some known ip address in this list. Honestly speaking even I don't know where these IP addresses will point to when we generate the maps.

Sample Data

1. Copy paste these records into a flat file with "ipaddress" header (sampleip.csv).


2. Make sure your Elasticsearch and Kibana services are up and running. If not, please refer my previous blog - how to restart them.

3. [Update 9/Aug/2019: Not mandatory step now] Install GeoIP plugin for Elasticsearch. Run the below command in your Elasticsearch home directory.

Once GeoIP plugin is installed successfully, you will be able to find plugin details under elasticsearch home plugin directory "/elasticsearch/plugins". You need to run installation command on each node if you are working in a clustered environment and then restart the services.

/elasticsearch/bin/elasticsearch-plugin install ingest-geoip

New version of elastics has built in GeoIP module, so you don't need to install it separately.

Configure Logstash

Configure logstash config file to create "logstash-iplocation" index. Please note your index name should start with logstash-name otherwise your attributes will not be mapped properly as geo_points datatype.

This is because the default index name in logstash template is declared as logstash-* , you can change it if you want but as of now lets move ahead with logstash-iplocation.

Below is the sample input, filter and output configuration.

input {

file {

path => "/Volumes/MYLAB/testdata/sampleip.csv"

start_position => "beginning"

sincedb_path => "/Volumes/MYLAB/testdata/logstash.txt"



filter {

csv { columns => "ipaddress" }

geoip { source => "message" }


output {

elasticsearch {

hosts => "localhost"

index => "logstash-iplocation"


stdout{ codec => rubydebug }


My configuration file looks something like this:

Important Notes

  • Your index name should be in lower caps, starting with logstash- for example logstash-abcd

  • Also, sincedb path is created once per file input, so if you want to reload the same file make sure you delete the sincedb file entry. It looks like this,

  • You invoke geoip plugin from filter configuration, it has no relation with input/output.

Run Logstash

Load the data into elasticsearch by running below command (it's a single line command). Now wait, it will take few seconds to load.

Change your home location accordingly, for me its homebrew linked as shown below.

/usr/local/var/homebrew/linked/logstash-full/bin/logstash -f /usr/local/var/homebrew/linked/logstash-full/libexec/config/logstash_ip.config

Sample output

Important Notes

  • See if filters geoip is invoked when you load the data into elasticsearch.

  • Also, the datatype of location should be geo_point, otherwise there is some issue with your configuration.

  • Latitude and longitude datatype should be float.

  • These datatypes are like confirmation that logstash loaded this data as expected.

Kibana Dashboard Creation

1. Once data is loaded into Elasticsearch, open Kibana UI and go to Management tab => Kibana Index pattern.

2. Create Kibana index with "logstash-iplocation" pattern and hit Next.

3. Select timestamp if you want to show it with your index and hit create index pattern.

4. Now go to Discover tab and select "logstash-iplocation" to see the data which we just loaded.

You can expand the fields and see geoip.location has datatype as geo_point. You can verify this by "globe" sign which you will find just before geoip.location field. If it's not there then you have done some mistake and datatype mapping is incorrect.

5. Now go to Visualize tab and select coordinate map from the types of visualization and index name as "logstash-iplocation".

6. Apply the filters (Buckets: Geo coordinates, Aggregation: Geohash & Field: geoip.location) as shown below and hit the "Play" button. That's it !! You have located all the ip addresses.

Thank you!! If you have any question please comment.

Next: Loading data into Elasticsearch using Apache Spark

Navigation Menu:

12,049 views4 comments

Help others, write your first blog today! 

Home   |   Contact Us

©2020 by Data Nebulae