View By

Categories

 

ELK stack Installation on OEL (Oracle Enterprise Linux)

Updated: Oct 25, 2019


Refer my previous blog to install Oracle Enterprise Linux operating system on your machine. Or if you have any operating system which supports Linux kernel like CentOS, Ubuntu, RedHat Linux etc, these steps will be similar.


Navigation Menu:



Elasticsearch Installation


Before we start Elasticsearch installation. I hope you all have Java installed on your machine, if not please refer this.


Now once you have installed Java successfully, go to this link and download latest version of Elasticsearch.


https://www.elastic.co/downloads/



I have downloaded TAR file (elasticsearch-6.2.4.tar.gz) to explain this blog.


For machines with GUI like CentOS, Ubuntu: Once you download it on your local machine, move it to your Linux environment where you want to run Elasticsearch. I use MobaXterm (open source tool) to transfer file from my windows machine to Linux environment (Red Hat Linux client without GUI in this case).


For non-GUI Linux machines: Simply run wget on your Linux machine (if you don't have wget package installed on your machine, run this command with root user to install wget: yum install wget -y).


Run below commands to install Elasticsearch with any user except root. Change the version according to your requirement, like I removed 6.2.4 for simplicity.


wget https://artifacts.elastic.co/downloads/elasticsearch/elasticsearch-6.2.4.tar.gz

tar -xvzf elasticsearch-6.2.4.tar.gz


rm -f elasticsearch-6.2.4.tar.gz

mv elasticsearch-6.2.4 elasticsearch



Start Elasticsearch


To start Elasticsearch, navigate to Elasticsearch directory and launch elasticsearch.


cd elasticsearch/

./bin/elasticsearch


Running Elasticsearch in Background


You can start Elasticsearch in background as well with below commands.

  • Run nohup and disown the process.

  • Later you can find out the java process running on your machine or you can simply note down the PID which generates after executing nohup.



Like in below case - 25605 is the PID.

[hadoop@elasticsearch elasticsearch]$ nohup ./bin/elasticsearch & [1] 25605 [hadoop@elasticsearch elasticsearch]$ nohup: ignoring input and appending output to ‘nohup.out’ disown

[hadoop@elasticsearch elasticsearch]$ ps -aux | grep java hadoop 25605 226 6.1 4678080 1257552 pts/0 Sl 11:54 0:31 /usr/java/java/bin/java -Xms1g -Xmx1g -XX:+UseConcMarkSweepGC -XX:CMSInitiatingOccupancyFraction=75 -XX:+UseCMSInitiatingOccupancyOnly -XX:+AlwaysPreTouch -Xss1m -Djava.awt.headless=true -Dfile.encoding=UTF-8 -Djna.nosys=true -XX:-OmitStackTraceInFastThrow -Dio.netty.noUnsafe=true -Dio.netty.noKeySetOptimization=true -Dio.netty.recycler.maxCapacityPerThread=0 -Dlog4j.shutdownHookEnabled=false -Dlog4j2.disable.jmx=true -Djava.io.tmpdir=/tmp/elasticsearch.zbtKhO5i -XX:+HeapDumpOnOutOfMemoryError -XX:+PrintGCDetails -XX:+PrintGCDateStamps -XX:+PrintTenuringDistribution -XX:+PrintGCApplicationStoppedTime -Xloggc:logs/gc.log -XX:+UseGCLogFileRotation -XX:NumberOfGCLogFiles=32 -XX:GCLogFileSize=64m -Des.path.home=/home/hadoop/apps/installers/elasticsearch -Des.path.conf=/home/hadoop/apps/installers/elasticsearch/config -cp /home/hadoop/apps/installers/elasticsearch/lib/* org.elasticsearch.bootstrap.Elasticsearch



Note: If you are getting below error, please make sure you are not logged in as root. Remove the file, login with different user and redo above steps. Remember I told to install elasticsearch with any user except root.

Error:java.nio.file.AccessDeniedException: /home/hadoop/apps/installers/elasticsearch/config/jvm.options

Verify Elasticsearch installation


[hadoop@localhost etc]$ curl http://localhost:9200 { "name" : "akY11V_", "cluster_name" : "elasticsearch", "cluster_uuid" : "3O3dLMIDRYmJa1zrqNZqug", "version" : { "number" : "6.2.4", "build_hash" : "ccec39f", "build_date" : "2018-04-12T20:37:28.497551Z", "build_snapshot" : false, "lucene_version" : "7.2.1", "minimum_wire_compatibility_version" : "5.6.0", "minimum_index_compatibility_version" : "5.0.0" }, "tagline" : "You Know, for Search" }

Or you can simply open http://localhost:9200 in your local browser if your operating system supports any GUI.


Kibana Installation


Follow similar steps to download Kibana latest release from below link:

https://www.elastic.co/downloads/kibana



Move the TAR file to your Linux machine or simply run wget to download the file. Modify the version according to your requirement.


wget https://artifacts.elastic.co/downloads/kibana/kibana-6.2.4-linux-x86_64.tar.gz

tar -xvzf kibana-6.2.4-linux-x86_64.tar.gz


rm -f kibana-6.2.4-linux-x86_64.tar.gz

mv kibana-6.2.4-linux-x86_64 kibana


Now, uncomment this line in kibana.yml file: elasticsearch.url: "http://localhost:9200"

cd kibana

vi /config/kibana.yml


Start Kibana


[hadoop@localhost kibana]$ ./bin/kibana log [21:09:12.958] [info][status][plugin:kibana@6.2.4] Status changed from uninitialized to green - Ready log [21:09:13.091] [info][status][plugin:elasticsearch@6.2.4] Status changed from uninitialized to yellow - Waiting for Elasticsearch log [21:09:13.539] [info][status][plugin:timelion@6.2.4] Status changed from uninitialized to green - Ready log [21:09:13.560] [info][status][plugin:console@6.2.4] Status changed from uninitialized to green - Ready log [21:09:13.573] [info][status][plugin:metrics@6.2.4] Status changed from uninitialized to green - Ready log [21:09:13.637] [info][listening] Server running at http://localhost:5601 log [21:09:13.758] [info][status][plugin:elasticsearch@6.2.4] Status changed from yellow to green - Ready

You can start Kibana in background as well by executing below command:

[hadoop@elasticsearch kibana]$ ./bin/kibana & [2] 23866 [hadoop@elasticsearch kibana]$ log [15:30:26.029] [info][status][plugin:kibana@6.2.4] Status changed from uninitialized to green - Ready log [15:30:26.164] [info][status][plugin:elasticsearch@6.2.4] Status changed from uninitialized to yellow - Waiting for Elasticsearch log [15:30:26.676] [info][status][plugin:timelion@6.2.4] Status changed from uninitialized to green - Ready log [15:30:26.701] [info][status][plugin:console@6.2.4] Status changed from uninitialized to green - Ready log [15:30:26.718] [info][status][plugin:metrics@6.2.4] Status changed from uninitialized to green - Ready log [15:30:26.781] [info][listening] Server running at http://localhost:5601 log [15:30:26.861] [info][status][plugin:elasticsearch@6.2.4] Status changed from yellow to green - Ready disown



Logstash Installation


Follow similar steps to download Logstash latest release from below link:

https://www.elastic.co/downloads/logstash



Or run the below commands with wget to download and install:


wget https://artifacts.elastic.co/downloads/logstash/logstash-6.2.4.tar.gz tar -xvzf logstash-6.2.4.tar.gz rm -f logstash-6.2.4.tar.gz mv logstash-6.2.4 logstash



Create config-sample file


cd /logstash/config

vi logstash-simple.conf


input { stdin { } } output { elasticsearch { hosts => ["localhost:9200"] } stdout { codec => rubydebug } }



Start Logstash


[hadoop@localhost logstash]$ ./bin/logstash -f ./config/logstash-simple.conf Sending Logstash's logs to /home/hadoop/apps/installers/logstash/logs which is now configured via log4j2.properties [2018-05-25T17:29:34,107][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"fb_apache", :directory=>"/home/hadoop/apps/installers/logstash/modules/fb_apache/configuration"} [2018-05-25T17:29:34,150][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"netflow", :directory=>"/home/hadoop/apps/installers/logstash/modules/netflow/configuration"} [2018-05-25T17:29:34,385][INFO ][logstash.setting.writabledirectory] Creating directory {:setting=>"path.queue", :path=>"/home/hadoop/apps/installers/logstash/data/queue"} [2018-05-25T17:29:34,396][INFO ][logstash.setting.writabledirectory] Creating directory {:setting=>"path.dead_letter_queue", :path=>"/home/hadoop/apps/installers/logstash/data/dead_letter_queue"} [2018-05-25T17:29:35,467][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified [2018-05-25T17:29:35,554][INFO ][logstash.agent ] No persistent UUID file found. Generating new UUID {:uuid=>"1aad4d0b-71ea-4355-8c21-9623927af557", :path=>"/home/hadoop/apps/installers/logstash/data/uuid"} [2018-05-25T17:29:37,391][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"6.2.4"} [2018-05-25T17:29:38,775][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600} [2018-05-25T17:29:48,843][INFO ][logstash.pipeline ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50} [2018-05-25T17:29:50,008][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}} [2018-05-25T17:29:50,030][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://localhost:9200/, :path=>"/"} [2018-05-25T17:29:50,614][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://localhost:9200/"} [2018-05-25T17:29:50,781][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>6} [2018-05-25T17:29:50,789][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>6} [2018-05-25T17:29:50,834][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil} [2018-05-25T17:29:50,873][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}} [2018-05-25T17:29:50,963][INFO ][logstash.outputs.elasticsearch] Installing elasticsearch template to _template/logstash [2018-05-25T17:29:51,421][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost:9200"]} [2018-05-25T17:29:51,646][INFO ][logstash.pipeline ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x3c15ca66 run>"} The stdin plugin is now waiting for input: [2018-05-25T17:29:51,902][INFO ][logstash.agent ] Pipelines running {:count=>1, :pipelines=>["main"]} hello world { "@version" => "1", "@timestamp" => 2018-05-25T21:34:46.148Z, "host" => "localhost.localdomain", "message" => "hello world" }



Accessing Kibana dashboard


In order to access Kibana dashboard remotely, configure the file kibana.yml in /kibana/config directory to server.host: "0.0.0.0" as highlighted below.

vi kibana.yml



Now try opening the link on your local browser.

http://{your machine ip}:5601

Note: If link doesn't work, try to stop firewall services on your server. Run below commands:

service firewalld stop

service iptables stop


Here is the sample, In my case it's http://192.16x.x.xxx:5601




If you don't have your linux machine details. You can search your ipaddress by running ifconfig on your machine (inet is your ip).


I hope you enjoyed this post. Please comment below if you have any question. Thank you!



Next: Loading data into Elasticsearch with Logstash



Navigation Menu:


#InstallElasticsearch #Installation #ELKStack #Elasticsearch #Kibana #Logstash

1,922 views

Help others, write your first blog today! 

Home   |   Contact Us

©2020 by Data Nebulae