Splunk Forwarder

Ratings:
(4)
Views:0
Banner-Img
  • Share this blog:

Splunk forwarder is one of the components of Splunk infrastructure. Splunk forwarder acts as an agent for log collection from remote machines. Splunk forwarder collects logs from remote machines and forwards them to the indexer (Splunk database) for further processing and storage. Unlike other traditional monitoring tool agents, Splunk forwarder consumes very less CPU -1-2% only.

Splunk Universal Forwarders provide reliable, secure data collection from remote sources and forward that data into Splunk Enterprise for indexing and consolidation. They can scale to tens of thousands of remote systems, collecting terabytes of data with minimal impact on performance.

There are two types of Splunk forwarder

Universal forwarder(UF)

Heavy weight forwarder(HWF)

Universal forwarder(UF) -Splunk  agent installed on non-Splunk system to gather data locally, can’t                          parse or index  data

Heavy weight forwarder(HWF) - full instance of Splunk with advanced functionality 

Heavy weight forwarder works as a remote collector, intermediate forwarder, and possible data filter because they parse data, they are not recommended for production systems.

Forwarder functionalities:

-Tagging of metadata (source, sourcetype, and host) -Configurable throttling and buffering -Data compression -SSL Security -Transport over any available network ports -Local scripted inputs -Centralized management

These core tutorials will help you to learn the fundamentals of Splunk Forwarder. For an in-depth understanding and 
practical experience, explore Splunk Training.

Installing Splunk forwarder 

Linux installation steps 

Download Splunk Universal Forwarder:

http://www.splunk.com/download/universalforwarder (64bit package if applicable!)

 Install Forwarder using below command

rpm -i splunk_install_file.rpm    #replace splunk install file with downlaoded file name
#Specify directory to install and Accept License
su splunk -c "/opt/splunkforwarder/bin/splunk start --accept-license"   #replace your splunk installation path with your path
# Enable Splunk to start on boot
/opt/splunkforwarder/bin/splunk enable boot-start -user splunk   #this enabled boot start
# setup username and password
su splunk -c "/opt/splunkforwarder/bin/splunk edit user admin -password <your new password> -auth admin:changeme"   #change default username and password
#optional if you want to use the Deployment Server feature of your splunk server.
su splunk -c "/opt/splunkforwarder/bin/splunk set deploy-poll <ip:port>"
/etc/init.d/splunk restart

Enable Receiving input on the Index Server Configure the Splunk Index Server to receive data, either in the manager: Manager -> sending and receiving -> configure receiving -> new or via the CLI: /opt/splunk/bin/splunk enable listen 9997 Where 9997 (default) is the receiving port for Splunk Forwarder connections.

Configure Forwarder connection to Index Server: /opt/splunkforwarder/bin/splunk add forward-server hostname.domain:9997 (where hostname.domain is the fully qualified address or IP of the index server (like indexer.splunk.com), and 9997 is the receiving port you create on the Indexer: Manager -> sending and receiving -> configure receiving -> new)

Test Forwarder connection: /opt/splunkforwarder/bin/splunk list forward-server

Add Data: /opt/splunkforwarder/bin/splunk add monitor /path/to/app/logs/ -index main -sourcetype %app% Where /path/to/app/logs/ is the path to application logs on the host that you want to bring into Splunk, and %app% is the name you want to associate with that type of data This will create a file: inputs.conf in /opt/splunkforwarder/etc/apps/search/local/ -- here is some documentation on inputs.conf: http://docs.splunk.com/Documentation/Splunk/latest/admin/Inputsconf Note: System logs in /var/log/ are covered in the configuration part of Step 7. If you have application logs in /var/log/*/

Configure Splunk forwarder to collect and forward logs

Add data to forwarder by directly clicking on settings>>add data and providing a location of the log file on a local or remote server.But what if you have to monitor hundreds of server logs then it's not practical each time to use GUI. In this case, we can use Splunk configuration files to collect logs from multiple servers and locations.The configuration file for collecting/monitoring logs on local/Remote machine is inputs.conf.Again there are multiple inputs.conf files available on splunk server.We will be editing inputs.conf in $splunk_home/etc/system/local directory.Below are steps to add logs to forwarder in linux.

Splunk forwarder configuration step by step

Login Splunk forwarder server

Create and populate the app directory

First, create a folder for your “app”. An app is a directory of scripts and configuration files. By creating your app directory, you can control the behavior of its contents.

mkdir /Applications/splunkforwarder/etc/app/yourappname /

 Inside your app folder  create two more folders called bin and local:

 mkdir /Applications/splunkforwarder/etc/app/yourappname/bin mkdir /Applications/splunkforwarder/etc/app/yourappname/local

The bin folder is a Splunk security requirement. Any executable, such as a script, must reside in this folder.The local folder will contain two plain text configuration (.conf) files:

inputs.conf outputs.conf

Put simply, inputs.conf is the configuration file that controls executing the script and getting its data into the Splunk Forwarder. And outputs.conf is the configuration file that controls sending the data out to the indexing server or “Splunk Receiver”. These files can be very simple or very complex depending on the needs

 Edit inputs.conf at $splunk_home/etc/app/yourappname   to monitor logs like in below example

Add a stanza like below with sourcetype, i.e., type of logs like syslog or other and index name if you wish to send data to another indexer. In front of monitor specify remote log file location.

open port 514 to listen to data from source machines-server generating logs

Below are few inputs.conf and outputs.conf sample configuration

inputs.conf example:

[monitor:///var/log/secure]  disabled = false  sourcetype = linux_secure    [monitor:///var/log/messages]  disabled = false  sourcetype = syslog

After collecting logs from the server, we have to forward logs to indexers. Splunk forwarder uses port number 9997 to forward collected logs to an indexer. We can configure these setting in outputs.conf file.

outputs.conf example​

 ## outputs.conf  [tcpout]                                                         #mention type of traffic like tcp/udp disabled=falsedefaultGroup=indexCluster                      #name of index sever 6 to which we want to forward data

Enable receiving on the indexer on port 9997. On indexer go to setting>>forwarding and receiving >> enable receiving

verify on the Splunk if your data is indexed by searching for logs or hostname through splunk search Gui.

How to get list of all forwarders installed in your environment?

By using below search query, you can directly list out available forwarder in your environment:

index=_internal source=*metrics.log group=tcpin_connections

 | eval sourceHost=if(isnull(hostname), sourceHost,hostname)
 | rename connectionType as Type
 | eval (fwd="uf","Universal Forwarder", fwd="lwf", "lf",fwd="full", "Heavy Forwarder", connect="cooked" or connect="cookedSSL","Splunk Forwarder", connect="raw" or connect="rawSSL","Legacy")
 | rename version AS "Version", sourceIp AS "Source IP", sourceHost AS "Host", destPort AS "Port"
 | fields Type, "Source IP", Host, Port, kb, tcp_eps, tcp_Kprocessed, tcp_KBps, splunk_server, Version
 | eval Hour=relative_time(_time,"@h")
 | stats avg(tcp_KBps), sum(tcp_eps), sum(tcp_Kprocessed), sum(kb), BY Hour, Type, "Source IP", Host, Port, Version
 | fieldformat Hour=strftime(Hour,"%x %Hh")

Sample Output:

Capture.323
 
About Author
Authorlogo
Name
TekSlate
Author Bio

TekSlate is the best online training provider in delivering world-class IT skills to individuals and corporates from all parts of the globe. We are proven experts in accumulating every need of an IT skills upgrade aspirant and have delivered excellent services. We aim to bring you all the essentials to learn and master new technologies in the market with our articles, blogs, and videos. Build your career success with us, enhancing most in-demand skills in the market.


Stay Updated


Get stories of change makers and innovators from the startup ecosystem in your inbox