LONG VERSION COMING SOON --------------------------------------------- Instructions (Short version): 1) Copy the "spdash" perl script into web servers executable bin (cgi-bin) directory. So the script can be executed by a browser from the URL. 2) Modify the variables at the top of the script to match your environment. 3) It is VERY important that the $STATDIR matches the same path on both checksplunk and spadmin. 4) Collect the data that the dashboard needs to read by running: # ./checksplunk spdash Note: Set in a cron for ever 2 min. Spdash will auto refresh every 15 seconds and pick up the changes. There are 2 pages to spdash. The Main page and the Server page. If spdash only sees one set of collected server files in $STATDIR it will automatically display the server page. If there are multiple sets of collected server files, it will display the main page and you can click into each server to get to the server page. To create multiple sets of collected server files in $STATDIR, you have 2 option (I prefer option #1): 1) NFS mount $STATDIR from multiple splunk servers that are running checksplunk so they can all write the collected server files into the same $STATDIR directory over the network. If this option is used, then you really don't have to run the web server on a splunk server. It can be ran from any web server that can see $STATDIR. Otherwise, if splunk is running on port 80, then the web server would have to be assigned to another port. 2) Run checksplunk on each splunk server and scp copy the files over to the web server $STATDIR directory. Servers have to be trusted by the web server.