Wednesday, January 23, 2013

Zabbix: Monitor customized applications by SNMP and JSON Low-level discovery

Low-level discovery (LLD) is an amazing feature in Zabbix, once you have defined the template you can start to monitor hundreds of hosts in few minutes by importing hosts definition in XML file.
Low-level discovery is used to discover dynamic items, for example, you can’t monitor disk space usage of a volume by static binding, because the index number of the volume is dynamic.
The customized application in this example, is a shell script to check database and web service, it outputs to stdout, which follows “key=value” format and it contains with string and numeric values.
db-status= [OK]
web-status=[OK]: http-code: 200

The goal is to create monitoring entry and trigger for each item automatically

Execute the script by SNMP

Net-snmp allows you to execute any arbitrary command by “exec” parameter. Edit /etc/snmp/snmpd.conf and map OID to the script. exec . testscript1 /usr/local/bin/ The lines of output are in MIBIOD.101.x ( ., which can be retrieved by snmpwalk on .

What is Low Level discovery?

Discovery of SNMP OIDs

It is possible to automatically add the script by SNMP OID LLD, but it has drawbacks: once LLD is completed, the 4 lines of output will be added as 4 items, each value of item is retrieved by snmpget, which means the script will be executed 4 times.

Discovery item JSON format

Zabbix also supports LLD by parsing output in JSON format. This method can overcome the drawbacks of SNMP OID LLD, the idea is to run an external script to do snmpwalk on the OID then save outputs to a text file on local server, the other script retrieve value of each item by simply reading the text file. This script will be still executed 4 times, but it is executed locally and doesn’t do remote connections.

There are two scripts to execute snmpwalk for different output values, one for string and the other for numeric. If you don’t need to do graphing of the numeric value, one script is enough.


#retrieve string values with “-s” and output in JSON format
#The macro name KEY/VALUE is arbitrary
[root@zabbix:externalscripts]# ./  -s server1
  "{#VALUE}":" [OK]
  "{#VALUE}":"[OK]: http-code: 200

#Item prototype details for scripts-getstringoutput


Column “Name” is _{#KEY}, {#KEY} is macro in JSON output, _ is just simply prefixed to make it a valid name. Column “Key” (actually, it is value) is retrieved by

[root@zabbix:externalscripts]#./ -s server1 web-status
[OK]: http-code: 200

Just link the template to a host, the keys and values of the items will be discovered automatically.


LLD is such an amazing feature, even there are hundreds of items, they can be automatically discovered in few minutes.

Some tips in Zabbix implementation:

1) Define global macro for SNMP community string in administration->General->Macros, the macro can also be defined in template or host level.

2) Items  won’t be updated  straightaway after being added, it has to wait for next update specified in update interval

3) Define regular expressions in administration->General->Expressions

For example there are some virtual interfaces you want to exclude create regular expression called

“Real Adapters” with expression: (^lo$|^Microsoft|^RAS|^WAN|-0000$|^Teredo Tunneling|^Software Loopback|^sit)

And refer to the macro by @Real Adapters in the default “Template SNMP Interfaces”


The script:

#!/usr/bin/perl -w
use warnings;
use FileHandle;
###---------- main ----------------------
my ($cdir)=$0=~m|(.*)/|;
my $type=$ARGV[0];
my $host=$ARGV[1];
my $oid='200.101';
my $fname="${cdir}/logs/${host}";
my $afname="${fname}a.out";  $ifname="${fname}i.out";  $sfname="${fname}s.out"; 
my $timegap=120;
if ( ! defined $host ) {
 print "hostname is required\n Usage: $0 [-i|-s] hostname \n";
 exit 1;
my $mtime = (stat($afname))[9];
my $ctime=time;
#no need to do snmpwalk, if the file is recent 
if ( ( ($ctime - $mtime) > $timegap ) or (! defined $mtime ) ) {
 @output=&snmprun ("$host", "$oid");
 map (s/^STRING: "//,@output);
 map (s/\"$//,@output);
 #all values
 open(OUTFILE, ">$afname") or die "Can't write to $fname: $!";
 print OUTFILE @output;
 close (OUTFILE);
#numberic values
 open(OUTFILE, ">$ifname") or die "Can't write to $fname: $!";
 print OUTFILE grep(/.*=\s*\d+\.*\d+$/, @output);
 close (OUTFILE);
#string values
 open(OUTFILE, ">$sfname") or die "Can't write to $fname: $!";
 print OUTFILE grep(/.*=.*[a-zA-Z]/, @output); 
 close (OUTFILE);
&printjson ( "${fname}${type}.out" );
sub snmprun {
  my $host=$_[0];
  my $oid=$_[1];
  $snmpwalk='/usr/bin/snmpwalk -v 2c -O v -c public';
  @rt0=`$snmpwalk $host $fulloid 2>&1`;
  return @rt0;
#Read a file and print out in JSON format
sub printjson { 
$first = 1;
open (INFILE,"$_[0]") or die "Can't open $_[0] $!"; 
print "{\n";
print "\t\"data\":[\n\n";
while (<INFILE>)
    ($key, $value) = split (/=/,$_,2);
    print "\t,\n" if not $first;
    $first = 0;
    print "\t{\n";
    print "\t\t\"{#KEY}\":\"$key\",\n";
    print "\t\t\"{#VALUE}\":\"$value\"\n";
    print "\t}\n";
print "\n\t]\n";
print "}\n";

The script:

#!/usr/bin/perl -w
use warnings;
use FileHandle;
my $type=$ARGV[0];
my $host=$ARGV[1];
my $key=$ARGV[2];
my ($cdir)=$0=~m|(.*)/|;
my $timegap=600;
if ( $#ARGV != 2 )  {
 print "\nUsage: [-i|-s} hostname key \n";
 exit 1;
open(INFILE, "<$fname") or die "Can't open $fname: $!";
my $mtime = (stat($fname))[9];
my $ctime=time;
if ( ($ctime-$mtime) > $timegap ) {
  print "$key=<CRITICAL>: $fname hasn't been updated for > $timegap seconds, the outdated value will not be retrieved\n";
  exit 1
@line1=grep(/$key=/, @lines);
#print "@line1\n";
if ( ! defined ( @line1 )) {
 print "key $key is not found in $fname\n";
 exit 1
print "$value \n";


  1. After reading your post I understood that last week was with full of surprises and happiness for you. Congratz! Even though the website is work related, you can update small events in your life and share your happiness with us too.
    python training in chennai
    python course in chennai
    python training in bangalore

  2. This is a nice article here with some useful tips for those who are not used-to comment that frequently. Thanks for this helpful information I agree with all points you have given to us. I will follow all of them.

    Data Science training in rajaji nagar | Data Science Training in Bangalore
    Data Science with Python training in chennai
    Data Science training in electronic city
    Data Science training in USA
    Data science training in pune

  3. I appreciate your efforts because it conveys the message of what you are trying to say. It's a great skill to make even the person who doesn't know about the subject could able to understand the subject

    rpa training in chennai |best rpa training in chennai|
    rpa training in bangalore | best rpa training in bangalore
    rpa online training

  4. Whoa! I’m enjoying the template/theme of this website. It’s simple, yet effective. A lot of times it’s very hard to get that “perfect balance” between superb usability and visual appeal. I must say you’ve done a very good job with this.
    AWS Training in Bangalore |Best AWS Training Institute in Bangalore BTM, Marathahalli
    AWS Training in Chennai | AWS Training Institute in Chennai Velachery, Tambaram, OMR

  5. Trade FX At Home On Your PC: roboforex login Is A Forex Trading Company. The Company States That You Can Make On Average 80 – 300 Pips Per Trade. roboforex login States That It Is Simple And Easy To Get Started.