An update on a couple of old posts

As you may recall I have posted earlier on installing snort as an intrusion detection system (IDS).

And also an earlier post on dynamically adding firewall blacklist rules based upon traffic hitting my website based upon apache rewrite rules to trap known hacker attempts and custom error pages to do the same.

This post is just a quick update on those two posts.

For snort it is simply a reminder that you do need such tools installed to get a clear description of what the hackers are trying to do. I have a nagios/nrpe script that monitors the ‘alert’ file and reports to nagios when events have been logged which I then review to ensure the ip-addresses reported by snort have already been blacklisted by my dynamic blacklist scripts.

The logs from the blacklisting scripts just log the URL used wheras snort logs meaningfull alerts such as the below

06/06-18:08:00.381925  [**] [1:648:14] INDICATOR-SHELLCODE x86 NOOP [**] [Classification: Executable Code was Detected] [Priority: 1] {TCP} 136.186.1.76:80 -> 192.168.1.170:44208
06/09-06:27:41.904537  [**] [1:402:11] PROTOCOL-ICMP Destination Unreachable Port Unreachable [**] [Classification: Misc activity] [Priority: 3] {ICMP} 110.249.208.7 -> 192.168.1.189
06/14-00:19:15.538150  [**] [1:401:9] PROTOCOL-ICMP Destination Unreachable Network Unreachable [**] [Classification: Misc activity] [Priority: 3] {ICMP} 101.98.0.98 -> 192.168.1.189
07/01-20:21:27.076636  [**] [1:1390:11] INDICATOR-SHELLCODE x86 inc ebx NOOP [**] [Classification: Executable Code was Detected] [Priority: 1] {TCP} 113.20.13.217:80 -> 192.168.1.173:54192
07/13-13:38:57.644088  [**] [1:402:11] PROTOCOL-ICMP Destination Unreachable Port Unreachable [**] [Classification: Misc activity] [Priority: 3] {ICMP} 202.180.64.10 -> 192.168.1.170
07/18-17:46:49.818183  [**] [1:1394:15] INDICATOR-SHELLCODE x86 inc ecx NOOP [**] [Classification: Executable Code was Detected] [Priority: 1] {TCP} 116.66.162.254:80 -> 192.168.1.189:5673
12/16-13:19:25.353700  [**] [1:31978:5] OS-OTHER Bash CGI environment variable injection attempt [**] [Classification: Attempted Administrator Privilege Gain] [Priority: 1] {TCP} 95.213.176.146:55241 -> 192.168.1.193:80
12/17-01:26:50.840063  [**] [1:44687:3] SERVER-WEBAPP Netgear DGN1000 series routers authentication bypass attempt [**] [Classification: Attempted Administrator Privilege Gain] [Priority: 1] {TCP} 1.82.196.135:28768 -> 192.168.1.193:80
12/17-01:26:50.840063  [**] [1:44688:3] SERVER-WEBAPP Netgear DGN1000 series routers arbitrary command execution attempt [**] [Classification: Attempted Administrator Privilege Gain] [Priority: 1] {TCP} 1.82.196.135:28768 -> 192.168.1.193:80

The updates to the automated blacklist scripts were a bit more involved, the major change is that I can retire the apache rewrite rules as I now use apache custom error pages to create the dynamic blacklist rules.

The rewrite rules as you recall were to trap known ‘hacker attempts’ such as requests to phpmyadmin as redirect them to a cgi script that added a iptables drop rule for the requesting ip-address.

Rather than use apache rewrite rules, bearing in mind that URL requests to non-existant resources will trigger a webserver 404 error (page not found) you could simply change the apache error page configuration for 404 errors to run the same script that was called by the rewrite rules.

Issues with that method (and with the rewrite rules method) are simply that it is unwise to call a CGI script with any data passed by a client and while a scipt can do a lot of character translation to try to be safe it is better to use existing libraries; so my new method is to have a small php error page that sanitises the input and then invokes the original script passing the parameters it needs. An example of the php error page would be

<?php
header("HTTP/1.0 404 Not Found");
$xx = escapeshellcmd( $_SERVER['REQUEST_URI'] );
$xx = escapeshellarg( $xx );
$cmd = "/home/httpd/newsite/cgi-bin/error_404_handler.sh '".$xx."' '".$_SERVER['REMOTE_ADDR']."' '".$_SERVER['REQUEST_METHOD']."'";
exec( $cmd, $output );
foreach( $output as &$xx ) {
echo $xx;
}
?>

Important notes on the script are that I had to modify the script to detect if parameters were being passed for the following reasons

  • if parameters are passed to the CGI script they are used instead of using shell variables, as if the script is called from PHP then the shell variables will not be set
  • if parameters are passed to the CGI script the script assumes that the 404 response header has been set by PHP and does not set the header itself, if no parameters are passed the script sets the 404 header field in its text output allowing the script to still be used by rewrite rules

It is also of course extremely important that you check every link on your website to make sure it does not refer to a non-existant page or you will blacklist valid users. I used owasp-zap on the ‘kali’ distribution to search for all bad links and also fixed a lot of redirects that used to be for http to be https; and was 100% confident there were no bad links before switching on the 404 error page as a blacklisting tool.

Additionally I also created a 400 error page handler (bad requests to server) similar to the 404 page not found handler to trap requests with illegal http headers and deliberate bad characters/requests in the URI that indicate hacking attempts. That is because requests such as the below cause server 400 errors rather than 404 page not found errors and we do want to trap them

172.105.94.201 - - [05/Jan/2020:19:29:28 +1300] "\x16\x03\x01" 400 226 "-" "-"
172.105.94.201 - - [05/Jan/2020:19:29:31 +1300] "\xbd\xff\x9e\xffE\xff\x9e\xff\xbd\xff\x9e\xff\xa4\xff\x86\xff\xc4\xff\xbe\xff\xc7\xff\xdb\xff\xee\xffx\\d9\xff\xed\xff\xa4\xff\x9d\xff\xcf\xff\xd8\xff\xe5\xff\x04\xff\x12\xff0\xff\xb1\xff\xbd\xff\xe7\xff\xe2\xff\xdd\xff\xdc\xff\xde\xff\xc8\xff\xcc\xff\xbe\xff\xf8\xff&\xff\x01\xff\x0f\xff\xf5\xff\x06\xff\xff\xff\xf7\xff!\xff\xde\xff\x02\xff&\xff\x0c\xff\x01\xff\xf5\xff" 400 226 "-" "-"
223.155.162.30 - - [06/Jan/2020:01:44:17 +1300] "POST /HNAP1/ HTTP/1.0" 400 226 "-" "-"
81.213.225.47 - - [07/Jan/2020:02:28:20 +1300] "GET / HTTP/1.1" 400 226 "-" "-"
112.184.218.41 - - [07/Jan/2020:04:41:03 +1300] "GET / HTTP/1.1" 400 226 "-" "-"
185.100.87.248 - - [07/Jan/2020:17:46:19 +1300] "\x16\x03\x01\x02" 400 226 "-" "-"

Note the the 404 errors for root path URIs which may seem normal would occur when the client request contained non compliant http headers or headers that indicate session spoofing is occurring.

And of course you need to review why ip-addresses are being blacklisted to ensure you have not accidentally created a link to a non-existent page on the site that start blacklisting valid users and search crawlers.

About mark

At work, been working on Tandems for around 30yrs (programming + sysadmin), plus AIX and Solaris sysadmin also thrown in during the last 20yrs; also about 5yrs on MVS (mainly operations and automation but also smp/e work). At home I have been using linux for decades. Programming background is commercially in TAL/COBOL/SCOBOL/C(Tandem); 370 assembler(MVS); C, perl and shell scripting in *nix; and Microsoft Macro Assembler(windows).
This entry was posted in Automation, Unix. Bookmark the permalink.