Jump to content
When you buy through links on our site, we may earn an affiliate commission.
  • Current Donation Goals

How to see what IP address is running a perl script on *NIX web server?


freddy333

Recommended Posts

Is there a command I can run on a *NIX web server (ps or netstat, etc) that will allow me to see what foreign ip address is accessing the server and repeatedly running a perl script? I have root access.

 

For the past week or so, I have been seeing alot of unusual activity on our web server. Our page counters are not reflecting ANY new page views during these times. So whoever these 'visitors' are, they are not typical. However, top reports a constant cycle of 3-4 commands being run (I am not sure of the order?): accept, select, kqread, run, pipdwt, sbwait. These 4 processes repeat over & over again, with about a 10 second pause in between each cycle.

 

When I run netstat, it outputs screens that often contain IPs and domains for university & state government websites, which are VERY rare to see on this server, especially, with 3-5 of them appearing at the same time. Between the increased activity and the types of sites, my gut tells me these remote servers have been hacked and are now being used as bots to either generate spam through some perl script on our web server or they are being used to constantly access the server to slow our connection & block other legitimate visitors.

 

Again, my question is how to see what IPs are accessing which files on the server or running which perl scripts?

 

Here are some screenshots of top running on the webserver so you can see what I am seeing. Again, these several processes keep cycling over & over & over again (without any page views), which is something I have not seen in the 18 years this server has been running -

post-3175-0-64616600-1420400013_thumb.jp

 

post-3175-0-37897300-1420400021_thumb.jp

 

post-3175-0-64663600-1420400026_thumb.jp

 

post-3175-0-65616200-1420400041_thumb.jp

 

 

And here are 2 netstat screenshots that show some of the questionable items. Unfortunately, this screenshot does not include any of the .gov's, 'police.website.org' (or similar) items I keep finding & I often find several of these types of items appearing concurrently, which never happens on this server -

post-3175-0-93855000-1420400031_thumb.jp

 

post-3175-0-99702700-1420400036_thumb.jp

Link to comment
Share on other sites

A few more pieces of the puzzle -

 

1. The single constant IP that is always involved when our bandwidth has ground to a halt is 100.43.91.24, which is the Russian search engine, Yandex.

 

Now, I know Yandex, like google, spiders websites intermittently throughout the day, adding them to their search index, which is fine. But this is something more insidious since they are actively running 1 of our perl scripts almost non-stop 24/7, the effect of which is almost like a DNS attack because the connection is so bogged-down that honest visitors are unable to connect to the server or view content.

 

1 question is whether Yandex is whether running perl is just a way to waste bandwidth for the purpose of blocking access to our server or are they using perl to run some other application that may have more nefarious purposes (eg, using us as a spam server or as a proxy to attack other sites or as something else?)?

 

The whole point of this exercise is to figure out how to find out what Yandex is running on our server, how they are accessing it and then to find a way to either block them from doing so or block them entirely from accessing anything!

 

 

 

As you can see in these screenshots I took when the problem is occurring, Yandex is always connected along with 1 of these other IPs (the 2 blurred IPs are our internal accounts used to monitor the console). Also, Yandex is always simply connected via an HTTP port, while their 'partner' IP is the 1 that is running perl -

post-3175-0-30157700-1420686805_thumb.jp

 

post-3175-0-03409400-1420686812_thumb.jp

 

post-3175-0-01317400-1420686818_thumb.jp

 

post-3175-0-44171100-1420686823_thumb.jp

 

The other IP always changes. But here is the key - the other IP is always a major US government or commercial site like the state of Florida or Akamai or FBI that has little reason to be accessing this server. And they only appear while Yandex is connected. Once Yandex is booted, they are too.

 

2. I have found that if I kill the perl process, both Yandex & the other IP are instantly disconnected. No other actual visitor IPs are ever effected. Only these 2.

 

This tells me that Yandex is either spoofing the other IP (I have no idea how they this?) &/or they have hacked into another legitimate server & are using it (without the owner knowing about it) as a drone or bot to attack other servers like ours.

 

Trouble is that, within 20 seconds of being booted, Yandex is back again, but with a different secondary IP.

 

Anyone see anything like this before or, more importantly, have an idea how to block Yandex? (We have had Yandex blocked via our htaccess for years, but, clearly, that has no effect.

Link to comment
Share on other sites

Chances are the box has been compromised.

 

You can try and figure it out from the /proc directory. Grab the PID of the running perl process, then check

/proc/PID/cwd which will be a symlink to the current working directory of the script

and check

/proc/PID/cmdline which will contain the full command line of the running process

(in both those cases replace PID with the pid of the process)

 

To start, I would block the constant IP address in the firewall, since the random one drops when it's disconnected. I'd probably block the whole range to be safe. If it comes back using a new IP not attached to the Russian serach engine, then you will know that a hacker is working your box.

 

I assume all your security patches and the IP Board software updates are current? Anything else running on the box? Was the Shellshock patch done?

 

Also if you have the PERL library libwww-perl installed, check the logs. It's a common library used by hackers. You can grep the access log to see if you find anything there.

 

Good luck!

Edited by tomhorn
Link to comment
Share on other sites

Ok, now we are getting somewhere.

I am embarrassed to say this, but we are running a very old version of BSD (4.11), which has not been updated for many years. Yes, I know.

There are many reasons for this, which I cannot go into on a public site. But I can provide a few more details via PM if needed (the reasons are not really relevant to the problem).

 

It is difficult to run anything based on the PID because it changes so often/quickly.

 

Running /proc/PID/cwd returns 'command not found'. This may be due to our stale system.

 

Running /proc/PID/cmdline returns either 'command not found' or a permissions error (I think this is due to the PID having changed).

 

We blocked Yandex's IP a few years ago via htaccess & we upped this to blocking the entire range (ie., blocking '100.43') last week when this all began.

Unfortunately, this does not appear to have had any ANY effect.

 

On the positive side, I forgot about the access_log (I told you I was rusty). So I did a tail for their IP &, sure enough, I both caught them in the act & instantly figured out what script they were running & what they were doing (I have blocked our domain for security).

post-3175-0-00225000-1420764620_thumb.jp

 

The nph-proxy333.cgi is a perl script our admins used to use when conducting security evaluations of hacking attempts from foreign sites. Each GET request is coming in about 1/second, so you can see how they have been killing our bandwidth.

 

I just disabled the script, but tail shows that they continue to run the script somehow without any slowdown. I do not know how they can do that since I have disabled the script??

Link to comment
Share on other sites

I don't know. Seems pretty shady though if that's what they are doing ... :(

 

I would set an impossibly high timeout number in the robots.txt file (hours or days), and see if anything changes.

 

You can also disallow robots from parts of the server, so try disallowing the directory where the PERL script is (usually /cgi-bin/).

 

I would also still block the domain and IP address block at the firewall too.

Link to comment
Share on other sites

This morning, there appears to be 0 activity from that 100.43.... (Yandex) IP. And our bandwidth has returned to normal.

 

Looking at the log &/or (live) screenshots, it is pretty obvious that Yandex (or someone within Yandex) was using our proxy to access other sites--mostly US .gov & major commercial organizations. And this is not a type of activity that is in any way normal for a legitimate search engine spider or indexer.

 

Because of the way the proxy works, it is also not activity that can occur 'by accident'.

 

The questions remain - why & what were they doing it for?

 

How can I route all the activity for that IP in our server logs for the past week into a .txt file?

Link to comment
Share on other sites

Been ages, but something like

grep -n "100.43.000.000" * > file

should get you close...

Maybe -o "ip" * for the entire line, can't remember.

I think -C 1 will get you the line plus the one before and after... (To possibly show the other IPs connected with each connection of your visitor)

Edited by POTR
Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...
Please Sign In or Sign Up