Mobile app version of vmapp.org
Login or Join
Smith883

: Significant number of non-HTTP requests hitting my site I'm seeing a significant number of non-HTTP requests hitting a site I just launched. They show up in the server (nginx) logs as non-ASCII

@Smith883

Posted in: #Nginx #Security

I'm seeing a significant number of non-HTTP requests hitting a site I just launched. They show up in the server (nginx) logs as non-ASCII and get rejected (correctly) with a 400 status. Here are some lines from the log:

95.132.198.189 - - [09/Jan/2011:13:53:30 -0500] "œ$Ax10õœ²É9J" 400 173 "-" "-"
79.100.145.126 - - [09/Jan/2011:13:57:42 -0500] "#§i²¸oYi á¹„x13VJ—x·—œx04N x1DÔvbÛè½x10§¬x1E0œ_^¼+x09ÜÅx08DÌÃiJeT€¿æ]œrx1EëîyIÐ/ßýúê5Ǹ" 400 173 "-" "-"
79.100.145.126 - - [09/Jan/2011:13:58:33 -0500] "¯Ú%ø=Œ›D@x12¼x1C†ÄÀex015mˆàd˜Û%pÛÿ" 400 173 "-" "-"

What should I make of this? Is this some sort of scripted attack? Or could these be correct requests that have somehow been garbled?

They're not affecting the performance of the site and I'm not seeing any other signs of attacks (e.g., no strange POSTs) so at this point I'm more curious than afraid.

10.01% popularity Vote Up Vote Down


Login to follow query

More posts by @Smith883

1 Comments

Sorted by latest first Latest Oldest Best

 

@Gloria169

I've worked some pretty large sites (1bn pv/day on the largest), approx 1-2% of their traffic was random exploit probing/spiders/crawlers.


Classic path manipulation stuff like ......cmd etc.
More sophisticated buffer-overruns in POST and/or GET looked similar to those in your logs
Plain screwed-up home-rolled crawlers that mis-encoded or omitted key aspects of the HTTP protocol.
glitched customer connections garbling transmitted data.


Seems that it's plausible that it could be a buffer overrun attack - do you see other instances of the same request pattern? Quite plausible to see screwed up clients and connections too. Switching on logging for other W3C variables like user agent etc. may also give you more clues. e.g. the absence of regular headers like accepts, method etc.

We generally:


made sure our servers were patched regularly just in case ;-)
used tools like UrlScan (we were on IIS) to throw out malformed requests early in the request pipeline.
capped our POST and GET request sizes to prevent some of the more crazy DDoS attacks.
Pawed through all non-2xx/3xx traffic from the logs on a regular basis watching for new patterns. Believe it or not, Excel is an excellent tool for quickly scanning and slicing and dicing data.


While some actions are IIS specific, I've no doubt that nginx has similar tools. In generally there is now so much noise on the internet from random bots, I've learned to just be vigilant and look for emerging new trends.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme