Why are buffer overflows unlikely in web applications?
There are two reasons why we don't see buffer overflows in web applications.
First, to discover a buffer overflow, an attacker has to either get access to the source code, or get local access to the system on which the application is running. It's by checking the app's response to long inputs and tracing the execution that an attacker learns if it's vulnerable. Web apps are usually attacked remotely and the adversaries rarely see the source code. It then becomes very difficult to discover a buffer overflow in a web app.
Secondly, to perform a buffer overflow, the compiler should let data be stored into memory that has not been properly allocated. Languages like Java do not support that (unlike C, C++). So it's impossible to do buffer overflows in Java apps anyway. FAQS.org explains this well here.
On a cautionary note, there have been instances of PHP library functions (written in C themselves) being vulnerable to buffer overflows. Here're two examples [1, 2]. So open source PHP web apps need to take special care against buffer overflows when they call unsafe functions.
Jeremiah Grossman wrote a lucid article de-bunking buffer overflows in web applications: Myth-Busting Web Application Buffer Overflows Do check that out.