When compiling code that works with List <> objects, the compiler must use interface calls, which are more expensive than regular virtual calls, for specific types.
Of course, in the code fragment where the compiler sees an instance of the object and can prove that something declared as List <> is always an ArrayList <>, the compiler should be able to just emit regular calls instead of calling the interface, but methods that are not built-in and work with already created objects, they will not benefit from this optimization.
The ubiquitous optimization that speeds up virtual calls, namely built-in caches (often Polymorphic built-in caching or PIC so as not to be confused with an independent position code), benefits from observing access to instances of only one subclass through a variable of a certain declared type. In this case, after the code has been running for some time, the JIT may optimistically assume that the List <object will ever exist ArrayList <>, create a trap in case the error was incorrect, and fail with using an ArrayList <> call.
Modern processors perform verification very quickly (because they are superscalar and have good branch prediction), so you do not notice the cost of all these virtual call calls and a single implementation interface. But it makes the VM work, tune, generate and fix all this code.
For server software running in a stable state on HotSpot, this does not matter, but for quick launch on a mobile device it may matter - I do not know how good Google VM is.
Good article about this from Dr. Cliff Click, Jr. (conventional large ironing equipment, virtual machine with a hot spot): http://www.azulsystems.com/blog/cliff-click/2010-04-08-inline-caches-and-call-site-optimization
And, of course, "built-in caching" on Wikipedia.