About raytracing; raytracing is cool, but raytracing in standard mode does not give you realistic lighting, as the rays are cast off the camera (the position of your eyes when you are sitting in front of the monitor) through the viewing screen (your computer screen) to see where they end.
In the real world, this does not work. You do not emit radar / sonar rays from your eyes and do not verify that they have hit; instead, other objects emit energy, and sometimes this energy ends on your retina.
Thus, the correct way to calculate lighting will be similar to photo planning, where each light source emits energy that is transmitted through the carrier (air, water), and reflects / refracts across / through the materials.
Think about it - shooting rays from the camera through a pixel on the screen gives you one direction that you will check for the light intensity, while in fact the light can have many different angles to end up with the same pixel. Thus, "standard" raytracing does not give you the effects of light scattering, unless you use a special hack to take this into account. And aren't hacks the exact reason people want to use a method other than multifaceted screening?
Raytracing is not the ultimate solution.
The only real solution is the endless process, when the lights emit energy that bounces around the stage, and if you're lucky, you get on camera. Since the endless process is rather difficult to imitate, we will need to approximate it at some point. Games use hacks to make things look good, but in the end, every other rasterizer / rendering / tracer / any strategy should implement a limit - hack - at some point.
The important thing is - does it really matter? Are we going to a 100% simulation of real life, or is it good enough to calculate a picture that looks 100% real, regardless of the technique used?
If you cannot determine if the image is real or CGI, it doesn't matter which method or hacks were used?