"developer quality" is hard to evaluate. Java and (to a lesser extent) C # are used a lot in schools and universities to educate students in the rudiments of programming. Many of them find themselves in support forums with questions about homework and will be considered somehow to be programmers (and poor) using this language. In fact, the vast majority of them will never write one line of code after completing this required introductory course, and most of the rest probably will not write in this language.
--- talk about the "comparative studies" about the full competence of the programmer ---
As said, it is very difficult, if not impossible, to estimate the cost of comparison for implementing something in different languages, at least as a general case that will be used for all projects. Some things give .NET better, others Java, others are best done with Excel macros.
And the development cost usually makes up only a small part of the total cost of ownership of the system, especially if it is something like a multi-level application running on application servers with databases, etc. If the client already has a server server working with IIS with MS SQL Server databases as a backend, selling them Java EE applications using the Oracle backend makes them bad, even if this would be the most logical choice for the application otherwise. The development cost may be lower, but the current cost for the client will be much higher.
At the other end of the scale, a website for your grocery store that wants to start accepting orders over the network for delivery in the neighborhood should not be implemented in either .NET or Java EE. The cost of the solution (especially hosting) far outweighs the benefits. A simple thing, based on, for example, php or rails, will serve this client much better. The cost of hosting is reduced, you do not need to pay expensive license fees for databases and application servers, he could make some money using the resulting website.
jwenting May 31 '10 at 13:12 2010-05-31 13:12
source share