Maybe I am asking a very obvious and stupid question here, but I couldn't get anything on Google so here I go:
Why there is so much gung-ho about the fact that Java is Platform Independent and some other languages aren't. I mean, the total difference, as far as my understanding goes, is just the presence/absence of a compilation step, isn't it?
In Java, you don't have to compile the code again when you are running the bytecode on a different platform, whereas in C or C++, you will have to compile the code again, in order to run it on a different platform(Am I wrong here?).
So, being platform dependent just means one more additional step of compiling. Is this too much? I don't have much experience in programming so maybe I am missing some obvious practical point here.
There are different grades of platform independence:
So, the question is what platforms you want to cover and how much effort you want to devote to the variety of platforms.
For the client, you have Windows 32 and 64 bit, Macintosh, Linux variants, Android and iOS, to name the most popular ones. Alas, because of the different user interaction styles with smartphones and tablets, it's difficult to cover all these platforms with the same source code. Out of the box, Java only covers the classical desktops. HTML 5 and Javascript promise to cover the whole client range.
On the server side, there are mainly Linux versions and Windows 64 bit, and here Java's platform independence really rocks. And that's the reason why many web and application servers are Java-based. But other technologies can do the same here.