In 1995, Netscape announced a nifty little scripting language called JavaScript. Why JavaScript? Because Java was hip, cool and trendy (believe it or not). A language with no real connection to Java other than having a vaguely C-like syntax (you got to have those curly brackets), borrowed some of Java’s fame and glory by using its name.
JavaScript quickly found its place on the web and support was added in more and more web browsers. So far, it was used only for the lesser task of “scripting” however, not “real programming”. Validating an email entered in a form looked vaguely like an email address or that two passwords matched was the normal level of use in a web page, and if you wanted to be really fancy you could build something like a web chat! But no one would dream of doing anything so crazy as building an enterprise level office suite using it.
If you wanted to build an application on the web, rather than a web page, you could use Java Applets, though no more than a handful ever bothered. Flash appeared on the scene and became popular for animations and games, later on for streaming video. Microsoft launched Silverlight and offered a slightly more business focused touch to web applications. While all this was good, and used by many, Internet kept moving.
Google launched Gmail and were so unhappy with the performance of it that they had to build their own web browser, Google Chrome, beating all other browsers at the time in terms of JavaScript performance. Web 1.0 turned into web 2.0, and even if no one could agree on exactly what it is, it sure appeared to use a lot of JavaScript. People started talking about HTML5, the semantic web, software as a service. Microsoft confused themselves and everyone around them by saying that their vision for the web was HTML and JavaScript (but rushed to say that Silverlight is nice too when people started asking questions). Adobe said “no you take it” and washed their hands of Flex. It was clear that JavaScript was the future of the web.
Then something funny happened.
Not only was it enough that JavaScript now owned the web, it had to start digging into the realm of Applications. Not Web Applications, not “Rich Internet Applications”, but Applications. Google added its own framework for offline applications in Chrome, software projects like PhoneGap came on the scene and promised “native” apps for mobile devices built on HTML and JavaScript, without sacrificing the advantages of touch, tight platform integration and the app monetization system called the app store. Microsoft took it even one step further and announced that JavaScript would be a first class citizen in the new Windows 8 Metro world, with as much access to the Windows Runtime as .NET or C++. And Node.js took JavaScript to the server!
It is nice, though somewhat ironic, to see that JavaScript, the bastard half brother of the Java once promising “write once, run everywhere” is now much closer to fulfilling that promise than its big brother ever was. But looking at JavaScript as a language and the historical accidents leading up to its success one cannot help but wondering “could we not have done better?”.