Consider this. It always feels easier to stay in our comfort zones.
As developers, our comfort zones are typically defined by the technologies we know, whether it be an operating system, a language, a framework, or a library. However, what we know best can often be a legacy technology, which in the world of software can leave an individual or a team behind the technology curve.
Legacy technologies have a substantial amount of inertia within software organizations. Core products are almost (by definition) built on legacy technology. Capital investments have been made and development teams have been trained to expertise level on the legacy technology. The strengths and weaknesses are understood.
Yet, organizations rarely explicitly decide to stick with a legacy technology that has run past its prime. As the incumbent it simply becomes the default choice. Even worse, the choice between legacy and replacing technologies is too often seen as a binary decision, with legacy winning through acquiescence, when in fact a transition over time is optimal and achievable.
For the above reasons it makes sense to periodically consciously stop and compare how in-house technologies measure up with the corresponding state-of-the-art. After all, “state-of-the-art” simply means the best currently available. Applying the “best” when building applications typically translates to better development efficiency. More top developers are attracted to projects. Perceptions from both inside and outside are improved. Ultimately, better products can be delivered quicker and at lower cost while enhancing a team’s skill set….a win-win for all involved when the timing is right.
However, there is almost always an initial hurdle: finding agreement on what constitutes a legacy in-house technology past its prime is difficult and even controversial. What constitutes a compelling state-of-the-art technology worth adopting also isn’t always clear.
Often, the hype around new technology blurs objective comparison. Many unproven tools and products are touted as the state of the art, spelling out the weaknesses of legacy technologies before their own weaknesses are well understood and addressed. Early adoption comes with risks and headaches. It is critically important that organizations look for objective evidence when making such a crucial analysis.
HTML5 is the most compelling state-of-the-art technology for building virtually any GUI-based cross-device application. HTML5 is well past the tipping point…it is here to stay, thoroughly vetted, and mature. In the field it is already supporting a new generation of applications. Faster processors, increased bandwidth, standardization, and better implementations have all contributed to the evolution of today’s browser engines into full-fledged application infrastructures. And the lingua franca of all browsers is HTML5.
CTO Terry Thorsen discusses the once-in-a-generation shift to HTML5:
HTML5 runs within browsers on virtually every user-facing computer, whether it be a desktop, a smartphone, or tablet. It is by definition “connected” to the power of the Internet and cloud-based services, and with recent innovations can run in standalone “browser containers” that mimic traditional desktop applications. HTML5 is not resource heavy. It supports all the key media types, from PDFs to streaming video. It is extremely stable, secure, and requires no updates.
If you aren’t using HTML5 to build user-facing applications, we encourage you to compare it to what you are now using — an abundance of online resources are available to help. HTML is ready now and an exceptional technology to invest in.