Dude, Best Viewed With a WTF??? What Year is This?
I love web standards, I do. I teach a class on them at a local university and extoll the benefits of everyone using them to create an open and accessible web. Why? They allow us to publish to an audience that is larger than any other ever assembled by man. Without them, the web would be a fragmented mess and it would be far less useful. Now, they also allow us to do some fantastically cool stuff, like the latest video from one of my favorite artists, Arcade Fire. Have you tried it yet? It’s worth a viewing, for sure. Check out the overview from Mashable, too.
But when I see things like this (which accompanied the aforementioned video):
I wonder… what are we doing? Sure, I know it’s a cool demo. Yes, it’s a fun and innovative use of technology… and hopefully we are learning things by creating these experiments… They’re lovely. But at what cost? Are we fueling some sort of Browser War II? Is rich media in this “post-flash” world (which I’m not really sure we are in), bound to ghetto-ize the cool sites and and force us to revert to the “Best viewed on a…” web mullet bumper stickers of 1999? Dude, I’ve been there… I have the scars to prove it and the burnt weekends and late nights of many a browser debugging session to recall not so fondly. Remember this? Am I detecting shades of it here, or what?
So, the next time you bash Flash or any other tech for not being open or taking too much CPU or whatever is the complaint du jour, take a look at this CPU output from my pretty much new 15″ MBP with 8GB ram… This was what was happening while that beautiful piece of open content was playing…
Yeow. Every piece of tech is capable of eating up processors, standards compliant or not. Just saying.
Oh and by the way, I did write a postcard to myself. Check it out.