summaryrefslogtreecommitdiff
path: root/published/webdevat25.txt
blob: 944d6529c97c8e098c4094207d3756d41f7c3a61 (plain)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
If you hopped in a time machine and traveled back to 1999, ten years after the birth of the internet and what would turn out to be roughly the high water mark of the "web browser wars", and told web developers that one day hundreds of them would pony up cold hard cash to get a feature in a web browser no one would believe you. 

After all, why would developers put up their own money when ultimately they will never truly control the future of a web browser? Web browsers are made by large companies, not groups of web developers.

Fast forward to earlier this year though and that's exactly what happened. Enough web developers wanted to see a new feature on the web that they <a href="https://www.indiegogo.com/projects/picture-element-implementation-in-blink">put up enough money</a> for Chromium developer Yoav Weiss and others to implement support for the nascent <picture> element in his spare time. 

Sure there were some T-shirts on offer and some workshops available for bigger donations, but at the end of the day most of the money came from smaller donations. That is, web developers who wanted to see the Picture element working in a web browser and enough were willing to pay for it.

At first glance it seems like a crazy idea, asking people to fund development of a feature that will ultimately end up being used by most in a browser owned and controlled by a large company (Google in this case). And yet developers went for it. 

Why? Perhaps because it was a novel idea. But perhaps because it was a chance to directly shape the future of the web. Perhaps because the separation between between those who build the web and those who build web browsers is disappearing. 

Weiss explained the decision to try funding his work on Picture as a result of time constraints. "During discussions with the Blink team, we understood that getting picture into Blink required some infrastructure that wasn't there. So I had two options: either wait for the infrastructure to happen naturally over the course of the next 2 years, or make it happen myself."

Weiss opted for the latter, but it turned out to take a lot longer than he initially expected, working only in his spare time in the evenings. "I thought it would be cool if I could do that during the days as well," says Weiss, "as some sort of client-work... crowdfunding was the obvious choice to make that happen."

And happen it did, which is just one of many signs that web development today is a far cry from what it was even just a few years ago.

Picture support is available today in the Canary release of Chrome (you'll need to enable the "experimental Web features" in chrome:flags), and it appears to be on target to make a final release later this year in either Chrome 37 or 38. Weiss's work is also in the process of being ported to WebKit (which would make it available to Apple, should the company chose to add Picture support to its iOS Safari browser).Not to be left out, the Firefox team will soon have a working implementation of Picture. 

If all goes well, the Picture element should be available in most major, modern browsers in the near future -- marking the first time that a browser feature has been crowd-funded into existence.

Welcome to web development 25 years on. Web developers still may not truly control the future of web browsers but they increasingly control the future of web, which ultimately matters more since browsers come and go. 

It used to be that web browsers and standards bodies handed down new features from on high, but that is changing. Picture is just one example of a larger democratization of web development. Even the W3C, once the stodgiest thing on the web, has opened up "community groups" that allow any web developer to get involved in the standards process. 

An even more profound change is just over the horizen as browsers begin to support a collection of new tools referred to as Web Components. Web Components promises to make it even easier for developers to create new features for the web (see part two for more on Web Components and how they will change web development).

Once upon a time being a "web developer" just meant you knew how to coerce Internet Explorer into rendering a page the way you wanted.

Thankfully, wrestling with Internet Explorer is largely a thing of the past. These days IE 6 is less of a "problem" than the limited mobile browsers found on feature phones -- which account for a staggering amount of web traffic worldwide. And unlike Internet Explorer, which wasn't just limited, but flat out wrong, mobile browsers can be handled by modern web development tools -- like <a href="https://en.wikipedia.org/wiki/Responsive_Web_Design">responsive web design</a> -- combined with long-standing best practices like <a href="https://en.wikipedia.org/wiki/Progressive_enhancement">progressive enhancement</a>.

The great thing about progressive enhancement is that developers can also offer modern browsers running on more powerful devices a first class web experience with features that would drop jaws even just a few years ago. 

So, if modern web development is so incredibly powerful and web developers can crowdfund whole new features into existence why does your bank's website still suck so bad?

As William Gibson <a href="http://quoteinvestigator.com/2012/01/24/future-has-arrived/">says</a>, "The future is already here -- it's just not very evenly distributed."

Perhaps even worse for those actually using some of these so-called modern websites, the future is often <em>wrongly</em> distributed. That is, for every truly great example of progressively-enhanced, future-friendly responsive design there are, regrettably, other sites that use the same tools to produce a horrific, bloated website worse than the one it might have replaced.

In other words, democratization or no, there are still plenty of developers who are "doing it wrong". 

The same can be said of any field, but pundits and lovers of absolutist headlines would have you believe that this -- along with a host of other "problems" -- is why the web will never be able to compete with applications tailored to vendor-specific platforms. 

That is of course nonsense on several levels, not the least of which is that the web doesn't need to compete with platform applications in order to succeed (unless of course you're setting up an argument to show why it can't). 

The web was succeeding long before and will likely succeed long after the current crop of vendor-specific platforms fades away.

Furthermore, the distinction between what's often label "native" applications and web applications doesn't even exist an any meaningful way anymore anyway. Most "native" applications would be utterly useless without a web to connect to and share data through. The Facebook app wouldn't be much without Facebook.com behind it.

On the other side, web applications can tap into an increasingly wide array of native hardware -- cameras, accelerometers, GPS and more. Mozilla has created built an entire mobile OS around emerging web standards for accessing device hardware.

The main distinction between "native" and "web" applications at this point is really which tools the developer uses to build said application. 

That said, there is a distinction to be made between the process and tools used to build for the web versus the process and tools you'd use for a platform-specific application. In other words, there might be little between web app and native apps to those <em>using</em> a website/application, but there are some very important differences for developers <em>building</em> such sites/applications. 

The most important difference comes down to this: no one owns the web.

We have app stores and native platforms precisely because no one can own the web. That drives many companies crazy and lead them to build their own platforms. But it's the web's lack of central authority that makes it what it is. There is no father figure in the form of Apple or Google or Facebook to benevolently (or not) hand down new tools.

This makes the web very much like human culture -- a messy, fluid thing that's impossible to pin down.

For some this is a bug, but for web developers this is the web's greatest feature. 

And thanks to efforts like the Picture element and new tools like Web Components this is going to be an even bigger, better feature of the web five years from now.

One consequence of this decentralization though is that no one is going to make the web better for you. It's up to you. That means that every now and then someone will <a href="http://alistapart.com/article/responsive-web-design">coin a phrase</a> that helps usher in a new way of building websites. It means that something you want to use on the web can be crowd-funded into existence.

Twenty-five years after it first launched the web is still more or less a collection of developer hacks pieced together around a rough consensus that browser makers agree to support -- an imperfect, messy and slow process even when it's working at its best, which is often isn't. 

Still, nearly every taken-for-granted feature of the web today started as some kind of terrible hack -- a developer just like you wanted to do something that wasn't currently possible so they stretched the limits of what was possible. 

Web development 25 years on is more complex than it was even five years ago, but participating today feels more like standing on the cusp of something great than it did just a few years ago. With every passing app store rejection, every shuttered "open" API, every dying vendor-specific platform, the open and yes, messy, nature of web development feels more like a feature and less like a bug.