> Which is still open so I would qualify that if I mention it. > > Okay, here are a few specific questions I have about this effort, though > again, I'd much prefer to have something more like a discussion because > my take on all this is somewhat open ended right now. > > 1) First and foremost what is the big win for users visiting a flat HTML > site (that is, no login, no data exchange)? Which is to say, how does > HTTPS help users outside of situations where they already have it (e.g. > their bank, Facebook, et al)? HTTPS give you auth and *integrity*. They know they're connecting to you and they know that the content is what you intended to provide. - subresource integrity spec: https://w3c.github.io/webappsec-subresource-integrity/#use-casesexamples let's the benefit propogate. If my 1st connection is https, and I download a link to some other site, SRI specifies a constraint of the image, SRI protects against dependencies (jquery, etc). Systemic benefit -- more things are HTTPS, great cannon can be prevented. widely used site little sites are more likely to get caught up in tracking or advertising. > 2) The best answer I have come up with to the above question is that > HTTPS stops unsophisticated MitM attacks. Do you have any numbers or > research of any kind on how common such attack are? No one knows. Mozilla is trying to get such stats, but so far, says Barnes, "we don't have it. > 3) HTTPS consists of several layers, will Firefox be grading these > layers on a per-site basis and letting the user know the overall level > of security? That is, I might have implemented HTTPS, but done so in > such a way that my server is vulnerable to Heartbleed, BOOST, POODLE, > etc or supports a weak, possibly compromised cipher suite, will Firefox > warn users about the potential vulnerability? If so how? If not, why > not? is there a date? It's already happening. New features are https, fido hardware auth, etc - Gradually phasing out access to browser features for non-secure websites For every features that goes away, the question becomes, "how much are you going to break the web for it's own good?" To be completely frank, I don't care about URLs I care about secure connections. So if you can get a secure connections via HSTS et al geo location api, get user media (mic camera), HSTS and the upgrade-insecure-requests CSP Still treated as mixed content, HSTS you discover as you browse, "HSTS priming spec" > 4) LetEncrypt is great, but it's still way beyond the capabilities of > non-technical users. Yet part of what makes the web amazing is how > simple it is to just create a few text files, put them in the folder, > upload it to a server and you have a site (this is I believe one of the > central parts of Mozilla's Maker efforts, that anyone can create things > on the web). Fix the transport level. Big site concerns: - not too complex - dependecies -- media sites can't go HTTPS without their ads being HTTPS as the ecosystem moves in that direction the big sites don't have to worry as much. Little site concerns - complexity (config, etc) - same level of automation as DNS - caddy server - dependencies > 5) Tim Berners Lee has called the move from http to https, "arguably a > greater threat to the integrity for the web than anything else in its > history." Given that URLs breaking, changing and disappearing is already > a massive problem, and that this move will absolutely mean more broken > sites, how is that a win for the web? Is a secure web that's only 10% of > the web better than an insecure web? > Tim has been a really useful contraian voice. His views have driven the browser and web community to address concerns he has raised. HTST priming is designed to address.