Explaining Douglas Crockford's Updating the Web
This is my attempt to extract his slides from his presentation and annotate them slightly. I write my interpretations under the headings. Watch the full talk here.
What’s wrong with the web?
Insecure
Complexity
He believes that the security vulnerabilities are due to the overall complexity.
HTTP
Key / Value Pairs
Negotiation
Request / Response
Certificate Authorities
Not trust worthy, vulnerable
HTML
Not for applications…really for describing technical documents
Templating
XSS attacks
Document Object Model
The worst API, very insecure
CSS
Awkward and not intended for application usage
JavaScript
Hot mess, it is pretty terrible but there are some good parts
Many have tried
- Microsoft, Adobe, Apple, Oracle, many more
- In most cases, the technology was much better
- In most cases, the solution was not open
- There was no transition
Upgrade the Web
Keep the things it does well.
Move over and go down a new path for things that are currently vulnerable still.
HDTV transition was possible with set top box
Helper App
Used to open external protocols that were not supported by the browser. For a new protocol “web”, we will have a new way to execute applications
Transition Plan
- Convince one progressive browser maker to integrate.
- Convince one secure site to require its customers to use that browser
- Risk mitigation will compel the other secure sites
- Competitive measures will move the other browser makers
- The world will follow for improved security and faster application development
- Nothing breaks!
Strong Cryptography
- ECC 521
- AES 256
- SHA 3-256
Built upon paranoid levels of crypto beyond what is deemed needed by todays standards. Keeping things secure and future proof.
ECC 521 public keys as unique identifiers
No more passwords, no more usernames. This is you.
Secure JSON over TCP
HTTP is limited and not really needed for this. JSON can be encrypted and pushed over the wire asynchronously.
web:// publickey @ ipaddress / capability
It’s not pretty, but its clear. Take the certificate authorities out, and keep
Trust Management / Petnames
The way to try to make the long and completely foreign schema identifiable to the user. The initial relationship like going to “amazon.com” would come from search engines or directories. The idea being once you know about the site now there is a “relationship” that you wish to maintain. The idea of just typing in a domain name like “money.com” which was a hit or miss sort of thing would no longer really be possible.
Vat
More than a sandbox…only has access to what is granted.
Cooperation under mutual suspicion
I think here he was saying that assume that everything is potentially malicious. All applications can work together based on the API that they allow.
JavaScript Message Server / Qt
Isolated components entirely only sending JSON back and forth. He wants to use something like NodeJS but with better security in place to handle the messaging. This would be talking to the remote server and perhaps the individual applications. This is somewhat like a AMQP bus that is secure. Qt is the interaction and rendering framework that is very widely used. I’m inclined to saying that at this point in time if you want to approach things this way let’s not limit the potential. I can say why not allow Qt at a minimum, or a JVM that can plugin. Essentially creating a platform to build an application that is web based.
He didn’t go into too much detail here at all. The only thing he really mentioned was the clean separation this provides. Qt would handle the rendering of content as well as user interaction. The messaging bus would transfer the content.
I’m thinking that taking his Vat approach we can almost describe a secure web based ecosystem. He didn’t discuss how this would work in anyway but I can propose a possible design.
I’m thinking that “applications” have dependencies but they aren’t included in your code at all. You can refer to the original repository of that dependency or bundle it if you want to. The idea is that the applications will in fact “live” in the web but are installed into the user’s browser much more like an extension on Chrome is. They will have versions. When you have updates to be pushed out part of the application checks a specific address for updates and performs that update accordingly.
I’m inclined to say that much like an operating system this platform can have a standalone application as well as an application that allows other applications to interact with it. Let’s say you have an Ebay application. You want to by things. You have the Ebay application installed. Let’s say that you also have your bank application let’s say Well Fargo and that is installed. There is a `PaymentProvider` interface that can be implemented that defines what is necessary to provide payments to other applications. When you want to pay for your item you won on Ebay, you can choose one of your applications that implement the `PaymentProvider` interface.
I’m not entirely certain but I believe that the intention was that the JS message server may be more than just a message bus, more like the front-end application would be written there. The business logic would live on the message end, and it would broadcast messages to the Qt side for rendering updates etc. Assuming I am correct I think we can play around with this more and refine it.
I really like the application ecosystem concept. To pull that off, come up with logical interfaces and ways of interaction will not be easy. This is however, a tremendous improvement from the chaos that we have today. We have an inter-connected system that talks together in many different ways but in fact has very little security and clear cut boundaries in place. OAuth2 does have the notion of granting certain roles for different user types. This is a start, but to allow applications to collaborate a more descriptive mechanism is needed. This concept is going to be essential in the development of IoT technologies.
The Old Web: Promiscuity
The New Web: Commitment
The mantra for this. The old web will remain. Pretty sure that isn’t just for the transitional period. Rather, for “finding” new content, and what we call “Browsing the web” we would use the old web. Once we establish a relationship with a site we we will use the new web to maintain that relationship. I’d like to say that perhaps you can think about it where HTTP would be used for insecure content and some sites would “switch” to HTTPS for secure content, so too here.
There’s nothing new here
No new technology at all. Just bringing current technologies together.