iSEC Partners Conducts Tor Browser Hardening Study

by mikeperry | August 18, 2014

In May, the Open Technology Fund commissioned iSEC Partners to study current and future hardening options for the Tor Browser. The Open Technology Fund is the primary funder of Tor Browser development, and it commissions security analysis and review for all of the projects that it funds as a standard practice. We worked with iSEC to define the scope of the engagement to focus on the following six main areas:

  1. Review of the current state of hardening in Tor Browser
  2. Investigate additional hardening options and instrumentation
  3. Perform historical vulnerability analysis on Firefox, in order to make informed vulnerability surface reduction recommendations
  4. Investigate image, audio, and video codecs and their respective library's vulnerability history
  5. Review our current about:config settings, both for vulnerability surface reduction and security
  6. Review alternate/obscure protocol and application handlers

The complete report is available in the iSEC publications github repo. All tickets related to the report can be found using the tbb-isec-report keyword. General Tor Browser security tickets can be found using the tbb-security keyword.

Major Findings and Recommendations

The report had the following high-level findings and recommendations.

  • Address Space Layout Randomization is disabled on Windows and Mac
  • Due to our use of cross-compilation and non-standard toolchains in our reproducible build system, several hardening features have ended up disabled. We have known about the Windows issues prior to this report, and should have a fix for them soon. However, the MacOS issues are news to us, and appear to require that we build 64 bit versions of the Tor Browser for full support. The parent ticket for all basic hardening issues in Tor Browser is bug #10065.

  • Participate in Pwn2Own
  • iSEC recommended that we find a sponsor to fund a Pwn2Own reward for bugs specific to Tor Browser in a semi-hardened configuration. We are very interested in this idea and would love to talk with anyone willing to sponsor us in this competition, but we're not yet certain that our hardening options will have stabilized with enough lead time for the 2015 contest next March.

  • Test and recommend the Microsoft Enhanced Mitigation Experience Toolkit on Windows
  • The Microsoft Enhanced Mitigation Experience Toolkit is an optional toolkit that Windows users can run to further harden Tor Browser against exploitation. We've created bug #12820 for this analysis.

  • Replace the Firefox memory allocator (jemalloc) with ctmalloc/PartitionAlloc
  • PartitionAlloc is a memory allocator designed by Google specifically to mitigate common heap-based vulnerabilities by hardening free lists, creating partitioned allocation regions, and using guard pages to protect metadata and partitions. Its basic hardening features can be picked up by using it as a simple malloc replacement library (as ctmalloc). Bug #10281 tracks this work.

  • Make use of advanced ParitionAlloc features and other instrumentation to reduce the risk from use-after-free vulnerabilities
  • The iSEC vulnerability review found that the overwhelming majority of vulnerabilities to date in Firefox were use-after-free, followed closely by general heap corruption. In order to mitigate these vulnerabilities, we would need to make use of the heap partitioning features of PartitionAlloc to actually ensure that allocations are partitioned (for example, by using the existing tags from Firefox's about:memory). We will also investigate enabling assertions in limited areas of the codebase, such as the refcounting system, the JIT and the Javascript engine.

Vulnerability Surface Reduction (Security Slider)

A large portion of the report was also focused on analyzing historical Firefox vulnerability data and other sources of large vulnerability surface for a planned "Security Slider" UI in Tor Browser.

The Security Slider was first suggested by Roger Dingledine as a way to make it easy for users to trade off between functionality and security, gradually disabling features ranked by both vulnerability count and web prevalence/usability impact.

The report makes several recommendations along these lines, but a brief distillation can be found on the ticket for the slider.

At a high level, we plan for four levels in this slider. "Low" security will be the current Tor Browser settings, with the addition of JIT support. "Medium-Low" will disable most of the JIT, and make HTML5 media click-to-play via NoScript. "Medium-High" will disable the rest of the JIT, will disable JS on non-HTTPS url bar origins, and disable SVG. "High" will fully disable Javascript, block remote fonts via NoScript, and disable all media codecs except for WebM (which will remain click-to-play).

The Long Term

A web browser is a very large and complicated piece of software, and while we believe that the privacy properties of Tor Browser are better than those of every other web browser currently available, it is very important to us that we raise the bar to successful code execution and exploitation of Tor Browser as well.

We are very eager to see the deployment of sandboxing support in Firefox, which should go a long way to improving the security of Tor Browser as well. To improve security for their users, Mozilla has recently shifted 10 engineers into the Electrolysis project, which provides the groundwork for producing a multiprocess sandbox architecture for the desktop Firefox. This will allow them to provide a Google Chrome style security sandbox for website content, to reduce the risk from software vulnerabilities, and generally impede exploitability.

Until that time, we will also be investigating providing hardened builds of Tor Browser using the AddressSanitizer and Virtual Table Verification features of newer GCC releases. While this will not eliminate all vectors of memory corruption-based exploitation (in particular, the hardening properties of AddressSanitizer are not as good as those provided by SoftBounds+CETS for example, but that compiler is not yet production-ready), it should raise the bar to exploitation. We are hopeful that these builds in combination with PartitionAlloc and the Security Slider will satisfy the needs of our users who require high security and who are willing to trade performance and usability in order to get it.

We also hope to include optional application-wide sandboxes for Tor Browser as part of the official distribution.

Why not Google Chrome?

It is no secret that in many ways, both we and Mozilla are playing catch-up to reach the level of code execution security provided by Google Chrome, and in fact closely following the Google Chrome security team was one of the recommendations of the iSEC report.

In particular, Google Chrome benefits from a multiprocess sandboxing architecture, as well as several further hardening options and innovations (such as PartitionAlloc).

Unfortunately, our budget for the browser project is still very constrained compared to the amount of work that is required to provide the privacy properties we feel are important, and Firefox remains a far more cost-effective platform for us for several reasons. In particular, Firefox's flexible extension system, fully scriptable UI, solid proxy support, and its long Extended Support Release cycle all allow us to accomplish far more with fewer resources than we could with any other web browser.

Further, Google Chrome is far less amenable to supporting basic web privacy and Tor-critical features (such as solid proxy support) than Mozilla Firefox. Initial efforts to work with the Google Chrome team saw some success in terms of adding APIs that are crucial to addons such as HTTPS-Everywhere, but we ran into several roadblocks when it came to Tor-specific features and changes. In particular, several bugs required for basic proxy-safe Tor support for Google Chrome's Incognito Mode ended up blocked for various reasons.

The worst offender on this front is the use of the Microsoft Windows CryptoAPI for certificate validation, without any alternative. This bug means that certificate revocation checking and intermediate certificate retrieval happen outside of the browser's proxy settings, and is subject to alteration by the OEM and/or the enterprise administrator. Worse, beyond the Tor proxy issues, the use of this OS certificate validation API means that the OEM and enterprise also have a simple entry point for installing their own root certificates to enable transparent HTTPS man-in-the-middle, with full browser validation and no user consent or awareness.

All of this is not to mention the need for defenses against third party tracking and fingerprinting to prevent the linking of Tor activity to non-Tor usage, and which would also be useful for the wider non-Tor userbase.

While we'd love for this situation to change, and are open to working with Google to improve things, at present it means that our only option for Chrome is to maintain an even more invasive fork than our current Firefox patch set, with much less likelihood of a future merge than with Firefox. As a ballpark estimate, maintaining such a fork would require somewhere between 3 and 5 times the engineering staff and infrastructure we currently have at our disposal, in addition to the ramp-up time to port our current feature set over.

Unless either our funding situation or Google's attitude towards the features we require changes, Mozilla Firefox will remain the best platform for us to demonstrate that it is in fact possible to provide true privacy by design for the web for those who want it. It is very distressing that this means playing catch-up and forcing our users to make usability tradeoffs in exchange for improved browser security, but we will continue to do what we can to improve that situation, both with Mozilla and with our own independent efforts.

Comments

Please note that the comment area below has been archived.

August 18, 2014

Permalink

Moxilla was not mentioned as part of the PRISM system mentioned by N S A whistleblower, Edward Snowden. Google and Microsoft apparently are part of the governmental grid. "Privacy" is much more important than "usability"concerns when using the modern-day internet. Moxilla is efficient enough when considering the alternative PRISM-affiliated Google and Microsoft browsers.

I agree. Also there have been lots of Sandboxing problems with v8 recently. There is a lot of hype for sandboxing, cause it is a good idea, but you may end up trusting it too much.

I'd love to see your patches to work around the proxy bypass and privacy issues I linked to in the post!

Otherwise, I'm afraid you're very easy to deanonymize and track.

August 19, 2014

Permalink

And how about chromium? Have you look thru what can you do with it? I know it is different than tor but it is a project called epic privacy browser for example. How about teaming up with Opera they trying to use and extend chrome differently. or it isn't matter which fork you use couse all webkit/blink is limited and you dont have much to do it?
Than why isn't 64bit version for windows?! Do you have to wait while Mozilla officiali support it? (again) I read somewhere x64 browser can add extra security but i dont know is it true or not. However if its true i think we need to go with it as soon as possible. THANKS FOR YOUR HARD WORK!

We mostly don't build 64bit Windows because Mozilla doesn't. You're right that it would provide better ASLR. It is something we will consider if we manage to get AddressSanitizer and other hardening options working for Windows, but because it is not officially supported by Mozilla, it may end up being a bit of work.

However, in my opinion, a hardened Windows build should be available as 64bit only. Same for Mac and Linux.

August 20, 2014

In reply to mikeperry

Permalink

Mike--this is off-topic so feel free to ignore, but do you know *why* Mozilla doesn't do 64-bit builds of Firefox? I had searched around to find some answers to this in the past, but there seemed to be a real reluctance to accept patches from forks like Waterfox. But it was never clear to me why...

August 20, 2014

In reply to mikeperry

Permalink

Two things to consider:
1) 32-bit software can't process snoop or hook into 64-bit software. So malware targeted at 32-bit x86 can't affect the 64-bit process, or at least not without more effort.
2) 64-bit builds of Firefox and Chrome (Chromium) are available but if you read the history of both, you'll find the performance is often worse because the Javascript engines are only optimized to 32-bit levels. Hell you see a lot of these problems show up in the notes for asm.js/emscripten

So if you're trying to use/create a Tor browser on a 64-bit platform, you have to take the performance hits with it. I'd still rather take the performance hits than leave the browser open to tampering.

When they say Google Chrome, they're talking about Chromium. Chrome is Chromium plus some non-free (and closed source) components; as there's no way of telling for sure what any closed-source components are doing, there's a security risk.

August 19, 2014

Permalink

Won't the security slider fragment the anonymity set and make it more easy to fingerprint users?

The section of the report on the security slider only discusses vulnerability mitigation and doesn't consider the effect on fingerprinting and the anonymity set at all:-(

That's why we only want 3 or 4 positions on the slider, so that at worst people are fragmented only into those groups. Not ideal, but not terrible, and better than people tweaking all of these options themselves.

August 21, 2014

In reply to mikeperry

Permalink

There's been so much discussion of "strong crypto" in the media since Snowden. Between Schneier and other sources who seem to be citing crypto experts, kind of a lot of TorBrowser's default ciphersuites ...seem somewhat vulnerable.

Might the deactivation of certain ciphersuites (which utilize rc4 and md5, or which do not support forward secrecy), as well as possibly the exclusion of SSL 3.0, be under consideration for inclusion in the Security Slider portion of this new hardening project?

"Won't the security slider fragment the anonymity set and make it more easy to fingerprint users?"

Not to suggest that this question is anything less than completely legitimate and valid in its own right but I must take the opportunity to (yet again) apply the very question you ask to something else.

This is the striking discrepancy between the stand-alone Tor Browser (the subject of this thread) and Tails with regard to the AdBlock Plus extension: Tails includes it in their version of Iceweasel (which they now call "Tor Browser"), while stand-alone Tor Browser does not.

This is a striking discrepancy that remains, for years now, while in just about every /other/ area that I am aware of, the trend has been has been in the other direction: to make Tor Browser and Tails' version of Iceweasel as /like/ each other as possible.

Have you ever asked or wondered about /this/ seeming contradiction?

I have, many times now, and have yet to see it addressed.

Shouldn't stand-alone Tor Browser and Tails be uniform in this regard? (I.e., whether an extension such as AdBlock Plus is included or not.)

August 19, 2014

Permalink

Mike, I have two questions for you:

1. How much money do you need to participate in Pwn20wn? Funding could be realized by crowdfunding (e.g. via Indiegogo).

2. Regarding the "Security Slider", to what extent will this user-specific configuration set affect the anonymity group? Until now, I thought it is crucial that every user appears identical without any distinguishable set-up.

1. We'd probably want to run several prize levels, depending on the hardening configuration. I hear that the largest Pwn2Own prizes reach $50-75k though.. We still need to think about this, and I suspect we won't have a clear idea until after we start to deploy these hardening options.

2. The hope is that with only 3-4 positions, it will not be that much fragmentation. (2 bits of fingerprintable entropy, with one of those positions being harder to fingerprint without JS).

August 19, 2014

Permalink

Re: Firefox, A question rather than a comment:
are current releases of the Firefox browser, especially such (modified) builds as used in Tor Browser bundles, still compiled to run on X86-32 processors with SSE but without support for SSE2 (such as AMD "K7" Athlon XP) ?

August 19, 2014

Permalink

The ultra-fast browsers QupZilla and Midori are such pretty, nimble little things. What a delight it would be if Tor could be integrated into them.

August 19, 2014

Permalink

I'm personally disappointed. Looking at the report, it's almost entirely a bunch of "do more things we consulted with another client about" and fairly obvious improvements...many of which the team has already discussed internally. Little to no new fuzzing, improvements to code coverage, specific unit test guidance, appears to have taken place.

To a trained eye, OTF/Tor Project could've saved a lot of money to hire a few summer students to dig through trac.torproject.org and mozilla's bugzilla.

Mike Perry is an awesome, talented coder and security thinker, but this report basically just captures some of the stuff that's been on his mind ad is in his wake instead of adding to it.

To echo a similar note, the review of about:config settings could have been more useful. On an upstream level, the documentation and expected vs. actual behavior of many relevant about:config settings in stock Firefox remains lousy, even on Bugzilla. There's all kinds of cruft left over from earlier versions as well.

That makes me wonder what else might've been found via fuzzing with a broader scope, casting a wider net, and even just pointing out the need to parse this documentation for Firefox by release version and keep it up-to-date.

August 19, 2014

Permalink

With regard to funding sources, the German Federal Government has just declared as part of its "digital agenda", that they are willing to fund anonymizing technologies. That sounds crazy in the current era of government surveillance, but hey Mike, that is your chance! Go and call the German Federal Government if they have spare money for the Tor Project.

August 19, 2014

Permalink

PRISM had nothing to do with the browsers. The class of attacks associated with PRISM were network redirection and man-in-the-middle attacks that would impact any browser not using SSL.

Mozilla wasn't mentioned because they don't run a large scale international network, like Google and Microsoft do. In those cases the attacks by the NSA was attacking unencrypted intra-data center email traffic.

To extend this to the browser design is a logical fallacy.

Yeah, except when Google and Microsoft write their browsers to specifically use services that their networks provide; for instance, sending the text of a webpage you visited to one of their servers to translate it for you because it's in a foreign language, and thereby informing Google of the text of the webpage you're accessing. Of course, Firefox does make use of some of Google's services (unfortunately.)

August 19, 2014

Permalink

Hi,
will any of these security patches be sent back to stock Firefox?

Getting our patches into versions that work for stock Firefox is a little challenging. Mozilla is unwilling to take many of the fingerprinting and tracking defenses as-is, because they are enabled by default.

That said, there is talk that they want to support an "Anonymous Mode" for enhanced private browsing. We intend to augment as many of our patches with prefs and private browsing mode detection in preparation for this idea. We expect to have another merge push shortly after we finish the switch to Firefox 31ESR.

August 20, 2014

In reply to mikeperry

Permalink

Just so Mike doesn't have to address this, I wanted to point out that whatever technical/practical issues there may be with upstreaming TBB code to Mozilla, the reason this isn't happening is that Mozilla is heavily dependent on Google for funding and as a result, hesitant to do anything that undermines Google's ability to track Firefox users out of the box.

That's one of the reasons we've seen the farce of Firefox trying to have its cake and eat it too with things like "Committed to you, your privacy, and an open web!" on their front page....which also serves Google Analytics if you leave js enabled.

There are great people working on Firefox at Mozilla, but they need more financial and other support from people like us so that their privacy features aren't always informally up for review in terms of Google's ability to track users for advertising.

August 19, 2014

Permalink

Wait a minute, am I misunderstanding, or just plain naive?

"Google Chrome"

Do you actually mean "Chromium"? Because you obviously are aware of Google Chrome's mother, and what it is.

August 20, 2014

Permalink

[NOTE: Although currently appearing in the comments for the "How to use the “meek” pluggable transport" entry, I had intended to post the following *here* ("iSEC Partners Conducts Tor Browser Hardening Study"). My apologies for the resulting cross-post.]

RESTRICTING IMAGES TO DECREASE ATTACK SURFACE

I recall at least one post from some time back in which one of the Tor devs (either arma or Mike Perry, IIRC) pointed-out that image-rendering in any browser presents a major vector for attack.

So, in the meantime, for those who can get-by without them, wouldn't blocking all or at least third-party images from loading greatly decrease the attack surface within TBB?

For those wishing to try this, here are instructions: (Omit the quotation marks when entering the phrases below)

1.) Type "about:config"into the URL bar, then hit 'enter' (A warning will appear that you will need to accept before you can proceed.)

A search-bar should appear on top of the page, just under the URL bar.

2.) Enter "permissions.default.image" into said search-bar.

The default setting, "1": enable all images, should appear.
To change this:

3.) click to highlight, "permissions.default.image", the string-of-text that should appear just below "Preference Name"
(which, in turn, appears just under the search-bar)

4.) Now, either double-click the highlighted string or right-click and select "Modify"

5.) Change to either "2" (do not load any images), or "3" (do not load third-party images)

Note 1: Even "2" does not block /all/ images for me; a number of clickable images that are actually links still appear. Example: news.google.com

Note 2: When viewing this site, vbdvexcmqi.oedi.net, with images disabled, links become camouflaged by the green background color of the page. The workaround is to highlight those areas where a link appears to be missing. This should make it visible.

August 20, 2014

Permalink

I would definitely like to see more focus on hardening for Linux as well. Apparmor and SELinux should cover the bases for most distros, especially the biggest several (Debian, Ubuntu, Fedora, etc.). And Apparmor at least should be relatively straightforward to make portable since Tor Browser runs mostly (but not entirely) in its own directory.

I will check the bug tracker to see if there is a ticket and make a feature request, and of course volunteer to help do what I can. Thanks for all that you guys and gals do.

Yes, after posting I realized I forgot to mention this. The ticket is https://trac.torproject.org/projects/tor/ticket/5791. There are some experimental AppArmor and Seatbelt profiles already.

We hope to add them as an optional item in the official bundles, but the Linux sandboxes in particular require root and often cause evidence of vioations and the policy itself to exist outside of the Tor Browser directory, hence those sandboxes will be optional.

August 22, 2014

Permalink

"Note 1: Even "2" does not block /all/ images for me; a number of clickable images that are actually links still appear. Example: news.google.com"

Its embedded image code.If you want look into 'Page source' and create special ADB rule(-:.