Break open Dropbox SSL traffic with Squid Proxy
Some threats creep up on you over time, they evolve to a point where they pose a risk to all organisations at around the same time. Data Loss Prevention (DLP) is a relatively recent solution area that organisations are seeing as a ‘must have’. The mix of personal rather than corporate identity, cloud, encryption and synchronisation ensure that corporate data is leaking all the time. Sure this is nothing new in some respects, you could have emailed that sensitive document out or used a USB key.
Whether organisations like it or not they are going to have to start inspecting encrypted traffic. This is going to cause friction with employees (the talent) but it’s a conversation that has to begin soon before organisations feel the sting of data loss. In this blog post I examine the visibility a normal security team would have regarding Dropbox traffic and how Squid Proxy could be used in a whitelist or blacklist configuration to decrypt and inspect SSL traffic.
Standard SSL Proxy Requests
All HTTP proxies will use the HTTP ‘CONNECT’ method to enable encrypted communications. When you look at the Squid Proxy access log you will see a series of CONNECT requests and the destination domain name and IP address. However you can’t see what the user is doing. Are they uploading? Are they downloading? With Dropbox you get an ‘idea’ of what is going on because of the separate DNS domains (dl-web and photos-1).
In the example below I logged into Dropbox, downloaded a picture, deleted a picture and uploaded a picture and then viewed the picture in the gallery.
1306911721.016 100708 ::1 TCP_MISS/200 58002 CONNECT www.dropbox.com:443 - DIRECT/18.104.22.168 - 1306911721.674 98606 ::1 TCP_MISS/200 7693 CONNECT www.dropbox.com:443 - DIRECT/22.214.171.124 - 1306911721.936 98871 ::1 TCP_MISS/200 121798 CONNECT www.dropbox.com:443 - DIRECT/126.96.36.199 - 1306911722.025 98963 ::1 TCP_MISS/200 16674 CONNECT www.dropbox.com:443 - DIRECT/188.8.131.52 - 1306911724.024 100945 ::1 TCP_MISS/200 80967 CONNECT www.dropbox.com:443 - DIRECT/184.108.40.206 - 1306911724.729 101649 ::1 TCP_MISS/200 118764 CONNECT www.dropbox.com:443 - DIRECT/220.127.116.11 - 1306911897.287 67655 ::1 TCP_MISS/200 87629 CONNECT dl-web.dropbox.com:443 - DIRECT/18.104.22.168 - 1306912018.476 63442 ::1 TCP_MISS/200 2420 CONNECT dl-web.dropbox.com:443 - DIRECT/22.214.171.124 - 1306912084.914 61758 ::1 TCP_MISS/200 3204 CONNECT photos-1.dropbox.com:443 - DIRECT/126.96.36.199 - 1306912085.113 61956 ::1 TCP_MISS/200 5636 CONNECT photos-1.dropbox.com:443 - DIRECT/188.8.131.52 - 1306912098.433 154744 ::1 TCP_MISS/200 55489 CONNECT www.dropbox.com:443 - DIRECT/184.108.40.206 -
Squid Proxy with ssl-bump
Squid proxy through the ssl-bump feature allows you to Man in the Middle (Squid in the Middle) the SSL traffic. Traffic from the user to Dropbox is intercepted by Squid and a private certificate is used to terminate the SSL traffic. The private certificate in this case was one I knocked up using openssl but in reality it would be a certificate from an organisation’s CA of which the Trusted Root Certificate is installed on user’s desktops.
When I navigate to Dropbox and enter my credentials you are prompted to accept the Firefox Security Exception as the Trusted Root certificate of the CA I am using is not installed. Creating a signed certificate within your organisation and rolling out the Trusted Root Certificate across all desktops is a relatively simple task.
Once I accept the security exception the site operates as normal but you can tell that we are now using a Black Foundry certificate rather than Dropbox’s certificate.
It is immediately evident the difference it makes with regards to visibility. You can see every single GET and POST and the URI of the request (as if it was http).
Again for this test I navigated to Dropbox, logged in, downloaded a photo, uploaded a photo etc. Note: I have deleted a number of lines related to page assets (images and css) to make it clearer.
I downloaded a file named ‘jamie-eason_2.jpg’ and then I did a POST to delete a file. After that I browsed the photo gallery and then uploaded something in another POST.
This is a totally different level of visbility into the traffic. I could go further and dissect the POST’s and determine what each user was uploading to Dropbox. I could store the files that were uploaded and downloaded or their hashes (MD5 or SHA) and compare them to hashes for sensitive internal files.
Greater levels of inspection are inevitable due to the risks posed by exploits carried in encrypted traffic and the risk of users leaking sensitive information. In the immediate future users are going to have to get used to higher levels of inspection and surveillance (e.g. Packet Capture).
The implementation will be the most important aspect (rather than whether you can decrypt traffic on the fly). So much encrypted personal traffic (Internet Banking, Facebook and Twitter) is carried across corporate networks.
A Whitelist would allow Internet Banking traffic to avoid decryption using the Organisation’s Private Certificate whilst allowing inspection of all other Social Media and Cloud file hosting providers. A Blacklist would allow the specific targeting of sites such as Dropbox but really miss the mark. If not Dropbox it could be direct to Amazon S3.
So Whitelisting is the key I just don’t want to be around for the conversation you have with your users