None
HTTPS Proxy and Script Beautifier
Minification helps developers protect their intellectual property and reduce bandwidth, but gets in the way of finding high-quality XSS or other-code-injection holes like indirect evals off XmlHttpRequests. This article adds a new hammer to the toolbox to assist against the problem; an integrated HTTP/HTTPS proxy (w/o ssl strip), inline beautifier, and local caching.

Usage:
$ openssl req -x509 -nodes -days 365 -newkey rsa:1024 -keyout mycert.pem -out mycert.pem
$ #add browser CA (Firefox: Preferences -> Advanced -> Encryption -> View Certificates -> Import -> Trust CA to identify websites
$ #set browser proxy (Firefox: Preferences -> Advanced -> Network -> Settings -> Manual Proxy -> HTTP Proxy 127.0.0.1 Port 8080
$ python proxy.py 8080 mycert.cert

HTTP(S) Proxy

What we need is an interface that can collect and modify relevant HTTP/HTTPS traffic (ie, man-in-the-middle), which is more generally recognized as an HTTP/HTTPS proxy. Moxie Marlinspike made waves releasing sslstrip and mitmproxy, and they're both excellent. pdp followed soon after with httpservers.py. Fiddler appears to do the same, but on a commercial closed-source basis. The open source tools provide a great place to turn to when code stopped working as expected (especially the SSL tunnels).

I find my final product more elegant, particularly when stacking ProxyHandler such that the last one always sees clear-text traffic. The self-signed SSL certificate still needs to be added to your browser's trusted CAs, like the others, but it automatically generates custom certificates by destination. Bonus: integrated support for gzip and zlib compression.

openssl req -x509 -nodes -days 365 -newkey rsa:1024 -keyout mycert.pem -out mycert.pem
# add to browser CA (Firefox: Preferences -> Advanced -> Encryption -> View Certificates -> Import -> Trust CA to identify websites
# set as browser proxy (Firefox: Preferences -> Advanced -> Network -> Settings -> Manual Proxy -> HTTP Proxy 127.0.0.1 Port 8080
python proxy.py 8080 mycert.cert


Inline Beautifier

The primary motivator was for inline beautification of script-like text from interesting servers. js-beautify, via github, makes this painless once the files are clear-text though. Beautifying .js files was a huge step forward, but I found more and more webservers (rails/web2py) automatically concatenating and inlining scripts, so BeautifulSoup was roped in to pull out <script> sections for js-beautfy to fix up.

sudo apt-get -y install python-beautifulsoup python-openssl git
git clone https://github.com/einars/js-beautify.git
cd js-beautify/python
sudo python setup.py install


Local Caching

Finally, the second reason behind all this was to support changing target sites for enhanced access, debugging, and visibility. All requests and responses are cached as data or headers, post-beautification, and seamlessly open to modification with your editor of choice. Everything else behaves completely as normal from the browser's prospective allowing exceptionally rich analysis of JavaScript frameworks.


Python Source

proxy.py
cache.py


Quick Accuracy Test

It turned out constantly ctrl+shift+f5ing to test bad HTTPS sucks; so I made a quick script to automate it...
#!/bin/bash
#wget -e "https_proxy=$proxy" -e "http_proxy=$proxy" --no-check-certificate https://encrypted.google.com -O garbage

proxy="127.0.0.1:8080"
options="--no-check-certificate"

url=${1:-"https://encrypted.google.com"}
out=${2:-"google.html"}

export http_proxy="$proxy"
export https_proxy="$proxy"
wget $options $url -O $out

(I originally developed/deployed this several months ago. I probably missed important details on recollection; please shoot me an email if you catch 'em.)


- Kelson (kelson@shysecurity.com)