Interception of proxies created on the fly causes browser errors

I wrote a proxy interceptor in Python 3 that uses the man-in-the-middle attack technique to be able to check and modify pages passing through it on the fly. Part of the process of “installing” or configuring a proxy involves creating a “root” certificate that must be installed in the browser, and each time a new domain enters through HTTPS through a proxy server, the proxy generates a new site certificate (and caches all certificates, generated to the disk, so it does not need to re-create certificates for domains for which certificates have already been created), signed by the root certificate, and uses the site certificate to communicate with the browser. (And, of course, the proxy server creates its own HTTPS connection with the remote server. The proxy server also checks the correctness of the server certificate, if you are interested.

Ok, it works fine with the surf browser. (And this may be relevant - at least on several versions, at least surfing did not check / did not validate the certificate. I can’t confirm that this applies to later versions.) But Firefox gives the SEC_ERROR_REUSED_ISSUER_AND_SERIAL error in the second ( and all subsequent) HTTPS request (s) made through a proxy server and Chromium (I have not tested it myself) provides NET :: ERR_CERT_COMMON_NAME_INVALID for every HTTPS request. Obviously, they are a serious problem when trying to view my proxy server.

The SSL library I'm using is pyOpenSSL 0.14 , if that matters.

Regarding the Firefox error SEC_ERROR_REUSED_ISSUER_AND_SERIAL, I am sure that I will not use serial numbers. (If someone wants to check my work, it will be quite happy: cert.py - pay attention to "crt.set_serial_number (getrandbits (20 * 8)))" on line 168.) The root certificate, of course, does not change, but that shouldn't change, right? I’m not sure what exactly is meant by the “issuer” in the error message if it is not the owner of the root certificate.

In addition, the Firefox Certificate View dialog box displays completely different serial numbers for the different certificates generated by the proxy. (As an example, I have one generated for www.google.com with serial number 00: BF: 7D: 34: 35: 15: 83: 3A: 6E: 9B: 59: 49: A8: CC: 88: 01: BA: BE: 23: A7: AD, and another generated for www.reddit.com with serial number 78: 51: 04: 48: 4B: BC: E3: 96: 47: AC: DA: D4: 50: EF: 2B: 21: 88: 99: AC: 8C.) So, I'm not entirely sure that Firefox complains for sure.

My proxy reuses the private key (and thus the public key / module) for all certificates that it creates on the fly. I came to the conclusion that this is what Firefox was looking for and trying to modify the code to create a new key pair for each certificate that the proxy creates on the fly. This did not solve the problem in Firefox. I still get the same error message. I have yet to check whether it solves the Chromium problem.

Regarding the Chromium NET :: ERR_CERT_COMMON_NAME_INVALID error, only the domain is the common name for the site certificate, right? I should not include a port number or anything else, right? (Again, if someone wants to test my work, see cert.py. ) If that helps, my proxy interceptor does not use any wildcards in the common certificate names or anything else. Each generated certificate is for one specific fqdn.

I am quite sure that work on this version is impossible if it is impossible to make Firefox or Chrome (or Chromium or IE, etc.). The company that I used to buy and create a man-in-them-middle-proxy through which all traffic from inside the corporate Internet had to go through. The PC administrators in the mentioned company installed a self-signed certificate as a certification authority in each browser on each company-owned computer used by employees, and as a result there were never any errors, such as Firefox and Chromium, provided me with certificates my made own interception of proxy software . Perhaps the PC administrators have configured some settings: config in Firefox so that it all works or something like that, but I doubt it.

To be fair, the proxy server used by this company was either network or transport, and not an application like mine. But I would expect the same thing to be possible in the proxy server of the HTTP (s) application level.

Edit: I tried setting subjectAltName as suggested by brain99. Below is the line I added at location brain99:

r.add_extensions([crypto.X509Extension(b"subjectAltName", False, b"DNS:" + cn.encode("UTF-8"))])

I still get SEC_ERROR_REUSED_ISSUER_AND_SERIAL from Firefox (in the second and subsequent HTTPS requests, and I get ERR_SSL_SERVER_CERT_BAD_FORMAT from Chromium.

Here are some proxy generated certificates:

google.com: https://pastebin.com/YNr4zfZu

stackoverflow.com: https://pastebin.com/veT8sXZ4

+7
firefox google-chrome certificate ssl proxy
source share
1 answer

I noticed that you just installed CN in your X509Req. Chrome and Firefox require the subjectAltName extension; see for example this Chrome help page or this Mozilla wiki page to discuss required CAs or recommended methods. To quote from the Mozilla wiki:

Some CAs mistakenly believe that one primary DNS name should be the Common Name of the subject and all the others in the SAN.

In accordance with the basic requirements of the CA / Browser Forum:

  • BR # 9.2.1 (section 7.1.4.2.1 in BR version 1.3), topic Alternative Name extension
    • Mandatory / Optional: Mandatory
    • Content: This extension MUST contain at least one entry. Each entry MUST be either dNSName containing the fully qualified domain name, or iPAddress containing the server IP address.

You can do this easily with pyOpenSSL:

 if not os.path.exists(path): r = crypto.X509Req() r.get_subject().CN = cn r.add_extensions([crypto.X509Extension("subjectAltName", False, "DNS:" + cn]) r.set_pubkey(key) r.sign(key, "sha1") 

If this does not solve the problem or partially solves it, send one or two sample certificates that detect the problem.


In addition, I also noticed that you are signing SHA1. Note that certificates signed with SHA1 are deprecated in several major browsers, so I would suggest switching to SHA-256.

 r.sign(key, "sha256") 
+1
source share

All Articles