DescriptionI want the script to extract a list of FQDN's from SNI, as well as any returned certificate Common Names (and SANs) and then attempt to resolve each of them.
For any which are unresolvable, the FQDN, the associated IP and port should be recorded
Activity
2016-02-03 11:22:25
Unadvertised Services
Seeing (for example) a HTTPS connection go out with foo.bar.invalid as the SNI FQDN suggests that the destination server has a service responding to that FQDN. That it's not being advertised in DNS means it's potentially interesting as an apparent attempt has been made to keep it hidden from public view
Identifying Tor Usage
Depending on the version of the tor client the user is using, the value of the SNI and the certificate Common Name during a SSL handshake can help identify connections to a Tor Entry Guard.
The SNI FQDN tends to be a random (at least in appearance) string (e.g. www.3avkpvvrqtgkdk.com) with the guard then returning a different string as the cert Common Name (e.g. www.52iaby6bzurz7c4gugy.net)
So within tshark we'd be looking at the following
- ssl.handshake.extensions_server_name
- x509sat.printableString
The presence of two non-resolvable FQDN's in one handshake is a reasonable starting indicator that the connection may be Tor related. For further confirmation, any IP's meeting that criteria could then be cross-compared against the publicly available list of Tor nodes.
Whilst this method is more reliable than relying on destination port numbers (as a node's ORPort is configurable) it is contingent on tshark recognising that the SSL dissector should be applied to the stream (which won't be the case for some ports) so may need to look into sane ways of forcing that. To begin with, though, simply grabbing the low-hanging fruit is a reasonable starting point
2016-02-03 11:33:04
2016-02-03 11:42:49
Would then need to adjust processing of that temp file (https://github.com/bentasker/PCAPAnalyseandReport/blob/c157a45136b1fa11f9a2c09b67e1779530d24c01/PCAP_Analysis.sh#L499) to pull them out
If we add an additional printf into that loop we can simply write the IP, Port, SNI Name, CN and SANs out to a seperate report file for processing later.
In effect, it'll be very similar to the existing visitedsites.csv report - except that that report doesn't include certificate CN/SANs, so an alternative might simply be to update that report to include the source for a FQDN (i.e. Host Header, SNI Hostname, Cert CN, Cert SAN) and then use that as an input to avoid duplication
No part of that is active, so we don't need to honour PASSIVE_ONLY at this point.
2016-02-03 11:45:03
So splitting that into multiple parts so we can include a source identifier makes sense. Can then look at adjusting the generation of sslrequests.txt so that there's a column for CN,SANs etc (or if needed, dump them out to a seperate temp file)
2016-02-03 11:51:27
Webhook User-Agent
View Commit
2016-02-03 11:51:29
2016-02-03 11:53:27
2016-02-03 12:19:06
One option for a little further down the road: PAS-13 will be implementing a DNS transaction log, so we could conceivably turn this into a passive check by searching the output of that for lookups for extracted FQDNs. We can only extract SNI/CN etc at time of the handshake, so for the average connection there will likely have been a DNS lookup just before (assuming a previous query hasn't been cached).
But for the connections that this issue is interested in, there will likely have been no lookup (or at the very least, an NXDOMAIN). So, we could create a "lite" version of this feature which uses the information PAS-13 will ultimately capture.
I think it needs to be optional - so the active version of the check still needs to be implemented. As well as being controlled by PASSIVE_ONLY it might be worth defining a configuration option to allow other active checks to be used alongside the passive version of this check.
Either way, the passive version is currently blocked by PAS-13
2016-02-03 12:23:22
It does currently include an empty SNI line:
But that should be reasonably easy to correct, so I'm going to come back to it once the lookups are implemented.
2016-02-03 12:23:27
Webhook User-Agent
View Commit
2016-02-03 12:23:42
2016-02-03 12:29:15
2016-02-03 12:57:10
2016-02-03 12:57:36
2016-02-03 12:57:56
2016-02-03 13:13:59
- Src IP
- Dest IP
- Src IPv6
- Dest IPv6
- Src Port
- Dest Port
- SNI Name
- Certificate Names
Rows should all be unique, though there may be duplication (for example where Certificate names contains two unresolvable names). For a Tor connection, that'll almost certainly be the case as the issuer name will likely also be unresolvable.
2016-02-03 13:29:32
Webhook User-Agent
View Commit
2016-02-03 14:18:25
I'll raise a separate FR for that though as I don't want this issue becoming too Tor specific - Raised PAS-28