Difference between revisions of "Summerschool Aachen 2004/Network Reconnaissance Lab"
(→Some more fingerprinting) |
m (Reverted edits by Oxudocopaj (talk) to last revision by Pylon) |
||
(22 intermediate revisions by 14 users not shown) | |||
Line 1: | Line 1: | ||
− | == Notes on | + | == Notes on Lab Session == |
+ | |||
=== Fingerprinting === | === Fingerprinting === | ||
We also worked on our mandatory assignemt. Our idea was basically first to parse the http://cr.yp.to/surveys/dns1.html page with a perl script and load the fingerprints into our | We also worked on our mandatory assignemt. Our idea was basically first to parse the http://cr.yp.to/surveys/dns1.html page with a perl script and load the fingerprints into our | ||
Line 9: | Line 10: | ||
--[[Samad Nasserian]] | --[[Samad Nasserian]] | ||
− | |||
− | |||
− | |||
=== More SNMP Stuff === | === More SNMP Stuff === | ||
Line 34: | Line 32: | ||
I didn't like the assignment either but after I had complained quite a bit I sat down and had a look at ftpmap. As PHP (heh, you like it, don't you? ;)) is the scripting language i know best, i started implementing a parser for the ftmap fingerprint files in PHP. The hardest part was understanding how ftpmap created its checksums on ftpd responses and how it tested these checksums against the fingerprint database, cpunkt helped me a great deal with that. After these difficulties were sorted, I worked until 21:00 and everything seemed to almost (but not quite) work, but it was getting late and so I went home, deciding to finish the project later that evening. | I didn't like the assignment either but after I had complained quite a bit I sat down and had a look at ftpmap. As PHP (heh, you like it, don't you? ;)) is the scripting language i know best, i started implementing a parser for the ftmap fingerprint files in PHP. The hardest part was understanding how ftpmap created its checksums on ftpd responses and how it tested these checksums against the fingerprint database, cpunkt helped me a great deal with that. After these difficulties were sorted, I worked until 21:00 and everything seemed to almost (but not quite) work, but it was getting late and so I went home, deciding to finish the project later that evening. | ||
− | But the PHP code i wrote seemed very messy and actually did not work as intended so I decided to learn Python and start from scratch. The code looks much cleaner now, but still doesn't work as intended ;) ... It's | + | But the PHP code i wrote seemed very messy and actually did not work as intended so I decided to learn Python and start from scratch. The code looks much cleaner now, but still doesn't work as intended ;). At around 04:00 i decided to give up and go to bed. |
− | + | ||
+ | All in all the task was very frustrating. It was not about network reconnaissance or fingerprinting at all but rather about understanding someone else's code and parsing text files. I became even more frustrated after hearing what some of the people not working on the assignment had done yesterday, because those were all the things I wanted to learn or try while attending the Summerschool. I feel like I wasted a lot of time I could have put to much better use. | ||
-- [[Lutz Böhne]] | -- [[Lutz Böhne]] | ||
Line 138: | Line 137: | ||
But I failed to implement matching against the fingerprints database. I also got far to much timeouts to my DNS queries. I didn't investigate further. :-( | But I failed to implement matching against the fingerprints database. I also got far to much timeouts to my DNS queries. I didn't investigate further. :-( | ||
− | -- MaxDornseif | + | -- [[MaxDornseif]] |
+ | |||
+ | === Automatic feature extraction for HTTP Server fingerprinting === | ||
+ | |||
+ | A lot of the fingerprinting techniques we have seen during the lecture rely on manually identifying features that could indicate the type and version of software. These most obvious are banners, string formats, particular commands or headers etc. During the lab session I wrote a set of stripts that given a few thousand HTTP headers it extracts strings that could be used as features to identify servers. | ||
+ | |||
+ | First of all I needed a database of HTTP server reply headers. To do this I crawled the [dmoz.org dmoz.org] directory, and extracted all the domains. Then I send requests, and stored the reply headers. | ||
+ | |||
+ | How to suck the dmoz site: | ||
+ | <pre> | ||
+ | $ wget -k -r -l 2 -A html -O out.html http://dmoz.org | ||
+ | </pre> | ||
+ | |||
+ | How to extract the URLs: | ||
+ | (Easy perl!) | ||
+ | <pre> | ||
+ | $ perl -ne 'if ($_ =~ /(http:\/\/[^\/") ]*)/) {print "$1\n";}' out.html | sort -u > hostlist.html | ||
+ | </pre> | ||
+ | |||
+ | A python script to create the reply header files: | ||
+ | (note that curl works like wget) | ||
+ | <pre> | ||
+ | $ less web/heads/getcmds.sh | ||
+ | import os | ||
+ | f = file('hostlist.html','r') | ||
+ | x = 0 | ||
+ | for i in f: | ||
+ | os.system('curl --socks 172.17.23.1:1080 --connect-timeout 2 -i -o /dev/null -D out-%s.txt --connect-timeout 2 -v %s' % (str(x),i[:-1])) | ||
+ | x += 1 | ||
+ | </pre> | ||
+ | |||
+ | The feature extraction algorithm works stochastically by compating random pairs of headers, and extracting all strings longer than 3, that match. The matching strings are then sorted and the dublicates are discarded. | ||
+ | <pre> | ||
+ | # Cross correlation of two strings | ||
+ | # Based on AND not (*) | ||
+ | # | ||
+ | # This extracts the strings that are matching | ||
+ | def xstr(s1,s2): | ||
+ | a = s1.lower() | ||
+ | b = s2.lower() | ||
+ | ss = [] | ||
+ | s = '' | ||
+ | for x in range(0,len(a)): | ||
+ | for y in range(0,len(b)): | ||
+ | if a[(x + y) % len(a)] == b[y % len(b)]: | ||
+ | s += a[(x +y)% len(a)] | ||
+ | else: | ||
+ | s = s.strip() | ||
+ | if len(s) > 3: | ||
+ | ss +=[s] | ||
+ | s = '' | ||
+ | return ss | ||
+ | |||
+ | # Pick random pairs of header files and extract the matching strings | ||
+ | def compxstr(fname): | ||
+ | w = file(fname,'w') # file to save output | ||
+ | f = gettraces() # list of traces file | ||
+ | random.shuffle(f) # shuffle it around | ||
+ | f2 = gettraces() # A copy of the fist of header files | ||
+ | r = [] | ||
+ | for i in range(0,len(f)): | ||
+ | print i | ||
+ | s1 = file(f[i]).read() | ||
+ | s2 = file(f2[i]).read() | ||
+ | r += xstr(s1,s2) # Extract all matching strings | ||
+ | r.sort() # Sort all matches | ||
+ | r2 = [] | ||
+ | p = '' | ||
+ | for x in r: | ||
+ | if p != x: # Deduplicate | ||
+ | p = x | ||
+ | w.write('%s\n' % x) # Save to file | ||
+ | r2 += [x] # Keep a clean sorted and dedup list | ||
+ | w.close() | ||
+ | </pre> | ||
+ | |||
+ | The list of features extracted is the following. As expected it contains headers, version numbers, options and names. | ||
+ | <pre> | ||
+ | , (unix) frontpage/, (unix) frontpage/5.0.2.2, (unix) mod_, (unix) mod_ssl/2.8., | ||
+ | , ) mod_, , 01, , 01 j, , 11, , 16, , 19, , 25, , 26, , 27, , 27 sep 200, , 27-s | ||
+ | ep-, , 27-sep-200, -200, -4050, -412, -414, -age, -cache, -check=0, -encoding, - | ||
+ | , .com., .com/, .com/index.htm, .com/w, .com/w3c/p3p.xml", cp=", .htm, .html, .n | ||
+ | , /0.1, /1.0, /1.1, /1.2, /1.3, /1.3., /1.3.2, /1.3.26, /1.3.3, /1.3.31 (unix), | ||
+ | /1.4, /2.0, /2.0., /2.0.4, /4.0, /4.0., /4.1, /4.1., /5.0, /5.0., /default.htm, | ||
+ | , 00 gmt, 00:00, 00:00:, 00:00:00 gmt, 02 m, 03:17, 04 0, 04 1, 04 14:4, 04 14:5 | ||
+ | , 1.2 mod_, 1.3., 1.3.2, 1.3.29, 109629, 13:0, 13:3, 14:4, 14:42, 14:5, 14a6, 15 | ||
+ | , 2001, 2002, 2002 1, 2002 11:4, 2003, 2003 0, 2004, 2004 0, 2004 04:, 2004 1, 2 | ||
+ | 004 13:, 2004 14:, 2004 14:0, 2004 14:2, 2004 14:4, 2004 15:, 2004 15:02:, 2004 | ||
+ | , : 1., : 1.1, : apache, : asp, : asp.net, : ch, : fr, : fri,, : http://, : http | ||
+ | ://www., : max, : mi, : miss from, : mo, : mon, : mon,, : mon, 2, : mon, 27, : m | ||
+ | on, 27 sep 2004, : mon, 27 sep 2004 1, : mon, 27 sep 2004 14:, : mon, 27 sep 200 | ||
+ | 4 14:4, : mon, 27 sep 2004 14:45:, : mon, 27 sep 2004 14:46:, : mon, 27 sep 2004 | ||
+ | 14:55:, : mon, 27 sep 2004 15:, : mon, 27 sep 2004 15:0, : mon, 27 sep 2004 15: | ||
+ | 1, : mon, 27 sep 2004 15:2, : mon, 27 sep 2004 15:28:, : mon, 27 sep 2004 15:3, | ||
+ | , : php, : po, : sun,, : th, : thu, : thu,, : thu, 1, : tue,, : we, : wed, 2, :0 | ||
+ | , :06 gmt, :06:, :06:5, :07:, :08 gmt, :08:, :11 gmt, :13:, :14 gmt, :15 gmt, :1 | ||
+ | , ; charset=iso-8859-1, ; expires=, ; expires=tue, 2, ; expires=tue, 27-sep-2005 | ||
+ | , ; path=/; domain=, ; path=/; domain=., ; path=/; domain=.tripod.com, _id=, _lo | ||
+ | g, _server/, a con, a iva, a mod_, a mod_ssl/2.8., a php, a ps, a psaa, a.co, ac | ||
+ | , ache-, age=, ange, apache, apache/, apache/1.3., apache/1.3.2, apache/1.3.26, | ||
+ | , bytes, c, c21:, c41:, cache, cache-, cache-control:, cache-control: max-age=, | ||
+ | cache-control: no-, cache-control: no-cache, cache-control: no-cache="set-cookie | ||
+ | , cache-control: no-cache="set-cookie,set-cookie2", cache-control: no-store, no- | ||
+ | cache, must-revalidate, cache-control: no-store, no-cache, must-revalidate, post | ||
+ | , cache-control: p, cache-control: private, cation, cept, cfid=, cftoken=, cgi/, | ||
+ | char, charset, charset=, charset=iso-8859-, charset=iso-8859-1, con, conne, con | ||
+ | , content-length: 1, content-length: 10, content-length: 12, content-length: 14, | ||
+ | , content-length: 3, content-length: 4, content-length: 45, content-length: 47, | ||
+ | content-length: 5, content-length: 58, content-length: 6, content-length: 7, con | ||
+ | tent-length: 73, content-length: 9, content-length: 98, content-location:, conte | ||
+ | , content-location: http://w, content-t, content-type: text/, content-type: text | ||
+ | , content-type: text/html;, content-type: text/html; charset=, content-type: tex | ||
+ | t/html; charset=gbk, content-type: text/html; charset=iso-8859-, content-type: t | ||
+ | , content-type: text/html; charset=utf-8, content-type: text/plain, control: no- | ||
+ | , d5-415, date, date:, date: mon, date: mon, 27 sep 2004, date: mon, 27 sep 2004 | ||
+ | 1, date: mon, 27 sep 2004 14:, date: mon, 27 sep 2004 14:3, date: mon, 27 sep 2 | ||
+ | 004 14:4, date: mon, 27 sep 2004 14:40:, date: mon, 27 sep 2004 14:42:56 gmt, da | ||
+ | te: mon, 27 sep 2004 14:45:, date: mon, 27 sep 2004 14:46:, date: mon, 27 sep 20 | ||
+ | 04 14:46:3, date: mon, 27 sep 2004 14:5, date: mon, 27 sep 2004 14:54:, date: mo | ||
+ | n, 27 sep 2004 14:54:0, date: mon, 27 sep 2004 14:55:, date: mon, 27 sep 2004 14 | ||
+ | :55:2, date: mon, 27 sep 2004 15:, date: mon, 27 sep 2004 15:0, date: mon, 27 se | ||
+ | p 2004 15:01:, date: mon, 27 sep 2004 15:03:, date: mon, 27 sep 2004 15:04:, dat | ||
+ | e: mon, 27 sep 2004 15:06:, date: mon, 27 sep 2004 15:1, date: mon, 27 sep 2004 | ||
+ | 15:14:, date: mon, 27 sep 2004 15:15:, date: mon, 27 sep 2004 15:18:, date: mon, | ||
+ | 27 sep 2004 15:19:, date: mon, 27 sep 2004 15:2, date: mon, 27 sep 2004 15:26:, | ||
+ | date: mon, 27 sep 2004 15:28:, date: mon, 27 sep 2004 15:3, date: mon, 27 sep 2 | ||
+ | 004 15:32:, date: mon, 27 sep 2004 15:37:, date: mon, 27 sep 2004 15:4, date: mo | ||
+ | , dav/2, debian, domain=, domain=., domain=.s, dsp cor, dsp cor cur, dsp cor cur | ||
+ | , e-co, e/1., e/2., e: m, e: mo, e: r, e: t, e: te, ection, erre, etag:, etag: " | ||
+ | , etag: "1, etag: "2, etag: "3, etag: "4, etag: "5, etag: "54, etag: "d, expires | ||
+ | , expires:, expires: mon, 2, expires: mon, 27 sep 2004 1, expires: mon, 27 sep 2 | ||
+ | 004 14:, expires: mon, 27 sep 2004 15:, expires: thu, 01 jan 1970 00:00:00 gmt, | ||
+ | , front, frontpage/, frontpage/5.0, frontpage/5.0.2.2, frontpage/5.0.2.2510 mod_ | ||
+ | ssl/2.8.1, frontpage/5.0.2.26, frontpage/5.0.2.2623, frontpage/5.0.2.263, frontp | ||
+ | age/5.0.2.2634a, frontpage/5.0.2.2634a mod_ssl/2.8.18 openssl/0.9.7a, frontpage/ | ||
+ | , http://, http://p, http://w, http://www., i/2., id=1, ified, inde, index., ind | ||
+ | ex.htm, ing:, ion:, it/1., jul 200, jul 2004, jul 2004 1, jun 200, jun 2001, jun | ||
+ | , l/0., l/1., l/2., last, last-modified:, last-modified: fri, last-modified: fri | ||
+ | ,, last-modified: mon, last-modified: mon,, last-modified: mon, 2, last-modified | ||
+ | : mon, 27 sep 2004, last-modified: mon, 27 sep 2004 10:, last-modified: s, last- | ||
+ | modified: sat, 0, last-modified: sat, 1, last-modified: sat, 2, last-modified: s | ||
+ | un,, last-modified: t, last-modified: thu,, last-modified: thu, 05 dec 2002 14:1 | ||
+ | , last-modified: thu, 1, last-modified: tue,, last-modified: tue, 0, last-modifi | ||
+ | ed: tue, 2, last-modified: tue, 22 ju, last-modified: wed, last-modified: wed,, | ||
+ | linux, linux), live, loca, location:, location: http://, location: http://www., | ||
+ | , main, mar 200, max-age=0, may 200, microsoft, microsoftofficewebserver: 5.0_pu | ||
+ | , ml; path=/, mod_, mod_auth_p, mod_auth_pa, mod_auth_passthrough/1.8 mod_log_by | ||
+ | tes/1.2 mod_bwlimited/1.4 php/4.3.8 frontpage/5.0.2.2634a mod_ssl/2.8.1, mod_bw, | ||
+ | mod_bwlimited/1., mod_f, mod_fastcgi/2.4., mod_gzip/1.3., mod_gzip/1.3.26.1a, m | ||
+ | od_jk/1., mod_jk/1.2., mod_l, mod_log_bytes/, mod_log_bytes/1.2 mod_bwlimited/1. | ||
+ | 4 php/4.3.8 frontpage/5.0.2.2634a mod_ssl/2.8.19 openssl/0.9.6b, mod_p, mod_perl | ||
+ | /1.2, mod_python/3.0., mod_s, mod_ssl/2., mod_ssl/2.8., mod_ssl/2.8.1, mod_ssl/2 | ||
+ | .8.18 openssl/0.9.6, mod_ssl/2.8.19 openssl/0.9.7, mon,, mon, 0, mon, 27, moved, | ||
+ | , music, n, 1, n, 2, n, 27, n: c, ncod, nix), no-cache, no-cache, must-revalidat | ||
+ | e, nov 1, ntpa, oct 200, on, 27 sep 2004 1, openssl, openssl/0.9., openssl/0.9.6 | ||
+ | , os.com/w3c/p3p.xml", ound, our ind, p/1., p/4., p3p:, p3p: policyref="http://, | ||
+ | p3p: policyref="http://www., p3p: policyref="http://www.lycos.com/w3c/p3p.xml", | ||
+ | , page, path=/, path=/;, php/, php/4., php/4.0., php/4.1.2, php/4.1.2 mod_perl/1 | ||
+ | .2, php/4.2., php/4.3., php/4.3.2, php/4.3.4, php/4.3.5, php/4.3.8, php/4.3.8-12 | ||
+ | mod_ssl/2.8.19 openssl/0.9.7d, policyref="http://www., post-check=0, pre-check= | ||
+ | , s.com, s.com/, sa ps, se, sep 2, sep 200, sep 2004, sep 2004 0, sep 2004 06:0, | ||
+ | sep 2004 1, sep 2004 15:, sep 2004 15:1, sep 2004 15:2, sep 2004 16:, serv, ser | ||
+ | , server: apache/, server: apache/1., server: apache/1.3, server: apache/1.3., s | ||
+ | erver: apache/1.3.14 (unix), server: apache/1.3.2, server: apache/1.3.26 (unix), | ||
+ | , server: apache/1.3.27, server: apache/1.3.27 (unix), server: apache/1.3.27 (un | ||
+ | , server: apache/1.3.29, server: apache/1.3.29 (, server: apache/1.3.29 (unix), | ||
+ | server: apache/1.3.29 (unix) php/4.3., server: apache/1.3.31 (debian gnu/linux), | ||
+ | server: apache/1.3.31 (unix), server: apache/1.3.31 (unix) mod_, server: apache | ||
+ | /1.3.31 (unix) mod_auth_passthrough/1.8 mod_, server: apache/1.3.31 (unix) mod_a | ||
+ | uth_passthrough/1.8 mod_log_bytes/1.2 mod_bwlimited/1.4 php/4.3., server: apache | ||
+ | /1.3.31 (unix) mod_auth_passthrough/1.8 mod_log_bytes/1.2 mod_bwlimited/1.4 php/ | ||
+ | 4.3.8 frontpage/5.0.2.2634a mod_ssl/2.8.19 openssl/0.9.6b, server: apache/1.3.31 | ||
+ | (unix) mod_gzip/1.3.26.1a, server: apache/2, server: apache/2.0, server: apache | ||
+ | , server: microsoft-iis/, server: microsoft-iis/4.0, server: microsoft-iis/5.0, | ||
+ | , sessi, sessionid, set-cookie, set-cookie:, set-cookie: a, set-cookie: asp, set | ||
+ | -cookie: aspsessionid, set-cookie: aspsessionida, set-cookie: c, set-cookie: cf, | ||
+ | set-cookie: cfid=, set-cookie: cftoken=, set-cookie: cookiestatus=cookie_ok; pa | ||
+ | th=/; domain=.tripod.com; expires=tue, 27-sep-2005 15:, set-cookie: jsessionid=b | ||
+ | , set-cookie: member_page=, set-cookie: phpsessid=, sion, sion: 1., site, soft-, | ||
+ | , t-ch, telo, thro, thu,, tion, tion:, tion: c, tran, trans, transfer-encoding: | ||
+ | , x) d, x) frontpage/5.0.2.2, x) mod_, x) mod_ssl/2.8.1, x) mod_throttle/3.1.2 m | ||
+ | od_, x-, x-cache, x-cache: miss from, x-powered-by:, x-powered-by: asp.net, x-po | ||
+ | , x-powered-by: php/, x-powered-by: php/4., x-powered-by: php/4.1.2, x-powered-b | ||
+ | y: php/4.3., x-powered-by: php/4.3.2, x-powered-by: php/4.3.4, x-powered-by: php | ||
+ | /4.3.8, y: a, | ||
+ | </pre> | ||
+ | |||
+ | The next phase of the project is to use the above strings to classify different servers, according to which heatures their headers contain. Note that the same scripts shouls be useable to extract features and classify other protocols, such as HTTP clients, FTP transcripts, ... | ||
+ | |||
+ | -- [[George Danezis]] | ||
+ | |||
+ | === Honeyd === | ||
+ | |||
+ | Did some work on a honeynet at the weekend. Here is some basic [[Honeyd/Networking]] Information. | ||
+ | |||
+ | --[[User:Mario Manno|Mario Manno]] 21:31, 28 Sep 2004 (CEST) | ||
+ | |||
+ | |||
+ | === screwing with output === | ||
+ | |||
+ | I made a tool which reads the amap respose file (I suppose this can be seen as a parser that we had to write) which listens on a port and for every input it gets it randomly (seeded with the attackers ip address) picks | ||
+ | and entry from this file and returns it. The result is that amap will give you 3 totally unrelated guesses (which are obviously wrong). | ||
+ | |||
+ | -- Ilja van Sprundel | ||
+ | |||
+ | === DNS fingerprinting === | ||
+ | |||
+ | I also used djb's DNS fingerprinting database to fingerprint name servers with a perl script. See the notes on Project Day I for more about this. | ||
+ | |||
+ | --[[User:Cpunkt|Cpunkt]] 18:11, 30 Sep 2004 (CEST) | ||
+ | |||
+ | [[Category:Summerschools]] [[Category:Hacks]] |
Latest revision as of 17:36, 24 November 2010
Contents
Notes on Lab Session
Fingerprinting
We also worked on our mandatory assignemt. Our idea was basically first to parse the http://cr.yp.to/surveys/dns1.html page with a perl script and load the fingerprints into our datastructure, into which also the fingerprints of the ftp and smtp servers were to be loaded. We planed to use the already implemented functions which computed the fingerprint of a server and pipe their output to our script and compare it with the stored values in our datastructure. When calling our script the user was expected to give it as a parameter a value N and our script was supposed to output for the best N matches of our fingerprint with the fingerprints stored in our datastructure the name of the corresponding server. However, my project partner left the group at 6.30 pm and I was left alone and consequently joined another project group.
More SNMP Stuff
I decided to go for the suggestion of SNMP fingerprinting. Today I was mainly busy (or better: idle) setting up and using a scanner which would scan the 137.229.0.0/16 network for hosts that speak SNMP. I used perl for that, together with the Net::SNMP-module from CPAN. The scanner would send SNMP-GET-requests to every IP address for the OID system.sysDescr.0 and log the results. I scanned in parallel, sending out about 15 requests per second. In the end, I had files containing a lot of lines saying that the queried host does not answer to SNMP-requests, and 1268 lines of responses from hosts which had an snmpd running, containing the description of the system. There were a lot of HP JetPrint printers, Cisco devices, some Windows 2000 systems and some Linux boxes. I did not get round to much fingerprinting, apart from the trivial thing to do (considering the content of the system.sysDescr-field...). I uploaded the results of my scans, in case anyone wants to have that information. Besides that, I got sidetracked and played a bit with some of the printers. As is not surprising, the somewhat homegrown webservers they were running were not very well implemented...
Some more fingerprinting
After having been one of the guys complaining about the assignment, too, I nevertheless sat down and started coding. After putting some thought into it and maybe also reducing my own expectations, I found a way to get the whole thing going with not too many lines of perl code.
The script right now is still a bit messy, it depends on certain subroutines being available inside the script itself instead of using a modular or oo-based structure. But it is conceptually extendable and is not too hard to clean up. I then implemented the smtp parsing and scanning as proof of concept. It should be possible to add other protocols quite easily.
The only problem I still have left is one of matching two strings. I definitely get the expected status codes from the servers I am fingerprinting, only something inside my script doesn't want me to successfully match against the database :-)
I didn't like the assignment either but after I had complained quite a bit I sat down and had a look at ftpmap. As PHP (heh, you like it, don't you? ;)) is the scripting language i know best, i started implementing a parser for the ftmap fingerprint files in PHP. The hardest part was understanding how ftpmap created its checksums on ftpd responses and how it tested these checksums against the fingerprint database, cpunkt helped me a great deal with that. After these difficulties were sorted, I worked until 21:00 and everything seemed to almost (but not quite) work, but it was getting late and so I went home, deciding to finish the project later that evening.
But the PHP code i wrote seemed very messy and actually did not work as intended so I decided to learn Python and start from scratch. The code looks much cleaner now, but still doesn't work as intended ;). At around 04:00 i decided to give up and go to bed.
All in all the task was very frustrating. It was not about network reconnaissance or fingerprinting at all but rather about understanding someone else's code and parsing text files. I became even more frustrated after hearing what some of the people not working on the assignment had done yesterday, because those were all the things I wanted to learn or try while attending the Summerschool. I feel like I wasted a lot of time I could have put to much better use.
-- Lutz Böhne
SNMP Reconnaissance
This is a placeholder for the results of the SNMP scanning I've been doing, but here's a list of default passwords that others might find useful
Fingerprinting
So we gave out some mandatory work for today. It was considered boring and frustrating by most and they considered all other possibilities more entertaining. Is that a patter about the grass being greener elsewhere? Alexander seemed to consider the requirement of doing something he doesn't enjoy for a whole afternoon to hard and left without notice. I'm disappointed about that.
To find out if the task was really unbearable I sat down myself and implemented what I asked for. The basic parser was quickly done:
def loadFingerprints(self): # seek list of probes for l in sys.stdin: if 'Here are the DNS packets sent by the surveying program:' in l: break for l in sys.stdin: if l.startswith('<tr><td align=right>'): fields = l.split('<td>') # this IS exploiutable tests.append((eval(fields[1].strip('</>tdtr')), fields[2].strip('</>tdtr</td></tr>\n'))) if '</table>' in l: break # seek list of probes for l in sys.stdin: if not l.startswith('<tr><td>'): continue if l.startswith('<tr><td>Software</td>'): continue probes.append([x.replace('</td>', '').replace('tr>', '').strip(' </\n') for x in l.split('<td>')])
Crude, but works. Mostly. I get entries like
[, 'BIND 9.2', '4q', '5', '5', '1q', '2', '1q', '1q', '1q', '1q', '3AA', '0AA', '3AA', '3AA', '3AA', '3AA', '3AA', '4q', '4q', '4q', '3AA', '3AA', '5', '0AAD, 2, 5']
that is fine, but others are not
[, '1', '1', 't', 't', 't', 't', 't', 't', '1', 't', '0', 't', '0', '15', '0Z0', '0', '0', 't', 't', 't', '0', '0', 't', '4']
I decided to leave that problem for later.
Scaning was easy now:
def scanTargets(self, targetlist, timeout=1): for target in targetlist: s = socket.socket(socket.AF_INET, socket.SOCK_DGRAM) s.settimeout(timeout) s.connect((target, 53)) for test, desc in tests: flags = [] reply = None retries = 5 while 1: print 'sending %r ...' %test, s.send(test) try: reply = s.recv(1500) print repr(reply) break except socket.timeout: print "timeout" retries -= 1 if retries < 0: flags.append('t') break if reply: flags.extend(self.checkFlags(reply)) print "xxx", flags
I did parse the response with pydns:
def checkFlags(self, reply): flags = [] u = DNS.Lib.Munpacker(reply) r = DNS.Lib.DnsResult(u, []) # check RCODE flags.append(r.header['rcode']) if r.header['tc']: flags.append('TC') if r.header['rd']: flags.append('RD') if r.header['aa']: flags.append('AA') if r.answers: flags.append('D') if len(r.questions) == 0: flags.append('q') if len(r.questions) == 0: flags.append('Q2') # X is missing # print vars(r) return flags
But I failed to implement matching against the fingerprints database. I also got far to much timeouts to my DNS queries. I didn't investigate further. :-(
-- MaxDornseif
Automatic feature extraction for HTTP Server fingerprinting
A lot of the fingerprinting techniques we have seen during the lecture rely on manually identifying features that could indicate the type and version of software. These most obvious are banners, string formats, particular commands or headers etc. During the lab session I wrote a set of stripts that given a few thousand HTTP headers it extracts strings that could be used as features to identify servers.
First of all I needed a database of HTTP server reply headers. To do this I crawled the [dmoz.org dmoz.org] directory, and extracted all the domains. Then I send requests, and stored the reply headers.
How to suck the dmoz site:
$ wget -k -r -l 2 -A html -O out.html http://dmoz.org
How to extract the URLs: (Easy perl!)
$ perl -ne 'if ($_ =~ /(http:\/\/[^\/") ]*)/) {print "$1\n";}' out.html | sort -u > hostlist.html
A python script to create the reply header files: (note that curl works like wget)
$ less web/heads/getcmds.sh import os f = file('hostlist.html','r') x = 0 for i in f: os.system('curl --socks 172.17.23.1:1080 --connect-timeout 2 -i -o /dev/null -D out-%s.txt --connect-timeout 2 -v %s' % (str(x),i[:-1])) x += 1
The feature extraction algorithm works stochastically by compating random pairs of headers, and extracting all strings longer than 3, that match. The matching strings are then sorted and the dublicates are discarded.
# Cross correlation of two strings # Based on AND not (*) # # This extracts the strings that are matching def xstr(s1,s2): a = s1.lower() b = s2.lower() ss = [] s = '' for x in range(0,len(a)): for y in range(0,len(b)): if a[(x + y) % len(a)] == b[y % len(b)]: s += a[(x +y)% len(a)] else: s = s.strip() if len(s) > 3: ss +=[s] s = '' return ss # Pick random pairs of header files and extract the matching strings def compxstr(fname): w = file(fname,'w') # file to save output f = gettraces() # list of traces file random.shuffle(f) # shuffle it around f2 = gettraces() # A copy of the fist of header files r = [] for i in range(0,len(f)): print i s1 = file(f[i]).read() s2 = file(f2[i]).read() r += xstr(s1,s2) # Extract all matching strings r.sort() # Sort all matches r2 = [] p = '' for x in r: if p != x: # Deduplicate p = x w.write('%s\n' % x) # Save to file r2 += [x] # Keep a clean sorted and dedup list w.close()
The list of features extracted is the following. As expected it contains headers, version numbers, options and names.
, (unix) frontpage/, (unix) frontpage/5.0.2.2, (unix) mod_, (unix) mod_ssl/2.8., , ) mod_, , 01, , 01 j, , 11, , 16, , 19, , 25, , 26, , 27, , 27 sep 200, , 27-s ep-, , 27-sep-200, -200, -4050, -412, -414, -age, -cache, -check=0, -encoding, - , .com., .com/, .com/index.htm, .com/w, .com/w3c/p3p.xml", cp=", .htm, .html, .n , /0.1, /1.0, /1.1, /1.2, /1.3, /1.3., /1.3.2, /1.3.26, /1.3.3, /1.3.31 (unix), /1.4, /2.0, /2.0., /2.0.4, /4.0, /4.0., /4.1, /4.1., /5.0, /5.0., /default.htm, , 00 gmt, 00:00, 00:00:, 00:00:00 gmt, 02 m, 03:17, 04 0, 04 1, 04 14:4, 04 14:5 , 1.2 mod_, 1.3., 1.3.2, 1.3.29, 109629, 13:0, 13:3, 14:4, 14:42, 14:5, 14a6, 15 , 2001, 2002, 2002 1, 2002 11:4, 2003, 2003 0, 2004, 2004 0, 2004 04:, 2004 1, 2 004 13:, 2004 14:, 2004 14:0, 2004 14:2, 2004 14:4, 2004 15:, 2004 15:02:, 2004 , : 1., : 1.1, : apache, : asp, : asp.net, : ch, : fr, : fri,, : http://, : http ://www., : max, : mi, : miss from, : mo, : mon, : mon,, : mon, 2, : mon, 27, : m on, 27 sep 2004, : mon, 27 sep 2004 1, : mon, 27 sep 2004 14:, : mon, 27 sep 200 4 14:4, : mon, 27 sep 2004 14:45:, : mon, 27 sep 2004 14:46:, : mon, 27 sep 2004 14:55:, : mon, 27 sep 2004 15:, : mon, 27 sep 2004 15:0, : mon, 27 sep 2004 15: 1, : mon, 27 sep 2004 15:2, : mon, 27 sep 2004 15:28:, : mon, 27 sep 2004 15:3, , : php, : po, : sun,, : th, : thu, : thu,, : thu, 1, : tue,, : we, : wed, 2, :0 , :06 gmt, :06:, :06:5, :07:, :08 gmt, :08:, :11 gmt, :13:, :14 gmt, :15 gmt, :1 , ; charset=iso-8859-1, ; expires=, ; expires=tue, 2, ; expires=tue, 27-sep-2005 , ; path=/; domain=, ; path=/; domain=., ; path=/; domain=.tripod.com, _id=, _lo g, _server/, a con, a iva, a mod_, a mod_ssl/2.8., a php, a ps, a psaa, a.co, ac , ache-, age=, ange, apache, apache/, apache/1.3., apache/1.3.2, apache/1.3.26, , bytes, c, c21:, c41:, cache, cache-, cache-control:, cache-control: max-age=, cache-control: no-, cache-control: no-cache, cache-control: no-cache="set-cookie , cache-control: no-cache="set-cookie,set-cookie2", cache-control: no-store, no- cache, must-revalidate, cache-control: no-store, no-cache, must-revalidate, post , cache-control: p, cache-control: private, cation, cept, cfid=, cftoken=, cgi/, char, charset, charset=, charset=iso-8859-, charset=iso-8859-1, con, conne, con , content-length: 1, content-length: 10, content-length: 12, content-length: 14, , content-length: 3, content-length: 4, content-length: 45, content-length: 47, content-length: 5, content-length: 58, content-length: 6, content-length: 7, con tent-length: 73, content-length: 9, content-length: 98, content-location:, conte , content-location: http://w, content-t, content-type: text/, content-type: text , content-type: text/html;, content-type: text/html; charset=, content-type: tex t/html; charset=gbk, content-type: text/html; charset=iso-8859-, content-type: t , content-type: text/html; charset=utf-8, content-type: text/plain, control: no- , d5-415, date, date:, date: mon, date: mon, 27 sep 2004, date: mon, 27 sep 2004 1, date: mon, 27 sep 2004 14:, date: mon, 27 sep 2004 14:3, date: mon, 27 sep 2 004 14:4, date: mon, 27 sep 2004 14:40:, date: mon, 27 sep 2004 14:42:56 gmt, da te: mon, 27 sep 2004 14:45:, date: mon, 27 sep 2004 14:46:, date: mon, 27 sep 20 04 14:46:3, date: mon, 27 sep 2004 14:5, date: mon, 27 sep 2004 14:54:, date: mo n, 27 sep 2004 14:54:0, date: mon, 27 sep 2004 14:55:, date: mon, 27 sep 2004 14 :55:2, date: mon, 27 sep 2004 15:, date: mon, 27 sep 2004 15:0, date: mon, 27 se p 2004 15:01:, date: mon, 27 sep 2004 15:03:, date: mon, 27 sep 2004 15:04:, dat e: mon, 27 sep 2004 15:06:, date: mon, 27 sep 2004 15:1, date: mon, 27 sep 2004 15:14:, date: mon, 27 sep 2004 15:15:, date: mon, 27 sep 2004 15:18:, date: mon, 27 sep 2004 15:19:, date: mon, 27 sep 2004 15:2, date: mon, 27 sep 2004 15:26:, date: mon, 27 sep 2004 15:28:, date: mon, 27 sep 2004 15:3, date: mon, 27 sep 2 004 15:32:, date: mon, 27 sep 2004 15:37:, date: mon, 27 sep 2004 15:4, date: mo , dav/2, debian, domain=, domain=., domain=.s, dsp cor, dsp cor cur, dsp cor cur , e-co, e/1., e/2., e: m, e: mo, e: r, e: t, e: te, ection, erre, etag:, etag: " , etag: "1, etag: "2, etag: "3, etag: "4, etag: "5, etag: "54, etag: "d, expires , expires:, expires: mon, 2, expires: mon, 27 sep 2004 1, expires: mon, 27 sep 2 004 14:, expires: mon, 27 sep 2004 15:, expires: thu, 01 jan 1970 00:00:00 gmt, , front, frontpage/, frontpage/5.0, frontpage/5.0.2.2, frontpage/5.0.2.2510 mod_ ssl/2.8.1, frontpage/5.0.2.26, frontpage/5.0.2.2623, frontpage/5.0.2.263, frontp age/5.0.2.2634a, frontpage/5.0.2.2634a mod_ssl/2.8.18 openssl/0.9.7a, frontpage/ , http://, http://p, http://w, http://www., i/2., id=1, ified, inde, index., ind ex.htm, ing:, ion:, it/1., jul 200, jul 2004, jul 2004 1, jun 200, jun 2001, jun , l/0., l/1., l/2., last, last-modified:, last-modified: fri, last-modified: fri ,, last-modified: mon, last-modified: mon,, last-modified: mon, 2, last-modified : mon, 27 sep 2004, last-modified: mon, 27 sep 2004 10:, last-modified: s, last- modified: sat, 0, last-modified: sat, 1, last-modified: sat, 2, last-modified: s un,, last-modified: t, last-modified: thu,, last-modified: thu, 05 dec 2002 14:1 , last-modified: thu, 1, last-modified: tue,, last-modified: tue, 0, last-modifi ed: tue, 2, last-modified: tue, 22 ju, last-modified: wed, last-modified: wed,, linux, linux), live, loca, location:, location: http://, location: http://www., , main, mar 200, max-age=0, may 200, microsoft, microsoftofficewebserver: 5.0_pu , ml; path=/, mod_, mod_auth_p, mod_auth_pa, mod_auth_passthrough/1.8 mod_log_by tes/1.2 mod_bwlimited/1.4 php/4.3.8 frontpage/5.0.2.2634a mod_ssl/2.8.1, mod_bw, mod_bwlimited/1., mod_f, mod_fastcgi/2.4., mod_gzip/1.3., mod_gzip/1.3.26.1a, m od_jk/1., mod_jk/1.2., mod_l, mod_log_bytes/, mod_log_bytes/1.2 mod_bwlimited/1. 4 php/4.3.8 frontpage/5.0.2.2634a mod_ssl/2.8.19 openssl/0.9.6b, mod_p, mod_perl /1.2, mod_python/3.0., mod_s, mod_ssl/2., mod_ssl/2.8., mod_ssl/2.8.1, mod_ssl/2 .8.18 openssl/0.9.6, mod_ssl/2.8.19 openssl/0.9.7, mon,, mon, 0, mon, 27, moved, , music, n, 1, n, 2, n, 27, n: c, ncod, nix), no-cache, no-cache, must-revalidat e, nov 1, ntpa, oct 200, on, 27 sep 2004 1, openssl, openssl/0.9., openssl/0.9.6 , os.com/w3c/p3p.xml", ound, our ind, p/1., p/4., p3p:, p3p: policyref="http://, p3p: policyref="http://www., p3p: policyref="http://www.lycos.com/w3c/p3p.xml", , page, path=/, path=/;, php/, php/4., php/4.0., php/4.1.2, php/4.1.2 mod_perl/1 .2, php/4.2., php/4.3., php/4.3.2, php/4.3.4, php/4.3.5, php/4.3.8, php/4.3.8-12 mod_ssl/2.8.19 openssl/0.9.7d, policyref="http://www., post-check=0, pre-check= , s.com, s.com/, sa ps, se, sep 2, sep 200, sep 2004, sep 2004 0, sep 2004 06:0, sep 2004 1, sep 2004 15:, sep 2004 15:1, sep 2004 15:2, sep 2004 16:, serv, ser , server: apache/, server: apache/1., server: apache/1.3, server: apache/1.3., s erver: apache/1.3.14 (unix), server: apache/1.3.2, server: apache/1.3.26 (unix), , server: apache/1.3.27, server: apache/1.3.27 (unix), server: apache/1.3.27 (un , server: apache/1.3.29, server: apache/1.3.29 (, server: apache/1.3.29 (unix), server: apache/1.3.29 (unix) php/4.3., server: apache/1.3.31 (debian gnu/linux), server: apache/1.3.31 (unix), server: apache/1.3.31 (unix) mod_, server: apache /1.3.31 (unix) mod_auth_passthrough/1.8 mod_, server: apache/1.3.31 (unix) mod_a uth_passthrough/1.8 mod_log_bytes/1.2 mod_bwlimited/1.4 php/4.3., server: apache /1.3.31 (unix) mod_auth_passthrough/1.8 mod_log_bytes/1.2 mod_bwlimited/1.4 php/ 4.3.8 frontpage/5.0.2.2634a mod_ssl/2.8.19 openssl/0.9.6b, server: apache/1.3.31 (unix) mod_gzip/1.3.26.1a, server: apache/2, server: apache/2.0, server: apache , server: microsoft-iis/, server: microsoft-iis/4.0, server: microsoft-iis/5.0, , sessi, sessionid, set-cookie, set-cookie:, set-cookie: a, set-cookie: asp, set -cookie: aspsessionid, set-cookie: aspsessionida, set-cookie: c, set-cookie: cf, set-cookie: cfid=, set-cookie: cftoken=, set-cookie: cookiestatus=cookie_ok; pa th=/; domain=.tripod.com; expires=tue, 27-sep-2005 15:, set-cookie: jsessionid=b , set-cookie: member_page=, set-cookie: phpsessid=, sion, sion: 1., site, soft-, , t-ch, telo, thro, thu,, tion, tion:, tion: c, tran, trans, transfer-encoding: , x) d, x) frontpage/5.0.2.2, x) mod_, x) mod_ssl/2.8.1, x) mod_throttle/3.1.2 m od_, x-, x-cache, x-cache: miss from, x-powered-by:, x-powered-by: asp.net, x-po , x-powered-by: php/, x-powered-by: php/4., x-powered-by: php/4.1.2, x-powered-b y: php/4.3., x-powered-by: php/4.3.2, x-powered-by: php/4.3.4, x-powered-by: php /4.3.8, y: a,
The next phase of the project is to use the above strings to classify different servers, according to which heatures their headers contain. Note that the same scripts shouls be useable to extract features and classify other protocols, such as HTTP clients, FTP transcripts, ...
Honeyd
Did some work on a honeynet at the weekend. Here is some basic Honeyd/Networking Information.
--Mario Manno 21:31, 28 Sep 2004 (CEST)
screwing with output
I made a tool which reads the amap respose file (I suppose this can be seen as a parser that we had to write) which listens on a port and for every input it gets it randomly (seeded with the attackers ip address) picks and entry from this file and returns it. The result is that amap will give you 3 totally unrelated guesses (which are obviously wrong).
-- Ilja van Sprundel
DNS fingerprinting
I also used djb's DNS fingerprinting database to fingerprint name servers with a perl script. See the notes on Project Day I for more about this.
--Cpunkt 18:11, 30 Sep 2004 (CEST)