Friday, December 22, 2006

Encryption added

Its christmas and we are working on extending and finetuning We added a bunch of encryption-functions recently to check how common ciphers treat your text. While this is not designed to be used in a productive environment, it might be useful to check some other implementation of these ciphers.

Please be aware of possible keylengths and used blockmodes!

We are currently workin with the mcrypt-lib and support AES (Several Modes), Blowfish, CAST, DES, 3DES (TripleDES), GOST, LOKI97, RC2, RC4, Saferplus, Serpent, Twofish, Wake and XTEA.

Expect more Ciphers (and Blockmodes) to come when we gathered some experience with the current set.


Wednesday, December 20, 2006


We changed our hash-pages and updated stuff:

We do support Hashes (Strings, Files), Checksums (Strings), Several Encodings (Strings) and finally a bunch of cryptographic function. Just have a look at the new "Crypto"-Submenu on


Wednesday, December 06, 2006

Yes, we are...

we got a request to sell hey, we are bribable. direct your offers to - we'll negotiate all the rest.
If you just find serversniff to be useful or want to implement functions in your website, there is no need to buy the whole stuff. Just drop us a mail to get access to our api-functions. We still don't charge nothing for its use as long as you are using it reasonably.


Tuesday, December 05, 2006

Finetuning and cleaning up

We started finetuning serversniff a little bit - fixing the ton of bugs still laying around, adding a bit of sorting or optics here and there or extend the explanations of a few scripts. Switched a few scripts from a generic approach to using serversniff's API - stuff like that. Not really noticable at all, but eating up enough time on our side.
We're dreaming of many many new functions to come - if only time would allow...


Thursday, October 26, 2006

New HTTP-API-Functions

We updated our HTTP-checks and added new API-Funktions. You can craft your own HTTP-requests now, e.g. do a GET with HTTP 1.0 with or without Host-Header and check the response, either only the HTTP-Servers header-info or the complete file, you can even filter for some special lines e.g. to get only the Server-Header or so.
This check does support servers listening on multiple IPs and is also capable of doing https.
We started implementing this backend to Serversniffs frontend - you might expect more HTTP-Checks based on this backendscript to come.


Saturday, October 07, 2006

New Scripts

The IP-Scripts are coming back.

We are on our way to extend serversniff with some new IP-Scripts. We started today with a simple icmp-ping, that simply sends 4 ICMP-Echo-Requests to a given host.

Additionally, for a simple ping would be quite lame, we implented what we call a "Ping-Row", that sends about 16 ICMP-Echo-Requests with increasing packetsizes. You'll be surprised how many sites allow small pings, while their routers or firewalls sort out the hugeICMP-Packets.


Monday, September 25, 2006

Extending the API

Our domain-database seems to be up and running: the scripts run very stable, we run a few background-tasks that are feeding the database with around 50.000 new domains per day.

We are working on further extensions of our API in combination with our checkomatik. Although we missed owasp's "automn of code" we are confident that we will release a full-featured version of checkomatik by the end of this year.

While we are using it internally since months, it still lacks a real user-dependent interface, security, translation and, most important, automation.

We got a few steps up the ladder with creating more api-functions, stuff like the "pingrow" or a complete ssl-check.


Sunday, August 06, 2006

Domain Kiting - how many hosts fit on one ip?

Bob Parson wrote in his noteworthy blog about Domain-Kiting - see - and i thought it should be possible to identify kited domains easily by querying's host-database: A kited domain i thought will share its IP with many many other hosts. So i started gathering a list of known ips sorted by the count of known hostnames living on this ip. I ended up with the following list:

Hostnames - IP
142643 |
132972 |
123455 |
59117 |
46819 |
40765 |
36617 |
34971 |
32697 |
32415 |
31617 |
29830 |
29142 |
27024 |
25405 |
23997 |
22636 |
19653 |
19553 |
19543 |
19168 |
19036 |
18994 |
18387 |
18311 |
17638 |
17459 |
17360 |
17132 |
17018 |
16961 |
16677 |
16253 |
15815 |
15807 |
13998 |
13952 |
13597 |
13565 |
13142 |
13131 |
12883 |
12711 |
12458 |
12444 |
12439 |
12437 |
12437 |
12436 |

So the Hostnames hosted at might not be kited but the rest: the impressive figures for e.g.: 132972 hostnames. Kited? - No, not at all. Whatever host- and domainname we checked on this domain was not kited, not even parked, but operational. It might be a loadbalancer behind - but i find this count of hostnames for one single IP still impressive.

Checking the other hosts we found a lot of parked and not too many kited domains. By explicitly checking known kited domainnames like we found, that most kited domains live together with parked domain-names on one host - often with as less as 4.000 known hostnames for this special ip. But still: on the named IPs you might (or might not) found a lot of kited domains. If you bring a few minutes of patiences, you might use the "host-on-ip"-function on to check these ips for hostnames living there.
Restart your query if you don't get an answer after about a minute - it'll be faster then, for the database has stuff in its cache.


Monday, July 31, 2006


We know:
  • 20.032.470 Hosts
  • 5.357.250 Domains
  • 7.812.218 IPs
  • 224.337 Nameservers
  • 39.914 Mailservers
(We just started with sorting in Mail- and Nameservers - we are sorting in MX- and NS-Records for around 200.000 Domains per Day, so MX- and NS- figures will continue to increase for about 20 to 30 days.)

Wednesday, July 26, 2006

5 Million Domains

We cracked the 5-Million-Mark on our domain-database. Serversniff know knows more than 5 million unique domain-names, which should represent around 5 percent of all globally registered domainnames.

We will keep inserting new hosts and domains daily, and the more you look up the more we will know. The update-speed might decrease slightly for we are on the run to update our data with NS- and MX-records. We might finish this in a few months and serversniff will offer many new functions then.

A domainsearch is implemented, a hostnamesearch looking up hosts in our 35-million-hostnames-db will be in place soon - both functions are available via our API only at the moment. Access to our API-services is free, but requires a formless registration - send an EMail to to get a free personal account.


Thursday, July 06, 2006

Unstable Server

Serversniff doesn't like virtuozzo. You might imagine that we call a lot of backend-programms to let serversniff do it's job. When there are too many background-processes and not enough (shared) RAM, the apache-process is silently dying and can't be restarted, it can't even be killed.

Virtuozzo is nice, but is nothing compared to stuff like VMWare. Providers like virtuozzo, for it make every virtual machine on a host run on only one system-installation, while each vmware-engine has its own os and eats therefore much more ram and hd-space.

We decided that we don't like virtuozzo - so serversniff will (again) move to another server with the old one simply acting as some kind of proxy. while we move we will do some quality assurance and internal updates, it might take a few weeks until everything is moved completely.


Tuesday, June 20, 2006



serversniff has a bunch of security-holes, and we are watching closely what people are doing here - and really, someone noticed that it was quite easy to get a glimpse of the mysql-log-database.

the evil hacker might have been a scriptkid, for he obviously got acces to the mysql-db, used an unkown (at least to major search-engines) mysql-exploit-script trying to create files on the system. the mysql-db died on the way to his goal.

the attacker created (and then deleted or emptied) several tables in the db mysql:

"db" - nice - contains all passwords from table "user" in cleartext!
"dat" - used to execute commands on the host
"fm" - contains php-code to upload files and execute commands
local - slightly different from "dat"
sploitdb - slightly different from "dat"
wip3r - slightly different from "dat"

It seems, that the guy used at least 4 slightly different exploits targeting to the same problem.

Better luck next time.


Saturday, May 20, 2006

sorting it all in

we're still importing hosts, this time from zonetransfers from toplevel-domains.
in parallel we're sorting in domains - expect more functions to come.
we crosschecked our data with - while we still lack many .com-domains, it seems that we have many hostnames that they don't have.

Tuesday, May 09, 2006

fixing bugs

I'm pleased to announce that a simple forum-posting at made me finally fix a few bugs in the subdomain-lookup that were resultin from the migration of serversniff to the f*cking virtuozzo-server.

The migration of our Hostname-DB nearly comes to an end - we are around 18 million known hostnames now with around 1 million hosts left in the queue. After this we have a few million hostnames from zonefiles waiting to get in. In parallel we started to build a domain-database, but this will take some time - we are around a million known domains by now, adding around 70.000 new domains each day. Expect more functions to come when we extracted broader data from our database.

And hey, we are still looking for Postgresql/Perl-Geek willing to speed up things here.


Monday, May 08, 2006


I did a few Bugfixes, mostly relating to the fact, that the crazy virtuozzo-system of our shared-server acts silly and resolves nonexisting hostnames as localhost or with it's own ip-number. i had to adjust many errorhandlers to this.

A User noticed some "can't write to log"-errors on the IP-Stack-check. These are noncritical, in fact we don't really need the log anymore for the check does it's job quite well. We'll fix em by removing the detailed-logging that was initally used for debugging-purposes. Please be aware that the IP-Check is sitting on a separate host that is rather slow, for the above mentioned virtuozzo-stuff won't interact with hping2, the programm this check is based upon.

We expanded our API to many with many new serverchecks. We are offering 16 different, configurable checks now - a list is available at Send us an E-Mail to get free access to the API. We want to know who's using our resources, but we're still offering all this for free.

We also expanded Serversniffs capabilities and are offering scripts to show HTML-Sourcecode of webpages, HTML-Comments inside webpages, Hyperlinks inside of weppages, Cookies set when visiting a Site and a websites robots.txt.

Have fun and stay secure,


Monday, April 17, 2006


springtime, and everythings new: we moved hostonip to the new database. we are still sorting in servernames, but it should even now be bigger than the old one. the old db was static, the new one is dynamic: when you look up a hostname, we start some backgroundqueries automagically trying to gather more info.

when looking up hosts, you should really consider to repeat your query a minute later: you might be surprised about the findings, like we were, when we did a lookup for

we fixed last bugs (looking up ips produced an error) and hope that it will stay as fast as it is now when it's grown bigger.

have fun,


Thursday, April 13, 2006

Where are you coming from today?

We implemented a little User-Info that's displayed on most pages and fixed a few minor bugs. Thanks to p0f and

Wednesday, April 12, 2006

Brothers im Net

hoi, and are starting to team up some services.
clemens from proposed an API for services like traceroute or ping. If everything works out you'll be able to fire up traceroutes from different locations on both of our sites, maybe from even more sites round the world at some stage.
maybe you want to participate? - we are open for anyone who is willing to share services like traceroute or ping - all you need is a host where you are able to install an run software like hping, tcptraceroute etc.
drop us a note if you're interested:

Friday, April 07, 2006

Postgresql strikes back

We had perl-scripts hammering millions of new rows into our newly designed DB-Server, many of them running in parallel. The DB-Server strikes back now: It's running a postgres-thread eating up around 1 Gig RAM and 90 percent cpu-time. I don't know what's happening deep inside the db, i don't even know how to find out. i'm learning new stuff everyday but i'm still far from beeing a postgresql-geek.

I suspended the perls, giving postgres a few hours to sort things out.
I'm praying for a smll little knowledgable ghost helping out with database-design and operating. But I'm afraid I'll have to keep learning by failing, the hard way.


Wednesday, April 05, 2006


serversniff was offline for about 10 hours last night.
we had a full hd, requiring a bit of delete-action, log-tuning and finally a reboot.

after that, we were offline again, for the database didn't start with the system. i hope that everything's fixed now.


serversniffs brother

serversniff's got a brother: If I had found this site before, serversniff would not have been built, for Clemens did a very good job making many functions more mature and useful than they are at serversniff.
I like the Serversniff used to have a similar, very plain layout, but people kept telling me that a website needs a more fancy layout.

Check it out!


Friday, March 31, 2006

It's Vokswgen, and i think: lame users make lame cars

Many users want to have internal adresses checked. Guess what happens, if you try - yup, nothing. It's a pity and i sometimes think of giving an errormessage for these adresses. Now I've seen the first failure that makes me real happy. It's only because...

We bought a car four years ago. A new Volkswagen, a New Beetle. It lasted 2 years - since then the car breaks down regularly. Many many things that shouldn't break in three year old vehicle. Expensive parts.

Volkswagen told me that the guarantee lasted two years - not a day more. Besides the "usual" stuff like inspection, oil, fluids, brakes and so on we put many thousand Euros into the New Beetle in the last 2 years. Way too much.

If you are really interested into details, send an email - I'll be happy to send you many many invoices from the friendly Volkswagen-Dealer.

And I didn't even bothered to have everything repaired.

People, whatever you do, remember me:


It was so happy to see that ServerSniff didn't work for Volkswagen, even if it was only because Mr. or Mrs. Volkswagen asked for a volkswagen-internal servername (a tld that is not known on the internet) on the ssl-check...

Sunday, March 26, 2006

New Alpha-Version: Browsercheck

We made public a new Browsercheck that shows
  • your browsers request-headers
  • your browsers capabilites
  • several JavaScript-Related Infos
  • name and version of installed plugins
  • your Java-Version
  • your private IP-Number (java-based)
Find on at "Tools, Browsercheck". You might consider this an "Alpha-Version" for it lacks some formatting and functionality, especially the plugin-checks on Internet Explorer. We made it public to gather a bit of experience, what browsers are around and (a silent hope) that somebody will respond what functionality you want us to add in the browsercheck. Tell us what you think, what you miss, write to

Saturday, March 25, 2006

SSL-Check back online

The "old" SSL-Check is back online. The ancient OpenSSL 0.9.7e 25 Oct 2004 will be replaced with a more up to date version these days. Thanks again for the hint!


SSLCheck broken

Anonymous pointed out on the wiki, that the script for checking ssl/tls/https-servers is broken. We investigate and restore functionality ASAP. Seems that the GnuTLS-Portion of the check is working while the OpenSSL-Part isn't.
We are on our way to redo the sslcheck anyway - we'll have a fully working version back online until monday, be it the old or rewritten new one.


Monday, March 20, 2006


after migrating to the new v-server the gethostbyname-function behaves strange: calling gethostbyname('') got the queried domainname back, if the host didn't exist. On the new server we are getting back the server's ip-number. I still can't imagine what's the reason of this strange behaviour and wether this is php-related or system-related - but i fixed (hopefully all) scripts to have the lookups done directly via dig.

the subdomain-check should have stopped to report every nonexistent host with serversniff's ip-address.


Sunday, March 19, 2006


switching to the new server broke the connection to the postgresql-db, a few scripts relying on this connection were broken for about 36 hours. they should be up and running now.

the subdomains-function shows some strange results and reports every nonexistant host as "" - hey, folks, this is plain wrong. we'll investigate this, but it might take until thursday for i'm out of office until then.


Friday, March 17, 2006

work in progress

transit of the domains to the new machine already started and is in progress. some things may be a bit rough - the ipinfo-script didn't work as advertised, but we hope to have it fixed now. we finally mastered the traceroute-problem: we had to explicitly specify virtuozzo's (virtual) network-adapter. hping2 didn' like this either - so hping moved to another machine with a "complete" network-stack.


Digital Nomads

After one year on the same, dedicated root-server, we decided to switch to a cheaper hosting and ordered a Virtuozzo-based VHost, costing only a quarter of the dedicated server. Everthing eating up much memory or processor-time will move to a secret backend.

Guess what: Shit happens. Other than the great VMWare, the f*cking Virtuozzo doesn't support low-level-ip, so tools like nessus, hping2 or even silly traceroute won't work anymore. We'll have to switch them to another backend, too, to keep our old scriptset.

You shouldn't notice anything while the domain is in transit as both servers look mostly the same.


Friday, March 10, 2006

we made it into papers

we made it into a "purple paper" - at least, they call us "worth our attention" - thanks, guys. we are still suffering from a static database and incomplete domain-data, for postgres is getting slow when it comes to insert huge amounts of data into indexed tables. we are experimenting and still thinking and fighting to get a sata-raid-system up and running with linux.


Sunday, March 05, 2006


The german serversniff-wiki died silently due to a configuration change of the webserver. It's alive again. We are in the process of switching to a new, faster db-format that will allow us to update our domain-data faster and more frequently.


Tuesday, February 28, 2006


Switched the hostonip-script to the new database - data should be dynamic again. DB under heavy load, lookups might still not be as fast as the should be.

Feb 28 2006:

Known Hosts:8597068
Known Domains:3459664
Known IPs:9109832
Hostnames in 2do-Query:4194170

Sunday, February 26, 2006

Database moved

The Domains-Database moved to a faster, bigger server. Queries should be faster now, and we hope to broaden our range of known hosts and domains quite fast.
Queries on Serversniff will continue to run against the old (now static) database-host until we cleaned things up in the new one.


Saturday, February 25, 2006


Fixed various minor glitches in Serversniffs Webserver-Detection. We care about errors now and might link to more information about the various webservers and their modules. Check it out and mail us (or comment here) if we missed something. You are welcome to add text to the wiki!
Check it out:


Friday, February 24, 2006


Sshhhhh - don't tell anybody: we had nasty (and quite common) bugs in our hash-creator: while it worked well with "usual" strings, some hashes didn't work, especially those with a ', " or \. Since nobody complained, they might have gone completely unnoticed.
They are fixed now, only the ntlm-/lm-hashes weren't completely fixable: strings with both ' and " will give an empty hash and an error-message now. I'll try to fix this issue during the next week as well.
Mapping the Net crossed 3 Million Domains, but is running out of Disk-Space. The new server is here and will replace the old one by the end of March.


Friday, February 17, 2006

Roger Schwarz, again

Soultcer pointed in a comment to - thats what you find at google, jup.

What I still asked myself is, who he really was. Is compiling the memory into erverstrings of many many webservers some kind of running gag? - Or is Rogers memory still present at T-Online? - I doubt that there are many people if anybody at all working for T-Online who used to know Roger. IT-Business and new economy used to be some kind of work fast, change often, don't really think about your past collegues.


Sunday, February 12, 2006

In memoriam Roger Schwarz

Did Bugfixing, Adding and Removing....
  • Hardcorebugfixing: I disabled the DNS-Script until i fixed it up. This may take a bit of time.
  • Fixed the HTTP-Header-Script to implement Port-Numbers and to check hostnames with multiple IPs (try
  • Added a HTTP-Server-Detection (for those who think the Header-Script is tooo complex
  • Fixed a few Bugs on Texts and Links
  • Started customizing the english wiki - it't time to start doing this
  • Still bugs left to fix: HTTP-Header and Servertype don't support https.
Wondered again: who was Roger Schwarz? - His memory is linked into's serverstring since many years. If anybody of the admins there still knows the old collegue?

Twiddled the Domain-Reaper-Script to prefer the new domains from the queue. We should break 3 million unique Domains soon.

Tuesday, February 07, 2006

the good, the bad and the ugly...

or the f*cking second-level-domains.

And no - don't tell me anything about ISO and standards - it seems, that some NICs set up standards for each and everything, while others just comply to "unwritten" standards.

Look at .pl with the "Standard-SLs",,,,,,,,,,,,,,,,,,,,,,,,,,,,

or .br with,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,

How many domains do you expect to find under Second-Level-Domains as fancy as "" or "" ?? - Try a zonetransfer and see, if you get 100 Domainnames.

Others "just do it" and set up pseudo-SLDs like

Others just do it and set up pseudo-SLDs for every part of the country: stuff like all the italian or american "", "", ", "", "" and so on...

Others just register a fancy domain and sell subdomains like,,,,, and so on.

Hey, NS-Admins, Hey ICANN:

this is UGLY!

- at least for me trying to get things sorted out in a manner to make queries simple and understandable for somebody who doesn't (want) to know about SLDs.


Sunday, February 05, 2006

Cleaning the mess

Our queue is down to around 200.000 Hostnames and it seems that we can start filling it up again slowly in a few days. There are still huge zonetransfers every now and then. The net's huge.b

Implemented part of the data in the Subdomains-Script, for this is one of the most-used scripts on serversniff (i still can't imagine why).


Friday, February 03, 2006

Twiddling the Database

I fixed the broken ICMP-Trace yesterday - it shows now a "we know XX Domainnames for this host" if you call the script and links to our "hostonip"-Script.
We seem to have passed the phase of the HUGE Zonetransfers finally - we import constantly new Domains but the queue of our "hostnames to import" stays roughly the same - it's hanging around 3.1 Million Hostnames for a few days now.
In the beginning, we transferred really huge zones, mostly .edu or other universities, also some second-level Domains like and a few "zone-spammers": the stuffed literally millions of hostnames into their zone-files - regarding the content of the websites they seem to use this simply for spamming. But then, I don't know any search-engine that does Zone-Transfers.

Besides of the Spam-Domains we still have a few Million more exotic hostnames in a separate queue that we will serve if we are up to date with the "important" TLDs.

The queue is getting more and more the bottleneck of MappingTheNet - we have around 10 Tasks stuffing hosts in the queue on one side and 4 tasks of checking the hostnames and sorting them into the database. The "sort-in-tasks" take a long time to query the db for hosts to do, for they have to lock the database while they read and update the entry. I'm praying for a postgresql-guru, but i might end up with having to solve this problem by myself.

We are now aware of

4.913.386 unique Hostnames
5.302.751 unique IPs (that have at least 1 hostname assigned)
2.155.215 unique Domains
3.186.760 hostnames waiting to be sorted in from our queue


Thursday, February 02, 2006

Adventures in diving through the web

We started a project called "Map the Net" recently: we created a crawlerscript thats mangling it's way through the global network and tries to get as many domains, ips and hostnames as it can get.

The results get stored - you guessed it, in a huge database and will make us very, very famous and rich somewhere in time.

Nobody but us has this information, let aside the major search-engines, the NSA, Dan Kaminsky and maybe the lovely guys at Netcraft.