Wednesday, February 21, 2007

Back to "normal" operation

We're back to normal operation, all tools should work fine now. Still missing are the "Crypt-Decode", Crypt-Encrypt/Decrypt and the Virus-Check.

Crypt-Decode will come back in a few days. Crypt-Encrypt/Decript too. The Virus-Check will be implemented into the new File-Info-Tool we're working on.

We're currently developing new tools and improving old ones.
The crypto- and encoding-Scripts already support a few new Hash/CRC/Encoding-Algorithms, more will follow.
The SSL-Check will soon support even more ciphers, making it the best SSL-Check we know, checking more ciphers and functionality than any other SSL-Check, be it offline or online.
We're currently working on a sort of "decompiler" for unpacking and decompiling a bunch of different formats ranging from macromedia flash (.swf) or Microsofts Winword (including macros and plaintext) to fullblown java-applets.
We consider a bunch of other applications making live of security-guys easier - but we'd love to have your input: can you think of a (not yet available) tool to ease your work? - Drop us a mail or write a comment - we're open to anything.


Monday, February 19, 2007


Serversniff was down, and is up again. With a reduced toolset, still. I kicked the old installation in a fit of rage when a bunch of tools failed to compile due to Suse's crazy path-structure.

I always hated Suse, and since years \me refuses to work with any non-debian-system whenever possible. We had to choose Suse for Serversniffs hoster Strato didn't offer anything else a year ago.

Things have changed, Strato offers other distributions. We set up a new system on debian sarge, updated from sid to be more up2date. We did a restore from backup, which worked quite well, despite some upgrade-hassle with mysql (4->5) or some changed network-functionalities or text-output from commandlinetools like ping.

We're still missing the SSL- and the Crypto-Functions, and many API-Functions still fail or behave a bit strange - We will restore these during the next few days - they had to be revamped anyway, so expect more api-functions and public stuff to run a bit smoother when it's all restored. We apologize for any inconvinience.


Thursday, February 15, 2007

Showing hidden Meta-Information in DOC, PDF and more than 100 other file-formats

Did you hear about hidden information in formats like Microsofts .doc?

We did. Yeah. You too. For most of us this is old news. Read here or here, or ask your favourite big brothersearch-engine.
Everybody should know this - but people everywhere, from government to No Such Agencys keep publishing winword-documents on their websites.

During our penetration tests (and during our internal FileInfo-tests) we came across quite many websites with chatty files, especial .doc. We were fed up to explain this again and again and created a nifty little tool to analyze as many file-formats as possible. If you want to give it a beta-try, check by at Serversniffs "FileInfo". Currently this does ONLY files on webservers, this means the file to be checked has to be on some public webserver. Beware: The check is more than slow and supports only files with a size smaller than 1 MB. It also fails on filenames with blanks or %20. It's BETA. Stuff will get better with our next serverupgrade, which will finally kick SuSe-Linux into /dev/nul.

Examples in Winword, containing a bit of hidden information (and no, we won't post any files with hidden text here!)

It's not only winword that is chatty - we also found loads of PDF-files on websites containing Windows-Usernames of the people who created them. This might get dangerous when you are able to determine the user-structure and naming-convention of an organisation. While many pdfs are clean, there seem to a few PDF-Creator-Tools that we found to be vulnerable by default.

Especially Acrobat Distiller puts realnames or Windows-Usernames into the PDFs Meta-Information: (examples: or, both showing usernames in "Author" and "Creator"-Fields.
This seems to be configurable: Google did a better job, see, while Yahoo puts usernames in many files, like this here

Feel free to experiment. FileInfo will display internal Meta-Information for more than 100 File-Formats.

Please drop us a mail you're stumbling over something funny or if you just like the tool- we'll do our best trying to fix stuff or add more file-formats and functionality, and we're waiting for any user-input.


Wednesday, February 14, 2007

The Big Big AXFR

I like the global AXFRS posted in one of the comments to my previos posting. Have a look at Maximilian Dornseifs well-hidden blogentry at to get an idea on how to automate this. Maximilian missed, that there is a hell of secondary-TLDs like, etc. etc.
He also missed that there are a few real big hosters and providers who return far more entrys than many TLDs.

But all in all this blogentry was one of the reasons to create

Thanks, Maximilian.


Tuesday, February 13, 2007

Where do i get domains....

Dear kids,

I understand completely that you ask me where to download a list of 100.000.000 domains.

I did, and this drove a few GB traffic to a website. Can you imagine what would happen if i'd post any such url here? Boys, i guess you can. So please, have a look at and try to have fun with our favourite search-engine as well.

Another option: Since most of the URLs listed on the site are .com/.net, you might also get them directly from verisign. The TOS there are roughly the same as the TOS at, in fact i derived serversniffs Terms of Service from Versign, for the last what i wanted to do is support f*cking spammers.

Or: Create something on your own. Something to offer, me or the "community". Then come back on me and ask. I'd be happy to support you with a list of domains if you're able to explain what you're workin on. I'd be happy to team up with you if there is any win-win-situation.

If you're working for a company: I'd be happy to support you with hostnames, domainnames or known IP's, filtered by whatever you want, given that you're willing to agree to our ToS. Drop me a mail and I'm quite sure we'll negotiate a reasonable agreement.


Monday, February 05, 2007

before the flood....

We are in the process of updating our domain-database: we just started an insert of roughly 150.000.000 hostnames, bringing our db-system to the limit.

We ceased "regular" spidering and updates for a while to catch up with this bulk-data. Currently we run at a rate of about 1.000.000 new domains per day, which we consider not really bad, but still unsatisfying. We are currently testing a NAS-Array running on 6 SCSI-Disks (currently as an experiment - we will invest in more hardware if this proves to be faster than the current system).