Tuesday, July 24, 2007

More on Flash hacking

A quick follow-up to my previous post on testing Flash sites: Stefano Di Paola recently delivered a superbly comprehensive presentation on Flash application security during this year's OWASP conference in Milan. It goes into great detail on the ActionScript security model, how Flash applications are sandboxed, and a variety of client-side attack vectors. The resulting exploits include classic cross-site scripting issues, as well as a variation known as cross-site "flashing". Stefano also covers a few concepts for more advanced hacks, and plans to release testing tools and additional whitepapers soon. You can check out his web-log at Wisec.it.

Wednesday, July 18, 2007

Threat analysis: Fast-Flux Service Networks

Ever wonder how phishing and malware sites manage to stay online? Through their analysis of botnets and infected hosts, the HoneyNet Project has documented an increasingly widespread technique used by online criminals: "Fast-Flux Service Networks". It's an admittedly clever and approach that makes it much harder to shut down malicious operations.

The premise behind fast-flux service networks is simple: attackers register a fully qualified domain name, and then rotate hundreds or thousands of IP addresses that are assigned to it. A DNS name may only be mapped to a particular IP for a few minutes. Each IP is an infected member of a botnet - but they are not the source of content, such as a virus or a scam web-site. Instead, they simply act as proxies, redirecting to one or more "mothership" servers that actually host the content. A more complex variation, "double-flux" service networks, implement additional misdirection by also rotating the authoritative name servers.

Like most of the HoneyNet Project's work, the whitepaper is very well-written and includes a case study with real-world examples. Definitely worth checking out if you're interested in how the more sinister side of the Internet underground operates.

Tuesday, July 17, 2007

Decompiling and testing Flash-based web sites

I've recently been evaluating several tools to help our team perform security assessments on Flash-based web applications. We occasionally have to test client sites that are almost entirely written in Flash, and they can be even more annoying to assess than they are to use. I have never really worked with the language from a developer's perspective, so it's been a good learning experience.

I was first interested in decompilers, thinking that certain poorly-coded applications might have hard-coded host information, credentials, or other potentially sensitive information. I found that Flare is an effective (and free) tool for extracting ActionScript from SWF files. However, after using it on a number of projects I've come to realize that there's rarely much of interest in the ActionScript. (There are a few horribly coded Flash login portals that store passwords in the source code, but I've never seen them used in a "professional" client application. Google for "login.swf", and decompile a few of the results to see what I mean.)

Ordinary proxy tools like Paros or Burp will catch any inbound or outbound HTTP requests issued by a Flash application. However, some applications talk to the server using a SOAP-like messaging protocol known as Flash Remoting, in which messages are binary encoded in "Action Message Format" (AMF). Neither Paros nor Burp will decode AMF, making it difficult to analyze the transactions. However, there are a few applications that can: ServiceCapture and Charles Debugging Proxy are two of the more popular and well-regarded tools. Below are a few screenshots of a binary AMF response as seen in Paros, versus the same response decoded in ServiceCapture:

Binary AMF response in Paros

Decoded AMF response in ServiceCapture

As you can see, deserializing Flash Remoting traffic can provide a lot of information about an application, and even identify targets for parameter manipulation or SQL injection attacks.

Unfortunately, I haven't been able to find any free tools with this capability. ServiceCapture and Charles offer downloadable trials, but require a reasonable license fee for continued use. The Burp Proxy team is soliciting requested features for their next update, so I will be sure to submit this as a recommendation.

Monday, July 16, 2007

JavaScript Web Spider - Powered by Yahoo

pdp has released a proof-of-concept web spider written completely in JavaScript. It is a pure client-side tool, requiring no server support other than the Yahoo Site Explorer service it leverages. The spider is very efficient - it can index the files and directory structure of a web site within a few queries, making it very fast and efficient. The only limitation is that only can fetch pages already indexed by Yahoo.

As pdp points out in his writeup, it would be simple to modify this code to identify vulnerabilities in spidered sites and exploit them in real-time. We will almost certainly see XSS/AJAX worms leveraging this sort of technique to target and compromise other web applications, and they will be very difficult to block.

Thursday, July 12, 2007

Insecurity through stupidity - FTP servers expose DoD data

The Associated Press is running a story on how they discovered an extensive number of sensitive but unclassified military documents kept on unsecured FTP servers. Both government and contractor systems were found to allow anonymous access to goodies like project schematics, facility security information, building plans, and geological survey data. Some of the responses by the guilty parties are both hilarious and frightening. My favorite quote from the article [emphasis added]:

A spokeswoman for contractor SRA International Inc., where the AP found a document the Defense Department said could let hackers access military computer networks, said the company wasn't concerned because the unclassified file was on an FTP site that's not indexed by Internet search engines.
"The only way you could find it is by an awful lot of investigation."

Yeah, it's really no big deal, you never see port scanning or worms checking for anonymous FTP out on the Internet - it's far too much work. If the system isn't indexed on Google, no one will ever find it.

When I first started doing this kind of work, I couldn't believe how many high-profile clients had no grip on their Internet presence or systems therein. I eventually came to realize that it's a widespread problem, made even more problematic when companies have to track both in-house and outsourced systems and hosting. Every external penetration test we perform is preceded by a footprinting phase, where we identify the client's IP ranges and ensure we have approval to test them. Nine times out of ten, they end up shocked at what we discover. Clients often have no clue whether certain address ranges are actually theirs, never-mind what systems are on them or what services they run.

As much as I love tracking the bleeding edge in vulnerabilities and attack techniques, articles like this are a good reminder of how important it is to keep perspective, and recognize that many organizations are still struggling with the most fundamental aspects of IT security.

Oh, and one take-away question...why the hell were these FTP servers discovered by the Associated Press, and not agencies' own vulnerability scans or penetration tests? Either they're not being performed, or the people doing them are incompetent. Neither would surprise me.