HP Screws the POODLE | TechSNAP 184

HP Screws the POODLE | TechSNAP 184

A new attack against SSL called POODLE hits the web, and there’s no easy fix. We’ve got all the details.

Plus the Zero day bug that exposes other zero-day bugs, HP signs malware, and then it’s a big batch of your questions, our answers!

Thanks to:


DigitalOcean


Ting


iXsystems

Direct Download:

HD Video | Mobile Video | MP3 Audio | Ogg Audio | YouTube | HD Torrent | Mobile Torrent

RSS Feeds:

HD Video Feed | Mobile Video Feed | MP3 Audio Feed | Ogg Audio Feed | iTunes Feeds | Torrent Feed

Become a supporter on Patreon:

Foo

— Show Notes: —

Zero day Bug in the Bugzilla bug tracker exposes zero day exploits for other software

  • When new flaws are found in important software such as Mozilla’s Firefox, or operating systems from Redhat, and others, the details are put into a private bug in the Bugzilla bug tracker
  • Only those with a ‘need to know’, like the Security Officer, have access to the details of the flaw while a patch is prepared, tested, and shipped
  • Once a patch is shipped, some of the details may be made public
  • It is important that the details remain secret until users have had a chance to install the patches, to prevent a mal-actor from exploiting the flaw using the details and proof-of-concept provided by the people reporting the bug in the first place
  • The security and privacy of the bug tracker are therefore imperative
  • researchers at security firm “Check Point Software Technologies” discovered that it was possible to create Bugzilla user accounts that bypass that validation process.
  • “Our exploit allows us to bypass that and register using any email we want, even if we don’t have access to it, because there is no validation that you actually control that domain,” said Shahar Tal, vulnerability research team leader for Check Point. “Because of the way permissions work on Bugzilla, we can get administrative privileges by simply registering using an address from one of the domains of the Bugzilla installation owner. For example, we registered as admin@mozilla.org, and suddenly we could see every private bug under Firefox and everything else under Mozilla.”
  • Bugzilla Security Advisory
  • “An attacker creating a new Bugzilla account can override certain parameters when finalizing the account creation that can lead to the user being created with a different email address than originally requested. The overridden login name could be automatically added to groups based on the group’s regular expression setting.“
  • This flaw is obviously very serious, as it might expose previously private zero-day exploits for many different open source products
  • “The fact is that this was there for 10 years and no one saw it until now,” said Tal. “If nation state adversaries [had] access to private bug data, they would have a ball with this. There is no way to find out if anyone did exploit this other than going through user list and seeing if you have a suspicious user there.”
  • “The perception that many eyes have looked at open source code and it’s secure because so many people have looked at it, I think this is false,” Tal said. “Because no one really audits code unless they’re committed to it or they’re paid to do it. This is why we can see such foolish bugs in very popular code.”
  • In response to the Krebs story, Mozilla made this statement:
  • “Regarding the comment in the first paragraph: While it’s a theoretical possibility that other Bugzilla installations expose security bugs to “all employees,” Mozilla does not do this and as a result our security bugs were not available to potential exploiters of this flaw.
    At no time did Check Point get “administrative privileges” on bugzilla.mozilla.org. They did create an account called admin@mozilla.org that would inherit “netscapeconfidential” privileges, but we stopped using this privilege level long before the reported vulnerability was introduced. They also created “admin@mozilla.com” which inherited “mozilla-employee” access. We do actively use that classification, but not for security bugs. In addition, on bugzilla.mozilla.org Mozilla regularly checks @mozilla.com addresses against the employee database and would have caught any fraudulently created @mozilla.com accounts quickly.”

POODLE Attacks

  • A new attack against the SSL protocol (the protocol itself, not the implementations like OpenSSL this time) was found by Bodo Möller, Thai Duong, and Krzysztof Kotowicz of the Google Security team
  • POODLE – Padding Oracle On Downgraded Legacy Encryption
  • For reasons of backwards compatibility, (because the Internet is a mess of legacy crap), many SSL/TLS clients implement a ‘downgrade dance’ in the protocol handshake, rather than properly negotiate the version of the protocol to be used
  • Instead, it tries the highest version that the client supports, and if this fails to make a successful connection, it drops the connection and tries again with the next highest version until it successfully makes a connection, or runs out of options to try
  • The problem with this approach is that an attack in a position to perform a MiTM attack, could interfere with the connection and cause the downgrade dance to happen
  • The downgrade could also be caused coincidently, be dropped or malformed packets, such as a weak WiFi or Mobile signal, unnecessarily downgrading the security of the connection
  • If both the client and the server support TLS 1.2, but the attacker causes the connection to be dropped when the handshake proposes TLS 1.2, 1.1, and 1.0, then the client fails over to using SSL 3.0
  • SSL 3.0 is obsolete (it was released in 1996), and only supports the vulnerable RC4 cipher, and some block ciphers in CBC mode, which has some issues of its own, described in detail in the paper
  • In order to combat this attack, since completely disabled SSL 3.0 is not always an option (detailed later), they propose introducing TLS_FALLBACK_SCSV
  • This new extension to the TLS protocol requires that, if a client does do a downgrade dance, in the subsequent handshakes, they indicate to the server that they are doing said dance. If the server supports a higher protocol version than what the client is trying to negotiate, and this flag is set, something funny is probably going on, and the server should reject the connection, instead of allowing the downgrade to the weaker version of TLS or SSL
  • The issue with just disabling SSL 3.0 to stop this attack is older clients and servers that only support SSL 3.0
  • These include Windows XP with Internet Explorer 6 (even if another browser is installed an issued, many applications that embed a browser will still be vulnerable, including applications like Steam). IE6 includes support for TLS 1.0, but it is disabled by default
  • Many older appliances, including some load balancers that sit in front of major websites, only support the older protocols
  • 0.12% of the Alexa top 1 million websites only support SSL 3.0 and no version of TLS
  • The list includes citibank.com (SSL 3.0 only, a weak 1024 bit certificate), but that is just a redirector to online.citibank.com which is secure, a 2048 bit certificate, does not allow SSL 3.0 and supports TLS 1.2
  • Google Security blog post
  • Adam Langley’s blog
  • Poodle Paper: This POODLE Bites: Exploiting The SSL 3.0 Fallback
  • Report on sites vulnerable to Poodle attack, instructions on disabling SSL 3.0 on servers and browsers

Signed Malware = Expensive “Oops” for HP

  • “Earlier this week, HP quietly produced several client advisories stating that on Oct. 21, 2014 it plans to revoke a digital certificate the company previously used to cryptographically sign software components that ship with many of its older products. HP said it was taking this step out of an abundance of caution because it discovered that the certificate had mistakenly been used to sign malicious software way back in May 2010.”
  • Code signing is a way to ensure the software you are running on your system is actually from the author it claims to be from
  • On some highly secure systems, it is only permissible to run software that is signed, thus preventing most instances of malware
  • Except, in the case where the malware authors manage to get their malware signed, either by gaining access to a trusted code signing certificate, or by tricking someone into signing the code for them
  • One of the most popular previous instances of signed malware was Stuxnet, where a number of the components of that suite of malware had been signed by various, apparently stolen, code signing certificates
  • In Feb. 2013, whitelisting software provider Bit9 discovered that its system had been compromised and 32 bits of malware had been whitelisted. Covered on TechSNAP episode 100
  • “according to HP’s Global Chief Information Security Officer Brett Wahlin, nothing quite so sexy or dramatic was involved in HP’s decision to revoke this particular certificate. Wahlin said HP was recently alerted by Symantec about a curious, four-year-old trojan horse program that appeared to have been signed with one of HP’s private certificates and found on a server outside of HP’s network. Further investigation traced the problem back to a malware infection on an HP developer’s computer.”
  • “HP investigators believe the trojan on the developer’s PC renamed itself to mimic one of the file names the company typically uses in its software testing, and that the malicious file was inadvertently included in a software package that was later signed with the company’s digital certificate. The company believes the malware got off of HP’s internal network because it contained a mechanism designed to transfer a copy of the file back to its point of origin.”
  • In this instance, HP believes that this is a case of ‘tricked into signing the malware’, not of their signing infrastructure being compromised
  • “When people hear this, many will automatically assume we had some sort of compromise within our code signing infrastructure, and that is not the case,” he said. “We can show that we’ve never had a breach on our [certificate authority] and that our code-signing infrastructure is 100 percent intact.”
  • Even if the security concerns from this incident are minimal, the revocation of this certificate is likely to create support issues for some customers. The certificate in question expired several years ago, and so it cannot be used to digitally sign new files. But according to HP, it was used to sign a huge swath of HP software — including crucial hardware and software drivers, and other components that interact in fundamental ways with the Microsoft Windows operating system.
  • “The interesting thing that pops up here — and even Microsoft doesn’t know the answer to this — is what happens to systems with the restore partition, if they need to be restored,” Wahlin said. “Our PC group is working through trying to create solutions to help customers if that actually becomes a real-world scenario, but in the end that’s something we can’t test in a lab environment until that certificate is officially revoked by Verisign on October 21.”
  • How practical is it to revoke single signatures on a specific file, rather than having to revoke the certificate that signed all of the files?
  • How will machines find out about the revocation of the HP certificate? Unlike a browser, the systems may not have an internet connection to be able to check the status of the certificate online, or download a CRL (Certificate Revocation List)
  • Will the revocation come via a Windows Update?
  • It’ll be interesting to see how this plays out

Feedback:


Round Up:


Question? Comments? Contact us here!