?

Log in

No account? Create an account

Previous Entry | Next Entry

News from a conference...

Well, the RSA Conference has turned out some interesting news today.

The first bit of news is that Microsoft is now planning an Internet Explorer 7 for Windows XP users. Details are scarce, but it looks like they're targeting a June or July beta and a final release around Christmas. (But they say they'll keep doing betas until they get it right, so expect the final release date to slip.)

The second is that SHA-1 has been broken. Cryptogeeks out there are probably already swearing like sailors just from that sentence, and everybody else probably would keep those blank expressions on your face no matter how much explanation I typed, so I won't bother explaining further.

Comments

( Read 6 comments — Leave a comment )
countalpicola
Feb. 16th, 2005 06:21 am (UTC)
> he second is that SHA-1 has been broken.

... wow. Suppose this'll speed the rolling out of SHA-256, at least...
opt513
Feb. 16th, 2005 06:27 am (UTC)
Maybe you could explain some of the implications without trying to say what this actually does?
countalpicola
Feb. 16th, 2005 06:56 am (UTC)
What it does: SHA-1 and other hash functions take a file of any size and convert it into a fixed length string. A good hash function will make this fixed length string as unique as possible, to the point that if you're given a hash, then download a file, you can tell if the file has been tampered with by hashing it and comparing it with the hash you were given.

Of course, there are more possible files than there are possible hashes, so all that anyone really asks for is that it should be "as hard as possible" to find two files that have the same hash.

What happened: For SHA-1, "as hard as possible" meant checking 2^80 files to find one that matches another file's hash. According to the article, someone found a way to do this in 2^69 files.

Implications: SHA-1 and related functions are frequently used to verify file integrity as described above. What this means is that it is a lot easier for attackers to get past this integrity check than it should be. While 2^69 is still a huge number, it isn't as big as it should be, so from a crypto standpoint, SHA-1 is broken and should be quickly considered obsolete.

Bittorrent relies on SHA-1 hashes to provide security for files you download, and I'm sure other p2p apps do as well. Certainly, these hashes show up in a lot of places we probably don't even realize. What this means is that, for things using SHA-1 to be considered secure, they will need to change what hash they use; a potentially huge undertaking.

A longer term consequence is that, once you figure out how to break a hash, any hash that works in kinda the same way is also at risk. This isn't really a problem for us, but for the people who invent these hashes, any break in any hash is a huge deal.
opt513
Feb. 16th, 2005 07:25 am (UTC)
Thanks.
brentdax
Feb. 16th, 2005 07:30 am (UTC)
A hash algorithm allows you to boil down a big chunk of data (like, say, a multi-gigabyte file) into a short piece of data (like, say, a twenty-byte number). The short number is said to be the "hash" of the large file.

Of course, since you're taking a lot of data and trying to express it as a little data, occasionally two files will have the same hash. This condition is called a "collision". A cryptographic hash function like SHA-1 is designed to make collisions hard to predict--that is, it's hard to generate a file with the intent of making it match a particular hash.

One use of hashes is to verify that a file received from a third party is genuine. For example, Mozilla could post the hash for the Firefox installer on its own site, then distribute the installer to its mirrors. If one of those mirrors changed the file (to, say, include a piece of spyware), the file's hash would be different, so you would know the file wasn't genuine. P2P applications use SHA-1 for this purpose, and certain types of cryptographic signing use SHA-1 as a component.

Another purpose is to scramble data in a way that makes it impossible to retrieve, but still possible to check if somebody can match it. For example, Filespace doesn't store users' passwords in its database; it stores a hash of the password. There's no way to "decrypt" these passwords--even if I look at the database manually, I can't tell what your password is. However, when you type your password in, it's hashed and compared to the already-hashed database field; if they match, you've just proven that you know the password.

Basically, what they found is that SHA-1 isn't nearly as collisionproof as they thought it was. They thought that guessing a file that matched a certain hash would take 280 tries; it turns out it'll only take 269. That makes the problem several thousand times easier--which means that someone with, say, a few thousand PCs and a couple weeks could find a collision. (And as computers continue to get faster, the barrier to entry continues to lower.) It also suggests that there may be other weaknesses in the algorithm.

Basically, if programs don't change, in a few years they may have useless security. But changing means rewriting all sorts of programs in ways that may not be backwards compatible. And some of these programs are doing Really Important Things, like ensuring the integrity of military documents.
opt513
Feb. 16th, 2005 09:49 am (UTC)
Also thanks.
( Read 6 comments — Leave a comment )