Talk:Brute force attack: Difference between revisions
imported>Sandy Harris (new article, may need work from others) |
imported>Howard C. Berkowitz (→==Another dimension to key strength==: new section) |
||
Line 2: | Line 2: | ||
More generally, can others improve this? | More generally, can others improve this? | ||
== ==Another dimension to key strength== == | |||
Something not often considered in crypto for the civil sector, but often examined in depth in the military and intelligence areas, is not just how long a brute force (or more skilled) attack would take to yield the plaintext, but how long a period of protection is needed? | |||
A classic example is that if you can hit a target with artillery in 5 minutes, but it would take the intended target 15 minutes to move out of range, the main reason to encrypt at all is the equivalent, I suppose, of giving the condemned a blindfold. Now, there might be a rationale for using encryption with resistance just slightly longer than the period between unit code name changes. | |||
On the other hand, espionage traffic really should be protected for decades, because there are literally families of spies. | |||
It happened that I was on the U.S. Federal Telecommunications Standards Committee at a time when one of the military members wanted the option for a longer checksum on -- IIRC -- HDLC. They said it was needed to protect nuclear command and control, and I inquired when the U.S. government had decided that the risk of accidental nuclear war was unacceptable at 16 bits but acceptable at 32 -- or maybe it was 32 and 64. My observation was not appreciated. | |||
For things like money and securities trading, you do need strong protection until the trades are made, at which point the information is public. If the typical period between placing the order and making the sale is 15 minutes, how quickly would you have to break it and give it to another trader who could exploit the information? | |||
I've had some generically weird experiences in clinical computing. In one case, the doctors insisted on very strong crypto (for 1966) for hard copies of lab charts they would leave unattended in hard copy. In another case, I became extremely frustrated with a client, who wanted strong security for an in-hospital hospital system on which an authenticated physician could prescribe narcotics. Trying for a ''reduction ad absurdum'', I drew up a system that was generally more rigorous than used to order the launch of an ICBM. To get to the audit file, you had to have two people, at two locations, monitored by remote video links to two different guard centers, turn keys and enter their codes within 10 seconds of one another. | |||
The client loved it. I went out and beat my head against the wall until it really felt good to stop. | |||
[[User:Howard C. Berkowitz|Howard C. Berkowitz]] 23:51, 4 August 2008 (CDT) |
Revision as of 22:51, 4 August 2008
Much of this is taken from The FreeS/WAN docs [1] which I have permission User_talk:Sandy_Harris/Permission to re-use here, but I have rewritten quite a lot so I do not think it needs tagging as an external article at this point. Editors care to comment?
More generally, can others improve this?
==Another dimension to key strength==
Something not often considered in crypto for the civil sector, but often examined in depth in the military and intelligence areas, is not just how long a brute force (or more skilled) attack would take to yield the plaintext, but how long a period of protection is needed?
A classic example is that if you can hit a target with artillery in 5 minutes, but it would take the intended target 15 minutes to move out of range, the main reason to encrypt at all is the equivalent, I suppose, of giving the condemned a blindfold. Now, there might be a rationale for using encryption with resistance just slightly longer than the period between unit code name changes.
On the other hand, espionage traffic really should be protected for decades, because there are literally families of spies.
It happened that I was on the U.S. Federal Telecommunications Standards Committee at a time when one of the military members wanted the option for a longer checksum on -- IIRC -- HDLC. They said it was needed to protect nuclear command and control, and I inquired when the U.S. government had decided that the risk of accidental nuclear war was unacceptable at 16 bits but acceptable at 32 -- or maybe it was 32 and 64. My observation was not appreciated.
For things like money and securities trading, you do need strong protection until the trades are made, at which point the information is public. If the typical period between placing the order and making the sale is 15 minutes, how quickly would you have to break it and give it to another trader who could exploit the information?
I've had some generically weird experiences in clinical computing. In one case, the doctors insisted on very strong crypto (for 1966) for hard copies of lab charts they would leave unattended in hard copy. In another case, I became extremely frustrated with a client, who wanted strong security for an in-hospital hospital system on which an authenticated physician could prescribe narcotics. Trying for a reduction ad absurdum, I drew up a system that was generally more rigorous than used to order the launch of an ICBM. To get to the audit file, you had to have two people, at two locations, monitored by remote video links to two different guard centers, turn keys and enter their codes within 10 seconds of one another.
The client loved it. I went out and beat my head against the wall until it really felt good to stop.
Howard C. Berkowitz 23:51, 4 August 2008 (CDT)