Tuesday, April 13, 2021

Available right now: Uncrackable encryption, dirt cheap

In this article:

Speedy encryption

Your CEO, in a hotel room thousands of miles away, sends a message back to the office about a competitive bid he’s about to make. The security of that message–and therefore the future of the deal–depend on trade-offs your company has made between processing speed and encryption level.

S. Walter Packaging’s CIO Toni Patti.

Chances are, encryption got short changed.

The conventional wisdom among IT executives is that encryption bogs down a system’s processing speed. As a result, many companies don’t use enough encryption to protect their vital communications. Users are left with a false sense of security.

They’re like the users in a U.S. military organization with which I was once associated: They were shocked to discover that a systems administrator had made it part of her daily routine to read the private e-mail of various individuals–up to and including an admiral.

The typical cryptographer, in creating a cryptosystem, designs the smallest key–and as such as the fastest algorithm–that meets the designer’s estimate for the user’s security needs. While certainly there are scenarios where small keys and fast encryption are more important than absolute security (e.g., hardware encryption of a satellite feed), electronic-commerce transactions and e-mail privacy require ever greater vigilance in this very untrustworthy world. And history has shown repeatedly that designers underestimate their cryptosystems’ vulnerabilities. The 1996 breach of Netscape security, through faults in Netscape’s pseudo-random-number generator, is a recent example.

Many IT managers don’t realize that today’s technology has eased the MIPS problems of encryption systems. The storage and processing of bits is so cheap that powerful, effective encryption is now within reach of every organization that wants to keep its communications private.

Hello, Moscow?

Perfectly secure communication has been available for most of this century. World War I-era work at AT&T gave us the One-Time Pad (OTP), a technology that ensures the privacy of the Washington-Moscow hotline, for instance. OTP requires hardware random-bit generation and keys that are as long as all intended messages. OTP technology is difficult to implement, partly because of key-distribution issues. An OTP can be handled by a PC, but for most communications, organizations usually opt for simpler systems that, with proper design, can approach the OTP’s security.

Today, most secure communications among banks and financial institutions are encrypted via the Data Encryption Standard (DES), which was designed by IBM and adopted in 1976 as a federal standard. The security of a DES-encrypted communication lies not in the secrecy of the algorithm but in the key that is used to encrypt and decrypt the data. In fact, DES is the most widely studied and widely implemented cipher in the world. But DES was designed for hardware implementation, and it is relatively slow when implemented in software. Also, DES uses only a 56-bit key (plus 8 parity bits that provide no additional security). Many encryption specialists believe that the 56-bit key size was chosen as a federal standard because it could be cracked by U.S. government machines using massively parallel processing and a process known as “brute forcing”–trying trillions and quadrillions of DES keys until the decoding key is found.

Some experts were arguing even 20 years ago that the DES standard was too weak. That argument was proven true when the DES Challenge, posed by encryption pioneers RSA Data Security, was cracked in five months via a network of computers ranging from PCs to large university workstations. Tens of thousands of volunteers, linked via the Internet, tested 18 quadrillion keys, checking billions of keys per second. The key that encoded the challenge phrase, “Strong cryptography makes the world a safer place,” was eventually found by a PC with a 90MHz Pentium processor and 16MB of RAM (see http://www.rsa.com/des). An even more rapid and remarkable breach of DES security occurred last month, when the nonprofit Electronic Freedom Foundation (EFF) won RSA’s DES Challenge II, in fewer than three days, with a massively parallel computer costing less than $250,000. Full details of this breach can be found on the web at http://www.eff.org/descracker, and in the new book written by EFF and published by O’Reilly, entitled Cracking DES (see http://ww.oreilly.com/catalog/crackdes).

For greater security, there’s triple DES, which puts each encrypted or decrypted block through three DES operations in a row. But even triple DES uses only 24 bytes of key material, which is equivalent to a few words in this sentence. No successful attacks on triple DES have been published, but of course that doesn’t mean triple DES can’t be–or hasn’t been–cracked. Attacks on cryptosystems by foreign government agencies intent on industrial espionage are a realistic concern.

For communications to be secure overall, hundreds of factors must be considered. Many of these involve assessment of risk and trust of one sort or another. But because no one can know the capabilities of all adversaries, it is prudent to err on the side of using more security rather than less.

“I believe that cryptography is the most important single control measure in the information security area,” says Charles C. Wood, president of Baseline Software (http://www.baselinesoft.com), a Sausalito, Calif.-based provider of information security solutions. “A wide variety of critical controls are built with cryptography,” he says. “These include fixed password systems, digital signatures, and digital certificates. If this basic building block is unduly constrained (with short key lengths, for instance), then all other controls which rely on cryptography are needlessly jeopardized.”

A thousand times more power

The 400MHz Pentium II-based PC that you can pick up at your local computer store represents more than a thousandfold improvement in raw computational power and memory size over the desktop computers available 15 years ago (see table, “Speedy encryption“). Even a lowly 486-based PC can encrypt 100,000 bytes per second using triple DES. But cryptography hasn’t kept pace. Cryptosystem’s designers haven’t fully utilized the improvements in PC horsepower to provide the most security possible. With PCs 1,000 times more powerful than they used to be, our encryption keys can and should be 1,000 times bigger too. That means cryptokeys of at least 56,000 bits. While encrypting with such a large key is going to be slower than encrypting a smaller key, if all you need today is 1,000 or 2,000 bytes per second (which might be typical for encrypting vital e-mail), then the larger key provides increased security, with minimal processing delay. And as computers become even faster, you can continue to use larger keys, or shrink the delay into the subsecond range. Each additional key bit doubles the number of keys to search, so brute-forcing such an enormous key would be unrealistic, requiring an astronomical amount of processing.

Some users might say that such keys are ridiculously large and unnecessary, but they miss the point that today’s fast technology makes these keys essentially free, even if they are larger than really required. And keep in mind the rapid improvements in processor performance: How many people when using IBM’s 5MHz personal computers in 1983 were planning for 400MHz processor power to be available 15 years later? This is not too long a planning horizon, given how long DES has lasted.

In the “big leagues” of encryption, the best way to implement large keys is via an extensible algorithm, which allows support for keys of variable sizes ranging beyond a megabit.

Few of today’s cryptosystems allow users to select the key size that best meets their need for security and their assessment of the risks. Designers seem to follow a “take it or leave it” approach. Designers should make their cryptosystems as extensible as possible, preferably over several orders of magnitude.

Ensuring randomness

I have provided consulting services to a company whose CEO must make significant business deals, in competitive bidding situations, when he travels internationally. His laptop allows him to communicate worldwide, but most importantly allows him to communicate securely with others in his company, on various aspects of the deals he must make. He has valid concerns about the security of the information he sends and receives electronically because of the environments he is in, and the value of this information requires a very high degree of security, much more than is typically used commercially.

I worked with the company on hardware-generated key bits, a technology used by organizations that push the limits of security. Ideally, the bits that make up a key should be perfectly random–Netscape was vulnerable to the 1996 attack because its bits were somewhat predictable.

Many organizations work with inadequate encryption, even though cryptosystems with variable key lengths well over a thousand bits are readily available. If someone takes 10 minutes to type a business e-mail message that contains important details of a new business opportunity, is it worth a few seconds of computational time to encrypt the message with a 56,000-bit or million-bit cryptokey? In many situations, it is well worth the time, as this can provide enormous increases in privacy. Nor is storage of such large keys a problem, since, for example, one 3.5-inch diskette can hold over 10 million bits and can fit into your pocket.

Pretty good privacy

PGP–“pretty good privacy”–is a public-key system that has been called the “de facto standard for e-mail encryption.” It was made available, with complete source code, by Philip R. Zimmermann. Like all public-key systems, it allows a secure exchange of messages without a secure exchange of keys beforehand. PGP allows public keys to vary from 384 bits up to 1,024 bits, and some might argue that such a size should be big enough for anyone. But PGP, when performing encryption, uses only a 128-bit IDEA key (operating on 64-bit blocks), no matter what public key is specified. This is an example of a situation in which people may assume one level of performance (in terms of key sizes) about a cryptosystem but get something else as a result of trade-offs that the designers make. I recognize that the strength of PGP is its ability to use public-key protocols to establish session keys without the two individuals needing to have a secure key exchange beforehand. However, the question then becomes whether this convenience carries with it the disadvantage of providing less security than is possible with today’s processors.

It must also be pointed out that there is a significant disadvantage to using encryption standards. While standards are vital for interoperability between the widest number of participants, the creation of a standard provides an attractive target for an adversary.

Have you had prolems balancing the need for security and communication at your business? Tell us how you solved the problem. E-mail us at [email protected] and tell us about your experience.

The best way to ensure security is to encrypt your message with multiple algorithms before transmission. This is called multiple encryption (or a cascade cipher). The output of the first stage is fed back to the input of the second stage, with the two stages using different encryption algorithms. Today’s computational horsepower makes this approach quick and simple. So if you want to encrypt with a standard system such as DES, triple DES, or PGP/IDEA, be sure to perform a second encryption with a key of at least 56,000 bits. It is worth the extra few seconds to assure yourself of the highest level of security possible. You have nothing to lose by performing the second encryption using a cryptosystem with a huge key, and everything to gain.//

Tony Patti is founder, editor, and publisher of Cryptosystems Journal. He is also chief information officer at S. Walter Packaging of Philadelphia, the country’s largest retail packaging specialist, and former CIO of Neumann College. He was one of the 16 charter members of Datamation’s 21st Century Club.

Speedy encryption

While the DES encryption standard is more suitable to hardware implementation than software implementation, the following table shows how much faster today’s chips are at encrypting data.

Processor 8088 80286 80386 80486 HP 9000/887
Speed (MHz) 4.7 6 25 66 125
DES blocks (per second) 370 1,100 5,000 43,000 196,000
DES encryption (bytes/sec) 3,000 9,000 40,000 340,000 1,500,000
Triple DES encryption (bytes/sec) 1,000 3,000 13,000 110,000 500,000

Source:Applied Cryptography: Protocols, Algorithms and Source Code in C, 2nd edition, by Bruce Schneier, John Wiley & Sons, 1995

Similar articles

Latest Articles

The Conversational AI Revolution:...

One of the things I’m looking forward to seeing at next week’s NVIDIA GTC event is an update on their Conversational AI efforts. I’m fascinated...

Edge Computing

Edge computing is a broad term that refers to a highly distributed computing framework that moves compute and storage resources closer to the exact...

Data-Driven Decision Making: Top...

The phrase data-driven decision making – certainly popular in the field of data analytics – may seem redundant. After all, nearly everything is driven...

Top Performing Artificial Intelligence...

As artificial intelligence has become a growing force in business, today’s top AI companies are leaders in this emerging technology. Often leveraging cloud computing and...