Cryptography: A field at the intersection of mathematics and computer science that is concerned with the security of information, typically the confidentiality, integrity and authenticity of some message. [e]
This article may be deleted soon.
|
|
|
The metadata subpage is missing. You can start it via filling in this form or by following the instructions that come up after clicking on the [show] link to the right.
|
Do you see this on PREVIEW? then SAVE! before following a link.
A - For a New Cluster use the following directions
Subpages format requires a metadata page.
Using the following instructions will complete the process of creating this article's subpages.
- Click the blue "metadata template" link below to create the page.
- On the edit page that appears paste in the article's title across from "
pagename = ".
- You might also fill out the checklist part of the form. Ignore the rest.
- For background, see Using the Subpages template Don't worry--you'll get the hang of it right away.
- Remember to hit Save!
the "metadata template".
However, you can create articles without subpages. Just delete the {{subpages}} template from the top of this page and this prompt will disappear. :) Don't feel obligated to use subpages, it's more important that you write sentences, which you can always do without writing fancy code.
|
B - For a Cluster Move use the following directions
The metadata template should be moved to the new name as the first step. Please revert this move and start by using the Move Cluster link at the top left of the talk page.
The name prior to this move can be found at the following link.
|
|
The term cryptography comes from Greek κρυπτός kryptós "hidden," and γράφειν gráfein "to write". In the simplest case, the sender hides (encrypts) a message (plaintext) by converting it to an unreadable jumble of apparently random symbols (ciphertext). The process involves a key, a secret value that controls some of the operations. The intended receiver knows the key, so he can recover the original text (decrypt the message). Someone who intercepts the message sees only apparently random symbols; if the system performs as designed, then without the key an eavesdropper cannot read messages.
Various techniques for obscuring messages have been in use by the military, by spies, and by diplomats for several millennia and in commerce at least since the Renaissance; see History of cryptography for details. With the spread of computers and electronic communication systems in recent decades, cryptography has become much more broadly important.
Banks use cryptography to identify their customers for Automatic Teller Machine (ATM) transactions and to secure messages between the ATM and the bank's computers. Satellite TV companies use it to control who can access various channels. Companies use it to protect proprietary data. Internet protocols use it to provide various security services; see below for details. Cryptography can make email unreadable except by the intended recipients, or protect data on a laptop computer so that a thief cannot get confidential files. Even in the military, where cryptography has been important since the time of Julius Caesar, the range of uses is growing as new computing and communication systems come into play.
With those changes comes a shift in emphasis. Cryptography is, of course, still used to provide secrecy. However, in many cryptographic applications, the issue is authentication rather than secrecy. The Personal identification number (PIN) for an ATM card is a secret, but it is not used as a key to hide the transaction; its purpose is to prove that it is the customer at the machine, not someone with a forged or stolen card. The techniques used for this are somewhat different than those for secrecy, and the techniques for authenticating a person are different from those for authenticating data — for example checking that a message has been received accurately or that a contract has not been altered. However, all these fall in the domain of cryptography. See information security for the different types of authentication, and hashes and public key systems below for techniques used to provide them.
Over the past few decades, cryptography has emerged as an academic discipline. The seminal paper was Claude Shannon's 1949 "Communication Theory of Secrecy Systems"[1].
Today there are journals, conferences, courses, textbooks, a professional association and a great deal of online material; see our bibliography and external links page for details.
In roughly the same period, cryptography has become important in a number of political and legal controversies. Cryptography can be an important tool for personal privacy and freedom of speech, but it can also be used by criminals or terrorists. Should there be legal restrictions? Cryptography can attempt to protect things like e-books or movies from unauthorised access; what should the law say about those uses? Such questions are taken up below and in more detail in a politics of cryptography article.
Up to the early 20th century, cryptography was chiefly concerned with linguistic patterns. Since then the emphasis has shifted, and cryptography now makes extensive use of mathematics, primarily information theory, computational complexity, abstract algebra, and number theory. However, cryptography is not just a branch of mathematics. It might also be considered a branch of information security or of engineering.
As well as being aware of cryptographic history and techniques, and of cryptanalytic methods, cryptographers must also carefully consider probable future developments. For instance, the effects of Moore's Law on the speed of brute force attacks must be taken into account when specifying key lengths, and the potential effects of quantum computing are already being considered. Quantum cryptography is an active research area.
Cryptography is difficult
Cryptography, and more generally information security, is difficult to do well. For one thing, it is inherently hard to design a system that resists efforts by an adversary to compromise it, considering that the opponent may be intelligent and motivated, will look for attacks that the designer did not anticipate, and may have large resources.
To be secure, the system must resist all attacks; to break it, the attacker need only find one effective attack. Moreover, new attacks may be discovered and old ones may be improved or may benefit from new technology, such as faster computers or larger storage devices, but there is no way for attacks to become weaker or a system stronger over time. Schneier calls this "the cryptographer's adage: attacks always get better, they never get worse."[2]
Also, neither the user nor the system designer gets feedback on problems. If your word processor fails or your bank's web site goes down, you see the results and are quite likely to complain to the supplier. If your cryptosystem fails, you may not know. If your bank's cryptosystem fails, they may not know, and may not tell you if they do. If a serious attacker — a criminal breaking into a bank, a government running a monitoring program, an enemy in war, or any other — breaks a cryptosystem, he will certainly not tell the users of that system. If the users become aware of the break, then they will change their system, so it is very much in the attacker's interest to keep the break secret. In a famous example, the British ULTRA project read many German ciphers through most of World War II, and the Germans never realised it.
Cryptographers routinely publish details of their designs and invite attacks. In accordance with Kerckhoffs' Principle, a cryptosystem cannot be considered secure unless it remains safe even when the attacker knows all details except the key in use. A published design that withstands analysis is a candidate for trust, but no unpublished design can be considered trustworthy. Without publication and analysis, there is no basis for trust. Of course "published" has a special meaning in some situations. Someone in a major government cryptographic agency need not make a design public to have it analysed; he need only ask the cryptanalysts down the hall to have a look.
Having a design publicly broken might be a bit embarrassing for the designer, but he can console himself that he is in good company; breaks routinely happen. Even the NSA can get it wrong; Matt Blaze found a flaw
[3]
in their Clipper chip within weeks of the design being de-classified. Other large organisations can too: Deutsche Telekom's Magenta cipher was broken[4]
by a team that included Bruce Schneier within hours of being first made public at an AES candidates' conference. Nor are the experts immune — they may find flaws in other people's ciphers but that does not mean their designs are necessarily safe. Blaze and Schneier designed a cipher called MacGuffin[5]
that was broken[6]
before the end of the conference they presented it at.
In any case, having a design broken — even broken by (horrors!) some unknown graduate student rather than a famous expert — is far less embarrassing than having a deployed system fall to a malicious attacker. At least when both design and attacks are in public research literature, the designer can either fix any problems that are found or discard one approach and try something different.
The hard part of security system design is not usually the cryptographic techniques that are used in the system. Designing a good cryptographic primitive — a block cipher, stream cipher or cryptographic hash — is indeed a tricky business, but for most applications designing new primitives is unnecessary. Good primitives are readily available; see the linked articles. The hard parts are fitting them together into systems and managing those systems to actually achieve security goals. Schneier's preface to Secrets and Lies
[7] discusses this in some detail. His summary:
If you think technology can solve your security problems, then you don't understand the problems and you don't understand the technology.[7]
For links to several papers on the difficulties of cryptography, see our bibliography.
Then there is the optimism of programmers. As for databases and real-time programming, cryptography looks deceptively simple. The basic ideas are indeed simple and almost any programmer can fairly easily implement something that handles straightforward cases. However, as in the other fields, there are also some quite tricky aspects to the problems and anyone who tackles the hard cases without both some study of relevant theory and considerable practical experience is almost certain to get it wrong. This is demonstrated far too often.
For example, companies that implement their own cryptography as part of a product often end up with something that is easily broken. Examples include the addition of encryption to products like Microsoft Office
[8],
Netscape
[9],
Adobe's Portable Document Format (PDF)
[10],
and many others. Generally, such problems are fixed in later releases. These are major companies and both programmers and managers on their product teams are presumably competent, but they routinely get the cryptography wrong.
Even when they use standardised cryptographic protocols, they may still mess up the implementation and create large weaknesses. For example, Microsoft's first version of PPTP was vulnerable to a simple attack [11] because of an elementary error in implementation.
There are also failures in products where encryption is central to the design. Almost every company or standards body that designs a cryptosystem in secret, ignoring Kerckhoffs' Principle, produces something that is easily broken. Examples include the Contents Scrambling System (CSS) encryption on DVDs,
the WEP encryption in wireless networking,
[12]
and the A5 encryption in GSM cell phones
[13].
Such problems are much harder to fix if the flawed designs are included in standards and/or have widely deployed hardware implementations; updating those is much more difficult than releasing a new software version.
Beyond the real difficulties in implementing real products are some systems that both get the cryptography horribly wrong and make extravagant marketing claims. These are often referred to as snake oil,
Principles and terms
Cryptography proper is the study of methods of encryption and decryption. Cryptanalysis or "codebreaking" is the study of how to break into an encrypted message without possession of the key. Methods of defeating cryptosystems have a long history and an extensive literature. Anyone designing or deploying a cryptosystem must take cryptanalytic results into account.
Cryptology ("the study of secrets", from the Greek) is the more general term encompassing both cryptography and cryptanalysis.
"Crypto" is sometimes used as a short form for any of the above.
Codes versus ciphers
In common usage, the term "code" is often used to mean any method of encryption or meaning-concealment. In cryptography, however, code is more specific, meaning a linguistic procedure which replaces a unit of plain text with a code word or code phrase. For example, "apple pie" might replace "attack at dawn". Each code word or code phrase carries a specific meaning.
A cipher (or cypher) is a system of algorithms for encryption and decryption. Ciphers operate at a lower level than codes, using a mathematical operation to convert understandable plaintext into unintelligible ciphertext. The meaning of the material is irrelevant; a cipher just manipulates letters or bits, or groups of those. A cipher takes as input a key and plaintext, and produces ciphertext as output. For decryption, the process is reversed to turn ciphertext back into plaintext.
Ciphertext should bear no resemblance to the original message. Ideally, it should be indistinguishable from a random string of symbols. Any non-random properties may provide an opening for a skilled cryptanalyst.
The exact operation of a cipher is controlled by a key, which is a secret parameter for the cipher algorithm. The key may be different every day, or even different for every message. By contrast, the operation of a code is controlled by a code book which lists all the codes; these are harder to change.
Codes are not generally practical for lengthy or complex communications, and are difficult to do in software, as they are as much linguistic as mathematical problems. If the only times the messages need to name are "dawn", "noon", "dusk" and "midnight", then a code is fine; usable code words might be "John", "George", "Paul" and "Ringo". However, if messages must be able to specify things like 11:37 AM, a code is inconvenient. Also if a code is used many times, an enemy is quite likely to work out that "John" means "dawn" or whatever; there is no long-term security.
An important difference is that changing a code requires retraining users or creating and (securely!) delivering new code books, but changing a cipher key is much easier. If an enemy gets a copy of your codebook (whether or not you are aware of this!), then the code becomes worthless until you replace those books. By contrast, having an enemy get one of your cipher machines or learn the algorithm for a software cipher should do no harm at all — see Kerckhoffs' Principle. If an enemy learns the key, that defeats a cipher, but keys are easily changed; in fact, the procedures for any cipher usage normally include some method for routinely changing the key.
For the above reasons, ciphers are generally preferred in practice. Nevertheless, there are niches where codes are quite useful. A small number of codes can represent a set of operations known to sender and receiver. "Climb Mount Niikata" was a final order for the Japanese mobile striking fleet to attack Pearl Harbor, while "visit Aunt Shirley" could order a terrorist to trigger a chemical weapon at a particular place. If the codes are not re-used or foolishly chosen (e,g. using "flyboy" for an Air Force officer) and do not have a pattern (e.g. using "Lancelot" and "Galahad" for senior officers, making it easy for an enemy to guess "Arthur" or "Gawain"), then there is no information to help a cryptanalyst and the system is extremely secure.
Codes may also be combined with ciphers. Then if an enemy breaks a cipher, much of what he gets will be code words. Unless he either already knows the code words or has enough broken messages to search for codeword re-use, the code may defeat him even if the cipher did not. For example, if the Americans had intercepted and decrypted a message saying "Climb Mount Niikata" just before Pearl Harbor, they would likely not have known its meaning.
There are historical examples of enciphered codes or encicodes. There are also methods of embedding code phrases into apparently innocent messages; see steganography below.
In military systems, a fairly elaborate system of code words may be used.
Keying
What a cipher attempts to do is to replace a difficult problem, keeping messages secret, with a much more tractable one, managing a set of keys. Of course this makes the keys critically important. Keys need to be large enough and highly random; those two properties together make them effectively impossible to guess or to find with a brute force search. See cryptographic key for discussion of the various types of key and their properties, and random numbers below for techniques used to generate good ones.
Kerckhoffs' Principle is that no system should be considered secure unless it can resist an attacker who knows all its details except the key. The most fearsome attacker is one with strong motivation, large resources, and few scruples; such an attacker will learn all the other details sooner or later. To defend against him takes a system whose security depends only on keeping the keys secret.
More generally, managing relatively small keys — creating good ones, keeping them secret, ensuring that the right people have them, and changing them from time to time — is not remarkably easy, but it is at least a reasonable proposition in many cases. See key management for the techniques.
However, in almost all cases, it is a bad idea to rely on a system that requires large things to be kept secret. Security through obscurity — designing a system that depends for its security on keeping its inner workings secret — is not usually a good approach. Nor, in most cases, are a one-time pad which needs a key as large as the whole set of messages it will protect, or a code which is only secure as long as the enemy does not have the codebook. There are niches where each of those techniques can be used, but managing large secrets is always problematic and often entirely impractical. In many cases, it is no easier than the original difficult problem, keeping the messages secret.
Basic mechanisms
In describing cryptographic systems, the players are traditionally called Alice and Bob, or just A and B. We use these names throughout the discussion below.
Secret key systems
Until the 1970s, all (publicly known) cryptosystems used secret key or symmetric key cryptography methods. In such a system, there is only one key for a message; that key can be used either to encrypt or decrypt the message, and it must be kept secret. Both the sender and receiver must have the key, and third parties (potential intruders) must be prevented from obtaining the key. Symmetric key encryption may also be called traditional, shared-secret, secret-key, or conventional encryption.
Historically, ciphers worked at the level of letters or groups of letters; see history of cryptography for details. Attacks on them used techniques based largely on linguistic analysis, such as frequency counting; see cryptanalysis.
Types of modern symmetric cipher
On computers, there are two main types of symmetric encryption algorithm:
A block cipher breaks the data up into fixed-size blocks and encrypts each block under control of the key. Since the message length will rarely be an integer number of blocks, there will usually need to be some form of "padding" to make the final block long enough. The block cipher itself defines how a single block is encrypted; modes of operation specify how these operations are combined to achieve some larger goal.
A stream cipher encrypts a stream of input data by combining it with a pseudo-random stream of data; the pseudo-random stream is generated under control of the encryption key.
To a great extent, the two are interchangeable; almost any task that needs a symmetric cipher can be done by either. In particular, any block cipher can be used as stream cipher in some modes of operation. In general, stream ciphers are faster than block ciphers, and some of them are very easy to implement in hardware; this makes them attractive for dedicated devices. However, which one is used in a particular application depends largely on the type of data to be encrypted. Oversimplifying slightly, stream ciphers work well for streams of data while block ciphers work well for chunks of data. Stream ciphers are the usual technique to encrypt a communication channel, for example in military radio or in cell phones, or to encrypt network traffic at the level of physical links. Block ciphers are usual for things like encrypting disk blocks, or network traffic at the packet level (see IPsec), or email messages (PGP).
Another method, usable manually or on a computer, is a one-time pad. This works much like a stream cipher, but it does not need to generate a pseudo-random stream because its key is a truly random stream as long as the message. This is the only known cipher which is provably secure (provided the key is truly random and no part of it is ever re-used), but it is impractical for most applications because managing such keys is too difficult.
Key management
More generally, key management is a problem for any secret key system.
- It is critically important to protect keys from unauthorised access; if an enemy obtains the key, then he or she can read all messages ever sent with that key.
- It is necessary to change keys periodically, both to limit the damage if an attacker does get a key and to prevent various attacks which become possible if the enemy can collect a large sample of data encrypted with a single key.
- It is necessary to communicate keys; without a copy of the identical key, the intended receiver cannot decrypt the message.
- It is sometimes necessary to revoke keys, for example if a key is compromised or someone leaves the organisation.
Managing all of these simultaneously is an inherently difficult problem. Moreover, the problem grows quadratically if there are many users. If N users must all be able to communicate with each other securely, then there are N(N−1)/2 possible connections, each of which needs its own key. For large N this becomes quite unmanageable.
One problem is where, and how, to safely store the key. In a manual system, you need a key that is long and hard to guess because keys that are short or guessable provide little security. However, such keys are hard to remember and if the user writes them down, then you have to worry about someone looking over his shoulder, or breaking in and copying the key, or the writing making an impression on the next page of a pad, and so on.
On a computer, keys must be protected so that enemies cannot obtain them. Simply storing the key unencrypted in a file or database is a poor strategy. A better method is to encrypt the key and store it in a file that is protected by the file system; this way, only authorized users of the system should be able to read the file. But then, where should one store the key used to encrypt the secret key? It becomes a recursive problem. Also, what about an attacker who can defeat the file system protection? If the key is stored encrypted but you have a program that decrypts and uses it, can an attacker obtain the key via a memory dump or a debugging tool? If a network is involved, can an attacker get keys by intercepting network packets? Can an attacker put a keystroke logger on the machine? If so, he can get everything you type, possibly including keys or passwords.
For access control, a common technique is two factor authentication, combining "something you have" (e.g. your ATM card) with "something you know" (e.g. the PIN). An account number or other identifier stored on the card is combined with the PIN, using a cryptographic hash, and the hash checked before access is granted. In some systems, a third factor is used, a random challenge; this prevents an enemy from reading the hash from one transaction and using it to perform a different transaction.
Communicating keys is an even harder problem. With secret key encryption alone, it would not be possible to open up a new secure connection on the Internet, because there would be no safe way initially to transmit the shared key to the other end of the connection without intruders being able to intercept it. A government or major corporation might send someone with a briefcase handcuffed to his wrist, but for many applications this is impractical.
Another problem arises when keys are compromised. Suppose an intruder has broken into Alice's system and it is possible he now has all the keys she knows, or suppose Alice leaves the company to work for a competitor. In either case, all Alice's keys must be replaced; this takes many changes on her system and one change each on every system she communicates with, and all the communication must be done without using any of the compromised keys.
Various techniques can be used to address these difficulties. A centralised key-dispensing server, such as the Kerberos system is one method. Each user then needs to manage only one key, the one for access to that server; all other keys are provided at need by the server.
The development of public key techniques, described in the next section, allows simpler solutions.
Public key systems
Public key or asymmetric key cryptography was first proposed, in the open literature, in 1976 by Whitfield Diffie and Martin Hellman.[14]. The historian David Kahn described it as "the most revolutionary new concept in the field since polyalphabetic substitution emerged in the Renaissance" [15]. There are two reasons why public key cryptography is so important. One is that it solves the key management problem described in the preceding section; the other is that public key techniques are the basis for digital signatures.
In a public key system, keys are created in matched pairs, such that when one of a pair is used to encrypt, the other must be used to decrypt. The system is designed so that calculation of one key from knowledge of the other is computationally infeasible, even though they are necessarily related. Keys are generated secretly, in interrelated pairs. One key from a pair becomes the public key and can be published. The other is the private key and is kept secret, never leaving the user's computer.
In many applications, public keys are widely published — on the net, in the phonebook, on business cards, on key server computers which provide an index of public keys. However, it is also possible to use public key technology while restricting access to public keys; some military systems do this, for example. The point of public keys is not that they must be made public, but that they could be; the security of the system does not depend on keeping them secret.
One big payoff is that two users (traditionally, A and B or Alice and Bob) need not share a secret key in order to communicate securely. When used for content confidentiality, the public key is typically used for encryption, while the private key is used for decryption. If Alice has (a trustworthy, verified copy of) Bob's public key, then she can encrypt with that and know that only Bob can read the message since only he has the matching private key. He can reply securely using her public key. This solves the key management problem. The difficult question of how to communicate secret keys securely does not need to even be asked; the private keys are never communicated and there is no requirement that communication of public keys be done securely.
Moreover, key management on a single system becomes much easier. In a system based on secret keys, if Alice communicates with N people, her system must manage N secret keys all of which change periodically, all of which must sometimes be communicated, and each of which must be kept secret from everyone except the one person it is used with. For a public key system, the main concern is managing her own private key; that generally need not change and it is never communicated to anyone. Of course, she must also manage the public keys for her correspondents. In some ways, this is easier; they are already public and need not be kept secret. However, it is absolutely necessary to authenticate each public key. Consider a philandering husband sending passionate messages to his mistress. If the wife creates a public key in the mistress' name and he does not check the key's origins before using it to encrypt messages, he may get himself in deep trouble.
Public-key encryption is slower than conventional symmetric encryption so it is common to use a public key algorithm for key management but a faster symmetric algorithm for the main data encryption. Such systems are described in more detail below; see hybrid cryptosystems.
The other big payoff is that, given a public key cryptosystem, digital signatures are a straightforward application. The basic principle is that if Alice uses her private key to encrypt some known data then anyone can decrypt with her public key and, if they get the right data, they know (assuming the system is secure and her private key unknown to others) that it was her who did the encryption. In effect, she can use her private key to sign a document. The details are somewhat more complex and are dealt with in a later section.
Many different asymmetric techniques have been proposed and some have been shown to be vulnerable to some forms of cryptanalysis; see the public key article for details. The most widely used public techniques today are the Diffie-Hellman key agreement protocol and the RSA public-key system[16]. Techniques based on elliptic curves are also used. The security of each of these techniques depends on the difficulty of some mathematical problem — integer factorisation for RSA, discrete logarithm for Diffie-Hellman, and so on. These problems are generally thought to be hard; no general solution methods efficient enough to provide reasonable attacks are known. However, there is no proof that such methods do not exist. If an efficient solution for one of these problems were found, it would break the corresponding cryptosystem. Also, in some cases there are efficient methods for special classes of the problem, so the cryptosystem must avoid these cases. For example, Weiner's attack on RSA works if the secret exponent is small enough, so any RSA-based system should be designed to choose larger exponents.
In 1997, it finally became publicly known that asymmetric cryptography had been invented by James H. Ellis at GCHQ, a British intelligence organization, in the early 1970s, and that both the Diffie-Hellman and RSA algorithms had been previously developed
[17].
[18].
Cryptographic hash algorithms
A cryptographic hash or message digest algorithm takes an input of arbitrary size and produces a fixed-size digest, a sort of fingerprint of the input document. Some of the techniques are the same as those used in other cryptography but the goal is quite different. Where ciphers (whether symmetric or asymmetric) provide secrecy, hashes provide authentication.
Using a hash for data integrity protection is straightforward. If Alice hashes the text of a message and appends the hash to the message when she sends it to Bob, then Bob can verify that he got the correct message. He computes a hash from the received message text and compares that to the hash Alice sent. If they compare equal, then he knows (with overwhelming probability, though not with absolute certainty) that the message was received exactly as Alice sent it. Exactly the same method works to ensure that a document extracted from an archive, or a file downloaded from a software distribution site, is as it should be.
However, the simple technique above is useless against an adversary who intentionally changes the data. The enemy simply calculates a new hash for his changed version and stores or transmits that instead of the original hash. To resist an adversary takes a keyed hash, a hashed message authentication code or HMAC. Sender and receiver share a secret key; the sender hashes using both the key and the document data, and the receiver verifies using both. Lacking the key, the enemy cannot alter the document undetected.
If Alice uses an HMAC and that verifies correctly, then Bob knows both that the received data is correct and that whoever sent it knew the secret key. If the public key system and the hash are secure, and only Alice knows that key, then he knows Alice was the sender. An HMAC provides source authentication as well as data authentication.
See also One-way encryption.
Heading text
Random numbers
Many cryptographic operations require random numbers, and the design of strong random number generators is considered part of cryptography. It is not enough that the outputs have good statistical properties; the generator must also withstand efforts by an adversary to compromise it. Many cryptographic systems, including some otherwise quite good ones, have been broken because a poor quality random number generator was the weak link that gave a cryptanalyst an opening.
For example generating RSA keys requires large random primes, Diffie-Hellman key agreement requires that each system provides a random component, and in a challenge-response protocol the challenges should be random. Many protocols use session keys for parts of the communication, for example PGP uses a different key for each message and IPsec changes keys periodically; these keys should be random. In any of these applications, and many more, using poor quality random numbers greatly weakens the system.
The requirements for random numbers that resist an adversary — someone who wants to cheat at a casino or read an encrypted message — are much stricter than those for non-adversarial applications such as a simulation. The standard reference is the "Randomness Requirements for Security" RFC.[19]
One-way encryption
See One-way encryption.
Steganography
Steganography is the study of techniques for hiding a secret message within an apparently innocent message. If this is done well, it can be extremely difficult to detect.
Generally, the best place for steganographic hiding is in a large chunk of data with an inherent random noise component — photos, audio, or especially video. For example, given an image with one megapixel and three bytes for different colours in each pixel, one could hide three megabits of message in the least significant bits of each byte, with reasonable hope that the change to the image would be unnoticeable. Encrypting the data before hiding it steganographically is a common practice; because encrypted data appears quite random, this can make steganography very difficult to detect. In our example, three megabits of encrypted data would look very much like the random noise which one might expect in the least significant bits of a photographic image that had not been tampered with.
One application of steganography is to place a digital watermark in a media work; this might allow the originator to prove that someone had copied the work, or in some cases to trace which customer had allowed copying.
However, a message can be concealed in almost anything. In text, non-printing characters can be used — for example a space added before a line break does not appear on a reader's screen and an extra space after a period might not be noticed — or the text itself may include hidden messages. For example, in Neal Stephenson's novel Cryptonomicon, one message is coded as an email joke; a joke about Imelda Marcos carries one meaning, while one about Ferdinand Marcos would carry another.
Often indirect methods are used. If Alice wants to send a disguised message to Bob, she need not directly send him a covering message. That would tell an enemy doing traffic analysis at least that A and B were communicating. Instead, she can place a classified ad containing a code phrase in the local newspaper, or put a photo or video with a steganographically hidden message on a web site. This makes detection quite difficult, though it is still possible for an enemy that monitors everything, or for one that already suspects something and is closely monitoring both Alice and Bob.
A related technique is the covert channel, where the hidden message is not embedded within the visible message, but rather carried by some other aspect of the communication. For example, one might vary the time between characters of a legitimate message in such a way that the time intervals encoded a covert message.
Combination mechanisms
The basic techniques described above can be combined in many ways. Some common combinations are described here.
Digital signatures
Two cryptographic techniques are used together to produce a digital signature, a hash and a public key system.
Alice calculates a hash from the message, encrypts that hash with her private key, combines the encrypted hash with some identifying information to say who is signing the message, and appends the combination to the message as a signature.
To verify the signature, Bob uses the identifying information to look up Alice's public key and checks signatures or certificates to verify the key. He uses that public key to decrypt the hash in the signature; this gives him the hash Alice calculated. He then hashes the received message body himself to get another hash value and compares the two hashes. If the two hash values are identical, then Bob knows with overwhelming probability that the document Alice signed and the document he received are identical. He also knows that whoever generated the signature had Alice's private key. If both the hash and the public key system used are secure, and no-one except Alice knows her private key, then the signatures are trustworthy.
A digital signature has some of the desirable properties of an ordinary signature. It is easy for a user to produce, but difficult for anyone else to forge. The signature is permanently tied to the content of the message being signed, and to the identity of the signer. It cannot be copied from one document to another, or used with an altered document, since the different document would give a different hash. A miscreant cannot sign in someone else's name because he does not know the required private key.
Any public key technique can provide digital signatures. The RSA algorithm is widely used, as is the US government standard Digital Signature Algorithm (DSA).
Once you have digital signatures, a whole range of other applications can be built using them. Many software distributions are signed by the developers; users can check the signatures before installing. Some operating systems will not load a driver unless it has the right signature. On Usenet, things like new group commands and NoCeMs [1] carry a signature. The digital equivalent of having a document notarised is to get a trusted party to sign a combination document — the original document plus identifying information for the notary, a time stamp, and perhaps other data.
See also the next two sections, "Digital certificates" and "Public key infrastructure".
The use of digital signatures raises legal issues. There is an online survey of relevant laws in various countries.
Digital certificates
Digital certificates are the digital analog of an identification document such as a driver's license, passport, or business license. Like those documents, they usually have expiration dates, and a means of verifying both the validity of the certificate and of the certificate issuer. Like those documents, they can sometimes be revoked.
The technology for generating these is in principle straightforward; simply assemble the appropriate data, munge it into the appropriate format, and have the appropriate authority digitally sign it. In practice, it is often rather complex.
Public key infrastructure
Practical use of asymmetric cryptography, on any sizable basis, requires a public key infrastructure (PKI). It is not enough to just have public key technology; there need to be procedures for signing things, verifying keys, revoking keys and so on.
In typical PKI's, public keys are embedded in digital certificates issued by a certification authority. In the event of compromise of the private key, the certification authority can revoke the key by adding it to a certificate revocation list. There is often a hierarchy of certificates, for example a school's certificate might be issued by a local school board which is certified by the state education department, that by the national education office, and that by the national government master key.
An alternative non-hierarchical web of trust model is used in PGP. Any key can sign any other; digital certificates are not required. Alice might accept the school's key as valid because her friend Bob is a parent there and has signed the school's key. Or because the principal gave her a business card with his key on it and he has signed the school key. Or both. Or some other combination; Carol has signed Dave's key and he signed the school's. It becomes fairly tricky to decide whether that last one justifies accepting the school key, however.
Hybrid cryptosystems
Most real applications combine several of the above techniques into a hybrid cryptosystem. Public-key encryption is slower than conventional symmetric encryption, so use a symmetric algorithm for the bulk data encryption. On the other hand, public key techniques handle the key management problem well, and that is difficult with symmetric encryption alone, so use public key methods to manage keys. Neither symmetric nor public key methods are ideal for data authentication; use a hash for that. Many of the protocols also need cryptographic quality random numbers.
Examples abound, each using a somewhat different combination of methods to meet its particular application requirements.
In Pretty Good Privacy (PGP) email encryption the sender generates a random key for the symmetric bulk encryption and uses public key techniques to securely deliver that key to the receiver. Hashes are used in generating digital signatures.
In IPsec (Internet Protocol Security) public key techniques provide source authentication for the gateway computers which manage the tunnel. Keys are set up using the Diffie-Hellman key agreement protocol and the actual data packets are (generally) encrypted with a block cipher and authenticated with an HMAC.
In Secure Sockets Layer (SSL) or the later version Transport Layer Security (TLS) which provides secure web browsing (https), digital certificates are used for source authentication and connections are generally encrypted with a stream cipher.
Cryptographic hardware
Historically, many ciphers were done with pencil and paper but various mechanical and electronic devices were also used. See history of cryptography for details. Various machines were also used for cryptanalysis, the most famous example being the British ULTRA project during the Second World War which made extensive use of mechanical and electronic devices in cracking German ciphers.
Modern ciphers are generally algorithms which can run on any general purpose computer, though there are exceptions such as Solitaire designed for manual use. However, ciphers are also often implemented in hardware; as a general rule, anything that can be implemented in software can also be done with an FPGA or a custom chip. This can produce considerable speedups or cost savings.
The RSA algorithm requires arithmetic operations on quantities of 1024 or more bits. This can be done on a general-purpose computer, but special hardware can make it quite a bit faster. Such hardware is therefore a fairly common component of boards designed to accelerate SSL.
Hardware encryption devices are used in a variety of applications. Block ciphers are often done in hardware; the Data Encryption Standard was originally intended to be implemented only in hardware. For the Advanced Encryption Standard, there are a number of AES chips on the market and Intel are adding AES instructions to their CPUs. Hardware implementations of stream ciphers are widely used to encrypt communication channels, for example in cell phones or military radio. The NSA have developed an encryption box for local area networks called TACLANE.
When encryption is implemented in hardware, it is necessary to defend against side channel attacks, for example to prevent an enemy analysing or even manipulating the radio frequency output or the power usage of the device.
Hardware can also be used to facilitate attacks on ciphers. Brute force attacks on ciphers work very well on parallel hardware; in effect you can make them as fast as you like if you have the budget. Many machines have been proposed, and several actually built, for attacking the Data Encryption Standard in this way; for details see the DES article.
A device called TWIRL [20] has been proposed for rapid factoring of 1024-bit numbers; for details see attacks on RSA.
Legal and political issues
A number of legal and political issues arise in connection with cryptography. In particular, government regulations controlling the use or export of cryptography are passionately debated. For detailed discussion, see politics of cryptography.
There were extensive debates over cryptography, sometimes called the "crypto wars", mainly in the 1990s. The main advocates of strong controls over cryptography were various governments, especially the US government. On the other side were many computer and Internet companies, a loose coalition of radicals called cypherpunks, and advocacy group groups such as the Electronic Frontier Foundation. One major issue was export controls; another was attempts such as the Clipper chip to enforce escrowed encryption or "government access to keys". A history of this fight is Steven Levy Crypto: How the Code rebels Beat the Government — Saving Privacy in the Digital Age
[21] . "Code Rebels" in the title is almost synonymous with cypherpunks.
Encryption used for Digital Rights Management also sets off passionate debates and raises legal issues. Should it be illegal for a user to defeat the encryption on a DVD? Or for a movie cartel to manipulate the market using encryption in an attempt to force Australians to pay higher prices because US DVDs will not play on their machines?
The legal status of digital signatures can be an issue, and cryptographic techniques may affect the acceptability of computer data as evidence. Access to data can be an issue: can a warrant or a tax auditor force someone to decrypt data, or even to turn over the key? The British Regulation of Investigatory Powers Act includes such provisions. Is encrypting a laptop hard drive when traveling just a reasonable precaution, or is it reason for a border control officer to become suspicious?
Two online surveys cover cryptography laws around the world, one for usage and export restrictions and one for digital signatures.
References
- ↑ C. E. Shannon (1949). Communication Theory of Secrecy Systems.
- ↑ Bruce Schneier (July 2009), Another New AES Attack
- ↑ Matt Blaze (1994), Protocol failure in the escrowed encryption standard
- ↑ Eli Biham, Alex Biryukov, Niels Ferguson, Lars Knudsen, Bruce Schneier and Adi Shamir (April 1999), Cryptanalysis of Magenta
- ↑ Matt Blaze and Bruce Schneier (1995), The MacGuffin Block Cipher Algorithm
- ↑ Vincent Rijmen & Bart Preneel (1995), Cryptanalysis of McGuffin
- ↑ 7.0 7.1 Bruce Schneier (2000), Secrets & Lies: Digital Security in a Networked World, ISBN 0-471-25311-1
- ↑ Hongjun Wu. The Misuse of RC4 in Microsoft Word and Excel.
- ↑ Ian Goldberg and David Wagner (January 1996). Randomness and the Netscape Browser: How secure is the World Wide Web?.
- ↑ David Touretsky. Gallery of Adobe Remedies.
- ↑ Bruce Schneier and Mudge, Cryptanalysis of Microsoft's Point-to-Point Tunneling Protocol (PPTP), ACM Press
- ↑ Nikita Borisov, Ian Goldberg, and David Wagner. Security of the WEP algorithm.
- ↑ Greg Rose. A precis of the new attacks on GSM encyption.
- ↑ Diffie, Whitfield (June 8, 1976), "Multi-user cryptographic techniques", AFIPS Proceedings 4 5: 109-112
- ↑ David Kahn, "Cryptology Goes Public", 58 Foreign Affairs 141, 151 (fall 1979), p. 153
- ↑ Rivest, Ronald L.; Adi Shamir & Len Adleman, A Method for Obtaining Digital Signatures and Public-Key Cryptosystems
- ↑ Clifford Cocks (November 1973), "A Note on 'Non-Secret Encryption'", CESG Research Report
- ↑ Malcolm Williamson (1974), "Non-Secret Encryption Using a Finite Field", CESG Research Report
- ↑ Eastlake, D. 3rd; J. Schiller & S. Crocker (June 2005), Randomness Requirements for Security, RFC 4086; IETF Best Current Practice 106
- ↑ Adi Shamir & Eran Tromer (2003). On the cost of factoring RSA-1024.
- ↑ Steven Levy (2001). "Crypto: How the Code Rebels Beat the Government — Saving Privacy in the Digital Age. Penguin, 56. ISBN 0-14-024432-8.
(Read more...)
|
Daniel Mietchen 11:00, 1 April 2010 (UTC)
|
Howard C. Berkowitz 17:20, 6 April 2010
|
3
|
Current Winner (to be selected and implemented by an Administrator)
To change, click edit and follow the instructions, or see documentation at {{Featured Article}}.
The metadata subpage is missing. You can start it via filling in this form or by following the instructions that come up after clicking on the [show] link to the right.
|
Do you see this on PREVIEW? then SAVE! before following a link.
A - For a New Cluster use the following directions
Subpages format requires a metadata page.
Using the following instructions will complete the process of creating this article's subpages.
- Click the blue "metadata template" link below to create the page.
- On the edit page that appears paste in the article's title across from "
pagename = ".
- You might also fill out the checklist part of the form. Ignore the rest.
- For background, see Using the Subpages template Don't worry--you'll get the hang of it right away.
- Remember to hit Save!
the "metadata template".
However, you can create articles without subpages. Just delete the {{subpages}} template from the top of this page and this prompt will disappear. :) Don't feel obligated to use subpages, it's more important that you write sentences, which you can always do without writing fancy code.
|
B - For a Cluster Move use the following directions
The metadata template should be moved to the new name as the first step. Please revert this move and start by using the Move Cluster link at the top left of the talk page.
The name prior to this move can be found at the following link.
|
|
|
A |
B |
C |
D |
E |
F |
G |
H |
I |
J |
K |
L |
M |
N |
O |
P |
Q |
R |
S |
T |
U |
V |
W |
X |
Y |
Z
|
Use in English
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Alphabetical word list
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Retroalphabetical list
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Common misspellings
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
English is notorious for its many varied, inconsistent and irregular spellings. This can be seen at its most extravagant in the field of proper nouns—for example, simply adding an 'h' to 'Maria' to make it rhyme with 'pariah', or calling oneself 'Cholmondeley Featherstonehaugh' while pronouncing it 'Chumley Fanshaw'. An example of a common misspelling is 'disasterous' for 'disastrous', retaining the 'e' of 'disaster'. Many words do not turn out to have the pronunciation they appear to have: 'do' and 'to' do not rhyme with 'go' and 'no', while 'seismic', instead of being 'seezmic' or 'sayzmic', or even 'sayizmic', is in fact 'size-mic'. The above grid (reproduced and explained below) provides links to three lists and a cluster of articles devoted to these things.
To show pronunciation, these articles use correct spellings with added accent marks, instead of relying on the International Phonetic Alphabet (IPA). In some cases incorrect respellings are placed next to the correct ones, signalled by a preceding asterisk, like thís *thíss. The accent marks show pronunciation, thús. A table of these accents (which are not part of the language[1]) can be found below; there is also an IPA key at English phonemes. Where there is more than one accent, the first is stressed, and the same is true after a hyphen, so in the respelling of Tchaikóvsky, *Chŷ-kóffskỳ, it is 'kóff' that has the main stress. (Another way of showing new stress is with a bar: Tchaî|kóvsky.) A sentence from the preceding paragraph can thus be rewritten as follows: "An example of a common misspelling is *disāsterous for disāstrous, retaining the E of disāster." Respelling may be used to exemplify an incorrect spelling, or show a correct pronunciation, or a bit of both. Unlike the IPA, where there can only be one version per pronunciation, as there must be an unambiguous one-to-one correspondence, there can be many respellings: if *disāsterous for disāstrous is a common mistake, we can also represent the pronunciation as *dizāstrus or *dizāstrous or *dizāstrəss (with 'ə', a special character – the only one used – for schwa); or we can contrast British English *dizàstrus with American *dizástrus.
Particular attention is given to homophones, words with the same pronunciation but different meanings. English is rich in homophones, many of which are also homonyms, having also the same spelling, as, for example, cán able, tin (the italicised words suggest meanings, in this case two); while homographs are words with the same spelling whose meanings are distinguished by different pronunciations.
Also of special note are words that many writers incorrectly divide. ôver and dûe, for example, combine to form overdûe, without a space in the middle. Such examples are included with ‘one word’ alongside them: alongsîde one word.
An equals sign = is placed between homophones (in some cases the approximately equals sign ≈ is more appropriate). Homographs and other similar-looking words are included after 'cf.' (Latin conferre, 'compare').
Some words from other languages, in most cases French, may sometimes appear in English with accents from those languages. Here, such spellings are shown using bold italics: touchè may be written with a French accent: touché *tooshây.
The apostrophe is an important part of spelling and so it is treated as a letter, with its own place at the end of the alphabet.
Fragments of words are in bold when correctly spelt: Ukrâine has -âine, not -âne.
Words in italics are used to suggest meanings (e.g. sêa water = sêe vision, where the equals sign denotes identical pronunciation). Words beginning with an initial capital may have no word in italics following: these are names of people, either personal or family, and/or commercial or place names. Such words are included because they often contrast with the spellings of homophones: a bank clerk might be named Clàrk or Clàrke, but probably not 'Clerk' (though BrE clerk = Clàrk/Clàrke). Unusual spellings can be explained by regular ones: Cloúgh = Clúff. An American called Maurìce Mŏrris could just as well be called Mórris Maurice ("Morris Morris") in Britain, where Maurice = Mórris (although it would be putting the conventional surname before the conventional given name).
Links to letter articles and lists
|
A |
B |
C |
D |
E |
F |
G |
H |
I |
J |
K |
L |
M |
N |
O |
P |
Q |
R |
S |
T |
U |
V |
W |
X |
Y |
Z
|
Use in English
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Alphabetical word list
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Retroalphabetical list
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Common misspellings
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
In the navigation table above (reproduced at the top of each article in the cluster) the cells in each row link as follows:
- Top row: articles on each letter and its use in English. There are similar articles on GH, the apostrophe and the hyphen.
- Second row: alphabetical lists of of commonly misspelt and/or mispronounced words, alongside more regular words they may be confused with (words beginning with an apostrophe are here). Some incorrect spellings are also listed, signalled by an asterisk: *dispánd disbánd means that the word is 'disband'. (The bottom row is devoted entirely to misspellings and typos.)
- Third row: retroalphabetical lists, arranged alphabetically according to the final letter of the word and continuing backwards through it:
- In the retroalphabetical lists the headword is on the right. In this way, suffixes and other word endings can be seen grouped together, just as prefixes can be seen in normal alphabetical order. So, instead of ádd båll coúsin, we have réplicA fláB plástiC; and so for mûsiC, see under -C, for mûsicaL, see under -L, for pàrticlE, see under -E, and so on.
- Some suffixes are included separately; their pronunciation may or may not apply to following words ("always -ãrian" means there is no other pronunciation of -ãrian).
- Throughout, the apostrophe is treated as the last letter, after Z. (Words ending in an apostrophe are also here.)
- For clarity, italic association words are to the left of the example word:
- woman mâid = make mâde
- Some incorrect spellings are listed retroalphabetically, in which case the misspelling goes on the right, just as in the alphabetical list:
- wêasel *wêasal
- Bottom row: common misspellings including typos (blue-linked for checking purposes), followed by the correct versions.
Two main varieties are distinguished: British English (BrE), that of the UK and much of the Commonwealth (see also Commonwealth English), and American English (AmE), that of the USA and Canada (without the cåught = cót merger that has occurred in some parts of North America).
Unlike dictionaries, the lists include personal and place names for their own sake and for contrast.
Table of accents
These accents are intended to show the pronunciation while retaining the spelling: they are not part of the language. Those on i and y show the same sound; similarly with u, oo and w. Accented vowels are stressed (ỳ is normally unstressed, as in háppy). ā, not in the table, means that the sound is à in standard British and Commonwealth pronunciations but á in American and other British and Commonwealth speech.
|
Front vowels
|
Back vowels
|
e
|
i
|
y[2]
|
a
|
o
|
u
|
oo
|
w[3]
|
The typical short sound, never occurring at the end of a word (acute accent)
|
pét
|
pít
|
crýpt
|
cát
|
dóg[4]
|
nút
|
|
|
The typical long sound, corresponding to the names of the letters A, E, I, O and U (circumflex accent)
|
sêe
|
nîce
|
mŷ
|
nâme
|
nôse
|
rûle
|
toô
|
neŵ
|
Sounds shown with the grave accent (ẁ- and qù- indicate the BrE ó sound of the following a, ẁad rhyming with qùad; òu and òw are diphthongs sounding like àù in àùtobahn: nòw has this sòund)
|
èight (= â)
|
machìne (= ê)
|
quaỳ water = kêỳ lock (= ê)
|
àre
|
òther, blòod (= ú)
|
fùll (= oò), qùantity (= w)
|
foòt (= ù)
|
ẁant (= wó)[5]
|
The ër sound (umlaut accent)
|
përson
|
bïrd
|
mÿrtle
|
(ëarth)
|
wörd
|
pürr
|
|
|
The åw/ŏr sound (ring accent)[6]
|
|
(cŏin)
|
(jŏy)
|
åll
|
mŏre
|
(for some BrE speakers) sůre
|
|
|
The ãir sound (tilde accent)
|
(thére)
|
(ãir =
|
Ãyr)
|
stãre
|
|
|
|
|
Irregular (respelling needed)
|
sew (= sô)
|
meringue (*məráng)
|
|
because (*bikóz)
|
woman (*wùmən), women (*wímən)
|
business (*bízníss)
|
|
|
Example sentences
These sentences show how the accents may be used, for example, when teaching pronunciation. Words without accents are monosyllables with the schwa sound, a neutral grunt.
The usual short sound, acute accent:
The gínger cát was jéalous of the bláck cát: howéver, the tábby was a véry dífferent mátter - the stúff of réveries, ín fáct.
The usual long sound, circumflex accent:
Sây mŷ nâme thrêe tîmes with stŷle and Î’ll gô and fînd a tûne to plây for yoû.
The third sound, grave accent:
Christìna Grèy shoùld (and dòes?) lòve her mòther and fàther.
The ër sound, umlaut:
But fïrst, Mÿrtle, fürther dïrty, ïrksome and distürbing wörk for the nürses.
The ŏr sound (sůre here is with British pronunciation = Shåw), the ring, or half-ring:
Sůre yoû ŏught to cråwl ón åll fŏurs, m’lŏrd?
Irregular, without accent, instead with respelling:
Many women? Any woman! (pronounced: *Ménny wímmin? Énny wùman!)
Double letters
The following alphabetical table shows examples of how letters can be doubled in English.
Double consonant letters before suffixes are used (as often elsewhere) to preserve short vowel sounds, as in flípped (not *flîped), rebélled (not *rebêled) and pégged (not *pêged, which if regular would in any case be pronounced *pêjed). Compare scrâped, past of scrâpe, and scrápped, from scráp. In the case of t, doubling it after an unstressed vowel and before a suffix may seem unnecessary, but in some cases it can be doubled before -ed: either tàrgeted or tàrgetted (but always commítted).
The sign # indicates a double letter that is rare in that position; capital-letter words indicate that the double letter in this position is only found in names. An asterisk (*) indicates a respelling to show pronunciation, and an equals sign (=) introduces a homophone.
letter
|
initial
|
medial
|
final
|
final + silent e
|
A
|
àardvark #[7]
|
bazàar #
|
bàa #
|
|
B
|
|
ríbbon
|
ébb #
|
Crábbe (= cráb)
|
C
|
|
sóccer (*sócker), accépt (*əxépt)
|
|
|
D
|
|
hídden
|
ádd
|
|
E
|
êel
|
bêen
|
sêe
|
|
F
|
Ffoùlkes
|
éffort
|
óff
|
Clíffe (= clíff)
|
G
|
|
aggréssion (-g-), exággerate (-j-)
|
égg #
|
Légge (= lég)
|
H
|
|
hítchhike # (accidental)
|
|
|
I
|
|
skìíng #
|
Hawàìi #
|
|
J
|
|
|
hàjj # (also spelt hàdj)
|
|
K
|
|
púkka; boòkkeeper (accidental)[8] #
|
|
|
L
|
llàma[9] #
|
fílling
|
wéll
|
bélle beauty (= béll ring)
|
M
|
|
súmmer
|
Crámm (= crám)
|
grámme (= grám)
|
N
|
|
dínner
|
ínn # pub
|
Ánne (= Ánn)
|
O
|
oôze, oòmph #
|
foôd, foòt, flòod, doŏr
|
toô
|
Loôe (= loô)
|
P
|
|
flípped
|
|
stéppe Asia # (= stép foot)
|
Q
|
|
Sadìqqi #
|
|
|
R
|
|
érror
|
pürr
|
|
S
|
|
méssy
|
lóss
|
crevásse
|
T
|
|
bétter
|
ẁatt
|
couchétte -sh-
|
U
|
|
vácuum # (*vákyoôm)
|
|
|
V
|
|
révved #
|
|
|
W
|
|
Lawwell # (accidental)
|
|
|
X
|
|
Éxxon ™ #
|
Bób B. Sóxx #
|
|
Y
|
|
Khayyàm #
|
|
|
Z
|
|
fízzy
|
búzz
|
|
Names of the letters
The names of the letters of the alphabet are rarely written out in English (a simple capital being the normal usage: "with a C, not a K") so that, unlike in many other languages, most of their spellings have a rather unofficial status. But they can be shown as follows, using real words where possible:
A: â (the indefinite article, when stressed), èh? what?
B: bê exist, bêe sting
C: occasionally cêe; sêe look, sêa ship
D: Dêe River, surname
E: ê as in êmail, ê-mail
F: éff as in the euphemism éff óff
G: gêe up, exclamation *jêe
H: âitch as in drópping your âitches
I: Î me, eŷe vision
J: jây bird
K: Kây person
L: él elevated railway (AmE)
M: ém dash
N: én dash
O: ôwe debt, ôh! exclamation
P: pêa pod, pêe urine, p pence (BrE)
Q: queûe line, cûe ball, prompt
R: àre be, BrE àh exclamation
S: occasionally éss
T: têa drink, têe golf, tì do-re-mi
U: yoû me, eŵe sheep
V: Vêe Bobby
W: "doúble you" (*dúblyu; cf. vácûum, which actually does have a doúble Û)
X: éx- past
Y: whŷ reason (voiced w, as in BrE)
Z: BrE zéd, AmE zêe
The Chaos
by Gerard Nolst Trenité
This poem on pronunciation irregularities was first published in 1920. Accent marks, respellings and editorial comments have been added to reflect current British English pronunciation. The unadorned poem, with an introduction, can be found here.
The Châós (*câyóss)
Dêarest crêature ín creâtion
Stúdying English (*Ínglish) pronunciâtion,
- Î wíll têach yoû ín mŷ vërse
- Sòunds lîke cŏrpse, cŏrps (*cŏr), hŏrse and wörse.
Î wíll kêep yoû, Sûsy, busy (*bízzy),
Mâke yŏur héad wíth hêat grôw dízzy;
- Têar ín eŷe, yŏur dréss yŏu'll téar;
- Quêer, fãir sêer (*sêe-er), hêar mŷ prãyer.
Prây, consôle yŏur lòving pôet,
Mâke mŷ côat loòk neŵ, dêar, sew (=sô) ít!
- Júst compãre heàrt, hêar and hëard,
- Dîes and dîet (*dîət), lŏrd and wörd.
Swŏrd (*sŏrd) and swård, retâin and Brítain
[Mînd the látter hòw ít's wrítten].
- Mâde hás nót the sòund of báde,
- Sây–said (*séd), pây–pâid, lâid but pláid.
Nòw Î sůrely wíll nót plâgue yoû
Wíth súch wörds as vâgue and âgûe,
- Bút bê cãreful hòw yoû spêak,
- Sây: gúsh, bùsh, steâk, strêak, breâk, blêak,
Prêvious, précious, fûchsia (*feŵsha), vîa,
Récipê, pîpe, stúdding-sâil, choîr (=quîre);
- Wôven, óven, hòw and lôw,
- Scrípt, recêipt (*rissêet), shoe (=shoô), pôem, tôe.
Sây, expécting fråud and tríckerỳ:
Dåughter (*dåwter), làughter (*làfter) ánd Terpsíchorê (*Terpsíckery),
- Brànch, rànch, mêasles, tópsails, aîsles (*îles),
- Míssîles, símilês, revîles.
Whôlly (=hôly), hólly, sígnal, sîgning (*sîning),
Sâme, exámining, but mîning,
- Schólar (*scóllar), vícar, and cigàr,
- Sôlar, mîca, wår and fàr.
From "desîre": desîrable - ádmirable from "admîre",
Lúmber, plúmber, biêr, but brîer,
- Tópsham, broûgham (*breŵəm), renòwn, but knôwn,
- Knówledge, dòne, lône, góne, nòne, tône,
Òne (=wòn), anémonê, Balmóral,
Kítchen, lîchen (=lîken), låundry, laurel (lórrel).
- Gërtrude, Gërman (J-), wínd and wînd,
- Beau (=Bô), kînd, kíndred, queûe, mankînd,
Tŏrtoise (*tŏrtus), türquŏise, chámois-léather (*shámwà-),
Rêading, Réading, hêathen, héather.
- Thís phonétic lábyrínth
- Gíves móss, grôss, broòk, brôoch, nînth, plínth.
Háve yoû éver yét endéavoured
To (=toô)[10] pronòunce revêred and sévered,
- Dêmon, lémon, ghoûl, fòul, sôul,
- Pêter, pétrol and patrôl?
Bíllet dòes nót énd lîke bállèt (*bállây);
Boûquèt, ẁallet, mállet, chálèt.
- Blòod and flòod are nót lîke foôd,
- Nŏr ís môuld lîke shoùld and woùld (=woòd).
Bánquet ís nót nêarly pàrquèt,
Whích exáctly rhŷmes wíth khàkì. —not usually nowadays
- Díscòunt, vîscòunt (*vîcòunt), lôad and brŏad,
- Towård, to fŏrward, to (=toô) rewård,
Rícochèted and crôchèting, crôquèt?
Rîght! Yŏur pronunciâtion's OK.[11]
- Ròunded, woûnded, griêve and síeve,
- Friénd and fiênd, alîve and líve.
Ís yŏur R corréct ín hîgher?
Kêats assërts ít rhŷmes Thalîa.
- Hûgh, but húg, and hoòd, but hoôt,
- Buŏyant, mínute, bút minûte.
Sây abscíssion wíth precísion,
Nòw: posítion ánd transítion;
- Woùld ít tálly wíth mŷ rhŷme
- Íf Î méntioned páradîgm?
Twòpence, thréepence, têase are êasy,
But cêase, crêase, grêase and grêasy?
- Cŏrnice, nîce, valìse, revîse,
- Râbíes, but lúllabîes.
Óf súch púzzling wörds as nåuseous,
Rhŷming wéll wíth cåutious, tŏrtious,
- Yoû'll envélop lísts, Î hôpe,
- Ín a línen énvelôpe.
Woùld yoû lîke some mŏre? Yoû'll háve ít!
Áffidâvit, Dâvid, dávit.
- To (=toô) abjûre, to përjure. Shèik
- Dòes nót sòund lîke Czéch but âche.
Líberty, lîbrary, hêave and héaven,
Râchel, lóch, moustàche, eléven.
- Wê sây hállôwed, bút allòwed,
- Pêople, léopard, tôwed but vòwed.
Màrk the dífference, moreôver,
Betwêen mover (*moôver), plòver, Dôver.
- Lêaches, brêeches, wîse, precîse,
- Chálíce, bút polìce and lîce,
Cámel, cònstable, únstâble,
Prínciple, discîple, lâbel.
- Pétal, pênal, and canál,
- Wâit, surmîse, pláit, prómíse, pál,
Sûit, suìte, rûín. Cïrcuít, cónduít
Rhŷme wíth "shïrk ít" and "beyónd ít". —still?
- Bút ít ís nót hàrd to téll
- Whŷ ít's påll, måll, but Páll Máll.
Múscle, múscular, gâol (=jâil), îron,
Tímber, clîmber, búllion, lîon,
- Wörm and stŏrm, chaise (*shézz), châós, chãir,
- Sénator, spectâtor, mãyor,
Îvy, prívy, fâmous; clámour
Hás thê Â of dráchm and hámmer.
- Pùssy, hússy ánd posséss,
- Désert, but desërt, addréss.
Gôlf, wolf (=Woòlf), còuntenance, lieuténants
Hŏist ín lieû of flágs léft pénnants.
- Coùrier, cŏurtier, tomb (*toôm), bómb, cômb,
- Còw, but Cowper (=Coôper), sòme and hôme.
"Sôlder, sôldier! Blòod ís thícker",
Quôth hê, "than liqueûr ŏr líquor",
- Mâking, ít ís sád but trûe,
- Ín bravàdo, múch ado (*adoô).
Strânger dòes nót rhŷme wíth ánger,
Neîther dòes devòur wíth clángour. —neither does anger: *áng-gə
- Pîlot, pívot, gåunt, but āunt,
- Fónt, frònt, wônt, wånt, gránd and grānt.
Àrsenic, specífic, scênic,
Rélic, rhétoric, hygìênic.
- Goòseberry, goôse, and clôse, but clôse,
- Páradise, rîse, rôse, and dôse.
Sây invèigh, nèigh, but invêigle,
Mâke the látter rhŷme wíth êagle.
- Mînd! Mêándering but mêan,
- Válentîne and mágazìne.
Ánd Î bét yoû, dêar, a pénny,
Yoû sây máni-(fôld) lîke many (*ménny),
- Whích ís wróng. Sây râpier, pìêr,
- Tîer (òne who tîes), but tìêr.
Àrch, archângel; prây, dòes ërring
Rhŷme wíth hérring ŏr wíth stïrring?
- Príson, bîson, tréasure trôve,
- Trêason, hóver, còver, côve,
Persevêrance, séverance. Ríbald
Rhŷmes (but pîebåld dòesn't) wíth níbbled.
- Phâeton, paêan, gnát, ghåt, gnåw,
- Liên, psŷchic, shóne, bône, pshåw.
Dôn't bê dòwn, mŷ ôwn, but roúgh ít,
Ánd distínguish bùffèt, búffet;
- Broôd, stoòd, roôf, roòk, schoôl, woòl, boôn,
- Worcester (*Wùster), Boleýn, to (=toô) impûgn.
Sây ín sòunds corréct and stërling
Hëarse, hêar, heàrken, yêar and yëarling —yëar and yêarling are about as likely
Êvil, dévil, mézzotínt,
- Mînd the Z (zéd)! (A géntle hínt.)
Nòw yoû nêed nót pây atténtion
To (=toô) súch sòunds as Î dôn't méntion,
- Sòunds lîke pŏres, påuse, pŏurs and påws,
- Rhŷming wíth the prônòun yŏurs;
Nŏr are próper nâmes inclûded,
Thôugh Î óften hëard, as yoû díd,
- Fúnny rhŷmes to ûnicŏrn,
- Yés, yoû knôw them, Våughan and Stråchan —nowadays regularised to *Strákhən
Nô, mŷ mâiden, cŏy and còmely,
Î dôn't ẁant to spêak of Chòlmondeley (*Chúmley).
- Nô. Yét Froûde compãred wíth pròud
- Ís nô bétter thán McLeod (*McClòud).
But mînd trívial and vîal,
Trîpod, mênial, denîal,
- Trôll and trólley, réalm and rêam,
- Schédule, míschief, schísm, and schême.
Àrgil, gíll, Argŷll, gíll. Sůrely
Mây bê mâde to rhŷme wíth Råleigh,
- Bút yŏu're nót suppôsed to sây
- Pìquèt rhŷmes wíth sóbriquèt.
Hád thís ínvalid inválid
Wörthless dócuments? Hòw pállid,
- Hòw uncoûth hê, còuchant, loòked,
- Whén for Pŏrtsmouth Î had boòked!
Zeûs, Thêbes, Thales, Aphrodîtê,
Páramour, enámoured, flîghty,
- Épisôdes, antípodês,
- Ácquiésce, and óbsequies.
Plêase dôn't mònkey wíth the gêyser,
Dôn't pêel 'tâters wíth mŷ râzor,
- Rāther sây ín áccents pûre:
- Nâture, státure ánd matûre.
Pîous, ímpìous, límb, clîmb, glúmly,
Worsted (wùsted), wörsted, crúmbly, dúmbly,
- Cónquer, cónquest, vàse, phâse, fán,
- Ẁan, sedán and àrtisan.
The TH (*têe-âitch) wíll sůrely troúble you
Mŏre than R, CH ŏr W (*àh, cêe-âitch ŏr doúble-û)
- Sây thén thêse phonétic géms:
- Thómas, thŷme, Therêsa, Thames (*Témz).
Thómpson, Chátham, Wåltham, Stréatham,
Thére are mŏre but Î forgét 'em -
- Wâit! Î've gót ít: Ánthony,
- Lîghten yŏur anxîety.
Thê archâíc wörd ålbêít
Does nót rhŷme wíth èight - yoû sêe ít;
- Wíth and fŏrthwith, òne hás vŏice,
- Òne hás nót, yoû mâke yŏur chŏice.
Shoes (=shoôs), gôes, dòes. Nòw fïrst sây: fínger;
Thén sây: sínger, gínger, línger.
- Rêal, zêal, mauve (*môv), gåuze and gâuge,
- Márríage, fôlìage, mìràge, âge,
Hêro, héron, quêry, véry,
Párry, tárry, fûry, bury,
- Dòst, lóst, pôst, and dòth, clóth, lôth,
- Jób, Jôb, blóssom, bosom (*bùzm), ôath.
Fåugh, oppúgnant, kêen oppûgners,
Bòwing, bôwing, bánjo-tûners
- Hôlm yoû knôw, but nôes, canoes (*canoôz),
- Pûisnê (*poôny), trûísm, ûse (*yoûss), to ûse (*yoûz)?
Thôugh the dífference sêems líttle,
Wê sây áctual, but víctual,
Sêat, swéat, châste, càste, Lêigh, èight, heîght,
- Pùt, nút, gránite, ánd unîte.
- Rêefer dòes nót rhŷme wíth déafer,
Féoffer dòes, and zéphyr, héifer.
- Dúll bùll Géoffrey, Geŏrge ate (*ét) lâte,
- Hínt, pînt, sénate, but sedâte.
Gáelic, Árabic, pacífic, —Scottish; or regular Gâelic if Irish
Scîence, cónscience, scientífic;
- Toûr, but òur, doûr, súccour, fŏur,
- Gás, alás, and Àrkansås.
Sây manoeûvre, yacht (*yót) and vómit,
Néxt omít, whích díffers fróm ít
- Bôna fîdê, álibî
- Gŷrate, dòwry ánd awrŷ.
Sêa, idêa, guínea, ãrêa,
Psàlm, Marìa, bút malãria.
- Yoûth, sòuth, soúthern, cléanse and clêan,
- Dóctrine, türpentine, marìne.
Compãre âlien wíth Itálian,
Dándelîon wíth battálion,
- Rálly wíth állŷ; yeâ, yê,
- Eŷe, Î, ây, aŷe, whèy, kêy, quaỳ! —ây mê, archaic expression of sadness, ây = èh
Sây avër, but éver, fêver,
Neîther, léisure, skèin, recêiver.
- Néver guéss - ít ís nót sâfe,
- Wê sây càlves, válves, hālf, but Râlf.
Stàrry, gránary, canãry,
Crévice, but devîce, and éyrie,
- Fâce, but préface, thén grimâce,
- Phlégm, phlegmátic, áss, glāss, bâss.
Báss, làrge, tàrget, gín, gíve, vërging,
Ŏught, òust, jòust, and scòur, but scoürging;
- Êar, but ëarn; and ére and téar
- Do (*doô=) nót rhŷme wíth hêre but héir.
Mînd thê Ô of óff and óften
Whích mây bê pronòunced as ŏrphan, —scarcely heard nowadays
- Wíth the sòund of såw and såuce;
- Ålsô sóft, lóst, clóth and cróss.
Pùdding, púddle, pùtting. Pútting?
Yés: at gôlf ít rhŷmes wíth shútting.
- Réspîte, spîte, consént, resént.
- Lîable, but Pàrliament.
Séven ís rîght, but sô ís êven,
Hŷphen, roúghen, néphew, Stêphen,
- Mònkey, dónkey, clerk (=Clàrk) and jërk,
- Ásp, grāsp, ẁasp, demèsne, cŏrk, wörk.
 of válour, vápid vâpour,
S of neŵs (-z) (compãre neŵspâper (-ss-)),
- G of gíbbet (j-), gíbbon, gíst (j-),
- Î of ántichrîst and gríst,
Díffer like divërse and dîvers,
Rívers, strîvers, shívers, fîvers.
- Ònce, but nónce, tôll, dóll, but rôll,
- Pólish, Pôlish, póll and pôll.
Pronúnciation - thínk of Psŷchê! -
Ís a pâling, stòut and spîky.
- Wôn't ít mâke yoû lose (=loôs) yŏur wíts
- Wrîting grôats and sâying 'gríts'? —no longer
Ìt's a dàrk abýss ŏr túnnel
Streŵn wíth stônes lîke rôwlock, gúnwale,
- Íslington, and Îsle of Wîght,
- Hòusewîfe, vërdíct and indîct.
Dôn't yoû thínk sô, rêader, ràther,
Sâying làther, bâther, fàther?
- Fînally, whích rhŷmes wíth enoúgh,
- Thôugh, throûgh, bòugh, cóugh, hóugh, sòugh, toúgh??
Hiccoúgh hás the sòund of súp.
Mŷ advîce ís: GÍVE ÍT ÚP!
Notes
- ↑ A few different accents, mostly from French, sometimes crop up in English, however; see French words in English.
- ↑ When not accented, y is usually the semi-consonant of yoû and yés.
- ↑ When not accented, w is usually the semi-consonant of wê and wíll.
- ↑ In American English this short British sound is replaced by the longer à in most positions, and by ŏ before r.
- ↑ Grave accents on w and on a u following a q indicate the sound of the following a: à in American English, but in British the extra sound ó as in the British pronunciation of hót.
- ↑ å and ŏ show the same sound: ideally the o too would have a ring over it, but this symbol is not available, so ŏ is used instead.
- ↑ àardvark and Transvàal are from Afrikàans, itself a further example.
- ↑ With a pause to indicate both k’s are pronounced.
- ↑ Also representing a Welsh sound in place names like Llandudno (-dídno) and Llanfairpwllgwyngyllgogerychwrndrobllllantysiliogogogóch.
- ↑ Strong form of to, not normal in a verb's infinitive, necessitated by the metre.
- ↑ The pronunciation required by the metre is "ôkay", though the K is normally the stressed syllable: okây.
(Read more...)
Previous Winners
- English spellings [r]: Lists of English words showing pronunciation, and articles about letters. [e]
- Folk saint [r]: A deceased person or spirit that is venerated as a saint but who has not been officially canonized by the Church. [e]
- Led Zeppelin [r]: English hard rock and blues group formed in 1968, known for their albums and stage shows. [e]
- Locality of reference [r]: A commonly observed pattern in memory accesses by a computer program over time. [e]
- Rabbit [r]: Long-eared, short-tailed, burrowing mammals of the family Leporidae of the order Lagomorpha, found in several parts of the world. [e]
- Scarborough Castle [r]: Ruined stone castle on the east coast of Yorkshire, England, begun in mid-twelfth century. [e] (September 3)
- The Clash of Civilizations and the Remaking of World Order [r]: Add brief definition or description (August 27)
- Mauna Kea [r]: One of the three main volcanic mountains on Hawaii, the biggest island in Hawaii (U.S. state). [e] ((August 20)
- Brute force attack [r]: An attempt to break a cipher by trying all possible keys; long enough keys make this impractical. [e] (August 13)
- Cruiser [r]: While definitions vary with time and doctrine, a large warship capable of acting independently, as a flagship, or a major escort; capabilities include anti-air warfare, anti-surface warfare, anti-submarine warfare, land attack, and possibly ballistic missile defense [e] (August 5)
- The Canterbury Tales [r]: Collection of stories in verse and prose by Geoffrey Chaucer. [e] (July 30)
- Milpa agriculture [r]: A form of swidden agriculture that is practiced in Mesoamerica. Traditionally, a "milpa" plot is planted with maize, beans, and squash. [e] (July 23)
- Domain Name System [r]: The Internet service which translates to and from IP addresses and domain names. [e] (July 16)
- Scuticaria [r]: A genus of orchids, closely related to Bifrenaria, formed by nine showy species of cylindrical leaves, which exist in three isolated areas of South America. [e] (July 9)
- Torture [r]: Add brief definition or description (July 2)
- Miltonia [r]: An orchid genus formed by nine showy epiphyte species and seven natural hybrids of Brazil, one species reaching Argentina and Paraguay. [e] (June 25)
- Ancient Celtic music [r]: The music and instruments of the ancient Celts until late Antiquity. [e] (June 18)
- Bifrenaria [r]: A genus of orchids formed by circa twenty species of South America, some widely cultivated because of their large and colored flowers; divided in two distinct groups, one with large flowers and short inflorescences and the other with small flowers and long inflorescences. [e] (June 11)
- Halobacterium NRC-1 [r]: A microorganism from the Archaea kingdom perfectly suited for life in highly saline environments giving biologists an ideal specimen for genetic studies. [e] (June 4)
- Animal [r]: A multicellular organism that feeds on other organisms, and is distinguished from plants, fungi, and unicellular organisms. [e] (May 28)
- Coal [r]: a combustible, black rock formed after millions of years of heat and pressure were applied to the decayed remains of plants and organic matter in what were then swamps. [e] (May 21)
- Johannes Diderik van der Waals [r]: (1837 – 1923) Dutch scientist, proposed the van der Waals equation of state for gases. [e] (May 7)
- Scientific method [r]: The concept of systematic inquiry based on hypotheses and their testing in light of empirical evidence. [e] (Apr 14)
- Korematsu v. United States [r]: A U.S. Supreme Court case, in which the internment of Japanese-Americans was deemed constitutional due to military necessity [e] (Apr 7)
- Orchid [r]: Any plant classified under Orchidaceae, one of the largest plant families and the largest among Monocotyledons. [e] (Mar 31)
- Oliver Cromwell [r]: (1599-1658) English soldier, statesman, and leader of the Puritan revolution, nicknamed "Old Ironsides". [e] (Mar 24)
- Wisconsin v. Yoder [r]: 1972 U.S. Supreme Court decision in which it was held that the constitutional rights of the Amish, under the "free exercise of religion" clause, were violated by the state's compulsory school attendance law. [e] (Mar 17)
- Conventional coal-fired power plant [r]: power plant that burns coal in a steam generator to produce high pressure steam, which goes to steam turbines that generate electricity. [e] (Mar 10)
- Battle of the Ia Drang [r]: First divisional-scale battle involving helicopter-borne air assault troops, with U.S. forces against those of North Vietnam [e] (Mar 3)
- Ether (physics) [r]: Add brief definition or description (Feb 24)
- Large-scale trickle filters [r]: Add brief definition or description (11 Feb)
- Homeopathy [r]: Add brief definition or description (28 Jan)
- Microeconomics [r]: Add brief definition or description (14 Jan)
- Speech Recognition [r]: Add brief definition or description (26 Nov)
- Mashup [r]: Add brief definition or description (19 Nov)
- Tux [r]: Add brief definition or description (14 Oct)
- Hydrogen bond [r]: Add brief definition or description (7 Oct)
- Lead [r]: Add brief definition or description (1 Sept)
- DNA [r]: Add brief definition or description (8 July)
- Augustin-Louis_Cauchy [r]: Add brief definition or description (1 July)
- Vasco da Gama [r]: Add brief definition or description (24 June)
- Phosphorus [r]: Add brief definition or description (17 June)
- Crystal Palace [r]: Add brief definition or description (10 June)
- Gross Domestic Product [r]: Add brief definition or description (3 June)
- RNA interference [r]: Add brief definition or description (27 May)
- Latino history [r]: Add brief definition or description (20 May)
- Navy Grog [r]: Add brief definition or description (13 May)
- Systems biology [r]: Add brief definition or description (6 May)
- Steroid [r]: Add brief definition or description (22 Apr)
- Lebanon [r]: Add brief definition or description (15 Apr)
- Wheat [r]: Add brief definition or description (7 Apr)
- Benjamin Franklin [r]: Add brief definition or description (1 Apr)
- Coherer [r]: Add brief definition or description (25 Mar)
- U.S. Civil War [r]: Add brief definition or description (18 Mar)
- Life [r]: Add brief definition or description (11 Mar)
- Petroleum refining processes [r]: Add brief definition or description (4 Mar)
- Shirley Chisholm [r]: Add brief definition or description (20 Feb)
- Telephone Newspaper [r]: Add brief definition or description (4 Feb)
- Wristwatch [r]: Add brief definition or description (28 Jan)
- Korean War of 1592-1598 [r]: Add brief definition or description (21 Jan)
- Andrew Carnegie [r]: Add brief definition or description (11 January 2008)
- Bowling [r]: Add brief definition or description (31 December 2007)
- Architecture [r]: Add brief definition or description (December 6)
- Civil society [r]: Add brief definition or description November 29
- Joan of Arc [r]: Add brief definition or description (November 22)
- Chemistry [r]: Add brief definition or description (November 15)
- Albert Gallatin [r]: Add brief definition or description (November 8)
- Prime number [r]: Add brief definition or description (November 1)
- Tennis [r]: Add brief definition or description (October 25)
- Rottweiler [r]: Add brief definition or description (October 18)
- Theodor Lohmann [r]: Add brief definition or description (October 9)
- William Shakespeare [r]: Add brief definition or description (October 2)
- Edward I [r]: Add brief definition or description (September 25)
- El Tío [r]: Add brief definition or description (September 18)
- Scotland Yard [r]: Add brief definition or description (September 11)
- Kilt [r]: Add brief definition or description (September 4)
- U.S. Electoral College [r]: Add brief definition or description (August 28)
- Butler [r]: Add brief definition or description (August 21)
- Tony Blair [r]: Add brief definition or description (August 14)
- Northwest Passage [r]: Add brief definition or description (August 7)
- Literature [r]: Add brief definition or description (July 31)
- Biology [r]: Add brief definition or description (July 25)
Rules and Procedure
Rules
- The article's status must be 0 or 1, i.e., only "Advanced Articles" may be nominated.
- Any Citizen may nominate an article.
- No Citizen may have nominated more than one article listed under "current nominees" at a time.
- The article's nominator is indicated simply by the first name in the list of votes (see below).
- At least for now--while the project is still small--you may nominate and vote for articles of which you are a main author.
- An article can be Article of the Week only once every six months. Nominated articles that have won top honors should be removed from the list.
- Comments on nominations should be made on the article's talk page.
- The list of nominees should be kept below 20, or thereabouts. Articles with very few supporters and which have not gained any new supporters in the last two weeks or so may be deleted to make room for new nominees.
- Any editor may entirely cancel the nomination of any unapproved article in his or her area of expertise if, for example, it contains obvious and embarrassing problems.
Voting
- To vote, add your name and date in the Supporters column next to an article title, after other supporters for that article, by signing
<br />~~~~ . (The date is necessary so that we can determine when the last vote was added.) Your vote is alloted a score of 1.
- Add your name in the Specialist supporters column only if you are an editor who is an expert about the topic in question. Your vote is alloted a score of 1 for articles which you created and a score of 2 for articles which you did not create.
- You may vote for as many articles as you wish, and each vote counts separately, but you can only nominate one at a time; see above. You could, theoretically, vote for every nominated article on the page, but this would be pointless.
Ranking
- The list of articles is sorted by number of votes first, then alphabetically.
- Admins should make sure that the votes are correctly tallied, but anyone may do this. Note that "Specialist Votes" are worth 3 points.
Updating
- Each Thursday, one of the admins listed below should move the winning article to the Current Winner section of this page, announces the winner on Citizendium-L and updates the "previous winning articles" section accordingly.
- The winning article will be the article at the top of the list (ie the one with the most votes).
- In the event of two or more having the same number of votes :
- The article with the most specialist supporters is used. Should this fail to produce a winner, the article appearing first by English alphabetical order is used.
- The remaining winning articles are guaranteed this position in the following weeks, again in alphabetical order. No further voting would take place on these, which remain at the top of the table with notices to that effect. Further nominations and voting take place to determine future winning articles for the following weeks.
Administrators
These are people who have volunteered to run this program. Their duties are (1) to ensure that this page remains "clean," e.g., as a given article garners more votes, its tally is accurately represented and it moves up the list, and (2) to place the winning article on the front page on a weekly basis.
To become an administrator, you need not apply anywhere. Simply add your name below. Administrator duties are open to editors and authors alike.
References
See Also
|
|