Talk:Computer/Archive 3: Difference between revisions
imported>Pat Palmer No edit summary |
imported>John Stephenson m (John Stephenson moved page Talk:Computer/Archive3 to Talk:Computer/Archive 3: So archiving templates work) |
Revision as of 11:14, 21 February 2021
This is archive 3 for Talk:Computer
archived everything; starting over
I have archived previous discussions, since I totally reorganized Computer, sending most of the old stuff either to Computer architecture or to History of computing (new article).Pat Palmer 17:14, 23 April 2007 (CDT)
question about microprocessor computing
a rapid evolution can also be changed into revolution, and if we talk about microprocessor computing we do talk about the digital revolution. Any objection to changing that part? Robert Tito | Talk 17:52, 23 April 2007 (CDT)
- I'm not sure what you're asking, but edit what you think best. I'd recommend avoiding getting too detailed about technology, however. Perhaps that discussion would belong better in history of computing, which hopefully will have a breakdown of developments by decade.Pat Palmer 17:57, 23 April 2007 (CDT)
- P S - I wouldn't want to use a phrase like "digital revolution" without first explaining it. I think in terms of my grandmother--could she read what I wrote and get anything out of it? If not, it's too "jargony". That's my corny standard, anyways.Pat Palmer 18:19, 23 April 2007 (CDT)
- the rapid change of society is called the digital revolution, sorry for anybody but that is the generally accepted term for tha]] | Talk 18:23, 23 April 2007 (CDT)
- OK; can we put the definition in parentheses right after the first occurence of the term?Pat Palmer 18:29, 23 April 2007 (CDT)
- I reworded yours just a bit. Would you care to take a stab at writing digital revolution? Sounds like a complete article to me!Pat Palmer 18:34, 23 April 2007 (CDT)
why can't I mark this as NOT from Wikipedia?
When I uncheck the "from Wikipedia" on this article, it has no effect. This article is no longer anything like the original brought from Wikipedia. Someone please help!Pat Palmer 17:59, 23 April 2007 (CDT)
Pat, Jason is one of the few that can change it. Send an e-mail to constables@citizendium.org with the request. Robert Tito | Talk 18:06, 23 April 2007 (CDT)
- Thanks! I just did so.Pat Palmer 18:18, 23 April 2007 (CDT)
I just took the wikipedia mention off the bottom of the page. --Matt Innis (Talk) 21:05, 23 April 2007 (CDT)
applications archived here temporarily
I don't yet know where (if at all) to include this stuff.
Shannon
Who added shannon to the computer page? His major contribution to computer science if the definition of shannon information. When a document contains NO shannon information - after being acted upon by an enigma - no link can be obtained from the resulting document to its original. A shannon enigma is seen as the perfect enigma where encryption comes to be important. Not many know about this person as he made his major contributions in the 1940's and a few in the 1950's - well before encryption engines were available wiht the depth and speed we have now. Robert Tito | Talk 18:53, 23 April 2007 (CDT)
- Robert, I'll have to double-check my facts, but I believe that Shannon (in addition to his well-known work in information theory) made the crucial connection between boolean algebra and "switching algebra". Without that, none of the circuits in a CPU could have ever been made to work with their current level of complexity. Or maybe my memory deceives me and it was someone else? I'm afraid I have to completely dispute your claim that "not many know about this person". Try a google.Pat Palmer 18:58, 23 April 2007 (CDT)
- Here's the link: Claude Shannon; please read the 3rd sentence on the page--Shannon was indeed likely the first to note the crucial link of boolean algebra and logic design principles!Pat Palmer 17:24, 25 April 2007 (CDT)
- While I'm about it, I don't think fiber optic communications would ever have come about without Shannon. While that's a complete specialty within computer science, it was world-changing. Not everyone knows that he is important, but a lot of people think he was. Still, do you have a suggestion for another name you'd like me to include instead?Pat Palmer 19:00, 23 April 2007 (CDT)
There are in effect two (not related) Shannons thtat played a role, one in the late 60's - I assume the one you are referring to, and one in the 40/50's. According to me, but then I do this by heart the latter contributed to running pad enigmas as in defined them. He was at the fundament of symmetric and a-symmetric encryption. Robert Tito | Talk 19:40, 23 April 2007 (CDT)
- If there were two Claude Shannon's, I don't know about it, but stranger things have happened. He worked across multiple fields. Of course, there are several thousand Pat Palmers in this world (smile).Pat Palmer 21:23, 23 April 2007 (CDT)
similar data?
I'm removing the "similar" in the following sentence: "Some people define a computer as a machine for manipulating similar data according to a list of instructions known as a program." because I don't really see why it's there, and it makes an already sentence longer without seeming to add much. Correct me if it really matters. Sorry to be so picky!Pat Palmer 20:54, 23 April 2007 (CDT) manipulating similar data according t
similar
however is the crux: repetitive similar data could be manipulated by the simple computers, only addresses AND another field or two, but they all had to have the SAME structure of information - hence similar. They were nothing more or less than large strings with positional meaning of tehe information. I created far too many punch cards not to know that. Robert Tito | Talk 21:39, 23 April 2007 (CDT)
- I'm not sure that this distinction is helpful to an average reader who happens not to be computer geek. If you don't mind, I'd rather omit it for now.Pat Palmer 22:45, 23 April 2007 (CDT)
- ok so since data means all data all data was involved, not really. It had to be similar, maybe call them structured data as that is what they needed to be. Robert Tito | Talk 22:53, 23 April 2007 (CDT)
- I'm not sure that this distinction is helpful to an average reader who happens not to be computer geek. If you don't mind, I'd rather omit it for now.Pat Palmer 22:45, 23 April 2007 (CDT)
plea for help gathering reference
Before anything more gets added to this article, it would be nice to find references for the stuff that's already there. If anyone has time and can find any, I'd appreciate help in getting them on the page. I find editing a reference really tiring, and can only bring myself to do a few per day. Any help appreciated.Pat Palmer 21:21, 23 April 2007 (CDT)
reworden
the abacus, slider etc were known in hindsight we call a pocker calculator a calculator. in these days they would have been miracles. So omit the word compuyter and it makes more sense. Robert Tito | Talk 17:25, 25 April 2007 (CDT)
No mention of Charles Babbage?
Did I miss it? I would think any account of the history of computers would include his difference engine and (planned) analytical engine. Greg Woodhouse 16:18, 4 May 2007 (CDT)
that depends upon how deep you wanna go, he can be added. But then in the totality of the story what does it actually add? To me that seems more like something of the history workgroup to talk about the times leading up to modern computers starting in some grey area - Greek, Egyptian, Chinese. Omitting Babbage here doesn't seem to influence the line of the article much, nor would adding him. Robert Tito | Talk 16:34, 4 May 2007 (CDT)
Certainly, Babbage is no less relevant to the history of computers than the abacus and the slide rule (mentioned in the intro). Greg Woodhouse 16:38, 4 May 2007 (CDT)
- Greg, by all means add Babbage - he will fit nice in that line. We might even consider some very OLD cash registers as used in these days (NCR and other producers). So add along. All these should have more attention ina true history document, to appreciate them all in ther own value and at their own circumstances/time. Robert Tito | Talk 16:53, 4 May 2007 (CDT)
- Greg, please do not add Babbage here, but rather on history of computing. Babbage's is not a short story, and I'm trying to keep this top-level article focused on essentials. Also, Babbage was not the only pioneer--there were a couple of Americans working in math for computing at the same time. Let's do them all on the secondary history page.Pat Palmer 18:03, 4 May 2007 (CDT)
- Clarification of goals for Computer vs. history of computing: Computer (top-level article) is to explain the context of its invention and its importance. Details of history, including the brilliant pioneers such as Babbage, should not go here (in my opinion) lest the article become too long. Once we name one here, we've got to name quite a few. It would dilute the main message of this article, which is to be an overview branching off to increasing levels of detail in sub-articles.Pat Palmer 18:08, 4 May 2007 (CDT)
Moore's law
Moore's law has generally been valid since the 1960s, but it is widely accepted that current technology is approaching fundamental physical limits (for a detailed discussion see Neil Gershenfeld, The Physics of Information Technology), and that Moore's law cannot continue to hold. Greg Woodhouse 11:28, 31 May 2007 (CDT)