منتديات ورقلة سعيد عتبة

المحبة والوفاء ولم شمل الأوفياء
 
الرئيسيةالبوابةاليوميةس .و .جبحـثقائمة الاعضاءالمجموعاتالتسجيلدخول
شاطر | 
 

 أجيال الكمبيوتر بالفرنسية والعربية

استعرض الموضوع السابق استعرض الموضوع التالي اذهب الى الأسفل 
كاتب الموضوعرسالة
Admin
Admin


عدد المساهمات: 14
نقاط: 39
تاريخ التسجيل: 09/11/2010
العمر: 23

مُساهمةموضوع: أجيال الكمبيوتر بالفرنسية والعربية   الإثنين نوفمبر 15, 2010 10:27 pm

History of computing hardware
First-generation machines
Design of the von Neumann architecture (1947)
Even before the ENIAC was finished, Eckert and Mauchly recognized its limitations and started the design of a stored-program computer, EDVAC. John von Neumann was credited with a widely circulated report describing the EDVAC design in which both the programs and working data were stored in a single, unified store. This basic design, denoted the von Neumann architecture, would serve as the foundation for the worldwide development of ENIAC's successors.[54] In this generation of equipment, temporary or working storage was provided by acoustic delay lines, which used the propagation time of sound through a medium such as liquid mercury (or through a wire) to briefly store data. A series of acoustic pulses is sent along a tube; after a time, as the pulse reached the end of the tube, the circuitry detected whether the pulse represented a 1 or 0 and caused the oscillator to re-send the pulse. Others used Williams tubes, which use the ability of a small cathode-ray tube (CRT) to store and retrieve data as charged areas on the phosphor screen. By 1954, magnetic core memory[55] was rapidly displacing most other forms of temporary storage, and dominated the field through the mid-1970s.
Magnetic core memory. Each core is one bit.
EDVAC was the first stored-program computer designed; however it was not the first to run. Eckert and Mauchly left the project and its construction floundered. The first working von Neumann machine was the Manchester "Baby" or Small-Scale Experimental Machine, developed by Frederic C. Williams and Tom Kilburn at the University of Manchester in 1948 as a test bed for the Williams tube;[56] it was followed in 1949 by the Manchester Mark 1 computer, a complete system, using Williams tube and magnetic drum memory, and introducing index registers.[57] The other contender for the title "first digital stored-program computer" had been EDSAC, designed and constructed at the University of Cambridge. Operational less than one year after the Manchester "Baby", it was also capable of tackling real problems. EDSAC was actually inspired by plans for EDVAC (Electronic Discrete Variable Automatic Computer), the successor to ENIAC; these plans were already in place by the time ENIAC was successfully operational. Unlike ENIAC, which used parallel processing, EDVAC used a single processing unit. This design was simpler and was the first to be implemented in each succeeding wave of miniaturization, and increased reliability. Some view Manchester Mark 1 / EDSAC / EDVAC as the "Eves" from which nearly all current computers derive their architecture. Manchester University's machine became the prototype for the Ferranti Mark 1. The first Ferranti Mark 1 machine was delivered to the University in February, 1951 and at least nine others were sold between 1951 and 1957.
The first universal programmable computer in the Soviet Union was created by a team of scientists under direction of Sergei Alekseyevich Lebedev from Kiev Institute of Electrotechnology, Soviet Union (now Ukraine). The computer MESM (МЭСМ, Small Electronic Calculating Machine) became operational in 1950. It had about 6,000 vacuum tubes and consumed 25 kW of power. It could perform approximately 3,000 operations per second. Another early machine was CSIRAC, an Australian design that ran its first test program in 1949. CSIRAC is the oldest computer still in existence and the first to have been used to play digital music.[58]
Commercial computers
The first commercial computer was the Ferranti Mark 1, which was delivered to the University of Manchester in February 1951. It was based on the Manchester Mark 1. The main improvements over the Manchester Mark 1 were in the size of the primary storage (using random access Williams tubes), secondary storage (using a magnetic drum), a faster multiplier, and additional instructions. The basic cycle time was 1.2 milliseconds, and a multiplication could be completed in about 2.16 milliseconds. The multiplier used almost a quarter of the machine's 4,050 vacuum tubes (valves).[59] A second machine was purchased by the University of Toronto, before the design was revised into the Mark 1 Star. At least seven of the these later machines were delivered between 1953 and 1957, one of them to Shell labs in Amsterdam.[60]
In October 1947, the directors of J. Lyons & Company, a British catering company famous for its teashops but with strong interests in new office management techniques, decided to take an active role in promoting the commercial development of computers. The LEO I computer became operational in April 1951 [61] and ran the world's first regular routine office computer job. On 17 November 1951, the J. Lyons company began weekly operation of a bakery valuations job on the LEO (Lyons Electronic Office). This was the first business application to go live on a stored program computer.[62]
In June 1951, the UNIVAC I (Universal Automatic Computer) was delivered to the U.S. Census Bureau. Remington Rand eventually sold 46 machines at more than $1 million each ($8.38 million as of 2010).[63] UNIVAC was the first "mass produced" computer. It used 5,200 vacuum tubes and consumed 125 kW of power. Its primary storage was serial-access mercury delay lines capable of storing 1,000 words of 11 decimal digits plus sign (72-bit words). A key feature of the UNIVAC system was a newly invented type of metal magnetic tape, and a high-speed tape unit, for non-volatile storage. Magnetic media are still used in many computers.[64] In 1952, IBM publicly announced the IBM 701 Electronic Data Processing Machine, the first in its successful 700/7000 series and its first IBM mainframe computer. The IBM 704, introduced in 1954, used magnetic core memory, which became the standard for large machines. The first implemented high-level general purpose programming language, Fortran, was also being developed at IBM for the 704 during 1955 and 1956 and released in early 1957. (Konrad Zuse's 1945 design of the high-level language Plankalkül was not implemented at that time.) A volunteer user group, which exists to this day, was founded in 1955 to share their software and experiences with the IBM 701.
IBM 650 front panel
IBM introduced a smaller, more affordable computer in 1954 that proved very popular.[65] The IBM 650 weighed over 900 kg, the attached power supply weighed around 1350 kg and both were held in separate cabinets of roughly 1.5 meters by 0.9 meters by 1.8 meters. It cost $500,000 ($4.05 million as of 2010) or could be leased for $3,500 a month ($30 thousand as of 2010).[63] Its drum memory was originally 2,000 ten-digit words, later expanded to 4,000 words. Memory limitations such as this were to dominate programming for decades afterward. The program instructions were fetched from the spinning drum as the code ran. Efficient execution using drum memory was provided by a combination of hardware architecture: the instruction format included the address of the next instruction; and software: the Symbolic Optimal Assembly Program, SOAP,[66] assigned instructions to the optimal addresses (to the extent possible by static analysis of the source program). Thus many instructions were, when needed, located in the next row of the drum to be read and additional wait time for drum rotation was not required.
In 1955, Maurice Wilkes invented microprogramming,[67] which allows the base instruction set to be defined or extended by built-in programs (now called firmware or microcode).[68] It was widely used in the CPUs and floating-point units of mainframe and other computers, such as the Manchester Atlas [69] and the IBM 360 series.[70]
IBM introduced its first magnetic disk system, RAMAC (Random Access Method of Accounting and Control) in 1956. Using fifty 24-inch (610 mm) metal disks, with 100 tracks per side, it was able to store 5 megabytes of data at a cost of $10,000 per megabyte ($80 thousand as of 2010).[63][71]
Second generation: transistors
A bipolar junction transistor
The bipolar transistor was invented in 1947. From 1955 onwards transistors replaced vacuum tubes in computer designs,[72] giving rise to the "second generation" of computers. Initially the only devices available were germanium point-contact transistors, which although less reliable than the vacuum tubes they replaced had the advantage of consuming far less power.[73] The first transistorised computer was built at the University of Manchester and was operational by 1953;[74] a second version was completed there in April 1955. The later machine used 200 transistors and 1,300 solid-state diodes and had a power consumption of 150 watts. However, it still required valves to generate the clock waveforms at 125 kHz and to read and write on the magnetic drum memory, whereas the Harwell CADET operated without any valves by using a lower clock frequency, of 58 kHz when it became operational in February 1955.[75] Problems with the reliability of early batches of point contact and alloyed junction transistors meant that the machine's mean time between failures was about 90 minutes, but this improved once the more reliable bipolar junction transistors became available.[76]
Compared to vacuum tubes, transistors have many advantages: they are smaller, and require less power than vacuum tubes, so give off less heat. Silicon junction transistors were much more reliable than vacuum tubes and had longer, indefinite, service life. Transistorized computers could contain tens of thousands of binary logic circuits in a relatively compact space. Transistors greatly reduced computers' size, initial cost, and operating cost. Typically, second-generation computers were composed of large numbers of printed circuit boards such as the IBM Standard Modular System[77] each carrying one to four logic gates or flip-flops.
A second generation computer, the IBM 1401, captured about one third of the world market. IBM installed more than one hundred thousand 1401s between 1960 and 1964.
This RAMAC DASD is being restored at the Computer History Museum
Transistorized electronics improved not only the CPU (Central Processing Unit), but also the peripheral devices. The IBM 350 RAMAC was introduced in 1956 and was the world's first disk drive. The second generation disk data storage units were able to store tens of millions of letters and digits. Next to the fixed disk storage units, connected to the CPU via high-speed data transmission, were removable disk data storage units. A removable disk stack can be easily exchanged with another stack in a few seconds. Even if the removable disks' capacity is smaller than fixed disks,' their interchangeability guarantees a nearly unlimited quantity of data close at hand. Magnetic tape provided archival capability for this data, at a lower cost than disk.
Many second generation CPUs delegated peripheral device communications to a secondary processor. For example, while the communication processor controlled card reading and punching, the main CPU executed calculations and binary branch instructions. One databus would bear data between the main CPU and core memory at the CPU's fetch-execute cycle rate, and other databusses would typically serve the peripheral devices. On the PDP-1, the core memory's cycle time was 5 microseconds; consequently most arithmetic instructions took 10 microseconds (100,000 operations per second) because most operations took at least two memory cycles; one for the instruction, one for the operand data fetch.
During the second generation remote terminal units (often in the form of teletype machines like a Friden Flexowriter) saw greatly increased use. Telephone connections provided sufficient speed for early remote terminals and allowed hundreds of kilometers separation between remote-terminals and the computing center. Eventually these stand-alone computer networks would be generalized into an interconnected network of networks—the Internet.[78]
Post-1960: third generation and beyond
Intel 8742 eight-bit microcontroller IC
The explosion in the use of computers began with "third-generation" computers, making use of Jack St. Clair Kilby's[79] and Robert Noyce's[80] independent invention of the integrated circuit (or microchip), which later led to the invention of the microprocessor,[81] by Ted Hoff, Federico Faggin, and Stanley Mazor at Intel.[82] The integrated circuit in the image on the right, for example, an Intel 8742, is an 8-bit microcontroller that includes a CPU running at 12 MHz, 128 bytes of RAM, 2048 bytes of EPROM, and I/O in the same chip.
During the 1960s there was considerable overlap between second and third generation technologies.[83] IBM implemented its IBM Solid Logic Technology modules in hybrid circuits for the IBM System/360 in 1964. As late as 1975, Sperry Univac continued the manufacture of second-generation machines such as the UNIVAC 494. The Burroughs large systems such as the B5000 were stack machines, which allowed for simpler programming. These pushdown automatons were also implemented in minicomputers and microprocessors later, which influenced programming language design. Minicomputers served as low-cost computer centers for industry, business and universities.[84] It became possible to simulate analog circuits with the simulation program with integrated circuit emphasis, or SPICE (1971) on minicomputers, one of the programs for electronic design automation (EDA). The microprocessor led to the development of the microcomputer, small, low-cost computers that could be owned by individuals and small businesses. Microcomputers, the first of which appeared in the 1970s, became ubiquitous in the 1980s and beyond. Steve Wozniak, co-founder of Apple Computer, is sometimes erroneously credited[by whom?] with developing the first mass-market home computers. However, his first computer, the Apple I, came out some time after the MOS Technology KIM-1 and Altair 8800, and the first Apple computer with graphic and sound capabilities came out well after the Commodore PET. Computing has evolved with microcomputer architectures, with features added from their larger brethren, now dominant in most market segments.
Systems as complicated as computers require very high reliability. ENIAC remained on, in continuous operation from 1947 to 1955, for eight years before being shut down. Although a vacuum tube might fail, it would be replaced without bringing down the system. By the simple strategy of never shutting down ENIAC, the failures were dramatically reduced. The vacuum-tube SAGE air-defense computers became remarkably reliable – installed in pairs, one off-line, tubes likely to fail did so when the computer was intentionally run at reduced power to find them. Hot-pluggable hard disks, like the hot-pluggable vacuum tubes of yesteryear, continue the tradition of repair during continuous operation. Semiconductor memories routinely have no errors when they operate, although operating systems like Unix have employed memory tests on start-up to detect failing hardware. Today, the requirement of reliable performance is made even more stringent when server farms are the delivery platform.[85] Google has managed this by using fault-tolerant software to recover from hardware failures, and is even working on the concept of replacing entire server farms on-the-fly, during a service event.[86][87]
In the 21st century, multi-core CPUs became commercially available.[88] Content-addressable memory (CAM)[89] has become inexpensive enough to be used in networking, although no computer system has yet implemented hardware CAMs for use in programming languages. Currently, CAMs (or associative arrays) in software are programming-language-specific. Semiconductor memory cell arrays are very regular structures, and manufacturers prove their processes on them; this allows price reductions on memory products. During the 1980s, CMOS logic gates developed into devices that could be made as fast as other circuit types; computer power consumption could therefore be decreased dramatically. Unlike the continuous current draw of a gate based on other logic types, a CMOS gate only draws significant current during the 'transition' between logic states, except for leakage.
This has allowed computing to become a commodity which is now ubiquitous, embedded in many forms, from greeting cards and telephones to satellites. Computing hardware and its software have even become a metaphor for the operation of the universe.[90] Although DNA-based computing and quantum qubit computing are years or decades in the future, the infrastructure is being laid today, for example, with DNA origami on photolithography.[91] Fast digital circuits (including those based on Josephson junctions and rapid single flux quantum technology) are becoming more nearly realizable with the discovery of nanoscale superconductors.[92]
Fiber-optic and photonic devices, which already have been used to transport data over long distances, are now entering the data center, side by side with CPU and semiconductor memory components. This allows the separation of RAM from CPU by optical interconnects.[93]
An indication of the rapidity of development of this field can be inferred by the history of the seminal article.[94] By the time that anyone had time to write anything down, it was obsolete. After 1945, others read John von Neumann's First Draft of a Report on the EDVAC, and immediately started implementing their own systems. To this day, the pace of development has continued, worldwide











































الجيل الأول (1951-1958) الصمامات المفرغة
في عام 1952 قام روشستر بتصميم الجهاز 701 وهو أول إنتاج من حاسب إليكتروني رقمي لشركة آي بي إم للاستخدامات العلمية K وهو يعمل بفكرة القنوات الهوائية (الصمامات المفرغة).
أجهزة حاسوب الترانزستور في عام 1954 تم في معامل بل إنتاج أول جهاز حاسوب عام الأغراض يستخدم الترانزستور بالكامل في تصميه، أطلق عليه اسم TRADIC اختصاراً لكلمات Trasistorizad Airbone Digital Computer، وبالرغم من ذلك لم يحظ هذا الجهاز باسم الجيل الثاني واعتبر الجيل الثاني في صناعة الحاسب هو الجيل الذي بدأ بعد عام 1958.
الجيل الثاني (1958-1964): الترانزستور ظهر هذا الجيل عقب اختراع الترانزستور الذي تم إحلاله محل الصمامات الإلكترونية المفرغة.
الجيل الثالث (1964-1970) الدوائر المتكاملة في صيف عام 1958 قام د. جاك كيلبى Jack S. Kilby بتصميم أول دائرة إليكترونية متكاملة وتكونت من عدة مكونات من السيليكون مجمعة مع بعضها البعض لتجميع دائرة إلكترونية على شريحة سيلكون واحدة مما سمح بتطويرها بعد ذلك، وقد تم الإعلان عنها في 12 سبتمبر 1958. وفي نفس العام قام كراي ببناء أول حاسوب عملاق بالكامل من الترانزستور.
نظم آى بى إم والجيل الثالث في سنة 1964 أتمت شركة آي بي إم عائلة نظم 360 بعد استبدال النظام 260 بالترانزستور والدوائر المتكاملة وقد تم بيع أكثر من ثلاثين ألف جهاز من هذه النوعية وبدا كما لو أن التكنولوجيا الجديدة قد بدأت في فرض نفسها على سوق الحاسب ومن هذا التاريخ بدأ الجيل الثالث (جيل الدوائر المتكالمة). بعد شهر من هذا الإنتاج كانت برامج لغة بيسيك تعمل في كلية دارتماوث (بالإنجليزية: Dartmouth‏) على يد مخترعيها توماس كيرتز Thomas Kurtz وجون كيمنى John Kemeny وسرعان ما أصبحت أكثر اللغات شهرة وسهولة في كل أجهزة الحاسب.
في عام 1970 تم إنتاج أول حاسب شخصى Alto في معامل زيروكس يستخدم الأيقونات والنوافذ والرسوميات والفأرة.
الجيل الربع (1971-1990): المعالج الدقيق
في عام 1971 ظهرت أول شريحة متكاملة بواسطة شركة إنتل كانت تحمل الرقم 4004 كمعالج دقيق يعمل على 4 بت يحتوى على 2250 ترانزستور وله القدرة على العمل بما يماثل تقريباً جهاز إنياك الذي أنتج عام 1946 (الذي كان يشغل حجرة ضخمة ويحتوى على 18000 صمام مفرغ). كان المعالج الدقيق 4004 موضوعاً على شريحة بطول سدس بوصة وبعرض ثمن بوصة.
الحاسب الشخصى القديم
ظهر الحاسب الشخصى الأول في عام 1971, ولم يكن يسمى بالحاسب الشخصى إذ قام ببنائه وتوزيعه جون بلانكينباكر John Blankenbaker وأسماه Kenbak-1 وله ذاكرة قدرها 256 بايت وتظهر البيانات على شكل مجموعة لمبات بيان ولم يتم بيع سوى 40 قطعة من هذا الجهاز الذي بلغ سعره في ذلك الوقت 750 دولار لكنه ساهم في الإعداد لثورة الحاسب الشخصى التي أعقبت هذا التاريخ.
في عام 1974 قامت شركة إنتل بالإعلان عن معالج 8080 يعمل على نظام 8 بت كأول معالج عام الإغراض.
الحاسب الشخصى
كان عام 1977 يحمل في طياته أول تجميع كامل لأجهزة الحاسب الشخصى (بالإنجليزية: Personal Computer‏) قامت بتوزيعه شركات متعددة أنتجته وبدات في بيعه منها شركة كومودور Commodore وشركة تاندى Tandy وشركة أبل (التي أنتجت جهاز Aplle II كأول جهاز مجمع يستخدم العرض الملون) بأجهزة حملت أسماء هذه الشركات بحجم صغير سرعان ما تم توزيعه في المدارس والمعامل والبيوت والشركات. وفى خلال هذا العام كانت أول شبكة محلية تربط بين عدة أجهزة قد بيعت تحت اسم أركنيت ARCnet.
مجموعة التعليمات المخفضة و32 بتفي عام 1980 ظهر النموذج الأول لمجموعة التعليمات المخفضة (بالإنجليزية: RISC‏) على يد فريق التطوير في شركة آي بي إم وظهرت في نفس العام في معامل بل شريحة المعالج الدقيق تعمل على 32 بت تحت اسم بلماك-32 (بالإنجليزية: Ballmac-32‏) لتزيد من قدرة الحوسبة.
حاسب آي بي إم XT عندما قامت شركة آي بي إم بإنتاج جهاز الحاسب الشخصي الأول لها في عام 1981 اكتسبت ثورة الحاسب الشخصي قوة دفع هائلة لها فقد احتوى على مشغل أقراص مرنة واستخدم معالج إنتل 8088 على شاشة تعمل على نمط النصوص بذاكرة وصلت إلى 128 أو 256 كيلوبايت وتستخدم نظام دوس (بالإنجليزية: DOS‏).
ثم قامت شركة آي بي إم بإنتاج جهازها الجديد IBM PC XT كتقنية ممتدة للجهاز السابق وأضافت إليه القرص الصلب كوحدة تخزين إضافية وعملت على تزويده بإمكانية عرض الرسوم. وما إن أصبح هذا الجهاز يحتل مركز الصدارة في عالم الحاسب الشخصي حتى بدأت الشركات الأخرى في تقليده حتى سمحت بعد ذلك لهم بتقليد البرمجة الداخلية له في الذاكرة الثابتة (ذاكرة القراءة فقط ROM BIOS) فأصبحت أجهزة الشركات الأخرى متوافقة مع إنتاج شركة آي بي إم.
تطور أجهزة آي بي إم في عام 1984 أيضاً أعلنت آي‌ بي‌ إم عن جهازها الجديد IBM PC AT كتقنية متقدمة بمعالج إنتل 80286. تغيرت عدة ملامح أساسية في الجهاز الجديد عن سابقيه إذ أضافت العرض المرئي المحسن EGA لاستخدام 16 لوناً (الأنظمة القديمة كانت تستخدم لونين أو أربعة ألوان) كما عمل الجهاز الجديد على خطوط نقل بيانات تسع 16 بت (بدلاً من النظام القديم الذي كان يستخدم 8 خطوط نقل) وتغير شكل !لوحة المفاتيح (حوسبة) والتغذية الكهربية.
معالج إنتل 80486 في عام 1989 قدمت شركة إنتل المعالج الشهير إنتل 80486 الذي يعمل على 16 بت بتردد 16 ميجاهيرتز بقدرة 2.5 MIPS، والذي أحدث ثورة في مجال الحاسبات الشخصية، واعتمدت عليه معظم الشركات العالمية المنتجة للحاسبات في تصنيع حاسباتها، ويضم داخله معالج مساعد للعمليات الحسابية Math Co-Processor وذلك قبل تثبيت المعالج المساعد في اللوحة الرئيسية في مكان منفصل عن المعالج الرئيسى.
الجيل الخامس (1992- حتى الآن) في الثمانينات (حوالي 1982) كانت اليابان قد وضعت تصوراً للجيل الخامس من أجيال الحاسب أو الجيل الأول من الحاسب الاستدلالي. لم يكن الحاسب الاستدلالي هدفاً في حد ذاته وإنما كان الهدف من وراء تصنيعه هو إضفاء قدر من الذكاء والقدرة على الاستنتاج على الحاسب، ففى الجهاز الجديد كانت هناك مهام فرعية للتصميم الذي أراده المصممون ومنها الترجمة بواسطة الحاسب من لغة إلى لغة، والتعرف الصوتى وإدراك الحاسب للكلام، والرؤية بالحاسب، وإثبات النظريات، وألعاب الحاسوب. وقد بدأ ظهور الجيل الخامس عام 1992 وبدأ معه استخدام اللغة الطبيعية (عن طريق الصوت) في إدخال البيانات، ويتم معالجتها بوسائل الذكاء الاصطناعي.

الرجوع الى أعلى الصفحة اذهب الى الأسفل
معاينة صفحة البيانات الشخصي للعضو http://abasse-elmdani.mam9.com
 

أجيال الكمبيوتر بالفرنسية والعربية

استعرض الموضوع السابق استعرض الموضوع التالي الرجوع الى أعلى الصفحة 
صفحة 1 من اصل 1

صلاحيات هذا المنتدى:لاتستطيع الرد على المواضيع في هذا المنتدى
منتديات ورقلة سعيد عتبة  ::  ::  ::  ::  :: -