AMD exists fifty years – From Intel clone to core champion

Spread the love

AMD is celebrating its fiftieth anniversary. The anniversary comes at a festive time, because AMD is in good shape after the introduction of its Ryzen processors and brand new hardware is on the way in the form of Zen 2 and Navi. It is not the first time that AMD is going through a positive period, but the company has also had some difficult times. In honor of the anniversary, we look back at the history of the brand.

May 1, 1969 – The beginning

AMD, in full Advanced Micro Devices, was formally founded on May 1, 1969 by Jerry Sanders. The American worked at Fairchild Semiconductor, but was dissatisfied with the way things were going. He found seven like-minded colleagues and together they decided to set up their own chip company. AMD is not the only processor manufacturer that originated from Fairchild. A year earlier, Robert Noyce and Gordon Moore left the same company and in July 1968 they founded Intel.

In the early years, AMD made all kinds of integrated circuits and RAM. In 1975, the company introduced its first microprocessor: the Am9080. That was a clone of the Intel 8080 created through reverse engineering . A year later, Intel began adding microcode to its processors and struck a copyright deal with AMD for the use of that code.

AMD entered into a joint venture with Siemens in 1977. The German company bought twenty percent of AMD shares. Together they founded Advanced Micro Computers, with the aim of working on their own processors. However, that cooperation was short-lived, because AMD bought Siemens out again in 1979, because the views of the companies did not match. AMC was closed afterwards.

1981 – AMD makes processors for Intel

It was 1981 when IBM made its first personal computer. The company needed Intel’s x86 processors for that. However, IBM required that there be two suppliers for all chips in its products. Intel approached AMD and signed a ten-year agreement for the production of processors. In 1982, AMD began production of the Intel 8086. It was the exact same design as Intel’s own processor, but made in AMD’s facilities. AMD also made copies of Intel’s 8088, 80186 and 80188, and when Intel released its 286 in 1984, AMD came up with its own version of it: the Am286. That was the last Intel processor that AMD made on order, because the collaboration came to an early end because Intel no longer wanted to reveal its designs.

In the meantime, AMD also continued to create its own microprocessors based on the risc architecture. In 1988, the first variant in the Am29000 series was released, a 32-bit microprocessor that would continue to be used in many variants in various equipment, including laser printers, for years to come.

1991 – AMD clones 386 and competes with Intel

While AMD didn’t get a draft of the Intel 386, it made its own clone. Work on it began in 1988, and in 1991 AMD released its Am386, which is fully compatible with the Intel version. AMD had wanted to release the processor earlier, but that was stopped with a lawsuit. AMD had brought that case in 1987, after Intel stopped sharing designs. The long-running legal dispute ended in 1994, with AMD winning its case, but having been forced to reverse engineer to make x86 processors in the meantime.

The Am386 that AMD released turned out to be more efficient than Intel’s variant. For example, AMD made a 386DX-40 with a speed of 40MHz, while Intel’s fastest 386 design peaked at 33MHz. That lead was because AMD had a better production process at that time. In the years that followed, AMD continued to clone processors from Intel.

AMD released its Am486 in 1993, four years after Intel released the 486. The AMD version was just as fast per clock tick as the Intel original, but cheaper. In addition, AMD released faster versions in the following years, with the top model being the Am486DX4, which ran at 120MHz. The fastest Intel model reached 100MHz. Intel had already released its first Pentiums, but they ran at 66MHz. AMD’s fast 486 variants were popular because they still fit in old motherboards and outperformed the first Pentiums. According to WikiChip, the market share of the Am486 reached 40 percent in 1995.

1996 – AMD-K5, first proprietary x86 design

After the 486, Intel entered the Pentium era and AMD opposed it with its K5. That was AMD’s first completely proprietary x86 design and a corresponding socket and chipset were also developed. For its K5, AMD drew on the knowledge it had gained with RISC processors. The AMD-K5 was basically a derivative of the Am29000 architecture with an x86-decoding front-end added to it.

According to a Forbes article , the K in the name stands for Kryptonite, the fictional element from the Superman comic books that allows the superhero to be restrained. AMD would see the Intel Pentium processors as Superman and the K processors had to compete with that.

In the 5k86 line, AMD had processors with speeds ranging from 90 to 133MHz, but clock-for-clock they were faster than their Pentium counterparts. In the naming, the manufacturer therefore used a performance rating to indicate which Pentium the K processor competed with. The 90MHz model was sold as the K5-120 and the 133MHz version was the K5-200. The K5 processors fit into Socket 5 motherboards, which were also used for Intel Pentium processors. The AMD-K6, which followed in 1997, also shared the same motherboards as its Pentium counterparts, but with Socket 7.

1999 – Athlon with copper and the GHz barrier

AMD continued to develop its own chips and in 1998 entered into a partnership with Motorola. The company, now best known for smartphones, was also a leading chip manufacturer at the time. AMD and Motorola are working together to refine a manufacturing process using copper interconnects. Because copper is a better conductor than aluminum, the interconnects could be made smaller and that made them more economical.

The new production method, in which processors were made at 180 nanometers, resulted in the arrival of the first Athlon processors based on the K7 architecture in 1999. The first series consisted of so-called slot processors, just like the Pentium II and III. The L2 cache was located next to the processor on a printed circuit board. AMD’s Slot A, like Intel’s Slot 1, used 242 pins. However, it was rotated 180 degrees so that users couldn’t accidentally plug an AMD processor into an Intel motherboard and vice versa. While the slot confirmation was similar, the processors were not compatible.

The Athlons based on the K7 architecture ran at speeds of 500 to 700MHz. Not much later, AMD came up with the improved K75 architecture and on that basis the manufacturer introduced the world’s first gigahertz processor in March 2000. The 1 GHz version had to be paid $ 1300 and it was not really recommended , because the new Athlon generation with Thunderbird cores was released a few months later.

2000 – Thunderbird and Spitfire: overclocking with a pencil

In June 2000, AMD released its second generation Athlon processors with Thunderbird cores. The first models were still released as a Slot A variant, but the Socket A versions also appeared immediately, which eventually came out with speeds of 600MHz to 1.4GHz. The major difference between the first and second generation Athlon processors was the placement of the L2 cache. With the first variants that was 512kB in addition to the processor; the second generation had 256kB integrated on the chip itself. Although the amount was halved, the speed of on-chip cache itself was much higher. AMD was the only manufacturer with a seventh-generation x86 processor in June 2000. Intel also pushed the speed of its Pentium III to 1GHz, but only followed suit with its Pentium 4 in August.

During the same period, AMD also introduced the Duron processors as a counterpart to Intel’s Celeron CPUs. The Durons had Spitfire cores, based on the Thunderbird design, but the L2 cache was 64kB instead of 256kB. However, the 128kB L1 cache was unchanged and the front-side bus speed was 100MHz as with the Athlons. That made the Durons much more interesting than the Celerons, because Intel’s budget processors had an fsb of 66MHz and only 32kB of L1 cache. The Durons were not only faster, but also much cheaper . In the summer of 2000 you bought a Duron 600 for 269 guilders and a Celeron with the same speed cost 429 guilders.

Many users will think back to this processor generation with warm feelings. It was a fun time for overclockers, despite AMD locking down the multiplier on its processors by laser-cutting the L1 bridges on the tops of the Thunderbird and Spitfire processors. Restoring those connections was a simple job. All you needed was a steady hand and a pencil. There was still a lot of tinkering with the socket and Bison Electro-Kit was used to connect the pins, but it soon turned out that the same was possible with an HB pencil due to the conductive properties of graphite. The Duron processors were popular with overclockers and users with a small budget. With a bit of luck, you bought a 600MHz one and could run it at 900MHz.

2003 – Athlon 64: 64bit and dual cores

In the early years of the 21st century, AMD had a clear technological lead over Intel. The Pentium 4 processors, based on the NetBurst architecture, were not a success. Power consumption and heat generation were high, while clock speeds didn’t scale as well as Intel expected. Intel thought it could reach speeds of 5GHz with the design and even 10GHz with derivatives, but got stuck at 3.8GHz. That was higher than what AMD achieved, but the Athlon processors performed better at the same clock speed.

AMD continued to improve its Athlon processors and released the Athlon 64 in 2003, the first 64-bit consumer processor based on the K8 architecture. The first versions were made on 130nm and from 2005 on 90nm. AMD marketed the Athlon 64 processors with performance ratings instead of naming the actual speed. The manufacturer did this to compare the processors with the Pentium 4s. For example, AMD positioned its 2.4GHz-running Athlon 64 3800+ against the Pentium 4 3.8GHz. Overclocking using a pencil was no longer possible, but AMD released an FX series of its Athlon 64 processors with an unlocked multiplier. These variants were aimed at gamers and also clocked higher.

The K8 architecture of the Athlon 64 processors was innovative because of the integrated memory controller. Previously it was on the motherboard and a northbridge was used as the center. So that intermediate step disappeared and that resulted in a significant reduction in latency. AMD was years ahead of Intel, because the competitor did not integrate the memory controller into the Nehalem architecture for the Core i processors until 2008.

In May 2005, AMD released its first dual-core processor: the Athlon 64 X2. At the same time, Intel released its Pentium D, a dual-core based on the NetBurst architecture. As with the single-core variants, AMD’s processor generally performed better, but AMD didn’t win out on price initially. At release, the cheapest Athlon 64 X2 cost $537, while Intel offered a Pentium D for $237. The usefulness of more than one core was also limited at that time. Games did almost nothing with it. AMD therefore stuck to one core for its Athlon FX variants.

The K8 architecture and the K9 variant for dual cores reappeared in many processors between 2003 and 2009. For example, AMD came up with the Sempron series as successors to the Durons, and eventually X2 variants were also released. AMD also used the architecture for its Turion 64 laptop processors and the X2 variants, which competed with Intel’s Pentium M and mobile Core processors.

2006 – AMD acquires ATi

The Athlon era gave AMD good numbers and the processor manufacturer went on a takeover path. That resulted in the acquisition of video card manufacturer ATi, for which AMD paid an amount of $ 5.4 billion. Initially, AMD continued to sell the video cards under the ATi brand name, but that came to an end in 2010. However, the Radeon name was retained; under that name, ATi introduced its first GPU in the year 2000. AMD still uses the Radeon name for its GPUs.

The acquisition not only enabled AMD to release video cards, but also to use ATi’s knowledge for its processors. Ultimately, this would also lead to the introduction of APUs: AMD processors with an integrated GPU. The acquisition of ATi had major consequences for AMD. Not only technologically, but also financially. The billion-dollar purchase led to a large debt burden, which was exacerbated in the following years by large losses and fierce competition with Intel.

2007 – Phenom and fierce competition from Core 2

During the years that Intel stuck to its NetBurst architecture, AMD had a firm grip on the performance crown. It became difficult to maintain that competitive edge when Intel decided to abandon NetBurst and go back to the drawing board. In 2006, Intel introduced the Core 2 generation. It was based on the economical Pentium M laptop processors, and the Pentium 4’s power consumption and overheating problems disappeared like snow in the sun. Intel released Core 2 Solo, Duo, and Quad variants, making it the first with a consumer quad-core processor. At the time, Intel used a chip design with two dies for the quad cores; there were at most two cores in one die .

Only at the end of 2007 did AMD come up with an answer to Intel’s new multicore violence: the Phenom series based on the K10 architecture. Phenoms came with two, three, and four cores, all of which were monolithic designs with all cores on one running . AMD made the processors at 65nm and although Intel initially did the same with its Core 2 models, 45nm versions soon appeared that could be clocked higher.

The introduction of the Phenom processors was also accompanied by an annoying bug. The quadcores had an error in the translation lookaside buffer, or TLB. That bug couldn’t be solved with software or microcode without significant performance loss, so AMD had to make another revision of the processors, which wasn’t released until sometime in 2008.

AMD moved to 45nm in 2009 with its Phenom II generation. Again there were variants with two, three and four cores. In all cases, AMD now used the same quad-core die and cores were disabled for the dual and triple cores. The smaller process allowed the new Phenoms to clock higher than their predecessors, but Intel’s Core 2 Quads were faster. AMD offered a lower price in return, in order to gain some market share.

The lower-ranked Phenom II processors were of interest to users, as it soon became apparent that it was possible to re-enable disabled cores in some revisions . Later there were also hexacores that were also sold as quadcores. They could also be unlocked in some cases. Motherboard manufacturers responded to this by adding functionality that made this possible.

2008 – From own fabs to fabless

For years, AMD made its own processors in its own production facilities, or fabs. That came to an end in 2008, when the company started to divest its production division. This eventually resulted in a new company: GlobalFoundries. AMD’s choice to divest its manufacturing division was made for financial reasons. The costs of constantly investing in new processes were too high to pay for ourselves.

Now that AMD no longer produces itself, it buys production capacity from other chipmakers. TSMC is currently the supplier of AMD’s latest processors and GPUs that are made at 7nm. GlobalFoundries is still a major supplier for 14nm and 12nm products, including current Ryzen processors. Incidentally, GlobalFoundries was also unable to invest in smaller processes as an independent company. Last year it announced that it would stop developing its 7nm, 5nm and 3nm nodes. The manufacturer cannot compete with major competitors such as Samsung and TSMC.

Early 2011 – AMD Fusion: first APUs

After AMD sold its fabs, an internal reorganization followed in 2009. The company was split into four parts and the CPU and GPU divisions were merged . AMD had already announced that it was working on Fusion APUs, which combine CPU and GPU. The merger of the two divisions made it clear that AMD wanted to invest heavily in this.

At the beginning of 2011, AMD introduced its first APUs. At first these were only economical Brazos variants for tablets, laptops and small computers. For example, the chips appeared in netbooks. The performance of the CPU part was disappointing ; AMD’s dual core was slower than the Intel Atoms of the time. The faster GPU part made up for it, but Fusion was not really convincing.

In mid-2011 the Llano variants for desktops were released. The CPU cores were also not convincing here, but the integrated GPU was a lot faster than that of Intel processors. Still, most gamers saw more benefit in a separate CPU and a video card from the mainstream segment that performed better. The Fusion APUs were interesting for HTPCs, for example, but failed to break through.

End of 2011 – Bulldozer disappoints

At the end of 2011, AMD presented its Bulldozer architecture and the FX name was used again for the corresponding processors. It was a completely new chip design that was built from the ground up with the intention of delivering better performance with lower consumption. AMD had to be able to compete with Intel again.

On paper, the processors were impressive. For example, the FX-8150, the first top model, was an octa-core processor clocked at 3.6 GHz. The FX processors had a modular design, with modules consisting of two cores that had to share part of the hardware. Bulldozer failed to deliver on its promises. In practice, the FX processors sometimes performed worse than the Phenom II variants of the previous generation in single-core applications. Consumption and heat development were also very high.

Although the FX processors were not a great success, AMD managed to break records with them. An FX-8150 was overclocked to 8.4GHz using helium. That was good for a world record that has not been broken to this day. Such speeds were not feasible for home use, but in 2013 AMD released the world’s first 5 GHz consumer processor: the FX-9590. That processor with a TDP of 220 watts also required serious cooling. AMD therefore came up with a bundle with water cooling.

Compared to the Intel Core processors, AMD was not strong with Bulldozer. During this period, the roles were reversed compared to a decade earlier. AMD continued to develop its Bulldozer processors, just as Intel did with its Pentium 4. This time, however, it was Intel that had continued success with its Core i processors, holding onto the performance crown year after year.

Between 2012 and 2016, AMD came up with Piledriver, Steamroller and Excavator as Bulldozer’s successors. The new versions led to higher clock speeds and better performance over their predecessors, but significant improvements were not forthcoming and the new versions did not lead to AMD’s competitive return to Intel.

2013 – PlayStation 4 and Xbox One with AMD hardware

In the Bulldozer era, AMD struggled and the Fusion APUs also failed to succeed in the PC market. The knowledge about combining a CPU and a GPU did come in handy in the console market. AMD became the purveyor of both Sony and Microsoft. The Xbox One and PlayStation 4 were released at the end of 2013 and both had an AMD chip containing a processor and a GPU.

The chip’s processor cores consist of Jaguar modules. In fact, those were CPU cores developed for laptops. Both the Xbox One and the PS4 had eight of these cores. The gpu was based on the gcn architecture that AMD used in its Radeon video cards at the time. The version in the PS4 had 1152 stream processors and was therefore comparable to the Radeon HD 7850 or 7870. The Xbox One had a GPU with 768 stream processors, which were clocked slightly higher.

With the deal, AMD assured years of supplies of those chips, because consoles have a long lifespan. The faster revisions of the consoles, the PlayStation 4 Pro and the Xbox One X that came out a few years later, were also equipped with AMD hardware. In the meantime, Sony has also announced that the PlayStation 5 will get AMD hardware again . Whether Microsoft chooses the same path with a new Xbox is not yet known.

2014 – Reorganization and arrival of Lisa Su

After years of poor financial results, AMD announced Lisa Su as its new CEO in October 2014. She has been active at AMD since 2012 as coo. In the same month, the brand new CEO announced the dismissal of 710 people . At that time about seven percent of all employees. AMD had to cut costs and further reorganize to stay afloat.

AMD had been taking a different course for some time at that time. Competition with Intel had been put on the back burner and the company wanted to enter more new markets. The intention was that custom APUs, such as those in Sony and Microsoft consoles, would play a greater role. In the meantime, AMD was also working on a new processor design, which would only become clearer years later.

2017 – AMD resurrects with Ryzen

In early 2017, AMD released its first Ryzen processors. Bulldozer had been thrown overboard and behind the scenes the manufacturer had been working on the completely new Zen architecture for several years . Expectations were sky-high, because developments in the processor field were almost at a standstill. Intel ruled the roost with its Core i processors and although new versions came out every year, there was hardly any progress. AMD couldn’t do anything about that for years.

Benchmarks show that AMD has made a huge step with Zen. The performance gain over the Excavator cores is huge and Ryzen can compete with Intel. AMD has yet to beat the competition in terms of maximum clock speed and single-core performance, but when it comes to cores, AMD is the absolute winner. Ryzen brought octacores to the mainstream platform for a relatively low price. Threadripper processors with sixteen cores followed not much later on the hedt platform. The performance per watt is excellent, eliminating a major problem of the previous generation.

In 2018, AMD continued on the same course with a second generation of Ryzen and Threadripper CPUs. The new versions are made on a slightly smaller 12nm process and have higher clock speeds. AMD is the absolute core champion with a 32-bit Threadripper W2990WX processor .

AMD’s resurgence is due in part to chip architect Jim Keller, who worked on Zen at AMD from 2012 to 2015. He was also responsible in the past for the K8 architecture of the Athlon 64 processors, which gave AMD success for years. A striking detail is that Keller has been working as a chip designer for competitor Intel since April 2018 . Incidentally, Keller regularly changes jobs; after his first stint at AMD, he went to Apple, where he was one of the founders of the iPhone manufacturer’s own socs. In 2016 and 2017 he worked for Tesla.

2019 – Zen 2 and Navi: new technological edge?

AMD celebrates its 50th anniversary in May. That birthday is celebrated a month before an important announcement from the company. During a keynote at the Computex trade show on May 27 in Taipei, CEO Lisa Su will reveal information about new products . The first details about the new Ryzen processors with 7nm cores and Navi GPUs will probably come out here. Those products will be on the market in the second half of this year.

This year AMD is all about 7nm. At the beginning of this year, AMD already released its Radeon VII , the first consumer video card with a GPU made on TSMC’s 7nm process. Although the GPU is baked on the latest process, the chip itself is a further development of the older Vega GPU. Due to the use of HBM2 memory, AMD cannot make cheaper versions of this video card and it remains an expensive model. The upcoming Navi cards may use gddr6 and will at least be cheaper than the Radeon VII. However, AMD does not seem to compete with Nvidia’s fastest RTX cards for the time being. The Navi charts will probably mainly serve the mainstream segment.

Little is known about the new Ryzen processors yet, except that they are based on the Zen 2 architecture , which AMD has already revealed information about. That processor design consists of 7nm CPU chiplets and a 14nm I/O die. The design with different chips that are made in different processes is cleverly put together, because only the CPU cores need to be made in this way in the most expensive 7nm process. Moreover, those are small chips, several of which can be used if more cores are needed. That is much cheaper than making one large chip with all cores, as Intel does. The use of separate chips also has disadvantages, because there is latencyin the communication between the various components. AMD has developed a new version of its Infinity Fabric interconnect for this and practice will show how that works out.

​AMD can celebrate its fiftieth anniversary, because the company is currently at the forefront of processor technology. Intel has struggled for years to get its 10nm process going and continues to make many processors at 14nm, while AMD is releasing 7nm processors this year. How Zen 2 performs in practice and what exactly the next generation of Ryzen processors will look like remains to be seen, but the future looks bright for AMD.

You might also like