why is amd slower than intel?

Discussion in 'hardware' started by TechOutsider, Jul 18, 2009.

Thread Status:
Not open for further replies.
  1. tipstir

    tipstir Registered Member

    Joined:
    Jun 9, 2008
    Posts:
    830
    Location:
    SFL, USA
    Sounds like nobody here is going to agree with either AMD or Intel. I run both and would still go with AMD. Intel commercials will lead you to believe they're the only CPU on the market, well guess again. Larger cache is no always a good thing in some applications. Still the added buffer feature won't hurt.
     
  2. Bill_Bright

    Bill_Bright Registered Member

    Joined:
    Jun 29, 2007
    Posts:
    4,042
    Location:
    Nebraska, USA
    Huh? What does that mean?

    Well, of course they do. AMD would like you to believe the same, as would Ford and Toyota prefer you only buy from them. For those that have done the research, there's a clear leader - but that does not mean the follower is a slouch.
     
  3. tipstir

    tipstir Registered Member

    Joined:
    Jun 9, 2008
    Posts:
    830
    Location:
    SFL, USA
    I am more for AMD than Intel control of the market. I use to go for Intel and not for AMD. I found that AMD CPU though cheaper and can be used OC by 1GHz more tend to do basically the same type of job. This race to see which CPU is faster is getting nowhere today as not every application can use such speeds as of yet.

    Intel should never have released the no release CPU called Celeron. This just mix up the market with poor defective CPU with limited cache. I still feel AMD own Sempron still better over Celeron. Full CPU with it's larger cache just adds more speed to I/O stream other than that they all going to perform pretty much the same. Some applications tend to see better results with Single core than with Dual Core. Now with the Quad Dual Core how much applications can use that and if they do are you going to spend for for having the latest CPU then find out another faster CPU has top that CPU you just got.
     
  4. Fuzzfas

    Fuzzfas Registered Member

    Joined:
    Jun 24, 2007
    Posts:
    2,753
    My (well, my bigger brother's, i didn't have my own PC back then) first CPU was an Intel Pentium I (200 Mhz). After that, i found myself going with AMD every time (Athlon 1100->Athlon XP 2500+ ---> Athlon 64 3200+ ---> Athlon 64 x2 3800+ ---> Athlon 64 x2 5050e.

    I am not a fanatic gamer anymore (i play older games) and mainly need the PC for internet, listening to music, watching films,p2p.

    For that purpose, all the comparison between "slower" and "faster" is IMHO moot point. Even the 3800+ x2 shows no noticeable difference to the 5050e. I can p2p, burn DVD, watch HD film and browse the internet at the same time with no delay. What else do i need?

    Everytime i did my shopping enquiries, i ended up with AMD, simply because it gave me enough power for MUCH less money.

    As for heat, EE (Energy Efficient) AMD models, run very cold. My current 5050e , using stock crappy cooler, with 32 C room temperature, is at 38 C (CnQ enabled).

    Yeah, Intel is faster. But how many really NEED that performance boost? AMD gives better bang for buck and for non gamers-average home users is more than enough. The AMD entry leven dual core (the old S939 x2 3800+), is more than enough even for Blu Ray reproduction , coupled with a half decent video card.


    The rest of this "faster" war is part of the promo-advertizing of the companies, trying to convince you that you NEED the newest 8-core (soon) CPU, while in reality, for most adult home users, any dual core will do.


    My current setup

    Asrock A790GMH/128M
    AMD x2 5050e
    4 GB RAM Kingston 800Mhz DDR2 CAS 4
    VGA: using onboard integrated ATI HD 3300
    2 HD

    Good luck in trying to make an equivalent powered rig with Intel based components and using the SAME amount of money... That's AMD for me... BTW, the above CPU/motherboard combo (the 780G and 790G are very efficient chipsets) is one of the lowest in power consumption around, decreasing overall heat and electricity bill.

    (With a fraction of the money difference that i would pay to buy Intel, i bought 1 spare motherboard and 4 GB more RAM for future 64bit upgrade in Windows 7).
     
    Last edited: Aug 30, 2009
  5. Fuzzfas

    Fuzzfas Registered Member

    Joined:
    Jun 24, 2007
    Posts:
    2,753
    Exactly. The currently available CPU power is not yet used by software. You can "upgrade" in cores and actually find a performance decrease.

    We will arrive at 12 cores and they will still sell the story that we "need" the newer CPU, because it performs a synthetic test in 10 entire seconds faster than our "old" CPU... :D
     
  6. TechOutsider

    TechOutsider Registered Member

    Joined:
    Sep 26, 2008
    Posts:
    549
    I agree with Fuzzfas and tipstir. Both offer comparable performance at most price points. AMD is often cheaper though.

    Do we need that performance boost? For me, no. I have a Intel Q6600 and I want to replace it with a Celeron 440 (a lot less heat), since I don't like running my A/C.

    I agree.

    What about you guys? Do you think that you absolutely need an i7 over a Phenom II x4 for your computing needs? But competition is nice.
     
  7. westom

    westom Registered Member

    Joined:
    Aug 21, 2009
    Posts:
    41
    AMD is cheaper with minuscule profit margins (and saddled by a major mistake in buying the video controller company). Its processors are no longer as fast. The reason was obvious. Transistors had gotten so small that the CMOS gate insulator (glass) was as few as three atoms thick. Too much electricity was leaking through that glass. To make the processor faster means even more electricity leaks through that glass. Faster meant too hot. Not because heat is destructive. At those temperatures it is not because heat causes too much timing and conductivity changes - data errors.

    Nobody could make a solution for that gate leakage work. Dupont's solution was used on some IBM chips - which then pealed off during construction.

    Intel took a major gamble. Intel committed to hafnium dioxide in their next generation of ICs long before the process was even proven. A decision made at the presidential layer because the risk was that massive (and because their management actually understands how the product works). Intel's proprietary hafnium dioxide worked. Everyone else was playing catchup with Intel's smaller and faster transistors because nobody else was able to make transistors that small using the new hi-K material.

    Intel has a long history of doing this. AMD is simply pushing Intel to be innovative - as Zilog Z-80, Motorola's 68000, IBM's PowerPC and a long list of smaller challengers (ie Transmeta) also did. To be competitive, the AMD chips must now operate slower so as to not overheat. AMD is now trying new solutions such as outsourcing because their R&D has not kept up with Intel - who took major gambles when the technology was not even known to work.

    Also mentioned was AMD's major bungle in grossly overpaying for their video controller company – as then discovering that the video controller does not integrate well with processor needs. This has further impeded AMD's R&D budgets.
     
  8. tipstir

    tipstir Registered Member

    Joined:
    Jun 9, 2008
    Posts:
    830
    Location:
    SFL, USA
    Strange you say this I am using a AMD laptop with ATI PCI-E integrated GPU without any issues. Face it AMD just as good as Intel. VIA is another CPU to deal with as it's not as mention as much as AMD. VIA been around too. The CPU doesn't offer a lot of what AMD has for L1/L2.
     
  9. westom

    westom Registered Member

    Joined:
    Aug 21, 2009
    Posts:
    41
    So you know AMD is making a profit on that integrated solution? You can see it is profitable only because your computer works? Others have attempted this and failed. That user also saw it working; not the financial failure that created it and not the slower processing speeds that mean the computer will be retired earlier.

    AMD is now suffering R&D cutbacks. Financial analysts attribute that to paying much too much for the video processor company (ATI?). When will you see results of that reduced R&D? Typically four years later when their newer products are not doing what Intel already is. Some R&D cutbacks have already appeared.

    AMD is now outsourcing some of its manufacturing. AMD has been slow to grasp technology such as high-K processes and the smaller transistors that Intel has been doing for some time now. Therefore AMD needed outside help. Therefore AMD processors cannot operate at the faster speeds.

    AMD is not making profits in an industry that is highly profitable. What AMD was doing in their products years ago has only appeared in their bottom line recently. When AMD had a better architecture, a profit growth appeared in their financials many years later. This started somewhere around K-6 when a new engineering group was isolated from AMD top management. Therefore began some superb work especially in architectural design. Today, AMD's technology is not advancing relative to Intel.

    So AMD integrated your CPU and GPU - at how much loss? Others also did this - with no success. Hi-K process is but one example of a major technology leap by Intel that AMD could not do; having wasted so much capital on buying a video controller company. Capital that could have been used in R&D is not available.

    Whispers are asking if AMD can survive. Yes. But the fact that whispers are even asking suggests a problem that most attribute to wasting so much capital on the video controller company with so little to be gained.
     
  10. Bill_Bright

    Bill_Bright Registered Member

    Joined:
    Jun 29, 2007
    Posts:
    4,042
    Location:
    Nebraska, USA
    :D Yeah, my Chevy truck works too, but I would hardly call General Motors profitable.

    Highly profitable? Not hardly! Typical of cutthroat industries, the profit margins are minimal and that's much of the problem. The only way to compensate for low profits per unit is to move large quantities. AMD's decline had more to do with Intel's cutting edge advances, AND Intel lowering prices. Unexpected manufacturing problems and the recession at the same time sure didn't help any either.

    Note for the entire IT industry, the profit margin is less than 7.5% - a careful look shows some serious concerns, like RAM makers profit margin at -44%!

    Look at Dell - a profit margin of 3.43% barely covers inflation.

    Intel Profits down 90%.

    Better design attracted the previous AMD-only gaming community, among other enthusiasts, and Intel lowering prices attracted everyone.

    I think the ATI deal can save AMD, if they can get past today's economic woes - if they are smart about it. If AMD fails to design their ATI graphics solutions to perform equally well on both AMD and Intel based motherboards, the company will surely fail too. That would be very bad for consumers as we need AMD and ATI to keep Intel and nVidia honest, and priced within reach.

    And that's a very good thing for all of us, including AMD. AMD also offers consumers choices - also a very good thing.
     
  11. tipstir

    tipstir Registered Member

    Joined:
    Jun 9, 2008
    Posts:
    830
    Location:
    SFL, USA
    Sure AMD laptop is much cheaper than Intel laptop. Today who wants to spend $1,500 or more on laptop when you can get them from $348, $548 and $699. These net laptop run Intel Atom and AMD has there own Atom. Still price matters. Don't forget VIA. In the past you few others making CPUs too. Intel is huge but AMD can hang around if they don't screw-up. The popular AMD X2 X64 mobile CPU with ATI for video pretty darn good. I have some DELL, and Toshiba running Intel in laptops they're okay too.
     
  12. westom

    westom Registered Member

    Joined:
    Aug 21, 2009
    Posts:
    41
    Ignored is that other CPU manufacturers have already tried it - and failed. No significant performance was achieved by integrating the video controller with the CPU.

    Meanwhile, Dell, et al do not manufacture CPUs. Those CPU manufacturers that have concentrated only on core objectives are highly profitable including Intel and TI.

    ATI deal has caused AMD to reduce its R&D budgets when Intel's new technologies require AMD to increase R&D. Another symptom of AMD's reduced competitive edge: outsourcing of some of its production because AMD needs to buy outside help. Insufficient R&D in AMD to innovate smaller transistors and high-K materials. Innovation breakthroughs already in Intel products to make faster chips that consume less energy.

    How does integrating a video controller in a CPU make the CPU faster with less energy consumption? It does not. But it does imply AMD is no longer trying to match Intel's innovations.

    Defined is why Intel took such a massive technology leap. Intel committed to a proprietary high-K material even though the process was not proven. Other companies committed to high-K materials only to have whole productions fail (ie IBM). Intel committed to a risky process - and succeeded where so many others had failed. AMD is still playing catchup ... does not have those solutions, and is now saddled with a massive ATI debt to further reduce R&D.

    Or has AMD decided to stop competing technically? Has AMD decided to become a commodity producer of lower cost low performance CPUs? Has AMD finally admitted it can no longer innovate fast enough to compete with Intel? Then a video controller, the North bridge, and South bridge all integrated in one package would make sense. Has AMD decided to commoditize their product in a slower 'system on a chip'? Maybe since they paid so much for ATI and do not have capital for the necessary R&D.

    And finally what industry analysts are also saying. Another point you ignored. AMD paid too much for ATI. For the acquisition to be successful, then explain why even the analysts are wrong.

    All above points made previously are unanswered. Some still believe that AMD will somehow compete with Intel - the OP's original question. Possible that AMD has decided to stop trying to compete. To instead find some other CPU market. Somehow an integrated video processor will solve AMD's technology problems? Of course not. How does AMD address the challenge with reduced R&D budgets because AMD paid too much for ATI?

    AMD is losing money in a CPU industry where the leading innovators are extremely profitable. To remain competitive, AMD must sell their products on razor thin - almost non-existent - profit margins. So maybe AMD has decided to stop trying to be a leading edge manufacturer?
     
  13. Bill_Bright

    Bill_Bright Registered Member

    Joined:
    Jun 29, 2007
    Posts:
    4,042
    Location:
    Nebraska, USA
    That statement is illogical. Because other's failed before all future attempts by AMD will fail too? o_O Good thing Orville and Wilbur did not have that defeatist attitude. Or NASA.

    And you assume too much about improved performance. It is not always about simply increasing performance. It is also about cutting costs, increasing energy efficiency, and reducing heat. If improvements in these areas can be realized without degrading data crunching performance, the product is still a success.

    But let's look at performance for a minute. Why was Intel's integration of the floating-point processor into the 486 die a success, even though it increased the price of the CPU? Why was AMD's integration of the memory controller such a big success, such that Intel is now doing something similar? The reason was, and is obvious - no bus to transfer data to distant devices therefore no latencies. But not just that - integration simplifies (lowers the costs of) motherboard/chipset design and manufacturing. If AMD succeeds with their Fusion line of CPUs with integrated graphics (which is still 2 years out), there will be one less bus to travel, and fewer parts to integrate into the motherboard.

    And finally, at least on the video front, AMD has NO plans for the Fusion chips to replace high-end graphics solutions (cards) hard core gamers and professional graphics designers demand. Rather, AMD is marketing the Fusion as a very viable, affordable option for "mainstream" computers - that is, the vast majority of computers out there - those that are not extreme game machines, but used for work, school, email, Internet, listening to music and watching DVDs - all of which can be done easily with just about any graphics solution today.

    It outsources to cut costs, as does just about every industry in the world - or at lease in countries with high labor costs. I assure you, Intel does not believe AMD is not a contender. And with the world economy improving, people and companies are starting (albeit slowly) to buy computers again, so it is not like AMD has no money coming in - or bill collectors banging at the door. In fact, income from the ATI side of the house has gone up, and as Global Foundries (the new manufacturing company name) completes retooling to the new technologies, AMD CPU sales are expected to improve too.

    That's an illogical question. NO graphics solution makes ANY CPU faster. But, a good graphics solution clearly helps define the over all performance of the computer because the more (and faster) the CPU can hand off graphics crunching to the GPU the better the over performance of the computer as that leaves the CPU more cycles to complete its other (and now fewer) duties. And power consumption as seen at the wall outlet (and not necessarily the CPU) is reduced because only one device needs power and no power is wasted shuffling data long distances across a bus. And I note this integration is based on new manufacturing techniques which jams more transistors in the same space - which by itself, improves performance (due to less latency) and allows for even more integration.

    Of course not.
    Not entirely. Lower costs, yes. But they are still striving for high performance, and certainly, many of their offerings achieve that, and will continue to do so for years come, IF they can survive through this recession - and avoid any more buggy drivers.
     
  14. MasterTB

    MasterTB Registered Member

    Joined:
    Jun 19, 2007
    Posts:
    547
    Location:
    Paran?, Argentina
    Just to add some fire to the conversation :ninja:

    I'm running a system with an "Old" Athlon64 x2 5000+ @2.8Ghz with 4 gigs of ram, a 1TB Toshiba HDD, an XFX 9800GT video card displaying on a Samsung SyncMaster 2343NWX monitor with 2048x1152 resolution (60Hz) and I can easily keep up with a Core2Duo (@2.66 Ghz -default-) with a Zotac 260GTX (same ram and HDD) running all games.... and BTW the other sistem has a 1280x1024 Resolution Monitor.

    So, I don't really see AMD being slower than Intel.
    Never have and hopefully never will.

    I also don't trust benchmarks. They never show you anything your system can do. I've built machines in the past going by what benchmarkers built and never could even get close to the results they got.

    Of course this is my experience and my opinion, and opinions vary.
     
  15. TechOutsider

    TechOutsider Registered Member

    Joined:
    Sep 26, 2008
    Posts:
    549
    Intel has always been, and will likely remain for quite a while, the 800 lbs gorilla of CPU land

    "Chimpzilla" stole 25% of Intel's banana's due to the Netburst fiasco and right now the Gorilla is very, very angry.

    Intel's executing very well right now, and they have vast resources to throw at the problem.

    AMD will adapt and live, or fail to adapt and die.

    A snippit from someone else's forum post ... I think it greatly summarizies this thread.
     
Thread Status:
Not open for further replies.
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.