Page 7 of 9 FirstFirst ... 456789 LastLast
Results 61 to 70 of 89

Thread: Intel Vs. Amd

  1. #61
    Xanex's Avatar Poster
    Join Date
    Nov 2002
    Location
    UK
    Posts
    192
    Architecture Architecture Architecture

    Its all about the Architecture

    Can someone give me the temp for a p4 3.06HT Cpu at full maxed out CPU%?

    Take a look at the size of the die for the p4 and for the amd athlon line, the p4 has a huge die (ergo muchos more $/£ since u get less outta the wafer and therfore less chips per wafer = more money)

    The p4 also has a metal casing to the die, area to transmit heat is greater on a intel than a amd, so amd's do run hotter, they have a smaller die and less area to transmit the heat. i have to have my case opened (since new chip) cos i got a poor case and cooling, and ive got shitty silver thermal paste, if i where to shove my old thunderbird copper base heatsink with some artic silver 3 instead of the oem ( cos the old brace on the copper heatsink is just a bitch to equip) i would lower the temp from 51C to 40 odd C. case closed also.

    but as it stands my amd2000+ operates at 44C base and 51C full 100% cpu case open. case closed 51 base and 60C full 100%. 60C for an amd is a stable temp, people are just used to overclockers temps where they try to get their chip as cool as poss'. amds can go as high as 90C b4 they burn out. my mobo has settings to shut it down at certain temps, and the lowest temp is 75C . (chaintech 7njs Ultra 400) which i think has to be one of the best amd mob's around right now.

    my old amd t-bird ran at 37C full wack with 100% cpu load and that same old copper based bitch to fit heatsink and artic silver 2 case closed. !!!!!!

    and then p4's only stay cooler cos they de-clock themselves when they get hot.

    Leave an amd on over nite and it still has the same performance in the morning. leave a p4 overnite and its prolly' dropped the closk by 200mhz to lower the cpu temp to what the manufacturer has deemed "cool"

    my 2p

    Xan

  2. Software & Hardware   -   #62
    Can someone give me the temp for a p4 3.06HT Cpu at full maxed out CPU%?
    Depends solely on the cooling, there is no figure for this. What u want is the rate of heat output (or consumption) in watts, which i dunno offhand

    The p4 also has a metal casing to the die, area to transmit heat is greater on a intel than a amd, so amd's do run hotter,
    If the heatsink and thermal paste is set up properly then the metal casing makes for poorer cooling not better. but it is true that the smaller die size means lower rate of heat dissipation

    and then p4's only stay cooler cos they de-clock themselves when they get hot.

    Leave an amd on over nite and it still has the same performance in the morning. leave a p4 overnite and its prolly' dropped the closk by 200mhz to lower the cpu temp to what the manufacturer has deemed "cool"
    they only lower their operating frequency when the temperature becomes dangerous to the chip (ie 80 degrees or something like that) . A p4 adequately cooled will be the same speed in the morning. If it was reallly poorly cooled it would slow down as u say, an AMD left poorly cooled would simply shut off. (I think the old thing about AMD chips burning out has been addressed and all recent chips shouldn't do that)

  3. Software & Hardware   -   #63
    Ok u guys can debate "clock time" and all that other stuff but the hardcore fact is that Intel's stock kicks amd's stock's ass the only reason amd's iz above the 10 dollar mark rite now cuz they announced opteron 64 bit that also runs 32 bit. BTW INTEL HAD 64 BIT BEFORE AMD JUSS THAT THEY DUN RUN 32 BIT LIKE AMD(THere called XEON)

  4. Software & Hardware   -   #64
    Ex-member
    Join Date
    Jan 2003
    Posts
    5,450
    Originally posted by Secret Squirrel@8 September 2003 - 23:43
    Ok u guys can debate "clock time" and all that other stuff but the hardcore fact is that Intel's stock kicks amd's stock's ass the only reason amd's iz above the 10 dollar mark rite now cuz they announced opteron 64 bit that also runs 32 bit. BTW INTEL HAD 64 BIT BEFORE AMD JUSS THAT THEY DUN RUN 32 BIT LIKE AMD(THere called XEON)
    Did you see that bit about not posting if you don't know what you're talking about?

    No?

    Oh well...

  5. Software & Hardware   -   #65
    lynx's Avatar .
    Join Date
    Sep 2002
    Location
    Yorkshire, England
    Posts
    9,759
    Originally posted by Secret Squirrel@9 September 2003 - 00:43
    Ok u guys can debate "clock time" and all that other stuff but the hardcore fact is that Intel's stock kicks amd's stock's ass the only reason amd's iz above the 10 dollar mark rite now cuz they announced opteron 64 bit that also runs 32 bit. BTW INTEL HAD 64 BIT BEFORE AMD JUSS THAT THEY DUN RUN 32 BIT LIKE AMD(THere called XEON)
    Just about to correct a couple of errors, then saw Lamsey's post, so never mind.
    .
    Political correctness is based on the principle that it's possible to pick up a turd by the clean end.

  6. Software & Hardware   -   #66
    clocker's Avatar Shovel Ready
    Join Date
    Mar 2003
    Posts
    15,305
    I've seen this exact debate on several forums that I visit and it always seems to degenerate into throwing statistics and test numbers back and forth.

    What is the real world difference?

    I have to assume that this argument is mostly of interest to gamers, as most users aren't pushing their CPUs at 100% for hours on end.
    So, is there a real difference between comparable chips when actually playing a game?
    "I am the one who knocks."- Heisenberg

  7. Software & Hardware   -   #67
    Originally posted by clocker@8 September 2003 - 17:41
    I've seen this exact debate on several forums that I visit and it always seems to degenerate into throwing statistics and test numbers back and forth.

    What is the real world difference?

    I have to assume that this argument is mostly of interest to gamers, as most users aren't pushing their CPUs at 100% for hours on end.
    So, is there a real difference between comparable chips when actually playing a game?
    a lot of the people who get worked up about CPU comparisons are video game players. that is true. sure, if you just check your email, surf the web, download a bit of pr0nz, listen to some mp3s, you're not going to make a modern CPU break a sweat.

    but games are not the only things that push CPUs to 100% of their abilities. there are plenty of people who get paid to use computers all day, who would be concerned about CPU comparisons. editing audio, video, creating multimedia/internet content, using photoshop or illustrator, running scientific programs (i admit, this is a bit more rare than the other professional types)... any of that can and will easily use 100% of a CPU's cycles and CPU speed can make a huge difference in how long it takes you to complete your tasks. a lot of people do some of those more creative things at home as part-time work, or just for their own amusement-- faster CPUs do benefit their activities.

    as for the effect of CPU speed on games, it depends. the results vary wildly from one game to the next because any modern 3D game depends on the combined speeds of the CPU and the 3D-accelerated video card (designed to remove much of the burden from the CPU, of rendering 3D graphics). some games rely more heavily on the speed of the CPU, some rely more heavily on the speed of the video card. two classic examples are Quake 3 Arena and Unreal Tournament. in these reflex-intensive games, displaying a high rate of frames per second benefits the player by presenting a more accurate, smoothly animated representation of the action, so the player can respond more confidently based on more reliable visual information. Quake 3 Arena's frame rate flies when you play it with a high-end video card and a lower-end CPU-- it is not especially dependent on CPU speed. Unreal Tournament on the other hand benefits to some extent from video card upgrades, but fast CPUs are what really make its frame rate improve.

    of course, those two games are a few years old... so there is such a thing as "more than enough computer power" for older games. after you pass a certain threshold, there's really no discernible difference between 110 frames per second or 210 frames per second. but then the next generation of games is released, following the same pattern of hardware-dependence but wanting MORE, rendering last year's best-PC-on-the-block into an outdated piece of junk because it doesn't have enough speed.

  8. Software & Hardware   -   #68
    clocker's Avatar Shovel Ready
    Join Date
    Mar 2003
    Posts
    15,305
    Brainiac,

    Thanks for the reply.
    Well written.

    I realize that there are other apps that can intensively use a CPUs cycles.

    I guess my point is that so far all I read here are number comparisons.
    Or theoretical debates about the advantages about different archetectures etc.

    I think it would be more informative if someone could post something like: " I've played Quake with both setups and I like AMD because..." .

    Most of this discussion seems to be pretty blue-sky.

    When I was building my first system a couple of weeks ago, I asked the guys at my local comp shop this same question.
    They didn't launch into a dissertation about stats or theoretical advantages.
    Their reply: "How much money ya got?".
    "I am the one who knocks."- Heisenberg

  9. Software & Hardware   -   #69
    lynx's Avatar .
    Join Date
    Sep 2002
    Location
    Yorkshire, England
    Posts
    9,759
    The very high frame rates of some of the modern games are ridiculous - above about 100 fps (about 50fps for lcd) the refresh rate of the phosphor on the monitor (or lcd pixel) cannot keep up with the frame rate, so anything above that is totally wasted. And in any case, the human eye cannot react at anything like those speeds (actually only about 8 fps), although the frame rate needs to be much faster than the eye to avoid strobing effects.

    Game developers should concentrate on getting more detail rather than higher frame rates, or better still make better games based on current processor speeds rather than trying to produce games that stretch pc's further and further. I suspect we are getting past the limit of what is required in terms of frame rate, and getting near to the limit of what is required in terms of picture detail.

    I often have periods where my processor is running at 100%, but not so much that I can justify spending a lot of money on the top processor chips. I set myself a budget and find out what sort of performance I can get for that money. I don't care whose tests you use, the answer is always AMD. I've currently got a XP1700+, Sandra tells me that the performance is bettter than a 1.6GHz Pentium and worse than a 1.8Ghz Pentium, but it cost me a lot less than any Intel chip. I'm currently looking to upgrade, the XP2400+ beckons (I think that's the fastest chip I can get for my current M'board), and £63.43 seems a very reasonable price, far better than £132.92 for a 2.4GHz P4.

    I think someone was quoting stock prices earlier - all I can say is that if so many people are foolish enough to pay excessive prices for their processor chips, it is hardly surprising that Intel has a high stock valuation.
    .
    Political correctness is based on the principle that it's possible to pick up a turd by the clean end.

  10. Software & Hardware   -   #70
    well, the effect of a CPU's brand on games is relatively intangible. upgrading your CPU may or may not contribute to an increase in frames-per-second, it may or may not make the game run more smoothly. i would not expect a person to be able to identify Intel or AMD (like a Coke vs Pepsi taste test) if they were given a chance to play the same game on two unmarked computers.

    in contrast, you upgrade a 3D video card, and you get more frames per second, you can turn up the detail levels, you may be able to enable more special effects that your older card couldn't produce, etc. some people can pretty easily tell which picture is produced by which card, because of certain quirks or characteristics of each brand's display methods. the characteristics and features of the video card are obvious.

    but CPUs? i would be incredibly surprised if someone could identify the two brands in a "blind" test, to the point where a preference is justified on performance or stability alone. their brand-exclusive features are entirely speed-related (aside from throttling/idling behavior). both brands perform well, both brands make 100% stable CPUs. CPUs either work or don't work, period-- improper cooling, shoddy motherboards & RAM, etc are completely separate issues. there just is no obvious difference to identify the CPU brand, if you haven't already been told which one you're using. "i play Quake on AMD because AMD makes Quake look better, sound better, feel better." anyone who claims such a thing (about Intel or AMD) is just fooling themselves.

Page 7 of 9 FirstFirst ... 456789 LastLast

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •