Moon
Long Term Member
Crazy Dog Lady
Sexayness via Steph
Posts: 1,356
|
Post by Moon on Jun 18, 2011 12:21:53 GMT
In the past when I have purchased computers, the main consideration behind their specifications was Simming. So I would purchase them with the intent of having an ATI graphics card put in, if it wasn't already part of the package deal. I fought tooth and nail for the last one I bought, the guy was really pushing me for an NVIDIA. Sorry, ATI only for me, bucko. So now I am possibly in the market for a new computer. Well... I pretty much am for certain, as my beast is coming up on the three year mark and it has gotten really laggy and tired, and it's looking like it will cost about the same money for me to get a fairly decent new one as it will to get this one spruced up. Plus, I just want an excuse for a new computer! LOL But here's what I am wondering - with technological advances being what they are, is it still imperative that I get a special graphics card? I am really only looking to run TS2, as TS3 didn't jazz me. At this point, are integrated graphics not enough? I do not own Starcraft 2, but was hoping to get it if I got a new computer. Their system specs do list requiring graphics cards... here's the link to the system specs: us.blizzard.com/support/article.xml?locale=en_US&articleId=26242Considering I'm pretty sure that the ATI Radeon 9800 Pro is already a few years old, do I still really need to look at upgraded graphics? Are there any integrated graphics processors that are coming up to these levels? Also, when looking, I tend to focus on computers sporting AMD processors, simply because they bought out ATI and that's the sort of brand loyalty I've been groomed for. Should I be more open to Intel processors? Are there any thoughts as to one having better integrated graphics? I am probably going to be buying from www.tigerdirect.ca, but am open to suggestions for specific machines if anyone has one. I have a good monitor and so on, I really just need the computer itself. Any thoughts/insight/links/suggestions are appreciated.
|
|
Sylphide
Long Term Member
Prima Simmerina
La Sylphide with James
Posts: 955
|
Post by Sylphide on Jun 18, 2011 13:02:25 GMT
Hi Moon, Hooray for new PC. This is fun! I am no geek so I can't possibly answer your questions, but I just wanted to tell you that I bought mine at MircroBytes where they build the machines with exactly the components you want instead of having only prebuilt machines. I don't know if they have locations in Ontario though? Tell us what you got when the deal is done!
|
|
Moon
Long Term Member
Crazy Dog Lady
Sexayness via Steph
Posts: 1,356
|
Post by Moon on Jun 18, 2011 16:29:31 GMT
I looked into MicroBytes, Margot, looks like they are only in Quebec. No luck there!
I used to be a big fan of MDG, had one of my first computers built by them. The other day my husband went and looked on their website, and they only had the computer prices listed as "daily payments" of a dollar or so... as though you are buying on credit. So we called to ask what the deal was, why can't I just see a full price so I know how much I'm paying outright? The guy on the phone told us the full price of a computer just as an example (we didn't ask about any one in particular) but said they only sell them now if you buy through their in store credit program. Yeah, no thanks.
|
|
CharlieChomper
Long Term Member
Totally Technical Helper
Please call me CharlieChomper (or even CC or "the other CC" on this forum). Thanks!
Posts: 1,756
|
Post by CharlieChomper on Jun 19, 2011 2:36:17 GMT
Avoid buying from Tiger Direct, if possible!
They may be an "established" company, but they also have a long history of issues and problems with customer service, as well as sometimes just with merchandise--most recently, they got caught up in a dishonest business deal where they were caught selling old, refurbished Dells as new (by law, anything that's refurbished/repaired has to be labeled such). Dell also sued them over this because it was in violation of a contract to even sell them under Dell's brand name (Tiger Direct acquired these systems as Dell was trying to unload them due to age--they only agreed to allow Tiger Direct to sell them if it their name was stripped off the machines, as Tiger Direct had promised. Tiger Direct, however, chose to sell them as Dells instead in violation of the contract)--especially as Tiger Direct was telling customers these systems supposedly came equipped with Dell's tech support. This was never the case, and when problems arose and customers began calling Dell for help, not only did Dell learn of this (and began investigating), but many customers began to complain. So, Dell complained--Tiger Direct stripped the Dell logo off, but it was more of the same. So, Dell demanded their hardware back and sued Tiger Direct (along with the customers who bought these computers--especially those whose computers were acting up) and Dell even reported them to the authorities over it, on the grounds it was fraud. The last I heard, while the computers were pulled from sale, those lawsuits are still ongoing--it also was not the first, nor last major situation involving Tiger Direct either...
Most people in the industry generally tend to avoid doing business with that company whenever possible because of that and the fact that price-wise, you can sometimes get a better deal by shopping elsewhere.
As far as the AMD vs Intel processor debate, it's nearly an even split these days between the two of them. Also, the use of ATI name in the graphics/video market is being completely phased out by AMD (in favor of branding it all under AMD's name instead--most of the boards I've run across these days have been branded as AMDs, even on the graphics end). If you do opt for Intel, though, definitely stick with their own chipset on the motherboard.
However, on the intergrated graphics end, avoid Intel's integrated graphics chipsets! They have made improvements, but I still wouldn't recommend them for gaming--especially not with the games you have mentioned!
Generally, though, anytime you go with integrated graphics, it's definitely going to eat into your system RAM/memory regardless of what you go with. It also sometimes puts more of a demand upon your processor in terms of performance to compensate (with the way computers are designed, there exists an inter-dependence and relationship between certain hardware components such as the CPU/processor, RAM/memory, and video/graphics card/chipset, motherboard chipset and in how they perform and work with each other--anytime you're running something that's performance "heavy" like a video chipset as integrated, it means that more of the other components that work with it end up having to sometimes get pulled more into things to try and compensate for what a card might have, but the chipset "lacks" (ie dedicated memory and sometimes GPU-performance)).
Also, avoid "over-clocked"/"hyperclocked" hardware at all costs! It's a recipe for disaster waiting to happen (especially where replacing components is concerned). The process may give a short-term "gain" in performance, but it's at the expense of the health of the hardware (as it also dramatically increases not just the strain to it, but also the amount of friction/heat being applied to it, which can lead to premature "aging" of the components as well as very unhappy hardware and problems tied into that, including with performance) and the toll that it takes (including the "aging" I mentioned earlier) is not worth the risks involved as it can kill your hardware. It's also amongst the reasons as to why most companies have a clause that it will void the warranty.
While these days, the overheating factor is likely to cause the system to shut itself off to "protect" your hardware and itself (due to the adoption of new safety standards in the current form factor by the governing body on hardware).
In years past, it was known to sometimes start fires or caused smoking inside the computer due to the heat factors involved as well as sometimes crucial "jumpers" being bypassed that would have kept some things in check or monitored things (during earlier revisions to standards, they had introduced changes to try and prevent it--however, some overclockers bypassed them or would move jumpers around (this was mostly with respect to CPUs back then, where the jumpers were on the motherboard) to get past it, and therefore disable some safety measures that were built into the board to overclock their CPUs as the overclockers complained it prevented them from doing just that. So, that led to further revisions to where the computer will largely now try and just shut itself off instead).
Up until about a year ago, there had existed a company that was very well-known for this practice (in fact, they even advertised their hardware as such given that there is an overly-enthusiastic group of people who engage in this practice that get sometimes labeled (due to their sometimes-almost fanatical-like zeal about it) as "the cult of overclockers" who either don't know how to do it properly or else just don't want to do it themselves).
However, some consumers--unaware of the nature of the practice or its risks and being blinded by the "lifetime warranty" or thinking it was "faster" than the same card made by a competitor not engaging in it--bought these cards and then ended up having overheating or burned out cards or other issues arise because they were unaware of what was done to these cards (I have actually seen this within even the Sims community--not just related to the overheating card issue that some people have had (although, I have seen this brand come up quite frequently with that)).
The result was sometimes a high return rate for replacement or--toward the end--warranty issues as well as complaints. However, that's only one part of the reasons behind their having filed for bankruptcy.
I also know from having spoken to an engineer who worked on the designs whom they licensed from (I'm not going to mention any company names in saying this), that the company they were licensing from was not happy about what they were doing.
Incidentally, licensing of hardware for others to manufacture is standard practice these days--almost everyone (except Intel) engages in it as it's far less expensive than actually building/maintaining their own fab/fabrication facilities (to provide some insight into this, Intel once defaulted on payment in the process of building one due to how astronomically expensive they can be and not having the money for it, combined with astronomical daily maintenance costs in running one). AMD (this would include what was ATI) also no longer runs their own fabs or actually manufactures things in-house, as it were, anymore as they spun off their fab facilities to form a new, separate company. They currently contract their manufacturing and production work to another company, in an effort to cut costs.
On a semi-related note, given the changes in slot types and standards since the days of the ATI 9800 Pro (which is an AGP card) versus now (PCI-E/PCI-Express), it wouldn't work in a newer system. You'd still be looking into having to upgrade to a newer card that uses the new standard (among the advantages to PCI-E is that the cards are generally cheaper to produce and therefore cost less, perform faster and use up less wattage/more efficient than the older 8x AGP slots allowed for).
As you're looking at an entirely new system, though, the key question that needs to be asked are not just about the hardware, but the budget involved. I'll try and post a list of some of the better or more reliable companies out there who build systems(or, at least, allow customers to be able to choose some of the hardware that goes into them).
|
|
Moon
Long Term Member
Crazy Dog Lady
Sexayness via Steph
Posts: 1,356
|
Post by Moon on Jun 19, 2011 22:37:19 GMT
Thanks for the indepth reply, Charlie Chomper. And thanks for the heads up on TigerDirect! My father in law is in love with them, LOL I totally should have mentioned budget in my original post... duh! I don't want to pay more than $700 for the computer, (before taxes and possible shipping costs) though I suppose that I could be flexible for superior system. Since posting this thread, I've been doing a little bit more research, and lurked a bit on a Starcraft 2 forum. A lot of people there are really serious gamers and had much higher expectations for their systems, but one thing that they really push is having a larger power supply. Is that a good idea all around, or would I be okay with a standard power supply since my level of gaming is pretty casual? So I've been looking around a bit, and I am wondering if it is technological blasphemy to even consider getting a Dell, and I hate to admit that I've seen a system that has grabbed my interest. I have heard that the quality of their systems has improved leaps and bounds since their "Dude, you're getting a Dell!" days, but I've been carrying around a bias against them for a long time, LOL They do seem to have a pretty good deal for this system, though. Almost everything else in my price range is around 4GB, and since Windows 7 64 bit is what comes with all the systems, and takes 2GB just for the OS alone, I am really hoping to find something with more than 4GB and this one by Dell has 8. Also, that price includes the AMD Radeon 6450 which, although not a gaming card, is should still be able to stand up to most games at reasonable settings (for the time being).. right? www.dell.com/ca/p/xps-8300/pdPlus it includes shipping, so extra score. And it has pretty good ratings. Should I just get over my Dell phobia? And then there's this machine, also interesting. But raises entirely different questions. It has an admirable graphics card, and a beefed up power supply, but only 2x2GB, which is decent from what I read online, but still not the 8 I would prefer. www.canadacomputers.com/product_info.php?cPath=7_121&item_id=037503I don't know enough to tell whether or not everything in the computers is going to get used to the full extent, either. There's no point in having 8GB ram if the computer can't access it all, right? I remember putting extra ram in one computer only to find out that my version of XP wouldn't be able to recognize it past a certain point. (or something like that) There had also been a couple of computers I had been considering on TigerDirect, but I'll steer clear of them now!
|
|
CharlieChomper
Long Term Member
Totally Technical Helper
Please call me CharlieChomper (or even CC or "the other CC" on this forum). Thanks!
Posts: 1,756
|
Post by CharlieChomper on Jun 22, 2011 11:20:01 GMT
If he's just buying components, he may want to look into Newegg (they do have a Canadian version of their site as well) instead--it's historically where I've bought hardware over the years (including within my own systems) and I've honestly never had any issues with them whatsoever (these days, I have more options open to me to purchase hardware from, but still use them--however, for a period of time, Newegg was basically the "only" reliable/reputable option in that sector (and usually very reasonable as well--in those days, Tiger Direct was around, but even back then, they didn't have the best reputation and it's only seemed to get worse...) and it's where historically even many engineers, techs, and even companies, etc. would often purchase their components from (during that period, the reviews were also far more useful and accurate as they were written by these same groups)). Regarding the power supply (sometimes referred to as PSU (Power Supply Unit) for short), it's sadly one of those components many people seem think as long as they obtain a bigger wattage than what they have (regardless of other extremely important factors like voltages, stability/reliability, brand name, how "clean" it runs (not in the "green" sense, but tying more a factor that helps determine its stability), quality of the internal components, number of "rails"/"rail balancing", processor-compatibility (which doesn't come into play as much as it used to), video card-configuration (if applicable), number of connectors (molex, SLI, SATA, etc.), motherboard-compatibility (again, if applicable--some motherboards have been fairly notorious about not playing nicely with some power supplies, sometimes with the manufacturer of said motherboards being aware of this and without providing a list of which PSUs to avoid ), form factor, etc.), "anything" should work or they can just skimp on the costs--the reality, however, is that your power supply is actually the one component that could be hazardous to "skimp" on or just go with "anything" as it's one of the most important components that you if you're not careful can either cause major performance issues (to say the least...), shutdowns, or can damage or even kill your other hardware in some cases (especially if there's an overload). Also, as a power supply ages, it's going to affect its performance as well. In other words, it's not just a nice, big wattage you want to look into (in some cases--and I've especially noticed amongst the "hardcore gamer" group--it's going to be overkill and possibly just a waste of money), what you really want to keep in mind is what else is going into that system and especially keeping in mind the voltage readings on the "rails"! This is especially important with respect to the video/graphics card! There are calculators out there that should provide some idea into how much may be needed to run at optimal performance in terms of the overall wattage--but take them with a grain of salt, as again, they're not the only factors involved in determining what to look into. On a related note, there's a very short list of companies I'd even consider buying from as far as power supplies go due to not having to worry about issues arising with them and good manufacture (it sadly shrank over this past year...)--that's more than can be said about the vast majority of them out there (to re-emphasize this point, that's one component you never, ever want to take any chances with!). It's semi-related to digital cameras (to paraphrase a professor I had this past semester) of where most consumers tend to look into the number of megapixels versus other, sometimes more important aspects of the camera. The wattage is important, obviously, but there important things that needed to be considered as well. As far as Dell goes, they have definitely improved compared to how they once were--but one of my concerns with them has historically been their love of using/relying upon components manufactured "in-house". On the surface, that may not sound so bad or terrible (and usually, I wouldn't have taken issue with it)--except that it extends to the power supply as well to the extent that historically it's been the case of where if you opt to upgrade that later at some point, a customer wasn't able to go with an "off the shelf" model by another manufacturer without running into issues (as the system would be designed to only work with a Dell-manufactured power supply) either due to strange cabling issues (which is no longer the case, thankfully--Dell used to sometimes include connectors as well that were exclusive to their systems and which no other power supply manufacturers out there would work with and were needed=very annoying if looking to upgrade...), literally fusing the power supply to the inside of the case (making the task literally impossible--again, thankfully, a practice they have long-discontinued...) or just their ongoing issues of where if you replace that Dell power supply with something else, performance issues may arise along mostly stability-related problems). I've encountered more than a few techs who like to brag about having gotten a non-Dell power supply into a Dell--the part they either leave out or just sometimes don't follow-up on is that these same systems inevitably end up experiencing problems of one sort or other due to this. Some even like to deny this (it's not that the computer suddenly became "possessed", it's just that it's rejecting the non-Dell power supply as it was never designed to run anything but Dell's power supplies. As terrible as this may sound in terms of comparisons go, it's almost the equivalent of giving a person the wrong blood for their type or an organ rejection or wrong gas/petrol type for a car in terms of how the computer reacts to it). At one point in the Sims2 era, I remember seeing this with alarming regularity and people not understanding why their computers suddenly began experiencing problems or else thinking it odd when they put the old Dell power supply back in why the computer was suddenly "behaving". On a related note to all this, one issue that had sometimes arisen with Dell and power supplies (although, this is only sometimes been the case with them) is if that same power supply may need to be replaced for whatever reason (such as in the case of a problem). There have been times where an appointment needed to be made and possibly having to take it to a special location--as in those cases, they won't allow anyone aside from their own, special techs to deal with it (for various reasons, including just the warranty). However, having said all that, they're probably amongst the better prebuilt/commercial builders on the market at the moment. Regarding the 4GB RAM limit issue and Windows as well as the links you posted, I'll post about those sometime later today when I get a moment to.
|
|
Moon
Long Term Member
Crazy Dog Lady
Sexayness via Steph
Posts: 1,356
|
Post by Moon on Jun 22, 2011 20:58:58 GMT
|
|
CharlieChomper
Long Term Member
Totally Technical Helper
Please call me CharlieChomper (or even CC or "the other CC" on this forum). Thanks!
Posts: 1,756
|
Post by CharlieChomper on Jun 24, 2011 11:24:40 GMT
The odd thing about power supplies is that, unlike most other types of components, it's actually become a more complicated "science" to select from compared to how it used to be (it's a combination of form factor revisions (which would not impact a new system as far as differences are concerned between the revisions--it only really applies to situations where a power supply may need to be replaced or possibly upgraded and the age of the computer/system involved in relation to when the revisions were released), just the existence of the different form factors (typically, you're not likely to see this as much with consumer systems as you would in more specialized circumstances), and especially just advancement and changes in reliant hardware that made it necessary (such as with processors and especially video/graphic card technologies)).
Regarding the 4GB limit, it only really applies to 32-bit operating systems (regardless of what it is or what architecture it uses (Windows, Linux (or any other 'nix)/OS X, etc.), as it's a physical limitation tied into the 32-bit architecture).
These days, unless you're planning on running a 32-bit edition of Windows (32-bit edition of Windows 7, 32-bit edition of Vista, or older...), it's extremely unlikely you're going to run into this issue given the move toward 64-bit operating systems and software (with the exception of a motherboard having this limit--but as with software, it's unlikely you're going to encounter a motherboard that has a 4GB limit these days).
Regarding HP, it historically had been the case of where someone used to pay more for the "brand name" versus the actual hardware they used (this is not as much the case as it once was). However, one concern I have had with them in the past (and I admittedly can't say with absolute certainty if this has changed or not--however, I haven't seen this as much lately compared to how it used to be on the desktop end, at least) as it had been a major issue was that they had a history of motherboard problems (usually a failure or short somewhere that resulted in having to have it replaced). However, while I haven't seen this as much with their desktops, I have definitely encountered this with their laptops of late (most recently, with one that was barely a year old if that...).
The other brand which HP puts out (which had previously been its own brand), the lower-end, Compaq should be avoided (long history of issues, some of which persist...).
Also, in looking at the links you had posted, that particular series from Dell (XPS) is from their gaming-entry "rigs" (it was developed originally in an attempt to compete with the higher-end "boutique" system builders out there such as Alienware (which Dell now owns and bought out mostly in an attempt to get better access to AMD due to their own customer demand for them at one point and to better compete with other companies out there (given that Alienware had been mostly aimed at the "hardcore" gaming market out there along with the strong demand for AMDs by that demographic at that time, Alienware had very good access--which Intel wanted (Intel had the strike against them where AMD was concerned due to their long-standing history of having previously been an Intel-exclusive "shop" and with an equally long-standing relationship with Intel, so they had limited access to AMD processors back then (Dell had also been one of Intel's biggest customers in the PC market (although, I really can't emphasize this fact enough when I say that computers (regardless of what type it is--PC or Mac, etc.) make up a very tiny percentage of Intel's overall processor and chipset sales (maybe less than 2% if even that...))).
One possible issue that may arise with some software and multiple cores (as all the systems you have listed are multi-core) is the same as with any new hardware--"backwards compatibility" regarding older software. I mainly mention this with respect to Sims2--admittedly, I have not been keeping tabs on Sims 2 as much with respect to quad cores to fully comment on that one way or another, so it is something I will have to look into (hexa core processors are actually fairly new--however, as most software out there hasn't even really made full use of quad core technology, much less even dual core, hexa core at this point would seem like overkill for the types of tasks you're running).
In looking over all four systems so far, the HP Pavilion P7-1038 has the fastest processor out of the lot, but the trade-offs are that the hard drive seems to be the slowest (only 5400RPM--which seems rather low by current standards. This mainly affects the access speed of the drive.). I also have to question why it would be using (or even need) shared memory if has dedicated video memory. It's also the lowest end video card out of those you have listed.
For the money between those two HPs, the Pavilion P6741F would be the better deal (that is not in comparison to the other systems you have mentioned, however--just between the two HPs).
The Prodigy system I have to admit I'm not familiar with as far as companies go (so I can't comment on their workmanship/care nor in their warranty situation or handling of things like rescue discs, etc. might go), but seems to have the slowest processor speed. The RAM vendor is also one that's not very well-known, however, the case/power supply is a very well-known brand. The jury 's currently out as to whether they're a good company anymore or not (they've quality assurance/control appears to have gone downhill over the past year--I'll just say that they used to be in the upper echelons of good manufacturers, but are now in the caution zone nor are as efficient (typically anything that's below 70% efficiency on a power supply isn't that great--it also can have a potentially negative impact on hardware as the lower the percentage (and ergo, efficiency), the greater the risk that the power may not be sufficiently routed to where it needs to go (or just enough produced to supply the components that need it)--this is another very important factor involved in power supplies).
The video card, however, appears to be the fastest of the offerings so far.
The Dell appears to have the most customization options listed as well as having better video/graphics card choices, but between the second HP system (the quad core) versus the Intel, it comes down to the cost ratios involved for options and the trade-offs that may be involved. However, between any of the Intel offerings by Dell versus the AMD Phenom II 955 that HP is offering, the Intel processors actually outperform that particular AMD processor in the benchmark results.
|
|
Moon
Long Term Member
Crazy Dog Lady
Sexayness via Steph
Posts: 1,356
|
Post by Moon on Jun 24, 2011 23:32:05 GMT
Thanks sooooo much for helping me with this.
I *think* the Prodigy is a build by the Canada Computers people. Explains why no one has heard of it, LOL
I've pretty much, 99% decided to go for the Dell. A friend is really pushing me to go for a quad core processor but the only one I can find in a decent price is that HP and after looking into what you said about the Intel vs the Phenom, it looks like EVERYONE is saying to get the intel even though it's not the i7.
Plus I like that Dell makes the PSU info available, and lots of other computers I've looked at don't, even on extensive searching into the matter.
Do you think it would be worth it for me to spend the $50 more to get the HD 5670?
|
|
CharlieChomper
Long Term Member
Totally Technical Helper
Please call me CharlieChomper (or even CC or "the other CC" on this forum). Thanks!
Posts: 1,756
|
Post by CharlieChomper on Jun 26, 2011 5:34:50 GMT
Most companies tend to be very guarded as a rule when it comes to who manufactured the hardware going into their systems--with laptops, even more so.
In some cases, it's due to contracts they may have in place or not wanting to advertise whom they're using (in some cases, there may be a reason behind it due to quality issues...).
With laptops, the reason that there's more secrecy, however, is that it may surprise people to know that there are actually very few physical manufacturers of them--most companies who actually sell the laptops to "end users" (such as consumers or businesses, etc.) actually, themselves, choose from within a pool of laptop choices made by those same manufacturers and then rebrand them and maybe make a few modifications or tweaks to them to resell (the companies never advertise this however, nor want it to generally get out).
There are a few exceptions to this where some companies may design them and then outsource production to another manufacture (usually through an exclusive contract and never with that same "pool" I mentioned earlier), but the majority of them these days work from within that same, small pool as it's seen as more cost-effective and companies realize the main selling point of a laptop is portability--not quality--which is also why laptop quality has been in decline or some people end up replacing them sooner than they would say a desktop or just previously...
Also, in looking into Prodigy, it actually appears they're a small American system builder.
With respect to Phenoms, you have to keep in mind that it's a series (like Athlon, Opteron, etc.)--so, there could be one core release or generation that's not that great, but then the next release might be better. Unfortunately for AMD, during that particular core release for what you had listed, they were still very much in "catch up" mode to Intel and didn't fare so well (I've seen it described more like taking a major beating or even a bloodbath, given how poorly they performed in contrast to their Intel counterparts as the benchmarking results came back with Intel beating them in the double digits).
However, given what was going on internally within both companies, it wasn't a big surprise--Intel came back fighting after initially being badly defeated in the first round of the 64-bit processor war, while AMD was undergoing a combination of bad management (as in, it would be too kind to even say their CEO at the time deserved to be in the top 5 of all-time worst and most incompetent CEOs in any industry in history...), the aftermath of the merger with ATI (which led to even further bad management decisions that negatively impacted both companies involved in it), and resting on their success/failure to further innovate (see notes about bad management decisions...).
In AMD's case, they've luckily parted ways with the horrible CEO (I won't get too much into the industry politics involved, but let's just say that given his past history of mismanagement and how he nearly ran an otherwise successful, industry giant into the ground (who, incidentally, are still trying to recover from that all these years later...), many people were shocked by his being named to the post and concerned about AMD's future and prospects given his past record--following his ouster from AMD, another otherwise "healthy" company unfortunately named him to lead them and if not for his current, ongoing problems with the American authorities (namely, the SEC) and being under investigation by them, there was talk of that company going bankrupt given what he did).
However, AMD had been trying to recover from the issues and messes he created until fairly recently--including just what he did to what had been ATI. To provide some perspective on this, a few years ago, there were ongoing discussions and serious concerns that they would cease to exist because of how bad things were. There was even an ongoing joke said to be circulating within AMD during that time (as well as an ongoing rivalry--almost animosity/hostility at times--between AMD employees versus ATI employees) that anytime layoffs were announced, how many ATI employees actually still remained (as more often than not back then, ATI bore the brunt of the layoffs to where very few employees actually remained from prior to the merger) or if there were any still left.
Just as ATI's products began to suffer within a year of the merger as not only did they lose many of their researchers and scientists and just staff (either due to the merger, or post-merger bad management decisions and layoffs that resulted from that--including having had their budget reduced or underfunded, further leading to layoffs), but their budget for it was greatly reduced to where it began to hurt them and the company (their driver support during this time also left something to be desired for similar reasons) and led to some question as to whether AMD bit off more than they could handle with them or what was going on there. Just as their products lagged so far behind nVidia during that time in performance and other crucial areas, it didn't look good for them on the whole.
This same mismanagement then began affecting their processors in a similar fashion to where they had suddenly found themselves trying to play catch-up with a more, fight-worthy and aggressive Intel--and were being beaten (sometimes badly...) by their main rival.
However, one has to also take into account that there's an old commentary about the differences between management/business people (suits) versus the techies (engineers, developers/programmers, techs, researchers, sys admins, etc.) of where what the "suits" might think looks good on paper, may not make much sense to the "techie" who may have a far better grasp on the inner workings of the industry in every respect (including the various psychologies at work) than the "suit", but might have a more difficult time in trying to sometimes explain it to the "suit" or why it makes sense or why things are the way they are.
That merger was definitely a great example of that. As AMD's then-CEO felt he "needed" to make his mark within the company (his predecessor was the one largely responsible for their success as well as for the introduction of their very significant and highly successful introduction of the 64-bit processor as we now know it), so he decided he wanted to be remembered for doing something he felt would boost his standings within the community (some have argued he was trying to bolster his own ego while he was at it...). Around this same time, there began some murmurings of Intel working on what was seen as potentially significant new technology that would effectively fuse both their CPU and GPU (graphical processor) technologies to increase the performance and efficiencies of both (at that time, it was more of a rumor that had some credibility, but without anything specifically being known or what they may have done with it). AMD's CEO at that time, knowing that they were not in the graphical chipset market decided this would be his way of marking his legacy. So, he decided for cost reasons that there should be a merger between AMD and another company that was in that market.
This is where the suits divide with the techies. To the techies, the obvious choice by a long shot would have been nVidia (as mentioned earlier, it's difficult to fully describe the reasons why, but on every level--including the standing of both companies and their philosophies--it made the most sense). In fact, I remember within a week of the ATI-AMD announcement, talking to an engineer at nVidia I ran into and how the mentality with some of their employees having been expressed by surprise that AMD had opted for ATI instead.
To the suits, ATI made more sense for reasons I'm about to get into (and which proved nearly fatal for ATI).
Apparently, AMD's CEO had approached nVidia's CEO about it--however the main reason it didn't go through was not because nVidia carried a higher price tag, but purely ego on the part of AMD's CEO (as everyone concerned--including the boards of both companies, Wall Street, etc.--felt nVidia's CEO was a better choice to lead the combined companies and had a better record of solid performance and leadership and was just a better CEO, period. AMD's CEO, however, (again, being mindful his main reason for doing this was an attempt to leave his mark...) wouldn't relinquish control and demanded that the only way he would agree is if he was named to the top spot. nVidia refused and the negotiations broke down and ended).
He next went to ATI, which carried a lower price tag and whose CEO was agreeable to the demand. For ATI, it also meant solving an ongoing problem they had been having as well--better access to fab facilities/production and access to suppliers that AMD had (as ATI had a history of not being able to produce enough to keep up with demand, sometimes due to supplier issues or shortages of key resources, which sometimes led to difficulty in locating their cards to purchase as well as sometimes just shortages of their cards for sale which was hurting them).
Anyway, getting back to things, let's just say since that one CEO has left, the current CEO of AMD (who appears to be much better than his predecessor so far...) has been trying to undo the mess and bring the company back into recovery and just putting them back on track--the result has been having to catch up in both markets, versus being the leader (as it was prior to the merger).
They have gotten better with some cores, however, lately (and please excuse my almost-"kitchen sync" explanation there) on the processor end (on the graphical side, they have improved leaps and bounds over the past few years to where they're comparable to their position, pre-merger, in that market again). Also, price usually does come into play for some as well--about a year ago, when the first-generation hexacores were released by AMD, their price tag was about $300US (or 212 Euros, roughly). In contrast, 6-month old Intels i7 Quad cores during that same period ran at about $1000US (about 706 Euros).
That's sometimes the trade-off between AMD and Intel--Intel has sometimes had an edge over AMD, but AMD has historically been cheaper, price-wise over Intel and sometimes has done better with backwards-compatibility where hardware is concerned (for example, where multi-core processors are concerned on varying motherboard socket-types--with Intel, they have a history of releasing new chipsets and sockets for new cores).
Sadly, though, with that generation of Phenom that AMD was pressing in the quad core you had listed, it didn't even take an i7 to outpace it--even the i5 cores ripped it to shreds in performance.
As far as graphic cards go and performance, I have to admit to being a bit confused by your question with respect to upgrading from what card to the HD 5670?
|
|
Moon
Long Term Member
Crazy Dog Lady
Sexayness via Steph
Posts: 1,356
|
Post by Moon on Jun 26, 2011 18:49:43 GMT
Oh sorry... the Dell comes with the HD 6450 for the $699 pricetag, but for $50 more you can opt for a 5670. I know that the 6450 is in their newer series but the the 5670 should outmaneuver it still? I read that anything below a 5 in the second digit signifies a more "budget" graphics card.
I guess I am pretty naive when it comes to business. I figure business is... well... business! That the only thing that really comes into play is making money. But man, you make it sound like an absolute soap opera! There's so much politics and drama.
|
|
CharlieChomper
Long Term Member
Totally Technical Helper
Please call me CharlieChomper (or even CC or "the other CC" on this forum). Thanks!
Posts: 1,756
|
Post by CharlieChomper on Jun 27, 2011 4:50:45 GMT
You'll have to excuse my having gone into all that--I used to joke with a friend of mine that if I were to ever get into just a fraction of all the things that have gone on or are happening in the industry and write a book about it, it would need to be sold by the pound because of how massive a topic it is. It would even be larger if one were to even go into just the history of things... For better or worse, it's impossible to separate business from industry politics in any aspect of the industry. Just as what may seem logical from a business perspective (and which may appear to possibly just earn a company money) may not correspond with say the politics--which may actually play a larger role into how successful something is or turns out to be (or the degree of it or whether it ends up becoming a failure and the magnitude of that). It's a slightly different "beast", if you'll excuse the expression as far as how things operate. However, again, I'm sorry for having mentioned all that earlier --you are correct, though, in thinking it's like one big soap opera or "drama". I mainly had mentioned it to put some of this into perspective with current standings. I usually try and leave all that out normally--and probably should have when I initially wrote that out as well. As far as those two cards are concerned, while the lower numbers in a series (usually 5 and below) typically do denote "budget" (this applies to nVidia as well as with AMD), between those two cards there are trade-offs. The 5670 requires almost twice the amount of power compared to the 6450, but the 5670 still (from what I've seen of benchmark results between them, at least) greatly outperforms even the newer 6450 (the results have varied as to actual figures I've seen--as it depends upon how the benchmarks were determined and circumstances as well as in what benchmarks were actually used--but it's largely been by a significant gap between the two). So, to answer your question, if it's only about $50 more, you may want to look into the 5670 over the 6450 given the performance gap alone (from the last time I checked, that price is also less than what that card is even retailing for).
|
|
Moon
Long Term Member
Crazy Dog Lady
Sexayness via Steph
Posts: 1,356
|
Post by Moon on Jun 27, 2011 15:43:54 GMT
Excellent, that's the answer I was hoping for, LOL
Thanks again for all your help!
|
|
CharlieChomper
Long Term Member
Totally Technical Helper
Please call me CharlieChomper (or even CC or "the other CC" on this forum). Thanks!
Posts: 1,756
|
Post by CharlieChomper on Jun 27, 2011 20:40:53 GMT
|
|
Moon
Long Term Member
Crazy Dog Lady
Sexayness via Steph
Posts: 1,356
|
Post by Moon on Jul 9, 2011 14:41:29 GMT
So, I sucked it up and actually bought the Dell that's a step up from what I was looking at... an i7 instead of an i5 and 8gb instead of 6gb. But at $100 difference I feel I'm getting a really good deal. I just got my email from Dell yesterday saying it's in the process of shipping!!!!! Now to play the waiting game.
|
|
CharlieChomper
Long Term Member
Totally Technical Helper
Please call me CharlieChomper (or even CC or "the other CC" on this forum). Thanks!
Posts: 1,756
|
Post by CharlieChomper on Jul 10, 2011 21:01:14 GMT
Congratulations!
|
|
Moon
Long Term Member
Crazy Dog Lady
Sexayness via Steph
Posts: 1,356
|
Post by Moon on Jul 13, 2011 0:56:24 GMT
Oh yeah baby, I am on it right now! Everything is working beautifully thus far. I just got it all fired up so no installing sims yet, LOL.. but I am so excited!
|
|