actually, your laptop doesn't... not enough VRAM... it's a VRAM [I]whore[/I] :)
I'm surprised your desktop manages it. Then again... this game is pissing me off... if it's one thing I hate it 'planned obsolescence', being forced to upgrade when I know for a fact my video card would manage just fine with the game as it stands, provided they didn't adopt what ammounts to a newer supposedly sexier codec... fucking window dressing on a 'Buy Vista Bitches' drive...
So it should at least not crash to the desktop, i could understand having to turn down the video settings, but to just crash?
As for my desktop, it not only runs it, but runs it well with all graphics options set on high (except for DX10 support)
I do wish i'd sprung the extra $100 for the 8600 256 meg card for my laptop, wonder if its user replacable...
Biggles<font color=#AAFFAA>The Man Without a Face</font>
They are usually a component, but they are often fairly solidly connected to the motherboard (due to the needs of the cooling system), and only accessible by pulling the entire laptop apart in order to get to the motherboard.
[QUOTE=shadow boxer;164260]I'm surprised your desktop manages it. Then again... this game is pissing me off... if it's one thing I hate it 'planned obsolescence', being forced to upgrade when I know for a fact my video card would manage just fine with the game as it stands, provided they didn't adopt what ammounts to a newer supposedly sexier codec... fucking window dressing on a 'Buy Vista Bitches' drive...[/QUOTE]
How is developers making better-looking games that require more computing power "planned obsolescence"? Vista has nothing to do with it, the game runs fine on XP. Your hardware is just too old. It's like demanding that SimCity 2000 run on a 286, since that's a CPU too, just like a 386 is a CPU.
[QUOTE=shadow boxer;164260]actually, your laptop doesn't... not enough VRAM... it's a VRAM [I]whore[/I] :)
I'm surprised your desktop manages it. Then again... this game is pissing me off... if it's one thing I hate it 'planned obsolescence', being forced to upgrade when I know for a fact my video card would manage just fine with the game as it stands, provided they didn't adopt what ammounts to a newer supposedly sexier codec... fucking window dressing on a 'Buy Vista Bitches' drive...[/QUOTE]
SB I normally tolerate your rants without comment, but in this case your A. Wrong and B. being stupid
Yeah the basic engine for the game is the Unreal engine, but they added alot of SM 3.0 effects, and they had to do that to get the pretty effects in, without it playing at 5 fps. YOUR VIDEO CARD WILL NOT PLAY THE GAME JUST FINE AS IT STANDS GET OVER IT.
With your opinion you should just go back to playing freaking Pong, cause thats what your basicly arguing for.
I'm fully aware that my computer, built using technology from 2002 and 2004, is in no way capable of playing a game made for top of the line computers today. Hell, I can still barely play Supreme Commander, but I know that it isn't the fault of developers for attempting to intentionally obsolete my hardware. The gear I run is ancient, and as such I expect better looking games to [B]require[/B] better hardware to play.
A 9800 drives over 90% of the games engine without a problem, in fact it drives it very well... only the few extra farkles from SM 3.0 cause the issues.
That, is not fair.
That's like denying me use of the lastest car because I have 9 fingers instead of ten...
I like devices which have a lifespan greater than 3 or 4 years.
Sure, advance stuff as much as you like and I'll also concede computers are at the bleeding edge of hyper-fast development and innovation but that's no excuse for elitism and what really ammounts to [I]waste[/I].
How many gazillions of computer waste gets chucked out every year ? That's a cubic assload of glass, plastic, precious metals and highly refined materials being sent (hopefully) to recyclers because of what ? Basically because we want the latest and greatest and we want to keep up with the cyber-joneses. It's bullshit.
It's possible to design things to last for longer than they currently do. it's a conscious choice which nobody seems to want to entertain.
Perhaps in this instance it needs to start with a Mobo with a truly massive and wide system bus and super wide architecture, which can cope with a decade of Moores Law. Same with some other components.
We shouldn't be required to upgrade as often as we do.
Whats wrong with a video card with swappable cores and ram ?
Probably not much.. aside from not being as profitable...
As a root cause for a good deal of the worlds ills, [B]greed[/B] is my first choice.
Yes, I read the link. Yes, it ran. But it didn't run well. Your analogy is flawed. It is more like having a V8 but only running it with seven spark plugs and a cog or three missing from the transmission. It might work, but it'll run like shit. The screenshots of the shader hack illustrate exactly that. Without utilizing SM3.0, the game would require vastly more horsepower to achieve the look and feel the developers want, which would have obsoleted your card all the same.
Hell, you are already contradicting yourself. The Radeon 9800 is an a very old card by any measure. The R300 core was first put onto the market in early August of 2002 with the 9500 and 9700 lines, and the 9800 line was released back in [B]May[/B] of 2003, which itself [B]far longer[/B] than the "3 or 4 years" you wished for in a product. Even adding onto it the various iterations of the 9800, you still get on the high-side of three years old. Stop playing dumb, deaf, and blind and just research this information. It's all over the internet.
As of now, my 9800 Pro is second-hand and runs 100% fine, no hassles whatsoever. It is a fairly young card, but it is at least three years old, likely a full four.
As far as recycling goods go, computer enthusiasts don't typically just throw their parts away. An oddly high number of them collect their parts and re-use or re-sell as necessary. hand-me-downs are very common in this field. It's not like I'll just throw my 9800 in the trash when I get a newer card. That aside, stop skirting around the topic by adding environmentalism to the mix. That has nothing to do with this.
We aren't [B]required[/B] to upgrade, either. You aren't mandated by law to have the highest level system possible by modern measure every month of every year. Hell, this system was first built in December of 2002, and the only changes over its entire run have been the DVD and hard drives (added both), the Video Card (onboard to GeForce Ti500 to Radeon 9800 Pro), and the Processor (2200+ to 2800+). Wow, how wasteful I am!
Hell, I'm not even going to touch the "obsolescence-proof motherboard." I'll let Biggles tackle that can of worms with much more knowledge [B]and[/B] humor.
And as far as a modular setup goes, doesn't that achieve [B]exactly[/B] the same results as upgrading the card? Congratulations, you just replaced the two most costly components of the video card at once!
You need to get over the fact that your card is obsolete. I knew that when I first got it, and it doesn't bother me one bit. Time flies while you're still breathing, and in the computer world things will get obsoleted. It's just how they go. Deal with it or just at least try to learn *why* what you are saying sounds so ridiculous.
Biggles<font color=#AAFFAA>The Man Without a Face</font>
[QUOTE=shadow boxer;164279]did you guys read the link Ahash posted ?
I suggest you do.
A 9800 drives over 90% of the games engine without a problem, in fact it drives it very well... only the few extra farkles from SM 3.0 cause the issues.[/quote]
What if the artists and designers felt those "few extra farkles from SM 3.0" were necessary to accurately convey their vision to the player? Why should they have to compromise their design and vision of the game [i]they[/i] are making so that people with quite old hardware can play it?
[quote]That, is not fair.
That's like denying me use of the lastest car because I have 9 fingers instead of ten...[/quote]
No, that would be if someone designed a keyboard with ten buttons in perfect position for use by each finger, which then made it hard to use by people who lack the sufficient quantity of fingers. And I wouldn't blame such a designer for not ensuring that people with less fingers can use the keyboard.
[quote]I like devices which have a lifespan greater than 3 or 4 years.[/quote]
Your card is still alive. It still runs. It still drives games from its time. There's no reasonable reason to demand that it drives games from [i]this[/i] time that are designed for [i]this[/i] time's hardware.
[quote]Sure, advance stuff as much as you like and I'll also concede computers are at the bleeding edge of hyper-fast development and innovation but that's no excuse for elitism and what really ammounts to [I]waste[/I].
How many gazillions of computer waste gets chucked out every year ? That's a cubic assload of glass, plastic, precious metals and highly refined materials being sent (hopefully) to recyclers because of what ? Basically because we want the latest and greatest and we want to keep up with the cyber-joneses. It's bullshit.[/quote]
It's not elitism, it's technological advance and taking advantage of that advance without hampering their design by having to take into account old hardware that lacks features they feel they require.
[quote]It's possible to design things to last for longer than they currently do. it's a conscious choice which nobody seems to want to entertain.
Perhaps in this instance it needs to start with a Mobo with a truly massive and wide system bus and super wide architecture, which can cope with a decade of Moores Law. Same with some other components.[/quote]
The only way to achieve what you suggest would be to hold back the development of the components that slot into such a motherboard. It's not just the bus width that changes, nor is it just the speed. About the only thing that doesn't change is that it all gets done by sending electrical signals along bits of copper. It's simply not possible to predict what the requirements will be in 10 year's time. Even the bit about electrical signals and copper is likely to change soon enough.
[quote]We shouldn't be required to upgrade as often as we do.[/quote]
I don't remember hearing about any laws being passed making it a requirement that people upgrade their computers. Nobody is forcing you to upgrade.
[quote]Whats wrong with a video card with swappable cores and ram ?
Probably not much.. aside from not being as profitable...[/quote]
That's exactly what a video card [i]is[/i]. Apart from the core and the RAM (which are very tightly linked to the point of being a single design), the rest of the card is some capacitors and resistors to ensure a smooth power supply and maybe an external display signal controller and external bus controller.
[quote]As a root cause for a good deal of the worlds ills, [B]greed[/B] is my first choice.[/QUOTE]
I hope you're not trying to imply that companies are greedy for producing newer products that are better than their previous ones.
The bottom line is that your card is too old. If you don't like the fact that it can't play the latest games, I would suggest you go and buy an xbox 360 or a ps3. They'll still be making games for those in 5 years. But then you'd probably complain about the xbox 720 or the ps4 coming out and making your console obsolete, or that ps4 games don't work on your ps3.
[I]The term "futureshock" refers to a psychological state having to do with informational overload most easily defined as too much change in too little time. It is invariably tied to technological paradigm shifts and can be applied to individuals as well as societies.[/I]
Read some William Gibson.
Then look at the sublimely ironic situation we have here...
It's a game, the major theme there in is one about [I]technology gone mad...[/I]
Why do you think it's name is 'Bioshock'..........
and why do laptop makers refuse to use a modular design, at least make it so those of us with technical knowledge can swap out some major components easily,
forget the modular design. External PCI-Express is much better. Asus is developing [URL="http://www.asus.com/news_show.aspx?id=5369"]this[/URL]. Too bad it's only PCI-Express 1x, but the new pc-card standard that is coming will support 16x PCI-Express.
Well SB...it's also good to note that the speed of technology roughly doubles every 24 months
(See [url=http://en.wikipedia.org/wiki/Moore's_law]Moore's Law[/url])
Given that, a 6 year old card is 1/8 the speed of the new cards (Roughly, not exactly)
Granted, the 9800 is a wonderful card, as was the x800 series, but frankly, they are outdated. Why ask the developers to not use the tools made available to them? Most avid gamers are pressuring developers to make more powerful games with better graphics. Shader Model 3.0 has been out for what 3 years now? As an example, DX10 has been out less than a year, and at several gaming forums people are upset that no game currently uses the DX10 codepath to it's fullest advantage.
Now, I will grant you that making a game backwards compatible is a WONDERFUL goal, and companies such as Epic Games and Valve studios are AWESOME at doing that. However, most development teams are being pressured to release the game as fast as possible, and making a game backwards compatible is lower on the list than bugfixes. Rewriting shaders to use older hardware when newer isn't available is ALOT of work. Development teams must find a target system (Generally what most of the population of their core audience is) and code for that, taking into account that a number will be updating as time progresses during the development period.
Cheap answer: Go buy a x1950 for 150$ (PCIe) and slap it in, be happy.
And a good dual core chip for about 150 would work as well I suppose... :D)
RubberEagleWhat's a rubber eagle used for, anyway?
I'm actually with SB on this one, at least partly (no, i too think the 9800 is too old). But the release-cycle of new graphics cards has just gotten ridiculous over the last years. For example, i bought a x1800 last year for ~400€. A few months ago i got a 8600 GT (for my media center) for ~100€ that has roughly the same power.
(warning: crazy conspiracy theory coming up)
Doesn't it seem odd to anyone else that a publisher would not "encourage" a developer to widen the target-audience as far as possible? Why not include a bare bones mode, that runs on older machines? The lost atmosphere is of less interest to the publisher than the fact that more people would be encouraged to buy the game since they can actually play it.
Unless there was some other incentive for the publisher not to do that. After all, the game is part of the nVidia "they way it's meant to be played" campaign, and nVidia's biggest competitor started supporting SM3.0 one card generation later, thus shutting out a lot of people without nVidia cards...
If SB is using a radeon 9800 I bet he doesn't have a PCIe slot, still there are good cards out there for AGP. Also remember that not everyone will buy a Geforce 8800 GTX just to play a game. That's a high end video card and most people buy mainstream video cards like the geforce 7600 series. The raw performans of the X800XT PE still takes on the geforce 8600gt in many games. Only reason it's starting to get outdated is because ATI didn't put SM3 in it while nvidia did put SM3 into the gefroce 6600 - 6800 cards.
But I'm still all for game developers doing whatever they want with their games, no one is forcing you to buy it.
RubberEagle: I don't know if nVidia told them to use SM3.0 or not, but I do know that nVidia and ATI go to great length to get developers to design games so they work better on their cards.
Well ditto, it's a hardware-heavy game. I should be the forefront of the nagging front since I played SS2 through seven times and still love it to this day. --And boast a rig that was built in 2003. And that's like needing a V8 to run the game and having a dumbed-down Vespa.
There's a formula to how you handle BioShock:
Have rig: Play it.
Don't have rig: Play WoW.
It's [b]that[/b] simple. :D Now I'm sure I'll be able to get meself around to buying a new rig in around a year or so, and by that time all this rootkit-crap and unnecessary jumping through activation hoops is probably over with, and you get to pick the game up for a rough third of its original price. Me like. :)
Nice rig, but if you're buying it mostly for gaming then I would switch the core 2 duo e6400 for a core 2 duo e4400, with the $75 you'd be saving I would use to get a better video card like the geforce 8600GTS or ATI x1950pro.
I'd actually suggest sticking with the e6400, though possibly suggesting the e6600 for $20 more if you're not willing to buy immediately. The extra power is worth the small cost increase. Another option, the e6750 is faster [url=http://www.newegg.com/Product/Product.aspx?Item=N82E16819115029]and on sale right now for $204.99[/url], a far better deal than your current item. My suggestion: [B]Buy the processor while the sale is still going on[/b] and settle for the rest later!
Also, if you're going with any of these core 2 duo units, you'll be perfectly fine with the OEM cpu cooler. The systems idle just fine and dandy with it. My roommate's e6600 is idling mid to high 30's, loading in the 40's. I imagine the e6550 would run slightly cooler and the e6750 just a tad warmer. Save $35 and skip it. Or better yet, put it into a good quality power supply. Name brand, preferrably. [URL="http://www.newegg.com/Product/Product.aspx?Item=N82E16817371005"]Such as this example.[/URL] It's not 550 watts, but your system doesn't sound like it is going to require more than 400 in the worst possible situation, so this seems like a safe bet.
For the video card, [URL="http://www.newegg.com/Product/Product.aspx?Item=N82E16814150230"]here's an EVGA 8600gt[/URL] with a $20 mail in rebate, taking it down to $100 flat.
The difference between e4400 and e6400 in games is too small in my opinion to justify taking the e6400 over a better video card. Now, if you could get a geforce 8800 GTS 320mb, that'd be awesome, they are around the $290 price.
While some games may not fully utilize, the benefits of having the faster FSB and slightly higher clock will add up in the future. It's just a few dollars now for a little extra insurance. Certainly not like stepping up to a Core2Extreme or anything along those lines.
This is also why I suggested the other Conroe-based chips, especially given their fantastic retail price. On top of that, those I suggested come with 2mb additional L2 cache and put out less heat than the Allendale cores.
At the present, an 8600GT is perfectly reasonable as a graphics card. A notable step up over the 7600, but not a significant leap in budget as with the 8800GTS. On top of that, getting a card like that without getting a processor capable of handling data fast enough is wasted money.
Well looking at both aspects of this argument here, this is how I see it.
1) Yes it's unfair that we have to keep upgrading to keep up with the newer games
2) It's true were not forced to upgrade, were just forced to look for compatible software when the only shop you got is the local wal-mart.
3) I have 4 systems ranging from a Athalon 650, a XP 1700+, a Celeron 1.5ghz, and my current P4 3.2ghz. I know the tragedy some of us go though, especially if you want to keep your favorate older systems going, but there is nothing we can do except deal with it.
4) I still have a hunkering for the Voodoo series, my favorite, but I have to keep that to the oldest system that is in dustballs.
5) And playing my favorite games require me to keep my older 650, and there is some I still can't even play on that.
Otherwise... yes it is unfair "but that's life", yes there is great ideas that can make things better "but companies want money, not user needs", yes things are getting bad because things are more expensive and we are making less little by little "but your forced to pay it anyway". I personally think I don't know how the hell a depression has hit yet because people have completly spent hellacious amounts of money they don't have. I don't know what the exact percentage, but it was like 75-85% of americans spent more than they made in one year last year. This year is probably worse. I know everyone's pain, I know how this junk is going on. But we all just have to deal with everything the best way we can. Hell all of the work I done in the past is ignored because it isn't eye candy, but it can outperform anything made.
Sorry but this old fossil is trying "And I'm sorry if it isn't trying hard enough" to spread some wisdom. We can't convince the world to change, but we can convince ourselves to change. (OMG that came from me???)
I can see the change from e6400 to e6600 or e6750, but there's no reason to go from e4400 to e6400 when the price difference is $80 and almost no gain. And the difference from a 2M L2 cache to a 4MB is like 1 fps in games.
The fun part is that I recommend overclocking. The Core 2 duo is no fun without that. I got a e4300 overclocked to 3GHz and a gf8800gts. I really do love to play games with AA and AF turned on so that's why I'm recommending a gf8800gts or Radeon x1950XT if you think you'll never need DX10.
It's funny how this discussion about upgrading keeps coming back, and it's the same story every time a new "game of the year" hits the shelves.
I believe Oblivion was the latest one, Half Life 2 before that, Doom 3.... I got GRAW a couple weeks ago and it's barely playable on a Core 2 Duo with 2GB of RAM and forget about playing GRAW2, it probably won't even start up.
Anyway, yes, it's frustrating and no, there's nothing you can do about it, other than switching to a video console of course.
I'm surprised nobody's been complaining about Vista yet, because that's another suspicious reason to upgrade. Well, XP has still a lot of life in it, so there's no urgent reason to upgrade yet.
I agree, I'm still a bit puzzled as to why there isn't something like a generic laptop system, like the home-built desktops, where you can buy your own parts and assemble it yourself?
the conspiricy revolving around vid card requirements is also that publishers/devs get some money from vid card manufactureres
(Remember NVidia's "The way it's ment to be played" and ATI's "Get in the Game" camapigns?
Of course, that only works when consumers have a demand for better graphics....of which a large part do for some reason... :D
[QUOTE=Stingray;164359]
I believe Oblivion was the latest one, Half Life 2 before that, Doom 3.... I got GRAW a couple weeks ago and it's barely playable on a Core 2 Duo with 2GB of RAM and forget about playing GRAW2, it probably won't even start up.[/QUOTE]
You can't tell the performance of a computer for games just buy looking at the processor and memory. If your system is a core 2 duo with 2gb ram but with a Geforce 3, then I can see why your system can't play GRAW. Or you're playing at some insane resolution with 16x AA and AF.
And core 2 duos are ment to be overclocked. One of the big reasons why alot of people got excited about the c2d when it was released was because you could get 50% OC with the stock cooler.
PS here is a link to a review between different video cards and how they perform with different processors. [url]http://www.legionhardware.com/document.php?id=672&p=3[/url]
[QUOTE=C_Mon;164361]You can't tell the performance of a computer for games just buy looking at the processor and memory. If your system is a core 2 duo with 2gb ram but with a Geforce 3, then I can see why your system can't play GRAW.[/QUOTE]
I forgot to mention, it's a Dell Precision M65, a business type of laptop, so not much choice in video hardware but still got an above average GPU, 256MB of video RAM. You can't buy it anymore, a sign of its obsolescence, I guess.
Comments
but it runs sweet on my desktop PC, (core 2 Duo E6600 2.4ghz, 2.5 gigs ram, GeForce 7600 128 megs ram)
Now i'm just debating buying the game or not. it's visually stunning, and creepy as hell.
little miffed that it won't run on my laptop, as it meets specs.
I'm surprised your desktop manages it. Then again... this game is pissing me off... if it's one thing I hate it 'planned obsolescence', being forced to upgrade when I know for a fact my video card would manage just fine with the game as it stands, provided they didn't adopt what ammounts to a newer supposedly sexier codec... fucking window dressing on a 'Buy Vista Bitches' drive...
So it should at least not crash to the desktop, i could understand having to turn down the video settings, but to just crash?
As for my desktop, it not only runs it, but runs it well with all graphics options set on high (except for DX10 support)
I do wish i'd sprung the extra $100 for the 8600 256 meg card for my laptop, wonder if its user replacable...
[QUOTE=shadow boxer;164260]I'm surprised your desktop manages it. Then again... this game is pissing me off... if it's one thing I hate it 'planned obsolescence', being forced to upgrade when I know for a fact my video card would manage just fine with the game as it stands, provided they didn't adopt what ammounts to a newer supposedly sexier codec... fucking window dressing on a 'Buy Vista Bitches' drive...[/QUOTE]
How is developers making better-looking games that require more computing power "planned obsolescence"? Vista has nothing to do with it, the game runs fine on XP. Your hardware is just too old. It's like demanding that SimCity 2000 run on a 286, since that's a CPU too, just like a 386 is a CPU.
I'm surprised your desktop manages it. Then again... this game is pissing me off... if it's one thing I hate it 'planned obsolescence', being forced to upgrade when I know for a fact my video card would manage just fine with the game as it stands, provided they didn't adopt what ammounts to a newer supposedly sexier codec... fucking window dressing on a 'Buy Vista Bitches' drive...[/QUOTE]
SB I normally tolerate your rants without comment, but in this case your A. Wrong and B. being stupid
Yeah the basic engine for the game is the Unreal engine, but they added alot of SM 3.0 effects, and they had to do that to get the pretty effects in, without it playing at 5 fps. YOUR VIDEO CARD WILL NOT PLAY THE GAME JUST FINE AS IT STANDS GET OVER IT.
With your opinion you should just go back to playing freaking Pong, cause thats what your basicly arguing for.
I suggest you do.
A 9800 drives over 90% of the games engine without a problem, in fact it drives it very well... only the few extra farkles from SM 3.0 cause the issues.
That, is not fair.
That's like denying me use of the lastest car because I have 9 fingers instead of ten...
I like devices which have a lifespan greater than 3 or 4 years.
Sure, advance stuff as much as you like and I'll also concede computers are at the bleeding edge of hyper-fast development and innovation but that's no excuse for elitism and what really ammounts to [I]waste[/I].
How many gazillions of computer waste gets chucked out every year ? That's a cubic assload of glass, plastic, precious metals and highly refined materials being sent (hopefully) to recyclers because of what ? Basically because we want the latest and greatest and we want to keep up with the cyber-joneses. It's bullshit.
It's possible to design things to last for longer than they currently do. it's a conscious choice which nobody seems to want to entertain.
Perhaps in this instance it needs to start with a Mobo with a truly massive and wide system bus and super wide architecture, which can cope with a decade of Moores Law. Same with some other components.
We shouldn't be required to upgrade as often as we do.
Whats wrong with a video card with swappable cores and ram ?
Probably not much.. aside from not being as profitable...
As a root cause for a good deal of the worlds ills, [B]greed[/B] is my first choice.
Hell, you are already contradicting yourself. The Radeon 9800 is an a very old card by any measure. The R300 core was first put onto the market in early August of 2002 with the 9500 and 9700 lines, and the 9800 line was released back in [B]May[/B] of 2003, which itself [B]far longer[/B] than the "3 or 4 years" you wished for in a product. Even adding onto it the various iterations of the 9800, you still get on the high-side of three years old. Stop playing dumb, deaf, and blind and just research this information. It's all over the internet.
As of now, my 9800 Pro is second-hand and runs 100% fine, no hassles whatsoever. It is a fairly young card, but it is at least three years old, likely a full four.
As far as recycling goods go, computer enthusiasts don't typically just throw their parts away. An oddly high number of them collect their parts and re-use or re-sell as necessary. hand-me-downs are very common in this field. It's not like I'll just throw my 9800 in the trash when I get a newer card. That aside, stop skirting around the topic by adding environmentalism to the mix. That has nothing to do with this.
We aren't [B]required[/B] to upgrade, either. You aren't mandated by law to have the highest level system possible by modern measure every month of every year. Hell, this system was first built in December of 2002, and the only changes over its entire run have been the DVD and hard drives (added both), the Video Card (onboard to GeForce Ti500 to Radeon 9800 Pro), and the Processor (2200+ to 2800+). Wow, how wasteful I am!
Hell, I'm not even going to touch the "obsolescence-proof motherboard." I'll let Biggles tackle that can of worms with much more knowledge [B]and[/B] humor.
And as far as a modular setup goes, doesn't that achieve [B]exactly[/B] the same results as upgrading the card? Congratulations, you just replaced the two most costly components of the video card at once!
You need to get over the fact that your card is obsolete. I knew that when I first got it, and it doesn't bother me one bit. Time flies while you're still breathing, and in the computer world things will get obsoleted. It's just how they go. Deal with it or just at least try to learn *why* what you are saying sounds so ridiculous.
I suggest you do.
A 9800 drives over 90% of the games engine without a problem, in fact it drives it very well... only the few extra farkles from SM 3.0 cause the issues.[/quote]
What if the artists and designers felt those "few extra farkles from SM 3.0" were necessary to accurately convey their vision to the player? Why should they have to compromise their design and vision of the game [i]they[/i] are making so that people with quite old hardware can play it?
[quote]That, is not fair.
That's like denying me use of the lastest car because I have 9 fingers instead of ten...[/quote]
No, that would be if someone designed a keyboard with ten buttons in perfect position for use by each finger, which then made it hard to use by people who lack the sufficient quantity of fingers. And I wouldn't blame such a designer for not ensuring that people with less fingers can use the keyboard.
[quote]I like devices which have a lifespan greater than 3 or 4 years.[/quote]
Your card is still alive. It still runs. It still drives games from its time. There's no reasonable reason to demand that it drives games from [i]this[/i] time that are designed for [i]this[/i] time's hardware.
[quote]Sure, advance stuff as much as you like and I'll also concede computers are at the bleeding edge of hyper-fast development and innovation but that's no excuse for elitism and what really ammounts to [I]waste[/I].
How many gazillions of computer waste gets chucked out every year ? That's a cubic assload of glass, plastic, precious metals and highly refined materials being sent (hopefully) to recyclers because of what ? Basically because we want the latest and greatest and we want to keep up with the cyber-joneses. It's bullshit.[/quote]
It's not elitism, it's technological advance and taking advantage of that advance without hampering their design by having to take into account old hardware that lacks features they feel they require.
[quote]It's possible to design things to last for longer than they currently do. it's a conscious choice which nobody seems to want to entertain.
Perhaps in this instance it needs to start with a Mobo with a truly massive and wide system bus and super wide architecture, which can cope with a decade of Moores Law. Same with some other components.[/quote]
The only way to achieve what you suggest would be to hold back the development of the components that slot into such a motherboard. It's not just the bus width that changes, nor is it just the speed. About the only thing that doesn't change is that it all gets done by sending electrical signals along bits of copper. It's simply not possible to predict what the requirements will be in 10 year's time. Even the bit about electrical signals and copper is likely to change soon enough.
[quote]We shouldn't be required to upgrade as often as we do.[/quote]
I don't remember hearing about any laws being passed making it a requirement that people upgrade their computers. Nobody is forcing you to upgrade.
[quote]Whats wrong with a video card with swappable cores and ram ?
Probably not much.. aside from not being as profitable...[/quote]
That's exactly what a video card [i]is[/i]. Apart from the core and the RAM (which are very tightly linked to the point of being a single design), the rest of the card is some capacitors and resistors to ensure a smooth power supply and maybe an external display signal controller and external bus controller.
[quote]As a root cause for a good deal of the worlds ills, [B]greed[/B] is my first choice.[/QUOTE]
I hope you're not trying to imply that companies are greedy for producing newer products that are better than their previous ones.
The bottom line is that your card is too old. If you don't like the fact that it can't play the latest games, I would suggest you go and buy an xbox 360 or a ps3. They'll still be making games for those in 5 years. But then you'd probably complain about the xbox 720 or the ps4 coming out and making your console obsolete, or that ps4 games don't work on your ps3.
Read some William Gibson.
Then look at the sublimely ironic situation we have here...
It's a game, the major theme there in is one about [I]technology gone mad...[/I]
Why do you think it's name is 'Bioshock'..........
(See [url=http://en.wikipedia.org/wiki/Moore's_law]Moore's Law[/url])
Given that, a 6 year old card is 1/8 the speed of the new cards (Roughly, not exactly)
Granted, the 9800 is a wonderful card, as was the x800 series, but frankly, they are outdated. Why ask the developers to not use the tools made available to them? Most avid gamers are pressuring developers to make more powerful games with better graphics. Shader Model 3.0 has been out for what 3 years now? As an example, DX10 has been out less than a year, and at several gaming forums people are upset that no game currently uses the DX10 codepath to it's fullest advantage.
Now, I will grant you that making a game backwards compatible is a WONDERFUL goal, and companies such as Epic Games and Valve studios are AWESOME at doing that. However, most development teams are being pressured to release the game as fast as possible, and making a game backwards compatible is lower on the list than bugfixes. Rewriting shaders to use older hardware when newer isn't available is ALOT of work. Development teams must find a target system (Generally what most of the population of their core audience is) and code for that, taking into account that a number will be updating as time progresses during the development period.
Cheap answer: Go buy a x1950 for 150$ (PCIe) and slap it in, be happy.
And a good dual core chip for about 150 would work as well I suppose... :D)
(warning: crazy conspiracy theory coming up)
Doesn't it seem odd to anyone else that a publisher would not "encourage" a developer to widen the target-audience as far as possible? Why not include a bare bones mode, that runs on older machines? The lost atmosphere is of less interest to the publisher than the fact that more people would be encouraged to buy the game since they can actually play it.
Unless there was some other incentive for the publisher not to do that. After all, the game is part of the nVidia "they way it's meant to be played" campaign, and nVidia's biggest competitor started supporting SM3.0 one card generation later, thus shutting out a lot of people without nVidia cards...
But I'm still all for game developers doing whatever they want with their games, no one is forcing you to buy it.
RubberEagle: I don't know if nVidia told them to use SM3.0 or not, but I do know that nVidia and ATI go to great length to get developers to design games so they work better on their cards.
There's a formula to how you handle BioShock:
Have rig: Play it.
Don't have rig: Play WoW.
It's [b]that[/b] simple. :D Now I'm sure I'll be able to get meself around to buying a new rig in around a year or so, and by that time all this rootkit-crap and unnecessary jumping through activation hoops is probably over with, and you get to pick the game up for a rough third of its original price. Me like. :)
[QUOTE]EVGA 256-P2-N615-TX GeForce 7600GT 256MB 128-bit GDDR3 PCI Express x16 SLI Supported Video Card - Retail
Item #: N82E16814130062
$89.99
Eagle Tech Cool Power ET-PSCP 550 ATX12V 550W Power Supply - Retail
Item #: N82E16817193018
$19.99
G.SKILL 2GB (2 x 1GB) 240-Pin DDR2 SDRAM DDR2 800 (PC2 6400) Dual Channel Kit Desktop Memory Model F2-6400CL5D-2GBNQ - Retail
Item #: N82E16820231098
$84.99
GIGABYTE GA-965P-DS3 LGA 775 Intel P965 Express ATX Intel Motherboard - Retail
Item #: N82E16813128012
$99.99
Intel Core 2 Duo E6400 Conroe 2.13GHz LGA 775 Processor Model BX80557E6400 - Retail
Item #: N82E16819115004
$205.00
ARCTIC COOLING Freezer 7 Pro 92mm CPU Cooler - Retail
Item #: N82E16835186134
$34.99
Subtotal: $534.95
Shipping: $15.63
Grand Total: $550.58[/QUOTE]
Also, if you're going with any of these core 2 duo units, you'll be perfectly fine with the OEM cpu cooler. The systems idle just fine and dandy with it. My roommate's e6600 is idling mid to high 30's, loading in the 40's. I imagine the e6550 would run slightly cooler and the e6750 just a tad warmer. Save $35 and skip it. Or better yet, put it into a good quality power supply. Name brand, preferrably. [URL="http://www.newegg.com/Product/Product.aspx?Item=N82E16817371005"]Such as this example.[/URL] It's not 550 watts, but your system doesn't sound like it is going to require more than 400 in the worst possible situation, so this seems like a safe bet.
For the video card, [URL="http://www.newegg.com/Product/Product.aspx?Item=N82E16814150230"]here's an EVGA 8600gt[/URL] with a $20 mail in rebate, taking it down to $100 flat.
This is also why I suggested the other Conroe-based chips, especially given their fantastic retail price. On top of that, those I suggested come with 2mb additional L2 cache and put out less heat than the Allendale cores.
At the present, an 8600GT is perfectly reasonable as a graphics card. A notable step up over the 7600, but not a significant leap in budget as with the 8800GTS. On top of that, getting a card like that without getting a processor capable of handling data fast enough is wasted money.
1) Yes it's unfair that we have to keep upgrading to keep up with the newer games
2) It's true were not forced to upgrade, were just forced to look for compatible software when the only shop you got is the local wal-mart.
3) I have 4 systems ranging from a Athalon 650, a XP 1700+, a Celeron 1.5ghz, and my current P4 3.2ghz. I know the tragedy some of us go though, especially if you want to keep your favorate older systems going, but there is nothing we can do except deal with it.
4) I still have a hunkering for the Voodoo series, my favorite, but I have to keep that to the oldest system that is in dustballs.
5) And playing my favorite games require me to keep my older 650, and there is some I still can't even play on that.
Otherwise... yes it is unfair "but that's life", yes there is great ideas that can make things better "but companies want money, not user needs", yes things are getting bad because things are more expensive and we are making less little by little "but your forced to pay it anyway". I personally think I don't know how the hell a depression has hit yet because people have completly spent hellacious amounts of money they don't have. I don't know what the exact percentage, but it was like 75-85% of americans spent more than they made in one year last year. This year is probably worse. I know everyone's pain, I know how this junk is going on. But we all just have to deal with everything the best way we can. Hell all of the work I done in the past is ignored because it isn't eye candy, but it can outperform anything made.
Sorry but this old fossil is trying "And I'm sorry if it isn't trying hard enough" to spread some wisdom. We can't convince the world to change, but we can convince ourselves to change. (OMG that came from me???)
The fun part is that I recommend overclocking. The Core 2 duo is no fun without that. I got a e4300 overclocked to 3GHz and a gf8800gts. I really do love to play games with AA and AF turned on so that's why I'm recommending a gf8800gts or Radeon x1950XT if you think you'll never need DX10.
I believe Oblivion was the latest one, Half Life 2 before that, Doom 3.... I got GRAW a couple weeks ago and it's barely playable on a Core 2 Duo with 2GB of RAM and forget about playing GRAW2, it probably won't even start up.
Anyway, yes, it's frustrating and no, there's nothing you can do about it, other than switching to a video console of course.
I'm surprised nobody's been complaining about Vista yet, because that's another suspicious reason to upgrade. Well, XP has still a lot of life in it, so there's no urgent reason to upgrade yet.
I agree, I'm still a bit puzzled as to why there isn't something like a generic laptop system, like the home-built desktops, where you can buy your own parts and assemble it yourself?
(Remember NVidia's "The way it's ment to be played" and ATI's "Get in the Game" camapigns?
Of course, that only works when consumers have a demand for better graphics....of which a large part do for some reason... :D
I believe Oblivion was the latest one, Half Life 2 before that, Doom 3.... I got GRAW a couple weeks ago and it's barely playable on a Core 2 Duo with 2GB of RAM and forget about playing GRAW2, it probably won't even start up.[/QUOTE]
You can't tell the performance of a computer for games just buy looking at the processor and memory. If your system is a core 2 duo with 2gb ram but with a Geforce 3, then I can see why your system can't play GRAW. Or you're playing at some insane resolution with 16x AA and AF.
And core 2 duos are ment to be overclocked. One of the big reasons why alot of people got excited about the c2d when it was released was because you could get 50% OC with the stock cooler.
PS here is a link to a review between different video cards and how they perform with different processors. [url]http://www.legionhardware.com/document.php?id=672&p=3[/url]
I forgot to mention, it's a Dell Precision M65, a business type of laptop, so not much choice in video hardware but still got an above average GPU, 256MB of video RAM. You can't buy it anymore, a sign of its obsolescence, I guess.