Jump to content

NVIDIA G80 initial pictures released


REVENGE

NVIDIA's upcoming GPU, the G80, has just been snapped by a few hacks at PCOnline. Consuming over 300W and requiring dual PCIe power plugs and water cooling, you can bet that this will be a "hot" product. Feast your eyes, pictures are worth a thousand words.

 

UPDATE: The NVIDIA press launch event for G80 will occur on October 18 in Santa Clara, CA. Be advised, this is a PAPER LAUNCH ONLY, so I suggest you take the time between the press launch and hard launch to get a 1 kw PSU.


User Feedback

Recommended Comments



Dear God, that is to much. I mean it is geting expensive to game on the pc/mac. I mean 300 watts for a fricking card? Please. If the gaming companies would truely use the power of the cards that currectly on the market and programmers would optomize the games, we wouldn't need to go so far. Companies wonder why pc gaming is going down, from the figures I have read about consoles are ten times as profittable, and people wonder why so many are tired of windows and gaming and go linux or osx.

Link to comment
Share on other sites

There's only one word for this, STUPID. The recent reviews of best chipsets for Core 2 Duos showed that entire systems (including video cards, but not monitors) were drawing less than 200W. 300W for a video card alone is so far beyond stupid, I don't think the appropriate words exist to describe it...

Link to comment
Share on other sites

Well, I'd like to clarify that it is LIKELY the card consumes 300W; however, NVIDIA has yet to comment on this. This conclusion is being made based on the fact that this is a single chip requiring 2 PCIe plugs and water cooling. The statement itself was made by HardOCP. And oh yes, did I mention that there were rumors that this card would have its own discrete PSU? Yes thats right, a PSU dedicated to powering a video card. Yes, this is stupid stuff indeed.

 

Also, IMO, I think ATi's R600 will regardless consume more power than the G80. Expect things to get better as product refreshes bring die shrinkage from 90nm down to something smaller.

Link to comment
Share on other sites

Cool gpu!

Nvidia rocks.

 

I'm a macnoob but how is the deal with 'pc' hardware in mac's?

Can you insert any hardware if there are drivers for mac?

Or is a 7300gt different in hardware than a 7300gt on pc?

Link to comment
Share on other sites

Well, I'm posting this because after all, there is a chance that NVIDIA GPUs could be used exclusively for the next Macs considering the DAAMIT vs. Intel fiasco. Or, if rumors that Mac will use AMD are true, maybe we'll never see NVIDIA highend in macs again...

Link to comment
Share on other sites

ok so just to kiddin' a bit... i'll build a rig with the apple's "8-cores-in-one" solution with a quadcore gpu system(all G80s off course)+watercooling(with syberian water :P). how many PSUs i will need in your opinion? :)

Link to comment
Share on other sites

This is a nice way to blow every fuse in your house simultaneously. I think they should offer a bundled deal: buy this video card, get a complimentary mini fire extinguisher for free.

 

Who's going to be the guinea pig and spend their life's savings on this thing?

Link to comment
Share on other sites

sandmanfvrga, I completely agree with you.

Gaming on computers in general is becoming stupid. As powerful as this card is, with the proper code and a card that developers feel comfortable with, the videocards from 2004 could produce better graphics, in all probability.

Consoles are the way to go. They're less expensive and will become very PC-like in the new generation as peripherals like mice and keyboards can be added through USB port.

It'll shut up the people who don't like the controllers for things like FPS, and the graphics of the 360 and PS3 will be beyond computers for a few years (I think i read that somewhere...).

IMHO i'll be happy with the Wii standing oposite my mac mini and hooked up to my lcd monitor.

 

I'm just not a fan of games on computers. Not the ones that require super-systems to function, that is. I think the devs should optimize their code as they do for consoles. It would open up a larger market, as more people will play, and I wouldn't have to set my game settings to "low" just to play most games on the market.

 

-Urby

Link to comment
Share on other sites

sandmanfvrga, I completely agree with you.

 

 

-Urby

 

It is way too much, as in "way".

 

The dual SLI was premature as was the first P IV and Rambus.....

 

Gessssh. Heat, Heat, Heat, that's the problem, heat.

 

The Mac G5 water cooled was the biggest buffoon joke that I had seen and now this.

 

That just have to wait for smaller die SOI insulation and in time there should be some break throughs, as in smaller wafers and instead of one chip, start spreading the instructions out over 2 way, 4 way GPU graphics on one card. 300 WATTS! :)

 

 

Simply too much.

Link to comment
Share on other sites

Don't forget folks that this is nothing more than a mock up, or a "Beta Board". This will see release in probably another 3 to 5 months, after it has undergone some revisions. And whoever said the consoles are ahead of the PC is a sorely mistaken and hasn't paid ANY attention to the PC game market.

 

Sales do not indicate that the game industry is dying on the PC, its platauing.

 

With Intel's new Quad Core, and AMD just a few months off of having Quad Core as well, this card will fit perfectly with this. Many games do not require an "Uber" PC to run. But most these games, you can't run on hardware that is more than 3 years old. I work on computers(PCs) for a living, repairing them, and building gaming rigs. I get people coming in all the time bitching about how thier retarded GeForce 4MX won't run Prey, WOW, DOOM 3, Quake 4. My answer to them: It was never meant to run ANY game, ANYWAY! If you want your PC to run games that are coming out today, you can't be using hard ware from 2002. I know people who have tried running Oblivion on Radeon 9800Pro, with a Intel Celeron 1.8ghz. Then they do nothing but {censored} about how crappy the game was coded, built, made, and supported. Its called UPGRADE YOUR POS BEFORE YOU {censored}. 99 times outta 100, the game isn't the responsible for how it runs on your system. YOU ARE. If you want to game on your PC, don't buy a console, spend that $500+ that it would cost you to get going with an Xbox 360, and put a decent vid card in your system, update your CPU/MOBO, and Ram. I'm sick of hearing people whine about how their games wont run on their 5 year old computer. You dump and update your TV and iPods at the drop of a hat, and then {censored} that your computer isn't running fast.

 

So before you {censored} about PC gaming, look at the rig you're running, whether its a Mac or a PC. If you haven't upgraded in the last 3 years, STFU. If you have bought a new video card in the last 4 years, STFU. The designers of games are doing everything they can with the hardware that they have. It takes 6 months for them to adapt to the new standards, widgets, gadgets, and gizmo's that ATI & nVidia put on their cards. Lets not forget that we've had DirectX 9.0 since early 2003. OpenGL just got updated to 2.0 not that long ago. Plus, DirectX 10 is on the horizon, and promises even better graphics, better sound, better immersion within game environments. This card will be the ground breaker, and path finder for the next generation of Vid cards. Lets not forget that the consoles are kinda on par with the PC right now. With DX10, the PC will once again leap frog the Consoles, and it'll be another 5 years before the Consoles catch back up. That is, if Sony ever gets their POS PS3 out. I'm sick of hearing that system being hyped, and not seeing ANY actual gameplay footage from. Put up or shut up Sony.

 

Go nVidia. I can't wait to slap this card in and take it for a spin!

Link to comment
Share on other sites

This is way too much and in fact, it's going to be "the thing that kills PC gaming."

 

i was thinking that myself, why use this stupid requirement card when you can just run 2~4 of the card before it and taht will run faster.

 

 

max

 

"i don' wanna hook up ma' damn fishtank to da' video card now y'all" :lol:

Link to comment
Share on other sites

AFter reading this, I think I am DONE with PC gaming. I am going Mac and not going back. Just got to get rid of things and switch. I can game on the Wii an then the 360 when it goes down to compete with the PS3. 300 watts for a damn card? Whatever. Let the Vista kiddies have it.

Link to comment
Share on other sites

AFter reading this, I think I am DONE with PC gaming. I am going Mac and not going back. Just got to get rid of things and switch. I can game on the Wii an then the 360 when it goes down to compete with the PS3. 300 watts for a damn card? Whatever. Let the Vista kiddies have it.

 

Did you even bother to read my post? That this is a Prototype??! My god, you folks know nothing of computers and computer gaming.

Link to comment
Share on other sites

The Doc, quit being a prick. I have two degress in computer scientist and build my pcs, so I am no idiot. I know what this means and I know it may be a prototype, but Nvidia's prototypes pretty much come to fruition in mainstream cards with close to if not the same specs as prototypes. Plus I am not the only one who thinks like this. Don't know what your problem is, but get over it. This much money and power in video cards is rediculous.

Link to comment
Share on other sites

I am not going to buy any graphics cards until they make use of multi-core GPUs in 65nm structure or even less, because the power-consuption of these graphics monsters that are going to be released is just insane!

Anyway I think that classic PCs are going to be replaced by those home entertainment machines which will have more gaming performance in the future, just in case you don't already possess some gaming console (XBOX360, PS3, Wii).

Link to comment
Share on other sites

I think PC gaming is dieing and frankly, I think it MIGHT be good. Why? When PC gaming dies (I know this is stretch and make some mad) then Windows has zero edge over OSX and Linux. Then maybe Windows will go down in popularity. I don't know, far fetched but might happen.

 

I do know that PC gaming is fall WAY behind the consoles. Why? PC's have no edge. Sure new stuff comes out on the pc that makes it "better" but what the 360 and PS3 can do, who needs a pc? You can multiplayer, have high end games, and I would bet you money, if I was a betting man, that MMORPG's will be coming to the these systems with Keyboards and Mice like World of Warcraft. Consoles already patch their games and allow add ons. The hard drive on the consoles made that possible. I really see no need for PC gaming now.

Link to comment
Share on other sites

When PC gaming dies, there will be somewhat of a crunch in graphics cards. That's a definite, since they tend to narrowcast toward the gaming population.

 

In regards to starting to kill WIndows, I don't think it will. People will continue to want their Windows insecurity blanket on their $50 Dell boxes. But we can hope, right???

Link to comment
Share on other sites

Did you even bother to read my post? That this is a Prototype??! My god, you folks know nothing of computers and computer gaming.

 

 

Lol yep, and its not like that you nececarely need this card, you can sli or soon quad sli cards any way. Im also sure that they will figure out how to get the requirements down by the final release... dev cards are teh equivilent of alphas and betas in os's, they are no where near ready, just the basis showing what they did right and what they need fixed.

 

 

oh, and sandmanfvrga , this is what my dad says "if Linux was as good an os as Windows, then wouldnt oem flock to it? OEM's have to pay $125 a copy to microsoft, so wouldnt you think that if linux is anywhere near as good and as functional as windows they would flock to that since theyd have the license fee waved etc?" and I think thats quite logical actually... but this gpu will not kill windows gaming, or anything of the sort, it might teach nVidia/Ati a lession, but it will not harm anything, or change ne thing....

 

max

Link to comment
Share on other sites



×
×
  • Create New...