arthurking83
01-04-2012, 6:24pm
It started off relatively normally, I get a call last weekend .... "you got to come and fix my PC, it's got bricks"
Me: .. :confused: (obviously a technical terms I've not yet come across :p)
Sister is telling me that her screen is flickering a brick like pattern, when she plugs it into a particular wall socket, but it's not as bad if she runs it via an extension lead via another wall socket.
This has me thinking ... :beer_mug: it's time to switch off as my sis has done it again(PC wise)
Yesterday I had the time to go and help her, thinking that it'll probably be another easy fix by simply putting the DVI plug back in and tightening the retaining screws properly or something, but no!
This time she was for real.
Brick like flickering pattern was immediately diagnosed as a dead graphics chip on her Medion PC's motherboard, and I'm thinking about a $50 fix.. I'll just nip down the road to a DSE and get whatever they had in stock and she's right.
Too late in the day, so I brought it home and not having a spare PCIe graphic card laying around(I have heaps of older AGP cards, but no PCI cards.
But! I have one in my current PC, and maybe I'll just pull it out to confirm my suscpicion that it's the card and not some other m/b issue along the line somewhere.
Yep! beauty... it's just the card and she's right to go.. but it's now 11:59PM Sat, and it's going to be Sunday any second now, so there's no shop in the worlkd that'd be open at 1AM Sunday and I just want to get it sorted and get it back to her so that I don't have to go through the process of having to get it to her later in the week.. and so on and etc.
So the decision is made, MY PC is going to get downgraded from having a upper mid end graphics card(nvidia 400 series, from way back) to using the piddly onboard VGA in the form of a very low end type graphics Radeon(AMD) 4290 chip.
From the benchmarks I quickly flicked through, quite a substantial loss in GPU performance and total memory in going from the nividia graphics card to the on board AMD chip.
Oh well, for the sake of sanity and time management(my own time) it was easier to do this, and one day replace the now missing graphics card in my PC.
First of all, I had a fright after the initialisation, the max screen resolution was a miniscule 1400x1024, and massively distorted compared the curent mininium standard of 1920x1080(I've been used to on this 24" for the past two years).
Installed the latest Catalyst drivers and other oblique periphery software and bingo! .. I'm now back at 1920x1080 again .. cool as!
Screen looked redder than 'a bright red thing', so I had to redo the calibration again which is quite easy and quick now with the BasICColor software, and we're back to square one ... except for the issue now of a far inferior graphics chip performance.
But I'll be damned if I can see anywhere where it's made any difference.
All apps open and close and render as fast as the $100 graphics card did.
I was worrying about CaptureNX2 rendering it's 100Meg files woefully slow, but nope! Not one iota of difference that I can percieve.
Mind you I'm not a gamer and don't really care for them, but played a bit of Open Transport Tycoon a few months back for a bit of a laugh(rekindling some long lost youthfulness I guess), but that's about it, haven't touched it again for at least 3 months now.
But I don't need 2000fps on Doom of Fortune CCIX or whatever, but I used to assume that even for mild photo editing that a more substantial graphics card was helpful.
It's a fairly painless job to get a mid range $100 67xx or 68xx series graphics card and get going with high end graphics rendering again, but is there any point?
I've previously had on board graphics chips, years back, as the sole graphics GPU, and there's always been a difference in screen rendering between a graphics card and a motherboard graphics chip, but I'm now wondering is there any point any longer?
And for future PC builds:
Can I stop stressing over the slight performance hit of having a graphics chip on board the motherboard, and trying to source the best motherboard with no on board graphics.
Is anyone else running with a homebuilt PC and on board graphics chip instead of mid to high end graphics card?
I suppose if I ever get off my posterior and get myself a 27" Dell U2711 or something similar and require a GPU that's able to supply it's 25million by 19billion pixel rendering ability, I'll have to get a graphcs card once again, but as I see it currently, for the 1920x1080 res this (or any other)screen is capable of producing, I need not!
.. just a useless observation I made last night.
Me: .. :confused: (obviously a technical terms I've not yet come across :p)
Sister is telling me that her screen is flickering a brick like pattern, when she plugs it into a particular wall socket, but it's not as bad if she runs it via an extension lead via another wall socket.
This has me thinking ... :beer_mug: it's time to switch off as my sis has done it again(PC wise)
Yesterday I had the time to go and help her, thinking that it'll probably be another easy fix by simply putting the DVI plug back in and tightening the retaining screws properly or something, but no!
This time she was for real.
Brick like flickering pattern was immediately diagnosed as a dead graphics chip on her Medion PC's motherboard, and I'm thinking about a $50 fix.. I'll just nip down the road to a DSE and get whatever they had in stock and she's right.
Too late in the day, so I brought it home and not having a spare PCIe graphic card laying around(I have heaps of older AGP cards, but no PCI cards.
But! I have one in my current PC, and maybe I'll just pull it out to confirm my suscpicion that it's the card and not some other m/b issue along the line somewhere.
Yep! beauty... it's just the card and she's right to go.. but it's now 11:59PM Sat, and it's going to be Sunday any second now, so there's no shop in the worlkd that'd be open at 1AM Sunday and I just want to get it sorted and get it back to her so that I don't have to go through the process of having to get it to her later in the week.. and so on and etc.
So the decision is made, MY PC is going to get downgraded from having a upper mid end graphics card(nvidia 400 series, from way back) to using the piddly onboard VGA in the form of a very low end type graphics Radeon(AMD) 4290 chip.
From the benchmarks I quickly flicked through, quite a substantial loss in GPU performance and total memory in going from the nividia graphics card to the on board AMD chip.
Oh well, for the sake of sanity and time management(my own time) it was easier to do this, and one day replace the now missing graphics card in my PC.
First of all, I had a fright after the initialisation, the max screen resolution was a miniscule 1400x1024, and massively distorted compared the curent mininium standard of 1920x1080(I've been used to on this 24" for the past two years).
Installed the latest Catalyst drivers and other oblique periphery software and bingo! .. I'm now back at 1920x1080 again .. cool as!
Screen looked redder than 'a bright red thing', so I had to redo the calibration again which is quite easy and quick now with the BasICColor software, and we're back to square one ... except for the issue now of a far inferior graphics chip performance.
But I'll be damned if I can see anywhere where it's made any difference.
All apps open and close and render as fast as the $100 graphics card did.
I was worrying about CaptureNX2 rendering it's 100Meg files woefully slow, but nope! Not one iota of difference that I can percieve.
Mind you I'm not a gamer and don't really care for them, but played a bit of Open Transport Tycoon a few months back for a bit of a laugh(rekindling some long lost youthfulness I guess), but that's about it, haven't touched it again for at least 3 months now.
But I don't need 2000fps on Doom of Fortune CCIX or whatever, but I used to assume that even for mild photo editing that a more substantial graphics card was helpful.
It's a fairly painless job to get a mid range $100 67xx or 68xx series graphics card and get going with high end graphics rendering again, but is there any point?
I've previously had on board graphics chips, years back, as the sole graphics GPU, and there's always been a difference in screen rendering between a graphics card and a motherboard graphics chip, but I'm now wondering is there any point any longer?
And for future PC builds:
Can I stop stressing over the slight performance hit of having a graphics chip on board the motherboard, and trying to source the best motherboard with no on board graphics.
Is anyone else running with a homebuilt PC and on board graphics chip instead of mid to high end graphics card?
I suppose if I ever get off my posterior and get myself a 27" Dell U2711 or something similar and require a GPU that's able to supply it's 25million by 19billion pixel rendering ability, I'll have to get a graphcs card once again, but as I see it currently, for the 1920x1080 res this (or any other)screen is capable of producing, I need not!
.. just a useless observation I made last night.