Results 1 to 7 of 7
Like Tree1Likes
  • 1 Post By bfmetcalf

Thread: Trying To Replace GPU

  1. #1
    Jazz
    Member #
    41522
    Join Date
    Jul 2012
    Tablet
    TF700
    Posts
    31
    Liked
    4 times

    Trying To Replace GPU

    Hey Folks I'm having a trouble here. I'm lost cuz my GPU's temp reaches a temperature too an amazing 120 degrees and I sent it about 3 times to repairs n still nothing So I was thinking of replacing it with another GPU.

    My Computer
    Motherboard: Benicia-GL8E
    Power Supply: 460Watt
    Windows: Vista
    Ram:4Gb
    GPU: nVidia GeForce GT 8800

    I want to replace it with the GTX 650 2Gb version here are the details of the both.
    Is it good to replace or not good as I'm not that good in pc's trying to get better too.

    GPU.jpg

  2. #2
    Bumblebee
    Member #
    16239
    Join Date
    Nov 2011
    Location
    Memphis, TN
    Tablet
    TF101
    Posts
    2,966
    Liked
    392 times
    You shouldn't have any issues swapping them out, not sure what CPU you have and what the max TDP is on it, but a 460W PSU _should_ power it fine. You will need to make sure you are using the right drivers, but I would think that would be a non-event using the install media that comes with the unit.

    Anther thing to look at with heat problems in a desktop is how good your cooling is. Make sure all your fans are clean and working correctly as well as adding extra fans if possible on your enclosure. One thing I have seen is that some desktops have been put together will all the fans pointed inward which isn't very good for cooling purposes, make sure you have at least a fan blowing in and a fan blowing out, usually in the bottom and out the top works the best since heat rises and it will work with physics instead of against it.
    Swipe likes this.
    TF101! - Running Domination built on my beautiful ArchLinux box!
    Samsung Galaxy S4 - Stock Sadly...
    Custom Built Desktop - Intel I7 w/ Windows7 & ArchLinux Dual boot - Go LINUX!
    I firmly believe that any man's finest hour, the greatest fulfillment of all that he holds dear, is that moment when he has worked his heart out in a good cause and lies exhausted on the field of battle - victorious. - Vince Lombardi

    Great Help Guide Here -> Frederucos Fantastical Forum Favorites
    And Here -> Master Help Guide

  3. #3
    Jazz
    Member #
    41522
    Join Date
    Jul 2012
    Tablet
    TF700
    Posts
    31
    Liked
    4 times
    thx bfmetcalf, I'll see what I can do and I also have fans, the cpu is an Intel Quad Core 2 its a Q6600 model

    one more thing i forgot to ask whats the difference between a PCI-E x16 and a PCI-E 2.0 or 3.0 ?
    Last edited by MrJagger; 03-16-2013 at 07:54 AM. Reason: forgot to ask a question

  4. #4
    Starscream
    Member #
    42104
    Join Date
    Aug 2012
    Tablet
    TF300T
    Posts
    463
    Liked
    38 times
    MRJagger, The difference is in the bitrate/tranfer rate of data, PC1-E X 16 run at 16 bits PCI-E 2.0 & 3.0 run at higher 32 bit & 64 bitrates...
    Its mind over matter - What does it matter never mind - by me

  5. #5
    Bumblebee
    Member #
    9512
    Join Date
    Aug 2011
    Tablet
    Other - Windows
    Posts
    2,397
    Liked
    467 times
    Why on earth do you still have windows vista?
    URL Removed by Jeffrey

  6. #6
    TRANSFORMER FORUMS LEGEND
    Supporting Member

    Member #
    2978
    Join Date
    Jun 2011
    Location
    /EARTH/USA/NC/
    Tablet
    None
    Posts
    15,780
    Liked
    2676 times
    Quote Originally Posted by MrJagger View Post
    thx bfmetcalf, I'll see what I can do and I also have fans, the cpu is an Intel Quad Core 2 its a Q6600 model

    one more thing i forgot to ask whats the difference between a PCI-E x16 and a PCI-E 2.0 or 3.0 ?
    PICE 3.0 is faster:

    PCI Express 2.0

    PCI-SIG announced the availability of the PCI Express Base 2.0 specification on 15 January 2007.[18] The PCIe 2.0 standard doubles the transfer rate compared with PCIe 1.0 to 5 GT/s and the per-lane throughput rises from 250 MB/s to 500 MB/s. This means a 32-lane PCIe connector (32) can support throughput up to 16 GB/s aggregate.

    PCIe 2.0 motherboard slots are fully backward compatible with PCIe v1.x cards. PCIe 2.0 cards are also generally backward compatible with PCIe 1.x motherboards, using the available bandwidth of PCI Express 1.1. Overall, graphic cards or motherboards designed for v2.0 will work with the other being v1.1 or v1.0a.

    The PCI-SIG also said that PCIe 2.0 features improvements to the point-to-point data transfer protocol and its software architecture.[19]

    Intel's first PCIe 2.0 capable chipset was the X38 and boards began to ship from various vendors (Abit, Asus, Gigabyte) as of October 21, 2007.[20] AMD started supporting PCIe 2.0 with its AMD 700 chipset series and nVidia started with the MCP72.[21] All of Intel's prior chipsets, including the Intel P35 chipset, supported PCIe 1.1 or 1.0a.[22]

    Like 1.x, PCIe 2.0 uses an 8b/10b encoding scheme, therefore delivering, per-lane, an effective 4 Gbit/s max transfer rate from its 5 GT/s raw data rate.

    PCI Express 3.0

    PCI Express 3.0 Base specification revision 3.0 was made available in November 2010, after multiple delays. In August 2007, PCI-SIG announced that PCI Express 3.0 would carry a bit rate of 8 gigatransfers per second (GT/s), and that it would be backward compatible with existing PCIe implementations. At that time, it was also announced that the final specification for PCI Express 3.0 would be delayed until 2011.[23] New features for the PCIe 3.0 specification include a number of optimizations for enhanced signaling and data integrity, including transmitter and receiver equalization, PLL improvements, clock data recovery, and channel enhancements for currently supported topologies.[24]

    Following a six-month technical analysis of the feasibility of scaling the PCIe interconnect bandwidth, PCI-SIG's analysis found out that 8 gigatransfers per second can be manufactured in mainstream silicon process technology, and can be deployed with existing low-cost materials and infrastructure, while maintaining full compatibility (with negligible impact) to the PCIe protocol stack.

    PCIe 3.0 upgrades the encoding scheme to 128b/130b from the previous 8b/10b, reducing the overhead to approximately 1.54% ((130-128)/130), as opposed to the 20% of PCIe 2.0. This is achieved by a technique called "scrambling" that applies a known binary polynomial to a data stream in a feedback topology. Because the scrambling polynomial is known, the data can be recovered by running it through a feedback topology using the inverse polynomial.[25] PCIe 3.0's 8 GT/s bit rate effectively delivers 985 MB/s per lane, double PCIe 2.0 bandwidth. PCI-SIG expects the PCIe 3.0 specifications to undergo rigorous technical vetting and validation before being released to the industry. This process, which was followed in the development of prior generations of the PCIe Base and various form factor specifications, includes the corroboration of the final electrical parameters with data derived from test silicon and other simulations conducted by multiple members of the PCI-SIG.

    On November 18, 2010, the PCI Special Interest Group officially published the finalized PCI Express 3.0 specification to its members to build devices based on this new version of PCI Express.[26]

    AMD latest flagship graphic card, the Radeon HD 7970, launched on January 9, 2012, was the world's first PCIe 3.0 graphic card.[27] Initial reviews suggest that the new interface would not improve graphic performance compared to earlier PCIe 2.0, which, at the time of writing, is still under-utilized. However, the new interface would prove advantageous when used for general purpose computing with technologies like OpenCL, CUDA and C++ AMP.[28]
    (both from Wikipedia)

    Quote Originally Posted by goodintentions View Post
    Why on earth do you still have windows vista?
    Agreed....


    Asus Transformer 16 GB & Dock - Munching on Jelly Beans
    crApple iP*one 5 - Stock

    Just once I want someone to call me Sir without adding you're making a scene.
    - Homer Jay Simpson

  7. #7
    Jazz
    Member #
    41522
    Join Date
    Jul 2012
    Tablet
    TF700
    Posts
    31
    Liked
    4 times
    Quote Originally Posted by goodintentions View Post
    Why on earth do you still have windows vista?
    erm... cuz its expensive and I only have a moth in my wallet..

    Sorry for the late response

 

 

Remove Ads

Sponsored Links

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  

Similar Threads

  1. can infinity replace my desktop?
    By manbohong in forum Transformer Pad Infinity TF700 (Tegra 3) General Discussion
    Replies: 31
    Last Post: 07-19-2012, 01:10 PM
  2. Replace SSD
    By jayvandriver in forum Transformer Prime General Discussions
    Replies: 5
    Last Post: 06-03-2012, 03:18 PM
  3. Asus *SHOULD* replace this!!
    By tanerf82 in forum Transformer Prime General Discussions
    Replies: 11
    Last Post: 04-22-2012, 05:39 AM
  4. Dilemma - repair or replace?
    By verntern in forum Asus Transformer (TF101) General Discussions
    Replies: 8
    Last Post: 10-28-2011, 01:14 PM
  5. Replace a legacy app?
    By Squeak in forum Asus Transformer (TF101) General Discussions
    Replies: 6
    Last Post: 06-09-2011, 07:43 AM

Search tags for this page

asus gtx 650 8b/10b support

,

benicia-gl8e pci ex16 3

,

gpu support 8b/10b encoding?

Click on a term to search for related topics.
Powered by vBulletin® Version 4.2.3
Copyright © 2019 vBulletin Solutions, Inc. All rights reserved.
Search Engine Optimization by vBSEO 3.6.1
All times are GMT -6. The time now is 03:41 AM.