Recommended System Requirements | ||
---|---|---|
Game | GeForce 6150 LE | Radeon HD 3200 IGP |
Cyberpunk 2077 | 22961% | 5911% |
Hitman 3 | 29900% | 7719% |
Assassins Creed: Valhalla | 20992% | 5397% |
The Medium | 28676% | 7400% |
Resident Evil 8 | 22961% | 5911% |
FIFA 21 | 11533% | 2932% |
Call of Duty: Black Ops Cold War | 20308% | 5219% |
Grand Theft Auto VI | 32349% | 8357% |
Genshin Impact | 22961% | 5911% |
Far Cry 6 | 33267% | 8597% |
In terms of overall gaming performance, the graphical capabilities of the AMD Radeon HD 3200 IGP are massively better than the Nvidia GeForce 6150 LE.
The HD 3200 has a 75 MHz higher core clock speed and 3 more Texture Mapping Units than the GeForce 6150 LE. This results in the HD 3200 providing 1.6 GTexel/s better texturing performance. Neither of these GPUs support DirectX 10 or higher, so Texture Rate holds more weight than when comparing more modern GPUs.
The HD 3200 has a 75 MHz higher core clock speed and 3 more Render Output Units than the GeForce 6150 LE. This results in the HD 3200 providing 1.6 GPixel/s better pixeling performance. However, both GPUs support DirectX 9 or above, and pixeling performance is only really relevant when comparing older cards.
The HD 3200 was released over three years more recently than the GeForce 6150 LE, and so the HD 3200 is likely to have far better driver support, meaning it will be much more optimized and ultimately superior to the GeForce 6150 LE when running the latest games.
Both GPUs exhibit very poor performance, so rather than upgrading from one to the other you should consider looking at more powerful GPUs. Neither of these will be able to run the latest games in any playable way.
The GeForce 6150 LE and the Radeon HD 3200 IGP have the same amount of video memory, but are likely to provide slightly different experiences when displaying game textures at high resolutions.
The memory bandwidth of the GeForce 6150 LE and the Radeon HD 3200 IGP are the same, which means the GeForce 6150 LE and the HD 3200 have equal limitations when it comes to graphical data transfer.
The GeForce 6150 LE has 2 Shader Processing Units and the Radeon HD 3200 IGP has 40. However, the actual shader performance of the GeForce 6150 LE is 1 and the actual shader performance of the HD 3200 is 13. The HD 3200 having 12 better shader performance and an altogether better performance when taking into account other relevant data means that the HD 3200 delivers a marginally smoother and more efficient experience when processing graphical data than the GeForce 6150 LE.
The HD 3200 transistor size technology is 35 nm (nanometers) smaller than the GeForce 6150 LE. This means that the HD 3200 is expected to run much cooler and achieve higher clock frequencies than the GeForce 6150 LE. While they exhibit similar graphical performance, the HD 3200 should consume less power than the GeForce 6150 LE.
Core Speed | 425 MHz | vs | ![]() | 500 MHz | |
---|---|---|---|---|---|
Boost Clock | - | vs | - | ||
Architecture | C51 | RS780 | |||
OC Potential | - | vs |
![]() | None | |
Driver Support | - | vs | - | ||
Release Date | 11 Oct 2004 | vs | ![]() | 01 Mar 2008 | |
GPU Link | GD Link | GD Link | |||
Approved | ![]() | ![]() | |||
Comparison |
Memory | N/A | vs | N/A | ||
---|---|---|---|---|---|
Memory Speed | - | vs | - | ||
Memory Bus | 128 Bit | ![]() | vs | 32 Bit | |
Memory Type | - | vs | ![]() | GDDR3 | |
Memory Bandwidth | - | vs | - | ||
L2 Cache | - | vs | - | ||
Delta Color Compression | no | vs | no | ||
Memory Performance | 0% | ![]() |
vs | ![]() |
0% |
Comparison |
Shader Processing Units | 2 | vs | ![]() | 40 | |
---|---|---|---|---|---|
Actual Shader Performance | 0% | vs | ![]() | 1% | |
Technology | 90nm | vs | ![]() | 55nm | |
Texture Mapping Units | 1 | vs | ![]() | 4 | |
Texture Rate | 0.4 GTexel/s | vs | ![]() | 2 GTexel/s | |
Render Output Units | 1 | vs | ![]() | 4 | |
Pixel Rate | 0.4 GPixel/s | vs | ![]() | 2 GPixel/s | |
Comparison |
Max Digital Resolution (WxH) | 2560x1600 | ![]() | vs | - | |
---|---|---|---|---|---|
VGA Connections | 1 | ![]() | vs | 0 | |
DVI Connections | 1 | ![]() | vs | 0 | |
HDMI Connections | 1 | ![]() | vs | 0 | |
DisplayPort Connections | - | vs | - | ||
Comparison |
Max Power | - | - | |||
---|---|---|---|---|---|
Recommended PSU | - | - |
DirectX | 9.0c | vs | ![]() | 10.1 | |
---|---|---|---|---|---|
Shader Model | 3.0 | vs | ![]() | 4.1 | |
Open GL | 2.1 | vs | ![]() | 3.3 | |
Open CL | - | vs | - | ||
Notebook GPU | no | no | |||
SLI/Crossfire | no | vs | no | ||
Dedicated | no | vs | no | ||
Comparison |
Recommended Processor | - | - | |||
---|---|---|---|---|---|
Recommended RAM | - | - | |||
Maximum Recommended Gaming Resolution | - | - |
Performance Value | ![]() |
---|
Mini Review | Weak integrated graphics. None of today's modern games will run smoothly. | Radeon HD 3200 is an integrated GPU on the AMD 780G Chipset which is based on the 55nm, second unified shader architecture, R600. It's based on the RS780 Core and offers 40 Shader Processing Units, 4 TMUs and 4 ROPs a 32-bit memory interface of either DDR2 (more commonly) or GDDR3. The central unit runs, commonly, at up to 500MHz and the memory clock's operating speed depends on the system RAM's speed. Radeon HD 3200's performance depends on the users system configuration which will define the operating memory clock's speed and on the desktop manufacturer which decides its central unit's speed. Therefore, Radeon HD 3200 may offer similar performance to Radeon HD 3300. However, the most common versions, unlike Radeon HD 3300, are equipped with DDR2. DirectX 11 games aren't supported. |
---|
Recommended CPU | - | - | |||
---|---|---|---|---|---|
Possible GPU Upgrades | N/A | N/A | |||
GPU Variants | - | - |