• 1(current)
  • 2
  • 3
  • 4
  • 5
  • 10
Win Nvidia GeForce GT 1030 graphics cards - Perfect for low cost budget 4K HTPC
#1
Thumbs Up 
I did not see any other dedicted threads about Nvidia's new entry-model in its GeForce GTX 10-series of graphics cards, the Nvidia GeForce GT 1030.

https://www.geforce.com/hardware/desktop...ifications

GeForce GT 1030 is what you want if plan on upgrading an old PC into a dedicated and quiet HTPC capable of playing 4K HEVC HDR @ 60fps videos.

GeForce GT 1030 GPU specifications

Reference specification by Nvidia relevant to HTPC (Home Theater PC) usage for Kodi:
  • OS Certification = Windows 7 - 10, Linux, FreeBSD
  • Standard Display Connectors = HDMI 2.0b (including HDR + HLG + WCG & up to 32 audio channels output support), DisplayPort 1.4 2, and/or Single Link-DVI (DVI-D)
  • HDCP (High-bandwidth Digital Content Protection) = 2.2
  • Maximum Digital Resolution = 3840 × 2160 @ 120 Hz or 7680 × 4320 @ 60 Hz
  • VDPAU Feature Set = Feature Set H (8th generation PureVideo HD), meaning HEVC Main 10 (10-bit) and 10-bit VP9 hardware-accelerated decoding up to 7680 × 4320 @ 60fps
  • Hardware decoders = MPEG-1, MPEG-2, MPEG-4 part 2, MJPEG, VC-1, H.264 (8-bit) HEVC (8-bit & 10-bit), VP9 (8-bit & 10-bit)
  • HDMI Audio = Yes (High Definition Audio)
  • Multi Monitor support = Yes
  • PCIe = 3.0 (PCI Express backwards compatible with PCIe 2.0 & 1.0)
  • Graphics Card Dimensions = Width = 2-slot, 6.68 inches in Length, and 2.72 inches in Height for Low-Profile models or 4.72 inches in in Height for Full Height models
  • Graphics Card Power = 30 W (Watt) TDP
  • Maximum GPU Temperature (in C) = 97 C
  • Minimum Recommended System Power (PSU) = 300 W (Watt)
  • Supplementary Power Connectors = None
  • Standard Memory Config = 2 GB
  • Microsoft DirectX = Yes (DirectX 12)
  • OpenGL = 4.5
  • Vulkan API = Yes
On paper all GeForce GTX 10-series GPUs are capable of playback and output of 8K HEVC HDR and 8K VP9 HDR videos @ 60fps over a DisplayPort.

GeForce GT 1030 also meets Netflix certification for 4K playback as long as your computer runs Windows 10 operating-system with latest patches.


Good to know if that PCI-Express is a backwards compatible format most newer cards will also work in PCI-E 2.0 and even PCI-E 1.0 slots, so you could most likley use them on most older motherboards as long as it has a free PCI-Express x16 slot with enough lanes (uses GeForce GT 1030 only uses x4 PCIe lanes but the card requires a PCI-Express x16 sized slot). These cards should then work fine with almost any PC that have a Intel or AMD CPU that have a have a free PCI-Express x16 slot and meets the minimum hardware requirements for Microsoft Windows 10 for desktop editions (Home, Pro, Enterprise, and Education) as an operating system.


GeForce GT 1030 cards are 4K / 2160p / Ultra HD (UHD) capable and already available in the PCI-E 3.0 format from many different manufacturers.

Do a good search for "GeForce GT 1030 silent" and among other you find these without fans (passively-cooled heatsink only) and "LP" for (Low-Profile model) which are normally intended for HTPC:
  • MSI GeForce GT 1030 2GH LP OC Passive LP with 2GB VRAM has one HDMI 2.0b port and one DisplayPort 1.4 port <= Recommended!
  • MSI GeForce GT 1030 2GH LP OCV1 Passive OCV1 with 2GB VRAM has one HDMI 2.0b port and one DVI port
  • Asus GeForce GT 1030 Silent LP (GT1030-SL-2G-BRK) with 2GB VRAM has one HDMI 2.0b port and one DVI-D port
  • Inno3D GeForce GT 1030 0dB Passive LP with 2GB VRAM has one HDMI 2.0b port and one DVI-D port
  • Gigabyte GT 1030 Silent Low Profile 2G with 2GB VRAM has one HDMI 2.0b port and one DVI-D port
All of the GeForce GT 1030 cards listed above which have passive cooling and are in low-profile format so are practically made for HTPCs.

If you don't know which card you should buy today then I recommend that you just get the "MSI GeForce GT 1030 2GH LP OC" because it also have a DisplayPort port (which is more future-proof and is backwards compatible so can easily be converted to a DVI port with an adapter if needed), and you can find it for less than $70(US). You will most likley end up only use the HDMI port in any case. You can read a detailed review of this specific card on Phoronix here => http://www.phoronix.com/scan.php?page=ar...force-1030 which in summery had 1080p video decode benchmarks see an average temperature of 46C and a peak of 55C, with an average power draw of just 34 Watts with a peak of 72 Watts. Others report that any GeForce GT 1030 is capable of 4K HEVC HDR and 4K VP9 HDR videos @ 60fps using the HDMI port, (and using the DisplayPort this card is on paper even capable of 8K HEVC HDR and 8K VP9 HDR videos @ 60fps playback and output).

Image
Reply
#2
We need 4K @60fps on 4K displays benchmarks, maybe HDR too, which I haven't seen anywhere.
Reply
#3
(2017-07-27, 12:04)nsnhd Wrote: We need 4K @60fps on 4K displays benchmarks, maybe HDR too, which I haven't seen anywhere.
https://forum.doom9.org/showthread.php?s...ost1810727

"Observations: The GT 1030 can play all of the videos above perfectly and with only about 20% CPU utilization on an ancient 8 year old 2 core 3Ghz wolfdale PC. This lowest-end (so far) Pascal GPU furthermore has the lowest TDP in the 10x series, which means that no old PC system has to worry that their old power supply can support the graphics card. The 2 GB video memory seems perfectly adequate to play the most demanding 4k videos, while *simultaneously* downscaling to whatever resolution the current PC monitor has. This seems to me to demonstrate that this GPU is ideally suited to the millions of old PC systems, and all of the HTPC systems, that would like to upgrade their systems to allow playback of what I think will be the universal video encoding standard. Unlike some other posters in this forum, I would heartily recommend this GPU. At $69 (ZOTAC GT 1030) I don't think the card has any drawbacks at all. "

IMHO we are at a stage today that you don't really need benchmarks when NVIDIA state that their Pascal GP108 GPU core used in their GeForce GT 1030 graphics processor unit support hardware accelerated video decoding in the specifications. If it did not then it would be their reputation ruined. As noted by the guy who did the benchmark linked above, all videos he tested (which included 4K HDR videos) "with only about 20% CPU utilization on an ancient 8 year old 2 core 3Ghz". So I think that for video decoding purposes you can safely buy Nvidia cards today, and simply trust in what the specification says on the box, without benchmarks with the knowledge that Nvidia knows what they are doing.
Reply
#4
One thing to note is that nvidia cards are not able to accelerate 10bit H265 content with VDPAU, so you need a really beefy CPU to play them. I dont think this feature will come to VDPAU either - from what I have read nvidia have abandoned it.
Reply
#5
(2017-07-27, 12:40)teeedubb Wrote: One thing to note is that nvidia cards are not able to accelerate 10bit H265 content with VDPAU, so you need a really beefy CPU to play them. I dont think this feature will come to VDPAU either - from what I have read nvidia have abandoned it.
The Nvidia Linux video driver only does 8bit, the Nvidia Windows video driver does 4K hevc 10bit just fine on the GT1030.
Reply
#6
(2017-07-27, 12:45)Klojum Wrote:
(2017-07-27, 12:40)teeedubb Wrote: One thing to note is that nvidia cards are not able to accelerate 10bit H265 content with VDPAU, so you need a really beefy CPU to play them. I dont think this feature will come to VDPAU either - from what I have read nvidia have abandoned it.
The Nvidia Linux video driver only does 8bit, the Nvidia Windows video driver does 4K hevc 10bit just fine on the GT1030.

Apparently it is possible in Linux via CUVID, but this isnt compatible with Kodi.
Reply
#7
(2017-07-27, 12:45)Klojum Wrote: The Nvidia Linux video driver only does 8bit, the Nvidia Windows video driver does 4K hevc 10bit just fine on the GT1030.
The drivers on Linux and/or Windows might not support it yet, but NVIDIA's 8th generation PureVideo HD video decode engine ("VP8") available in all the GeForce GTX 10-series (codename "Pascal") of graphics cards supports Nvidia VDPAU Feature Set H, meaning it should even be capable of HEVC Main 12 (12-bit) hardware-accelerated decoding up to 4096 × 2304 pixel resolution, and even higher resolutions for HEVC Main 8 (8-bit) and HEVC Main 10 (10-bit).

https://en.wikipedia.org/wiki/Nvidia_Pur...reVideo_HD

Not exactly sure on the details of 10-bit and 12-bit bit dept VP9 decode as wikipedia only state VP9 hardware decoding video decoding up to 4096 × 2304 pixels resolution. People in doom9.org report that the GeForce GT 1030 have fixed-function HW decoder for 10bit VP9 that have been tested by with YouTube 4K HDR clips.

(2017-07-27, 12:40)teeedubb Wrote: One thing to note is that nvidia cards are not able to accelerate 10bit H265 content with VDPAU, so you need a really beefy CPU to play them. I dont think this feature will come to VDPAU either - from what I have read nvidia have abandoned it.
You of course need each and every part in the full chain to actually support playback those the type files in Kodi with hardware-accelerated decoding. Like example, even if & when Nvidia adds support for HEVC Main 12 (12-bit) in their drivers you still have to wait until both FFmpeg & Kodi add support for it too, and vice versa if the case is reversed. All parts need to support it. It's a SPOF (Single Point Of Failure) chain:

Kodi's core video player with render and demux (including depencies on FFmpeg) => HW decode API support => Manufacturer device drivers for the OS => Capable HW video decode engine.
Reply
#8
(2017-07-27, 12:45)Klojum Wrote: The Nvidia Linux video driver only does 8bit, the Nvidia Windows video driver does 4K hevc 10bit just fine on the GT1030.

So how can this NVIDA Card be future proofed if that sort of 10bit HEVC decoding is not possible in say LibreELEC ?
Or have I got it all wrong.

Reply
#9
(2017-07-27, 12:33)RockerC Wrote:
(2017-07-27, 12:04)nsnhd Wrote: We need 4K @60fps on 4K displays benchmarks, maybe HDR too, which I haven't seen anywhere.
https://forum.doom9.org/showthread.php?s...ost1810727

"Observations: The GT 1030 can play all of the videos above perfectly and with only about 20% CPU utilization on an ancient 8 year old 2 core 3Ghz wolfdale PC. This lowest-end (so far) Pascal GPU furthermore has the lowest TDP in the 10x series, which means that no old PC system has to worry that their old power supply can support the graphics card. The 2 GB video memory seems perfectly adequate to play the most demanding 4k videos, while *simultaneously* downscaling to whatever resolution the current PC monitor has. This seems to me to demonstrate that this GPU is ideally suited to the millions of old PC systems, and all of the HTPC systems, that would like to upgrade their systems to allow playback of what I think will be the universal video encoding standard. Unlike some other posters in this forum, I would heartily recommend this GPU. At $69 (ZOTAC GT 1030) I don't think the card has any drawbacks at all. "

IMHO we are at a stage today that you don't really need benchmarks when NVIDIA state that their Pascal GP108 GPU core used in their GeForce GT 1030 graphics processor unit support hardware accelerated video decoding in the specifications. If it did not then it would be their reputation ruined. As noted by the guy who did the benchmark linked above, all videos he tested (which included 4K HDR videos) "with only about 20% CPU utilization on an ancient 8 year old 2 core 3Ghz". So I think that for video decoding purposes you can safely buy Nvidia cards today, and simply trust in what the specification says on the box, without benchmarks with the knowledge that Nvidia knows what they are doing.
Have you read through the end of this thread, the guy only tested on his 1080p display and he suspected himself about the performance of this GT 1030 on 4K displays.
https://forum.doom9.org/showthread.php?p...ost1812734
Reply
#10
(2017-07-27, 13:37)nsnhd Wrote: Have you read through the end of this thread, the guy only tested on his 1080p display and he suspected himself about the performance of this GT 1030 on 4K displays.
https://forum.doom9.org/showthread.php?p...ost1812734
Yes I read it and its irrelevant as output scaling is done in hardware it doesn't matter what monitor he used to run his tests. He could decode all 4K and 4K HDR videos in HW.

Read the rest of that thread and you see that others there also confirmed that the GeForce GT 1030 have no problems with 4K HEVC & VP9 HDR videos @ 60fps HW decode.
Reply
#11
(2017-07-27, 13:33)wrxtasy Wrote:
(2017-07-27, 12:45)Klojum Wrote: The Nvidia Linux video driver only does 8bit, the Nvidia Windows video driver does 4K hevc 10bit just fine on the GT1030.

So how can this NVIDA Card be future proofed if that sort of 10bit HEVC decoding is not possible in say LibreELEC ?
Or have I got it all wrong.
Correction: Not possible TODAY! But not because of hardware limitations but instead because of software limitation in Nvidia's current device drivers only. And this device driver limitations is not limities to the GeForce GT 1030 cards alone but to all GeForce GTX 10-series of graphics cards, even the high-end ones. It is in Nvidia's best interest to fix this soon.

The hardware supports it so of course Nvidia will add support for it in its device drivers sooner or later, it is just a matter of time. And personally I believe that it will be sooner rather than later because Nvidia will loose money and reputation the longer it takes for them to add the support in their device drivers. People will buy less cards if the hardware knowingly support features that is not yet enabled in the available device drivers, and others who already bought the cards will complain that they can't use the functions promised by the hardware specifications, which will give Nvidia and especially all its GeForce GTX 10-series of graphics cards.

As a comparision, Nvidia Shield TV did not support 10-bit HEVC on its launch day either, as that support came with a later software/firmware upgrade. Today Nvidia have a good reputation for driver/firmware and I'm sure that they want to keep that good reputation.
Reply
#12
It's not only about decoding, it's about output to 4K resolution. You won't have any issue by decoding 4K then downscaling to 1080p but will have by output 4K. The 2GB VRAM is the bottleneck. And apps like madVR uses 16bits buffers, not 8bits.
Reply
#13
(2017-07-27, 13:56)nsnhd Wrote: It's not only about decoding, it's about output to 4K resolution. You won't have any issue by decoding 4K then downscaling to 1080p but will have by output 4K. The 2GB VRAM is the bottleneck. And apps like madVR uses 16bits buffers, not 8bits.
Sorry but you are just plain wrong here so one last reply now to explain it to you then I'm choosing not reply to you any longer on this topic. This is about video decoding and video output, and 2GB VRAM is enough for 4K output when 4K HDR videos are hardware decoded using the dedicated video decode engine on these cards. You are probably confusing hardware accelerated video decoding playback with the GPU GFLOPS graphics requirements for playing PC AAA-games, which is not at all the same, that is like comparing apples and oranges.

Do you honestly believe that Nvidia would release a card with a "HDMI 2.0b" port that is not capable of 4K output, (not even 4K video playback output) just read the specification:

https://www.geforce.com/hardware/desktop...ifications

You can compare this to the first Raspberry Pi with only 256MB of RAM which Kodi can today do H.264 decode and output 1080p video on, and that is with shared RAM used for both system and video. So why should 2GB of dedicated VRAM not be enough to decode and output 4K when the GPU have a dedicated video decode engine. You can also compare this for example Amazon Fire TV 4K which only have 2GB of total RAM, and again that is shared between graphics and system, it too have a dedicated video decode engine so it is capable of decoding and outputting 4K video without a sweet. And many non-Android 4K hardware players have much less RAM than that.
Reply
#14
Image

Just to clarify, you misread the wiki, it's referring to GM206 which is limited to HEVC Main10/VP9 4K decoding, GT 1030(GP108 GPU) as you can see has full fixed function 8K hardware decoding for both HEVC Main10 and VP9 10bit Profile2.

You should only buy the MSI card with DP1.4 output which supports 4K and 5K resolution monitors because DVI-D on GT 1030 is limited to 1080P only due to single-link DVI.

Also on Linux, Nvidia Pascal cards can decode HEVC Main10 just fine with NVDEC using Nvidia Video Codec SDK 8.

https://developer.nvidia.com/nvidia-video-codec-sdk

Quote:10/12-bit decoding support with HEVC/VP9

GT 1030 might not have enough VRAM for Netflix 4K.

http://nvidia.custhelp.com/app/answers/d...vidia-gpus

Quote:NVIDIA Pascal based GPU, GeForce GTX 1050 or greater with minimum 3GB memory
Reply
#15
@RockerC Thanks for the info.

(2017-07-27, 11:25)RockerC Wrote: Good to know if that PCI-Express is a backwards compatible format most newer cards will also work in PCI-E 2.0 and even PCI-E 1.0 slots, so you could possibly use them on many older motherboards as long as it has a free right-size PCI-Express x16 slot with enough lanes (uses x4).

By this, do you mean that if my crappy motherboard has one of those situations where there's 2 full-length x16 slots that share lanes ("1 x16 slot, 1 x8 slot; when the x8 slot is occupied, the x16 slot runs at x8"), if I do populate the 2nd slot with say a RAID card, this card won't be affected since it only uses x4 lanes anyway? I'm not very knowledgeable about these things (I don't game), and didn't know an x16 slot card might not actually use all lanes available. If this is the case, this card could be a great find!
Reply
  • 1(current)
  • 2
  • 3
  • 4
  • 5
  • 10

Logout Mark Read Team Forum Stats Members Help
Nvidia GeForce GT 1030 graphics cards - Perfect for low cost budget 4K HTPC2