I want to host security cameras and a plex server. Does this mean that my server needs a GPU? (Or would benefit from one). I heard plex does fine with just a CPU

  • Still-Snow-3743@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    11 months ago

    I have found transcoding to work noticeably better when using quicksync (the intel chip native encoder) rather than a GPU.

    At this point, I think the only real reason you would want a GPU is for LLMs.

    • jimheim@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      11 months ago

      A decent, recent Nvidia GPU is going to beat most CPUs. I wouldn’t shell out extra for a GPU just for transcoding, though. Good enough is good enough.

  • releenc@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    11 months ago

    Plex can only use the GPU if you haver the paid version. Those features are disabled in the free version. I run my plex server on Windows Server 2016 with an I7-3770K CPU @ 3.5GHz and 16GB RAM, with no dedicated graphics card, just the Intel on CPU graphics. I have no performance problems with transcoding videos using the free version of plex.

  • jerwong@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    11 months ago

    I use Jellyfin which is similar to Plex. I have it on a Raspberry Pi 4 8 GB. It’s perfectly fine if I’m sending H264 but most modern browsers do not support H265 so it forces the server to transcode. That will consume almost all processing power if it’s CPU-only and is a very slow process.

    • agent_kater@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      11 months ago

      That will consume almost all processing power if it’s CPU-only and is a very slow process.

      This is a complicated topic and the terminology is a bit ambiguous.

      Yes, non-hardware-accelerated transcoding is slow and will consume the CPU.

      However, you don’t necessarily need an external GPU to do hardware-accelerated transcoding. When you use Intel QuickSync for example, the codec hardware is part if the CPU. On the other hand it is only in CPUs that have integrated graphics, so you could still say the transcoding is done “by the GPU”, just not the additional one that you put in. In fact, putting in a dedicated graphics card often disables the integrated graphics and you have to use tricks to re-enable it before you can use it for transcoding again.

  • ArgoPanoptes@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    11 months ago

    If you want object detection with your ip cameras, you can use Frigate, and to have good performance, you can buy a Google Coral to perform the object detection part.

    • Krieg@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      11 months ago

      As long as you have Plex Pass, hardware transcoding is extremely good with moder QuickSync Intel processors, and specially good if you run Linux.

      • agent_kater@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        11 months ago

        In fact, putting in a dedicated graphics card often disables the integrated graphics including QuickSync and you might have to set up a virtual screen to re-enable it.

  • Powerstream@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    11 months ago

    Bought an Intel A750 for transcoding on my Jellyfin server. The CPU did fine for just 1 or 2 people watching at the same time. It just pegged out the CPU at 100%. Plus had limited support for what types could be transcoded. With the A750 can easily handle several people watching at the same time and can transcode any format. With little CPU usages, so everything else running stays fast.

  • fliberdygibits@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    11 months ago

    Just about anything Machine Learning or AI, Transcoding for a media server, render farm for something like blender perhaps?

  • Kltpzyxmm@alien.top
    cake
    B
    link
    fedilink
    English
    arrow-up
    1
    ·
    11 months ago

    Get an Intel cpu with iGPU (most do) and you’ll be good to go for anything that you’re doing there.

  • I_EAT_THE_RICH@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    11 months ago

    Yes, GPU for transcoding is reason enough, but object detection using frigate, or AI stuff is nice also. Buy a nvidia.

  • sphereatmos@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    11 months ago

    With plex, an intel quick sync igpu will be fine unless you plan to do >10 transcodes at a time

  • jewbasaur@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    11 months ago

    My main server has a 3070 and I use it to stream games through moonlight to all my tvs and computers around the house. That way I get the most value from the card instead of it being locked into one machine

  • DarkKnyt@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    11 months ago

    I have two GPUs in a single tower.

    A GTX 750 to that I share with my LXCs. It does jellyfin transcode, frigate nvr for 3 cameras, kasm accelerated desktops, xfce4 pve host acceleration, Jupyter tensorflow, ersatz tv transcode, and I plan to use it for immich. At most it is taxed about 25 percent but I plan to have a lot more nvr and jellyfin streams.

    I also have a 1660 ti passed to windows 11 VM for my gaming VM. I use sunshine and moonlight for remote gaming but I also roll easy diffusion for some image generation. I had an LLM but (https://github.com/oobabooga/text-generation-webui) but it was too slow for what I’m used to - I just use bing chat and now meta on whatsapp for my personal and an LLM I have access to at work.

  • davepage_mcr@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    11 months ago

    There’s the possibility of self hosted speech recognition for use with Mycroft or other personal assistants. Saves you from having Amazon listen to your every word.