• 38 Posts
  • 240 Comments
Joined 9 months ago
cake
Cake day: February 10th, 2024

help-circle






  • I built a new machine pretty recently, also with an RX 7800XT GPU (factory overclocked). When sitting idle at the desktop, the system draws about the same amount of power as my old machine did with an RX 480. So I think trying to put the big GPU to sleep during desktop use might be barking up the wrong tree.

    I suggest getting a power monitor, like a Kill-A-Watt, and taking measurements while you experiment. Here are some ideas to consider:

    • Are you using multiple monitors? I have read that newer AMD GPUs sometimes draw more power than they should in this case. It might depend on the resolution and/or windowing system in use. (I don’t remember if the reports I read were on Wayland or Xorg.) It almost certainly is a driver issue.
    • Are you using nonstandard timings? Have you tried different refresh rates? https://community.amd.com/t5/graphics-cards/which-monitor-timing-parameter-allows-gpu-vram-frequency-to/td-p/318483
    • Have you been playing games for hours every day, with no frame rate limit? The graphics card can draw considerably more power pushing polygons at 1440p@180Hz than it does at 90Hz, for example, and I don’t think the wattage progression from idle to full load is linear.
    • Are you using recent kernel and firmware versions?



  • AFAIK, RetroArch is just a front-end for the emulators that actually use the controller, so getting this to work depends on the emulator you’ll be using.

    I would expect any decent emulator on Linux to work with the standard Linux joystick and/or evdev APIs, which are supported by the Linux DualShock 4 driver. This driver is built in to the Linux kernel; nothing more should require installation. However:

    It’s possible that your distro might not load that driver automatically. To check, connect the DS4, power it up with the Playstation button (if its light isn’t already on), and run lsmod |grep -E 'hid_sony|hid_playstation' in a terminal. If it responds with some lines containing hid_sony or hid_playstation, then the driver is loaded.

    It’s possible that your distro might not have labeled the DS4 as a joystick device in udev, which isn’t strictly required, but some software expects to see. On the distros I’ve used, the easiest way to get this done is to install the steam-devices package. I think most desktop distros do it automatically these days, though.

    You don’t want DS4Windows. That’s Windows software. There is a program (not a driver) called ds4linux, which creates a virtual Xbox controller alongside the real DS4, similar to what Steam Input does when you use it. You shouldn’t need this for games/emulators that were written properly for Linux, but it’s there for cases when a developer took a shortcut and assumed Microsoft game hardware is standard on our non-Microsoft OS. Alternatively, I think you can use Steam Input when launching non-Steam games in Steam.

    There are various joystick test programs for linux, to give you an idea of whether the OS sees the controller. (This can be helpful when a game doesn’t appear to see it, to determine if it’s the game’s problem or a connection/driver problem.) KDE Plasma has one built in to the System Settings. There’s a also generic one called jstest-gtk, available with most desktop distros. There are probably more out there.

    Keep in mind that test programs like that don’t necessarily know which inputs map to which buttons/sticks on the controller. Don’t panic if they look mixed up in a test program; try it in a game first. If they’re still mixed up, look for a way to remap the inputs.





  • I think dropping loadable module support would severely limit what users can do when a driver misbehaves or doesn’t handle a particular device as well as an (in-tree) alternative.

    Also, I wonder how they expect to achieve being “The KDE operating system” or “doesn’t break” when their existing distro has been more than a little rocky so far. Who do they think will do the long-term work of raising and maintaining the quality bar?

    It would be kool to have a solid reference distro where Plasma could shine, especially for organisations and newer users who don’t know how to replace GNOME on existing distros. But this proposal gives me the impression that they underestimate the effort required, so I am skeptical.










  • Diablo IV is a DirectX 12 game. Those don’t use DXVK directly, though I think they might still use the DXGI component that comes with it, even though vkd3d-proton is providing the Direct3D 12 support.

    DXVK_CONFIG_FILE is not a flag, but an environment variable. It is for overriding the location where dxvk.conf is expected to be. By default, that file is expected to be in the game’s current working directory when it starts, so you don’t need this environment variable at all if you figure out what directory that is. It’s sometimes the directory where the game executable lives, but not always. (Hint: look for a dxvk or vkd3d log file.) Details here.

    Note that one person in that reddit thread says dxvk.conf can be in “any folder of the wine prefix”. As far as I know, that’s just plain wrong. It has to be where DXVK is expecting to find it.

    If you can’t figure out where to put the config file, you might try applying those dxgi settings using an environment variable instead. In Steam, the game launch options would be: DXVK_CONFIG="dxgi.maxDeviceMemory = 8192; dxgi.maxSharedMemory = 8192" %command%

    Here’s a different possible workaround, to be put in Steam’s game launch options: PROTON_HIDE_NVIDIA_GPU=1 %command%
    Or if using Lutris with a Proton Wine runner, you would add an environment variable to the game (or launcher) profile, with key: PROTON_HIDE_NVIDIA_GPU and value: 1.

    If none of those workarounds help, you’ll want to get involved in these discussions:

    https://github.com/HansKristian-Work/vkd3d-proton/issues/1588

    https://github.com/ValveSoftware/Proton/issues/7199

    Edit: Several people have reported that this VRAM bug doesn’t happen on AMD cards. If you happen to have one, you might give it a try.


  • Unfortunately, I don’t think D is good enough to prove your point. From your follow-up comment:

    A language that for all intents and purposes is irrelevant despite being exactly what everyone wanted,

    As someone who uses D, I can attest that it is not what everyone wanted; at least not yet. Despite all the great things in the language, the ergonomics around actually using it are mediocre at best: Several of its appealing features quickly turn it into a noisy language, error messages are often so obtuse as to be useless (especially with templates and contracts in play), and Phobos (the standard library) is practically made of paper cuts. Also, the only notable async support is a fragile mess, and garbage collection is too deeply embedded into both the stdlib and the ecosystem.

    (To be fair, D could be vastly improved with better defaults and standard library. That might happen in time, as Walter and the other maintainers have shown interest, but it’s just wishful thinking for now.)

    Also, D is an entirely different language from C++, and as such, would require code rewrites in order to bring safety to existing projects. It’s not really comparable to a C++ extension.