![](https://lemm.ee/pictrs/image/67f8d7b0-0ac8-4605-8d24-e14bda710a46.webp)
![](https://lemmy.world/pictrs/image/1f477879-f269-4fc2-805c-3cb0fe552f40.png)
Xcloud streaming does indeed work very well via Greenlight. I’m also using GeForce Now combined with PC game pass via gfn-electron so I can play Diablo. Very happy with it. On debian bookworm btw
Xcloud streaming does indeed work very well via Greenlight. I’m also using GeForce Now combined with PC game pass via gfn-electron so I can play Diablo. Very happy with it. On debian bookworm btw
You followed the setup instructions in the welcome tab after adding the .xpi right? It works great for me on Debian. If you run an update in your dnf, does it check the repo for the extension like it should?
It’s a slightly different use case, I default to Losslesscut and switch to Shotcut when I need a vfilter or if I’m just generally willing to concede to making a lossy cut.
Shotcut is way more flexible but I can make a quick clip in Losslesscut with probably 1/3 the number of user effort/inputs. Let alone trying to remember every ffmpeg parameter under the sun just to get consistent usable output
That’s what I’ve always assumed it does since back when quicktime player barely even ran on my PC yet for timeline operations it was significantly more responsive than WMP/MPC.
For Losslesscut I just get around this by encoding my input from source using keyint=n:scenecut=0 in ffmpeg where n is a manually set keyframe interval.
So e.g. if my expected cut occurs on a frame that occurs at t+10 seconds of footage, n can be the same as the fps and then there’ll always be a keyframe exactly at timestamp 00:00:01, 00:00:02 and so on. I can then open it in losslesscut and easily snap to the frame I want and make the cut losslessly.
Yeah the first encode generally means a lossy transcode by the time I get to my final video but being realistic that’d be a part of my workflow either way and this way it’s less
I couldn’t find it in my comment history, but I saw a thread months ago where someone was lamenting migrating from reddit where they used to just google “episode ### discussion” for the show they’re watching and would find a corresponding reddit thread, but the same thing wasn’t working for them with Lemmy. Someone else pointed out that it might be because Google personalises some of the search results now, so I tried their example query and the top link was to the post I was commenting on. It had already indexed to the most relevant result about an hour after the original post
Agreed! Personally, my willingness to trust a service is generally a function of the utility I get from the service. My data has value, but I certainly wouldn’t consider it priceless!
If you did that though, wouldn’t you get pretty similar activity to what you’ve posted here? Just to servers other than Google’s
I’m not meaning to be contrarian to your point about this being a reason why you should de-google, just absent of context someone reading this post might be compelled to do so without understanding that is going to compromise functions of their device they’re likely accustomed to
To each his own, sure, but for most people that includes push notifications, and that’s how they work.
It’s seen as an investment, yes. Those are important factors for a currency, I agree.
Is there a part where you meant to connect these dots to substantiate the first statement about it being a problem that it’s seen as an investment?
Edit: I get it, you’re saying it’s a problem with the idea that Bitcoin should be used as a currency in everyday transactions. I don’t think that’s a popular use case for Bitcoin, though. I wouldn’t use “digital gold” for everyday transactions, similarly to how I wouldn’t use real gold. That’s not really a problem with Bitcoin though, more of a misunderstanding of it
It’s sad, but as a crypto user I’d be sketched out enough about using a centralised hot wallet app like Exodus in an official capacity, let alone entering my private key in something installed via a 3rd party app store. This probably happens on the Play Store a few times a week, and that’s on a bigger platform with a full security review process. It’s ultimately unavoidable.
Am noob on debian, it’s great. Watched some videos to help introduce me, and it seems like the onboarding experience since 12 is way better than previously.
It’s the website that’s shitty, not the installer. And you’re not stupid OP, the bootable live image installer should be the default download. Make sure you link directly to it in your post, if you do. I should be able to go to the Debian website, hit download and get the best option like I can on the Ubuntu site. I got the normal installer instead, but that was fine for me.
I wouldn’t necessarily say I’m representative of the average newbie, as I had brief forays with using Linux many years ago. But it’s been painless. It took like an hour to setup, try a couple of DE’s, add Flatpack sources and then I was away, back to being immersed in my apps.
Wayland by default, inclusion of nonfree firmware sources, GNOME 43 are highlights for me and reasons why it deserves some focus. New users are coming from Windows, not Fedora. I’ve tried GNOME 2, that was a problem for me as a windows user. GNOME 43 is not a problem for Windows users, it is literally much more performant and stable. To the point I just realised now that it’s an older version when you pointed that out. Could’ve fooled me.
The reason I tried Debian first is because I wanted a blank slate, especially coming from Win11. That’s what I got after minimal and easy configuration. I’m satisfied with it and don’t feel curious about trying other distros, at least not right now.
mkvtoolnix-gui
thanks, edited to correct
All the developer needs to do is push a button to make EAC work. They’re probably busy hotfixing the 1.0 but I’m sure it’ll work soon, they are excluding all steam deck users by not pressing it
Edit: apparently it’s not EAC that is the problem. The game has its own anti-cheat which also potentially bans your account if you try to play on linux https://www.protondb.com/app/2073850
Sure, but this is largely because currently each client doesn’t need to aggregate the whole fediverse. In a decentralised network, you can’t split the sum total of processing required to run the fediverse equally amongst peers. Each peer would need to do more or less the same aggregation job, so the total processing required would be exponentially more than with the current setup. You could still argue it’s a negligible processing cost per client, but it’s certainly way less efficient overall even if we assume perfect i/o etc in the p2p system and even if the client only needs to federate the user selected content
Also just practically deploying a client app that can federate/aggregate constantly in the background (kinda required for full participation) and scale with the growth of fedi without becoming a resource hog I imagine would be pretty tough, like maybe possible yeah but I feel like it makes sense why it isn’t like that also
The instance is the aggregator, if it’s P2P then the aggregation is done by the client. In a torrent swarm you contribute bandwidth, not processing power
Not without unlocking the bootloader, which is locked up tight on all CCwGTV models especially since the OTA update to Android 12
I use Shotcut for more or less any video operations that require re-encoding. It’s great for basic editing but also simple transcoding jobs too