I’ve been checking for the Flatpak daily 😭
This is where you can track the issue
I’ve been checking for the Flatpak daily 😭
This is where you can track the issue
Another shitty thing about Plexamp is there is no easy way to download your entire library in a converted format and auto download any new additions.
The developer said that “this is not the intended use of Plexamp”, but the reasoning is flawed IMO
The only thing keeping me on Plex is iOS downloads supported natively.
The second Swiftfin gets that I will be switching fully to Jellyfin
Unless Plex adds something new and exciting that pushes them beyond FOSS offerings
Restic and borg are the best I’ve tried for remote, encrypted backups.
I personally use Restic for my remote backups and rsync for my local.
Restic beats out borg for me because there are a lot more compatible storage options.
YES this.
Back when I was on Windows 10, I meticulously deleted all pre-installed crap (candy crush, Netflix, etc.), and turned off all tracking, ads, etc.
About a month later they pushed a major update and all those pre-installed apps were back, with more. All the settings I turned off were reverted.
I won’t ever go back. The only games I really can’t play are all online (League, etc.), and TBH good riddance. Wasn’t adding value to my life anyway.
I was on Pop for a while, if I was still using an Nvidia card I would still be on Pop. Their built in support/installer is just so convenient and seamless for the most part.
Nvidia is just such a pain on Linux. Like if it works then great, but I have had just so many minor problems in the past.
My Nvidia card is essentially just a backup now in my server in case I need video output for a terminal.
Used: yes
Contributed: no
I know I know, I am sorry. Just started using it a few months ago (through Organic Maps on iOS), and honestly have started using it more than Google/Apple Maps. This is a good reminder for me so get off my ass and start contributing.
Yeah, potentially overkill, but all the power to anyone who wants to try them out. Freedom of choice is one of the best parts of Linux.
And sorry for the long response. It’s hard to gauge the proficiency that someone might have with Linux, so I tend to lean towards detailed explanations just in case
I think that there are definitely valuable/valid use cases for the software in the OP, but I think that the built in bash tools can get most people most of the way there. And learning the common bash/shell conventions is way more valuable than learning a custom tool that some distros/environments won’t support.
If someone already uses aliases, creates some custom scripts, and sets some useful environment variables (along with effective use of piping and redirection) and still needs something more specialized, then getting a new tool could help.
The downsides are a reliance on another piece of software to use the terminal. So I would only use something like this if I had a really solid and specific use case I couldn’t accomplish with what I already use.
I wouldn’t install a program for this if your use case is simple. You will end up relying on it when there are already some built in tools that can get you 99% of the way there.
rsync
command to an alias: alias rsync-cust=“rsync -avuP”
Edit: rephrased to not discount the tools shared. I am sure if you had a specific reason to use them they could be helpful. But I think for many users the above options are more than enough and are supported pretty universally.
Yeah I saw a post about it a long time ago on Reddit for users with lots of devices
Basically it is just setting up one or two “central devices” that know all the client devices, but not linking the client devices individually.
IE: One server is connected to your phone, laptop, tablet, desktop, etc. But the phone is not directly connected to your laptop or desktop or tablet.
To be fair I don’t actually know if this is the best approach anymore or if just connecting all of them in a mesh is better 🤷
Here is a forum post describing it.
Check out LibreOffice instead, it’s more modern and actively maintained.
I would go from the bottom up instead of top down.
Make a list of software and tools you use, and search for functional Linux native equivalents. Then find the distro that supports up to date versions of that software (through flatpak or the package manager).
You can honestly do 100% of this without even touching the command line if you choose something user friendly like Mint, Pop OS, Ubuntu, or Fedora. Don’t fall into the rabbit hole of finding the perfect distro. Go from what you need to what supports it.
keep the windows partition around for a while until you are 100% confident you can fully make the switch.
Also it’s good to get into the habit of using man
or - -help
instead of or in combination with searching on the internet. Makes you less reliant on searches and also ensures that your are using commands that correspond to the version of the software you are using
IE: man rm | grep recursive
Plex, PiHole, Photoprism, Home Assistant, Syncthing in a hub and spoke config, Caddy for reverse proxy, custom containers for: yt-dlp, restic, and rsync.
Yeah I am a bit salty about all of the whole “Opt-out” telemetry thing. I know its just a proposal but just feels a bit slimy.
Fedora is upstream of RHEL which is supposed to result in a mutually beneficial arrangement where Fedora users are essentially testers / bug reporters of code that will eventually make its way into RHEL. Its just part of the collaborative, fast, and “open” nature of FOSS. Adding sneaky/opt-out telemetry just feels like a slap in the face.
super small ex. I am a big Podman user these days, and have submitted a few bug reports so the Podman github repos which has been fixed by RedHat staff. This makes it faster for them to test and release stable code to their paying customers. Just a small example but it adds up across all users to make RHEL a better product for them to sell. Just look into the Fedora discussion forum, there is so much bug reporting and fixing going on that will make its way to RHEL eventually.
Making and arguing for “Opt-out only” telemetry is just so tone deaf to the Linux community as a whole, but I think they got the memo after the shit storm that ensued over the past few days.
But HEY one of the biggest benefits of Linux is that I can pretty painlessly distro hop. I’ve done it before and can do it again. All my actual data is on my home server so no sweat off my back. openSUSE is looking pretty good, maybe I will give it a try.
I’m conflicted on this. I 100% think CLI applications should remain as packages but Flatpak IMO is superior for GUI. It just has a lot of “step in the right direction” sorts of things that address some of Linux’s faults.
The big two positives for me are:
I am on Fedora Silverblue and the concept of a base OS + Flatpaks just feels right for workstations. OCI containers (podman/docker/distrobox) as a bonus for development environments without borking your host.
But with this recent Fedora news (I know nothing has changed YET but I am just sussed out tbh), I am considering switching to OpenSUSE Aeon/Kalpa.
Ahhh I see. Might be an issue with the Nvidia drivers and Wayland.
I would try the following in order and see if any of them fixes it:
Update your system “rpm-ostree update
” and reboot
Make sure you are on a Wayland session. This also provides instructions to see what apps are in X11 mode (which I suspect Firefox and Software Center are in your case).
2.1 If you are already on Wayland (its the default in Fedora 37): Install Flatseal and force Wayland for Firefox (toggle off X11 and ‘Fallback to X11’)
2.2 If you are on X11, logout and switch to Wayland in the login screen and follow 2.1 to force Firefox to Wayland.
If that doesn’t work I would follow these specific steps to install Nvidia on silverblue. RPM Fusion also has some specific guides for Silverblue that you should check to see if you missed a step.
I would also consider upgrading to Fedora 38, and bump up your RPM fusion major version to track Fedora 38.
I bought a .com for like $10 CAD from Cloudflare that uses a URL not linked to me.
Maybe overly paranoid, but it also makes it easy to get SSL certificates for my lab.
Nah @exu is right: non-IT focused companies do not have the skills or desire to reliably set up and maintain these systems. There is no benefit to them creating their own server stack based on a community distro to save a few bucks.
Smaller companies will hire MSPs to get them setup and maintain what they need. And medium to large size companies would want an enterprise solution (IE: RHEL) they can reliably integrate into their operations.
This is for a few high value reasons. Taking Red Hat as an example:
When lots of money is on the line companies want as many safety/contingency plans as they can get which is why RedHat makes sense.
The only companies that will roll their own solution are either very small with knowledgeable IT people (smaller startups), or MASSIVE companies that will create very custom solutions and then train their own IT operations divisions (talking like Apple, Microsoft, Amazon levels).
Not to say what Red Hat did is justified or good, because hampering the FOSS ecosystem is destructive overall, but just putting this into context.