• 0 Posts
  • 1.19K Comments
Joined 2 years ago
cake
Cake day: March 8th, 2024

help-circle



  • Hah. You do you. I get how it’d be obnoxious to be called out, but man, it’s not my fault that you chose the worst possible example for this. Like, literally the worst iteration of Windows for the specific metric you called out, in a clearly demonstrable way that a ton of people measured because it was such a meme.

    You can block me, but “they are what they are” indeed.

    Incidentally, this is a classic opportunity to remind people that blocking on AP applications sucks ass and the only effect it has is for the blocker to stop being able to see what the blockee is saying about them while everybody else still gets access to both. Speaking of software degradation, somebody should look into that.


  • Myyyyyeeeeh. A lightweight distro or a conemporaneous distro sure.

    If I’m running GPU accelerated Steam, tons of tabs on Firefox and the same highly customized KDE desktop full of translucent components and extra animations I am willing to bet they’d both chug.

    Which is what the conversation is about: new software doesn’t suck, it’s doing more stuff.

    For sure, all things being equal Linux does run ligher on RAM and VRAM, so if you’re using something that is speficially memory-limited so Windows and Linux fall on opposite sides of overflowing the available memory you’ll definitely see better performance on Linux, but that’s not an inherent issue with poorly made software having a huge performance overhead.


  • EVER is a long time.

    The current implementation? Not unless they stoip training along the same lines they currently are. I think there’s some value, and you can access it pretty easily with the open source freely available models that are out there and some semi-decent hardware, but hundreds of billions to trillions in revenue for multiple corporations? Nah.

    They’ll maaaaybe mitigate it by shifting people away from home computing and into connected systems, but I suspect the moment the bubble pops or hardware production levels off with their current demand people will end up realizing they can run 90% of what’s being offered in a gaming laptop from 2020.


  • Then you’re either lying about it or haven’t booted a newer PC. Fast Boot was a back of the box feature for Windows 8 for a reason. It was becoming a huge meme at the time how slow Win7 was to boot.

    If your 2020s PC with Windows 11 is taking 45 seconds to boot on the Windows logo like Win 7 does (as seen in the benchmarks above) then you need some tech support because something is clearly not working as expected. I don’t think even my weaker Win11 machines take longer than 10 secs from boot starting to the password screen.

    That may be true anyway, because the tiny hybrid laptop I’m using to write this is reporting 2-5% CPU utilization even with a literal hundred tabs open in this browser. So… yeah, either you have a knack for hyperbole or something broken.




  • This is not it. Not only is there a microinverter and a breaker there to address that issue, but my understanding as a layman is the load in the circuit is down to how much you’re drawing (i.e. if you’re generating 1200 behind the microinverter and pulling 1500 you’re pulling 1500 through the circuit, not 2700).

    The bigger fire hazard here is the battery many of these come with for storage, honestly.

    That’s not to say there isn’t a bit of a risk. You need to be careful if you need to do something in the installation that you disable both the grid breaker and the microinverter. Otherwise it’s entirely possible for the grid safety to blow and the inverter to keep pumping power into your house. But as the previous poster says, there’s a reason these are legal to install in apartments all over Europe, and it’s not just European grids being set for higher amps. FWIW, most of these kits come with 800W max out. My understanding is they’re perfectly fine to use as a cost mitigation and they’ll keep your fridge going in a blackout but no, they won’t be constantly tripping your fuse.


  • Except the Linux userbase has been saying that exact thing for the past ten years, so again, has Linux also degraded in sync or, hear me out here, is this mostly a nostalgia thing that makes you forget the cludgy performance issues of the software you used when you were younger and things have mostly gotten snappier over time across the board?

    As a current dual booter I’ll say that Windows and Linux don’t feel fundamentally different these days, for good and ill. Windows has a remarkably crappy and entirely self-inflicted issue with their online-search-in-Start-menu feature, which sucks but is toggleable at least. Otherwise I have KDE and Win11 set up the same way and they both work pretty much the same. And both measurably better than their respective iterations 10, let alone 15 or 20 years ago.



  • Well, the problem is less setting up the birthdate and more whether the birthdate needs to be verified.

    Plenty of OSs already query for a birthdate, particularly on gaming devices. And yes, they will provide age-based protections already.

    The question is, does the parent/account creator need to enter an accurate birthdate or not, and how does the system know?

    If they don’t, then whatever, it’s the same self-declaration we already have all over the Internet. No biggie. Everybody was born in 1901 and we’re all chill about it. It still makes for an absurd situation where you HAVE to have a personal profile for every user on every computer, which a ton of computers aren’t expecting, so it’s still dumb on top of being useless, but it’s a solvable problem.

    If they do, then you know have one of the biggest cryptographic and data management challenges in computing history. How do you have every single device across the entire planet interface with every single piece of software and server to authenticate a piece of personal data and safely store it so you don’t have to constantly re-check? It’s insane. Plus it removes a parent’s ability to enable their children to engage with content at whatever speed they see fit. And there are potentially different regulations in different areas, where both the server and user location may change the required behavior, so the whole thing is an absolute mess from the concept up.


  • That’s… not really true, and not what that link shows. Those latency tests still show then-modern devices topping the list. They’re arguing that some then-modern low end devices have more button-to-screen latency than older hardware (which they would, given he’s comparing to single-threaded, single-tasking bare metal stuff from the 80s spitting signals out to a CRT to laptops with integrated graphics). And they’re saying that at the time (I presume the post dates from 2017, when the testing ends), this wasn’t well understood because people were benching the hardware and not the end to end latency factoring the I/O… which was kinda true then but absolutely not anymore.

    I’d get in the weeds about how much or little sense it makes to compare an apple 2 drawing text on a CRT to typing on a powershell/Linux terminal window inside a desktop environment, but that’d be kind of unfair. Ten years ago this wasn’t a terrible observation to make with the limited tools the guy had available, and this sort of post made it popular to think about latency and made manufacturers on controllers, monitors and GPUs focus on it more.

    What it does not show, though, is that an apple 2 was faster than a modern gaming PC by any metric. Not in 2017, and sure as hell not in 2026, when 240Hz monitors are popular, 120Hz TVs are industry-standard, VRR is widely supported and keyboards, controllers, monitors and GPU manufacturers are obsessed with latency measurements. It’s not just fallacious, it’s wrong.


  • When was the last time you booted a 2011 machine? Because man, is that not true.

    And that’s a 2016-2017 era PC.

    Windows 7 didn’t even have fast boot support at all. I actively remember recommending people to let their PCs sit for a couple of minutes after booting so that Windows could finish whatever the hell it was trying to do in the background faster instead of clogging up whatever else you were trying to do.

    Keeping my old hardware around compulsively really impacts my perception of this whole “things were better when I was a teenager” stuff.


  • That’s some nonsense, though.

    For one thing, it’s one of those tropes that people have been saying for 30 years, so it kinda stops making sense after a while. For another, the reason it doesn’t make sense is it doesn’t account for modern computers doing more now than they did then.

    In 2016 I had a 970 that’s still in an old computer I use as a retro rocket, and I can promise you that wonderful as that thing was, I couldn’t have been playing Resident Evil this week on that thing. So yeah, I notice.

    And I had a Galaxy S7 then, which is still in use as a bit of a display and I assure you my current phone is VERY noticeably faster, even discounting the fact that its displaying 120fps rather than 60.

    Old people have been going “things were better when I was a kid” for millennia. I’m not assuming we’re gonna stop now, but… maybe we should.



  • I keep thinking back to all the conversations with alleged leftists here on how they were both the same and Biden was too soft on Israel, which disqualified Harris and at least Trump was running on ending foreign wars.

    Still haven’t seen any “oh, wow, yeah, that’s way worse than I thought it’d be, I was kinda wrong on that one”, either.

    I know I should not be pushing the issue in hopes that they quietly show up for the midterms, but at this point US politics is not worth engaging with and you can only take so many middle class cosplayers smugly calling you a naive centrist for even entertaining a gradient of madness between US political factions before you start getting flashbacks the circling of the drain speeds up.