• 0 Posts
  • 98 Comments
Joined 1 year ago
cake
Cake day: July 14th, 2023

help-circle
  • I have worked hard to remind some of the Trump-voting Massholes I have the pleasure of interacting with on a regular basis of this. I think I’ve made inroads with at least one of them. I also make sure to bring this up:

    “The political folks believed that because it was going to be relegated to Democratic states, that they could blame those governors, and that would be an effective political strategy.”

    Trump committed negligent homicide in blue states for political gain.











  • Let’s just take NYT for example. Subscription costs $325/year. Why would I ever pay that much? It’s not 1954. I’m not sitting down with my morning coffee and reading the damn thing front to back. I’m reading maybe one article a week from 15 different sources. Am I supposed to pay $5000/year just to cover my bases?

    As with everything else in [CURRENT YEAR] the value proposition is so absurdly out of step with reality that fixing it basically relies on rolling out the guillotines.





  • It’s not a Ponzi scheme because withdrawals and deposits are scheduled and mandatory. You don’t get into a situation where investors lose faith and ask for money that you don’t have because that’s not allowed. You get your money when you reach the requisite age. You also don’t get into a situation where you run out of investors because it’s a mandatory payroll tax.

    Essentially the only issue facing social security is the fact that since 1974, wage growth has become decoupled from productivity gains, meaning the payroll tax that funds social security captures a smaller proportion of business revenue over time - because businesses spend less of their revenue on payroll than they used to.

    Social security doomerism of the type on display here is a tacit acceptance of conservative propaganda on the subject. The government is fully capable of indefinitely maintaining a pension fund, we just have to stop accepting the lie that it isn’t.



  • I guess I’m wondering if there’s some way to bake the contextual understanding into the model instead of keeping it all in vram. Like if you’re talking to a person and you refer to something that happened a year ago, you might have to provide a little context and it might take them a minute, but eventually, they’ll usually remember. Same with AI, you could say, “hey remember when we talked about [x]?” and then it would recontextualize by bringing that conversation back into vram.

    Seems like more or less what people do with Stable Diffusion by training custom models, or LORAs, or embeddings. It would just be interesting if it was a more automatic process as part of interacting with the AI - the model is always being updated with information about your preferences instead of having to be told explicitly.

    But mostly it was just a joke.