Ex-Redditor. I have big autism, big sad-all-the-time, and weird math energy.

Interests

  • extreme metal
  • audio engineering
  • electrical engineering
  • math
  • programming
  • anarchism

Dislikes

  • proprietary software
  • advertisements
  • paywalls
  • capitalism
  • bigotry
  • people who defend the above
  • 0 Posts
  • 14 Comments
Joined 1 year ago
cake
Cake day: June 17th, 2023

help-circle



  • set up shop on an instance

    Don’t do that. You probably should have multiple accounts on different instances. If you really need a continuous, single identity, post links to all your usernames in each.

    This is why the move from Reddit was so difficult for Redditors: because we put all our eggs into Reddit Inc’s basket. All our content is under Reddit’s control. This analysis can be applied to any centralized social media service. If your instance shits the bed or bans itself from everyone else, you can move somewhere else. You can start your own in the worst case. It’s annoying, but at least there is a real path to move on.

    We shouldn’t be putting our eggs in any one basket. We shouldn’t have been doing it before the Fediverse, and we shouldn’t be doing it here either. Your social media access should not be dependent on the goodwill of one person or entity. Eventually, that entity will corrupt.

    Also, I’m on vlemmy.net. Right now, they haven’t defederated from anyone, and I believe we’re still not banned from Beehaw or anyone else. If you really want the whole Fediverse (and you probably don’t), make an account on vlemmy or one of the top three instances on this page.

    Why don’t you have a second account?

    Lazy. Don’t care if my shit gets fucked. But if you do care if your shit gets fucked, then you shouldn’t rely on centralized social media.


  • because all of them can also be said of DC electrical current.

    I mean I can’t and wouldn’t force you to think a certain way, but that premise is false, and I thought I demonstrated as such in the previous comment.

    What I can add is that actual “DC current”, e.g. that delivered by a physical, nearly-constant current source that turned on at some point in time and ostensibly will be turned off before the heat death of the universe, does have an AC component! At the very least, it will turn on and off, which is a variation in time. When we design circuits for “DC current” (or voltage), we make the assumption that the AC component is too small to be considered, and thus we just pretend that we have an ideal DC current.

    So when we talk about DC current with any kind of precision, we really mean the constant part of the current waveform equal to the average value of the signal. Blowing as a set of related signals in all it’s media are not constant signals. A recording would demonstrate this, and the requirement for sound to have a nonzero frequency also rules out the possibility for a DC sound.

    Now I know that analogies are loose comparisons, and if your analogy aids your understanding then more power to you, but I genuinely cannot find any way that they are analogous.


  • [Air] being blown out of your mouth is similar to DC ( direct current ) and that it’s a continuous wave of air with frequency zero.

    Nope. You can’t have sound without a vibration. A vibration of zero frequency is constant for all time. When you blow air, you get a bunch of “not-zero” frequency noise from the actual movement of air. Even if you could somehow blow a perfectly DC (0Hz frequency) wave, the fact that you started at some point of time mathematically implies that there are higher frequencies in the signal. [1]

    To convince yourself of this, record an audio clip of yourself blowing into a microphone. Any mic will do, just don’t overload it.[2] Open up the audio file in Audacity, Ardour, or any other audio program that can display waveforms. It will be oscillating quite a bit.

    This also indicates that approximating sound as a constant waveform is not a good engineering decision. As an hobbyist audio programmer and electrical engineering major, it would make my life a lot easier if blowing sounds were constant, because then I could do away with frequency analysis and digital filtering, which is so easy to screw up. We would simply sample the constant audio waveform in whatever medium [3] it is constant.

    [1] I actually had a much more detailed post in mind where I discussed Fourier series, Fourier transforms, and the exact definitions of DC values in electrical engineering, but unfortunately Jerboa ate the comment before I could submit it. Oh well, I can’t be mad since the app is so early in its lifecycle. If you need any help navigating the above pages, feel free to comment. I can also point you to more rigorous references if you need some reading material.

    [2] Really, I mean not to clip any element in the signal chain. All digital audio devices have a maximum loudness. If the signal has a bunch of flat tops, like it was going to keep going higher or lower and then some jerk clipped off the highest and lowest points with scissors, you’ve clipped the signal. This is especially important for blowing because it (intentionally) moves a lot more air than ordinary talking, so try to physically back away from the microphone when you blow. Technically you can damage a microphone by blowing at it, but you probably can’t blow hard enough to blow it. It’s mostly a signal integrity issue.

    [3] I have been using the word “waveform” rather loosely. The sound is physically propagated through space as related waves in pressure and particle velocity. Microphones typically respond to changes in pressure, which is converted into an analog voltage waveform. Now the pressure waveform exists over time, but also over space. Mathematically, this expresses the fact a sound might be louder or quieter depending on where in space you are relative to the sound’s source. If the electrical system is competently designed, the distribution of the voltage in space should be negligible. This expresses the reality that audio distributed through headphones sound the same regardless of where the player is located relative to the headphones, so long as all the wires are connected correctly. Ideally, once you have the pressure at a point, or more realistically an average over a small region of space, the reading is converted to a voltage that is directly proportional to the pressure waveform. In reality, there are going to be some nonlinearities, but the hope is that the waveform is as close to the original as possible under reasonable restrictions on frequency content and signal size, e.g. that the signal isn’t too fast or too big.

    Furthermore, the analog waveform needs to be sampled. This generates a new waveform that only exists at discrete points in time. Then, because computers have a finite number of storage bits, the sampled waveform is quantized, or forced into one of a discrete set of values. This is the digital waveform seen in Audacity or a similar program. Furthermore, your computer has to reverse that process so it can send a voltage signal to the headphones, which finally generates the pressure variations that reach your ears.

    We can use the term “audio waveform” interchangeably for all of these things, including the digital ones, because they carry (approximately; ideally exactly) the same information. This is not some hand-wavy term; information theory posits that the amount of information that a signal carries can be quantified. However, the hand-wavy explanation for it is that all of these waveforms are simply different ways to represent the same thing. For the purposes of classifying signals, sound signals should share common properties despite being in different mediums.



  • It seems there’s a lot of discussion about getting rid of tipping, but I don’t know how much has changed in this regard.

    Nothing has changed, and it never will, as it concerns poor and “therefore” “deserving” people. Americans’ talk is cheap.

    The system seems ridiculously unfair, and that extra expense in a country where everything is already so expensive really makes a difference.

    Agreed. So when you go to a restaurant and you have a maximum amount you can spend, divide the amount of money you have by (100% + local sales tax), then divide by (100% + the menu price), and subtract any surcharges added by the restaurant (assume $5.00 if you cannot look it up), often masquerading as a tip. I know it’s a lot of math, but you have a computer in your pocket. You’ll manage.

    In my view, the US is a fractal scam. At every level, everything is an attempt to extract money from ill-informed “suckers”, from the running of the government, to the prices of supermarket groceries, to the tipping culture at restaurants, to even finding a place to put your car [1]. Every single thing is someone’s grift. In order to function in America, you need to be willing to be suckered to some extent. There’s no way around it. Unfairness is baked into every transaction, and increasingly more social interactions.

    Everything in America is ridiculously unfair. We wear this on our sleeves, and for many Americans this fact defines their personality. Unfortunately, you will have to deal with it in the short term at least.

    Now if you would like to be the one to lead the charge against the tipping culture and the foisting of responsibility for servers’ compensation onto the customer, then be my guest. Refuse to tip and make a big scene about it. Make plans for how to take the inertia of your big struggle and turn it into a mass movement. I would thrilled to join you. However, I somehow doubt that you’re ready to go that far; none of the customers who stiffed me ever went on to start anti-tipping movements.

    So will AITA if I don’t tip?

    Yes. You are expected by all members of the public here to tip. That is our culture, something we’re proud of for some reason, and our expectation. For some servers, tips are the primary source of income at work.

    Is it really my personal responsibility to make sure my server is paid enough?

    No, it is the responsibility of the employer. However, when no employer takes their responsibility and you sit yourself down at a restaurant, the logical conclusion is that either you pay that part of the server’s wages, or they get stiffed. You know that this is the conclusion. (Or if not, now you do.)

    If you want to participate in our unique restaurant scam, you gotta accept that you’re going to get suckered into paying the server’s wages. Otherwise, don’t go to restaurants. When you go to a restaurant, you waste the employees’ finite time on this planet doing tedious, physically and mentally demanding bullshit that no sane person would choose to engage with, if not faced with the threats of homelessness and starvation. [2] At least make it worth their while.

    Sorry if I come off as having a chip on my shoulder, but that’s only because I totally do. So many customers used to concern-troll me as a pizza delivery person and give me shit like “sorry, couldn’t afford to tip, they should really pay you more.” Yeah, they should, but you absolutely could have tipped; all you had to do was order one less topping. I’d love to see some actual solidarity with food service employees, but that would require challenging deep-rooted assumptions about our culture and we’re too shit-for-brains to do that. Americans are so compassionate and empathetic until the moment they actually have to lift a finger.

    So when someone brings up “unfairness” or “it’s X’s responsibility to pay the workers” in response to tipping, I just kinda die a little inside from all the times those sentiments have been used against me and my colleagues.

    [1] And don’t even get me started on the process of buying a car, or how the public was scammed into accepting a car-centric infrastructure.

    [2] This is really a special case of the logic behind the antiwork movement: nobody actually wants to go to work. We only go to work under the threats of starvation and homelessness imposed by capitalism.


  • Reddit’s drawcard was finding THE sub for a topic

    IMO Reddit’s drawcard was containing the sub, and therefore the community, for a topic. Reddit is where the discussion was, and for many communities still is. Rather than hosting a dedicated forum, people interested in starting a community can just start it and begin moderating and discussing without setting up a backend; it allows users to get to the “socializing” step of building a community in less steps. Lemmy also does this, albeit with a smaller community likely distributed over several instances and earlier in the system’s lifecycle.

    Hopefully, Lemmy will implement a “multi-community” option like the multireddit concept so that users can group multiple related communities into one feed.

    That being said, I think that similar communities ought to find each other and work together to best serve the people of their communities. Some communities will benefit from collaborative non-competition (for example, a community for discussion about how to use a specific complex product) while some have no need to be centralized (for example, a community for sharing dank memes). However, even in communities that would benefit from non-competition in good times, users should always be free to form their own communities in case the parent community (or their moderation) becomes too odious to bear. This process was much more difficult on Reddit because sub names had to be unique, so new communities would need to pick a weird name.




  • These two are now the first apps I install on any new device:

    • Kiss launcher (simple and fast)
    • Articons icon pack

    Basically, my approach is to (mostly) prioritize text over icons, and reduce the colors I need to process.

    Other apps:

    • Brave browser (for YouTube and built-in anti-tracking features.)
    • Librera (ebook/PDF reader with lots of features)
    • Odyssey (local music player optimized for speed. My library is so large that all the other players were having trouble finding songs.)
    • Graph 89 (TI graphing calculator emulator)
    • Feeder (RSS feed aggregator)

  • Other historical artefacts like pottery, vellum writing, or stone tablets

    I mean I could just smash or burn those things, and lots of important physical artifacts were smashed and burned over the years. I don’t think that easy destructability is unique to data. As far as archaeology is concerned (and I’m no expert on the matter!), the fact that the artefacts are fragile is not an unprecedented challenge. What’s scary IMO is the public perception that data, especially data on the cloud, is somehow immune from eventual destruction. This is the impulse that guides people (myself included) to be sloppy with archiving our data, specifically by placing trust in the corporations that administer cloud services to keep our data as if our of the kindness of their hearts.