• 0 Posts
  • 41 Comments
Joined 1 year ago
cake
Cake day: July 5th, 2023

help-circle


  • Let’s not pretend like Blizz or Bethesda will see the end of this decade anyway.

    So if you’re management, you face a choice: try to dump everyone now in a reorganization on a moment’s notice, while it’s still Biden’s NLRB, or negotiate a CBA that probably bakes in substantial severance and job protections that will be expensive when they do try to reorganize for business reasons?

    If it’s true that the workers were likely to get dumped within the decade, then negotiating protections now actually protects them, or forces management to pay a high cost.



  • It’s just a type of injury. Injuries themselves don’t give you a right to sue, you have to be injured by someone else doing something wrong.

    Can I sue for blindness? Yes, if someone caused my blindness in a way that they’d be liable for. Same with other injuries like broken bones or lost employment or embarrassment or paralysis.

    So if someone drives drunk and hits you with their car, paralyzing you and causing loss of enjoyment of life, you can sue them and would have to prove liability (they caused your injury in a way that causes them to have to pay for it) and damages (the amount of money they owe you based on how injured you are). Something like loss of enjoyment of life would be part of the second part of the analysis.



  • I think that it’s foolish to concentrate people and activity there even further, it defeats the point of a federation.

    It defeats some of the points of federation, but there are still a lot of reasons why federation is still worth doing even if there’s essentially one dominant provider. Not least of which is that sometimes the dominant provider does get displaced over time. We’ve seen it happen with email a few times, where the dominant provider loses market share to upstarts, one of whom becomes the new dominant provider in some specific use case (enterprise vs consumer, mobile vs desktop vs automation/scripting, differences by nation or language), and where the federation between those still allows the systems to communicate with each other.

    Applied to Lemmy/kbin/mbin and other forum-like social link aggregators, I could see LW being dominant in the English-speaking, American side of things, but with robust options outside of English language or communities physically located outside of North America. And we’ll all still be able to interact.


  • For my personal devices:

    • Microsoft products from MS DOS 6.x or so through Windows Vista
    • Ubuntu 6.06 through maybe 9.04 or so
    • Arch Linux from 2009 through 2015
    • MacOS from 2011 through current
    • Arch Linux from 2022 through current

    I’ve worked with work systems that used RedHat and Ubuntu back in the late 2000’s, plus decades of work computers with Windows. But I’m no longer in a technical career field so I haven’t kept on top of the latest and greatest.



  • I’m still a skeptic of the Nova system into the 4 categories (1: unprocessed or minimally processed, 2: processed ingredients, 3: processed foods, 4: ultra processed foods), because it’s simultaneously an oversimplification and a complication. It’s an oversimplification because the idea of processing itself is such a broad category of things one can do to food, that it isn’t itself all that informative, and it’s a complication in that experts struggle to classify certain foods as actual prepared dishes being eaten (homemade or otherwise).

    So the line drawing between regular processed food and ultraprocessed is a bit counterintuitive, and a bit inconsistent between studies. Guided by the definitions, experts struggle to place unsweetened yogurt into Nova 1 (minimally processed), 2 (processed culinary ingredients), 3 (processed food) or 4 (ultra processed food). As it turns out, experts aren’t very consistent in classifying the foods, which introduces inconsistency in the studies that are performed investigating the differences. Bread, cheese, and pickles in particular are a challenge.

    And if the whole premise is that practical nutrition is more than just a list of ingredients, then you have to handle the fact that merely mixing ingredients in your own kitchen might make for a food that’s more than a sum of its parts. Adding salt and oil catapults pretty much any dish to category 3, so does that mean my salad becomes a processed food when I season it? Doesn’t that still make it different than French fries (category 3 if I make them myself, probably, unless you count refined oil as category 4 ultra processed, at which point my salad should probably be ultra processed too)? At that point, how useful is the category?

    So even someone like me, who does believe that nutrition is so much more than linear relationships between ingredients and nutrients, and is wary of global food conglomerates, isn’t ready to run into the arms of the Nova system. I see that as a fundamentally flawed solution to what I agree is a problem.





  • I don’t think the First Amendment would ever require the government to host private speech. The rule is basically that if you host private speech, you can’t discriminate by viewpoint (and you’re limited in your ability to discriminate by content). Even so, you can always regulate time, place, and manner in a content-neutral way.

    The easiest way to do it is to simply do one of the suggestions of the linked article, and only permit government users and government servers to federate inbound, so that the government hosted servers never have to host anything private, while still fulfilling the general purpose of publishing public government communications, for everyone else to host and republish on their own servers if they so choose.


  • None of what I’m saying is unique to the mechanics of open source. It’s just that the open source ecosystem as it currently exists today has different attack surfaces than a closed source ecosystem.

    Governance models for a project are a very reasonable thing to consider when deciding whether to use a dependency for your application or library.

    At a certain point, though, that’s outsourced to trust whoever someone else trusts. When I trust a specific distro (because I’m certainly not rolling my own distro), I’m trusting how they maintain their repos, as well as which packages they include by default. Then, each of those packages has dependencies, which in turn have dependencies. The nature of this kind of trust is that we select people one or two levels deep, and assume that they have vetted the dependencies another one or two levels, all the way down. XZ did something malicious with systemd, which opened a vulnerability in sshd, as compiled for certain distros.

    You’re assuming that 100% of the source code used in a closed source project was developed by that company and according to the company’s governance model, which you assume is a good one.

    Not at all. I’m very aware that some prior hacks by very sophisticated, probably state sponsored attackers have abused the chain of trust in proprietary software dependencies. Stuxnet relied on stolen private keys trusted by Windows for signing hardware drivers. The Solarwinds hack relied on compromising plugins trusted by Microsoft 365.

    But my broader point is that there are simply more independent actors in the open source ecosystem. If a vulnerability takes the form of the weakest link, where compromising any one of the many independent links is enough to gain access, that broadly distributed ecosystem is more vulnerable. If a vulnerability requires chaining different things together so that multiple parts of the ecosystem are compromised, then distributing decisionmaking makes the ecosystem more robust. That’s the tradeoff I’m describing, and making things spread too thin introduces the type of vulnerability that I’m describing.



  • In the broader context of that thread, I’m inclined to agree with you: The circumstances by which this particular vulnerability was discovered shows that it took a decent amount of luck to catch it, and one can easily imagine a set of circumstances where this vulnerability would’ve slipped by the formal review processes that are applied to updates in these types of packages. And while it would be nice if the billion-dollar-companies that rely on certain packages would provide financial support for the open source projects they use, the question remains on how we should handle it when those corporations don’t. Do we front it ourselves, or just live with the knowledge that our security posture isn’t optimized for safety, because nobody will pay for that improvement?


  • GamingChairModel@lemmy.worldtolinuxmemes@lemmy.worldBackdoors
    link
    fedilink
    arrow-up
    13
    arrow-down
    1
    ·
    edit-2
    5 months ago

    100%.

    In many ways, distributed open source software gives more social attack surfaces, because the system itself is designed to be distributed where a lot of people each handle a different responsibility. Almost every open source license includes an explicit disclaimer of a warranty, with some language that says something like this:

    THE SOFTWARE IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.

    Well, bring together enough dependencies, and you’ll see that certain widely distributed software packages depend on the trust of dozens, if not hundreds, of independent maintainers.

    This particular xz vulnerability seems to have affected systemd and sshd, using what was a socially engineered attack on a weak point in the entire dependency chain. And this particular type of social engineering (maintainer burnout, looking for a volunteer to take over) seems to fit more directly into open source culture than closed source/corporate development culture.

    In the closed source world, there might be fewer places to probe for a weak link (socially or technically), which makes certain types of attacks more difficult. In other words, it might truly be the case that closed source software is less vulnerable to certain types of attacks, even if detection/audit/mitigation of those types of attacks is harder for closed source.

    It’s a tradeoff, not a free lunch. I still generally trust open source stuff more, but let’s not pretend it’s literally better in every way.