• 0 Posts
  • 42 Comments
Joined 8 months ago
cake
Cake day: December 18th, 2023

help-circle



  • I thought of something that maybe gets this across. Think about roads. We all pay for them with taxes. Companies use these roads for free to make a profit. EG Amazon runs delivery vehicles on public roads.

    The (center-)left take on that is: “You didn’t build that.” It can be an argument for progressive taxation and even a wealth tax.

    Then there’s people who say that we should privatize all the roads. Let Amazon pay a toll for using those roads. Is it clear that this is a conservative policy?





  • Private ownership ≠ capitalism.

    Right. It’s private ownership of capital; aka the means of production. You’re saying that data should be owned because it can be used productively. That’s exactly capitalism for capitalism’s sake.

    This is a typical economically right-wing approach. There is a problem, so you just create a new kind of property and call it done. The magic of the market takes care of it, or something. I don’t understand why one would expect a different result from trying the same thing.









  • But it’s not “from each according to his ability”. FOSS is what people feel like contributing. And it’s not “to each according to their need”. It’s take it or leave it, unless someone feels like fulfilling requests.

    Traditionally, the slogan meant a duty to work. Contributing what you feel like is just charity.

    Capitalism, at its core, is private control of the capital. Copyright law turns code into intellectual property/capital. I’ve read the argument that copyleft requires strong copyrights. That argument implicitly makes copyleft a feature of capitalism. You know how rich people or corporations sometimes donate large sums to get their name on something, EG a hospital wing? That’s not so different from a FOSS license that requires attribution.




  • Assuming you want to know why France is islamophobic…

    It’s historically grown. France invaded majority muslim, north Africa in the 19th century. Present day Algeria was french territory. The native muslim population was brutally oppressed; somewhat comparable to the oppression of blacks in the US. Nevertheless, the Muslims were french and fought for her in its wars, such as in the trenches 1914-18.

    Algeria eventually won its independence after a brutal war lasting from 1954 to 1962. The brutality of this civil war is showcased by the massacre in Paris in 1961. Police attacked a peaceful demonstration for independence, murdering dozens, maybe hundreds of citizens. The police chief was a criminal nazi collaborator, convicted for his role in the holocaust. For decades, information about the massacre was suppressed in France.

    President Charles de Gaulle - formerly the leader of Free France, the french forces that did not surrender to the nazis - brokered independence for Algeria. In response, far right traitors attempted a coup d’état and to assassinate him.

    In many ways this history is comparable to the terrorist campaign that the US far right unleashed in the 1950/60 against African Americans and the civil rights movement. But the struggle was far more brutally fought in France. Hundreds of thousands were killed. Over a million people, mainly of european descent, were forced to flee from what became Algeria.

    The decades after Algerian independence will seem quite familiar to Americans. North African Muslims had become a minority in metropolitan France (the mainland). This hated minority was quietly, without much legal upheaval, pushed to the fringes of society. Information about past atrocities against them was suppressed. Small scale terror attacks continued to happen.

    These are the origins of the french far right and its islamophobia.


  • Text explaining why the neural network representation of common features (typically with weighted proportionality to their occurrence) does not meet the definition of a mathematical average. Does it not favor common response patterns?

    Hmm. I’m not really sure why anyone would write such a text. There is no “weighted proportionality” (or pathways). Is this a common conception?

    You don’t need it to be an average of the real world to be an average. I can calculate as many average values as I want from entirely fictional worlds. It’s still a type of model which favors what it sees often over what it sees rarely. That’s a form of probability embedded, corresponding to a form of average.

    I guess you picked up on the fact that transformers output a probability distribution. I don’t think anyone calls those an average, though you could have an average distribution. Come to think of it, before you use that to pick the next token, you usually mess with it a little to make it more or less “creative”. That’s certainly no longer an average.

    You can see a neural net as a kind of regression analysis. I don’t think I have ever heard someone calling that a kind of average, though. I’m also skeptical if you can see a transformer as a regression but I don’t know this stuff well enough. When you train on some data more often than on other data, that is not how you would do a regression. Certainly, once you start RLHF training, you have left regression territory for good.

    The GPTisms might be because they are overrepresented in the finetuning data. It might also be from the RLHF and/or brought out by the system prompt.