• 0 Posts
  • 9 Comments
Joined 1 year ago
cake
Cake day: August 9th, 2023

help-circle

  • We spent more on the Manhattan project than the disorganized fusion projects have spent in a decade, and will spend in the next decade as well.

    That cost was overwhelmingly slanted towards implementation though, not research. The theory for fission was very simple compared to nuclear fusion: Gather enough fissile material in one place rapidly, and it explodes. Once the basic parameters and theory were proven, the actual project cost went overwhelmingly to just enriching enough nuclear material and then, separately, getting the Silverplate Superfortresses ready. They were so sure of the science that they didn’t even bother to test the bomb they dropped on Hiroshima. It wasn’t like fusion research at all, where for over half a century every new device that’s supposed to produce power instead just discovers new plasma instabilities which mean it simply doesn’t work.

    Also, the cost comparison you’ve made is simply false. The Manhattan project cost no more than $20-30 billion, inflation-adjusted. ITER’s cost (from 2008 through to ~2025) is going to be at least €22 billion, and apparently $65 billion if the US is to be believed. That’s of course not even counting the various other “disorganized fusion projects”, like the ongoing operating costs for W7X, the NIF, JET, and whatever the Z machine, Shiva star, etc., and assorted Chalk Los Sandia Livermore national laboratories are doing for fusion research. Still worth it, probably— Hell, if it cost $10 trillion, it would probably still be worth it, as long as it actually works— But let’s not pretend it’s cheap or free or a safe bet or easy solution.

    Thorium is a safe bet, but it also needs significant research.

    On the other hand, why not both?

    That would be far too much foresight, obviously.

    …But there’s also never enough resources to go around, and you don’t want to be the country that sank all its money into a technology that didn’t pan out.


  • Current uranium reserves are expected to be depleted by the end of the century, at current use.

    More like somewhere between 200 years and a couple million years, assuming we fire back up and finish developing some 60-year old technologies.

    Fission as a serious replacement for just coal plants is a pipe dream without asteroid mining.

    pipe dream without asteroid mining

    …Yeah, no. At least, not yet. Plus, the energetic and engineering challenges to just throw “asteroid mining” into the conversation are insane— So you’re burning either fossil or synthetic/biofuels for the launch, electric ion (which is itself insanely difficult and expensive) I presume (so, I.e. nuclear or solar) for in-orbit maneuvering, for rocks that aren’t even that that big and which you don’t even have the technology to do anything with.

    We have most minerals in sufficient quantities in the Earth’s crust. And more importantly, we have the industrial processes to extract them efficiently. Fission is viable, has been for a long time, and will remain so for the foreseeable future.

    contrary to what people pretend we still don’t have a good answer for the waste.

    It’s rocks. Processed “nuclear waste” is literally just rocks. (Well, technically it’s solid glass covered in welded steel.) It’s not like air pollution that we end up breathing in, and it’s not like the chemical waste from other industries (including from batteries and rare earth extraction) which finds its way to the water cycle where it then bioaccumulates. If you’re picturing a glowing green river, or a barrel full of leaking sludge— Well, that’s not it.

    It can’t hurt you unless you powder it and huff it or build furniture with it or do something insanely stupid like that. And there are other much easier and more dangerous ways for malicious actors to hurt you too, that don’t involve breaking into secure facilities to steal the some of the heaviest elements known to exist.

    Dig a big hole and toss the waste a kilometer or two down the Canadian Shield, and it will sit there inert for a billion years long after it’s burnt through all its dangerous levels of residual radioactivity.

    We need a global fusion research project

    We already have a couple of those. If everything goes perfectly for them, they might become commercially widespread right around the same time the hard-to-reverse effects of climate change might become truly apocalyptic in the second half of this century. If the past history of this field of research is any indication, they quite possibly won’t really work, will work but only a decade or two behind schedule and several times over budget, or will lead nowhere except for some media coverage that’s good for military-industrial stock prices or whatever.

    This isn’t Sid Meier’s Civilization, where you can click “Global Fusion Research Project” and get a +100% boost to production after 20 turns. To quote Randall Munroe, “Magnetohydrodynamics combines the intuitive nature of Maxwell’s equations with the easy solvability of the Navier-Stokes equations”. Fusion is hard, or else we’d already be doing it, and though we know it’s definitely possible, there’s no guarantee of anything when it comes to actually engineering it.

    orbital solar.

    Uhh… No. Spending hundreds of millions of dollars to blast photovoltaics into an incredibly hostile environment, where they can’t even be cooled by dissipating into the atmosphere, is not probably going to bring energy costs down, at current or near-future technology levels.

    Plus any system capable of precisely beaming terawatts of power from space into localized collectors on the planetary surface is (1) probably by definition an omnipresent death ray and (2) probably at least going to fuck up a lot of migrating birds and components of the atmosphere.

    Simple as that.



  • Lmao, I’ve had literally 40-70GB of highly active application swap on an SSD for the last couple months now because I opened stuff and then didn’t close it.

    That said, I chose and installed that drive years ago specifically for this use case (though originally for less intensive/more reasonable cases), and I’m aware of the stupidity of letting it be used like this now.



  • Disk space is an issue… I’ve seen the OS take as much as 100 GB. But in a world of 2TB SSDs for $100, is that a big deal?

    Yes? Storage used for the OS is space not used for projects, entertainment, docs, redundancy, snapshots, avoiding fragmentation (EXT4), etc. Money spent on SSDs is money not spent on going out, food, meeting people, basic needs, other hardware, etc.

    I don’t see why NixOS would be any worse for the lifetime of a disk than other distros.

    Untested, but I’d assume high space use combined with high update frequency, plus occasional builds-from-source and multiple simultaneous package versions, means more disk writes.

    Biased, maybe, because manual GC means you see disk use tick up more than in other package managers, and also because I personally repeatedly rebuilt a custom gigabyte-sized Derivation dozens/hundreds of times. But I think it’s a reasonable point of caution.

    I’ve only hit binary cache missed for packages I created, or where I changed build options.

    Broken packages are, if anything, less of a problem [than] with Debian. Debian has lots of packages that are…not broken, but incomplete, requiring lots of manual config or whatever.

    Maybe this is a NixPkgs vs NixOS thing. Also, using Nix mostly to supplement packages I hadn’t already installed through my distro probably meant I hit more fringe areas. But I’ve even encountered cache misses and failed builds for some pretty big Python libraries on certain commits.

    Debian-based out-of-the-box functionality for stuff is indeed also Not Great, IIRC— Stable, but yeah, sometimes maybe a bit “incomplete”. Actually, Arch-based has worked well IME.

    And on the flip side: you can change package build options! Neat!

    But oh man…you should’ve seen how trivial it was to switch from PulseAudio to PipeWire (including Jack support etc), leaving no trace that Pulse was ever installed… Or switching from X to Wayland, on a system that I’ve been doing rolling updates on since 2017, all with a clear conscience… It’s beautiful.

    Yeah. I personally don’t care about that stuff unless it directly impacts something I’m working on.

    And that’s why I say Nix is a great tool for package management, but not something I’d personally want to use as an OS base. When you’re already elbow-deep in the plumbing anyway, Nix makes it way easier to swap components out. But when you just want to install and use an application, editing Nix configs feels like more work, and it’s so much easier to just pacman/yum/apt-get install firefox or whatever and get on with your day.


    Plus, some specific red flags surrounding stability and interoperability:

    1. ALSA is apparently hardcoded to just straight-up not work with a Nix root. Not sure how NixOS handles it, but in my specific use case, I had to symlinkJoin{paths=[alsa-lib alsa-plugins]} so they could find each other. Pretty sure it took a lot of strace -f -e trace=file and nix-locate for me to figure this one out, just to get sound working.

    2. QtWebEngine/Chromium has to be run through some kind of sed -e "whatever.so" to “Patch library paths in Chromium sources” in order to even run, because it’s also hardcoded to just not work with a Nix root. IIRC, this one I figured out by just straight-up grepping on the compiled binaries after seeing the errors in strace or whereever. Seems a bit ridiculous, using a RegEx to patch a web browser when installing it so it can even run.

    3. Binaries aren’t safe either, because they probably need patchelf to be able to run on Nix.

    4. Flakes are apparently hosted as user repositories on a Microsoft-owned website, and can just randomly disappear sometimes.

    5. Qt generally takes a ton of extra steps to be able to run on Nix. And have you actually ever opened the wrapper the Nix hooks generate to see what it’s actually doing? For one of my applications just now, you get a 43kb Bash script with apparently 581 assignments to just a handful of QT and XDG-related environment variables.

    6. OpenGL doesn’t look safe either. Nix handles the drivers its own way, so to get OpenGL for Nix packages to work on other systems, you have to jump through some hoops. I assume the same amount of work in the opposite direction would be needed to use EG proprietary or statically compiled graphics applications on NixOS too.

    7. Running precompiled binaries on Nix looks… Involved, as well. Sure, there’s tools to automate it. But that only hides the complexity, and adding an opaque dependency sorta defeats the entire purpose of configurability and composability IMO.

    I’m sure most of these problems are “solved”, in the sense that NixOS implements workarounds that are the default when you install the affected derivations, and there are wrappers written for most other cases. But all of that adds maintenance, fragility, and complexity. It remarkably works well enough for userspace, but stuff like this still feels a bit house-of-cards-y for the basic OS and desktop. It’s not Nix’s fault, but so much of the work that goes into Nix seems to be just to force software that was never designed for it to run on it. Ultimately, the Linux FHS has momentum and adoption. Nix’s technical design might be compelling, but so are interoperability, stability, and simplicity.

    The NixOS enthusiasts are doing a lot of technically interesting work, but I personally find the results of that work most useful outside the NixOS ecosystem. And I do think Nix as a package manager is really great. Ever since I’ve installed it, I’ve basically incorporated it as a major component or tool in every sizable software project I’ve since started. But I just personally wouldn’t want to base an entire OS on it.


  • I’m saying that’s a way I might personally consider going if I were to set up a new computer. Rock solid base that you can still get normal packages and binaries to run on without much hassle if needed, plus Nix with more up-to-date packages that you can customize however you find most useful.

    Personally I have a mix of rolling/regular repos, AUR, Nix, Flatpak, and static binaries. They all have their uses, TBH.


  • Nix is great. But I don’t think I’d want to use it for a desktop OS base.

    (Disk space/cycle life potential, binary cache misses, broken packages, and complete incompatibility with everything else. User error, TBH, but also stuff that’s not really a problem with other systems. Well worth it as a package manager, though.)