Want to wade into the snowy surf of the abyss? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

Any awful.systems sub may be subsneered in this subthread, techtakes or no.

If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)

Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

(2026 is off to a great start, isn’t it? Credit and/or blame to David Gerard for starting this.)

  • rook@awful.systems
    link
    fedilink
    English
    arrow-up
    0
    ·
    22 hours ago

    Been listening to the latest oxide and friends podcast (predictions 2026), and ugh, so much incoherent ai boosting.

    They’re an interesting company doing interesting things with a lot of very capable and clever engineers, but every year the ai enthusiasm ramps up, to the point where it seems like they’re not even listening to the things they’re saying and how they’re a little bit contradictory… “everyone will be a 10x vibe coder” and “everything will be made with some level of llm assistance in the near future” vs “no-one should be letting llms access anything where they could be doing permanent damage” and “there’s so much worthless slop in crates.io”. There’s enthusing over llm law firms, without any awareness of the recent robin ai collapse. Talk of llms generating their own programming language that isn’t readily human readable but is somehow more convenient for llms to extrude, but also talking about the need for more human review of vibe code. Simon Willison is there.

    I feel like there’s a certain kind of very smart and capable vibe coder who really cannot imagine how people can and are using these tools to avoid having to think or do anything, and aren’t considering what an absolute disaster this is for everything and everyone.

    Anyway, I can recommend skipping this episode and only bothering with the technical or more business oriented ones, which are often pretty good.

    • istewart@awful.systems
      link
      fedilink
      English
      arrow-up
      0
      ·
      19 hours ago

      I’m sure it’s all meant to bolster a sales pitch to corporate clients that “this is YOUR AI, that YOU CONTROL!”

      I’ve been wondering, since Rust has a more complex compiler that can take longer to run, and people are typically farming it out to a build/CI server anyway… are these otherwise accomplished vibe coders like Klabnik and the Oxide bros pursuing an experience similar to the REPL/incremental compilation of Lisp or Smalltalk? We’ve already discussed how the mechanics are similar to a slot machine, but if you can convince yourself you’re getting a “liveness” that you wouldn’t otherwise get with a compiled, rigorously type-checked language, you’re probably more than willing to ignore all that. I’m curious, but not curious enough to go pin one of these people up against the wall, or start poking the slop machine myself.

    • BlueMonday1984@awful.systemsOP
      link
      fedilink
      English
      arrow-up
      0
      ·
      22 hours ago

      Anyway, I can recommend skipping this episode and only bothering with the technical or more business oriented ones, which are often pretty good.

      AI puffery is easy for anyone to see through. If they’re regularly mistaking for something of actual substance, their technical/business sense is likely worthless, too.

      • rook@awful.systems
        link
        fedilink
        English
        arrow-up
        0
        ·
        21 hours ago

        There’s room for some nuance there. They make some reasonable predictions, like chatbot use seems likely to enter the dsm as a contributing factor for psychosis, and they’re all experience systems programmers who immediately shot down Willison when he said that an llm-generated device driver would be fine, because device drivers either obviously work or obviously don’t, but then fall foul of the old gell-mann amnesia problem.

        Certainly, their past episodes have been good, and the back catalogue stretches back quite some time, but I’m not particularly interested in that sort of discussion here.

    • rook@awful.systems
      link
      fedilink
      English
      arrow-up
      0
      ·
      22 hours ago

      Ugh, I carried to listening to the episode in the hopes it might get better, but it didn’t deliver.

      I don’t understand how people can say, with a straight face, that ai isn’t coming for your job and it is just going to make everyone more productive. Even if you ignore all the externalities of providing llm services (which is a pretty serious thing to ignore), have they not noticed the vast sweeping layoffs in the tech industry alone, let alone the damage to other sectors? They seem to be aware that the promise of the bubble is that agi will replace human labour, but seem not to think any harder about that.

      Also, Willison thinks that a world without work would be awful, and that people need work to give their lives meaning and purpose and bruh. I cannot even.

      • istewart@awful.systems
        link
        fedilink
        English
        arrow-up
        0
        ·
        19 hours ago

        Even if you ignore all the externalities of providing llm services (which is a pretty serious thing to ignore)

        Beyond the obvious and well-discussed material externalities, it strikes me that we don’t know and can’t yet know the true total cost of the LLM-driven development cycle. The manifestation of security holes and rewrites are possibly still years off in the future, maybe decades in the case of lower-level code. And yet, given industry practice and the mentality of most of the management strata, I have little doubt that such future costs will either a) be ignored completely and thus rendered true externalities or b) somebody else’s problem, I done got my bag, brah, see ya…

        • rook@awful.systems
          link
          fedilink
          English
          arrow-up
          0
          ·
          19 hours ago

          I feel like one day that “no guarantee of merchantability or fitness for any particular purpose” thing will have to give.

      • Jonathan Hendry@iosdev.space
        link
        fedilink
        arrow-up
        0
        ·
        21 hours ago

        @rook

        I figure two things will happen:

        a) In a year or two companies will realize that LLMs aren’t going to improve enough, and that they need skilled people because AI has turned their software into a shit show, and start hiring desperately.

        or

        b) In a year or two LLMs will get good enough for code that the software developed is just good enough despite the deskilling effects, and companies can get by with drastically reduced staff.

        • rook@awful.systems
          link
          fedilink
          English
          arrow-up
          0
          ·
          21 hours ago

          My gloomy prediction is that (b) is the way things will go, at least in part because there are fewer meaningful consequences for producing awful software, and if you started from something that was basically ok it’ll take longer for you to fail.

          Startups will be slopcoded and fail quick, or be human coded but will struggle to distinguish themselves well enough to get customers and investment, especially after the ai bubble pops and we get a global recession.

          The problems will eventually work themselves out of the system one way or another, because people would like things that aren’t complete garbage and will eventually discover how to make and/or buy them, but it could take years for the current damage to go away.

          I don’t like being a doomer, but it is hard to be optimistic about the sector right now.

  • BlueMonday1984@awful.systemsOP
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 days ago

    Found someone showing some well-founded concern over the state of programming, and decided to share it before heading off to bed:

    alt text:

    Is anyone else experience this thing where your fellow senior engineers seem to be lobotomised by AI?

    I’ve had 4 different senior engineers in the last week come up with absolutely insane changes or code, that they were instructed to do by AI. Things that if you used your brain for a few minutes you should realise just don’t work.

    They also rarely can explain why they make these changes or what the code actually does.

    I feel like I’m absolutely going insane, and it also makes me not able to trust anyones answers or analysis’ because I /know/ there is a high chance they should asked AI and wrote it off as their own.

    I think the effect AI has had on our industry’s knowledge is really significant, and it’s honestly very scary.

    • Soyweiser@awful.systems
      link
      fedilink
      English
      arrow-up
      0
      ·
      24 hours ago

      Haha, wow the reactions to that, 3 levels deep and suddenly people are talking about screws. (Im being positive here btw, funny to see what people have made/learned and how happy they seem with it).

  • bitofhope@awful.systems
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 days ago

    How to neither downplay the death of Renee Good nor the uncountable number of people, mostly people of color, who were murdered by near equally fascist police forces without the public outrage her murder finally rightly elicited? I am tired and yet I feel bad to even complain about it because look at this shit.

    • Soyweiser@awful.systems
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 day ago

      Yeah, just learned a black person was killed in the USA, and the only reason this was getting some attention was because Renee Good was also killed.

      With the benefit of a sea of distance between me and the USA this is just really fucked up. Two different Americas.

    • V0ldek@awful.systems
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      1 day ago

      There are two things here in my opinion:

      1. American cops are trained murderers, but they are trained, in particular to avoid causing massive PR disasters with their murders*. A paramilitary goon with a rifle in a government organisation so opaque we still don’t even know his identity is materially worse than a cop. It also looks much worse, the police have some completely undue public trust, ICE just looks like military forces.
      2. We immediatelly had video with the full event. When cops kill people of colour there’s usually no evidence since again, they know how to pull murder off without causing PR disasters. Basically the only reason George Floyd’s murder wasn’t successfully brushed aside is that we had video of it, and they tried to bury that shit hard. In this case I don’t even think the victim being white or a citizen matters, the event itself is so fucking horrifying it’d elicit outrage anyway. I am 100% sure that if there wasn’t video, just witness reports, it’d be out of the media cycle already.

      * I don’t want this to seem like a moral distinction, if anything the decorum granted to police forces is arguably a stepping stone that brought the USA here. Recall Mamdami’s recent words: “For too long, those fluent in the good grammar of civility have deployed decorum to mask agendas of cruelty”. HOWEVER, to me personally this is a rather chilling escalation. It shows that the PR part doesn’t actually matter anymore. America is so far into the fascist pipeline that paramilitary forces can just execute citizens in broad daylight on the street. They don’t need to hide it, they don’t need to play coy about it, they can just post-facto label the victim as an Enemy of the State and move on. I’m sorry but to me this is like one step away from just rounding people up against a wall for fun. Human life is not only practically worthless to state actors, it’s proudly and openly worthless as a matter of policy.

      • flere-imsaho@awful.systems
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 day ago

        i’m a bit conflicted here: on the one hand it’s true that the american fascists are now escalating, buyoed by the feeling of being virtually untouchable, but on the other hand, this is not a distinct change of behaviour, it’s that they basically widened their target group to include white people too.

        the blm protests were fueled not by new knowledge or radically changed police behaviour after all, but by the wider availability of documentation (mainly phone videos).

        (and on the gripping hand, extending brutal repressions to a majority group is a sign of escalation. but that only means that a large population of u.s. residents, i.e. the non-white ones, live and have always lived in a totalitarian state; the totalitarianism just wasn’t evenly distributed until trump.)

        • V0ldek@awful.systems
          link
          fedilink
          English
          arrow-up
          0
          ·
          1 day ago

          this is not a distinct change of behaviour

          This is what I disagree with. The theatrics of justifying police brutality don’t change the outcomes of police brutality – people still die – but the fact that the theatrics can now be dispensed of in favour of paramilitaries directly using violence to terrorise the people is a distinct change of behaviour towards fascism.

          And I think it’s important to recognise that because, as many scholars of fascism have warned time and time again, this is not a binary where a switch get flipped and haha, since today you’re in a fascist state. It’s a progressive erosion of the social contract. ICE as deployed by the Trump regime right now is a basically textbook run: create a paramilitary force, recruit from existing criminal militias to select for loyalists and violent personalities, normalise them as keepers of order, push out or integrate any other enforcement structures so that the paramilitary becomes dominant. Basically the only difference is that Trump didn’t have to create ICE, it was already there just waiting to be pushed through the pipeline.

          Does this event fundamentally change how you and I perceive America? No, if you were paying attention you knew the rot inside, and you’ve been shouting that Trump is a fascist since the very beginning. It is, however, a sign that the situation is much worse than it was months ago, that fascism is progressing, and if this is the point at which someone not paying attention wisens up and goes “shit, we are moving towards a totalitarian nightmare” then good, welcome, grab a pitchfork.

          • flere-imsaho@awful.systems
            link
            fedilink
            English
            arrow-up
            0
            ·
            1 day ago

            It is, however, a sign that the situation is much worse than it was months ago, that fascism is progressing, and if this is the point at which someone not paying attention wisens up and goes “shit, we are moving towards a totalitarian nightmare” then good, welcome, grab a pitchfork.

            oh, i’m not a pitchfork purist. anyone is welcome to grab one at any time.

  • o7___o7@awful.systems
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 days ago

    Hey I think I discovered a way to fix America! What if we rewrite the US Constitution in Rust?

  • sinedpick@awful.systems
    link
    fedilink
    English
    arrow-up
    0
    ·
    3 days ago

    got my Urbit newsletter for this quarter (or whatever the fuck the cadence is) and what stood out to me this time was nockchain.org. I was going to sit and do a deep dive to come up with sneers for this but I just don’t have the executive function right now. @self thoughts?

  • saucerwizard@awful.systems
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    3 days ago

    OT: He’s gone. Last thing he saw was my face and then there was no more pain. His veins had all collapsed (vet had to inject the phenobarb into the liver), so I was right to bring him in when I did.

    • self@awful.systems
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 days ago

      I’m so sorry. it’s never easy when this happens, but for what it’s worth it sounds like you gave him the best life possible. it takes a great deal of strength to be with a pet until the very end, and I hope you’re able to take the time you need to grieve and recover your emotional strength.

      • saucerwizard@awful.systems
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 days ago

        I adopted him from the shelter. He’d spent months if not close to a year there and no one wanted him. If I hadn’t adopted him, he would have been put down the next day. That was close to eight years ago.

        • flere-imsaho@awful.systems
          link
          fedilink
          English
          arrow-up
          0
          ·
          2 days ago

          my condolences; we’ve gone through similar with our previous cats, all rescuees, and even when you know it’s the right decision, the pain is still there.

        • self@awful.systems
          link
          fedilink
          English
          arrow-up
          0
          ·
          2 days ago

          thanks to you he had 8 more years of life and a much happier existence than any he’d known before he met you. I think that’s remarkable.

    • saucerwizard@awful.systems
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 days ago

      also the universe has granted me a small mercy and for once the alcohol/semaglutide thing I mentioned a thread or so ago seems to be totally impotent against the might of scottish chemical engineering. thank you jesus

    • aio@awful.systems
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      3 days ago

      Sorry for you and your cat. You did the right thing, but that doesn’t make it any easier.

  • maol@awful.systems
    link
    fedilink
    English
    arrow-up
    0
    ·
    3 days ago

    I’m doing this shitey online optional module for my college course because I left it too late to pick a proper one, and Christ in heaven people are using a lot of AI. This is meant to be a class about sustainability

    • froztbyte@awful.systems
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 days ago

      I’ve started seeing people make paperwork with explicit carve-outs for prompt use (merely referred to as “AI” with no explicit definition, so …. gonna be fun when the lawsuits start)

      • maol@awful.systems
        link
        fedilink
        English
        arrow-up
        0
        ·
        3 days ago

        Genuinely don’t understand how dependent these people are on AI. “I’ll generate an image of a skeleton at a bus stop” one of them said (it was relevant). I’ll go to Google images, I said. It’s faster! And you don’t end up with a mangled version of the Dublin Bus logo!

      • Seminar2250@awful.systems
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        3 days ago

        in the fall of 2024, i was getting teams messages from my students that were clearly llm-generated

        The purpose of this block of code is to efficiently BLAH FUCKING BLAH WHAT THE FUCK ARE YOU EVEN TALKING ABOUT

        i have to assume it’s only gotten worse