Want to wade into the snowy surf of the abyss? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid.

Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

Any awful.systems sub may be subsneered in this subthread, techtakes or no.

If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

The post Xitter web has spawned so many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)

Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

(Credit and/or blame to David Gerard for starting this. What a year, huh?)

  • blakestacey@awful.systems
    link
    fedilink
    English
    arrow-up
    0
    ·
    16 hours ago

    Chris Lintott (@chrislintott.bsky.social‬):

    We’re getting so many journal submissions from people who think ‘it kinda works’ is the standard to aim for.

    Research Notes of the AAS in particular, which was set up to handle short, moderated contributions especially from students, is getting swamped. Often the authors clearly haven’t read what they’ve submitting, (Descriptions of figures that don’t exist or don’t show what they purport to)

    I’m also getting wild swings in topic. A rejection of one paper will instantly generate a submission of another, usually on something quite different.

    Many of these submissions are dense with equations and pseudo-technological language which makes it hard to give rapid, useful feedback. And when I do give feedback, often I get back whatever their LLM says.

    Including the very LLM responses like ‘Oh yes, I see that <thing that was fundamental to the argument> is wrong, I’ve removed it. Here’s something else’

    Research Notes is free to publish in and I think provides a very valuable service to the community. But I think we’re a month or two from being completely swamped.

    • Evinceo@awful.systems
      link
      fedilink
      English
      arrow-up
      0
      ·
      10 hours ago

      people who think ‘it kinda works’ is the standard to aim for

      I swear that this is a form of AI psychosis or something because the attitude is suddenly ubiquitous among the AI obsessed.

    • BlueMonday1984@awful.systemsOP
      link
      fedilink
      English
      arrow-up
      0
      ·
      12 hours ago

      It gets worse:

      One of the great tragedies of AI and science is that the proliferation of garbage papers and journals is creating pressure to return to more closed systems based on interpersonal connections and established prestige hierarchies that had only recently been opened up somewhat to greater diversity.