• mox@lemmy.sdf.org
    link
    fedilink
    arrow-up
    57
    arrow-down
    3
    ·
    10 months ago

    Also:

    • Simple sites allow visitors to stay safe from browser exploits by keeping scripts disabled.
    • Simple sites pose very little threat of fingerprinting or other invasive tracking techniques.
    • Simple sites can look beautiful, with a bit of well-crafted CSS.
    • Kushan@lemmy.world
      link
      fedilink
      English
      arrow-up
      16
      ·
      10 months ago

      I don’t think your second point is correct. You can still embed analytics on a static website. I believe you’re conflating it with your first point by assuming that scripts are disabled on the browser side, in which case it’s a bit of a redundant point.

      I also think it’s a bit unrealistic in this day and age to run with scripts completely disabled. I know it sucks, but we need better ways of protecting our privacy and disabling all scripts is a bit of an extreme measure given so much of the modern web relies on it.

      • thesystemisdown@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        edit-2
        10 months ago

        I think it’s impossible if you want for things to work. JavaScript is so ubiquitous it’s been baked into the browser since 1995.

      • mox@lemmy.sdf.org
        link
        fedilink
        arrow-up
        4
        arrow-down
        2
        ·
        edit-2
        10 months ago
        1. My first two points make a distinction between fingerprinting and more invasive attacks that JavaScript has enabled, including data exfiltration. You might not have encountered the latter, but that doesn’t make them the same thing. (Also, the analytics you refer to that are possible without scripts are far less invasive than what scripts can do, as is hinted in my second point.)
        2. It’s not unrealistic, since scripts can be turned off by default and enabled selectively when needed. (But were that not the case, it would be reason to use them less, not more.)
  • pixxelkick@lemmy.world
    link
    fedilink
    arrow-up
    39
    arrow-down
    2
    ·
    10 months ago

    I think the reason experienced devs tend to have minimalist websites that look like they are from the 90s, is because software devs aren’t UX experts.

    At a senior level at large companies, someone else designs the look and figmas to make the site be pretty. I don’t do that shit.

    I can do some basic stuff as a front end dev, but react has nothing to do with css animations and all the stuff you typically associate with a “pretty” website.

    Reactive frameworks are just handy for updating the dom on a mutatable website (ie forms, web socket stuff, data in out, pulling data from a db)

    Blogs tend to be statically generated so there should be zero reason to use reactive frameworks anyways, unless you add something dynamic like perhaps a comment box folks can login to and leave comments/likes/shares etc. Loading those comments will prolly want a framework.

    Aside from that, it’s mostly css to do fancy stuff.

    • knfrmity@lemmygrad.ml
      link
      fedilink
      arrow-up
      3
      ·
      10 months ago

      Pretty sites are cool and all, but in my experience super simple things often just don’t work. I’m not patient anymore when it comes to stuff like that, so I’ll close the tab real quick and find the information elsewhere or move on to the next thing.

      • Lmaydev@programming.dev
        link
        fedilink
        arrow-up
        1
        ·
        10 months ago

        Pretty sites aren’t aimed at us though.

        For the average consumer grabbing their attention is really important and first impressions mean a lot.

        If they go onto a site that looks “basic” it’ll give a bad impression of the business.

        • knfrmity@lemmygrad.ml
          link
          fedilink
          arrow-up
          1
          ·
          10 months ago

          I know I’m not part of the target audience for pretty sites, but the average user gets frustrated with poor design choices and outright broken websites as well.

          Just as one recent and therefore present example, I was on a pretty site the other day and nothing happened when I clicked on “About Us”. The next thing I did was close the tab. As you say, first impressions mean a lot.

          I hear complaints about these kind of things at work constantly as well. As an internal product owner of sorts users think I and the devs make poor design choices on our own, but all we can do is manage the best we can with the UX garbage Microsoft comes up with.

  • UNWILLING_PARTICIPANT@sh.itjust.works
    link
    fedilink
    arrow-up
    22
    ·
    edit-2
    10 months ago

    Agree with the article (and the 10 other ones I’ve already read on the topic) but Paul Graham’s website looks like ass on mobile as of 2024. I couldn’t even figure out how to get to the content, at least on cursory examination.

    Good point about solo/team or simple/scalable though. Right tool for the job and all that. Good stuff

      • jadero@programming.dev
        link
        fedilink
        arrow-up
        20
        arrow-down
        1
        ·
        10 months ago

        There was a thread elsewhere asking whether a toggle should show current state or the state desired. There was enough disagreement that it quickly became apparent that, whatever else the toggle does, there should be something external to the toggle showing the possible states, indicating which way to move the toggle regardless of toggle appearance.

      • qaz@lemmy.world
        link
        fedilink
        arrow-up
        8
        ·
        10 months ago

        It did nothing when clicking it the first time and only changed the second time I clicked it

        • snowe@programming.devM
          link
          fedilink
          arrow-up
          6
          ·
          10 months ago

          I’ve wondered what this problem was for years but never cared to figure it out, because it always resolved after the first button press (just refresh the page and it all works properly). turns out it is something wrong with my use of local storage to save your theme state. if you don’t have the key in local storage then it does what you mentioned. I just need to switch this to prefers-color-scheme anyway.

      • fd93@programming.dev
        link
        fedilink
        arrow-up
        2
        ·
        10 months ago

        Hey - the poster isn’t actually the author. That would be me! Thanks for the feedback though. I normally just use Dark Reader for switching theme.

        • im sorry i broke the code@sh.itjust.works
          link
          fedilink
          arrow-up
          3
          ·
          edit-2
          10 months ago

          It works if you visited the website already but the first time it breaks: reloading fixes this while emptying the caches breaks it again. As another user pointed out the first click after emptying the cache doesn’t do anything at all even though the animation plays out just fine

          EDIT: I’m on safari ios, latest stable (17.3.1 if I’m not mistaken)

  • No1@aussie.zone
    link
    fedilink
    arrow-up
    18
    ·
    edit-2
    10 months ago

    I.d rather take any website than being continuously forced to download apps, or being told to go to Facebook for some business’s information.

    There’s 2 things a website should respect - simple do it more often -, and not doing these will earn you my wrath:

    • you should be able to at least zoom/shrink text. Some websites have things so locked down, I can never read their teeny tiny text. Fuck you ESPN. Why would you let desktop zoom, and stop it on mobile where my screen is smaller and I most need it? (I’ll leave alone the original intent of the web of separating presentation from content for another day).
    • Browser Back button should take you back to the previius ‘page’. I’m terrified to use it because you’re really showing multiple ‘pages’ on 1 real page, so who knows where I’ll end up.
  • Phoenix3875@lemmy.world
    link
    fedilink
    arrow-up
    16
    ·
    10 months ago

    Static websites can be beautiful and easy to use without being complex.

    PG’s blog and HN can definitely use some CSS tweaks. I can’t remember how many times I clicked the wrong thing in HN.

    On the other hand, it’s easy to get reader mode/custom CSS/alt frontend working for such websites, so maybe it’s alright after all.

  • MonkderZweite@feddit.ch
    link
    fedilink
    arrow-up
    15
    arrow-down
    1
    ·
    edit-2
    10 months ago

    For style examples, take Paul Graham’s site

    Nope, doesn’t display nothing with no JS on mobile. While the other two examples seem to be desktop-only. You can do better with only HTML and CSS.

  • stoy@lemmy.zip
    link
    fedilink
    arrow-up
    13
    ·
    10 months ago

    I have had an account on Deviantart for almost 20 years, and up untill last year I used to upload my photos to my gallery there.

    However over the years it has only gotten worse, it is slow, annoying and have had features removed that I wanted.

    So last year, I set up a simple menu system and started generating photo galleries in digiKam, and upload galleries there instead, and it is soo much more responsive.

    The menu I wrote is built in HTML and CSS, the galleries digiKam exports for me do use Javascript but only to aid in navigating the galleries with the arrow keys, so everything loads instantly.

    When I publish new galleries I do need to edit the HTML code in the menu (and one line in the gallery) but it is as easy as I can make it while still giving me some options.

    • mox@lemmy.sdf.org
      link
      fedilink
      arrow-up
      6
      ·
      10 months ago

      The menu I wrote is built in HTML and CSS, the galleries digiKam exports for me do use Javascript but only to aid in navigating the galleries with the arrow keys, so everything loads instantly.

      I love sites like this. Fully functional with plain HTML and CSS. JavaScript used only for optional enhancements. Fast, light, and trustworthy.

      • stoy@lemmy.zip
        link
        fedilink
        arrow-up
        3
        ·
        10 months ago

        Exactly, even now after half a year of using it, I am blown away by how fast it loads, and I love how I know exactly what is going on when it loads.

        I even tried it on my phone, and the galleries have a responsive design, but better yet, they recognize swipes, making it easy to navigate on phones and tablets

    • Gloria@sh.itjust.works
      link
      fedilink
      arrow-up
      3
      ·
      edit-2
      10 months ago

      How do you solve the discoverability issue? A platform gives you some place where people could stumble upon you, while a website is an island in the middle of an ocean that people have to actively browse to. Do you crosspost your new work now more to get the word seen by others? I find it hard to believe that people would like to browse to x different websites to see if an artist has new works, only to find out that they don’t. For finding new artist a central place or a feed, like a platform can provide, seems to be nearly impossible to replace.

      • stoy@lemmy.zip
        link
        fedilink
        arrow-up
        5
        ·
        10 months ago

        I don’t really use it for advertising, I have actively added the directory to the robots file and requested that search engines not index the page, I like it being hidden, but available for me to show people on their own computer, I also have a link to the page on my CV under hobbies.

      • jadero@programming.dev
        link
        fedilink
        arrow-up
        3
        ·
        edit-2
        10 months ago

        Edit: the bits barely had a chance to dry on my comment when I came across https://rss-parrot.net/

        This is a way of integrating RSS feeds into your personal timeline on Mastodon. I don’t know how this affects the work I describe at the bottom of this comment, but I bet it has a role to play.


        I find it hard to believe that people would like to browse to x different websites to see if an artist has new works, only to find out that they don’t.

        RSS FTW!

        Every site I’ve ever created or been involved with in even the tiniest capacity has supported RSS. Sometimes it was enabled just to shut me up.

        I’m not sure how to better promote the use of RSS and get people to use feed readers, but I think it is the answer to at least that particular issue.

        My personal opinion is that a “platform” should really be just a collection of searchable and categorized feeds with it’s own feed. That way there is both discoverability and the ability for individuals to construct their own personal feed on their own personal device (no server required!) while staying abreast of new feeds on the master feed aggregation “platform.”

        There are innumerable ways for people to get their own content into something that supports RSS and that feed could be easily submitted to the master feed aggregation “platform” to deal with the discoverability issue. For example, Mastodon and most compatible systems support RSS and registration is child’s play on any server that allows public registration.

        In fact, the “platform” could set up a crawler to automatically discover RSS feeds. If the author has done the metadata right, the results would even be automatically categorized.

        Done right, the “platform” might actually run on a pretty small server, because it would be linking to sites, and only pulling summaries from them.

        Even comments could be supported with a little creativity. As I said, there are innumerable ways for people to get their own content out there. If there were a standard metadata tag “comment: <link to article or another comment>”, some fancy footwork could produce a threaded discussion associated with a particular article, even if the original author has no internal commenting system. (And my favoured internal comment system would permit nothing but pure HTTPS links to the commenters own content, extracting a short summary for display.)

        Side note: I acquired a domain explicitly for the purpose of setting up such a feed aggregation “platform.” Now that I’m retired, I’m slowly working on creating it. Everything is highly experimental at this point and, to be honest, shows no visible progress to that end, but that is my ultimate goal.

        • brisk@aussie.zone
          link
          fedilink
          arrow-up
          2
          ·
          10 months ago

          This is an interesting sounding project, do you have a feed/blog/mastodon/mailing list you’re likely to announce on?

          • jadero@programming.dev
            link
            fedilink
            arrow-up
            1
            ·
            10 months ago

            Thanks for your interest!

            Apart from here and “self-hosting” and other communities, if you’re a glutton for punishment, you can see what’s up at https://walloftext.ca. I’m currently in the process of rebuilding everything from the ground up, including an associated mastodon-compatible instance. I’ve not yet rewritten my project outline to account for all the new stuff I’ve learned about in the past few months, but it’s coming in the next few days.

            Just note the most important part of my tagline: “Unstable by nature”. Some would argue that applies more to me than the stability of the site and projects. 😛 Either way, chaos is probably the order of the day for at least the rest of this year. (And I mostly take summers off to reenergize by fishing, working in my shop, etc.)

      • lemmyvore@feddit.nl
        link
        fedilink
        English
        arrow-up
        2
        ·
        10 months ago

        Not OP but I would use a CDN like bunny.net. It’s cheap and you get geo redundancy and all kinds of perks with it.

        You can set the Bunny CDN to pull from your home server or you can upload your files to a Bunny storage and it can pull from there so it doesn’t matter if your home server is on or not.

        I’m currently running only the dynamic parts at home (CMS, generators etc.) and I “host” all the static generated stuff on there.

        • projectmoon@lemm.ee
          link
          fedilink
          arrow-up
          1
          ·
          10 months ago

          Yeah, that sounds like a good idea. I am using photoprism for photo management. It doesn’t really support S3 or any CDN. You could use a fuse filesystem or something, but it’s very slow.

          • lemmyvore@feddit.nl
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            10 months ago

            It’s probably better to export the photos if you want to make a public presentation gallery. Many image viewers can create static HTML pages for a given set of images, GThumb, DigiKam etc. But it could work with a photo management app too if it has presentation gallery support and can be configured to serve images from a CDN prefix.

            The catch with CDN support in dynamic apps is that they need to be aware that you want to use a CDN so they can provide both a dynamic view (of whatever resource you’re trying to cache) so you have something to pull the original from, as well as use the CDN URL for their main pages so they take advantage of the caching.

            Alternatively, if they don’t have CDN support, or you want to isolate the dynamic app from the public, if the app makes good static-looking URLs you can scrape it, make static pages and upload that to the CDN.

            I recently did this for someone who was using a gallery app that was made with super old MySQL and PHP versions and was super unsafe and inefficient. I used URL-rewriting to make all pages end in .html and .jpg, then scraped it with wget, uploaded the whole thing to CDN and pointed a CNAME of their domain to the CDN domain. The dynamic app now lives on a private server in docker containers which they access over VPN, and when they change stuff I just run a script that takes a new snapshot and uploads it to the CDN.

            • projectmoon@lemm.ee
              link
              fedilink
              arrow-up
              1
              ·
              10 months ago

              Definitely a good way to do it. Photoprism supports uploading to WebDAV for sharing. Could front a CDN upload with a web dav server 🤔

      • stoy@lemmy.zip
        link
        fedilink
        arrow-up
        2
        ·
        10 months ago

        Currently I borrow space on my dad’s web host, he wasn’t using it and was ok with me doing it.

  • litchralee@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    6
    ·
    10 months ago

    I’m a fan of Pelican for static blog generation from Markdown files. Separating template and content into CSS/HTML and md files, and having it all in a Git repo for version control, is only a few hundreds of kilobytes. Lightweight to work on, and lightweight to deploy. It’s so uncomplicated, I can probably pick right back up if I left it alone for ten years.

    • owenfromcanada@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      10 months ago

      I’ve found GetSimple to have similar advantages. It’s not as much a “static” site generator, but it uses flat XML storage for content instead of a database, so I can back it up in a git repo and deploy by just copying files.

  • Omega_Haxors@lemmy.ml
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    10 months ago

    I once went to a professional to get a website done (as my ability (read: patience) to code websites had proved inadequate) and they constantly tried to upsell me on just the most stupid bullshit. When I pointed out how a lot of moving parts just means more things that could possibly break they blew me off and acted like it was a completely unreasonable concern. Needless to say ended up using a website builder instead and despite a few small glitches it works pretty well with JS completely disabled.

    EDIT: I was particularly concerned with how heavily they were leaning on JS, to the point it flat out wouldn’t load at all for some users. Having JS flair is perfectly fine on the side but when you can’t even get fucking text to load without it, that’s a problem.