i absolutely hate how the modern web just fails to load if one has javascript turned off. i, as a user, should be able to switch off javascript and have the site work exactly as it does with javascript turned on. it’s not a hard concept, people.

but you ask candidates to explain “graceful degradation” and they’ll sit and look at you with a blank stare.

  • Sertou@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    arrow-down
    2
    ·
    5 months ago

    The web isn’t just HTML and server side scripting anymore. A modern website uses Javascript for many key essentials of the site’s operation. I’m not saying that’s always a good thing, but it is a true thing.

    It is no longer a reasonable expectation that a website work with JavaScript disabled in the browser. Most of the web is now in content management systems that use JavaScript for browser support, accessibility, navigation, search, analytics and many aspects of page rendering and refreshing.

    • katy ✨@piefed.blahaj.zoneOP
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      3
      ·
      5 months ago

      The web isn’t just HTML and server side scripting anymore. A modern website uses Javascript for many key essentials of the site’s operation.

      which is why the modern web is garbage

  • cley_faye@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    ·
    5 months ago

    it’s not a hard concept, people.

    Depends. Webapps are a thing, and without JavaScript, there isn’t much to show at all.

    Websites that mostly serve static content though? Yeah. Some of them can’t even implement a basic one-line message that asks to turn on JavaScript; just a completely white page, even though the data is there. I blame the multiple “new framework every week” approach. Doubly so for sites that starts loading, actually shows the content, and then it loads some final element that just cover everything up.

    • Scrollone@feddit.it
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      5 months ago

      It depends. Inertia.js can pre-render pages server side, so you don’t need JavaScript to see the content.

      • cley_faye@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        5 months ago

        React can do SSR, too. The issue is that some sites actually means nothing if not dynamic. It makes sense to have SSR and sprinkle some JS on the client for content delivery, no issue there.

  • Supervisor194@lemmy.world
    link
    fedilink
    English
    arrow-up
    24
    ·
    5 months ago

    It’s worse than this even. I have an old Raspberry Pi 3B+ (1G) that I got in 2018. I hooked it up the other day to mess around with it, it’s been maybe 2 years since I did anything with it, ever since I got a Pi 4 (4G). 1 gigabyte of RAM is now insufficient to browse the web. The machine freezes when loading any type of interactive site. Web dev is now frameworks piled on frameworks with zero consideration for overhead and it’s pure shit. Outrageous.

    • rottingleaf@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      2
      ·
      5 months ago

      It’s encouraged to use things with a supply chain easily poisoned.

      There’s the issue of a Heisenberg effect here - when a spectator is present, like a huge audit of something, nothing happens, and when a spectator isn’t present, there’s nobody to look every day in piles of constantly changed crap to detect if something happens.

      Also not even easily poisoned, but easily denied. It’s about control. The militaries and producers of complex industrial equipment were the first to start doing this, however nuts that may seem. It’s useful to sell your allies a system they can use, but only when allowed. Or sell industrial equipment that can’t be smuggled to a third country without your permission.

      These things - they are legal even morally, but at some point in discussion of them common good might arise as a thing in itself, separate from morality. For the common good such systems of control are clear poison.

    • Possibly linux@lemmy.zip
      link
      fedilink
      English
      arrow-up
      12
      ·
      5 months ago

      You want to see terrible try looking at the network tab in inspect element

      “Modern” pages load hundreds of large assets instead of keeping it smaller and clean.

  • NigelFrobisher@aussie.zone
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    1
    ·
    5 months ago

    I built an internal tool that works with or without js turned on, but web devs want something simple for them with a framework, which is why you have to download 100Mb just for a basic form page.

    • memfree@piefed.social
      link
      fedilink
      English
      arrow-up
      5
      ·
      edit-2
      5 months ago

      This is correct. Web dev is told to make sure ads load before content. They don’t want users that don’t generate profits.

      • Mose13@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        5 months ago

        So in this example, what’s the underling issue, shitty business requirements or JavaScript?

            • Venia Silente@lemmy.dbzer0.com
              link
              fedilink
              English
              arrow-up
              1
              ·
              5 months ago

              Just because there are other ways to serve you ads does not at all mean we should not be able to not only stop at least one, but also the one which is most dangerous since it literally allows for RCE on all clients. by design.

              • Mose13@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                ·
                edit-2
                5 months ago

                The browser is supposed to be a sandbox environment for RCE. That’s why the sandbox part is important. Maybe instead of removing the RCE, we can lock down the sandbox better and reduce the amount of information advertisers can collect.

                If you remove code execution in the browser, then many websites will need to ship desktop apps instead. So now you’ve bypassed the browser sandbox altogether and that application can do much more damage.

                I’m not arguing that all websites need to execute in the browser, but without code execution in the browser, you remove a whole class of apps and the web becomes much less useful.

                Edit: calling it RCE is also kinda obnoxious because at that point you might as well call everything RCE. By that definition, if I push a docker image update, do I have RCE inside any container pulling that image? If there’s a way to break out of docker or web browser sandboxing, by all means call it RCE, but this is not that.

                • Venia Silente@lemmy.dbzer0.com
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  5 months ago

                  Maybe instead of removing the RCE, we can lock down the sandbox better and reduce the amount of information advertisers can collect.

                  By all means but then someone do it, because it’s 2025 and even Firefox sends all this information that is absolutely not needed to show a webpage. It’s at least 25 years late by this point.

                  If you remove code execution in the browser, then many websites will need to ship desktop apps instead.

                  Which in quite more than just some cases would be good, precisely because some things should be native programs instead of requiring that the web browser basically provides all the tasks of the OS.

  • normalexit@lemmy.world
    link
    fedilink
    English
    arrow-up
    10
    ·
    5 months ago

    Developers are still familiar with the concept, there are even ideas like server side rendering in react to make sites more SEO friendly.

    I think the biggest issue is that there is very little business reason to support these users. Sites can be sued over a lack of accessibility and they can lose business from bad ux, so they are going to focus in those two areas ten times out of ten before focusing on noscript and lynx users. SEO might be a compelling reason to support it, but only companies that really have their house in order focus in those concerns.

  • Korne127@lemmy.world
    link
    fedilink
    English
    arrow-up
    20
    arrow-down
    5
    ·
    5 months ago

    I, as a user, should be able to switch off javascript and have the site work exactly as it does with javascript turned on.

    I mean… many websites rely on JavaScript, so it’s kind of obvious that they don’t work without it. If it would work without JS in the first place, the website wouldn’t need to embed any JS code.

    • adarza@lemmy.ca
      link
      fedilink
      English
      arrow-up
      10
      arrow-down
      1
      ·
      5 months ago

      website wouldn’t need to embed any JS code.

      other than the 20 trackers and ad scripts.

    • katy ✨@piefed.blahaj.zoneOP
      link
      fedilink
      English
      arrow-up
      17
      arrow-down
      5
      ·
      5 months ago

      many websites rely on JavaScript,

      which is the problem that most people don’t understand the concept of graceful degradation

    • Azzu@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      1
      ·
      5 months ago

      There’s a difference between “wouldn’t work” and “wouldn’t work as nicely”. That’s what this post is about :D Most websites would still work in the same basic way without js.

      • Kazumara@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        9
        arrow-down
        1
        ·
        5 months ago

        OP really muddled the waters by writing:

        exactly as it does with javascript turned on

        That’s obviously impossible and wouldn’t be degraded.

          • Kazumara@discuss.tchncs.de
            link
            fedilink
            English
            arrow-up
            3
            ·
            5 months ago

            It’s either exactly the same, or it’s gracefully degraded. You’re asking for two opposite things at once.

            For what it’s worth I support the notion that fundamental functionality should be supported without Javascript, with good old form submissions.

            But I also recognise that you can’t get the exact same behaviour without javascript initiated background GETs and POSTs. Easy example: A scrollable map that streams in chunks as you move it.

      • Possibly linux@lemmy.zip
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        3
        ·
        5 months ago

        Why would someone spend tons of time on something that isn’t needed? Only a few people even know how to turn off JavaScript and chances are they will just turn it back on since nothing works.

    • Björn@swg-empire.de
      link
      fedilink
      English
      arrow-up
      10
      arrow-down
      2
      ·
      5 months ago

      Most websites out there could work fine without JavaScript. They rely on it because they can’t be bothered to be better.

      • Possibly linux@lemmy.zip
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        5
        ·
        edit-2
        5 months ago

        Have you ever tried building a modern page without JavaScript.

        You can do a lot of things with HTML5 and CSS. It just is very complicated and painful. It isn’t intuitive and the behavior will vary across browsers. What could be a little JavaScript turns into a ton of write only CSS.

        • Swedneck@discuss.tchncs.de
          link
          fedilink
          English
          arrow-up
          1
          ·
          4 months ago

          sure, it’s painful and pointless to build a fucking virtual machine without JS, but you can do 95% of normal website things with pretty bog standard HTML+CSS these days. You don’t even have to fiddle about to do pretty complex things, that’s just built-in most of the time.

        • Björn@swg-empire.de
          link
          fedilink
          English
          arrow-up
          5
          ·
          5 months ago

          Yes, that’s my job.

          The point isn’t to emulate the JavaScript functionality somehow. The point is to simply fetch the desired information as a new page load when necessary. The page should work in lynx.

    • MonkderVierte@lemmy.zip
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      4
      ·
      edit-2
      5 months ago

      so it’s kind of obvious that they don’t work without it.

      Uhm, the web is to share content, not to play JS. That’s what graceful degradation is for: the primary usecase should still work, even if the secondary or tertiary doesn’t.

      • Korne127@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        edit-2
        5 months ago

        Uhm, the web is to share content, not to play JS

        The web doesn’t have a single unified purpose. Even if I hate it as a programming language, JavaScript if the basis almost all client-side browser operations build upon.

        Sure, a simple website which just contains information works without it, but if you design a website in which the client does anything interactively and not everything should be processed server-side, it’s not really possible. No matter if you’re talking about a web game, something like Google Earth or an in-browser editor.

  • hperrin@lemmy.ca
    cake
    link
    fedilink
    English
    arrow-up
    6
    ·
    5 months ago

    It is substantially harder to make a modern website work without JavaScript. Not impossible, but substantially harder. HTML forms are not good at doing most things. Plus, a full page refresh on nearly any button click would be a bad experience.

    • Mose13@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      5 months ago

      Graceful degradation is for people that are angry about the future. Progressive enhancement is for people that respect the past. And it’s stupid to not hire someone only because they don’t know a term that you know.

        • Mose13@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          5 months ago

          I was referring to OP’s post because they mentioned “candidates”. I was also agreeing with your comment regarding progressive enhancement lol

  • swelter_spark@reddthat.com
    link
    fedilink
    English
    arrow-up
    12
    arrow-down
    2
    ·
    5 months ago

    Love it when a page loads, and it’s just a white blank. Like, you didn’t even try. Do I want to turn JS on or close the tab? Usually, I just close the tab and move on. Nothing I need to see here.

    • Croquette@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      1
      ·
      5 months ago

      React tutorial are like that. You create a simple HTML page with a script and the script generates everything.

      I had to do a simple webpage for an embedded webserver and the provider of the library recommended preact, the lightweight version of react. Having no webdev experience, I used preact as recommended and it is a nightmare to use and debug.