• limer@lemmy.ml
    link
    fedilink
    English
    arrow-up
    0
    ·
    13 days ago

    I think the best web page is a photo with a page of paper of handwriting : several photos if one has a lot to say.

    Today’s bandwidth and powerful computers can easily handle it

    • Hellfire103@lemmy.caOP
      link
      fedilink
      English
      arrow-up
      0
      ·
      13 days ago

      I flip back and forth between Brave and Tor Browser, depending on which one appears less fingerprintable; and I’ve disabled all of the analytics.

      • moseschrute@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        13 days ago

        The more things you block, the more unique and fingerprintable you become. Blocking JavaScript altogether may mitigate some of that, but you can be fingerprinted even without JS.

        Tor is a little better because they make your browser blend pretty well with other Tor browsers, so instead of being unique 1 of 1 you’re more like 1 out of all Tor users.

        I haven’t looked into this in a couple years, but that is my takeaway last time I went down the privacy/fingerprint rabbit hole.

        • Hellfire103@lemmy.caOP
          link
          fedilink
          English
          arrow-up
          0
          ·
          12 days ago

          I know, and I’m still researching the best way to mitigate this. So far, I’ve come away with the impression that Tor Browser and Brave do the best jobs of minimising fingerprinting, otherwise I would have just disabled JS in Vanadium and called it a day.

        • LifeInMultipleChoice@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          13 days ago

          (Not talking about a specific browser, just in general) Maybe I’m misunderstanding but when the VPN makes a request for the page information the request isn’t forwarding the browser information is it? So wouldn’t most of that be mitigated there?

          As in the VPNs sever making the request should show when they scrape that information, not the end user. Maybe I’m not understanding that though.

          • moseschrute@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            edit-2
            13 days ago

            A VPN doesn’t alter the requests your browser is making. It just masks your IP address. So any information about your browser is still sent. The exception would be if your VPN provides some sort of tracker/ad blocking feature where it can block certain requests. But it’s not really a magic switch that prevents websites from tracking you.

      • _stranger_@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        13 days ago

        it’s still owned by a homophobe that loves crypto, and is likely an antivaxxer.

        He was run out of Mozilla after only eleven days as CEO, and he helped found it!

        the guy is an asshole, and he’s very likely using brave money for evil shit.

  • witty_username@feddit.nl
    link
    fedilink
    English
    arrow-up
    0
    ·
    14 days ago

    If I’d want to write a site with js-equivalent functionality and ux without using js, what would my options be?

      • unwarlikeExtortion@lemmy.ml
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        14 days ago

        You can’t modify the DOM.

        But some most dynamicity can stay - sites can be built freely server-side, and even some “dynamic” functionality like menus can be made using css pseudoclasses.

        Sure, you won’t have a Google Docs or Gmail webapp, but 90% of stuff doesn’t actually need one.

        A basic website doesn’t require js.

        A webshop, for example, does for the part around adding to cart and checkout - but it doesn’t for merely browsing.

    • dondelelcaro@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      14 days ago

      htmx or equivalent technologies. The idea is to render as much as possible server side, and then use JS for the things that can’t be rendered there or require interactivity. And at the very least, serve the JS from your server, don’t leak requests to random CDNs.

    • TrickDacy@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      14 days ago

      I mean you could build a site in next.js, ironically. Which is very counter intuitive because it literally is js you are writing, but you can write it to not do dynamic things so it effectively would be a static server rendered site that, if js is enabled, gets for free things like a loader bar and quick navigation transitions. If js is disabled it functions just like a standard static site.

    • Hellfire103@lemmy.caOP
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      14 days ago

      HTML and CSS can do quite a lot, and you can use PHP or cgi-bin for some scripting.

      Of course, it’s not a perfect alternative. JavaScript is sometimes the only option; but a website like the one I was trying to use could easily have just been a static site.

  • Mwa@thelemmy.club
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    14 days ago

    I just use NOSCRIPT to do this and its annoying to visit websites that need Javascript, but its handy with noscript cause I just turn on the Javascript the website needs for functionality (this should also speed up load times)
    Sometimes if am using a browser without extension support (like Gnome WEB) I just disable Javascript on Websites or frontends that dont need it like Invidious (if am facing issues)

  • MonkderVierte@lemmy.zip
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    14 days ago

    Skill issue - on the devs side.

    A lot of pages even fail if you only disable 3rd-party scripts.

    I consider them broken, since the platform is to render a Document Object Model, scripting is secondary functionality and no fallbacks are bad practice. Imagine if that were a pdf/epub.

    • Spice Hoarder@lemmy.zip
      link
      fedilink
      English
      arrow-up
      0
      ·
      14 days ago

      Personally, I love server-side rendering, I think it’s the best way to ensure your content works the way YOU built it. However, offloading the processing to the client saves money, and makes sense if you’re also planning on turning it into an electron app.

      I feel it’s better practice to use a DNS that blocks traffic for known telemetry and malware.

      Personally, I used to blacklist all scripts and turn them on one at a time till I had the functionality I needed.

    • katy ✨@piefed.blahaj.zone
      link
      fedilink
      English
      arrow-up
      0
      ·
      14 days ago

      wild thing is that with modern css and local fonts (nerdfonts, etc), you can make a simple page with a modern grid and nested css without requiring a single third party library or js.

      devs are just lazy.

    • JustARaccoon@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      14 days ago

      But they’re not pdf/e-pub, they’re live pages that support changing things in the DOM dynamically. I’m sorry, I’m not trying to be mean but people not wanting scripting on their sites are a niche inside a niche, so in terms of prioritising fixing things that’s a very small audience with a very small ROI if done they might require a huge rewrite. It’s just not financially feasible for not much of a reason other than puritan ones.

      • MonkderVierte@lemmy.zip
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        13 days ago

        More simple websites have some advantages like, less work to maintain, responsivity and accessibility by default.

        Sure, what is already, that is. It starts already at choosing the frameworks.

    • josefo@leminal.space
      link
      fedilink
      English
      arrow-up
      0
      ·
      13 days ago

      It’s like JavaScript is used way over its reasonable use cases and you need a thick layer of framework indirection to be able to do anything, and yet still sucks.

  • fxdave@lemmy.ml
    link
    fedilink
    English
    arrow-up
    0
    ·
    14 days ago

    As a web developer, I see js as a quality improvement. No page reloads, nice smooth ui. Luckily, PHP times has ended, but even in the PHP era disabling jQuery could cause problems.

    We could generate static html pages It just adds complexity.

    Personally I use only client-side rendering, and I think, that’s the best from dev perspective. Easy setup, no magic, nice ui. And that results in blank page when you disable js.

    If your motivation is to stop tracking.

    • replace all foreign domain sources to file uris. e.g.: load google fonts from local cache.
    • disable all foreign script files unless it’s valid like js packages from public CDNs, which case load them from local cache.

    If your motivation is to see old html pages, with minimal style, well it’s impossible to do them reliably. If you are worried about closed-source js. You shouldn’t be. It’s an isolated environment. if something is possible for js and you want to limit its capability, contribute to browsers. That’s the clear path.

    I can be convinced. What’s your motivation?

    • unwarlikeExtortion@lemmy.ml
      link
      fedilink
      English
      arrow-up
      0
      ·
      14 days ago

      As a web dev, and primarily user, I like my phone having some juice left in it.

      The largest battery hog on my phone is the browser. I can’t help wonder why.

      I’d much rather wait a second or two rather than have my phone initialize some js framework 50 times per day.

      Dynamic HTML can be done - and is - server-side. Of course, not using a framework is harder, and all the current ones are client-side.

      Saying making unbloated pages is impossible to do right just makes it seem like you’re ill informed.

      On that note - “Closed-source” JS doesn’t really exist (at least client-side) - all JS is source-availiable in-browser - some may obfuscate, but it isn’t a privacy concern.

      The problem is that my phone does something it doesn’t have to.

      Having my phone fetch potentially 50 MB (usually 5-15) for each new website is a battery hog. And on a slow connection - to quote your words, “great UX”.

      The alternative is a few KB for the HTML, CSS and a small amount of tailor-made JS.

      A few KB’s which load a hundered times faster, don’t waste exorbitant amounts of computing power - while in essence losing nothing over your alternative.

      “Old pages with minima style” is a non-sequitur. Need I remind you, CSS is a thing. In fact, it may be more reliable than JS, since it isn’t turing-complete, it’s much simpler for browser interpreters to not fuck it up. Also, not nearly the vulnerability vector JS is.

      And your message for me and people like me, wanting websites not to outsource their power-hogging frameworks to my poor phone?

      Go build your own browser.

      What a joke.

      • fxdave@lemmy.ml
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        14 days ago

        Who said making unbloated pages impossible? Your comment would be more serious without your emotions.

        Source code is the source code which gets transformed to some target code. An obfuscated code is not source code.

        A reminder, in the past, large pages downloaded all stuff at once. In contrast, with dynamic imports the first load is much much faster. And that matters most. And any changes in dynamic content would just require the dynamic data to be downloaded. My phone lasts at least 2 days with one charge (avg usage), but I charge it every night, that’s not an issue.

        • unwarlikeExtortion@lemmy.ml
          link
          fedilink
          English
          arrow-up
          0
          ·
          14 days ago

          Source code is the code devs write.

          For compiled languages like C, only the compiled machine code is made available to the user.

          JS is interpreted, meaning it doesn’t get compiled, but an interpreter interprets source code directly during runtime.

          Obfuscsted code, while not technically unaltered source code is still source code. Key word being unaltered. It isn’t source code due to the virtue of not being straight from the source (i.e. because it’s altered).

          However, obfuscated code is basically source code. The only things to obfuscate are variable and function names, and perhaps some pre-compile order of operations optimizations. The core syntax and structure of the program has to remain “visible”, because otherwise the interpreter couldn’t run the code.

          Analyzing obfuscated code is much closer to analyzing source code than reverse-engineering compiled binaries.

          It may not be human-readable. But other programs systems can analyze (as they can even compiled code), but more importantly - they can alter it in a trivial manner. Because it’s source code with basically names censored out. Which makes evaluating the code only a bit harder than if it were truly “closed-source”.

          That’s why website source code is basically almostsource-available.

          A reminder, in the past, large pages downloaded all stuff at once. In contrast, with dynamic imports the first load is much much faster. And that matters most. And any changes in dynamic content would just require the dynamic data to be downloaded.

          Unfortunately, you’re very mistaken.

          In the past, pages needed to download any stuff they want to display to the user. Now, here’s the kicker: that hasn’t changed!

          Pages today are loaded more dinamically and sensibly. First basic stuff (text), then styles, then scripts, then media.

          However, it’s not Angular, React Bootstrap or any other framework doing the fetching. It’s the browser. Frameworks don’t change that. What they do, instead, is add additional megabytes of (mostly) bloat to download every day or week (depending on the timeout).

          Any web page gets HTML loaded first, since the dawn of the Web. That’s the page itself. Even IE did that. At first, browsers loaded sequentially, but then they figured out it’s better UX to load CSS first, then the rest. Media probably takes precedence to frameworks as well (because thet’s what the user actually sees).

          Browsers are smart enough to cache images themselves. No framework can do it even if it wanted to because of sandboxing. It’s the browser’s job.

          What frameworks do is make devs’ lives easier. At the cost of performance for the user.

          That cost is multiple-fold: first the framework has to load. In order to do that, it takes bandwidth, which may or may not be a steeply-priced commodity depending on your ISO contract. Loading also takes time, i.e. waiting, i.e. bad UX.

          Other than that, the framework beeds to run. That uses CPU cycles, which wastes power and lowers battery life. It’s also less efficient than the browser doing it because it’s a higher level of abstraction than letting the browser do it on its own.

          With phones being as trigger-happy about killing “unused” apps, all the frameworks in use by various websites need to spin up from being killed as often as every few minutes. A less extreme amount of “rebooting” the framework happens when low-powered PCs run oit of RAM and a frameworked site is chosen by the browser to be “frozen”.

          What a framework does is, basically, fill a hole in HTML and CSS - it adds functionality needed for a website which is otherwise unattainable. Stuff like cart, checkout, some complex display styles, etc.

          All of this stuff is fully doable server-side. Mind you, login is so doable it didn’t even slip onto my little list. It’s just simpler to do it all client-side for the programmer (as opposed to making forms and HTML requests that much more often, together with the tiny UX addition of not needing to wait for the bac(-and-forth to finish.

          Which itself isn’t really a problem. In fact, the “white flashes” are more common on framework sites than not.

          When a browser loads any site, it loads HTML first. That’s “the site”. The rest is just icing on the cake. First is CSS, then media and JS (these two are havily browser dependent as far as load priority goes).

          Now comes the difference between “classic”, “js-enhanced” and “fully js-based” sites.

          A classic site loads fast. First HTML. The browser fetches the CSS soon enough, not even bothering to show the “raw HTML” for a few hundered miliseconds if the CSS loads fast enough. So the user doesn’t even see the “white flash” most of the time, since networks today are fast enough.

          As the user moves through different pages of the site, the CSS was cached - any HTML page wishing to use the same CSS won’t even need to wait for it to load again!

          Then there’s the js-enhanced site. It’s like the classic site, but with some fancy code to make it potentially infinitely more powerful. Stuff like responsive UI’s and the ability to do fancy math one would exoect of a traditional desktop/native app. Having JS saves having to run every little thing needing some consideration to the server when the browser can do it. It’s actually a privacy benefit, since a lot less things need to leave the user’s device. It can even mend its HTML, its internal structure and its backbone to suit its needs. That’s how powerful JS is.

          But, as they say, with great power comes great responsibility. The frameworked-to-hell site. Initially, its HTML is pretty much empty. It’s less of like ordering a car and more of building a house. When you “buy the car” (visit the site), it has to get made right in front of your eyes. Fun the first few times, but otherwise very impractical.

          A frameworked site also loads slower by default - the browser gets HTML first, then CSS. Since there’s no media there yet, it goes for the JS. Hell, some leave even CSS out of the empty shell of the page when you first enter so you really get blasted by the browser’s default (usually white, although today theme-based) CSS stylesheet. Only once the JS loads the framework can the foundation of the site (HTML) start being built.

          Once that’s been built, it has CSS, and you no longer see the white sea of nothing.

          As you move through pages of the site, each is being built in-browser, on-demand. Imagine the car turning into a funhouse where whenever you enter a new room, the bell rings. An employee has to hear it and react quickly! They have to bring the Buld-A-Room kit quickly and deploy it, lest you leave before that happens!

          Not only is that slow and asinine, it’s just plain inefficient. There’s no need for it in 99% of cases. It slows stuff down, creates needless bandwidth, wastes needless data and wastes energy.

          There’s another aspect to frameworked sites’ inefficiency I’d like to touch.

          It’s the fact that they’re less “dynamic” and more “quicksand”.

          They change. A lot. Frameworks get updates, and using multiple isn’t even unheard of. Devs push updates left and right, which are expected to be visible and deployed faster than the D-Day landings.

          Which in practice means that max resource age is set very low. Days, maybe even hours. Which means, instead of having the huge little 15 MB on-average framework fetched once a week or month, it’s more like 4 to dozens of times per week. Multiply by each site’s preferred framework and version, and add to that their own, custom code which also takes up some (albeit usually less-than-frameork) space.

          That can easily cross into gigabytes a month. Gigabytes wasted.

          Sure, in today’s 4K HDR multimedia days that’s a few minutes of video, but it isn’t 0 minutes of nothing.

          My phone also reliably lasts a day without charge. It’s not about my battery being bad, but about power being wasted. Do you think it normal that checking battery use, Chrome used 64% according to the abdroid settings?

          You bet I tried out Firefox the very same day. Googling for some optimizations led me down a privacy rabbit-hole. Today I use Firefox, and battery use fell from 64% to 24%. A 40% decrease! I still can’t believe it myself!

          I admit, I tend to use my phone less and less so my current 24% may not be the best metric, but even before when I did, the average was somewhere between 25% and 30%.

          There’s a middle-ground in all of this.

          Where the Web is today is anything but.

          The old days, while not as golden they might seem to me are also not as brown as you paint them out to be.

    • lichtmetzger@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      13 days ago

      Luckily, PHP times has ended

      I guess I earn my living with nothing then. What an absurd take. PHP powers WordPress, Shopware, Typo3 and many other CMS systems and is still very strong. Especially in Europe.

      (Apart from that, a lot of people shitting on PHP base it on outdated knowledge or have never used it at all. With modern OOP practices, you can write really clean code.)

    • TrickDacy@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      14 days ago

      even in the PHP era disabling jQuery could cause problems.

      WTF. Do you think jQuery is what JavaScript used to be called or something? Pretty much everything you wrote is insane, and I specifically think that because I’ve been building webpages for 25 years. You sure never heard of progressive enhancement.

  • Victor@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    14 days ago

    People in this thread who aren’t web devs: “web devs are just lazy”

    Web devs: Alright buddy boy, you try making a web site these days with the required complexity with only HTML and CSS. 😆 All you’d get is static content and maybe some forms. Any kind of interactivity goes out the door.

    Non web devs: “nah bruh this site is considered broken for the mere fact that it uses JavaScript at all”

    • owsei@programming.dev
      link
      fedilink
      English
      arrow-up
      0
      ·
      13 days ago

      That site is literally just static content. Yes JS is needed for interactivity, but there’s none here

      • Victor@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        13 days ago

        If you have static content, then sure, serve up some SSR HTML. But pages with even static content usually have some form of interactivity, like searching (suggestions/auto-complete), etc. 🤷‍♂️

        • Limonene@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          13 days ago

          Search is easier to implement without Javascript than with.

          <form method="GET" action="/search">
          <input name="q">
          <input type=submit>
          </form>
          
          • Victor@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            13 days ago

            Does that little snippet include suggestions, like I mentioned? Of course it’s easier with less functionality.

            • humorlessrepost@lemmy.world
              link
              fedilink
              English
              arrow-up
              0
              ·
              edit-2
              13 days ago

              Back in my day, we’d take that fully-functional form and do progressive enhancement to add that functionality on top with js. You know, back when we (or the people paying us) gave a fuck.

      • cerothem@lemmy.ca
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        14 days ago

        That would make the website feel ultra slow since a full page load would be needed every time. Something as simple as a slide out menu needs JavaScript and couldn’t really be done server side.

        When if you said just send the parts of the page that changed, that dynamic content loading would still be JavaScript. Maybe an iframe could get you somewhere but that’s a hacky work around and you couldn’t interact between different frames

        • Limonene@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          13 days ago

          a slide out menu needs JavaScript

          A slide out menu can be done in pure CSS and HTML. Imho, it would look bad regardless.

          When if you said just send the parts of the page that changed, that dynamic content loading would still be JavaScript

          OP is trying to access a restaurant website that has no interactivity. It has a bunch of static information, a few download links for menu PDFs, a link to a different domain to place an order online, and an iframe (to a different domain) for making a table reservation.

          The web dev using javascript on that page is lazy, yet also creating way more work for themself.

        • unwarlikeExtortion@lemmy.ml
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          14 days ago

          JS is just a janky hotfix.

          As it was, HTML was all sites had. When these were called “ugly”, CSS was invented for style and presentation stuff. When the need for advanced interactivity (not doable on Internet speeds of 20-30 years ago), someone just said “fuck it, do whatever you want” and added scripting to browsers.

          The real solution came in the form of HTML5. You no longer needed, and I can’t stress this enough, Flash to play a video in-browser. For other things as well.

          Well, HTML5 is over 15 years old by now. And maybe the time has come to bring in new functionality into either HTML, CSS or a new, third component of web sites (maybe even JS itself?)

          Stuff like menus. There’s no need for then to be limited by the half-assed workaround known as CSS pseudoclasses or for every website to have its own JS implementation.

          Stuff like basic math stuff. HTML has had forms since forever. Letting it do some more, like counting down, accessing its equivalent of the Date and Math classes, and tallying up a shopping cart on a webshop seems like a better fix than a bunch of frameworks.

          Just make a standardized “framework” built directly into the browser - it’d speed up development, lower complexity, reduce bloat and increase performance. And that’s just the stuff off the top of my head.

        • Sir_Kevin@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          0
          ·
          14 days ago

          Something as simple as a slide out menu needs JavaScript and couldn’t really be done server side.

          I’m not trying to tell anyone how to design their webpages. I’m also a bit old fashioned. But I stopped making animated gimmicks many years ago. When someone is viewing such things on a small screen, in landscape mode, it’s going to be a shit user experience at best. That’s just my 2 cents from personal experience.

          I’m sure there are examples of where js is necessary. It certainly has it’s place. I just feel like it’s over used. Now if you’re at the mercy of someone else that demands x y and z, then I guess you gotta do what you gotta do.

        • expr@programming.dev
          link
          fedilink
          English
          arrow-up
          0
          ·
          14 days ago

          https://htmx.org/ solves the problem of full page loads. Yes, it’s a JavaScript library, but it’s a tiny JS library (14k over the wire) that is easily cached. And in most cases, it’s the only JavaScript you need. The vast majority of content can be rendered server side.

          • cerothem@lemmy.ca
            link
            fedilink
            English
            arrow-up
            0
            ·
            14 days ago

            While fair, now you have to have JavaScript enabled in the page which I think was the point. It was never able having only a little bit. It was that you had to have it enabled

      • Victor@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        14 days ago

        If you want to zoom into a graph plot, you want each wheel scroll tick to be sent to the server to generate a new image and a full page reload?

        How would you even detect the mouse wheel scroll?

        All interactivity goes out the door.

    • Frostbeard@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      14 days ago

      Stop, can only get so erect. Give me that please than the bullshit I have to wade trough today to find information. When is the store open. E-mailadress/phone. Like fuck if I want to engage

      • Victor@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        14 days ago

        😆 F—ck, I hear you loud and clear on that one. But that’s a different problem altogether, organizing information.

        People suck at that. I don’t think they ever even use their own site or have it tested on anyone before shipping. Sometimes it’s absolutely impossible to find information about something, like even what a product even is or does. So stupid.

    • MotoAsh@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      14 days ago

      Ehhhhh it kinda’ depends. Most things that are merely changing how something already present on the page is displayed? Probably don’t need JS. Doing something cool based on the submit or response of a form? Probably don’t need JS. Changing something dynamically based off of what the user is doing? Might not need JS!

      Need to do some computation off of the response of said form and change a bunch of the page? You probably need JS. Need to support older browsers simply doing all of the previously described things? Probably need JS.

      It really, really depends on what needs to happen and why. Most websites are still in the legacy support realm, at least conceptually, so JS sadly is required for many, many websites. Not that they use it in the most ideal way, but few situations are ideal in the first place.

      A lot of this is just non-tech savvy people failing to understand the limitations and history of the internet.

      (this isn’t to defend the BS modern corporations pull, but just to explain the “how” of the often times shitty requirements the web devs are dealing with)

      • Victor@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        13 days ago

        Of course it depends, like all things. But in my mind, there’s a few select, very specific types of pages that wouldn’t require at least a bit of JavaScript these days. Very static, non-changing, non-interactive. Even email could work/has worked with HTML only. But the experience is severely limited and reduced, of course.

    • BackgrndNoize@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      13 days ago

      A lot of this interactivity is complete bullshit, especially on sites that are mostly just for static data like news articles, the JS is there for advertisement and analytics and social media and other bullshit

      • humorlessrepost@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        13 days ago

        News site dev here. I’ll never build a site for this company that relies on js for anything other than video playback (yay hls patents, and they won’t let me offer mp4 as an alternative because preroll pays our bills, despite everyone feeling entitled to free news with no ads)

    • puppinstuff@lemmy.ca
      link
      fedilink
      English
      arrow-up
      0
      ·
      13 days ago

      I can do it but it’s hard convincing clients to double their budget for customers with accessible needs they’re not equipped to support in other channels.

      That being said, my personal sites and projects all do it. And I’m thankful for accessible website laws where I’m from that make it mandatory for companies over a certain size to include accessible supports that need to work when JS is disabled.

      • Victor@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        13 days ago

        What country or area would that be?

        And what do you mean by “do it”? What is it exactly that you do or make without JavaScript?

        • puppinstuff@lemmy.ca
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          13 days ago

          Some provinces in Canada have rules that businesses’ websites must meet or exceed the WCAG 2.0 accessibility guidelines when they exceed a certain employee headcount, which includes screen reader support that ensures all content must be available to a browser that doesn’t have JavaScript enabled.

          • neclimdul@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            12 days ago

            Also the EU and technically a lot of US sites that provide services to or for the government have similar requirements. The latter is largely unenforced though unless you’re interacting with states that also have accessibility laws.

            And honestly a ton of sites that should be covered by these requirements just don’t care or get rubber stamped as compliant. Because unless someone actually complains they don’t have a reason to care.

            I kind of thought the EU requirements that have some actual penalties would change this indifference but other than some busy accessibility groups helping people that already care, I haven’t heard a lot about enforcement that would suggest it’s actually changed.

          • Victor@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            13 days ago

            That’s excellent.

            And what do you make that doesn’t include JavaScript? Like what kind of software/website/content? If you don’t mind sharing, of course.

            • puppinstuff@lemmy.ca
              link
              fedilink
              English
              arrow-up
              0
              ·
              12 days ago

              Mostly marketing and informational websites for the public. Businesses, tourism spots, local charities and nonprofits, etc. Nothing that’s going to change the world but hopefully makes somebody’s day a little easier when they need to look something up.

            • neclimdul@lemmy.world
              link
              fedilink
              English
              arrow-up
              0
              ·
              12 days ago

              It doesn’t have to not include JavaScript, that would be quite difficult and unreasonable. Accessible sites are not about limiting functionality but providing the same functionality.

              I haven’t gone fully down the rabbit hole on this but my understanding is even something like Nuxt if you follow best practices will deliver HTML that can be interacted with and serve individual pages.

              That said, screen readers and other support shouldn’t require running without any JavaScript. Having used them to test sites that might be the smart approach but they actually have a lot of tools for announcing dynamic website changes that are built into ARIA properties at the HTML level so very flexible. There are of course also JavaScript APIs for announcing changes.

              They just require additional effort and forethought to implement and can be buggy if you do really weird things.

              • Victor@lemmy.world
                link
                fedilink
                English
                arrow-up
                0
                ·
                12 days ago

                I think we’re on the same page here. Your reply seems to me to argue against the people who are completely against JavaScript and who treat its very presence like a complete site-breaking bug. I am not of their opinion either. But I do sympathize with the sentiment that it is being used for evil.

                • neclimdul@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  12 days ago

                  Yeah, I don’t think that’s what the screenshot shows though since there’s no content at all 😅

        • _stranger_@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          13 days ago

          I unironically use Lynx from my home lab s when I’m ssh’d in snce it’s headless. Sometimes at work I miss the simplicity. I used to use Pine for Gmail as well. 😁

    • mrgoosmoos@lemmy.ca
      link
      fedilink
      English
      arrow-up
      0
      ·
      13 days ago

      it sounds like you’re saying there’s an easy solution to get websites that don’t have shit moving on you nonstop with graphics and non-content frames taking up 60% of the available screen

      it’s crazy that on a 1440p monitor, I still can’t just see all the content I want on one screen. nope, gotta show like 20% of it and scroll for the rest. and even if you zoom out, it will automatically resize to keep proportion, it won’t show any of the other 80%

      I’m not a web dev. but I am a user, and I know the experience sucks.

      if I’m looking at the results of a product search and I see five results at a time because of shitty layout, I just don’t buy from that company

      • Victor@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        13 days ago

        I had a bit of trouble following that first paragraph. I don’t understand what it is that you say it sounds like I’m saying.

        Either way, none of what you wrote I disagree with. I feel the same. Bad design does not elicit trust.

      • Victor@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        12 days ago

        Not sure that was the issue. I mean more that if you use only HTML and CSS all you’ll be able to create would be static sites that only change the contents of the page by full reloads. 🙂

        • NigelFrobisher@aussie.zone
          link
          fedilink
          English
          arrow-up
          0
          ·
          12 days ago

          There’s this ancient thing called the LAMP stack. Most of the web runs it, and what it does will blow your mind.