• caschb@lemmy.world
    link
    fedilink
    arrow-up
    3
    arrow-down
    5
    ·
    2 months ago

    The problem is not so much badly written programs and websites in terms of algorithms, but rather latency. The latency of loading things from storage, sometimes through the internet is the real bottleneck and why things feel so slow.

    • furry toaster@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      4
      ·
      2 months ago

      even with mothern ssds, things sometimes feel slower than what they were with hdds with time accurate software, windows 7 was snappy even on a hdd, windows 11 is slow and sluggish everywhere

  • Valmond@lemmy.dbzer0.com
    link
    fedilink
    arrow-up
    19
    ·
    2 months ago

    Had to install (an old mind you, 2019) visual studio on windows…

    First it’s like 30GB, what the hell?? It’s an advanced text editor with a compiler and some …

    Crashed a little less than what I remember 🥴😁

      • shynoise@lemmy.world
        link
        fedilink
        arrow-up
        9
        arrow-down
        1
        ·
        edit-2
        2 months ago

        OP was clearly using a rhetorical reduction to make a point that VS is bloated.

      • Valmond@lemmy.dbzer0.com
        link
        fedilink
        arrow-up
        3
        ·
        2 months ago

        Visual code is another project, visual studio is indeed an IDE but it integrates it all. Vscode is also an integrated development environment. I don’t really know what more to say.

        • The Stoned Hacker@lemmy.world
          link
          fedilink
          arrow-up
          7
          ·
          2 months ago

          VS Code is considered a highly extensible text editor that can be used as an IDE, especially for web based tools, but it isnt an IDE. It’s more comparable to Neovim or Emacs than to IntelliJ in terms of the role it’s supposed to fill. Technically. VS Code definitely is used more as an IDE by most people, and those people are weak imo. I’m not one to shill for companies (i promise this isnt astroturf) but if you need to write code Jetbrains probably has the best IDE for that language. Not always true but moee often than not it is imo.

          • Valmond@lemmy.dbzer0.com
            link
            fedilink
            arrow-up
            1
            ·
            2 months ago

            Ooh, a flame war 🔥🔥🔥 ! It has been so long since I was involved in one, thank you 🙋🏻‍♀️! 😊

            Who uses visual code to something else than writing and launching vode? I only uses it for C#/Godot on Linux but it has all the bells and whistles to make it an IDE IMO (BTW anyone who doesn’t code in C/C++ is weak ofc ☺️! 🔥).

            Let me just add that jetbrains (at least pycharm) have started their enshittification death cycle, and I’m looking for a lightweight python IDE that doesn’t hallucinate (but lets you use venm and debug), if you have any ideas I’m listening!

            Cheers

            • The Stoned Hacker@lemmy.world
              link
              fedilink
              arrow-up
              2
              ·
              2 months ago

              I wanna clarify that when i say VS Code I’m talking about Visual Studio Code. I was only commenting on the difference between Visual Studio and Visual Studio Code because you said you downloaded Visual Studio and was confused why a text editor was 30gb, and it’s possible you downloaded the IDE rather than the text editor. I apologize if you thought i was talking about Visual Code; I wasn’t.

              And i agree that JetBrains has started to enshittify but I also think their enshittification has been pretty slow because they sell professional tools that still have to perform the basic functionality of an IDE. And for the modt part I’ve been able to disable all AI features save the ones I’m required to use at work (yay AI usage metrics ;-;)

  • Tartas1995@discuss.tchncs.de
    link
    fedilink
    arrow-up
    6
    arrow-down
    8
    ·
    2 months ago

    I dislike a lot the framing of this.

    Yes, the average software runs much less efficient. But is efficiency what the user want? No. It is not.

    How many people will tell you that they stick to windows instead of switching to linux because linux is all terminal? And terminal is quicker, more efficient for most things. But the user wants a gui.

    And if we compare modern gui to old gui… I don’t think modern us 15x worse.

    • Buddahriffic@lemmy.world
      link
      fedilink
      arrow-up
      3
      ·
      2 months ago

      There isn’t anything fundamentally slower about using a GUI vs just text in a console. There’s more to draw but it scales linearly. The drawing things on the screen part isn’t the slow bit for slow programs. Well, it can be if it’s coded inefficiently, but there are plenty of programs with GUIs that are snappy… Like games, which generally draw even more complex things than your average GUI app.

      Slow apps are more likely because of an inefficient framework (like running in a web browser with heavy reliance on scripts rather than native code), inefficient algorithms that scale poorly, poor resource use, bad organization that results in doing the same operation more times than necessary, etc.

        • Buddahriffic@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          2 months ago

          Can you elaborate on that? I disagree but would like to understand why you think that. Maybe you’re referring to something I wouldn’t disagree with.

          • Tartas1995@discuss.tchncs.de
            link
            fedilink
            arrow-up
            2
            ·
            2 months ago

            E.g. From the terminal, I open a known file far more quickly then through an gui. Even if I want to use a gui for the file, issuing the opening command is quicker in the terminal.

            GUIs often require the user to scan the interface to find the relevant information as the developer didn’t know what you are actually searching.

            With a terminal, the user can be much more precise in what they seeking and consequently, less information is provided and less information needs to be scanned by the user.


            The average user doesn’t want to remember and type a specific phrase to do something though. Even if it is “faster” and more “efficient”, the user want to be guided towards the information. The user wants a good user experience, not a fast/efficient one.

            Pretty and guided, that is what the average user wants. Modern software is pretty and guided, not efficient and fast. Yes, developer became lazy in optimisation and like to use some big framework to save dev time. But the user also wanted it that way by wanting pretty GUIs because that is easier with the big frameworks.

            • Buddahriffic@lemmy.world
              link
              fedilink
              arrow-up
              1
              ·
              2 months ago

              Ah, that’s efficiency of use and depends more on how familiar you are with the software as well as the design and task. Like editing an image or video is going to be a lot easier with a gui than a command line interface (other than generating slop I guess).

              When people talk about how efficient software is, it’s usually referring more to the amount of resources it uses (including time) to run its processes.

              Eg an electron app is running a browser that is manipulating and rendering html elements running JavaScript (or other scripts/semi-compiled code). There is an interpreter that needs to process whatever code it is to do the manipulation and then an html renderer to turn that into an image to display on the screen. The interpreter and renderer run as machine code on the CPU, interacting with the window manager and the kernel.

              A native app doesn’t bother with the interpreter and html renderer and itself runs as machine code on the CPU and interacts with the window manager and kernel. This saves a bunch of memory, since there isn’t an intermediate html state that needs to be stored, and time by cutting out the interpreter and html render steps.

              • Tartas1995@discuss.tchncs.de
                link
                fedilink
                arrow-up
                2
                ·
                2 months ago

                I know. That is why I started my statement by stating that I don’t like the framing. It treats “efficiency” as the point of software. As the thing, that we should care about when judging software.

                But it isn’t. It is user experience. And yes, efficiency is part of that. Both, efficiency in execution and efficiency of use.

                And the user experience has improved a lot (ignoring intentional anti patterns to exploit the user that are fairly common, but i think we can agree to ignore that for the sake of the conversation)

        • mnemonicmonkeys@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          1
          ·
          2 months ago

          Technically true, but there’s a threshold on responsiveness. If both user interfaces respond in milliseconds, it doesn’t matter if one is more efficient

          • Tartas1995@discuss.tchncs.de
            link
            fedilink
            arrow-up
            1
            arrow-down
            1
            ·
            2 months ago

            It does because it highlights that instead of being excited to “have to use the terminal” as it is more “efficient” but instead they prefer the “slower” prettier gui. The user want the stupid animations and the flashy nonsense. The user doesn’t want quick software. They want pretty software.

              • Tartas1995@discuss.tchncs.de
                link
                fedilink
                arrow-up
                1
                ·
                2 months ago

                How am I gatekeeping?

                I am not telling anyone to use Linux in anyway or to not use it in anyway. I am just pointing out that the average user wants a pretty/convenient gui and not the most efficient tool. That isn’t bad. I don’t want to eat some weird mixture of nutrients because it is optimal, I want to eat food that I enjoy eating.

                I am calling out the weird focus on efficiency of software when the average user wants a good user experience. The user’s desire is not good nor bad, they just highlights that focusing on criticising efficiency of software is a strange thing to do, if the customer desires something else. It is like complaining that grindr is lacking heterosexual people.

    • UnderpantsWeevil@lemmy.world
      link
      fedilink
      English
      arrow-up
      12
      ·
      edit-2
      2 months ago

      But the user wants a gui.

      Firstly, plenty of Linux instances have GUI. I installed Mint precisely because I wanted to keep the Windows/Mac desktop experience I was familiar with. GUIs add latency, sure. But we’ve had smooth GUI experiences since Apple’s 1980s OS. This isn’t the primary load on the system.

      Secondly, as the Windows OS tries to do more and more online interfacing, the bottleneck that used to be CPU or open Memory or even Graphics is increasingly internet latency. Even just going to the start menu means making calls out online. Querying your local file system has built in calls to OneDrive. Your system usage is being constantly polled and tracked and monitored as part of the Microsoft imitative to feed their AI platforms. And because all of these off-platform calls create external vulnerabilities, the (abhorrently designed) antivirus and firewall systems are constantly getting invoked to protect you from the online traffic you didn’t ask for.

      It’s a black hole of bloatware.

      • ZombiFrancis@sh.itjust.works
        link
        fedilink
        arrow-up
        5
        ·
        2 months ago

        TVs became SmartTVs and now need the internet to turn on. The TVs need an OS now to internet to do TV.

        Antennae broadcast TV seems like an ancient magic.

        • UnderpantsWeevil@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          ·
          2 months ago

          We’ve deprecated a lot of the old TV/radio signal bandwidth in order to convert it to cellphone signal service.

          But, on the flip side, digital antennae can hold a lot more information than the old analog signals. So now I’ve got a TV with a mini-antennae that gets 500 channels (virtually none of which I watch). My toddler son has figured out how to flip the channel to the continuous broadcast of Baby Einstein videos. And he periodically hijacks the TV for that purpose, when we leave the remote where he can reach.

          So there’s at least one person I can name who likes the current state of affairs.

          • ZombiFrancis@sh.itjust.works
            link
            fedilink
            arrow-up
            3
            ·
            2 months ago

            I always have to remind myself being able to stream audio from a cellphone while driving across a city is also a pretty crazy development.

      • Tartas1995@discuss.tchncs.de
        link
        fedilink
        arrow-up
        2
        ·
        2 months ago

        I am not saying linux is terminal. I am saying that people tell you that linux is all terminal and that they want a gui.

        Linux gui is much prettier than Windows anyway.

  • Whitebrow@lemmy.world
    link
    fedilink
    arrow-up
    70
    ·
    2 months ago

    I still remember playing StarCraft 2 shortly after release on a 300$ laptop and it running perfectly well on medium settings.

    Looked amazing. Felt incredibly responsive. Polished. Optimized.

    Nowadays it’s RTX this, framegen that, need SSD or loading times are abysmal, oh and don’t forget that you need 40gb of storage and 32gb of ram for a 3 hour long walking simulator, how about you optimize your goddamn game instead? Don’t even get me started on price tags for these things.

    Software and game development is definitely a spectrum though, but holy shit is the ratio of sloppy releases so disproportionate that it’s hard to see it at times.

    • addie@feddit.uk
      link
      fedilink
      arrow-up
      34
      arrow-down
      1
      ·
      2 months ago

      StarCraft 2 was released in 2007, and a quick search indicates the most common screen resolution was 1024x768 that year. That feels about right, anyway. A bit under a million pixels to render.

      A modern 4K monitor has a bit over eight million pixels, slightly more than ten times as much. So you’d expect the textures and models to be about ten times the size. But modern games don’t just have ‘colour textures’, they’re likely to have specular, normal and parallax ones too, so that’s another three times. The voice acting isn’t likely to be in a single language any more either, so there’ll be several copies of all the sound files.

      A clean Starcraft 2 install is a bit over 20 GB. ‘Biggest’ game I have is Baldur’s Gate 3, which is about 140 GB, so really just about seven times as big. That’s quite good, considering how much game that is!

      I do agree with you. I can’t think of a single useful feature that’s been added to eg. MS Office since Office 97, say, and that version is so tiny and fast compared to the modern abomination. (In fact, in a lot of ways it’s worse - has had some functionality removed and not replaced.) And modern AAA games do focus too much on shiny and not enough on gameplay, but the fact that they take a lot more resources is more to do with our computers being expected to do a lot more.

      • glimse@lemmy.world
        link
        fedilink
        arrow-up
        11
        arrow-down
        1
        ·
        2 months ago

        Excel is sooo much than it used to be in Office 97. And it’s way better than any other spreadsheet software I’ve tried.

        Speaking of, anyone know of any alternative that handles named tables the same as Excel? Built-in filtering/sorting and formulas that can address the table itself instead of a cell range?? Please?

      • Teepo@sh.itjust.works
        link
        fedilink
        arrow-up
        18
        ·
        2 months ago

        Why are you comparing the most common screen resolution in 2007 to a 4k monitor today? 4k isn’t the most common today. This isn’t a fair comparison.

          • Denys Nykula@piefed.social
            link
            fedilink
            English
            arrow-up
            6
            ·
            2 months ago

            BTW the demand for bigger screens and bigger resolutions is something I don’t easily understand. I notice some difference between 1366x768 and 1920x1080 on a desktop, but the difference from further increase is of so little use for me I’d classify it as a form of bloat. If anything, I now habitually switch to downloading 480p and 720p instead of higher definition by default because it saves me traffic and battery power, and fits much more on a single disk easy to back up.

            • glimse@lemmy.world
              link
              fedilink
              arrow-up
              5
              ·
              2 months ago

              Pixel density is more important than resolution. Higher resolution is only useful outside of design work if the screen size matches

              IMO the ideal resolutions for computer monitors is 24" @ 1080p, 27" @ 2k, and 32"+ at 4k+. For TV it’s heavily dependant on viewer distance. I can’t tell the difference between 2k and 4k on my 55" TV from the couch.

      • MonkderVierte@lemmy.zip
        link
        fedilink
        arrow-up
        12
        ·
        edit-2
        2 months ago

        ‘Biggest’ game I have is Baldur’s Gate 3, which is about 140 GB, so really just about seven times as big. That’s quite good, considering how much game that is!

        Not at all. For example, Rimworld saves all the map and world data in one big XML (which is bad btw, don’t do that): about 2 million lines @75 MB, for a 30-pawns mid-game colony.

        So you see, Data is not what uses space. But what uses space instead is, if you don’t properly re-use objects/textures (so called “assets”), or even copy and repack the same assets per level/map, because that saves dev time.

        Ark Survival Evolved, with “only” about 100 GB requirement, was known as a unoptimized mess back then.

        Witcher 3 mod “HD Reworked Next-Gen” has barely 20 GB with 4k textures and high-res meshes. And you can’t say that Witcher 3 is not a vibrant and big open world game.

    • chunes@lemmy.world
      link
      fedilink
      arrow-up
      6
      ·
      2 months ago

      Absolutely. Every time I play a game from before 2016 or so it runs butter smooth and looks even better than modern games in many cases. I don’t know what we’re doing nowadays.

    • Einskjaldi@lemmy.world
      link
      fedilink
      arrow-up
      14
      ·
      2 months ago

      Then factorio dev blog comes in and spend months optimizing the tok of one broken gear in the conveyor belt to slightly improve efficiency.

  • Michal@programming.dev
    link
    fedilink
    arrow-up
    16
    arrow-down
    3
    ·
    2 months ago

    PCs aren’t faster, they have more cores, so they can do more at a time, but it takes effort to optimize for parallel work. Also the form factor keeps getting smaller, more people use laptops now and you can’t cheat thermal efficiency.

    • leftzero@lemmy.dbzer0.com
      link
      fedilink
      arrow-up
      22
      arrow-down
      2
      ·
      2 months ago

      My first PC ran at 16MHz on turbo.

      PCs today are orders of magnitude faster. Way less fun, but faster.

      What’s even more orders of magnitude slower and infinitely more bloated is software. Which is the point of the post.

      It’s almost impossible to find any piece of actually optimised software these days (with some exceptions like sqlite) to the point that 99% percent of the software currently in use can be considered unintentional (or intentional) malware.

      Particularly egregious are web browsers, which seem designed to waste the maximum possible amount of resources and run as inefficiently as possible.

      And the fact that most supposedly desktop software these days runs on top of one of those pieces ofintentional (it’s impossible to achieve such levels of inefficiency and bloat unintentionally, it requires active effort) malware obviously doesn’t help.

      • Auli@lemmy.ca
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 months ago

        Browsers are not the same as they where. They are basically bikers ring systems in themselves now.

        • Blue_Morpho@lemmy.world
          link
          fedilink
          arrow-up
          2
          ·
          2 months ago

          Only on some and name brand PC’s which used it for compatibility. For home built or local store, the turbo would overclock. I remember telling a friend, that although their 16mhz could run at 20, to not do it because it would compromise longevity! Ha! Mind you the cpu’s in those days didn’t have heat sinks but still- Oh no your 386 might not work in 20 years from running too hot!

    • CovfefeKills@lemmy.world
      link
      fedilink
      arrow-up
      6
      ·
      2 months ago

      It’s all about memory latency and bandwidth now which has improved greatly PC’s are still getting faster. There is a new RAM standards being pushed right now CAMM2 is really exciting it pushes back the need for soldered memory.

      • Kairos@lemmy.today
        link
        fedilink
        arrow-up
        3
        ·
        2 months ago

        The faster single core out of order execution performance on newer x86 CPUs lets it work on that higher bandwidth of data too.

    • ragas@lemmy.ml
      link
      fedilink
      arrow-up
      4
      ·
      edit-2
      2 months ago

      I came from C and C++ and had learned that parallelism is hard. Then I tried parallelism on Rust in a project of mine and it was so insanely easy.

    • EddoWagt@feddit.nl
      link
      fedilink
      arrow-up
      6
      ·
      2 months ago

      What do you mean pc’s aren’t faster? Yes they have more cores, they also clock higher (mostly) and have more instructions per clock. Computers now perform way better than ever before in every single metric most tasks, even linear ones, could be way faster

  • ZILtoid1991@lemmy.world
    link
    fedilink
    arrow-up
    24
    ·
    2 months ago

    They often are worse, because everything needed to be an electron app, so they could hire the cheaper web developers for it, and also can boast about “instant cross platform support” even if they don’t release Linux versions.

    Qt and GTK could do cross platform support, but not data collection, for big data purposes.

    • boonhet@sopuli.xyz
      link
      fedilink
      arrow-up
      5
      ·
      2 months ago

      There’s no difference whatsoever between qt or gtk and electron for data collection. You can add networking to your application in any of those frameworks.

    • Echo Dot@feddit.uk
      link
      fedilink
      arrow-up
      8
      ·
      2 months ago

      I don’t know why electron has to use so much memory up though. It seems to use however much RAM is currently available when it boots, the more RAM system has the more electron seems to think it needs.

      • GamingChairModel@lemmy.world
        link
        fedilink
        arrow-up
        14
        ·
        2 months ago

        Chromium is basically Tyrone Biggums asking if y’all got any more of that RAM, so bundling that into Electron is gonna lead to the same behavior.

      • Buddahriffic@lemmy.world
        link
        fedilink
        arrow-up
        5
        arrow-down
        1
        ·
        2 months ago

        Ib4 “uNusEd RAm iS wAStEd RaM!”

        No, unused RAM keeps my PC running fast. I remember the days where accidentally hitting the windows key while in a game meant waiting a minute for it to swap the desktop pages in, only to have to swap the game pages back when you immediately click back into it, expecting it to either crash your computer or probably disconnect from whatever server you were connected to. Fuck that shit.

        • boonhet@sopuli.xyz
          link
          fedilink
          arrow-up
          4
          ·
          2 months ago

          I mean unused RAM is still wasted: You’d want all the things cached in RAM already so they’re ready to go.

          • Echo Dot@feddit.uk
            link
            fedilink
            arrow-up
            1
            ·
            2 months ago

            I mean I have access to a computer with a terabyte of RAM I’m gonna go ahead and say that most applications aren’t going to need that much and if they use that much I’m gonna be cross.

            • boonhet@sopuli.xyz
              link
              fedilink
              arrow-up
              2
              ·
              2 months ago

              Wellll

              If you have a terabyte of RAM sitting around doing literally nothing, it’s kinda being wasted. If you’re actually using it for whatever application can make good use of it, which I’m assuming is some heavy-duty scientific computation or running full size AI models or something, then it’s no longer being wasted.

              And yes if your calculator uses the entire terabyte, that’s also memory being wasted obviously.

              • Echo Dot@feddit.uk
                link
                fedilink
                arrow-up
                1
                ·
                2 months ago

                That’s a different definition of wasted though. The RAM isn’t lost just because it isn’t being currently utilised. It’s sitting there waiting for me to open a intensive task.

                What I am objecting to is programs using more RAM than they need simply because it’s currently available. Aka chromium.

          • Buddahriffic@lemmy.world
            link
            fedilink
            arrow-up
            3
            ·
            2 months ago

            I don’t want my PC wasting resources trying to guess every possible next action I might take. Even I don’t know for sure what games I’ll play tonight.

            • boonhet@sopuli.xyz
              link
              fedilink
              arrow-up
              4
              ·
              2 months ago

              Well you’d want your OS to cache the start menu in the scenario you highlighted above. The game could also run better if it can cache assets not currently in use instead of waiting for the last moment to load them. Etc.

              • Buddahriffic@lemmy.world
                link
                fedilink
                arrow-up
                1
                ·
                2 months ago

                Yeah, for things that will likely be used, caching is good. I just have a problem with the “memory is free, so find more stuff to cache to fill it” or “we have gigabytes of RAM so it doesn’t matter how memory-efficient any program I write is”.

                • boonhet@sopuli.xyz
                  link
                  fedilink
                  arrow-up
                  3
                  ·
                  2 months ago

                  “memory is free, so find more stuff to cache to fill it”

                  As long as it’s being used responsibly and freed when necessary, I don’t have a problem with this

                  “we have gigabytes of RAM so it doesn’t matter how memory-efficient any program I write is”

                  On anything running on the end user’s hardware, this I DO have a problem with.

                  I have no problem with a simple backend REST API being built on Spring Boot and requiring a damn gigabyte just to provide a /status endpoint or whatever. Because it runs on one or a few machines, controlled by the company developing it usually.

                  When a simple desktop application uses over a gigabyte because of shitty UI frameworks being used, I start having a problem with it, because that’s a gigabyte used per every single end user, and end users are more numerous than servers AND they expect their devices to do multiple things, rather than running just one application.

  • Auster@thebrainbin.org
    link
    fedilink
    arrow-up
    3
    ·
    2 months ago

    Dunno this paradox theory, but the impression I get is that when you’re part of the process, it’s harder to notice changes. But if putting the two devices side by side, trying to run the same systems, programs, etc., the difference is glaring. And from tests I did, if the software doesn’t work on either of the devices, slapping a VM on the newer one to test older programs still tells quite a lot.

  • merc@sh.itjust.works
    link
    fedilink
    arrow-up
    28
    ·
    2 months ago

    You do really feel this when you’re using old hardware.

    I have an iPad that’s maybe a decade old at this point. I’m using it for the exact same things I was a decade ago, except that I can barely use the web browser. I don’t know if it’s the browser or the pages or both, but most web sites are unbearably slow, and some simply don’t work, javascript hangs and some elements simply never load. The device is too old to get OS updates, which means I can’t update some of the apps. But, that’s a good thing because those old apps are still very responsive. The apps I can update are getting slower and slower all the time.

    • ssfckdt@lemmy.blahaj.zone
      link
      fedilink
      arrow-up
      26
      ·
      2 months ago

      It’s the pages. It’s all the JavaScript. And especially the HTML5 stuff. The amount of code that is executed in a webpage these days is staggering. And JS isn’t exactly a computationally modest language.

      Of the 200kB loaded on a typical Wikipedia page, about 85kb of it is JS and CSS.

      Another 45kB for a single SVG, which in complex cases is a computationally nontrivial image format.

      • 87Six@lemmy.zip
        link
        fedilink
        arrow-up
        6
        ·
        2 months ago

        I don’t agree. It’s both. I’ve opened basic no JS sites on old tablets to test them out and even those pages BARELY load

          • Passerby6497@lemmy.world
            link
            fedilink
            English
            arrow-up
            4
            ·
            2 months ago

            Probably just the browser itself, considering how bloated they’re getting. It’s not super surprising, considering the apps run about as fast (on a good day) as it did 5-10 years ago on a new phone, it’s gonna run like dogshit on a phone from that era.

    • NecroParagon@midwest.social
      link
      fedilink
      arrow-up
      4
      ·
      2 months ago

      I can’t update YouTube on my iPad 2 that I got running again for the first time in years. It said it had been 70,000~ hours since last full charge. I wanted to use it to watch videos on when I’m going to bed. But I can’t actually login to YouTube because the app is so old and I seemingly can’t update it.

      I was using the web browser and yeah I don’t remember it being so damn slow. It’s crazy how that is.

      • Yaky@slrpnk.net
        link
        fedilink
        arrow-up
        1
        ·
        2 months ago

        Is your iPad on iOS 9.3.5? It is infamously slow.

        It is possible to downgrade it to 8.4.1 (faster, partially more broken) or even 6.1.3 (fast and old school, many apps don’t work, but there are apps in Cydia to fix stuff).

        Biggest issue I encountered is sites requiring TLSv1.3 for HTTPS encryption, and browsers simply do not support that.

      • merc@sh.itjust.works
        link
        fedilink
        arrow-up
        1
        ·
        2 months ago

        I have an old YouTube app on my iPad, and it still works fine. One of the more responsive apps on the device. I get nagged nearly every time I use it to update to the newest YouTube release, but that’s impossible. I’d first have to upgrade my OS, and Apple no longer releases new OSes for this generation of iPads. So, I’m stuck with an old YouTube, which mostly works fine, and an occasional nag message.

        I’m sure within a year or two mine will be like yours and YouTube will simply no longer work. But, for now it’s in a relatively good spot where I can use a version of YouTube designed for this particular hardware that doesn’t feel sluggish.

  • Quicky@piefed.social
    link
    fedilink
    English
    arrow-up
    30
    arrow-down
    1
    ·
    2 months ago

    I feel like this is Windows specific. Linux is rapid on PCs and my MacBook is absurdly quick.

    • Auster@thebrainbin.org
      link
      fedilink
      arrow-up
      8
      ·
      2 months ago

      Mint Xfce on my 2015 laptop compared to its previous system was the difference between usable and waiting 10 minutes for it to even boot, and things like gaming, VMs, comically large spreadsheets (surprisingly the memory hog), etc., were an eternal challenge on it. On my current laptop, I have the luxury of picking the systems by aesthetics and non-optimization functions instead. And to compare, I’ve run even the same updates on the two laptops, as the older one still works.

    • RaccoonBall@lemmy.ca
      link
      fedilink
      arrow-up
      3
      ·
      2 months ago

      App launch time can be annoyingly slow on mac if you’re not offline or blocking the server it phones home to

      it can be the difference between one bounce or seven bounces of the icon on my end

      • Quicky@piefed.social
        link
        fedilink
        English
        arrow-up
        2
        ·
        2 months ago

        What apps out of interest? I’m a new Mac owner, so limited experience, but everything seems insanely quick so far. Even something like Xcode is a one-bounce on this M4 Air.

        • RaccoonBall@lemmy.ca
          link
          fedilink
          arrow-up
          1
          ·
          edit-2
          2 months ago

          All of them. The device has to phone home to apple to ask permission to run them.

          to test close app (really shut, make sure dot on icon isnt glowing) then open and measure time

          close app and then disconnect from the internet and launch again

          the speed difference depends on how overloaded apples servers are.

          • Quicky@piefed.social
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            2 months ago

            I’ve not come across this but I’ll check it out. Is that App Store apps only?

            I think probably 90% of the apps I’ve installed have been through the homebrew package manager which likely means they don’t do any phoning home, but I’ll check out the pre-installed stuff and see if I can replicate.

            • RaccoonBall@lemmy.ca
              link
              fedilink
              arrow-up
              1
              ·
              2 months ago

              Not sure aboit homebrew, honestly. thougj i was under the impression it was for every executable

    • doleo@lemmy.one
      link
      fedilink
      English
      arrow-up
      3
      ·
      2 months ago

      Anyone opening the app menu (from the dock or Home Screen) on an iPad will tell you that it’s not exclusive to windows pcs.

    • sp3ctr4l@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      8
      ·
      edit-2
      2 months ago

      PC games are software.

      Unfortunately many PC games are also like this, astoundingly poorly optimized, just assume everyone has a $750 GPU.

      Proton can only do so much.

      … and Metal basically can’t do that that much.

      Look at Metal Gear Solid 5 or TitanFall 2, and tell me realtime video game graphics have dramatically increased in visual fidelity in the last decade.

      They haven’t really.

      They shifted to a poorly optimized, more expensive paradigm for literally everyone involved; publisher, developer, player.

      Everything relating to realtime raytracing and temporal antialiasing is essentially a scam, in the vast majority of actual implementations of it.

      • Quicky@piefed.social
        link
        fedilink
        English
        arrow-up
        4
        ·
        2 months ago

        I guess the counter argument for games is load times have dramatically improved, though that’s less about software development than hardware improvements.

        If we put consoles in the same bracket as computers, the literally instant quick-resume feature on an Xbox (for example) feels like sci-fi.

        • sp3ctr4l@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          2 months ago

          Yeah, you kinda defeated your own argument there, but you do seem to recognize that.

          You can instant resume on a Steam Deck, basically.

          You can alt tab on a PC, at least with a stable game that is well made and not memory leaking.

          Yeah, better RAM / SSDs does mean lower loading times, higher streaming speeds/bus bandwidths, but literally, at what cost?

          You could just actually take the time to optimize things, find non insanely computationally expensive ways to do things that are more clever, instead of just saying throw more/faster ram at it.

          RAM and SSD costs per gig are going up now.

          Moore’s Law is not only dead, it has inverted.

          Constantly cheaper memory going forward turned out to not the best assumption to make.

          • Quicky@piefed.social
            link
            fedilink
            English
            arrow-up
            1
            ·
            2 months ago

            With respect to OP’s post, they say “you can’t even tell the computers we are on are 15x faster…”, and I reckon that quick resume etc, is an example of “you absolutely can tell that we now have extremely fast hardware” when compared to what came before, irrespective of the quality of the software.

            I’m not disagreeing with you, I’m just picking apart the blanket “computers feel the same as they did a decade ago”. Some computers might feel the same, and a lot of software might be unoptimised, but there’s a good selection of examples where that’s not the case.

  • Flames5123@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    12
    arrow-down
    1
    ·
    2 months ago

    For my home PC, sure. Running some windows apps on my Linux machine in wine is a little weird and sluggish. Discord is very oddly sluggish for known reasons. Proton is fine tho.

    But for my work? Nah. My M3 MacBook Pro is a beast compared to even the last Intel MacBook. Battery is way better unless you’re like me and constantly running a front end UI for a single local service. But without that, it can last hours. My old one could only last 2 meetings before it started dying.

    • prime_number_314159@lemmy.world
      link
      fedilink
      arrow-up
      14
      ·
      2 months ago

      Apple put inadequate coolers in the later Intel Macbooks to make Apple Silicon feel faster by contrast. When I wake mine, loading the clock takes 1.5 seconds, and it flips back and forth between recognizing and not recognizing key presses in the password field for 12 seconds. Meanwhile, the Thinkpad T400 (running Arch, btw) that I had back in 2010 could boot in 8.5 seconds, and not have a blinking cursor that would ignore key presses.

      Apple has done pretty well, but they aren’t immune from the performance massacre happening across the industry.

      The battery life is really good, though. I get 10-14 hours without trying to save battery life, which is easily enough to not worry about whether I have a way to charge for a day.

  • ssfckdt@lemmy.blahaj.zone
    link
    fedilink
    arrow-up
    18
    ·
    2 months ago

    The program expands so as to fill the resources available for its execution

    – C.N. Parkinson (if he were alive today)