It always feels like some form of VR tech comes out with some sort of fanfare and with a promise it will take over the world, but it never does.

  • IronpigsWizard@lemmy.world
    link
    fedilink
    arrow-up
    7
    ·
    1 month ago

    Since at least 1970, every decade there seems to be a, “The VR take over is here!” fad and it falls flat every time.

    Those VR rollercoaster shuttle rides in malls during the 1980s and early 1990s, thinking that is the future, oh boy, we were all so silly.

  • tal@lemmy.today
    link
    fedilink
    English
    arrow-up
    5
    ·
    edit-2
    1 month ago

    I’m long-term bullish on VR, if you mean having a HMD designed to provide an immersive 3D environment. Like, I don’t think that there are any fundamental problems with VR HMDs, and that one day, we will have HMDs that will probably replace monitors (unless some kind of brain-computer interface gets there first) and that those will expand do VR, if dedicated VR headsets don’t get there first. Be more portable, private, and power-efficient than conventional displays.

    But the hardware to reasonably replace monitors just isn’t there today; the angular resolution isn’t sufficient to compete with conventional monitors. And I just don’t think that at current prices and with the current games out there, dedicated VR HMDs are going to take over.

    I do agree with you that there have been several “waves” by companies trying to hit a critical mass that haven’t hit that point, but I think that there will ultimately come a day where we do adopt HMDs and that even if it isn’t the first application, VR will eventually be provided by those.

    • unmagical@lemmy.ml
      link
      fedilink
      arrow-up
      3
      ·
      1 month ago

      Honestly, I kinda want VR theatres. You don’t gotta wear glasses that block half the light and you can individually adjust the screen size per user, but still have the audio/snack/social experience theatres are today.

      • Limerance@piefed.social
        link
        fedilink
        English
        arrow-up
        3
        ·
        1 month ago

        That would combine the downsides of a cinema with the downsides of VR.

        The social experience suffers immensely when the audience wears bulky headsets. Can’t kiss your sweetheart for example.

    • Tollana1234567@lemmy.today
      link
      fedilink
      arrow-up
      3
      ·
      edit-2
      1 month ago

      pretty much, we will never make it like CYLon level, or skynet level intelligent. the former requires a human mind in a convoluted process, which is probably more realistic than skynet/kaylon.

  • markovs_gun@lemmy.world
    link
    fedilink
    arrow-up
    12
    ·
    1 month ago

    I’ll go against the grain and say literally all of it. Every piece of technology that exists is a compromise between what the designer wants to do and the constraints of what is practical or possible to actually pull off. Therefore, all technology “fails” on at least some metric the designer would like it to achieve. Technology is all about improvement and working with imperfection. If we don’t keep trying to make things better, then innovation stops. With your example of VR, I’d say that after having seen multiple versions of VR in my lifetime, the one that we have now is way more successful and impactful, especially in commercial uses rather than consumer products. Engineers can now tour facilities before they are built with VR headsets to see design flaws that they might not have seen just with a traditional model review, for example. Furthermore, what we have now is just an iteration on what we had before. It doesn’t happen in a vacuum, people take what came before, look at what worked and what didn’t, and what could be fixed with other technologies that have developed in the meantime. That’s the iteration process.

    • untorquer@lemmy.world
      link
      fedilink
      arrow-up
      3
      ·
      1 month ago

      Iteration isn’t a claim that the predecessor was a failure though, you iterate on the successes of the prior generation. It used to be that technology advanced so rapidly that the cutting edge became obsolete in a matter of a few years, but for that time it was a success.

      I think there’s also an assumption of design philosophy here. One designer might put many generalized requirements into their design, then you get Google glasses, AI, NFTs and so on. This means everything is a failure because it couldn’t achieve the requirements. Others may pick a small set of very specific requirements, then you get the iPhone or a Toyota hilux. These are massive successes because they had cohesion in the idea and planned as to about compromise.

    • ClassifiedPancake@discuss.tchncs.de
      link
      fedilink
      arrow-up
      1
      ·
      1 month ago

      If you mean flying cars that will replace regular cars, I don’t think anyone ever tried it really. There have been prototype cars with wings but no one took that seriously. More recently what everyone keeps trying is drones as taxis but I hope that fails because I don’t want that noise pollution.

    • hactar42@lemmy.ml
      link
      fedilink
      arrow-up
      7
      ·
      1 month ago

      The LaserJet 4P driver was the GOAT. It worked on every HP printer for years. It’s been all downhill since.

  • sugarfoot00@lemmy.ca
    link
    fedilink
    English
    arrow-up
    22
    arrow-down
    2
    ·
    1 month ago

    Probably not top ten of mind, but Carbon Capture and Storage (CCS) has been trotted out by the fossil fuel industry for a generation as a panacea for carbon emissions, in order to prevent any real legislation limiting the combustion of hydrocarbons.

  • Formfiller@lemmy.world
    cake
    link
    fedilink
    arrow-up
    14
    arrow-down
    1
    ·
    1 month ago

    AI, Mass Surveillance and privatization of services people need to live and National security technology

  • HypergolicRunoff@lemmy.org
    link
    fedilink
    arrow-up
    21
    arrow-down
    1
    ·
    1 month ago

    Pesticides.

    We came up with this brilliant idea of planting a single crop per field which creates the perfect environment for the things we call “pests”. We invented pesticides to kill the pests, which incidentally also kill their predators and competitors, making the environment even more favorable when the pest returns. So we started using more and stronger pesticides, creating a dependency cycle, with the added bonus of poisoning the ground, the water table, the oceans and ourselves.

        • agedcorn@lemmy.world
          link
          fedilink
          arrow-up
          5
          ·
          1 month ago

          Heh, ‘polyculture’ sounds like a word a friend would use when describing the new folk they’re hanging with after turning a 20-year monogamous marraige into an open relationship.

  • kurmudgeon@lemmy.world
    link
    fedilink
    English
    arrow-up
    17
    arrow-down
    3
    ·
    1 month ago

    Twitter/X. It is not a free speech platform. Give it up and move on to something else. Stop supporting these billionaires and stop giving them your time.

    • early_riser@lemmy.world
      link
      fedilink
      arrow-up
      6
      ·
      1 month ago

      Social media as a whole, honestly. Way back in 2014 I read an article about the “social media cycle” (not their words IIRC). Basically, a new platform gets popular with teens and college-age kids, then their parents join, then the kids have to move to something else because they don’t want to be on the same platform as their parents. I could be misremembering. It was a comparison between Facebook and Snapchat.

      Anyway, the Fediverse helps, but since fedi platforms are largely clones of their normie counterparts (Lemmy/PieFed = reddit, Mastodon = Twitter, PeerTube = YouTube) they inherit many of the same problems. I know I bring this up a lot, but on these platforms, content is the focus, but on traditional forums, people are the focus.

  • unknown1234_5@kbin.earth
    link
    fedilink
    arrow-up
    8
    arrow-down
    1
    ·
    1 month ago

    vr is useful but its too wrapped up in corporate bs to really take off for now. its dominated by companies obsessed with ai and by pathetic startups that never finish a product. it just needs meta to be less dominant.

    • lucullus@discuss.tchncs.de
      link
      fedilink
      arrow-up
      9
      ·
      1 month ago

      I sometimes wonder what would happen to VR, if it would get the same situation as 3D printing. That took of, because some patents where expiring and it was then easy to build up your own version. We had/have many open source/FOSS printers and nearly all the companies currently in this space wouldn’t exist, if it werent for the many open source developments and the extention of the market, that they created. I know this is highly unprobable for VR, but one should be allowed to dream

      • HobbitFoot @thelemmy.club
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        1 month ago

        I think that you always run into the issue that you look like an idiot using it and you need to do something special to use it.

        • lucullus@discuss.tchncs.de
          link
          fedilink
          arrow-up
          3
          ·
          1 month ago

          Though that is mostly a problem for big techs vision of VR, where we use it everyday allday (like all the shit with business meetings in VR). It was always a niche technology. But 3D printing is also a niche. But it got to be a big niche. And with even the current developments we got quite a reduction in size (thus better wearing comfort). I think an open hardware and software system would quite help the whole VR industry to get better, though still being a niche.

          • HobbitFoot @thelemmy.club
            link
            fedilink
            English
            arrow-up
            3
            ·
            1 month ago

            3D printing has found its niche in being able to create custom plastic models at a cost far lower than injection molding. That’s been big for RPG and wargaming as a way to create better boards at an acceptable cost. I’ve also seen some toys sold that are obviously 3D prints, which shows the technology’s viability as a part of a commercial production line. These are use cases where 3D printing is the best option available, so the technology gets used.

            I don’t see that equivalent for consumer scale VR/AR. The state of the industry for VR tech seems be to sell rented experiences where the VR tech is integrated into an experience with other equipment or defined spaces. Thats an equivalent to when computer games were rented in arcades.

      • unknown1234_5@kbin.earth
        link
        fedilink
        arrow-up
        6
        ·
        1 month ago

        well most (tech-related) industries dont really get much traction when its just private companies. generally a private company starts something and then open-source projects keep the underlying tech working while major companies rebrand stuff every year.

        thats part of why I’m so excited for the steam frame. it’ll finally give a vr platform that doesnt rely on proprietary stuff, freeing people up to do stupid things with it and accidentally make something really cool. what we really needed is for the bubble it was in to burst so the companies that had it in a chokehold would let go, but it just got smaller and they held on. its a lot like the ai situation right now where there are useful and sustainable use cases, but its too wrapped up in shareholder circlejerks for anyone to get the chance to set it up right.

        also, I need to get my ender v3 working again. that thing was fun.

  • early_riser@lemmy.world
    link
    fedilink
    arrow-up
    46
    arrow-down
    9
    ·
    1 month ago

    I’m going to get downvoted for this

    Open source has its place, but the FOSS community needs to wake up to the fact that documentation, UX, ergonomics, and (especially) accessibility aren’t just nice-to-haves. Every year has been “The Year of the Linux Desktop™” but it never takes off, and it never will until more people who aren’t developers get involved.

    • TheV2@programming.dev
      link
      fedilink
      arrow-up
      2
      ·
      1 month ago

      If you make that argument about the state of software in general, I’d agree to an extent in the sense that it should be more prioritized. But I don’t see how that applies to open source in particular?

      In those aspects proprietary software is just as bad, if not even worse. The difference is simply that the default choice of software for most tasks is a proprietary software. They can have a shit ton of unusable and confusing mess, even intentional dark patterns, but users will adapt.

      • early_riser@lemmy.world
        link
        fedilink
        arrow-up
        3
        ·
        1 month ago

        There’s a reason why Apple is the poster child for accessibility. They control the entire stack from hardware to OS, and have an ocean of money to devote to what is effectively a tiny marginalized portion of their user base.

        Open source is the exact opposite. Any given open source project (especially any given Linux distro) is standing atop a precarious mound of other open source projects that the distro maintainers themselves have no control over. So when accessibility breaks, the maintainers say “It’s not us, it’s GNOME”. Then GNOME says “It’s not us, it’s Wayland”, and so on.

        Imagine I handed you a laptop without a working screen, then when you complain you can’t use it, I said “It’s not my problem” or “We’ll get to it eventually” or “I wouldn’t know how to help you” That’s desktop Linux when you’re blind.

        Apologies if this comes across as a rant. I’m just bitter about the fact there’s all this free, privacy-respecting software out there that’s out of my reach, and I’m stuck selling my soul to Microsoft and Apple.

    • djdarren@piefed.social
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      2
      ·
      1 month ago

      I’m a reasonably new Linux at a place of trying to learn how to improve/optimise my system, and honestly, Google’s Gemini has become my user manual.

      If I can’t figure something out then I could trawl through a bunch of forums where the issue doesn’t really match mine, or the fix has changed since OP had the same problem, or I could just go straight to an LLM. I understand that they have a tendency to make shit up on the fly (this is a great example), but when it comes to troubleshooting setup issues they’re really helpful. And yes, I kmow that’s because they’ve already ingested the support forums. But it is genuinely so much quicker to sort things out, while learning as you go.

      • sugarfoot00@lemmy.ca
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        1
        ·
        1 month ago

        It’s made a world of difference to me in my IT support services business. It’s not always right, but it’s always helpful even when it isn’t. It’s far better at looking at a page of log information and picking out the one bit that explains why the thing I need to work isn’t working. I’ve been emboldened to do a lot of projects that I was previously uncomfortable with. The key is I know enough about nearly anything that I can tell when im being led down a garden path.

        The quality of the prompt is everything.

        • djdarren@piefed.social
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          1
          ·
          1 month ago

          It’s far better at looking at a page of log information and picking out the one bit that explains why the thing I need to work isn’t working

          Yes. I can post a terminal output into it and it’ll tell me exactly what’s not working and why. And that’s incredibly valuable.

          Ironically, I used Gemini to help me build a little app that takes a copied YouTube link and uses yt-dlp to download it to my Jellyfin server in a format that’ll play nicely on my Apple TV. I can’t imagine how I’d approach achieving that if I had to start from scratch.

    • Felis_Rex@lemmy.zip
      link
      fedilink
      arrow-up
      2
      ·
      1 month ago

      Huge difference between having it and not needing it and needing it and not having it.

      I think the person you’re replying to is 100% correct since you’re coming at them so heated

    • VitoRobles@lemmy.today
      link
      fedilink
      English
      arrow-up
      13
      ·
      edit-2
      1 month ago

      Not here to downvote. But I will say there is some good changes as of the past five years.

      From a personal perspective: there’s a lot of GOOD open-source software that has great user experiences. VLC. Bitwarden. OBS. Joplin. Jitsi.

      Even WordPress (the new Blocks editor not the ugly classic stuff) in the past decade has a lot of thought and design for end users.

      For all the GIMP/Libre office software that just has backwards ass choices for UX, or those random terminal apps that require understanding the command line – they seem to be the ones everyone complains about and imprinted as “the face of open-source”. Which is a shame.

      There’s so much good open-source projects that really do focus on the casual non technical end user.

    • lucullus@discuss.tchncs.de
      link
      fedilink
      arrow-up
      12
      ·
      1 month ago

      While you generally have a point, the year of the linux desktop is not hindered by that. Distributions like Linux Mint, Ubuntu and the like are just as easy to install as Windows, the desktop environments preinstalled on them work very good and the software is more than sufficient for like 70% to 80% of people (not counting anything, that you cannot install with a single click from the app store/software center of the distribution.

      Though Linux is not the default. Windows is paying big time money to be the default. So why would “normal people” switch? Hell, most people will just stop messaging people instead of installing a different messenger on their phone. Installing a different OS on your PC/Notebook is a way bigger step than that.

      So probably we won’t get the “Year of the Linux Desktop”, unless someone outpays Microsoft for quite some time, or unless microsoft and Windows implode by themselves (not likely either)

  • chicken@lemmy.dbzer0.com
    link
    fedilink
    arrow-up
    56
    arrow-down
    1
    ·
    1 month ago

    “Smart” TVs. Somehow they have replaced normal televisions despite being barely usable, laggy, DRM infested garbage.

    • OpenPassageways@lemmy.zip
      link
      fedilink
      arrow-up
      1
      ·
      1 month ago

      Curious what your preferred streaming box is, considering changing my Android TV so that it launches to HDMI, disconnect from internet, use a streaming box that isn’t as slow and has a hardwired connection instead.

    • spankinspinach@sh.itjust.works
      link
      fedilink
      arrow-up
      4
      ·
      1 month ago

      Only if you use it as a smart tv - I just never signed the user agreements, and now have a big TV with OLED. I switch to the source I want - off I go. Television can still just be television!

    • IronBird@lemmy.world
      link
      fedilink
      arrow-up
      3
      ·
      1 month ago

      you can buy business-grade stuff without all the spyware shit, it’s just much more expensive

    • VitoRobles@lemmy.today
      link
      fedilink
      English
      arrow-up
      9
      ·
      1 month ago

      You’re not kidding. It’s pretty difficult to not buy them.

      It’s a $250 smart TV vs a $2000 non-infested TV.

    • Professorozone@lemmy.world
      link
      fedilink
      arrow-up
      6
      ·
      1 month ago

      Man, I haven’t really faced this yet. My flat screen is a really old Panasonic plasma and it is"barely" smart. It came with a few apps on it. I ignore them and use it as a dumb monitor, running everything through my receiver instead. When it dies, I don’t know what I’ll do.

      • AvailableFill74@lemmy.ml
        link
        fedilink
        arrow-up
        8
        ·
        1 month ago

        You can disconnect them from the WiFi and block their ability to connect and then use a third party device for any apps you want.

        • Professorozone@lemmy.world
          link
          fedilink
          arrow-up
          3
          ·
          1 month ago

          I recently bought a TV on behalf of a friend( because it was cheaper at Costco) and when we got it to his house and connected it, it asked him to give up his privacy like 11 times. If he said no, would it still have worked?

          • AvailableFill74@lemmy.ml
            link
            fedilink
            arrow-up
            4
            ·
            1 month ago

            Mine had the ability to turn of WiFi in settings. I provided it no real information, didn’t create and account, and didn’t use their app or interface.

            It was a Samsung. YMMV with other brands.

      • swab148@lemmy.dbzer0.com
        link
        fedilink
        arrow-up
        5
        ·
        1 month ago

        They’re more expensive, but check out commercial displays. They’re basically just big “dumb” TVs for businesses to display menus and whatnot, usually with a single HDMI and no sound, but those limitations can easily be bypassed with a stereo receiver.

    • early_riser@lemmy.world
      link
      fedilink
      arrow-up
      5
      ·
      1 month ago

      The concept confuses and infuriates me. I’m just going to stick a game console or Blu-ray player on it, but you can’t buy a TV these days that doesn’t have a bloated “smart” interface. The solution, for me at least, is a computer monitor. I don’t need or want a very large screen, and a monitor does exactly one thing, and that’s show me what I’ve plugged into it.

    • RedGreenBlue@lemmy.zip
      link
      fedilink
      arrow-up
      26
      ·
      1 month ago

      They are surveilance- and ad delivery platorms. The user experience is as bad as the consumer can tolerate. They work as intended.

      • chicken@lemmy.dbzer0.com
        link
        fedilink
        arrow-up
        6
        ·
        1 month ago

        I don’t buy it, they would be better at whatever nefarious crap if they didn’t take a full second to navigate between menu options, or had a UI designed by someone competent. Even people who have subscriptions to the services the TV is a gateway to have a hard time figuring out how to use them. These things aren’t even good at exploitation, they are decaying technology.

        • djdarren@piefed.social
          link
          fedilink
          English
          arrow-up
          3
          ·
          1 month ago

          If every smart TV you buy is the same, then you have no viable choices, and as such they’re doing the bare minimum of what’s expected for the bare minimum of cost.

          • chicken@lemmy.dbzer0.com
            link
            fedilink
            arrow-up
            3
            arrow-down
            1
            ·
            edit-2
            1 month ago

            You can choose not to have a TV. I only know about the current state of smart TVs because of sometimes being around the ones other people have, I would never buy one myself, there’s no need. Any media you want to see can be viewed in other ways.

            • djdarren@piefed.social
              link
              fedilink
              English
              arrow-up
              4
              arrow-down
              1
              ·
              1 month ago

              Do you have a 55" OLED laptop screen to watch movies and play games on?

              I mean, all power to you, but I really like having a nice sized TV.

              • chicken@lemmy.dbzer0.com
                link
                fedilink
                arrow-up
                2
                ·
                1 month ago

                That’s fair. I think if I wanted a larger screen I’d look into big monitors and some kind of expansion of my homelab setup to display things to it, but I can see why people might want a dedicated device with less setup required, even one where the setup is still pretty confusing.

                I looked up some statistics and it seems, depressingly, that consumers are in fact buying more televisions and it’s projected to increase, so I guess I have to concede the point that what they are doing is successful despite all reason.