• chinpokomon@beehaw.org
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 year ago

    Long term, there is some benefit to this sort of concept. You aren’t going to have as much freedom to turn your cloud based OS into a custom build, but what you will have is a machine which will never have down time for patches and security updates. The user will be running their app remotely, using all the power and hardware of a data center, and the instance of the app can migrate from one host PC to another, seamlessly without any perception to the end user. Furthermore a user can access all their applications and data from whatever client they are using and it will migrate this session from their terminal, to their phone, to their AR HMDs.

    It isn’t going to be a change which happens over night, and it will be more like how car engine have become less user serviceable but more reliable and efficient. It will be a different experience for sure, but it has potential value beyond being a way to charge people subscriptions.

    • Scrubbles@poptalk.scrubbles.tech
      link
      fedilink
      English
      arrow-up
      7
      ·
      1 year ago

      Ehhh we’ve been down that road before with thin clients. Anyone who has had to do their job on thin clients will tell you the experience never compares to running it locally

      • chinpokomon@beehaw.org
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        We have, and there are still things to solve before this is completely practical. This is still different than connecting to a mainframe over a 3270 terminal. A closer example of how this would work is port forwarding an X11 to a remote system or using SSH to tunnel to a server where I’ve ran screen. If I’ve connected to a GUI application running on a server or reconnected my SSH session, it is less important about where I’m connecting from. Extending this concept to Windows, you wouldn’t even need local storage for most needs. It won’t be practical for places with poor network connectivity, but where it is reliable, high bandwidth, and low latency, it won’t be so discernable from local use for most business applications. This is probably the biggest driving force behind XCloud. If Microsoft can make games run across networks with minimal problems, business applications are going to do just fine. XCloud works great for me, allowing me to stream with few problems. That’s less true for others in my family, so clearly this isn’t something which can roll out to everyone, everywhere, all at once. I think it would be great to be able to spin up additional vCPU cores or grow drive space or system RAM as needed per process so that I’m not wasting cycles or underutilizing my hardware. It seems like this would become possible with this sort of platform.

    • sfera@beehaw.org
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 year ago

      a user can access all their applications and data from whatever client they are using

      Also, users won’t own their most basic data anymore, nor will they be able to control how it is used. Canceling a subscription (or being locked out) could mean loosing it all.

      • chinpokomon@beehaw.org
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        1 year ago

        For a business, I see this as a strong benefit for this design. The work done for a company is the property of that company by most hiring contracts, so the work done on a remote system can by tightly controlled. At the same time, it would allow someone to use their own thin client to do both professional and personal work and keep things isolated. For someone doing freelance work, it makes sharing a natural extension of that process and access can be granted or revoked as it relates to contracts. That seems like an advantage to corporate IT departments.

        As for individuals, I don’t see how this takes away ownership. Regulations will be updated to allow users to request their data in compliance with GDPR requests, so nothing would become completely locked up. Should that be challenged ever, I don’t think any jurisdiction would say that Microsoft owns the data. What a user will be able to do with the bits they receive is a different question.

        • sfera@beehaw.org
          link
          fedilink
          English
          arrow-up
          4
          ·
          edit-2
          1 year ago

          I understand your point (regarding protection of intellectual property and having a homogeneous and controlled IT infrastructure), but I’d like to add that as a business (disregarding what my employees might like or consider more effective) I am still not in control of anything if my data and applications are somewhere “in the cloud” and I have no control over it. As a company I would be bound to that provider (in this case Microsoft) and would have to pay whatever they require for whatever they offer(good or bad services). A small alleviation would be to have that “cloud” on premise, but I think that that’s highly unrealistic. In this regard, a business is very similar to the plain user in my previous reply.

          Also, don’t forget that GDPR doesn’t apply everywhere. That’s just a EU requirement which might or might not be fully implemented, even when required. As I mentioned, there’s no guarantee that your company data is not misused when it’s completely out of your hands. Not even to think about what a security breach or outage would mean and what kind of impact it would have.

          Please don’t get me wrong. I’m not trying to spread FUD, but I am general skeptical and trying to think critically. Moving “everything” in “the cloud”, in the hands of one single actor requires a level of trust which I’m not able to provide and introduces single points of failure which I wouldn’t like to have, neither as individual nor as company.

          Thanks for reading my longest post ever. ;-)