I found that idea interesting. Will we consider it the norm in the future to have a “firewall” layer between news and ourselves?

I once wrote a short story where the protagonist was receiving news of the death of a friend but it was intercepted by its AI assistant that said “when you will have time, there is an emotional news that does not require urgent action that you will need to digest”. I feel it could become the norm.

EDIT: For context, Karpathy is a very famous deep learning researcher who just came back from a 2-weeks break from internet. I think he does not talks about politics there but it applies quite a bit.

EDIT2: I find it interesting that many reactions here are (IMO) missing the point. This is not about shielding one from information that one may be uncomfortable with but with tweets especially designed to elicit reactions, which is kind of becoming a plague on twitter due to their new incentives. It is to make the difference between presenting news in a neutral way and as “incredibly atrocious crime done to CHILDREN and you are a monster for not caring!”. The second one does feel a lot like exploit of emotional backdoors in my opinion.

      • neuracnu@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        1
        ·
        7 months ago

        Just like diet, some people prefer balancing food types and practicing moderation, and others overindulge on what makes them feel good in the moment.

        Having food options tightly controlled would restrict personal liberty, but doing nothing and letting people choose will lead to bad outcomes.

        The solution is to educate people on what kinds of choices are healthy and what are not, financially subsidize the healthy options so they are within reach to all, and only use law to restrict things that are explicitly harmful.

        Mapping that back to news and media, I’d like to see public education promoting the value of a balanced media and news diet. Put more money into non-politically-aligned news organizations. Look closely at news orgs that knowingly peddle falsehoods and either bring libel charges against them or create new laws that address the public harm done by maliciously spreading misinformation.

        But I’m no lawyer, so I don’t know how to do that last part without creating some form of tyranny.

    • halfway_neko@lemmy.dbzer0.com
      link
      fedilink
      arrow-up
      8
      arrow-down
      2
      ·
      7 months ago

      isn’t that what the upvote/downvote buttons are for? although to be fair, i’d much rather the people of lemmy decide which things are good and interesting than some “algorithm”

      • fine_sandy_bottom@discuss.tchncs.de
        link
        fedilink
        arrow-up
        4
        ·
        7 months ago

        There’s a real risk to this belief.

        There are elements of lemmy who use votes to manipulate which ideas appear popular, with the intention of manipulating discourse rather than having open discussions.

        • halfway_neko@lemmy.dbzer0.com
          link
          fedilink
          arrow-up
          1
          ·
          7 months ago

          yeah. you’re right.

          it’s not like i blindly trust the votes to tell me what’s right and wrong, but they still influence my thoughts. i could just sort by new, but i feel like that’s almost as easy to manipulate.

          i guess it comes back to the topic of the post. where and how i get my information is always going to affect me.

          i’m sure other platforms are no better than lemmy with manipulating content, but maybe for different reasons. i just have to choose the right places to spend my time.

          • fine_sandy_bottom@discuss.tchncs.de
            link
            fedilink
            arrow-up
            2
            ·
            7 months ago

            Yeah this is an “unpopular opinion” but I don’t believe the lemmyverse in it’s current form is sustainable for this reason.

            Instances federate with everyone by default. It’s only when instances are really egregious that admins will defederate from them.

            Sooner or later Lemmy will present more of a target for state actors wishing to stoke foment and such. At that time the only redress will be for admins to defederate with other instances by default, and only federate with those who’s moderation policies align with their own.

            You might say, the lemmyverse will shatter.

            I don’t think that’s necessarily a bad thing.

            End rant.

          • Alexstarfire@lemmy.world
            link
            fedilink
            arrow-up
            1
            ·
            7 months ago

            That really just reenforces my point. Most people aren’t setting that up themselves. The app defaults do that. I.e. someone/something else is making that determination. Sure, maybe you can still check out the post if you really want, but how many will do that? Can you change how it works? Depends on the app.

            If people want to opt-in to it then I don’t really care. I mostly HATE being forced to opt-out of things though.

            • keepthepace@slrpnk.netOP
              link
              fedilink
              arrow-up
              1
              ·
              7 months ago

              Well then we can argue about defaults, but in an open source app, I think the point is moot: anyone can make a new version with different defaults.

              [some content is masked, change the settings if you want to see it or disable this warning] sounds like an acceptable default over almost anything filterable in my opinion.

  • GrymEdm@lemmy.world
    link
    fedilink
    arrow-up
    34
    arrow-down
    6
    ·
    edit-2
    7 months ago

    Without wanting to be too aggressive, with only that quote to go on it sounds like that person wants to live in a safe zone where they’re never challenged, angered, made afraid, or have to reconsider their world view. That’s the very definition of an echo chamber. I don’t think you’re meant to live life experiencing only “approved” moments, even if you’re the one in charge of approving them. Frankly I don’t know how that would be possible without an insane amount of external control. You’d have to have someone/something else as a “wall” of sorts controlling your every experience or else how would things get reliably filtered?

    I’d much prefer to teach people how to be resilient so they don’t have to be afraid of being exposed to the “wrong” ideas. I’d recommend things like learning what emotions mean and how to deal with them, coping/processing bad moments, introspection, how to get help, and how to check new ideas against your own ethics. E.g. if you read something and it makes you angry, what idea/experience is the anger telling you to protect yourself from and how does it match your morality? How do you express that anger in a reasonable and productive way? If it’s serious who do you call? And so on.

    • OKRainbowKid@feddit.de
      link
      fedilink
      arrow-up
      7
      arrow-down
      1
      ·
      7 months ago

      I see where you’re coming from, but if you look up Karpathy, you’ll probably come to a different conclusion.

      • GrymEdm@lemmy.world
        link
        fedilink
        arrow-up
        1
        arrow-down
        1
        ·
        edit-2
        7 months ago

        He’s talking about wanting some system to filter out Tweets that “elicit emotion” or “nudge views”, comparing them to malware. I looked him up and see he’s a computer scientist, which explains the comparison to malware. I assume when he’s designing AI he tries to filter what inputs the model gets so as to achieve the desired results. AI acts on algorithms and prompts regardless of value/ethics and bad input = bad output, but I think human minds have much more capability to cope and assess value than modern AI. As such I still don’t like the idea of sanitizing intake because I believe in resilience and processing unpleasantness as opposed to stringent filters. What am I missing?

        • OKRainbowKid@feddit.de
          link
          fedilink
          arrow-up
          1
          arrow-down
          1
          ·
          7 months ago

          I don’t think you’re missing anything. Just maybe you’re taking his tweet more serious or literal than he intended. To me, it’s just an interesting perspective to consider tweets that are meant to influence your opinion as malware. Sure, somebody aware of the types of “bad input” in the form of misinformation campaigns, propaganda or advertisement might not be (as) susceptible to that - but considering the average Twitter user, comparing this type of content to malware seems appropriate to me.

    • keepthepace@slrpnk.netOP
      link
      fedilink
      arrow-up
      5
      ·
      7 months ago

      I think you are getting it wrong. I added a small edit for context. It is more about emotional distraction. I kinda feel like him: I want to remain informed, but please let me prepare a bit before telling me about civilians cut in pieces in a conflict between a funny cat video and a machine learning news.

      For the same reason we filter out porn or gore images from our feeds, highly emotional news should be filterable

      • GrymEdm@lemmy.world
        link
        fedilink
        arrow-up
        3
        ·
        edit-2
        7 months ago

        I don’t think there’s anything wrong with taking a break from social media or news. There are days I don’t visit sites like Lemmy or when I do I only click non-news links because I’m not in the mood or already having a bad day. That’s different than filtering (as per Karpathy’s example) Tweets so that when you do engage it’s consistently a very curated, inoffensive, “safe” experience. Again, I only have the one post to go off of, but he specifically talks about wishing to avoid Tweets that “elicit emotions” or “nudge views” and compares those provocative messages to malware. As far as your point regarding blatantly sensationalist news, when I recognize it’s that kind of story I just stop reading/watching and that’s that.

        I WANT to have my emotions elicited because I seek to be educated and don’t want to be complacent about things that should make me react. “Don’t know, don’t care” is how people go unrepresented or abused - e.g. almost no one reads about what Boko Haram is doing in Nigeria (thus it’s already “filtered out” by media), and so very little has been done in the 22 years they’ve been affecting millions of lives. I WANT to have my “views nudged” because I’m regularly checking my worldview to make sure it stays centered around my core ethics, and being challenged has prompted me to change bad stances before. Being exposed to objectionable content before and reassessing is also how I’ve learned to spot BS attempts to manipulate. It doesn’t matter how many times MAGA Tweets tell me that God is upset at drag queens and only Donald Trump can save the world because now I recognize ragebait when I see it. Having dealt with it before, no amount of exposure is going to make me believe their trash and knowing what is being said is useful for exposing and opposing harmful governmental policies/bad candidates (sometimes even helping deprogram others).

        • GrymEdm@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          edit-2
          7 months ago

          I’m putting this in it’s own response because it’s a less important addendum to my main points above and I don’t want to put everyone off with a single huge brick of text.

          If just knowing bad news exists makes life difficult for someone, even if they don’t click the link, then I’d (respectfully, not as an insult) recommend learning coping techniques like meditation, diaphragmatic breathing, or cognitive behavior therapy that can add resilience. I am NOT suggesting someone feeling like that is innately weak or flawed, but there are techniques to move the impact of knowing there’s bad things happening towards manageable. If it’s still immediately extremely distressing, there may be related past trauma that needs sorting out.

          Physical analogy for social media breaks - I work out regularly. Even though it’s a healthy habit, I don’t work out every day because it’s tiring and that would make it unhealthy. When I do work out though I want it to be difficult because that’s how gains are made. So I’m not saying you or I need to batter ourselves with torturous news every day - breaks aren’t just ok they’re how you stay healthy. When I read the news though, I want the whole truth even if that truth has parts that are uncomfortable or challenge my worldview, and I also want to be experienced/trained enough to handle those emotions and thoughts.

      • bloodfart@lemmy.ml
        link
        fedilink
        arrow-up
        1
        arrow-down
        3
        ·
        7 months ago

        That’s the point.

        The information that’s upsetting has leaked around the existing mechanisms for preventing it from ending up in your view.

        You’re supposed to be angry, not wish there was a better way to keep from seeing it.

        I swear to god we got motherfuckers here who took the wrong message from the damn matrix.

  • dejected_warp_core@lemmy.world
    link
    fedilink
    arrow-up
    19
    ·
    7 months ago

    The real question then becomes: what would you trust to filter comments and information for you?

    In the past, it was newspaper editors, TV news teams, journalists, and so on. Assuming we can’t have a return to form on that front, would it be down to some AI?

    • keepthepace@slrpnk.netOP
      link
      fedilink
      arrow-up
      8
      ·
      7 months ago

      Why do people, especially here in the fediverse, immediately assume that the only way to do it is to give power of censorship to a third party?

      Just have an optional, automatic, user-parameterized, auto-tagger and set parameters yourself for what you want to see.

      Have a list of things that should receive trigger warnings. Group things by anger-inducing factors.

      I’d love to have a way to filter things out by actionnable items: things I can get angry about but that I have little ways of changing, no need to give me more than a monthly update on.

      • zaphod@sopuli.xyz
        link
        fedilink
        arrow-up
        4
        arrow-down
        3
        ·
        7 months ago

        Because your “auto-tagger” is a third party and you have to trust it to filter stuff correctly.

    • tamal3@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      edit-2
      7 months ago

      Most recent Ezra Klein podcast was talking about the future of AI assistants helping us digest and curate the amount of information that comes at us each day. I thought that was a cool idea.

      *Edit: create to curate

      • dejected_warp_core@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        7 months ago

        It makes a lot of sense. It also presents an opportunity to hand off such filtering to a more responsible entity/agency than media companies of the past. In the end, I sincerely hope we have a huge number of options rather than the same established players (FANG) as everything else right now.

  • MonkderDritte@feddit.de
    link
    fedilink
    arrow-up
    20
    arrow-down
    1
    ·
    edit-2
    7 months ago

    Our mind is built on that “malware”. I think it’s more accurate to compare brain + knowledge to our immune system: the more samples you have, the better you are armed against mal-information.

    • FooBarrington@lemmy.world
      link
      fedilink
      arrow-up
      4
      arrow-down
      1
      ·
      7 months ago

      But that leaves out the psychological effects of long-term exposure to ideas. If you know for a fact that the earth is round, and for the next 50 years all the media you consume keeps telling you that the earth is flat, you will at some point start believing that (or at least become unsure).

      Every piece of information you receive has some tiny effect on you.

    • EntirelyUnlovable@lemmy.world
      link
      fedilink
      arrow-up
      3
      ·
      7 months ago

      I was thinking the same, you need to be exposed to some bullshit every now and then to give contrast and context to what you believe to be true

    • keepthepace@slrpnk.netOP
      link
      fedilink
      arrow-up
      3
      ·
      7 months ago

      This sounds like the theories that were more prevalent before germ theory. Surgeons or obstetricians would argue that washing hands was a disservice to the organisms they get into.

      Immune systems still get sick and can be overwhelmed. There is a mental hygiene that needs to exist.

  • Lemvi@lemmy.sdf.org
    link
    fedilink
    arrow-up
    16
    arrow-down
    3
    ·
    edit-2
    7 months ago

    I think the right approach would be to learn to deal with any kind of information, rather than to censor anything we might not like hearing.

    • keepthepace@slrpnk.netOP
      link
      fedilink
      arrow-up
      2
      ·
      7 months ago

      We are already having tons of filters in place trying to serve us information we are interested in, knowledgeable enough to digest, not-spammy, in the correct language, not porn or gore, etc… He is just proposing another interesting dimension. For instance, I am following AI news and news about the Ukraine conflict but I prefer to keep them separate and to not be distracted by the other when I get my fill on one.

      The only way I found with Twitter (and now Mastodon) to do it is to devote twitter only to tech news.

      • Lemvi@lemmy.sdf.org
        link
        fedilink
        arrow-up
        2
        ·
        7 months ago

        I don’t think he is proposing another dimension, but rather another scale. As you already said, we already filter the information that reaches us.

        He seems to take this idea of filtering/censorship to an extreme. Where I see filtering mostly as a matter of convenience, he portrays information as a threat that people need to be protected from. He implies that being presented with information that challenges your world view is something bad, and I disagree with that.

        I am not saying that filtering is bad. I too have blocked some communities here on Lemmy. I am saying that it is important not to put yourself in a bubble, where every opinion you see is one you agree with, and every news article confirms your beliefs.

        • keepthepace@slrpnk.netOP
          link
          fedilink
          arrow-up
          1
          ·
          7 months ago

          Emotion != information

          You can know that the Israeli-Palestinian conflict is going on without having to put pictures of maimed bodies inside your news feed. Actually I have blocked people I actually agree with just because they could not stop spamming angrily about it. I have also a militant ecologist friend who thinks saving the planet implies pushing the most anxiety inducing news as much as possible. Blocked.

          I don’t think that blocking the content that focus on pathos locks us up in a bubble, that’s quite the opposite. Emotions block analysis.

    • keepthepace@slrpnk.netOP
      link
      fedilink
      arrow-up
      6
      ·
      edit-2
      7 months ago

      I really think that as the 20th century saw the rise of basic hygiene practices we are putting in place mental hygiene practices in the 21st.

    • keepthepace@slrpnk.netOP
      link
      fedilink
      arrow-up
      3
      arrow-down
      1
      ·
      7 months ago

      Kind of, but the guy being a prominent LLM researcher, it kind of hints at the ability of not inflicting it on humans nor suffering from having to design an apolitical structure for it.

  • fine_sandy_bottom@discuss.tchncs.de
    link
    fedilink
    arrow-up
    10
    arrow-down
    1
    ·
    7 months ago

    Not really. An executable controlled by an attacker could likely “own” you. A toot tweet or comment can not, it’s just an idea or thought that you can accept or reject.

    We already distance ourselves from sources of always bad ideas. For example, we’re all here instead of on truth social.

  • bloodfart@lemmy.ml
    link
    fedilink
    arrow-up
    9
    arrow-down
    1
    ·
    7 months ago

    We already have a firewall layer between outside information and ourselves, it’s called the ego, superego, our morals, ethics and comprehension of our membership in groups, our existing views and values. The sum of our experiences up till now!

    Lay off the Stephenson and Gibson. Try some Tolstoy or Steinbeck.

  • YoFrodo@lemmy.world
    link
    fedilink
    arrow-up
    8
    ·
    7 months ago

    Reading, watching, and listening to anything is like this. You accept communications into your brain and sort it out there. It’s why people censor things, to shield others and/or to prevent the spread of certain ideas/concepts/information.

    Misinformation, lies, scams, etc function entirely on exploiting it

  • rtxn@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    arrow-down
    1
    ·
    edit-2
    7 months ago

    Nah man, curl that shit into my bash and let me deal with it

    • TrickDacy@lemmy.world
      link
      fedilink
      arrow-up
      2
      arrow-down
      2
      ·
      7 months ago

      Yeah, op seems to think minds are weak and endlessly vulnerable. I don’t believe that, not about myself at least

      • Dave.
        link
        fedilink
        arrow-up
        5
        arrow-down
        1
        ·
        7 months ago

        Yeah, op seems to think minds are weak and endlessly vulnerable. I don’t believe that, not about myself at least

        Your mind is subject to cognitive biases that are extremely difficult to work around. For example, your statement is an example of egocentric bias.

        All you need is content that takes advantage of a few of those biases and it’s straight in past your defences.

        • keepthepace@slrpnk.netOP
          link
          fedilink
          arrow-up
          2
          ·
          7 months ago

          I am fairly armored intellectually, but emotionally, I find it draining to be reminded that war is at my doorsteps and that kids are dying gruesome deaths in conflicts I barely know about.

        • TrickDacy@lemmy.world
          link
          fedilink
          arrow-up
          2
          ·
          7 months ago

          Yeah I understand people are pretty flawed, and vulnerable to some degree of manipulation. I just think that the idea proposed in this post is not only an overreaction, underestimates people’s ability to reject bullshit. We can’t always tell what’s bullshit, sure, but we don’t need to be treated like we’re too fragile to think for ourselves. Once that happens, we would literally become unable to do so.

      • xxd@discuss.tchncs.de
        link
        fedilink
        arrow-up
        4
        arrow-down
        1
        ·
        7 months ago

        I think you’re too optimistic as to how difficult it is to influence people. Just think of the various, obviously false, conspiracy theories that some people still believe. I think that for every person there is some piece of information/news, that is just believable enough without questioning it, that is going to nudge their opinion just ever so slightly. And with enough nudges, opinions can change.

        • TrickDacy@lemmy.world
          link
          fedilink
          arrow-up
          2
          ·
          7 months ago

          You’re referring to fringe groups. There are a lot of them, but they’re also in the vast minority. But even so, treating adults like especially fragile children isn’t going to help

          • xxd@discuss.tchncs.de
            link
            fedilink
            arrow-up
            2
            ·
            edit-2
            7 months ago

            Yes, only fringe groups believe outlandish conspiracies, but it’s unrealistic to believe that most people, including you, can’t be influenced. Just think of ads or common misconceptions. everyone is susceptible to this to some degree, no one can have their guard up 24/7, regardless of being a child or an adult. Having a “firewall” for everything isn’t a good solution I’d say, but it’s not as if everybody is as resilient as you think.

            • TrickDacy@lemmy.world
              link
              fedilink
              arrow-up
              1
              ·
              7 months ago

              I don’t think we’re actually disagreeing. I’m not actually saying people are super resilient. Just that we are at all, which the post appears to doubt.

  • perestroika@slrpnk.net
    link
    fedilink
    arrow-up
    8
    ·
    edit-2
    7 months ago

    I think most people already have this firewall installed, and it’s working too well - they’re absorbing minimal information that contradicts their self-image or world view. :) Scammers just know how to bypass the firewall. :)

  • xxd@discuss.tchncs.de
    link
    fedilink
    arrow-up
    5
    ·
    7 months ago

    Leaving aside the dystopian echo chamber that this could result in, you could argue that this would help with fake news by a lot. Fake news are so easy to spread and more present than ever. And for every person there is probably that one piece of news that is just believable enough to not question it. And then the next just believable piece of news. and another. I believe no one is immune to being influenced by fake stories, maybe even radicalized if they are targeted just right. A firewall just filtering out everything non-factual would already prevent so much societal damage I think.

    • GrymEdm@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      edit-2
      7 months ago

      There are enormous issues with who decides what makes it through the filter, how to handle things that are of unknown truth (say ongoing research), and the hazards of training consumers of information to assume everything that makes it to them is completely factual (the whole point of said fake news filter). If you’d argue that people on the far side of the filter can still be skeptical, then just train that and avoid censorship via filter.

      • xxd@discuss.tchncs.de
        link
        fedilink
        arrow-up
        2
        ·
        7 months ago

        Yeah, I agree. it’s not easy to determine truth, and whoever decides truth might introduce bias that then gets rolled out to everyone. With ongoing reserach or unknown information, you could just have a “currently being researched” or “not confirmed yet” attached to the information. I’m just saying that in an ideal world where this does work, it could be safer than relying on people being skeptical, because everyone fails to be skeptical about something eventually.

  • theneverfox@pawb.social
    link
    fedilink
    English
    arrow-up
    5
    ·
    7 months ago

    I remember watching a video from a psychiatrist with eastern Monk training. He was explaining about why yogis spend decades meditating in remote caves - he said it was to control information/stimuli exposure.

    Ideas are like seeds, once they take root they grow. You can weed out unwanted ones, but it takes time and mental energy. It pulls at your attention and keeps you from functioning at your best

    The concept really spoke to me. It’s easier to consciously control your environment than it is to consciously control your thoughts and emotions.