FAQ

Q: why not organize and stop treating the bus as a legitimate entity? why aren’t you working to stop the bus?

A: do both. cut the fuel line. break windows. put oatmeal in the gas tank. but maybe your efforts don’t succeed this election cycle. and if so don’t fucking throw away your vote if it can help your neighbors fucking survive. “harm reduction” is not a political strategy for action. it is a last minute, end of the line decision to save lives, after all other resources have been exhausted.

  • HACKthePRISONS@kolektiva.social
    link
    fedilink
    arrow-up
    5
    ·
    3 months ago

    a deontological system places the morality in the action itself, so you know before you do it whether its the right thing to do. ontological systems change the morality of the action depending on the results in the future.

    what if we need trump to be elected in order to escape earth before the sun goes nova? it’s an unknowable proposition, but are you willing to risk all of humanity on voting for biden?

    • zea@lemmy.blahaj.zone
      link
      fedilink
      arrow-up
      2
      ·
      3 months ago

      If you can convince me voting for Trump will give greater expected value then I’ll do it, but such absurd possibilities like you said usually come with an exact inverse that cancels out its expected value.

      Should I let that butterfly flap its wings? What if it causes a tornado somewhere?! Or, what if it not flapping causes a tornado somewhere?! Both are equally plausible, so there’s no point in choosing my actions based on them.

      • HACKthePRISONS@kolektiva.social
        link
        fedilink
        arrow-up
        3
        ·
        3 months ago

        I think you understand the problem of the unknowableness of the effects of our actions, and subsequently how absurd it is to use that as a basis of our morality.

        I’m not trying to get you to vote for trump, I’m trying to get you to choose a useful moral framework.

          • HACKthePRISONS@kolektiva.social
            link
            fedilink
            arrow-up
            1
            ·
            3 months ago

            the uncertainty shifts within the framework from whether my actions will have a good out come to whether i know what actions are moral. i suppose it’s possible that i might not know, but the categorical imperative is pretty easy to apply, so my confidence is much higher than i imagine is possible for any action within a utilitarian frame: you are totally dependent on unknowable circumstances to determine the morality of past actions.

            • zea@lemmy.blahaj.zone
              link
              fedilink
              arrow-up
              1
              ·
              3 months ago

              I want good outcomes, not the feeling of personal moral purity. Outcomes are inherently uncertain. You can say “murder bad, no uncertainty”, but that still leaves the outcome, the part I care about, uncertain.

              If I wanted moral certainty above all else, I could just say everything’s moral.