Yet psychopathic megalomaniacal leaders are a feature of the human race further back than recorded history, where remote mass destruction of estranged populations is a very recent development. Therefore it is immoral to develop, create and deploy weaponry like this and, “we will be the victims of it if we do not”, is a similarly weak moral argument to the one above. Just because we expect someone else to do the immoral thing does not render us any more moral for having done it. I don’t think. Yes, you can argue necessity, but how far does that go? If a pacifist somehow held in their hands a button which would kill every non-pacifist in the world, should they push it? And, in creating any new technology, we do need to ask, “is introducing this worth the risk of it falling into the wrong hands?” . Similar to how anti privacy laws creep in. If you’ve got nothing to hide, you’ve got nothing to fear, until the next government gets in and you need to hide being gay, or brown, or a woman. It’s not a question of whether or not “the good guys” get the weapon, it’s a question of what happens when the bad guys do, because they certainly will, because that’s what bad guys do.
If a pacifist somehow held in their hands a button which would kill every non-pacifist in the world, should they push it?
Did you intend this to be paradoxical? If a pacifist pushed a button to kill non-pacifists, he would obviously die from it too.
Yet psychopathic megalomaniacal leaders are a feature of the human race further back than recorded history, where remote mass destruction of estranged populations is a very recent development
This is likely wrong. In Sapiens, Yuval Harari discusses at length how genocide is as old as humanity. Some of us would brutally murder each other with sticks and stones if they had nothing better.
And, in creating any new technology, we do need to ask, “is introducing this worth the risk of it falling into the wrong hands?”
I guess I can more or less agree with this question. But most defense work is not creating the atomic bomb. Most of it is incremental improvements aimed at more effectively engaging a military target. Which is why the US did so poorly against guerilla warfare in Afghanistan… But that’s beside the point. Excuse my tangent. I am a defense contractor, I have left programs I was uncomfortable with existing.
Anyway, we agree that psychopathic megalomaniacs are a feature of the human creature. And whether or not they are flying drones, driving tanks, or a leading a hoard of mounted Visigoths at your village, I think most of us would rather remove them as a threat from a safe distance… Like with a missile.
A bit, yes. There’s an inherent paradox in the argument about necessity. Put it another way: if the next technology turns all of your enemies into steam, but as a side effect, also does the same to their families, are you forced to develop it, because the people on the other side of the world will just get there first if you don’t? What if the one after that is super low resource yet it also kills anyone who has ever shaken hands with your enemy? etc etc. I would argue that creating a new weapon, or developing existing ones further is not made more or less moral on the basis that your enemy might be doing it, because if you know your enemy’s mind that well, you could easily defeat them using a slingshot.
This is likely wrong…Some of us would brutally murder each other with sticks and stones if they had nothing better.
Not sure I follow, this seems to be what I was saying. Read it back. The difference is that now we have technology capable of remotely erasing huge populations, and no means whatsoever of keeping it out of the hands of the freaks that invariably take power. It’s therefore immoral to develop weapons because if you are clever enough to know how to do that, you should be clever enough to know how the resulting products will end up being used.
most defense work is not creating the atomic bomb. Most of it is incremental improvements
So the difference between them then is just one of scale. Oppenheimer probably never got a good night’s sleep again in his life, but it’s easy to persuade a thousand people to each do a thousandth of what he did. Then each person is only a thousandth as responsible as Oppenheimer. But each increment is still an evil deed, just a smaller one.
“Concern for man himself and his fate must always constitute the chief objective of all technological endeavors…in order that the creations of our mind shall be a blessing and not a curse to mankind. Never forget this in the midst of your diagrams and equations.” People working on weapons are ignoring, forgetting or equivocating over this simple fact. Good people don’t make bombs and sleep well at night. Find another job, where you can look back at your life’s work and honestly believe you made the world a better place.
Anyway, we agree that psychopathic megalomaniacs are a feature of the human creature. And whether or not they are flying drones, driving tanks, or a leading a hoard of mounted Visigoths at your village, I think most of us would rather remove them as a threat from a safe distance… Like with a missile.
Most of us would prefer our enemies killed at range, without having to look then in the eye, sure. But look at what you’re mixing up here: the psychopathic megalomaniacs who are sitting barking orders a world away from the lethality radii, and the grunts and (invariably) innocent collateral who are atomised inside them.
Not sure I follow, this seems to be what I was saying. Read it back. The difference is that now we have technology capable of remotely erasing huge populations, and no means whatsoever of keeping it out of the hands of the freaks that invariably take power.
Were you initially arguing then that today’s weapons are worse because they make murder further removed from oneself or because the scale of death is larger? Or both?
If the first argument, I disagree. Murder is no more moral for being gritty and physical. Tasting the blood of your victim doesn’t redeem the act. Perhaps you would argue that it is worse to allow the murderer to obfuscate the brutality of his actions from himself. But either way, he is a murderer just the same, with the same suffering resulting from his actions. Others should not be held accountable because he found a way to lie to himself. Removing the killing from immediate vicinity of other allows it to be more targeted and involve fewer innocents, and that far outweighs the mental gymnastics it enables for the murderer.
If the second argument, I agree the scale of death, especially the scale of imprecise killing, affects the morality of a weapon, hence why I mentioned nuclear weapons. I kind of thought you did NOT agree with that though, based on this argument:
So the difference between them then is just one of scale.
The amount of innocent deaths enabled by a fusion bomb in a single instance far outstrips that of a conventional bomb. And I would argue it is a weapon that could not be used in any way that would not involve millions of innocent deaths. This inability to be harnessed in any productive way (besides as a threat I suppose) is where it clearly falls into the realm of immoral weapons, and this is fundamentally different than (e.g.) designing sensors that enable us to better monitor the activities of our adversaries. You are making an argument about the cumulative effects of people’s actions, but still the net effects of the people who worked on these two examples are very different.
the next technology turns all of your enemies into steam, but as a side effect, also does the same to their families…I would argue that creating a new weapon, or developing existing ones further is not made more or less moral on the basis that your enemy might be doing it,
I argued that arming yourself was moral based on the fact that psychopaths would likely attack you. I am not trying to justify absolutely every type of weapon in existence, but the post is saying ALL weapons and their production is immoral which I do disagree with. And again, I would largely view a weapon that cannot be effective without harming innocents as immoral (another example: chemical warfare that cannot be removed from the environment). I do not think the morality of any object is based on whether it can be used to harm innocents though, because as previously argued, that is every facet of existence in the hands of a psychopath.
One facet of military development is development of CONOPS (Concept of Operations - how the weapon is used), and there are absolutely immoral CONOPS of weapons (like carpet bombing).
But look at what you’re mixing up here: the psychopathic megalomaniacs who are sitting barking orders a world away from the lethality radii, and the grunts and (invariably) innocent collateral who are atomised inside them.
I feel like you are arguing that because grunts are being exploited (I can agree with this) that they are innocent. But if you are hired to kill others on behalf of a psychopath, even if you really need the money, you are still accountable for carrying out the orders to kill on behalf of the psychopath. They are not innocents for having been duped. They are tools of destruction in the hands of the psychopath and must be disabled as much as a bomb or drone.
Find another job, where you can look back at your life’s work and honestly believe you made the world a better place.
I think it is a tall order to demand everyone dedicate all of their energies only to improving the world. Most people do a job they think is fine (especially since ideological work usually doesn’t pay) and contribute to the world and their communities as they can. My husband and I went around and around about this with Trump’s most recent election. We settled on working programs we don’t think to be actively harmful, donating generously with time and money, and political activism as it seems useful. The issues I worry most about require collective action (climate change, the malevolence of the current US administration), and I have never been one skilled are persuading others.
Yet psychopathic megalomaniacal leaders are a feature of the human race further back than recorded history, where remote mass destruction of estranged populations is a very recent development. Therefore it is immoral to develop, create and deploy weaponry like this and, “we will be the victims of it if we do not”, is a similarly weak moral argument to the one above. Just because we expect someone else to do the immoral thing does not render us any more moral for having done it. I don’t think. Yes, you can argue necessity, but how far does that go? If a pacifist somehow held in their hands a button which would kill every non-pacifist in the world, should they push it? And, in creating any new technology, we do need to ask, “is introducing this worth the risk of it falling into the wrong hands?” . Similar to how anti privacy laws creep in. If you’ve got nothing to hide, you’ve got nothing to fear, until the next government gets in and you need to hide being gay, or brown, or a woman. It’s not a question of whether or not “the good guys” get the weapon, it’s a question of what happens when the bad guys do, because they certainly will, because that’s what bad guys do.
Did you intend this to be paradoxical? If a pacifist pushed a button to kill non-pacifists, he would obviously die from it too.
This is likely wrong. In Sapiens, Yuval Harari discusses at length how genocide is as old as humanity. Some of us would brutally murder each other with sticks and stones if they had nothing better.
I guess I can more or less agree with this question. But most defense work is not creating the atomic bomb. Most of it is incremental improvements aimed at more effectively engaging a military target. Which is why the US did so poorly against guerilla warfare in Afghanistan… But that’s beside the point. Excuse my tangent. I am a defense contractor, I have left programs I was uncomfortable with existing.
Anyway, we agree that psychopathic megalomaniacs are a feature of the human creature. And whether or not they are flying drones, driving tanks, or a leading a hoard of mounted Visigoths at your village, I think most of us would rather remove them as a threat from a safe distance… Like with a missile.
A bit, yes. There’s an inherent paradox in the argument about necessity. Put it another way: if the next technology turns all of your enemies into steam, but as a side effect, also does the same to their families, are you forced to develop it, because the people on the other side of the world will just get there first if you don’t? What if the one after that is super low resource yet it also kills anyone who has ever shaken hands with your enemy? etc etc. I would argue that creating a new weapon, or developing existing ones further is not made more or less moral on the basis that your enemy might be doing it, because if you know your enemy’s mind that well, you could easily defeat them using a slingshot.
Not sure I follow, this seems to be what I was saying. Read it back. The difference is that now we have technology capable of remotely erasing huge populations, and no means whatsoever of keeping it out of the hands of the freaks that invariably take power. It’s therefore immoral to develop weapons because if you are clever enough to know how to do that, you should be clever enough to know how the resulting products will end up being used.
So the difference between them then is just one of scale. Oppenheimer probably never got a good night’s sleep again in his life, but it’s easy to persuade a thousand people to each do a thousandth of what he did. Then each person is only a thousandth as responsible as Oppenheimer. But each increment is still an evil deed, just a smaller one.
“Concern for man himself and his fate must always constitute the chief objective of all technological endeavors…in order that the creations of our mind shall be a blessing and not a curse to mankind. Never forget this in the midst of your diagrams and equations.” People working on weapons are ignoring, forgetting or equivocating over this simple fact. Good people don’t make bombs and sleep well at night. Find another job, where you can look back at your life’s work and honestly believe you made the world a better place.
Most of us would prefer our enemies killed at range, without having to look then in the eye, sure. But look at what you’re mixing up here: the psychopathic megalomaniacs who are sitting barking orders a world away from the lethality radii, and the grunts and (invariably) innocent collateral who are atomised inside them.
Were you initially arguing then that today’s weapons are worse because they make murder further removed from oneself or because the scale of death is larger? Or both?
If the first argument, I disagree. Murder is no more moral for being gritty and physical. Tasting the blood of your victim doesn’t redeem the act. Perhaps you would argue that it is worse to allow the murderer to obfuscate the brutality of his actions from himself. But either way, he is a murderer just the same, with the same suffering resulting from his actions. Others should not be held accountable because he found a way to lie to himself. Removing the killing from immediate vicinity of other allows it to be more targeted and involve fewer innocents, and that far outweighs the mental gymnastics it enables for the murderer.
If the second argument, I agree the scale of death, especially the scale of imprecise killing, affects the morality of a weapon, hence why I mentioned nuclear weapons. I kind of thought you did NOT agree with that though, based on this argument:
The amount of innocent deaths enabled by a fusion bomb in a single instance far outstrips that of a conventional bomb. And I would argue it is a weapon that could not be used in any way that would not involve millions of innocent deaths. This inability to be harnessed in any productive way (besides as a threat I suppose) is where it clearly falls into the realm of immoral weapons, and this is fundamentally different than (e.g.) designing sensors that enable us to better monitor the activities of our adversaries. You are making an argument about the cumulative effects of people’s actions, but still the net effects of the people who worked on these two examples are very different.
I argued that arming yourself was moral based on the fact that psychopaths would likely attack you. I am not trying to justify absolutely every type of weapon in existence, but the post is saying ALL weapons and their production is immoral which I do disagree with. And again, I would largely view a weapon that cannot be effective without harming innocents as immoral (another example: chemical warfare that cannot be removed from the environment). I do not think the morality of any object is based on whether it can be used to harm innocents though, because as previously argued, that is every facet of existence in the hands of a psychopath. One facet of military development is development of CONOPS (Concept of Operations - how the weapon is used), and there are absolutely immoral CONOPS of weapons (like carpet bombing).
I feel like you are arguing that because grunts are being exploited (I can agree with this) that they are innocent. But if you are hired to kill others on behalf of a psychopath, even if you really need the money, you are still accountable for carrying out the orders to kill on behalf of the psychopath. They are not innocents for having been duped. They are tools of destruction in the hands of the psychopath and must be disabled as much as a bomb or drone.
I think it is a tall order to demand everyone dedicate all of their energies only to improving the world. Most people do a job they think is fine (especially since ideological work usually doesn’t pay) and contribute to the world and their communities as they can. My husband and I went around and around about this with Trump’s most recent election. We settled on working programs we don’t think to be actively harmful, donating generously with time and money, and political activism as it seems useful. The issues I worry most about require collective action (climate change, the malevolence of the current US administration), and I have never been one skilled are persuading others.