Horgan said, "I wrote my book The End of War, which argues we can and must end war"
I haven't read the book, and so I'll ask, does the book address the following?
Almost all the violence in the world, globally and in our own neighborhoods, arises from a single easily identified source, violent men.
GOOD NEWS: We don't need to edit the entire species or the entire social structure. Few women are violent, and most men are peaceful. The primary threat to civilization arises from a small fraction of humanity which needs to either be transformed, removed, or managed effectively.
BAD NEWS: To my knowledge, no society in history has figured out how to keep the many peaceful men while ridding itself of the minority of violent men. If true, then at least one of the options we should be considering is a world without men. I lay out that case over 14 pages here:
I don't seek agreement on that claim, only intellectually honest friends who will challenge the "world without men" concept in earnest, and either defeat it, or admit so if they can't.
Yes, we MUST end war. And to that end, all options should be on the table for review.
Well, I have to say Phil you bring some interesting ideas! Have you read Shakespeare's Macbeth? Seems like the total calculation of violence might include folks not pulling the trigger. I love your passion, man. Thank you.
Yes, a total calculation would include voters and others, agreed.
I'm not proposing a total solution to war or violence. I define world peace as a radical reduction in violence, not a total elimination. The most obvious place to begin in seeking a radical reduction in violence is a focus on that small fraction of humanity committing most of the violence.
Well, no, that's not quite right. The most obvious place to begin would be to develop our understanding of the NEED for a radical reduction in violence. The status quo is unsustainable. Until we get that, we're unlikely to be interested in the degree of change that is necessary.
I was referring to Lady Macbeth.... Lady Macbeth is a pivotal character who incites violence primarily through her manipulation and provocation of Macbeth himself. One of her most significant moments is in Act 1, Scene 7, where she questions Macbeth's manhood and resolve, accusing him of being a coward when he hesitates to murder King Duncan. Her taunting speech includes lines like:
"Art thou afeard
To be the same in thine own act and valour
As thou art in desire? Wouldst thou have that
Which thou esteem'st the ornament of life,
And live a coward in thine own esteem,
Letting 'I dare not' wait upon 'I would,'
Like the poor cat i' the adage?"
So I think men and women as human beings are part of the story of war and violence. Big issue, though, I agree. Thanks for bringing this up Phil, it's important to think about these issues. Erik
Being part of the story is not the same as being the source of the overwhelming majority of the violence. Yes, women aren't saints, they are imperfect, even violent sometimes. But they represent a tiny percent of the problem.
I guess I just don't find the gendered thing that interesting when it comes to understanding human beings and violence. Men certainly pull the trigger more than women. Women historically have insisted they have a safe place to raise children, which pretty much means someone is going to pull the trigger somewhere. I don't think we're hitting paydirt here. Men commit more violence than women. We can go to race and further subdivide. Last time I checked, in the US Blacks commit more violence than whites, or asians. Sigh. It all strikes me as a nonstarter. Men also disproportionately die in wars, and I suppose that's "violence" as well. But I suppose you and I would be pretty grateful if "men" protected the US from foreign invasion and so on. This will go round and round in a circle, my view is that there's a better analysis. Thanks for your comment, Phil.
Horgan asks, "How far can science go in explaining reality?"
Before we ask that, we might first ask this:
How far can human beings go in successfully managing what is explained?
If human beings can only go to X, then we don't really want science going to X+. The fact that we can learn something doesn't automatically equal us being able to manage what we've learned.
We're talking about two different things here. Scientists are very intelligent, highly educated people who typically have the best of intentions. The population that will inherit the powers which flow from scientific explanations would be described quite differently.
I don't find this persuasive, unfortunately. Are we supposed to figure out what scientists will come up with before they come up with it? Then, through some magic, figure out if "everyone can handle it."? We don't know what the human mind will discover, so we have to invest in science and learning and human intelligence and see. If it's disturbing, once we know what it is--what we're actually talking about--then we can bring in legal systems and cultural pressure and all the rest. The idea that we would somehow "prepare" people for scientific discoveries that we haven't yet experienced strikes me as, well, just not an idea that is likely to help anything, anywhere.
There's an immediate problem of what you would actually teach people about the discovery that we don't yet know, we don't know what it is (huh?). There's another problem that "human beings only going to X" is not up to us and certainly not a single individual to say. This strikes me as a non starter and frankly not even consistent with a free society. I get how it must feel different to you, but I'd humbly suggest you think it through a bit more, as I can't find any way to prepare people for stuff they don't know is coming and give them some sort of management training (that sounds terrible, frankly!) until we know what we've discovered. That there are idiots or simpletons who don't know what to do with quantum mechanics or for that matter automobiles or even shovels I would grant you. Should we stop making shovels? Making everyone an idiot by attempting to stop science sounds pretty bad. Thanks for your comment. This is how we sharpen ideas.
There are lots and lots of problems with the questions I ask above, agreed. There are lots and lots of problems with the status quo too. As you know, a single human being can now erase modern civilization in a few minutes. Should we keep racing in that direction?
The heart of my argument is that the status quo simply isn't sustainable. The greater the powers we give ourselves, and the faster they are delivered, the higher the chance that some "one bad day" event will crash the system and deliver us to game over.
To argue otherwise is essentially to claim that human beings possess unlimited ability, and can successfully manage any amount of power delivered at any rate. That is the assumption a "more is better" relationship with knowledge is built upon.
It's considered an obvious given that children have limited ability and thus the powers available to them should also be limited. But once the child reaches 18, we throw that common sense out the window and switch to the "more is better" paradigm. And so we find ourselves accumulating vast powers like nukes, AI, and genetic engineering that we really have no idea how to make safe.
Try this? My argument is that we should learn how to control and manage the pace of the knowledge explosion. The scientific community would have us not learn that.
First, we’re seeing an explosion of information (not all of it high quality) and technological applications, but not necessarily an explosion of knowledge. Being able to do CRISPER gene editing is a much less profound change in knowledge as discovering the structure of DNA.
The same thing is true about technologies. We’re able to get better, more sophisticated technologies in most categories, but if you compare the technological differences between 1970 to the present and 1900 to 1950, you’ll find that the former are far less profound.
I also think you start with the premise that science is intrinsically designed to produce advances in the capabilities of technologies. Science should help us sharpen our understanding of the world, but technological advancement is not the only outcome of humans better understanding the world around us. We also have to learn to grow up and live better lives than our parents. Technology is a side dish of science, not the main plate.
My view is the kind of linear/functional science that has dominated the last century or two, is, in fact, producing diminishing returns, and at some point they will become less an object of fascination, and a source of wealth. The next intellectual revolution will be focused on humans better conceptualizing their place in the world, and while new ideas will have to be technical, their effects will be more spiritual than technological.
If I can jump in, I've been trying to catch your angle Jeffrey, in prior communication, and this really helps! You're looking at a very big picture view of science and tech. Anyway.... just wanted to say that....
I agree, technology is a by-product of science. But that doesn't really matter, imho.
So long as science is developing new knowledge, some of that knowledge will be converted in to technology. Some of that tech will be used for good, and some for bad.
A key fact to recall is that as the scale of technology grows, the odds of disaster grow with it. As example, while nuclear energy is very useful, one bad day with nuclear weapons and it's game over.
We need to update our knowledge philosophy to include the realization that the equation is no longer "some good, and some bad, and it evens out". 99% of technology can be great, but if just one technology of massive scale slips out of control, the good stuff no longer matters. Nuclear weapons illustrate this principle in the simplest possible manner.
I agree that the accelerating pace of change may inspire more spirituality. That is indeed an interesting consideration. But again, that doesn't really matter. To illustrate, if almost everybody becomes "enlightened", almost everybody isn't enough in world of existential scale technology. The matter is not decided by the average level of morality throughout the population. We don't get a vote over what psychopaths like Putin may decide to do with their technology.
I don't think you're wrong to highlight the perils of technology. The biggest threat to humanity in the industrial age is our own capacity for self-destruction, and I spend plenty of time thinking about nuclear weapons, bio-mishaps, climate change, etc.
My view, however, is that we should frame these risks and problems in a way that looks forward, and doesn't seek to impose top-down constraints on knowledge or how knowledge is applied. I would argue that contemporary institutional constraints on research and thinking already impose a tremendous conformity that exacerbates the problems you're concerned with, even as they produce remarkable technologies. The changes we need for a better future are foundational, not regulatory, even as the risks of various catastrophes loom on the horizon.
Here's one place to start: we can observe the bifurcation of human knowledge into some sort of "faith" that deals with matters of the spirit, ethics, deep metaphysical questions, etc. and a "science" that focuses on functional and mathematical formulations that drive towards advancing technologies -- and decide that this is a strange way to organize our minds. We could take a different approach, where, for instance, "spiritual" questions are taken up technically, while technical pursuits are understood to have a spiritual, rather than a merely social valence. Such changes can move people collectively towards a different, generally less dangerous set of outcomes, where each individual still has to do his or her own work. There is no philosophy that provides a cheat code for confronting and defeating the darkness and ugliness in every human heart, but we can equip the average individual with better tools. I really think this is how people worried about these matters should be spending their time.
Hi again Jeffrey, thanks for the ongoing discussion.
I am looking forward. I'm looking forward to a day when humanity learns to control the awesome power of knowledge, just as we do today with electricity, water, land, space and other elements of the natural world. Controlling knowledge doesn't mean allowing it to explode in every direction as fast as it can.
I applaud the holistic nature of your thinking, and agree especially with your ideas of approaching spiritual questions technically. One of my current obsessions is the idea that space itself may contain some property related to intelligence.
You write, "There is no philosophy that provides a cheat code for confronting and defeating the darkness and ugliness in every human heart"
But it's not "every human heart" that is the primary concern, but instead a small fraction of humanity, violent men, who are responsible for the vast majority of the threat.
Violent men have always been with us. But today, it's the marriage between violent men and an accelerating knowledge explosion which represents the biggest threat to the modern world.
Very interesting guest! I wish the interview was longer!
https://johnhorgan.org/
Thanks for the link, and the for free. Interesting. Interested!
Horgan said, "I wrote my book The End of War, which argues we can and must end war"
I haven't read the book, and so I'll ask, does the book address the following?
Almost all the violence in the world, globally and in our own neighborhoods, arises from a single easily identified source, violent men.
GOOD NEWS: We don't need to edit the entire species or the entire social structure. Few women are violent, and most men are peaceful. The primary threat to civilization arises from a small fraction of humanity which needs to either be transformed, removed, or managed effectively.
BAD NEWS: To my knowledge, no society in history has figured out how to keep the many peaceful men while ridding itself of the minority of violent men. If true, then at least one of the options we should be considering is a world without men. I lay out that case over 14 pages here:
https://www.tannytalk.com/s/peace
I don't seek agreement on that claim, only intellectually honest friends who will challenge the "world without men" concept in earnest, and either defeat it, or admit so if they can't.
Yes, we MUST end war. And to that end, all options should be on the table for review.
Well, I have to say Phil you bring some interesting ideas! Have you read Shakespeare's Macbeth? Seems like the total calculation of violence might include folks not pulling the trigger. I love your passion, man. Thank you.
Yes, a total calculation would include voters and others, agreed.
I'm not proposing a total solution to war or violence. I define world peace as a radical reduction in violence, not a total elimination. The most obvious place to begin in seeking a radical reduction in violence is a focus on that small fraction of humanity committing most of the violence.
Well, no, that's not quite right. The most obvious place to begin would be to develop our understanding of the NEED for a radical reduction in violence. The status quo is unsustainable. Until we get that, we're unlikely to be interested in the degree of change that is necessary.
I was referring to Lady Macbeth.... Lady Macbeth is a pivotal character who incites violence primarily through her manipulation and provocation of Macbeth himself. One of her most significant moments is in Act 1, Scene 7, where she questions Macbeth's manhood and resolve, accusing him of being a coward when he hesitates to murder King Duncan. Her taunting speech includes lines like:
"Art thou afeard
To be the same in thine own act and valour
As thou art in desire? Wouldst thou have that
Which thou esteem'st the ornament of life,
And live a coward in thine own esteem,
Letting 'I dare not' wait upon 'I would,'
Like the poor cat i' the adage?"
So I think men and women as human beings are part of the story of war and violence. Big issue, though, I agree. Thanks for bringing this up Phil, it's important to think about these issues. Erik
Being part of the story is not the same as being the source of the overwhelming majority of the violence. Yes, women aren't saints, they are imperfect, even violent sometimes. But they represent a tiny percent of the problem.
Hi Phil,
I guess I just don't find the gendered thing that interesting when it comes to understanding human beings and violence. Men certainly pull the trigger more than women. Women historically have insisted they have a safe place to raise children, which pretty much means someone is going to pull the trigger somewhere. I don't think we're hitting paydirt here. Men commit more violence than women. We can go to race and further subdivide. Last time I checked, in the US Blacks commit more violence than whites, or asians. Sigh. It all strikes me as a nonstarter. Men also disproportionately die in wars, and I suppose that's "violence" as well. But I suppose you and I would be pretty grateful if "men" protected the US from foreign invasion and so on. This will go round and round in a circle, my view is that there's a better analysis. Thanks for your comment, Phil.
Horgan asks, "How far can science go in explaining reality?"
Before we ask that, we might first ask this:
How far can human beings go in successfully managing what is explained?
If human beings can only go to X, then we don't really want science going to X+. The fact that we can learn something doesn't automatically equal us being able to manage what we've learned.
We're talking about two different things here. Scientists are very intelligent, highly educated people who typically have the best of intentions. The population that will inherit the powers which flow from scientific explanations would be described quite differently.
Hi Phil,
I don't find this persuasive, unfortunately. Are we supposed to figure out what scientists will come up with before they come up with it? Then, through some magic, figure out if "everyone can handle it."? We don't know what the human mind will discover, so we have to invest in science and learning and human intelligence and see. If it's disturbing, once we know what it is--what we're actually talking about--then we can bring in legal systems and cultural pressure and all the rest. The idea that we would somehow "prepare" people for scientific discoveries that we haven't yet experienced strikes me as, well, just not an idea that is likely to help anything, anywhere.
There's an immediate problem of what you would actually teach people about the discovery that we don't yet know, we don't know what it is (huh?). There's another problem that "human beings only going to X" is not up to us and certainly not a single individual to say. This strikes me as a non starter and frankly not even consistent with a free society. I get how it must feel different to you, but I'd humbly suggest you think it through a bit more, as I can't find any way to prepare people for stuff they don't know is coming and give them some sort of management training (that sounds terrible, frankly!) until we know what we've discovered. That there are idiots or simpletons who don't know what to do with quantum mechanics or for that matter automobiles or even shovels I would grant you. Should we stop making shovels? Making everyone an idiot by attempting to stop science sounds pretty bad. Thanks for your comment. This is how we sharpen ideas.
Hi Erik,
There are lots and lots of problems with the questions I ask above, agreed. There are lots and lots of problems with the status quo too. As you know, a single human being can now erase modern civilization in a few minutes. Should we keep racing in that direction?
The heart of my argument is that the status quo simply isn't sustainable. The greater the powers we give ourselves, and the faster they are delivered, the higher the chance that some "one bad day" event will crash the system and deliver us to game over.
To argue otherwise is essentially to claim that human beings possess unlimited ability, and can successfully manage any amount of power delivered at any rate. That is the assumption a "more is better" relationship with knowledge is built upon.
It's considered an obvious given that children have limited ability and thus the powers available to them should also be limited. But once the child reaches 18, we throw that common sense out the window and switch to the "more is better" paradigm. And so we find ourselves accumulating vast powers like nukes, AI, and genetic engineering that we really have no idea how to make safe.
Try this? My argument is that we should learn how to control and manage the pace of the knowledge explosion. The scientific community would have us not learn that.
Who is the Luddite? Who is against learning?
Thanks for the engagement!
I do understand what you’re saying, Phil. It’s such a big issue….
Thank you for the engagement! I love this. Thanks much for taking the time. I think you're saying something really important here.
Phil—
I would push back on your premise in a few ways.
First, we’re seeing an explosion of information (not all of it high quality) and technological applications, but not necessarily an explosion of knowledge. Being able to do CRISPER gene editing is a much less profound change in knowledge as discovering the structure of DNA.
The same thing is true about technologies. We’re able to get better, more sophisticated technologies in most categories, but if you compare the technological differences between 1970 to the present and 1900 to 1950, you’ll find that the former are far less profound.
I also think you start with the premise that science is intrinsically designed to produce advances in the capabilities of technologies. Science should help us sharpen our understanding of the world, but technological advancement is not the only outcome of humans better understanding the world around us. We also have to learn to grow up and live better lives than our parents. Technology is a side dish of science, not the main plate.
My view is the kind of linear/functional science that has dominated the last century or two, is, in fact, producing diminishing returns, and at some point they will become less an object of fascination, and a source of wealth. The next intellectual revolution will be focused on humans better conceptualizing their place in the world, and while new ideas will have to be technical, their effects will be more spiritual than technological.
If I can jump in, I've been trying to catch your angle Jeffrey, in prior communication, and this really helps! You're looking at a very big picture view of science and tech. Anyway.... just wanted to say that....
Hi Jeffrey, thanks for engaging.
I agree, technology is a by-product of science. But that doesn't really matter, imho.
So long as science is developing new knowledge, some of that knowledge will be converted in to technology. Some of that tech will be used for good, and some for bad.
A key fact to recall is that as the scale of technology grows, the odds of disaster grow with it. As example, while nuclear energy is very useful, one bad day with nuclear weapons and it's game over.
We need to update our knowledge philosophy to include the realization that the equation is no longer "some good, and some bad, and it evens out". 99% of technology can be great, but if just one technology of massive scale slips out of control, the good stuff no longer matters. Nuclear weapons illustrate this principle in the simplest possible manner.
I agree that the accelerating pace of change may inspire more spirituality. That is indeed an interesting consideration. But again, that doesn't really matter. To illustrate, if almost everybody becomes "enlightened", almost everybody isn't enough in world of existential scale technology. The matter is not decided by the average level of morality throughout the population. We don't get a vote over what psychopaths like Putin may decide to do with their technology.
Thanks for adding your thoughts!
I don't think you're wrong to highlight the perils of technology. The biggest threat to humanity in the industrial age is our own capacity for self-destruction, and I spend plenty of time thinking about nuclear weapons, bio-mishaps, climate change, etc.
My view, however, is that we should frame these risks and problems in a way that looks forward, and doesn't seek to impose top-down constraints on knowledge or how knowledge is applied. I would argue that contemporary institutional constraints on research and thinking already impose a tremendous conformity that exacerbates the problems you're concerned with, even as they produce remarkable technologies. The changes we need for a better future are foundational, not regulatory, even as the risks of various catastrophes loom on the horizon.
Here's one place to start: we can observe the bifurcation of human knowledge into some sort of "faith" that deals with matters of the spirit, ethics, deep metaphysical questions, etc. and a "science" that focuses on functional and mathematical formulations that drive towards advancing technologies -- and decide that this is a strange way to organize our minds. We could take a different approach, where, for instance, "spiritual" questions are taken up technically, while technical pursuits are understood to have a spiritual, rather than a merely social valence. Such changes can move people collectively towards a different, generally less dangerous set of outcomes, where each individual still has to do his or her own work. There is no philosophy that provides a cheat code for confronting and defeating the darkness and ugliness in every human heart, but we can equip the average individual with better tools. I really think this is how people worried about these matters should be spending their time.
Hi again Jeffrey, thanks for the ongoing discussion.
I am looking forward. I'm looking forward to a day when humanity learns to control the awesome power of knowledge, just as we do today with electricity, water, land, space and other elements of the natural world. Controlling knowledge doesn't mean allowing it to explode in every direction as fast as it can.
I applaud the holistic nature of your thinking, and agree especially with your ideas of approaching spiritual questions technically. One of my current obsessions is the idea that space itself may contain some property related to intelligence.
https://www.tannytalk.com/p/intelligence-is-intelligence-a-property
You write, "There is no philosophy that provides a cheat code for confronting and defeating the darkness and ugliness in every human heart"
But it's not "every human heart" that is the primary concern, but instead a small fraction of humanity, violent men, who are responsible for the vast majority of the threat.
https://www.tannytalk.com/s/peace
Violent men have always been with us. But today, it's the marriage between violent men and an accelerating knowledge explosion which represents the biggest threat to the modern world.