twitterfacebookgoogle+register
+ Reply to Thread
Page 3 of 3 FirstFirst 1 2 3
Results 21 to 22 of 22

Thread: GEEK OUT; For the Geek in All of Us

Share/Bookmark
  1. #21
    ^^^ (Continued)

    Finally, we need a word or two about opiates. The modern condemnation of religion has followed the Marxian rebuke that religion is an opiate administered indirectly by state power in order to secure a docile populace — one that accepts poverty and political powerlessness, in hopes of posthumous supernatural rewards. “Religion is the sigh of the oppressed creature,” Marx claimed, “the heart of a heartless world, and the soul of soulless conditions. It is the opium of the people.”

    Marx, Mao and even Malcolm X leveled this critique against traditional religion, and the critique lives on as a disdainful last insult to be hurled at the believer. I hurled it myself many times, thinking that it was a decisive weapon. In recent years, however, I’ve changed my mind about this criticism.

    First, religion is energizing as often as it is anesthetizing. As often as it numbs or sedates, religion also riles up and invigorates the believer. This animating quality of religion can make it more hazardous to the state than it is tranquilizing, and it also inspires a lot of altruistic philanthropy.

    Second, what’s so bad about pain relief anyway? If my view of religion is primarily therapeutic, I can hardly despair when some of that therapy takes the form of palliative pain management. If atheists think it’s enough to dismiss the believer on the grounds that he should never buffer the pains of life, then I’ll assume the atheist has no recourse to any pain management in his own life. In which case, I envy his remarkably good fortune.

    For the rest of us, there is aspirin, alcohol, religion, hobbies, work, love, friendship. After all, opioids — like endorphins — are innate chemical ingredients in the human brain and body, and they evolved, in part, to occasionally relieve the organism from misery. To quote the well-known phrase by the German humorist Wilhelm Busch, “He who has cares has brandy, too.”

    We need a more clear-eyed appreciation of the role of cultural analgesics. It is not enough to dismiss religion on the grounds of some puritanical moral judgment about the weakness of the devotee. Religion is the most powerful cultural response to the universal emotional life that connects us all.

    Stephen Asma is a professor of philosophy at Columbia College Chicago, and the author of the forthcoming, “Why We Need Religion.”

  2. #22
    From the New York Times online - - -

    Rebooting the Ethical Soldier

    By Robert H. Latiff

    Mr. Latiff is a retired Air Force major general.

    July 16, 2018

    In 2014, the United States Army Research Laboratory published a report predicting what the battlefield of 2050 would look like. Not surprisingly, it was a scenario largely driven by technology, and the report described a sort of warfare most people associate with video games or science-fiction movies — combined forces of augmented or enhanced humans, robots operating in swarms, laser weapons, intelligence systems and cyberbots fighting in a highly contested information environment using spoofing, hacking, misinformation and other means of electronic warfare.

    In one sense, this is nothing new. The way wars are fought have always changed with technology. But humans themselves don’t change so rapidly. As a retired Air Force major general with special interests in both technology and military ethics, I have a specific concern: that as new weapons technologies make soldiering more lethal, our soldiers will find it more difficult than ever to behave ethically and to abide by the long-established conventions regarding the rules of war.

    As the nation with the most powerful military on earth, we are obligated to acknowledge these profound fundamental changes in the experience of soldiering, to confront them and to prepare to adapt.

    On battlefields of the past, killing was intentional and intensely personal. In the future, the automated nature of combat, the artificial enhancement of soldiers and the speed and distances involved will threaten to undermine the warrior ethos. This applies not only to conduct between opposing soldiers, but among those fighting on the same side as well. In the past, soldiers took risks for one another. It is not at all clear how the introduction of autonomous machines will change that dynamic. Would a soldier be willing to entrust his life to a machine? Would a fellow soldier put his life or the lives of other soldiers in danger to save an important robot? Should he?

    Many soldiers will no longer have to fear close contact and hand-to-hand combat because they will be able to deploy robots and unmanned vehicles at great ranges. However, fear acts as a modulator of behavior, and by reducing it we will likely also remove constraints on unethical behavior. If a soldier cannot see, hear and understand the context of a battlefield or a particular engagement, he is less likely to concern himself with decisions requiring such nuance.

    Perhaps a machine can make a utilitarian calculation on proportionality of force, but can it make an empathetic decision? Would it be able to sense an enemy’s wavering determination, for instance, and call off an attack to prevent unnecessary loss of life? Commanders and troops on the ground, in contact with an enemy, have a “feel” for the complexities of the battlefield that cannot be reproduced by a machine.

    Computers, artificial intelligence, robots and autonomous systems will create an environment too complex and fast for humans to keep up with, much less control.

    Increasingly, warfare will exceed the capacity of the human senses to collect and process data. Department of Defense policy on autonomy in weapons systems says a human will always be part of the decision-making for lethal weapons, but the Pentagon nonetheless spent $2.5 billion in 2017 on artificial intelligence, and every military service is funding work in autonomous systems. In 2019, the Defense Department expects to spend $9.4 billion on unmanned systems, with over $800 million for autonomy, human-machine teaming and swarm research.

    Gradually, perhaps imperceptibly, automated systems will function so much more efficiently that humans will become mere bystanders. The soldier will become the slowest element in an engagement, or will simply become irrelevant. Adherence to the rules of war will become less relevant as well.

    A separate set of ethical questions are raised by the technologies of human “enhancement” and augmentation, which include improving physical strength, stamina and pain tolerance, as well as using neurological implants and stimulation to restore brain function and enhance learning.

    Can soldiers under the influence of behavior-modifying drugs or electronics be held to account for their actions? If the soldier is using drugs to enhance his cognition or reduce his fear, what is the role of free will? Might a soldier who fears nothing unnecessarily place himself, his unit or innocent bystanders at risk? What about the impact of memory-altering drugs on the soldier’s sense of guilt, which might be important in decisions about unnecessary and superfluous suffering?

    These are important decisions in war, and they form the basis for many of the tenets of “just war” theory. Gen. Paul Selva, the vice chairman of the Joint Chiefs of staff, supports “keeping the ethical rules of war in place lest we unleash on humanity a set of robots that we don’t know how to control.”

    The role of revising and recasting these conventions should be taking place at the highest levels of government. So far, it hasn’t. The White House’s Select Committee on Artificial Intelligence, formed in May, has not even acknowledged the major ethical issues surrounding A.I. that have been very publicly raised by an increasing number of scientists and technology experts like Elon Musk, Bill Gates and Stephen Hawking.

    While it is important that leaders openly recognize the critical nature of these issues, the Department of Defense needs to follow up on its 2012 directive on autonomy with guidelines for researchers and commanders. It should require that both researchers and military commanders question — throughout the development process and long before the systems are ready for deployment — how the systems will be used and whether that use might violate any of laws of armed conflict and international humanitarian law.

    Historically, the United States has led the world in technology development, and thus our use of questionable weapons or methods will be well noted by others. Sadly, in the period after the Sept. 11 attacks, the United States resorted to torture of enemy detainees. While most senior leaders have denounced the practice, the fact remains that the nation crossed an important moral threshold. Knowing that, future enemies — even civilized ones — may be less inhibited in employing the same methods against us. The same will be true for advanced technologies.

    With new warfare technologies, we now have an opportunity to again demonstrate our leadership in human rights by ensuring that our young soldiers know how to use the new weapons in ethical and humane ways.

    Robert H. Latiff is a retired Air Force major general and the author of “Future War: Preparing for the New Global Battlefield.”


 
+ Reply to Thread
Page 3 of 3 FirstFirst 1 2 3

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts

 
Visitor count:
Copyright © 2005 - 2013. Gameface.ph