ROME — Even though today’s modern tools and technologies are hardly human, the Pontifical Academy for Life is zeroing in on the world of robots and machines powered by artificial intelligence.
While the academy’s focus is on the protection of human life and dignity, the rapidly shifting and radical capabilities of robotics are having an ever-increasing impact on human lives, people’s relationships, communities and creation, said Jesuit Father Carlo Casalone, an academy member and consultant.
The need to reflect on the effects, opportunities and risks posed by artificial intelligence and robotics has led the pro-life academy to launch a special look at this complex field, adding robotics to its list of specialized projects, which already include palliative care, neuroscience, bioethics and human genome editing.
A major workshop on “Robo-ethics: Humans, Machines and Health” will be held at the Vatican Feb. 25-26 as part of this increased study; the workshop will focus on the use of robots and artificial intelligence, specifically in medicine and health care.
The use of industrial and personal-service robots is on the rise, according to industry reports. They are being used in manufacturing, housekeeping, assisting with surgery and even caring for the elderly. People with reduced mobility can be assisted with brain control technology, which converts brain waves into digital signals that can command or control external devices, such as artificial limbs or machines.
Casalone, who studied medicine and worked as a cardiologist before joining the Society of Jesus in 1984, helped organize the workshop. He became a member of the pontifical academy in 2017 and works in its scientific section.
He told Catholic News Service in December that the workshop will bring together ethicists, health care workers and researchers, including Hiroshi Ishiguro, a Japanese robotics engineer who creates humanoid robots and promotes discussion about the essence of being human. His lab has developed the interactive “Actroid,” a lifelike humanoid robot that can operate autonomously or be teleoperated and created an uncanny replica of Ishiguro known as the “Geminoid.”
Casalone said the academy wanted the workshop to include experts like Ishiguro who could explain “what sort of vision” guides their work and so that members could “truly listen to what is going on in today’s world and to engage with this historic moment in time.”
“We are seeking to be fully aware of what’s happening so that we know what is possible” in the rapidly advancing world of “cognitive machines” and to highlight the ethical, social, cultural and economic impact these tools may have.
For example, cheaper automated machine labor may threaten emerging economies, and mineral-rich African nations often see their resources extracted and exported without receiving the benefits in what has become a new “robot divide,” Casalone said.
Using robots for military applications can be “very dangerous and very deceptive” if nations use such machines to cover up their responsibility and destroy others “behind the scenes,” he said. Automated systems also can lead to “a sort of gaming mentality” when soldiers can control weaponry remotely, far away from its effects.
Home automation or “domotics” — such as security systems or robot vacuum cleaners — also presents certain risks, he said, if “houses begin to be built in a way that makes them more robot-friendly, more suitable for machines than for humans.”
And the use of robots in assisting the elderly or infirm, while it “could be of great help,” could also “risk triggering an attitude of delegating” the care of the most fragile and vulnerable in society “as if it were a task to be entrusted to machines” and not to fellow human beings, he said.
Similar problems may “also apply to the natural world,” he said, for example, when using robots for farming and livestock “changes our relationship with animals” and nature.
Casalone said the answer isn’t a stance against technology but “guiding development so that it respects human dignity and the common good as much as possible.”
“It is about becoming aware of and agreeing about regulating these radically new possibilities we have before us, which are able to increasingly and more deeply affect living beings and the human body,” he said.
The two-day workshop in February will not be proposing specific guidelines, he said, but rather will lay the groundwork for drawing up “some criteria, given what is at play with the emergence of these cognitive systems in our lives.”
The radical and pervasive impact today’s new technologies will have on human beings and their relations “demands greater oversight,” public discussion and concern not just among experts or special interest groups, but by everyone, he said.
Throughout history, science and technology have invented or developed new capabilities that have taken the world by surprise and “transformed our lives,” Casalone said. “So, we have to expect something new” will always be around the corner and be ready to respond.
Current controls on “the atomic bomb and its destructive potential,” he said, show how human beings are capable of not using every new technology, “which means there are options for guiding development” so that it can better respect human life.
“This, in every case, is what we are committed to,” he said.