Gefahr durch Künstliche Intelligenz


Sam Harris ist einer der vernehmlichsten Kritiker einer unbedachten und unkontrollierten Entwicklung von künstlichen Intelligenzen - es gibt gute Gründe anzunehmen, dass hier ein erhebliches Risikopotential liegt.

Der kritische Punkt ist nicht erst erreicht, wenn mich eine Maschine komplett kennt. Der kritische Punkt ist schon erreicht, wenn die Maschine mich besser kennt, als ich mich selbst kenne. Yuval Noah Harari

Materialsammlung "PodCasts"

Einen hervorragenden Überblick über dieses komplexe Thema bieten folgende Podcasts.

KI als Kernthema

    • Sam Harris speaks with Eliezer Yudkowsky about the nature of intelligence, different types of AI, the “alignment problem,” IS vs OUGHT, the possibility that future AI might deceive us, the AI arms race, conscious AI, coordination problems, and other topics.
    • Sam Harris speaks with Max Tegmark about his new book Life 3.0: Being Human in the Age of Artificial Intelligence. They talk about the nature of intelligence, the risks of superhuman AI, a nonbiological definition of life, the substrate independence of minds, the relevance and irrelevance of consciousness for the future of AI, near-term breakthroughs in AI, and other topics.
    • Sam Harris speaks with computer scientist Stuart Russell about the challenge of building artificial intelligence that is compatible with human well-being.
    • Sam Harris speaks with Kevin Kelly about why it’s so hard to predict future technology, the nature of intelligence, the “singularity,” artificial consciousness, and other topics.
    • Sam Harris speaks with Kate Darling about the ethical concerns surrounding our increasing use of robots and other autonomous systems.
    • Sam Harris speaks with MIT cosmologist Max Tegmark about the foundations of science, our current understanding of the universe, and the risks of future breakthroughs in artificial intelligence.

KI als Randthema

    • Sam Harris speaks with Thomas Metzinger about the scientific and experiential understanding of consciousness. They also talk about the significance of WWII for the history of ideas, the role of intuition in science, the ethics of building conscious AI, the self as an hallucination, how we identify with our thoughts, attention as the root of the feeling of self, the place of Eastern philosophy in Western science, and the limitations of secular humanism.
    • Sam Harris speaks with Anil Seth about the scientific study of consciousness, where consciousness emerges in nature, levels of consciousness, perception as a “controlled hallucination,” emotion, the experience of “pure consciousness,” consciousness as “integrated information,” measures of “brain complexity,” psychedelics, different aspects of the “self,” conscious AI, and many other topics.
    • Sam Harris speaks with Robert Sapolsky about his work with baboons, the opposition between reason and emotion, doubt, the evolution of the brain, the civilizing role of the frontal cortex, the illusion of free will, justice and vengeance, brain-machine interface, religion, drugs, and other topics.
    • Sam Harris speaks with Zeynep Tufekci about “surveillance capitalism,” the Trump campaign’s use of Facebook, AI-enabled marketing, the health of the press, Wikileaks, ransomware attacks, and other topics.
    • Sam Harris speaks with Tristan Harris about the arms race for human attention, the ethics of persuasion, the consequences of having an ad-based economy, the dynamics of regret, and other topics
    • Sam Harris talks to biologist David Krakauer about information, complex systems, and the future of humanity.
    • Sam Harris speaks with physicist David Deutsch about the foundations of knowledge, the moral landscape, possible futures for conscious beings, and other topics.
    • Sam Harris talks to physicist David Deutsch about the reach and power of human knowledge, the future of artificial intelligence, and the survival of civilization.

Weitere Seiten mit dem Thema "Künstliche Intelligenz"

gefahr_durch_kuenstliche_intelligenz.txt · Zuletzt geändert: 2019/03/05 16:01 von