Difference between revisions of "AI Can Now Self-Reproduce - Should Humans Be Worried (YouTube Content)"

Jump to navigation Jump to search
no edit summary
(Created page with "{{subst::How mental sandboxes can help humanity reach new heights (YouTube Content)}}")
 
Line 1: Line 1:
{{InfoboxAppearance
{{InfoboxAppearance
|title=How 'mental sandboxes' can help humanity reach new heights
|title=AI Can Now Self-Reproduce—Should Humans Be Worried?
|image=[[File:How Mental Sandboxes Cover.jpg]]
|image=[[https://theportal.wiki/wiki/File:AI_Self-Reproduce_Cover.jpg]]
|host=
|host=
|guests=[[Eric Weinstein]]
|guests=[[Eric Weinstein]]
|length=00:04:23
|length=00:05:30
|releasedate=11 May 2017
|releasedate=22 May 2017
|youtubedate=
|youtubedate=
|customlabel1=
|customlabel1=
Line 16: Line 16:
|customdata4=
|customdata4=
|link1title=YouTube
|link1title=YouTube
|link1=[https://www.youtube.com/watch?v=eydQmYWaKBE Watch]
|link1=[https://www.youtube.com/watch?v=Wu8s0tp9yzY Watch]
|link2title=Portal Blog
|link2title=Portal Blog
|link2=[https://theportal.group/embrace-contradictory-ideas-to-reach-new-intellectual-heights/ Read]
|link2=[https://theportal.group/ai-can-now-self-reproduce-should-humans-be-worried/ Read]
|link3title=
|link3title=
|link3=
|link3=
|link4title=
|link4title=
|link4=
|link4=
|prev=Genius Is Not about Excelling at Something - It’s about Doing Things Differently (YouTube Content)
|prev=How mental sandboxes can help humanity reach new heights (YouTube Content)
|next=AI Can Now Self-Reproduce - Should Humans Be Worried (YouTube Content)
|next=Capitalism 2.0 Will Include a Healthy Dose of Socialism (YouTube Content)
}}
}}


{{Stub}}
{{Stub}}


'''How 'mental sandboxes' can help humanity reach new heights''' was a video with [[Eric Weinstein]] on Big Think.
'''AI Can Now Self-Reproduce—Should Humans Be Worried?''' was a video with [[Eric Weinstein]] on Big Think.


{{#widget:YouTube|id=eydQmYWaKBE}}
{{#widget:YouTube|id=Wu8s0tp9yzY}}


== Description ==
== Description ==
Those among us who fear world domination at the metallic hands of super-intelligent AI have gotten a few steps ahead of themselves. We might actually be outsmarted first by fairly dumb AI, says Eric Weinstein. Humans rarely create products with a reproductive system—you never have to worry about waking up one morning to see that your car has spawned a new car on the driveway (and if it did: cha-ching!), but artificial intelligence has the capability to respond to selective pressures, to self-replicate and spawn daughter programs that we may not easily be able to terminate. Furthermore, there are examples in nature of organisms without brains parasitizing more complex and intelligent organisms, like the mirror orchid. Rather than spend its energy producing costly nectar as a lure, it merely fools the bee into mating with its lower petal through pattern imitation: this orchid hijacks the bee's brain to meet its own agenda. Weinstein believes all the elements necessary for AI programs to parasitize humans and have us serve its needs already exists, and although it may be a "crazy-sounding future problem which no humans have ever encountered," Weinstein thinks it would be wise to devote energy to these possibilities that are not as often in the limelight.


Eric Weinstein is a mathematician, economist and managing director of Thiel Capital.In a recent interview with Rebel Wisdom, Weinstein spoke about the origins of the Intellectual Dark Web, and his theory of how our institutions are plagued by an "embedded growth obligation." Disagreeable people, Weinstein says, could help institutions correct themselves.
 
ERIC WEINSTEIN:
 
Eric Weinstein is an American mathematician and economist. He earned his Ph.D in mathematical physics from Harvard University in 1992, is a research fellow at the Mathematical Institute of Oxford University, and is a managing director of Thiel Capital in San Francisco. He has published works and is an expert speaker on a range of topics including economics, immigration, elite labor, mitigating financial risk and incentivizing of creative risks in the hard sciences.  


== Transcript ==
== Transcript ==

Navigation menu