Welcome to DU!
The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards.
Join the community:
Create a free account
Support DU (and get rid of ads!):
Become a Star Member
Latest Breaking News
Editorials & Other Articles
General Discussion
The DU Lounge
All Forums
Issue Forums
Culture Forums
Alliance Forums
Region Forums
Support Forums
Help & Search
Latest Breaking News
In reply to the discussion: Elon Musk warns against unleashing artificial intelligence 'demon' [View all]bananas
(27,509 posts)8. A number of organizations are looking at global existential risks
The Bulletin of Atomic Scientists Doomsday Clock isn't just about nuclear war:
http://thebulletin.org/overview
The Doomsday Clock is an internationally recognized design that conveys how close we are to destroying our civilization with dangerous technologies of our own making. First and foremost among these are nuclear weapons, but the dangers include climate-changing technologies, emerging biotechnologies, and cybertechnology that could inflict irrevocable harm, whether by intention, miscalculation, or by accident, to our way of life and to the planet.
The Doomsday Clock is an internationally recognized design that conveys how close we are to destroying our civilization with dangerous technologies of our own making. First and foremost among these are nuclear weapons, but the dangers include climate-changing technologies, emerging biotechnologies, and cybertechnology that could inflict irrevocable harm, whether by intention, miscalculation, or by accident, to our way of life and to the planet.
Nassim Taleb, famous for "The Black Swan", is creating an institute at NYU:
http://nassimtaleb.org/tag/extreme-risk-institute/
Extreme Risk Institute
Nassim Taleb is starting the new academic year with a new role. Along with Charles Tapiero, Taleb will be co-director of the EXTREME RISK INITIATIVE, which is expected to develop into an Extreme Risk Institute within the NYU School of Engineering. Here is the official description from his Facebook Page:
In spite of the importance of extreme/hidden risks, there has not been a rigorous methodology to deal with them; statistical or mathematical approaches have not been formally reconciled with real-world decision-making the way engineering has traditionally integrated mathematics and real world heuristics. Extreme risks require both more mathematical and more practical rigor.
The Extreme Risks Initiative, ERI, is an NYU-School of Engineering interdisciplinary open research agenda, based on research axes defined by its members and a global research collaborations. Its approaches are at the intersection of the technical and the practical, based on a rigorous merger of theory and practice across interdisciplinary lines. These may include financial and economic engineering, urban risk engineering, transportation-networks, bio-systems, as well as global and environmental problems. A selected series of research axes as well as publications drawing on members Initiatives are included in the ERI a working paper series as well as current research enterprises.
Extreme Risk Institute
Nassim Taleb is starting the new academic year with a new role. Along with Charles Tapiero, Taleb will be co-director of the EXTREME RISK INITIATIVE, which is expected to develop into an Extreme Risk Institute within the NYU School of Engineering. Here is the official description from his Facebook Page:
In spite of the importance of extreme/hidden risks, there has not been a rigorous methodology to deal with them; statistical or mathematical approaches have not been formally reconciled with real-world decision-making the way engineering has traditionally integrated mathematics and real world heuristics. Extreme risks require both more mathematical and more practical rigor.
The Extreme Risks Initiative, ERI, is an NYU-School of Engineering interdisciplinary open research agenda, based on research axes defined by its members and a global research collaborations. Its approaches are at the intersection of the technical and the practical, based on a rigorous merger of theory and practice across interdisciplinary lines. These may include financial and economic engineering, urban risk engineering, transportation-networks, bio-systems, as well as global and environmental problems. A selected series of research axes as well as publications drawing on members Initiatives are included in the ERI a working paper series as well as current research enterprises.
Martin Rees and others created a Centre for Study of Existential Risk at Cambridge:
http://en.wikipedia.org/wiki/Centre_for_the_Study_of_Existential_Risk
The Centre for the Study of Existential Risk (CSER) is a research centre at the University of Cambridge, intended to study possible extinction-level threats posed by present or future technology. The co-founders of the centre are Huw Price (a philosophy professor at Cambridge), Martin Rees (a cosmologist, astrophysicist, and former President of the Royal Society) and Jaan Tallinn (a computer programmer and co-founder of Skype).[1] According to its website, CSER's advisors include philosopher Peter Singer, computer scientist Stuart J. Russell, statistician David Spiegelhalter, and cosmologists Stephen Hawking and Max Tegmark.[2] According to their website their "goal is to steer a small fraction of Cambridges great intellectual resources, and of the reputation built on its past and present scientific pre-eminence, to the task of ensuring that our own species has a long-term future."[2][3]
The Centre for the Study of Existential Risk (CSER) is a research centre at the University of Cambridge, intended to study possible extinction-level threats posed by present or future technology. The co-founders of the centre are Huw Price (a philosophy professor at Cambridge), Martin Rees (a cosmologist, astrophysicist, and former President of the Royal Society) and Jaan Tallinn (a computer programmer and co-founder of Skype).[1] According to its website, CSER's advisors include philosopher Peter Singer, computer scientist Stuart J. Russell, statistician David Spiegelhalter, and cosmologists Stephen Hawking and Max Tegmark.[2] According to their website their "goal is to steer a small fraction of Cambridges great intellectual resources, and of the reputation built on its past and present scientific pre-eminence, to the task of ensuring that our own species has a long-term future."[2][3]
Their website:
http://cser.org/
Safeguarding our passage through the 21st Century
The Centre for Study of Existential Risk is an interdisciplinary research centre focused on the study of human extinction-level risks that may emerge from technological advances. We aim to combine key insights from the best minds across disciplines to tackle the greatest challenge of the coming century: safely harnessing our rapidly-developing technological power.
Safeguarding our passage through the 21st Century
The Centre for Study of Existential Risk is an interdisciplinary research centre focused on the study of human extinction-level risks that may emerge from technological advances. We aim to combine key insights from the best minds across disciplines to tackle the greatest challenge of the coming century: safely harnessing our rapidly-developing technological power.
Edit history
Please sign in to view edit histories.
Recommendations
0 members have recommended this reply (displayed in chronological order):
51 replies
= new reply since forum marked as read
Highlight:
NoneDon't highlight anything
5 newestHighlight 5 most recent replies
RecommendedHighlight replies with 5 or more recommendations
Tesla boss Elon Musk warns artificial intelligence development is 'summoning the demon'
bananas
Oct 2014
#2
You need to catch up. Samaritan, the 2nd machine - the one that doesn't care - has access to all the
24601
Oct 2014
#42
Vigilance surfaces from time - they are dupes of Samaritan/Decima. Root & I are destined to be
24601
Oct 2014
#45
Ya know what if he wants a real concern he should look at the job we humans have done.
cstanleytech
Oct 2014
#11
Underestimating your adversary is a quick & sure way to lose. Ask John Kerry how that feels.
24601
Oct 2014
#43
His worry might be moot. The race is on. Will AI take over (computer singularity) before human-
rhett o rick
Oct 2014
#13
When we studied machine intelligence back in college, we encountered many unknowns.
tclambert
Oct 2014
#19
dead-hand automatic-response missile systems have been an issue since, what, the 50s?
MisterP
Oct 2014
#22
Oh, so when it comes to him selling cars he wants the government to butt out, but when others...
JVS
Oct 2014
#30
That combined with the artificial stupidity demon of Fox News would end us all.
Kablooie
Oct 2014
#33