NATIONAL BUREAU OF ECONOMIC RESEARCH
NATIONAL BUREAU OF ECONOMIC RESEARCH
loading...

Self-Regulating Artificial General Intelligence

Joshua S. Gans

NBER Working Paper No. 24352
Issued in February 2018
NBER Program(s):The Productivity, Innovation, and Entrepreneurship Program

This paper examines the paperclip apocalypse concern for artificial general intelligence. This arises when a superintelligent AI with a simple goal (ie., producing paperclips) accumulates power so that all resources are devoted towards that goal and are unavailable for any other use. Conditions are provided under which a paper apocalypse can arise but the model also shows that, under certain architectures for recursive self-improvement of AIs, that a paperclip AI may refrain from allowing power capabilities to be developed. The reason is that such developments pose the same control problem for the AI as they do for humans (over AIs) and hence, threaten to deprive it of resources for its primary goal.

download in pdf format
   (262 K)

email paper

Machine-readable bibliographic record - MARC, RIS, BibTeX

Document Object Identifier (DOI): 10.3386/w24352

Users who downloaded this paper also downloaded* these:
Ludvigson, Ma, and Ng w23225 Shock Restricted Structural Vector-Autoregressions
Bellemare Comment on "Food Price Spikes, Price Insulation, and Poverty"
Engle Interpreting Spectral Analyses in Terms of Time-Domain Models
Ericson and Kessler w18913 The Articulation Effect of Government Policy: Health Insurance Mandates Versus Taxes
 
Publications
Activities
Meetings
NBER Videos
Themes
Data
People
About

National Bureau of Economic Research, 1050 Massachusetts Ave., Cambridge, MA 02138; 617-868-3900; email: info@nber.org

Contact Us