Never mind existential risk, what are your politics?
An open letter to Nick Bostrom, Director of The Future of Humanity Institute at Oxford University
By Phil Hall
I think you should be using H. G. Wells’s version of futurology, the one he explores in his book Anticipations and elsewhere and move away from the narrower, logical-philosophical-statistical definition you seem to be relying on at the moment.
H. G. Wells was political!
Are logic and statistics really the correct basis for a useful futurology? Are they sufficient? Of course they aren’t! Not unless to be a Quant is to be a futurologist. Can you really put a more profound discussion of understanding of the nature of human beings and their society and culture in their environment to one side? There is no ceteris paribus here.
Our psychology, shared needs, expectations, values and cultures shape our world, especially in the time of the anthropocene. They do so far more than ‘science’ or ‘technology’.
Science and technology have no agency. They are not independent actors. Developments in science and technology are mere trajectories based on the culture and values of the people who control the development of science and technology. We haven’t reached a stasis where we can make assumptions about what these are and forecast trends based on abstract technocratic principles.
Science and technology have no agency. They are not independent actors.
We haven’t reached the ‘End of History‘. We are living under capitalism, which is full of contradictions. The market rules, which means the rulers of the market rule. It is impossible for people whose principal aim is profit, to harmonise that motive with rational action for the social good. The two forces are incompatible. The interests of the many and the few are irreconcilable. People who are exploited will always rise up together against the people who exploit them. These forces, science and technology, have no agency in themselves. They are the product of our culture and we have the agency.
The uses of technology are too unpredictable to extrapolate. Watch Mark Zuckerberg talk about The Meta, his new company. There is nothing new here. In fact, Facebook is also a great place for discussing politics and collaborating and social organising. A new Facebook called Meta will not, primarily, be about gaming or business meetings or hooking up. Gaming, business meetings and hooking up are just the activities that Zuckerberg values. Neither is Facebook simply a ‘social acid’ meant to dissolve opposition to capitalism in consumption and provide the illusion of connection. as Catherine Liu suggests. Facebook is also a catalyst for collective resistance to exploitation – like the telephone.
The job of economics is to make capitalism work. To identify existential risks to capitalism and work around them and make it function better. That’s not your remit.
I do like your analysis and extrapolations. They do provide insights into Artificial Intelligence and existential risk, but they seem bloodless, and abstract to me. Your insights provide a veneer of objectivity to something deeply subjective; individual and social behaviour. The job of economics is to make capitalism work. To identify existential risks to capitalism and work around them and make it function better. That’s not your job. To me, your insights are far too technophiliac and techno-centric.
Where, in your work, is there evidence of a broader concept of futurology that doesn’t make easy assumptions about the nature of people and society? You seem to ignore many of the insights available to society we can find in sociology, history, anthropology, psychology, literature and art? Where is the evidence of any ‘input’ from these disciplines in your techno-centric analysis? You are the director of The Future of Humanity Institute. Before you try to project the future of humanity, define what you mean by humanity. I challenge you.
Where, in your work, is there evidence of a broader concept of futurology that doesn’t make easy assumptions about the nature of people and society?
The greatest power to transform our society and our environment and to direct our efforts comes from our subjectivity not post facto trend analysis. Statistical correlation only has validity when talking about human society when it demonstrates that it understands human society.
If you love nature and animals and people, you protect them and that can shape everything around you. You spend money in a different ways; on hospitals and schools and not on roads and spaceships. That, in turn, affects existential risk and the nature of the AI we develop. And by AI I don’t mean conscious artificial life, simply advanced, autonomous expert systems.
Before you try to project the future of ‘humanity’ into far futures, define what you mean by humanity, I challenge you.
If profit and power are the motivations of the people in charge, technology takes a different route. Futurology for the Soviets or the fascists was not the same as futurology for capitalism or fascism. Politics shapes our world. Yet you present your findings as if they were apolitical. That’s strange. H. G. Wells was highly political. What are your politics? Behind the techncratic facade, your politics shape your version of the future, just as H. G. Wells’ politics shaped his version of futurology.
You are writing your magnum opus. Kindly make your politics explicit in the prologue to that magnum opus.
Philip R. Hall