Detalles del libro
How society can shape individual actions in times of uncertainty.
When we make decisions, our thinking is informed by societal norms, “guardrails” that guide our decisions, like the laws and rules that govern us. But what are good guardrails in today’s world of overwhelming information flows and increasingly powerful technologies, such as artificial intelligence? Based on the latest insights from the cognitive sciences, economics, and public policy, "Guardrails" offers a novel approach to shaping decisions by embracing human agency in its social context.In this visionary book, Urs Gasser and Viktor Mayer-Schönberger show how the quick embrace of technological solutions can lead to results we don’t always want, and they explain how society itself can provide guardrails more suited to the digital age, ones that empower individual choice while accounting for the social good, encourage flexibility in the face of changing circumstances, and ultimately help us to make better decisions as we tackle the most daunting problems of our times, such as global injustice and climate change.Whether we change jobs, buy a house, or quit smoking, thousands of decisions large and small shape our daily lives. Decisions drive our economies, seal the fate of democracies, create war or peace, and affect the well-being of our planet. "Guardrails" challenges the notion that technology should step in where our own decision making fails, laying out a surprisingly human-centered set of principles that can create new spaces for better decisions and a more equitable and prosperous society.
- Encuadernación Tapa dura
- Autor/es Gasser, Urs / Mayer-Schönberger, Viktor
- ISBN13 9780691150680
- ISBN10 0691150680
- Páginas 232
- Año de Edición 2024
- Idioma Inglés
Guardrails: Empowering Human Decisions in the Age of AI
- Urs Gasser , Viktor Mayer-Schönberger
- Editorial PRINCETON
- ISBN 9780691150680
¡Disponible! Recíbelo en 24/48h
Gratis en España peninsular
Recogida en librería
Gratis
¡Disponible! Cómpralo ahora y recógelo mañana.