In June 2023 I wrote:
«Will consumers perhaps come to see the phrase "AI-Powered System" in the same light as "Diesel-Powered SUV".»
Well, not yet it would seem.
In The Elements of AI Ethics from June of last year I build on The Elements of Digital Ethics from 2021. Which itself was the output of reading about digital harms for many years.
Seeing all of the categories of harms just get worse year on year is disheartening.
What goal is worth all this? I tend to fall back on a sentiment I use in my talks and teaching:
When a privileged group benefits from a technology, the more inclined they will be to ignore the harms done unto others by the same technology. Because drawing attention to the harm would suggest they should give up their personal gain to help someone else.
This appears to be true for the short term. In the long term the beneficiaries of technology will happily also ignore harm done unto themselves, as long as they get the experience boost in the moment.
What hope is there?
In my June 11 session for Ambition Empower I will be talking about how to champion technologies of compassion, drawing on work related to nature connectedness by P. Wesley Schultz, Marianne E. Krasny, F. Stephan Mayer and Cynthia M Frantz.
Technologies of compassion work in unison with an acknowledgement of our connection not only to each other but also to nature. Technology tends to separate us from nature, making us value it less - and causing us to increasingly worsen our own living conditions, and the conditions of all other species, over time.
But we can choose to design technology that takes nature into account.. Technology that works with, not against, nature. I believe this is what all schools must start teaching. Now.
Expect me to write more about this over the next year.
https://per.ax/aie
#solarpunk