I accept the risks but reject the analogy to nuclear weapons and virology. That analogy is incredibly dangerous, since it leads people to think controlling specific atoms will stop matrix multiplications, or that stopping matrix multiplications de-risks AI.
nitter.moomoo.me/tegmark/status/1663509691161346049#m (https://nitter.moomoo.me/tegmark/status/1663509691161346049#m)
https://nitter.moomoo.me/gfodor/status/1663537442333536256#m