• 0 Posts
  • 2 Comments
Joined 2 years ago
cake
Cake day: July 3rd, 2023

help-circle
  • In my opinion, being anti-AI or anti-LLM is much like being anti-chocolate. There are many good reasons to be anti-chocolate. It is very difficult to verify that a chocolate supply line does not include slave labor or child labor. I only know of one brand that even comes close. And the deforestation caused by farming can and does lead to climate change. Not to mention the addictive qualities and health effects of eating sugary candy.

    It seems mostly bad and when you look at the numbers, I think we should all be against it. And yet, making these arguments tends to do very little to make people stop eating chocolate.

    Yet, I could imagine a world where it’s farmed sustainably, by people who are paid appropriately, and with proper guidance on nutrition and exercise, it could be consumed safely.

    I have no problem with people saying they are anti-AI. But I’d just like to pause here to confirm whether maybe anti-AI is just our shorthand for anti-the-way-AI-is-right-now. Anti-the-companies-that-run-AI. I do not want noisy server farms taking up all the water of rural communities. I don’t want all of our electricity to go towards LLMs that are already “intelligent” enough to tell us that the best most immediate way to prevent further climate change is to turn them off.

    I’m not making this comment to promote one side or another. I’m just suggesting that we act strategically and try to be mindful about how polarization can appear from the outside. Being anti-AI likely persuades about as many people as being anti-chocolate. That is, very few. But if we could work towards more ethical AI, even if we don’t plan to use it, just so our argument is more palatable to the masses, it could lower the use of AI overall.

    So, I think it is worthwhile to get into the technical details of things like LLMs even if most of us here are fighting against such technologies. Just trying to add some nuance to a world that often feels way too polarized for me.