I agree with what you said. The only thing I want to point out is with your statement:
(+ its better for the environment)
Running models locally doesn’t necessarily mean that it’s better than the environment. Usually the hardware at cloud data centers is far more efficient at running intense processes like LLMs than your average home setup.
You would have to factor in whether your electricity provider is using green energy (or if you have solar) or not. And then you would also have to factor in whether you’re choosing to use a green data center (or a company that uses sustainable data centers) to run the model.
That being said, (in line with what you stated before) given the sensitive nature of the conversations this individual will be having with the LLM, a locally run option (or at least renting out a server from a green data center) is definitely the recommended option.
I agree with what you said. The only thing I want to point out is with your statement:
Running models locally doesn’t necessarily mean that it’s better than the environment. Usually the hardware at cloud data centers is far more efficient at running intense processes like LLMs than your average home setup.
You would have to factor in whether your electricity provider is using green energy (or if you have solar) or not. And then you would also have to factor in whether you’re choosing to use a green data center (or a company that uses sustainable data centers) to run the model.
That being said, (in line with what you stated before) given the sensitive nature of the conversations this individual will be having with the LLM, a locally run option (or at least renting out a server from a green data center) is definitely the recommended option.