Hi Jakub
Good news, it is very possible 
The only catch is that you will be using less capable models when offline. The biggest and most capable models that we can access today are refered to a as frontier models. Examples of the frontier models are OpenAIs’ GPT series e.g GPT-4o, Googles Gemini series, Anthropics Claude series.
You know these models, so it wil make sense to you that they are very capable because they have been trained on massive data etc, and hence require signifant resource to perform. You couldn’t reasonably replicate the infrastructure etc required to run GPT-4 offline. So even if you could copy GPT-4 to your offline scenario - it wouldn’t run.
Thing is even if you had some hardware you wouldn’t get GPT-4 to run as it is a closed source model.
So given that we iterface with these frontier models by ‘talking’ to them and them responding, similar to using a phone and calling a friend, it comes down to, if you cant ‘talk’ to the model … nothing, obvious.
Okay so thats all of the bad news is, totally offline means no frontier models.
Now rememberthe idea of taking GPT-4 home, welll their are open-source models that you can take home e.g DeepSeek v3 +r1, Metas Llama and more. Your home version will be a scaled down version of those, (also considered ‘Frontier’), models .
i think the easies way for you to get started would be to download a product like LMstudio , https://lmstudio.ai/ , and play with it.
Its straightforward and will help you clarify your picture.
And in my opinion, whatI I’m getting from these open source models satisfies ±80% of small business use cases.
So yes you can user yr AI offline, and i think its pretty damn good that way given the FREE price tag.
start with LMStudio, good luck 