In Module 2 you learned to talk with AI, but have you ever wondered where it “lives” when you use it? It’s like choosing between a hotel (everything ready, but you follow their rules) or your home (more work, but total control). When you use ChatGPT or Claude, your data travels to distant servers. When you use local models, everything stays on your PC. Each has pros and cons: privacy, costs, speed, control. Understanding the difference helps you choose the right tool for each project. And no, there’s no one right answer for everyone!
🧠 Don’t worry if they seem complex, we’ll explain them carefully!
Imagine having to prepare an important dinner. You have two options:
You at computer → Your message → Internet →
OpenAI/Anthropic servers → Processing →
Response → Internet → Back to you
It’s like: Calling a brilliant friend on the other side of the world. He knows everything but you always have to call him.
You at computer → Your message →
Your processor/GPU → Processing →
Response → On screen
Even without internet connection
It’s like: Having that brilliant friend living with you. Always available, no calls needed. BUT… he eats up all your resources, complains if the PC is slow, and isn’t always as good as his cloud cousin!
Some models run directly in the browser! Like having a mini-chef cooking with the office microwave: perfect for testing or light devices. But don’t expect a fancy dinner from that kitchen.
When you use ChatGPT, Claude, Gemini, DeepSeek online:
It’s like: Discussing secret business in a restaurant. Probably nobody’s listening, but…
When you use local models:
It’s like: Talking to yourself in the kitchen. Unless you have bugs, you’re safe!
⚠️ Warning: Local doesn’t mean invisible! For example, if you work on a shared computer, others might read conversations saved in temporary files. Learn about encryption if you need real privacy.
Here we provide indicative costs reflecting the market at the time of writing. Always compare market offers to find the best option for you.
ChatGPT Plus: ~$20/month
Claude Pro: ~$20/month
API usage: ~$0.001 per 1000 tokens (just an example, varies a lot by companies and LLM models)
| Pros | Cons |
|---|---|
| Predictable cost | You pay forever |
| Always have latest version | Usage limits on free plans |
| No hardware investment | If prices increase… |
Decent GPU: $500-3000+
Electricity: $10-50/month extra
Setup time: 3-10 hours (first time)
| Pros | Cons |
|---|---|
| Unlimited use | High initial investment |
| No recurring costs | You’re the IT technician |
| Total control | Less powerful models (for now) |
| When to use what? | AI in Cloud | AI locally |
|---|---|---|
| Sensitive data | ⚠️ Caution | ✅ Better |
| Power required | ✅ Very high | ❌ Limited |
| Intensive use | ❌ Expensive | ✅ Convenient |
| Need offline | ❌ No | ✅ Yes |
| Technical setup | ✅ Zero | ❌ Complex |
Practical rule:
Develop with cloud AI using generic placeholders (“my-app”, “client-name”, “secret-api”) and substitute real data only in your local code. Or: do 95% of the app with ChatGPT/Claude, but that super-secret feature? Develop it with local AI!
But also ask yourself: if you don’t have the revolutionary idea of the century, do you really need all this secrecy?
Don’t marry a single AI! Use plans from multiple services. You can split the information you reveal between ChatGPT, Claude, Gemini, Perplexity, …, without giving anyone the complete picture. Also remember, each AI has its own “character”, creativity and way of thinking… you’ll find different perspectives on the same problem.
Bonus: if one has server problems or you run out of credits, you always have a backup voice! 🎯
There are ways to have a powerful computer in the cloud, which you fully manage from home. Like renting an apartment in the cloud: it’s your home as long as you keep it. In that computer you can download open-source models locally and use them in your apartment privately.
These possibilities will become clearer as we progress through the course, for now it’s enough to mention them to know the available options. 📚
For companies or serious commercial projects: there are cloud solutions with extra privacy guarantees (Azure OpenAI, AWS Bedrock). Higher costs but sleep soundly. It’s like having a private suite in the hotel instead of the standard room.
Perfect when data is really sensitive but you also need all the cloud power. 💼
🚫 WRONG
"which AI is better?"
"local or cloud use?"
"is it safe to use ChatGPT?"
"how much does AI cost?"
✅ RIGHT
"This app handles medical data. Is it safe to use cloud AI?"
"I have $0 budget and no GPU. What options do I have to use AI?"
"Here's my PC: [details]. What local models can I run?"
"I'm developing [app type]. Do you recommend cloud or local AI for my case?"
Reality: If your PC has malware, gets hacked or you lose it, local data is as at risk as the cloud!
Reality: Serious companies (OpenAI, Anthropic) have clear policies. Risk exists, but it’s lower than you think. And they probably focus on different things than our projects, to continue growing in their world.
Reality: Models like Phi or Mistral 7B “run”, but prepare for coffee breaks while they respond… for decent performance you need dedicated hardware.
Reality: Prices are falling! Competition helps us users.
Trends coming:
❌ “I use ONLY ChatGPT for everything”
❌ “I use ONLY local models for everything”
✅ “I use the right tool for each situation”
Cloud: API costs can explode if a project grows
Local: Electricity + setup time + hardware
❌ “I NEVER use cloud, they spy on me!”
❌ “I put everything on cloud, what’s the problem?”
✅ “I evaluate case by case, with common sense”
Before choosing, ask yourself:
Remember:
Start simple: If you’re starting out, begin with cloud (free/cheap). When you grow and understand your needs, evaluate local or hybrid.
Before moving to the next module:
If you’ve checked everything → Great! Now you know how to consciously choose where to make your AI “live”!
If something’s unclear → Normal! It’s a complex topic. The important thing is understanding you have choices, and each option makes sense in different contexts.
In Module 4 we’ll discover the building blocks of programming: the 6 fundamental concepts at the base of EVERY program. They’re like LEGO pieces you can build anything with. And no, you don’t need to memorize syntax: AI writes, you just need to understand what to ask it!
Remember: Even Google uses a mix of cloud and local AI. Facebook too. There’s no “right way”, there’s the right way FOR YOU, at THIS moment, for THIS project.