Beginner's Roadmap to AI: 90 Days from Zero

Three months is the right unit of time to plan around. Less is impatient, you'll be memorising terminology and call yourself "an AI person" before you've built anything that works. More than that and you're either procrastinating or learning the wrong way. The plan below assumes you have no programming background, no math beyond high school, and 10 hours a week. By day 90 you will have shipped a working AI tool that someone other than you uses, joined two communities of practitioners, and have an honest answer to "what should I learn next?" If you have more time per week, compress it. If you have less, stretch it, but don't break the sequence. Order matters more than speed.

Table of contents

Week-by-week schedule

The schedule below is built around two principles: ship something every two weeks, and let the building drive the studying. Most beginners reverse this — they study for three months, never ship, and quit. We won't.

WeeksGoalHoursOutput
1-2Orientation and first contact20API key working, hello-world script
3-4First real project20Tool you actually use
5-6Reliability and evaluation20Test suite, half the failures fixed
7-8First multi-step agent20Agent that uses tools
9-10Theory you finally need15Notes doc explaining your projects
11-12Polished portfolio piece25Public, deployed, README, blog post

Weeks 1–2. Sign up for an OpenAI or Anthropic developer account, get an API key, install Python (3.11 or newer), VS Code or Cursor, and run a hello-world. Read the OpenAI Cookbook's "How to call the API" page in full. Don't skim. Then read Anthropic's quickstart for the Claude API. By the end of week two you should have a script that takes a question on the command line and returns an answer. This is small. That is the point.

Weeks 3–4. Pick a problem from your real life. Examples that have worked for past learners: summarising newsletters, generating standup notes from a git log, drafting replies to a parent's emails, naming files in a screenshot folder, summarising long YouTube transcripts. Build the smallest version that solves it. Deploy it somewhere a human can use it, Streamlit Cloud is free and takes ten minutes. Get one person other than you to use it for a week.

Weeks 5–6. Look at how often your week-3 tool fails. Make a Google Sheet of 30 inputs and the outputs. Mark the failures. Try to halve the failure rate. This is where you'll do real prompt engineering, chain-of-thought, structured output, few-shot examples, and it is where most learners panic and give up because their cute demo turns out to be flaky in real use. Push through. This is the work.

Weeks 7–8. Build a project that takes more than one step. The simplest pattern: an agent that researches a question, writes a draft, and emails it to you. Use OpenAI's Assistants API, Anthropic's tool use, or LangGraph. Pick whichever has the documentation that makes sense to you. Don't switch frameworks mid-project.

Weeks 9–10. Now read theory. The 3Blue1Brown neural networks series, the Hugging Face NLP course's first three chapters, Andrej Karpathy's "Intro to Large Language Models" talk. You will absorb ten times more in week nine than you would have in week one because every concept now has a hook in your memory.

Weeks 11–12. Take your best project, polish it, write a README, write a blog post explaining your design choices, deploy it publicly. This becomes the thing you point hiring managers at. We covered the broader 90-day plan in the context of the full learn AI roadmap.

Free resources only

The defining fact of learning AI in 2026 is that almost everything you need is free. The temptation to buy a 5,000 USD bootcamp comes from not knowing what's already available. Here is the short list, in the order you'll want them.

For the first contact (weeks 1-2): OpenAI Cookbook (cookbook.openai.com), Anthropic's prompting guide and quickstart, the Hugging Face course chapters 1-2.

For the first project (weeks 3-4): DeepLearning.AI's free short course "Building Systems with the ChatGPT API" by Isa Fulford and Andrew Ng (1.5 hours), Streamlit's "30 days of Streamlit" challenge for the deployment side.

For evaluations (weeks 5-6): the OpenAI Cookbook's evals section, Hamel Husain's blog post "Your AI product needs evals," and the Promptfoo docs.

For agents (weeks 7-8): LangChain's Academy short courses (free), Anthropic's "Building effective agents" guide.

For theory (weeks 9-10): 3Blue1Brown's neural networks playlist, Andrej Karpathy's "Let's build GPT from scratch" video, Lilian Weng's blog (lilianweng.github.io).

For ongoing learning: the Latent Space podcast, The Sequence newsletter, Import AI weekly.

If you stop after just these resources, with discipline and project work, you will be more capable in three months than 80 percent of self-described "AI experts" on LinkedIn. We covered the paid-vs-free trade-offs in our free vs paid AI courses guide.

First project to ship

The first project matters more than any course you take. It commits you. It is also where most beginners freeze, because they overthink the picking. Use this rule: pick the smallest problem in your own life that AI could solve and would save you 15 minutes a week. Tiny. Personal. Boring. Boring is good. Boring is achievable.

Three projects that have worked well for true beginners:

The newsletter summariser. A script that pulls last week's emails from one specific newsletter, summarises them in three bullets, and emails the summary to you. Touches: API calls, basic email parsing, prompt design, deployment as a daily cron job. Time to first working version: two weekends.

The meeting notes structurer. Paste a transcript, get back a structured doc with action items, decisions, and unresolved questions. Touches: long-context handling, structured outputs (JSON schema), prompt iteration. Time to first version: one weekend.

The "what did I do this week" generator. Reads your git log, calendar, or task list, produces a standup or weekly review. Touches: tool calls (reading a file or API), structured output, summarisation. Time to first version: two weekends.

Whichever you pick, deploy it. Streamlit Cloud, Vercel, Replit, anywhere you can hand a URL to a friend. The act of deploying forces you to handle config, secrets, and other "professional engineering" details that tutorials skip. This is the gap between hobbyists and people who get hired.

When to spend money

You will not need to spend much. The honest list:

20 USD per month: a ChatGPT Plus or Claude Pro subscription. The free tiers will throttle you and crush momentum. This is the only consistent paid recommendation in this entire roadmap.

30-100 USD over the 90 days: API credits. You will spend a small amount calling APIs to test your projects. Set a hard monthly cap in the OpenAI dashboard at 20 USD until you know your usage; raise as needed.

0-50 USD per month: optional Coursera specialisation. Pay for this only if you have demonstrably failed to finish unstructured courses before. The structure is the product. The Andrew Ng Machine Learning Specialization is the standard recommendation.

Things that look like good investments but aren't: bootcamps for a beginner with no programming background (you'll fall behind in week one), expensive "AI Master" certificates from non-accredited online schools (recruiters don't recognise them), one-on-one tutoring before you've struggled enough to know what to ask. Save the bigger investments for after the 90 days, when you actually know which gap you have.

Communities to join

The right community will reduce your learning time by half. The wrong one will fill your head with hype and make you feel behind. Pick carefully.

Join two communities, no more. Three is overload, one is fragile. The most useful for true beginners in 2026:

The DeepLearning.AI Discord is large, active, and friendly to beginners. People help each other on assignments. The signal-to-noise is good.

Local AI meetups on Meetup.com or Lu.ma. The "AI Tinkerers" chapters in most major cities are the gold standard. Showing up in person twice will teach you more than three months of Twitter scrolling.

Hugging Face's forums are the right place when you have specific technical problems with NLP or open-source models.

The OpenAI developer forum for API-specific questions.

Skip in the first 90 days: large general-purpose subreddits like r/MachineLearning (advanced research, intimidating), AI Twitter/X (mostly hype and hot takes), and "AI study groups" with no accountability or named maintainers (often dead).

One specific tactic: in your first month, post one beginner question per week in your chosen Discord. Not "where do I start", that is unanswerable. Specific: "I'm stuck on this error from the OpenAI quickstart, here's my code." You will get answered fast and you'll meet the regulars.

Common quitting points and how to push through

The quitting curve is well-documented. Most beginners drop out at one of four points. Naming them in advance lets you recognise the moment when you hit it and decide that you're going to keep going.

Quitting point 1, around week 3: "this is harder than I thought." Your first real project broke. Your prompt is unreliable. The thing that worked in your hello-world doesn't work on your real input. This is normal. Do not start a new project. Stay on the broken one for at least one more week. Most learners switch projects at this point and end up with three abandoned ones.

Quitting point 2, around week 6: "everyone on Twitter knows more than me." You discover that someone shipped a viral demo using techniques you've never heard of. You feel hopelessly behind. Cure: unfollow the highlight reel for the next 30 days. Compare yourself to your week-1 self, not to a 22-year-old who's been doing this for two years.

Quitting point 3, around week 8: "agents are too hard." Multi-step agents are genuinely brittle in 2026 and your first one will misbehave in surprising ways. This is not a personal failing. Even teams at major labs are struggling with reliable long-horizon agents. Build the smallest possible agent (two steps), get it working, declare victory, and move on.

Quitting point 4, around week 11: "polishing is boring." The portfolio polish step feels less rewarding than the learning step. Push through anyway. The portfolio is what gets you hired. We covered this in the broader 90-day learning roadmap.

The single best technique to push through any of them: a 30-minute weekly call with one accountability partner who is also doing the roadmap. You will both make it.

Frequently asked questions

Can I really learn AI with no programming background in 90 days?

Yes for applied use. Yes for very simple builds (Python scripts that call an API). No for becoming a fully employable AI engineer, that takes longer because you also have to learn software engineering. If you have no programming background and want a developer job, plan for 9-12 months total: three months for Python and software engineering basics, then this roadmap on top.

How many hours a week do I need?

Ten hours a week is the floor for the schedule above. Below that, the gaps between sessions become long enough that you forget what you learned. Above that, ramps faster. At 20 hours a week the same plan compresses to about six weeks. The hours have to be focused; passive video-watching does not count.

Do I need a Mac, or is Windows OK?

Windows is fine. Most of the work is in cloud APIs, and Python runs identically on both. The main wrinkle is that some open-source tools assume Linux conventions; on Windows you can use WSL (Windows Subsystem for Linux) to get Linux-style tooling for the rare moments it matters. For the first 90 days, vanilla Windows is plenty.

Should I learn ChatGPT or Claude first?

The differences matter less than learning either properly. Pick one and use it daily for a month. The skills transfer. By month two you should try the other to see what feels different. If you want a specific recommendation: ChatGPT has the broader plugin ecosystem, Claude tends to be better at long-form writing and adherence to instructions. We covered the comparison in the wider learn AI hub.

What if I miss a week?

Start that week again, do not skip. The order is built around stacking: each week's project assumes the last week's skills. Skipping breaks the stack and you'll feel lost. One week of slip is fine; declaring "I'll catch up later" almost never works.

Is it OK to use AI to help me build my AI projects?

Yes, and you should. Use Cursor, Claude Code, or GitHub Copilot to write the boilerplate. Use Claude or ChatGPT to debug. The skill is the meta-skill of using AI productively, not memorising syntax. The one rule: when something breaks, don't immediately paste it back to the AI. Read the error first. Try to predict the cause. Then check.

The bottom line

The 90 days will work if you ship something every fortnight and accept that the first few things you ship will be bad. They will. Bad working code beats perfect-in-your-head code in every dimension that matters. Set up your tools tonight, ship a hello-world by Sunday, ship a real project by week four. Pay for one subscription and not much else. Join two communities, post one specific question per week. Don't switch frameworks mid-project. Don't compare yourself to people who started two years ago. By day 91 you will be ahead of every "AI expert" who never built anything, which is most of them, and you will have a portfolio piece that is worth more than any certificate. The hard part is not the material. The hard part is finishing the second project after the first one breaks. Plan for that, and you'll get to day 91.

Last updated: May 2026