Lesson Goal
In Lesson 13 you aligned your store page and positioning with your monetization so players know what to expect. This final lesson ties everything together: a launch checklist so you go live without skipping critical steps, and a post-launch rhythm so you keep improving offers, store presence, and retention using the data and experiments you designed earlier.
By the end you will have:
- A short pre-launch checklist (store, IAP/ads, analytics, positioning).
- A post-launch cadence (when to review data, when to run the next experiment).
- Clarity on what to do next after the course (links to guides and help).
Step 1 – Pre-Launch Checklist
Before you flip the switch, run through these in order. They assume you have already done Lessons 1–13 (revenue model, loop mapping, first store and IAP flow, analytics, KPIs, live ops roadmap, A/B tests, segmentation, LTV and forecasts, budget decisions, experiment backlog, and store presence).
Store and positioning
- [ ] Store page copy matches your positioning and revenue model (premium vs F2P, what optional spending adds).
- [ ] Screenshots and trailer show the real experience; no misleading promises.
- [ ] Price (or “Free”) and main IAP or ad message are clear.
Monetization implementation
- [ ] IAP or ads are implemented and tested in a build (purchase flow, restore, ad placement).
- [ ] Prices and product IDs match what you designed; no test-only values left in production.
- [ ] First analytics events are in place (e.g. purchase completed, ad viewed, key progression milestones).
Data and safety
- [ ] Analytics dashboard or tool is receiving events (run a test session and confirm).
- [ ] You know where to look for revenue, retention, and conversion (even if it is a simple spreadsheet or one dashboard).
- [ ] You have a backlog of 2–3 post-launch experiments (from Lesson 12) so you are not deciding what to test at the last minute.
Pro tip: Do a “soft” or regional launch first if you can (e.g. one store or one region) so you can fix critical issues before a big push.
Step 2 – Launch Day (and the First Week)
Launch day
- Flip the switch: publish the store page and build.
- Monitor for crashes, payment failures, and missing analytics.
- Avoid changing prices or running new experiments in the first 24–48 hours; let baseline data collect.
First week
- Check retention (D1, D7) and revenue per user (or ad revenue) at least once.
- Compare to your forecast (from Lesson 10–11): are you in the right ballpark? If not, note whether the gap is traffic, conversion, or retention.
- Do not over-react to day-one numbers; wait for a few days of data before changing offers or store copy.
Common mistake: Changing too much too soon (e.g. dropping price or adding a new IAP on day two) so you never get a clear before/after read. Prefer one change at a time and measure.
Step 3 – Post-Launch Rhythm (When to Iterate)
Set a simple cadence so iteration is deliberate, not reactive.
Weekly (first month)
- Review retention and revenue; note any obvious drop-offs or spikes.
- Check store page (reviews, wishlists, or installs) and fix any broken or confusing copy.
- Pick one experiment from your backlog (e.g. one new price, one new offer, or one store A/B test) and schedule it for the next week.
Monthly (ongoing)
- Update your LTV or revenue forecast with real data (Lesson 10–11).
- Revisit budget and runway (Lesson 11) if revenue or retention shifted.
- Plan the next 2–3 experiments from your backlog and retire or reprioritize ones that no longer apply.
Quarterly
- Revisit positioning and store presence (Lesson 13): does the one-sentence pitch still fit? Do screenshots and trailer still represent the game?
- Consider larger experiments (e.g. a new content drop or a new monetization lever) and add them to the backlog.
Pro tip: Keep a short “live ops log” (dates and what you changed) so you can correlate changes with retention and revenue later.
Step 4 – What Success Looks Like (and What to Avoid)
Healthy signs
- Retention is stable or improving week over week (or you know why it dropped and have a plan).
- Revenue per user (or ad revenue) is in line with your conservative forecast, or you understand the gap (e.g. traffic mix, conversion).
- You are running one experiment at a time and learning from each before the next.
- Store page and positioning still match the game and monetization; reviews do not consistently complain about “misleading” or “pay-to-win” in a way that suggests a mismatch.
Warning signs
- Changing prices or offers every few days with no clear hypothesis or measurement.
- Ignoring retention and only chasing day-one revenue (leads to burnout and churn).
- No analytics or no time set aside to look at data (you cannot iterate safely without it).
- Store page and actual experience drift apart (e.g. you added heavy IAP but the page still says “no pay-to-win”).
Step 5 – Course Wrap-Up and Next Steps
You have gone from choosing a revenue model (Lesson 1) to mapping monetization into your loop (Lesson 2), designing offers and building your first store and IAP flow (Lessons 3–4), reading data and iterating (Lesson 5), KPIs and live ops basics (Lesson 6), live ops roadmap and content calendar (Lesson 7), safe A/B tests (Lesson 8), player segmentation and fair offers (Lesson 9), LTV and revenue forecasting (Lesson 10), budget and runway (Lesson 11), post-launch experiment backlog (Lesson 12), and store presence and positioning (Lesson 13). This lesson gave you a launch checklist and a post-launch rhythm so you can go live and keep improving.
What to do next
- Bookmark this course and the Monetize Your Indie Game index so you can revisit the lesson list.
- Use our help section for implementation snags (e.g. Unity IAP, store page optimization) and our guides for deeper dives on engines and tools.
- Share your launch or first experiment with the community (forums, Discord, or social) so others can learn from your results.
Mini-task:
Write down your next three actions: (1) one item from the pre-launch checklist you will complete this week, (2) one post-launch experiment you will run first, and (3) one date in the next month when you will review retention and revenue and update your forecast.
FAQ
When should I run my first A/B test after launch?
After you have at least a few days of stable data (e.g. 3–7 days). Avoid testing on day one; baseline first, then change one thing and measure.
What if my revenue is much lower than my forecast?
Check traffic (are the right players finding you?), conversion (are they reaching the paywall or store?), and retention (are they staying long enough to convert?). Fix the biggest gap first; do not change everything at once.
How often should I update my store page?
When positioning or the game meaningfully changes (e.g. new content, new monetization), or when reviews consistently point to the same confusion. Do not tweak copy every week without a reason.
Summary
Use the pre-launch checklist (store, IAP/ads, analytics, positioning, experiment backlog) so you go live without skipping steps. In the first week, monitor stability and baseline data; avoid big changes. Set a post-launch rhythm (weekly reviews and one experiment at a time, monthly forecast and budget check, quarterly positioning review) so you keep iterating safely. Success is stable retention, revenue in line with expectations (or understood gaps), and a clear link between what you change and what you measure. You have finished the course; next step is launch and your first iteration cycle.
Congratulations on completing Monetize Your Indie Game. Bookmark the course and come back whenever you need a refresher on forecasts, experiments, or store presence.