Companion document: Delivery Plan (stories, tasks, dependencies, timeline)
We want to test whether a lower-priced Welcome Offer ($10/SC20) converts better than the current offer ($20/SC40) for new registrations. The infrastructure to support this exists in part and requires a small backend change and a client update. Estimated effort: ~1-2 days backend, ~2-3 days client. The test can be stopped at any time without a deploy by deactivating the variant package.
Agreed test parameters (from meeting 2026-03-27):
- 10% minimum detectable effect (relative), two-sided test (two-sided because we also want to detect if the cheaper offer converts worse, to limit costs)
- 10% significance level (alpha = 0.10), 80% power. This is more lenient than the standard 5% alpha; the team accepted the higher false positive risk to keep the sample size manageable and shorten the test by ~1 week.
- ~10,000 total samples (across both arms), ~6 week runtime at ~1,700 registrations/week
- A/A test first to validate infrastructure
- Primary metric: first purchase conversion within 24 hours of registration
- Secondary metrics: second purchase conversion within 7 days; revenue per user
Recommended approach: Assign the correct variant at registration time, server-side. Both the popup and coin store automatically show the right offer. No client-side filtering needed.
We're considering an A/B test for the Global Poker First Purchase Offer:
- Control: $20 for a $40 Gold Coin Package (200k GC + SC40) - the current offer
- Treatment: $10 for a $20 Gold Coin Package (100k GC + SC20) - proposed variant
This document covers how the Welcome Offer works today, what controls exist, and the proposed engineering approach for a 50/50 split test on new registrations. Existing users are unaffected - they've already seen (or dismissed) their Welcome Offer.
The Welcome Offer (internally "FTO" - First Time Offer) reaches new players through two independent surfaces:
A modal shown automatically when a new player lands in the lobby after registration.
Trigger chain:
- During registration, the registration app sets a
new-player-logincookie - On first login, the game client checks for this cookie, confirms the feature flag is enabled and the user isn't in a restricted market
- If all conditions pass, the popup opens as a modal overlay on the lobby
- The cookie is consumed immediately, so the popup can only appear once per player
What the popup displays:
[LIMITED TIME banner]
WELCOME TO GLOBAL POKER
Sign Up Special
Get a $40 Gold Coin Package, for just $20 !
200,000 Gold Coins
FREE! - SC40 Sweeps Coins
[$40 strikethrough] $20 <-- purchase button
Not right now <-- dismiss link
[X] <-- close button
All display text is currently hardcoded in the client. The "Limited Time" label is static, not driven by any timer or countdown.
Existing logging: FTO loaded, FTO clicked, FTO declined - all specific to the popup. There is no separate logging for coin store views or purchases of the Welcome Offer package.
The Welcome Offer package also appears as a special offer inside the coin store, rendered alongside other purchasable packages whenever the player opens the store. This is independent of the popup - the coin store displays whatever packages the backend store service returns for that user.
All scenarios assume a newly registered player in a non-restricted market. S1 = first session after registration. S2 = a subsequent session on the same day.
Note: the registration flow goes directly from form submission to the lobby in a single redirect. Package assignment and popup display happen in the same session. Users blocked by email verification, geo-restrictions, or market restrictions never reach the point where the package is assigned. The only edge case where a user is assigned a package but doesn't see the popup is a network failure or browser crash mid-redirect, which should be rare.
| # | S1: Coin Store? | S1: Purchase? | S2: Coin Store? | Popup in S1? | Popup in S2? |
|---|---|---|---|---|---|
| 1 | Yes | -- | N/A | Yes (on lobby landing, before store) | N/A |
| 2 | No | -- | N/A | Yes (on lobby landing) | N/A |
| 3 | No | -- | Yes | Yes (on lobby landing) | No (cookie was consumed in S1) |
| 4 | Yes | Other item | Yes | Yes (on lobby landing) | No (cookie was consumed in S1) |
| 5 | Yes | None | Yes | Yes (on lobby landing) | No (cookie was consumed in S1) |
The popup always appears in Session 1 on lobby landing. It never re-appears in Session 2 because the cookie is consumed in S1 regardless of what the player does after.
| # | S1: Coin Store? | S1: Purchase? | S2: Coin Store? | In store S1? | In store S2? |
|---|---|---|---|---|---|
| 1 | Yes | -- | N/A | Yes | N/A |
| 2 | No | -- | N/A | N/A (not visited) | N/A |
| 3 | No | -- | Yes | N/A (not visited) | Yes (no purchase made, package still active) |
| 4 | Yes | Other item | Yes | Yes | Yes (unassign-on-next-purchase is not enabled) |
| 5 | Yes | None | Yes | Yes | Yes (no purchase made, package still active) |
The coin store visibility is controlled by the backend. The current production package config (confirmed):
- Claim limit: 1 - disappears after one purchase of this specific package
- Relative expiry: 7 days from assignment (604800000ms). Each user's clock starts at registration.
- Unassign on next purchase: no - buying a different package does not remove the Welcome Offer
Why buying another package does not remove the Welcome Offer: Two mechanisms could theoretically cause this, and neither applies. First, unassign-on-next-purchase is false, so no purchase of any other package triggers unassignment. Second, the claim limit is scoped per-customer-per-package, not per-customer. Each customer-package pair has its own event stream ({userId}|{packageId}), its own aggregate with its own claimLimit field, and its own row in the read model. Purchasing a $5 package from the store decrements the $5 package's claim limit, not the Welcome Offer's. The two are completely independent.
The Welcome Offer stops appearing when:
- The user purchases the Welcome Offer itself (claim limit hits 0), or
- 7 days pass since registration (relative expiry), or
- An admin deactivates the package
No other user action removes it.
| Control | Where | Effect |
|---|---|---|
| Package price | Store backend | Changes what the player pays |
| Package contents (GC, SC) | Store backend | Changes what the player receives |
| Package expiry | Store backend | Controls when the offer disappears |
| Claim limit | Store backend | How many times the package can be purchased |
| Unassign on next purchase | Store backend | Whether any purchase removes the offer |
| Package active/inactive | Store backend | Master on/off for the package |
| FIRST_TIME_OFFER feature | Backend user features | Per-user toggle (removed for restricted markets) |
| Control | Where | Effect |
|---|---|---|
| Popup display text | Client i18n config | What the player sees in the popup (hardcoded today) |
| Feature flag | Client build | Compiled into the build per environment |
| Package ID for popup purchase | Client config | Which package the popup uses for purchase |
Registration
→ Registration app sets new-player-login cookie
→ Backend assigns ALL new-user packages (including Welcome Offer) to user
First login
→ Game client checks cookie → shows popup with hardcoded text ($20/SC40)
→ Cookie consumed
Coin store
→ Welcome Offer package appears as special offer (same package for all users)
Purchase
→ User buys via popup or store → single Welcome Offer package
Registration
→ Registration app sets new-player-login cookie
→ Backend assigns normal packages as before
→ Backend buckets user by hashing user ID (same user always gets same group)
→ Group A → assign Welcome Offer A ($20/SC40)
→ Group B → assign Welcome Offer B ($20/SC40, identical contents)
First login
→ Game client checks cookie → fetches assigned Welcome Offer package
→ Popup renders from package data (price, GC, SC)
→ Cookie consumed
Coin store
→ Only the assigned Welcome Offer package appears (A or B, both identical)
Analytics
→ Package ID distinguishes groups
→ Validate: both groups should show same conversion rate
Same as A/A, except:
→ Welcome Offer A = $20/SC40 (control)
→ Welcome Offer B = $10/SC20 (treatment)
The A/A and A/B flows are identical from an engineering perspective. Once the infrastructure is built for A/A, switching to A/B is just creating the real variant package and updating an environment variable. No code change needed.
The game client has a custom A/B testing system that runs six experiments today. It uses hash-based bucketing on user ID for consistent group assignment.
However, because the Welcome Offer appears in both the popup and the coin store, a pure client-side experiment isn't sufficient. The coin store renders whatever packages the backend returns, so both surfaces need to show the correct variant.
Assign the correct Welcome Offer variant at registration time, server-side. This way both the popup and the coin store automatically show the right offer without any client-side filtering.
Backend changes:
- Create a second store package for the variant offer ($10, 100k GC, SC20). Both packages must have identical configuration (claim limit, expiry, unassign-on-next-purchase) except for price and contents, to avoid other differences affecting the results.
- Modify the signup-bonus service (a small standalone microservice that assigns packages to new users at registration) to assign one of the two Welcome Offer packages per user based on a hash of their user ID (50/50 split). The hash must use an experiment-specific seed to avoid correlation with other A/B experiments. The user ID is already available. No database schema migration needed.
- Users in restricted markets should be excluded from bucketing entirely, not just excluded at the popup display stage. This avoids diluting the measured effect with users who can never convert.
Client changes:
- Make the popup render from package data (price, coin amounts) instead of hardcoded text
- Add logging for which variant was shown alongside existing events
- No client-side experiment class or coin store filtering needed
Analytics:
- The package ID in the existing log events identifies which variant the user saw
- Purchase conversion is already tracked in the store backend
- Variant assignment is consistent and reproducible from user ID
- We should add a log event at assignment time (in the backend) so there's a clean record of bucketing independent of whether the user ever sees the popup
Rollback: Either variant package can be deactivated in the store backend without a client deploy. The popup will fall back to whichever active Welcome Offer package remains. To stop the experiment entirely, revert the backend to assigning the original single package.
A client-only approach would require:
- Assigning both Welcome Offer packages to every user
- Filtering the wrong one out of the coin store at render time
- Risk of both offers showing if filtering fails
Backend bucketing avoids all of this. The client just renders whatever the backend assigned.
| Team | Work | Effort estimate |
|---|---|---|
| Store / Platform Eng. | Create variant package, update signup-bonus service bucketing | ~1-2 days |
| GP Engineering | Make popup data-driven, add variant logging | ~2-3 days |
| Analytics | No new tooling needed. Variant identified by package ID. | -- |
| QA | Verify both variants in QC, confirm correct bucketing | ~1 day |
Agreed parameters (meeting 2026-03-27): 10% relative MDE, alpha = 0.10 (two-sided), 80% power.
| Phase | Duration | Total samples | What happens |
|---|---|---|---|
| A/A test | ~2 weeks | ~3,400 | Validate infrastructure, confirm no bucketing bias |
| A/B test | ~6 weeks | ~10,000 | Run experiment, measure primary and secondary metrics |
| Measurement window | +8 days | -- | 24h for last user to convert + 7 days for second-purchase window |
| Total | ~9 weeks | ~13,400 | End-to-end from launch to final results |
Registration volume is approximately 1,700 per week (average since 2026-01-01). The sample size estimate of ~10,000 is contingent on the 13% baseline conversion rate; if the confirmed rate differs by more than a few percentage points, the sample size and timeline must be recalculated.
Sequential testing was considered and rejected: it can extend duration when the true effect is small, and is susceptible to seasonal variation and marketing campaign timing.
Both groups receive the same offer ($20/SC40) to verify that the infrastructure works before introducing the real variant.
This validates: bucketing logic, package assignment, data-driven popup rendering (confirming it doesn't itself change behavior), and analytics segmentation.
How it works: Create two identical packages (same price, same contents, same expiry, same claim limit). The backend assigns one per group using the same bucketing logic. The package IDs distinguish the groups in analytics. Run until a sufficient sample is reached to confirm the conversion difference is within expected variance of zero.
Note: ~3,400 samples over 2 weeks (~1,700 per group at 13% baseline = ~221 conversions per group) is enough to detect a large bucketing imbalance but may not surface subtle instrumentation bias. The A/A validates gross errors, not small ones.
Primary metric: First purchase conversion within 24 hours of registration. Did the user purchase the Welcome Offer package within 24 hours of completing registration?
Secondary metrics:
- Second purchase conversion within 7 days. Of users who made a first purchase, what proportion made a second purchase within 7 days? This 7-day measurement window means final results are available 7 days after the last user enters the test. Note: because this metric is conditional on first purchase, a higher conversion rate in the treatment group could change the composition of the denominator (marginal converters who may be lower-quality), which could make the second-purchase rate look worse even if the treatment is broadly beneficial. Consider also tracking an unconditional version: proportion of all randomized users who make 2+ purchases within 7 days.
- Revenue per user (ARPU). The treatment halves the price. Higher conversion does not necessarily mean higher revenue. ARPU should be tracked to ensure the experiment is not optimizing conversion at the expense of revenue.
Secondary metrics are exploratory - no formal hypothesis test. The primary metric (24-hour conversion) is the sole basis for the go/no-go decision.
Exploratory: Conversion broken down by surface (popup vs. coin store) to understand whether the treatment effect differs between the one-shot popup exposure and the persistent coin store listing.
From meeting 2026-03-27:
- Phillip Correia: Extract all production packages and share package configuration (claim limit, expiry, unassign-on-next-purchase) with the group
- Phillip Correia: Discuss priority and implementation plan with Rami (engineering resourcing)
- Simon Dackombe: Schedule follow-up meeting for Friday 2026-04-03
- GP Engineering: Client changes (data-driven popup, variant logging) once prioritized
- Store / Platform Eng.: Backend changes (variant package, signup-bonus service update) once prioritized
-
Current Welcome Offer package configuration: Resolved. Claim limit: 1, relative expiry: 7 days (604800000ms), unassign-on-next-purchase: false. Both variant packages must match these settings exactly (except price and contents). -
Post-test plan: If the treatment wins, does everyone get $10/SC20? If it loses, is the $20/SC40 offer retained? What about an inconclusive result?
-
13% baseline source: Where does the 13% conversion rate come from? Does it cover popup conversions, coin store conversions, or both? The two surfaces likely have different rates. The entire sample size calculation depends on this number.
-
Analysis approach: After investigating the registration flow, ITT vs per-protocol is largely academic here. Registration flows directly into the lobby in a single uninterrupted redirect - package assignment and popup display happen in the same session. The only edge case is a network failure or browser crash between Auth0 completing registration and the game client loading, which would be rare. Users blocked by email verification, geo-restrictions, or market restrictions never reach the point where the package is assigned. Recommend ITT as the default with no need for a separate per-protocol analysis.
-
A/A success criteria: What statistical test will be used to declare the A/A "passed"? What threshold? A common approach is to run the same hypothesis test planned for the A/B and confirm the null is not rejected, but this should be specified upfront.
-
Stopping rules: Since this is a fixed-sample design (not sequential), should the team commit to not analysing interim results until the full sample is collected? If interim monitoring is desired, the number and timing of looks should be pre-specified with alpha adjustment to control false positive rate.
-
Pre-registration: Consider documenting the analysis plan (primary metric definition, test statistic, decision rule, sample size) before data collection begins.