Cookbook: Ready-to-Use Queries
This section provides proven, safe query patterns for common business questions. Each example includes the goal, a ready-to-run SQL template, and notes on how to interpret the results and avoid common pitfalls.
For more information on the underlying tables and data consistency rules, please see our Start Here guide.
Answering Common Business Questions
1. Review volume decline (last 60 vs. previous 60)
When to use it: Use this query to identify locations experiencing a significant drop in review activity. This is valuable for spotting potential reputation issues, engagement problems, or locations that may need additional support to maintain their online presence.
Click to view and copy the SQL template
WITH cur AS (
SELECT location_id, SUM(review_versions_count) AS rv_60d
FROM DATA.ANALYTICS.FACT_LOCATION_DAILY_STATS
WHERE date_id BETWEEN DATEADD('day', -60, CURRENT_DATE()) AND CURRENT_DATE()
GROUP BY 1
),
prev AS (
SELECT location_id, SUM(review_versions_count) AS rv_prev_60d
FROM DATA.ANALYTICS.FACT_LOCATION_DAILY_STATS
WHERE date_id BETWEEN DATEADD('day', -120, CURRENT_DATE()) AND DATEADD('day', -61, CURRENT_DATE())
GROUP BY 1
)
SELECT
l.brand, l.location_id, l.name,
p.rv_prev_60d, c.rv_60d,
(c.rv_60d - p.rv_prev_60d) AS delta_rv,
100.0 * (c.rv_60d - p.rv_prev_60d) / NULLIF(p.rv_prev_60d, 0) AS delta_rv_pct
FROM DATA.ANALYTICS.DIM_LOCATIONS l
JOIN cur c USING (location_id)
JOIN prev p USING (location_id)
WHERE l.active = TRUE AND p.rv_prev_60d > 0 AND c.rv_60d < p.rv_prev_60d
ORDER BY delta_rv_pct ASC NULLS LAST;
How to read it: Each row shows a location with declining review volume. The delta_rv column shows the absolute decrease in review count, while delta_rv_pct shows the percentage decline. Locations are sorted by percentage decline (most severe declines first). A -50% decline means the location received half as many reviews in the recent period compared to the previous period.
Guardrails:
- This query only shows active locations that had reviews in both periods and experienced a decline
- Consider seasonality—locations may naturally see different review patterns during holidays or off-seasons
- Very small locations (low baseline review counts) may show large percentage swings from normal variance
- Review volume can be affected by external factors like local events, temporary closures, or platform algorithm changes
2. Unresolved negative interactions build-up (last 14 days)
When to use it: Use this query to identify locations that are accumulating unresolved negative customer interactions over the past two weeks. This helps prioritize locations that may need immediate attention to prevent customer satisfaction issues from escalating or damaging reputation.
Click to view and copy the SQL template
SELECT
i.location_id,
l.name,
SUM(i.negative_interactions_count) AS negative_14d,
SUM(i.resolved_negative_interactions_count) AS resolved_negative_14d,
SUM(i.negative_interactions_count - i.resolved_negative_interactions_count) AS unresolved_negative_14d
FROM DATA.ANALYTICS.FACT_LOCATION_DAILY_STATS i
JOIN DATA.ANALYTICS.DIM_LOCATIONS l USING (location_id)
WHERE i.date_id BETWEEN DATEADD('day', -14, CURRENT_DATE()) AND CURRENT_DATE()
GROUP BY i.location_id, l.name
HAVING SUM(i.negative_interactions_count - i.resolved_negative_interactions_count) > 0
ORDER BY unresolved_negative_14d DESC;
How to read it: Each row represents a location with unresolved negative interactions in the last 14 days. The unresolved_negative_14d column shows how many negative interactions remain unaddressed. Locations are sorted by unresolved count (highest first), making it easy to prioritize which locations need immediate follow-up.
Guardrails:
- This is a 14-day rolling window—some "unresolved" items may be recent and within normal response time expectations
- Consider location size and volume when interpreting results; high-traffic locations may naturally have more unresolved items
- Query only shows locations with at least one unresolved negative interaction; locations with zero unresolved items are excluded
3. Top earners in the last 30 days
When to use it: Use this query to identify which employees have received the most rewards and customer mentions in the last 30 days. This is valuable for recognizing top performers, analyzing the effectiveness of recognition programs, and understanding which team members are driving positive customer experiences.
Click to view and copy the SQL template
SELECT
em.employee_id,
de.first_name,
de.last_name,
SUM(em.reward_amount) AS rewards_30d,
COUNT(*) AS mentions_30d
FROM DATA.ANALYTICS.FACT_EMPLOYEE_MENTIONS em
LEFT JOIN DATA.ANALYTICS.DIM_EMPLOYEES de
ON de.employee_id = em.employee_id
WHERE em.created_at_date >= DATEADD('day', -30, CURRENT_DATE())
GROUP BY em.employee_id, de.first_name, de.last_name
ORDER BY rewards_30d DESC;
How to read it: Each row represents an employee who received mentions in the last 30 days. The rewards_30d column shows total reward value earned, and mentions_30d shows the number of times they were mentioned. Results are sorted by total rewards (highest first). Employees with NULL names exist in mentions but not in the employee dimension (may be former employees or data sync issues).
Guardrails:
- Reward amounts depend on business rules and may vary by mention type or location
- Be mindful of privacy and recognition policies when sharing employee performance data
4. Adoption summary by location segments (last 30 days)
When to use it: Use this query to compare engagement and feature adoption metrics across different groups of locations. This is particularly useful for evaluating pilot programs, comparing regional performance, or assessing the impact of different onboarding strategies. You can define custom segments based on any grouping criteria (region, brand, pilot cohort, etc.).
Click to view and copy the SQL template
WITH segments AS (
-- Define your comparison groups here (one row per location_id per segment)
SELECT 'Segment A' AS segment, id AS location_id
FROM (VALUES (101),(102),(103)) AS t(id)
UNION ALL
SELECT 'Segment B' AS segment, id AS location_id
FROM (VALUES (201),(202)) AS t(id)
UNION ALL
SELECT 'Segment C' AS segment, id AS location_id
FROM (VALUES (301),(302),(303)) AS t(id)
),
loc_agg AS (
SELECT
location_id,
MAX(IFF(interactions_count > 0
OR outbound_messages_count > 0
OR inbound_messages_count > 0, 1, 0)) AS active_flag_30d,
MAX(IFF(chat_interactions_count > 0, 1, 0)) AS chat_flag_30d,
AVG(avg_first_response_minutes) AS avg_afr_30d
FROM DATA.ANALYTICS.FACT_LOCATION_DAILY_STATS
WHERE date_id >= DATEADD('day', -30, CURRENT_DATE())
GROUP BY location_id
)
SELECT
s.segment,
COUNT(*) AS total_locations,
SUM(active_flag_30d) AS active_locations_30d,
ROUND(100.0 * SUM(active_flag_30d) / NULLIF(COUNT(*), 0), 1) AS active_rate_pct,
SUM(chat_flag_30d) AS chat_adopted_locations_30d,
ROUND(100.0 * SUM(chat_flag_30d) / NULLIF(COUNT(*), 0), 1) AS chat_adoption_rate_pct,
AVG(avg_afr_30d) AS avg_first_response_minutes_30d
FROM loc_agg a
JOIN segments s
ON s.location_id = a.location_id
GROUP BY s.segment
ORDER BY total_locations DESC;
How to read it: Each row represents one segment with aggregated metrics. The active_rate_pct shows what percentage of locations in that segment had any platform activity in the last 30 days. The chat_adoption_rate_pct shows what percentage used the chat feature. The avg_first_response_minutes_30d shows average response time for locations in that segment. Compare these metrics across segments to identify high and low performers.
Guardrails:
- You must manually define your segments in the CTE by replacing the example location IDs with your actual comparison groups
- A location is considered "active" if it had any interactions or messages in the 30-day window; adjust this logic if your definition differs
- Chat adoption only measures if chat was used at least once; it doesn't measure frequency or quality of chat engagement
- Average first response time only includes locations that had responses; locations with no activity don't affect this metric
- Segment sizes can vary significantly; consider both percentage metrics and absolute counts when interpreting results
5. Geographic analysis: Performance by state/region (last 90 days)
When to use it: Use this query to analyze performance metrics for US locations by region. This is valuable for identifying regional trends, comparing regional performance, or planning regional initiatives.
Click to view and copy the SQL template
SELECT
l.us_region_9,
COUNT(DISTINCT l.location_id) AS total_locations,
COUNT(DISTINCT CASE WHEN l.active THEN l.location_id END) AS active_locations,
AVG(r.rating) AS avg_rating,
COUNT(r.review_id) AS total_reviews,
AVG(i.first_response_minutes) AS avg_first_response_minutes,
COUNT(DISTINCT e.employee_id) AS total_employees
FROM DATA.ANALYTICS.DIM_LOCATIONS l
LEFT JOIN DATA.ANALYTICS.FACT_REVIEWS r
ON r.location_id = l.location_id
AND r.last_version_date >= DATEADD('day', -90, CURRENT_DATE())
LEFT JOIN DATA.ANALYTICS.FACT_INTERACTIONS i
ON i.location_id = l.location_id
AND i.created_at_date >= DATEADD('day', -90, CURRENT_DATE())
LEFT JOIN DATA.ANALYTICS.FACT_EMPLOYEE_MENTIONS em
ON em.location_id = l.location_id
AND em.created_at_date >= DATEADD('day', -90, CURRENT_DATE())
LEFT JOIN DATA.ANALYTICS.DIM_EMPLOYEES e
ON e.employee_id = em.employee_id
WHERE l.country = 'US'
GROUP BY l.us_region_9
ORDER BY total_locations DESC, l.us_region_9;
How to read it: Each row represents a US region. The total_locations shows all locations in that region, while active_locations shows only active ones. Performance metrics (ratings, response times, employee counts) are calculated over the last 90 days. Regions are sorted by location count (largest first).
Guardrails:
- Query automatically excludes internal test/demo locations via
internal = FALSEfilter - Only shows states with 3+ locations to avoid statistical noise from single-location states
- 90-day window for metrics may not be representative for seasonal businesses
- NULL values in averages indicate no data for that metric in the time period
- Geographic coordinates (
lat,lon) andtimezonefields are also available for mapping and time-zone-aware analysis
6. Transaction revenue trends by brand (last 90 days)
When to use it: Use this query to analyze transaction revenue patterns across brands over the past 90 days. This helps identify revenue trends, seasonal patterns, and compare performance across different brands. Useful for financial reporting, forecasting, and identifying growth opportunities.
Click to view and copy the SQL template
SELECT
l.brand,
DATE_TRUNC('week', t.transaction_date) AS week,
COUNT(DISTINCT t.transaction_id) AS transactions_count,
COUNT(DISTINCT t.customer_id) AS unique_customers,
SUM(t.transaction_item_count) AS total_items_sold,
SUM(t.transaction_subtotal) AS revenue,
AVG(t.transaction_subtotal) AS avg_transaction_value
FROM DATA.ANALYTICS.FACT_TRANSACTIONS t
JOIN DATA.ANALYTICS.DIM_LOCATIONS l USING (location_id, account_id)
WHERE t.transaction_date >= DATEADD('day', -90, CURRENT_DATE())
GROUP BY 1,2
ORDER BY 1,2;
How to read it: Each row represents one brand for one week. The transactions_count shows total transactions, unique_customers shows how many distinct customers made purchases, and revenue shows total transaction revenue. The avg_transaction_value helps identify if revenue changes are driven by transaction count or ticket size. Compare weeks within a brand to spot trends, and compare across brands to identify performance differences.
Guardrails:
- Transaction data depends on POS system integration and may have delays or gaps depending on sync schedules
- Revenue is measured by
transaction_subtotalwhich may not include taxes, tips, or other adjustments depending on POS configuration - Customer IDs from different POS systems may not be deduplicated;
unique_customerscounts unique IDs within each POS system - Week-over-week comparisons may be affected by holidays, local events, or temporary closures
- Consider store count changes when comparing brand-level metrics over time
7. Customer purchase frequency analysis (last 90 days)
When to use it: Use this query to segment customers by purchase frequency and understand retention patterns. This helps identify your most loyal customers, one-time buyers, and opportunities to increase repeat purchase rates.
Click to view and copy the SQL template
WITH customer_txns AS (
SELECT
t.location_id,
t.customer_id,
COUNT(DISTINCT t.transaction_id) AS transaction_count,
SUM(t.transaction_subtotal) AS total_spent,
MIN(t.transaction_date) AS first_transaction_date,
MAX(t.transaction_date) AS last_transaction_date,
DATEDIFF('day', MIN(t.transaction_date), MAX(t.transaction_date)) AS customer_lifespan_days
FROM DATA.ANALYTICS.FACT_TRANSACTIONS t
WHERE t.transaction_date >= DATEADD('day', -90, CURRENT_DATE())
GROUP BY 1,2
),
frequency_segments AS (
SELECT
location_id,
customer_id,
transaction_count,
total_spent,
customer_lifespan_days,
CASE
WHEN transaction_count = 1 THEN '1 - One-time buyer'
WHEN transaction_count BETWEEN 2 AND 3 THEN '2 - Occasional (2-3 visits)'
WHEN transaction_count BETWEEN 4 AND 9 THEN '3 - Regular (4-9 visits)'
WHEN transaction_count >= 10 THEN '4 - Frequent (10+ visits)'
END AS frequency_segment
FROM customer_txns
)
SELECT
l.brand,
l.name AS location_name,
fs.frequency_segment,
COUNT(DISTINCT fs.customer_id) AS customers,
AVG(fs.transaction_count) AS avg_visits,
AVG(fs.total_spent) AS avg_total_spent,
AVG(fs.customer_lifespan_days) AS avg_lifespan_days
FROM frequency_segments fs
JOIN DATA.ANALYTICS.DIM_LOCATIONS l USING (location_id)
GROUP BY 1,2,3
ORDER BY 1,2,3;
How to read it: Each row represents a customer frequency segment within a location. The segments range from one-time buyers to frequent customers (10+ visits). The customers column shows how many customers fall into each segment, avg_visits shows average transaction count, and avg_total_spent shows average revenue per customer in that segment. Use this to identify what percentage of your customer base is loyal vs. transient.
Guardrails:
- This analysis is based on a 90-day window; adjust the timeframe based on your business cycle (longer for infrequent purchase categories)
- Customer IDs may not deduplicate across different locations
- New customers who just made their first purchase will appear as "one-time buyers" but may return outside this analysis window
- High one-time buyer percentages aren't necessarily bad for locations in tourist areas or transient traffic locations
customer_lifespan_dayswill be 0 for one-time buyers; useavg_visitsas the primary metric for that segment