Core responsibilities (ongoing)
Build and maintain transformation pipelines (ELT/ETL) from raw
• curated marts.
• Design dimensional models: facts, dimensions, conformed dimensions, SCD patterns.
• Define and govern metric logic in a semantic layer (dbt Semantic Layer / LookML / SSAS tabular equivalent).
• Partner with BI Devs to ensure dashboards map to certified metrics (no "shadow KPIs").
• Own analytics data documentation (catalog descriptions, lineage notes, usage guidance).
• Conduct model performance tuning (partitioning, clustering, incremental strategies).
Key interfaces
• Data Engineering: upstream contracts, schema changes, CDC semantics.
• BI: dashboard requirements, KPI definitions, visualization constraints.
• Data PM: roadmap prioritization, adoption goals.
• Governance: definitions, stewardship, certification process.
KPIs (leading + lagging)
• Leading: % models with tests, docs completeness, build time, review cycle time.
• Lagging: metric discrepancy incidents, dashboard trust score, adoption of certified models.
A-player competencies (Topgrading-style)
• Systems thinking: understands upstream/downstream ripple effects.
• Semantic rigor: defines grain, metric logic, edge cases explicitly.
• Pragmatic standards: enforces consistency without blocking delivery.
• Stakeholder translation: converts business questions into data contracts and models.
• AY behaviors: Trust at Scale, Clarity Over Complexity, Ownership.
Minimum qualifications
• Proven analytics modeling in a warehouse/lakehouse (Snowflake/BigQuery/Redshift/Synapse/Databricks).
• Strong SQL + modeling patterns (Kimball, data vault exposure acceptable).
• Experience with dbt or equivalent transformation framework.
• Familiarity with BI tools' semantic behaviors (Power BI DAX, Tableau LODs, Looker).
Work sample (recommended)
• Given 4 raw tables + messy requirements, produce:
• 1 fact, 3 dims, 5 KPIs,
• tests + documentation,
• and explain grain + edge cases.