Senior Data Engineer - Snowflake
Job Description: • The Senior Data Engineer (Sourcing & Pipeline Management) will play an integral role within the growing Varsity Brands Data Center of Excellence team. • Architect + Implement: Design, build and launch efficient and reliable data pipelines to move data from source platforms, including front-end applications, back-end systems, and third-party analytics and data services, to our enterprise data hub. • In addition, design and build pipelines to supply downstream enterprise applications with prepared reference data from our enterprise data hub. • Orchestrate + Monitor: Manage data pipelines as an interdependent network, with proactive visibility into pipeline errors as well as costs over time. • Partner + Educate: Partner with stakeholders to understand business requirements, work with cross-functional data and products teams, and build efficient and scalable data solutions. • Use your data and analytics experience to identify gaps and propose improvements in existing systems and processes, as well as making your source data pipelines easily accessible to data stakeholders. • Working with data modelers and analysts to identify and prioritize data sourcing gaps. • Assessing best fit tool for any given data source. • Establishing pipeline cadences and timing based on analytics needs and use cases while being cost conscious. • Providing downstream data stakeholders visibility to pipeline scheduling and status. • Responsively troubleshooting errors or alerts in existing pipelines. • Tracking and summarizing current period pipeline costs and trends for business and IT stakeholders. Requirements: • Familiarity with modern data stack tools and services employed to replicate data from source systems to cloud data warehouses or lakes particularly using Snowflake • Experience utilizing data replication tools and services like HVR, Fivetran, Airbyte, Meltano, & Matillion a MUST • Proficiency writing custom code to source data from APIs when needed • Ability to work collaboratively with product or application owners to tease out relevant raw data is available to source • Ability to identify source system data capture opportunities to unlock analytics capabilities • Strong knowledge in data architecture, data modeling, schema design, and software development principles. • 3+ years of experience in the data engineering/warehousing space, custom ELT/ETL design, implementation, and maintenance • 3+ years of experience writing SQL in an analytics or data pipeline context • 2+ years of experience in at least one language (Python, Scala, Java, etc) in a data engineering or analytics context • 1+ year of experience using an orchestration tool or service to coordinate ELT and downstream analytics pipelines • Experience using REST APIs to acquire and flow data from source to target systems • Experience working with cloud data analytics platforms and tools, particularly Snowflake, dbt, Tableau and Power BI a MUST • Experience standing up data pipelines from SAP ERP is a plus • Experience standing up data pipelines from Google Analytics 4 data is a plus. Benefits: • Comprehensive Health Care Benefits • HSA Employer Contribution/ FSA Opportunities • Wellbeing Program • 401(k) plan with company matching • Company paid Life, AD&D, and Short-Term Disability • Generous My Time Off & Paid Holidays • Varsity Brands Ownership Program • Employee Resource Groups • St. Jude Partnership & Volunteer Opportunities • Employee Perks including discounts on personal apparel and equipment! Apply tot his job