Urgent Requirement: Data Engineer, Snowflake & DataHub - Remote
Job Location: Remote Job Description: We are seeking an experienced Data Engineer with expertise in Snowflake and DataHub to join our dynamic team. The ideal candidate will have a strong background in cloud data platforms, data integration, and managing data ecosystems using Snowflake as the core data warehouse solution. You will be responsible for building, managing, and optimizing data pipelines, ensuring data quality and availability, and integrating with a broader data hub architecture. Key Responsibilities: • Design, build, and maintain scalable data pipelines using Snowflake. • Manage and optimize Snowflake environments, ensuring high performance and cost-efficiency. • Integrate data from multiple sources into Snowflake and ensure its accessibility across the enterprise. • Work with business intelligence, data science, and analytics teams to ensure smooth data availability and enable self-service analytics. • Design and implement data models and architectures within Snowflake that support various analytics use cases. • Support DataHub initiatives by integrating Snowflake as the central repository for all data. • Perform ETL/ELT tasks for data transformation and loading into Snowflake, ensuring clean, accurate, and timely data. • Collaborate with cross-functional teams to identify data requirements and deliver data solutions to meet business needs. • Troubleshoot and resolve issues related to data pipelines, Snowflake queries, and integration tools. • Ensure compliance with data governance, data quality, and security standards. • Develop and automate data workflows and pipelines for improved productivity. • Monitor data flow and provide regular status reports to stakeholders. Required Qualifications: • Bachelor's degree in Computer Science, Engineering, or a related field (or equivalent experience). • 8+ years of experience as a Data Engineer, with at least 3 years of experience working with Snowflake. • Strong knowledge of Snowflake architecture, data modeling, and query optimization. • Experience in building and managing data pipelines in cloud environments (AWS, Azure, GCP). • Proficient in SQL and data transformation techniques (ETL/ELT). • Familiarity with DataHub frameworks, integrating Snowflake within data ecosystems. • Experience with version control, CI/CD pipelines, and automation tools. • Strong knowledge of data warehousing concepts, data governance, and best practices. • Experience with Python, Spark, or other scripting languages for data manipulation is a plus. Preferred Qualifications: • Experience with other cloud data platforms such as AWS Redshift, Google BigQuery, or Azure Synapse. • Certifications in Snowflake or cloud platforms (AWS, Azure, GCP). • Experience in data integration tools like Apache Kafka, Apache Nifi, or Fivetran. • Knowledge of data visualization tools like Tableau, Power BI, or Looker. • Experience in Agile methodologies. Key Competencies: • Strong problem-solving and analytical skills. • Excellent communication and collaboration abilities. • Ability to work in a fast-paced and evolving environment. • Attention to detail and commitment to data quality. • Ability to manage and prioritize multiple tasks and deadlines. Apply tot his job