Remote Cloud Data Warehouse Architect

Remote, USA Full-time
This a Full Remote job, the offer is available from: Pennsylvania (USA) SUMMARY The Cloud Data Warehouse Architect will design and deliver the next-generation enterprise analytics platform. This position is highly technical and will focus on building a cloud-native, SAP-integrated, AI-ready architecture that supports analytics, reporting, and advanced machine learning at scale. The architect will modernize current BI and data warehouse environment, anchored today in IBM Netezza, Cognos, and Tableau into a cloud-based architecture. This role will require deep technical expertise in data modeling, cloud-native design, and hybrid architectures that bridge legacy on-prem systems with cloud-first capabilities. The Data Science & Insights group is at the center of analytics transformation. Our mission is to: • Consolidate legacy BI systems (Netezza, Cognos) into a modern cloud architecture. • Support the SAP S/4HANA migration with tight integration into futrestate. • Deliver governed, high-performance datasets for self-service analytics in Tableau, Power BI, and SAC. • Enable AI/ML use cases through Databricks and Azure ML. • Extend analytics capabilities to our partners and vendors via embedded reporting This is an opportunity to be the hands-on architect shaping future-state data strategy, working in a fast-paced, hybrid cloud environment that balances innovation with enterprise stability. ESSENTIAL DUTIES AND RESPONSIBILITIES Architectural Design & Modernization • Lead the design of a cloud data warehouse and data lakehouse architecture capable of ingesting large-scale transactional and operational data. • Define integration strategies for core systems. • Develop a reference architecture that leverages Azure Data Lake Storage (ADLS) and Databricks Delta Lake as core components. • Implement semantic modeling to unify reporting across Tableau, Power BI, and SAP Analytics Cloud (SAC). Data Engineering & Performance • Oversee ingestion pipelines for batch (Netezza extracts, flat files, nightly jobs) and near real-time (APIs, streaming) data sources. • Optimize query performance through partitioning, clustering, caching, and Delta Lake / warehouse design. • Establish reusable ETL/ELT patterns across Databricks notebooks, SQL-based orchestration, and integration with ActiveBatch scheduling. Governance, Security & Compliance • Define and enforce data governance standards (naming conventions, metadata, lineage, data quality). • Partner with InfoSec on identity management (Azure AD), encryption, and RBAC/ABAC models. • Implement governance tooling such as Azure Purview, SAP metadata catalogs, Databricks Unity Catalog, and Glasswing. Collaboration & Enablement • Partner with data engineers and visualization teams to deliver governed, high-performance datasets consumable in Tableau, Power BI, SAC, and SAP Fiori. • Serve as the technical SME for architects, engineers, and analysts, ensuring alignment to best practices in cloud-native data warehouse design. • Drive knowledge transfer from legacy platforms (Netezza, Cognos) into the new ecosystem. EDUCATION and/or EXPERIENCE Education • Bachelor's degree in Computer Science, Engineering, or related field. Experience • 7+ years in data engineering, data warehouse architecture, or cloud data architecture. • Expertise in Azure (ADLS, Synapse, Purview, Databricks, networking, security). • Strong proficiency in Databricks (Delta Lake, PySpark, SQL) and/or Snowflake (warehouse design, scaling, security). • Proven experience in data modeling (3NF, star schema, semantic layers). • Deep SQL expertise across both cloud and traditional RDBMS (Netezza, SQL Server, Progress OpenEdge). • Understanding of SAP S/4HANA integration and familiarity with SAP Datasphere. Preferred • Prior experience migrating from on-prem Netezza or other MPP systems to cloud-native platforms. • Familiarity with Cognos to Tableau/Power BI migrations and dashboard optimization. • Hands-on experience with SAP Analytics Cloud (SAC) and embedded analytics. • Knowledge of machine learning workflows and integration with Databricks MLflow or Azure ML. • Strong knowledge of data governance frameworks and tooling (Purview, Unity Catalog, SAC governance). This offer from "EDI Staffing, an EDI Specialists Company" has been enriched by Jobgether.com and got a 72% flex score. Apply tot his job
Apply Now

Similar Jobs

Database Administrator (Remote Opportunity)

Remote, USA Full-time

100% REMOTE: IBM DB2 Database Administrator

Remote, USA Full-time

[Remote] Databricks Engineer - Azure

Remote, USA Full-time

Senior Databricks Engineer- fully remote

Remote, USA Full-time

[Remote] DataOps Engineer (AWS)

Remote, USA Full-time

Dialysis Coordinator job at DaVita in Washington, WA

Remote, USA Full-time

DaVita – Administrative Assistant (AA) – Sainte-Claire, QC – Mission, TX

Remote, USA Full-time

AI and Machine Learning Engineer III

Remote, USA Full-time

[Remote] Senior Deep Learning Tools Developer

Remote, USA Full-time

Staff Data Scientist, Machine Learning - USA Remote

Remote, USA Full-time

Virtual Safety Associate- (Per Diem)- Patient Safety in River Edge, NJ

Remote, USA Full-time

Junior Data Entry Assistant for Remote Opportunities with Endless Growth and Development Possibilities at blithequark

Remote, USA Full-time

Merchant Services-Underwriter/Risk / Chargeback Analyst

Remote, USA Full-time

[Remote] Product Manager Power Components

Remote, USA Full-time

Engineer - Electrical Safety

Remote, USA Full-time

Sr ServiceNow Developer - Remote

Remote, USA Full-time

Experienced Online Chat Support Specialist - HR Operations and Call Center Excellence at Blithequark

Remote, USA Full-time

Material Planner II (Contract)

Remote, USA Full-time

Telecommute Youtube Jobs From Home No Experience – $25-$35/hr Needed

Remote, USA Full-time

Experienced Full Stack Customer Support Resolution Coordinator - Contact Center Operations at Blithequark

Remote, USA Full-time
Back to Home