**Experienced Full Stack Data Engineer – Web & Cloud Application Development at blithequark**
Are you a skilled data engineer looking to join a dynamic team that's revolutionizing the way we connect around the world? Do you have a passion for leveraging cutting-edge technologies to drive business growth and customer satisfaction? Look no further than blithequark, a global leader in communications services. **About blithequark** blithequark is one of the world's leading companies in the era and communications services industry, shaping the way we connect around the globe. We're a human community that spans the world, working behind the scenes to anticipate, lead, and listen. In times of crisis and celebration, we come together – lifting up our communities and striving to make a difference to move the industry forward. If you're fueled by purpose and powered by persistence, discover a career with us. Here, you'll find the rigor it takes to make a difference and the achievement that comes from living the NetworkLife. **Job Summary** As a Senior Data Engineer in the Artificial Intelligence and Data Organization (AI&D) at blithequark, you'll play a critical role in enhancing the performance, customer experience, and profitability of the company. You'll work on various projects, including artificial intelligence, data engineering, operations automation, safety remediation, and enterprise main Omni channel development. Your expertise will help us build data pipelines and transform data into actionable intelligence, driving business growth and customer satisfaction. **Key Responsibilities** * Examine advertising, customer experience, and virtual operations environments to construct data pipelines and remodel data into actionable intelligence * Convert raw data into usable data pipelines and build data tools and products for test automation and easy data accessibility * Assist in managing all advertising-related tasks on the Big Data/Cloud platform, including the design of new programs and education of other developers * Gather requirements, identify gaps, and build roadmaps and architectures to support the analytics-driven business in achieving its goals * Work closely with Data Analysts to ensure data quality and availability for analytical modeling * Identify gaps and implement solutions for data security, quality, and automation of processes * Support maintenance, bug fixes, and performance evaluation alongside the data pipeline * Designing, building, documenting, testing, and implementing new data pipelines and analytics * Collaborating in cross-functional teams to provide new data, develop schema requirements, and maintain metadata * Building semantic layer (curated data), measurement tables for reports and analytics * Coordinating with the offshore team for design and implementation * Identifying ways to enhance data reliability, performance, and quality * Using big data tools to handle business issues * Using data to determine responsibilities that can be automated * Designing and implementing fact tables, reports, and dashboards using Tableau, Looker, and other tools * Analyzing current Teradata/SQL queries for performance improvement **What We're Looking For** You're curious about new technologies and the game-changing opportunities they create. You want to stay up-to-date with modern developments and apply your technical knowledge to solve business problems. You thrive in fast-paced, innovative environments working as an outstanding teammate to drive the best results and business outcomes. **Requirements** * Bachelor's degree or 3 or more years of work experience * 3 or more years of relevant work experience * Even better if you have one or more of the following: + 3+ years of Database experience in Teradata SQL, Teradata Utilities, SQL Server, SSIS, and OLAP + 3+ years of experience in designing, building, and deploying production-stage data pipelines using tools from the Hadoop stack (HDFS, Hive, Spark, Streaming, HBase, Kafka, Oozie, NiFi, etc.) and programming in Python/Scala + 1+ year(s) of experience in Cloud data engineering, ideally Google Cloud with Big Query + Dashboard development experience in Tableau, Qlik, and/or Looker + A degree. Master's degree in Computer Science, Information Systems, and/or related technical discipline + Teradata/Big Data Analytics Certification + Knowledge of telecom industry + Experience in working with a distributed team + Ability to successfully communicate through presentation, interpersonal, verbal, and written skills **Why blithequark?** blithequark is dedicated to maintaining a Total Rewards package that's competitive, valued by our employees, and differentiates us as an Employer of Choice. We're a 'pay for performance' company, and your contribution is rewarded through aggressive salaries, performance-based incentives, and a Employee Stock Program. We create opportunities for us all to share in the success of blithequark and the value we help create through this broad-based discretionary equity award program. **Benefits** * Market-competitive salaries * Performance-based incentives * Employee Stock Program * Comprehensive benefits package * Flexible working arrangements * Opportunities for career growth and development * Access to cutting-edge technologies and tools * Collaborative and dynamic work environment * Recognition and rewards for outstanding performance **How to Apply** If blithequark and this position sound like a fit for you, we encourage you to apply even if you don't meet every "even better" qualification listed above. This position is eligible to be considered for the Department of Defense SkillBridge Program. Apply Job! Apply for this job