Data Engineer
eTeam
Job Description
Role Name: Data Engineer Location: Raleigh 4 days a week on site JOB DESCRIPTION: Data Engineer What is the opportunity The Engineering team is looking for a Data Engineer to design build and optimize scalable data solutions while supporting enterprise migration initiatives from legacy or onpremises systems to cloudbased platforms We are seeking a highly skilled Data Engineer with strong handson experience in modern data engineering practices deep expertise in Snowflake architecture and implementation and proven experience in data platform migrations The ideal candidate will lead the engineering work to align with the approved solution architecture and technology roadmap to comply with enterprise technical standards providing expertise as a technical resource to solve complex business issues translating data integration and database systems design for new initiatives What will you do Design develop and maintain scalable and reliable data engineering pipelines to support analytics and business intelligence needs Architect design and implement solutions that meet stakeholders needs Architect implement and optimize Snowflake data warehouse solutions including data modeling performance tuning and cost optimization Lead and support data migration initiatives including migration from onpremises or legacy data platforms to Snowflake or other cloudbased solutions Develop and manage ELTETL processes using modern data integration tools and frameworks Participate actively in requirements gathering data modeling and design sessions Prepare highlevel and detailed technical specifications for the projects in accordance with security and architecture documentation objectives Develop detailed plans and accurate estimates for the completion of the build system testing and implementation phases of a project Collaborate with data architects analytics teams and business stakeholders to translate requirements into technical solutions Running and optimizing SQL queries on RDBMS like MS SQL server MySQLMariaDB Ensure data quality security governance and compliance throughout the data lifecycle Develop code document and execute unit tests systems integration and acceptance tests and testing tools for functions of high complexity Troubleshoot and resolve performance data integrity and pipeline reliability issues Document architecture data flows and operational procedures What do you need to succeed Musthave Strong handson experience in Data Engineering including data pipeline development and largescale data processing Deep expertise in Snowflake architecture including Virtual warehouses Micropartitioning Clustering and performance optimization Security and access control Proven experience with data migration projects including assessment planning execution and validation Minimum 5 years of experience in software engineering or analytics in creating enterprise data architectures distributed and microservice software architectures and design patterns Strong SQL skills and experience with data modeling dimensional andor data vault Core SQL database concepts including creating DDL DML scripts normalization running and optimizing SQL queries on RDBMS Experience working with cloud platforms AWS Azure or GCP 2 years of application development experience in Hadoop NoSQL databases like MongoDB Cassandra or HBase Familiarity with orchestration and data integration tools eg Airflow dbt Informatica Fivetran or similar Prior experience with Liquibase Git code repos like GitHub Bachelors degree in Information Technology Computer Science Strong problemsolving skills and ability to work independently in a fastpaced environment Nicetohave Working knowledge of MapReduce or Spark Knowledge in JavaPython Bash shell scripting Experience using Stone Branch Experience in Tableau