ETL Developer
Angel and Genie
Job Description
Job Title: Talend ETL Developer (5 - 6 Years Experience) Immediate Joinee 3 levels of interview (Level 3 in person Mandatory) 5 to 6yrs of exp Budget - 15LPA Hybrid Mode - Mon to Thursday(Mandatory) Location - Hyderabad Job Summary We are looking for a skilled Talend ETL Developer with 5-6 years of experience in designing, developing, and maintaining data integration solutions. The ideal candidate should have strong expertise in Talend, data warehousing concepts, and ETL processes, with the ability to work in a fast-paced environment and collaborate with cross-functional teams. Key Responsibilities • Design, develop, and deploy ETL workflows using Talend Open Studio / Talend Data Integration • Build and optimize data pipelines for large-scale data processing • Extract, transform, and load data from multiple sources (databases, APIs, flat files, cloud sources) • Ensure data quality, integrity, and consistency across systems • Perform data mapping, transformation, and validation • Troubleshoot ETL job failures and performance issues • Collaborate with Data Analysts, Data Engineers, and Business teams to understand requirements • Implement error handling, logging, and monitoring mechanisms • Optimize SQL queries and ETL jobs for performance tuning • Maintain proper documentation of ETL processes and workflows • Support production deployments and provide L2/L3 support when required Required Skills & Qualifications • 5 - 6 years of experience in ETL development • Strong hands-on experience with Talend (DI / Big Data / Cloud) • Proficiency in SQL (advanced queries, joins, performance tuning) • Experience with RDBMS (Oracle, SQL Server, MySQL, PostgreSQL) • Good understanding of data warehousing concepts (star schema, snowflake schema) • Experience in handling large datasets and performance optimization • Knowledge of Unix/Linux scripting • Familiarity with version control systems (Git, SVN) • Strong debugging and problem-solving skills • Good communication and teamwork abilities • Knowledge of CI/CD pipelines • Exposure to data governance and data quality tools