Data Engineer
WaferWire Cloud Technologies
Job Description
Job Title : Software Engineer – Data Engineering & AI Location: Bangalore, India (Work from Client Location) Experience: 5-10 Years Worksite: Onsite (100%) About WCT: WaferWire Technology Solutions (WCT) specializes in delivering comprehensive Cloud, Data and AI solutions through Microsoft's technology stack. Our services include Strategic Consulting, Data/AI Estate Modernization, and Cloud Adoption Strategy. We excel in Solution Design encompassing Application, Data, and AI Modernization, as well as Infrastructure Planning and Migrations. Our Operational Readiness services ensure seamless DevOps, ML Ops, AI Ops, and Sec Ops implementation. We focus on Implementation and Deployment of modern applications, continuous Performance Optimization, and future-ready innovations in AI, ML, and security enhancements. Delivering from Redmond-WA, USA, Guadalajara, Mexico and Hyderabad, India, our scalable solutions cater precisely to diverse business requirements and multiple time zones (US time zone alignment). About the Role WCT is hiring a Software Engineer (Data Engineering & AI) with 5–8 years of experience to support on Client DC’s System & Engineering team. This role focuses on building and optimizing data pipelines, telemetry platforms, and AI-driven insights for large-scale data center infrastructure. You will work on high-volume operational data, enabling observability, reliability, and predictive intelligence across distributed systems.
Responsibilities: Design, build, and optimize scalable data pipelines for infrastructure and telemetry data Develop robust ETL/ELT workflows for large-scale, high-frequency datasets Build and maintain dashboards for API/system health and performance (e.g., Grafana) Work with engineering teams to enable data-driven observability and monitoring systems Implement data models supporting real-time and batch analytics Contribute to AI/ML-driven use cases such as anomaly detection and predictive insights Ensure data quality, reliability, and performance optimization across pipelines Collaborate with backend, SRE, and infrastructure teams to integrate data solutions Participate in debugging, performance tuning, and system optimization Contribute to documentation and best practices Required Qualifications: Bachelor’s degree in computer science, Data Engineering, or related field (or equivalent experience) 5–8 years of experience in data engineering or backend development Strong programming skills in Python and/or Java Hands-on experience with data pipelines, ETL/ELT frameworks, and distributed data processing Strong proficiency in SQL and data modeling Experience working with large-scale distributed systems Experience working on Google Cloud Platform (GCP) services such as: BigQuery (data warehousing & analytics) Pub/Sub (real-time messaging & streaming) Dataflow (streaming & batch pipelines) Cloud Storage (data lake storage) Cloud Composer (Airflow-based orchestration) Exposure to streaming frameworks and real-time data processing Working knowledge of Machine Learning concepts and AI-driven applications Familiarity with AI tools such as Google Gemini or similar platforms Experience with observability tools (e.g., Grafana, Prometheus) Understanding of security and compliance practices in data environments Experience working with SRE or infrastructure engineering teams Equal Employment Opportunity Declaration: WCT is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to age, ancestry, color, family or medical care leave, gender identity or expression, genetic information, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran status, race, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable laws, regulations and ordinances.