| Job Title: | Python Developer – Data Ingestion |
| Employment Type: | Permanent |
| Position Location: | Hyderabad |
| Reports to: | Software Development Manager (IT Solutions Delivery) |
| Qualifications: | BE/B.Tech/MCA Degree in Computer Science, Engineering, or similar relevant field |
| Total Experience: | 3 to 7 Years |
| Working Model: | Work from Office |
Organization Brief:
CapTech Consulting (www.captechconsulting.com), a U.S.-based technology and management consulting firm within the Markel Group (NYSE: MKL), and RD Solutions (RDSolutions.io), an innovative retail-tech products company in the same ecosystem, together drive global digital transformation and next-generation retail solutions through their specialized capabilities. Through which, we help leading organizations design, build, and modernize digital products, data platforms, and customer experiences that deliver measurable business impact.
Our Global Capability Center (GCC) in Noida and the new center currently being established in Hyderabad extends this mission by delivering world-class engineering, analytics, and innovation capabilities from India.
About The Role:
We are seeking a Python Developer to build and maintain scalable data ingestion pipelines in a cloud environment.
The role focuses on ingesting data from multiple sources, ensuring data quality, and supporting downstream analytics and data platforms.
Primary Responsibilities:
- Develop and maintain Python-based data ingestion pipelines
- Build and manage ETL/ELT workflows on Azure
- Ingest data from APIs, databases, and flat files
- Perform data validation, transformation, and error handling
- Optimize ingestion processes for performance and scalability
- Monitor pipelines and troubleshoot ingestion issues
- Collaborate with data engineering and analytics teams
- Document technical designs and data workflows
Knowledge & Skills Requirements
- Strong proficiency in Python
- Experience building data ingestion / ETL pipelines
- Mandatory hands-on experience with Microsoft Azure, Azure Functions
- Strong knowledge of SQL
- Hands-on experience with PostgreSQL (like schema design, upserts, batch loads)
- Experience working with REST APIs and structured/unstructured data
- Understanding of data modeling and schema design
- Familiarity with Linux and Git
- Strong analytical and problem-solving skills
Good to have (Preferred Skills)
- Experience with Azure services such as Data Factory, Blob Storage, Data Lake, Functions
- Knowledge of data processing frameworks (Pandas, PySpark)
- Experience with workflow orchestration tools (Airflow, Prefect)
- Exposure to Docker and CI/CD pipelines
Qualification:
- Bachelor’s or Master’s degree in Computer Science, Engineering, or equivalent experience
Why Join US:
- Be part of a Fortune 500 global enterprise known for innovation, integrity, and long-term
- Join a Global Capability Center that combines the energy of a startup with the stability of a global brand.
- Collaborate with S. based consulting teams on cutting-edge, enterprise-scale projects.
- Work in a hybrid environment that promotes learning, craftsmanship, and career
Fill the Form
