HexaCorp
HexaCorp - Senior Data Engineer - ETL Tools
Job Location
chennai, India
Job Description
Key Responsibilities : - Data Integration Development: Design, develop, and maintain robust data integration pipelines between our internal systems and external vendors using ETL tools and APIs. - Integrate disparate data sources and ensure seamless data flow across internal and external platforms. - Develop solutions to improve data integration efficiency and enhance data accessibility. - Collaboration with Product Teams: Work closely with internal product teams to understand business requirements and deliver data solutions that are scalable and reliable. - Ensure data consistency and integration alignment across different platforms by closely collaborating with cross-functional teams. - Data Workflow Optimization: Monitor, optimize, and maintain data workflows to ensure high performance, reliability, and scalability of integration processes. - Troubleshoot and resolve data integration issues promptly to minimize disruptions and ensure continuous operations. - Data Quality and Validation: Implement data quality checks and validation processes to ensure the accuracy and completeness of integrated data. - Continuously improve data integration practices to ensure that only clean and validated data enters the system. - Documentation and Knowledge Sharing: Document all data integration processes, workflows, and best practices to ensure knowledge sharing and effective onboarding of new team members. - Create clear documentation that details data flow, error-handling procedures, and other integration-related processes. - Stay Current with Industry Trends: Keep up-to-date with the latest trends, technologies, and best practices in data integration and engineering. - Recommend and implement improvements to existing integration systems and workflows, leveraging new tools and technologies to enhance performance and scalability. - Security, Governance, and Compliance: Ensure data governance standards and compliance requirements are met during the data integration process. - Implement security measures to protect sensitive data and ensure compliance with data protection Qualifications and Skills : Experience : - 6 years of experience as a Data Engineer with a strong focus on data integration. - Proven experience with ETL processes, integrating systems, and working with APIs for data exchange. - In-depth experience with SQL and NoSQL databases (e.g., PostgreSQL, SQL Server, MongoDB, Oracle), including performance tuning and optimization. - Experience with cloud platforms, particularly AWS, for building scalable and reliable data integration Expertise : - Proficiency in using ETL tools such as Apache NiFi, Talend, Informatica, Fivetran, or custom ETL pipelines. - Strong programming skills in languages such as Python, JavaScript, PowerShell (experience with Python is highly preferred). - Experience with API development and integration, ensuring seamless data communication between platforms. - Familiarity with data warehousing concepts and architectures, such as Star Schema, Snowflake Schema, and ETL pipelines for large-scale data storage. - Experience in data modeling and schema design to ensure that data is structured and stored effectively for analytics and Governance & Compliance : - Understanding of data governance principles, data protection standards, and regulatory compliance (e.g., GDPR, HIPAA). - Ability to apply data security best practices to ensure the confidentiality, integrity, and availability of and Troubleshooting : - Strong problem-solving skills and ability to troubleshoot complex data integration issues. - Ability to quickly identify issues, resolve problems, and ensure the smooth functioning of data integration and Communication : - Excellent communication and collaboration skills, with the ability to work effectively with cross-functional teams. - Experience with Agile methodologies and working in an Agile environment, ensuring flexibility and quick delivery of integration Control and Tools : - Familiarity with version control systems such as Git to manage code and configuration. - Experience with automation tools and scripting for data pipeline : - Relevant certifications in Data Engineering, Cloud Platforms (e.g., AWS Certified Data Analytics - Specialty), or Integration Tools are a plus. - Certifications in ETL tools or cloud platforms like AWS or Azure are Skills : - Experience with containerization technologies (e.g., Docker, Kubernetes) for deploying and managing data pipelines. - Familiarity with Big Data technologies such as Hadoop, Spark, or Kafka is a plus. - Knowledge of ERP systems and business process integration. - Familiarity with DevOps practices and integrating CI/CD pipelines for automated testing and deployment of data : - Bachelors or Masters degree in Computer Science, Information Technology, Data Engineering, or a related field (ref:hirist.tech)
Location: chennai, IN
Posted Date: 4/20/2025
Location: chennai, IN
Posted Date: 4/20/2025
Contact Information
Contact | Human Resources HexaCorp |
---|