Resource Quick

Python/PySpark Developer - Data Engineering

Click Here to Apply

Job Location

bangalore, India

Job Description

Key Responsibilities : - Develop robust and scalable Python and PySpark applications to support data engineering and analytics initiatives. - Design and deploy cloud-native solutions on AWS, leveraging services like EC2, Lambda, S3, RDS, Redshift, and Glue. - Manage and transform large datasets, ensuring efficient data ingestion and migration through well-designed pipelines. - Utilize SQL Server and stored procedures for effective data management and transformation. - Develop, deploy, and manage scalable ETL and orchestration solutions. Required Skills and Experience : - Strong proficiency in Python and PySpark. - Hands-on experience with AWS services (EC2, Lambda, S3, RDS, Redshift, Glue). - Solid understanding of data engineering principles and practices. - Experience with SQL Server and stored procedures. - Strong problem-solving and analytical skills. - Excellent communication and collaboration skills. Preferred Skills : - Experience with data visualization tools (i.e., Tableau, Power BI). - Knowledge of data warehousing and data lake concepts. - Experience with CI/CD pipelines and DevOps practices (ref:hirist.tech)

Location: bangalore, IN

Posted Date: 11/23/2024
Click Here to Apply
View More Resource Quick Jobs

Contact Information

Contact Human Resources
Resource Quick

Posted

November 23, 2024
UID: 4937080882

AboutJobs.com does not guarantee the validity or accuracy of the job information posted in this database. It is the job seeker's responsibility to independently review all posting companies, contracts and job offers.