AsalJobs

Senior Databricks Engineer - Spark/Python

Click Here to Apply

Job Location

in, India

Job Description

Mandatory Skills : Python, SQL, Spark,, Azure Databricks, Azure Data Factory , NoSQL Job Description : This candidate brings a unique blend of technical expertise in SQL and Python, coupled with industry specific experience in financial services, making them well-suited to tackle complex data challenges within a life insurance setting. Experience : - 6-10 years of hands-on experience in SQL and Python development. - Proven track record of building complex data validation and transformation projects using both SQL and Python. Industry Experience : - Demonstrated experience working in a financial services company, with a focus on life insurance sector. Technical Skills : - Strong practical knowledge in SQL and Python, with an emphasis on creating reusable code. - Strong skills in data manipulation tools and libraries (e., Pandas, NumPy) for cleaning, transforming, and analyzing data. - Experience with both relational (e., SQL Server) and NoSQL databases (e., CosmosDB, MongoDB). - Experience with data integration tools and techniques, including ETL (Extract, Transform, Load) processes, data pipelines, and orchestration (e., Azure Data Factory). - Proficient in writing base code that dynamically generates multiple code instances based on a configuration file (JSON format/SQL database format). - Experience working in Azure cloud environment, particularly with data lakes. - A solid knowledge and experience in well-architected frameworks for data platforms in the cloud. - Familiarity with writing SQL or Python code for accessing and processing data in a data lake, including knowledge of delta and parquet formats. - Demonstrated experience and skills in data modelling. - Knowledge of Databricks, Spark, or similar frameworks for processing large datasets is considered an added advantage. - Knowledge of Apache Spark, including Spark SQL, DataFrame/Dataset API. - Proficiency in setting up and managing Databricks clusters, notebooks, and jobs within the Azure ecosystem. - Understanding of Azure security best practices and compliance standards relevant to data engineering workflows in Azure Databricks. - Designing, implementing, and optimizing end-to-end data pipelines using Azure Databricks, ensuring reliability, scalability, and performance. - Utilizing Azure Databricks for data transformation tasks including cleansing, aggregation, and enrichment of datasets. - Implementing automation for data workflows and monitoring pipelines to ensure data integrity and availability. - Familiarity with version control systems like Git for managing code and notebooks. Key Attributes : - Detail-oriented with a focus on data quality and integrity. - Strong problem-solving skills and ability to optimize data workflows. - Effective communicator and team player with the capability to collaborate with actuarial scientists and analysts (ref:hirist.tech)

Location: in, IN

Posted Date: 11/27/2024
Click Here to Apply
View More AsalJobs Jobs

Contact Information

Contact Human Resources
AsalJobs

Posted

November 27, 2024
UID: 4899465703

AboutJobs.com does not guarantee the validity or accuracy of the job information posted in this database. It is the job seeker's responsibility to independently review all posting companies, contracts and job offers.