AWS Glue Application Developer

Job Title: AWS Glue Application Developer
Location: Chennai,Hyderabad
Experience: 5 - 7 Years
Primary Skills: AWS
Secondary Skills: SQL

Job Description:

Must Skills: SQL, PySpark, Python, AWS, Glue, DMS, Data integrations and Data Ops.

Roles And Responsibilities

Job Title: AWS Glue Application Developer

Location: Chennai & Hyderabad

Experience: 5 to 7 Years

Must Skills: SQL, PySpark, Python, AWS, Glue, DMS, Data integrations and Data Ops.

Job Reference ID:BT/F21/IND

 

Job Summary:

Design, build and configure applications to meet business process and application requirements.

 

Key Responsibilities:

7 years of work experience with ETL, Data Modelling, and Data Architecture Proficient in ETL optimization, designing, coding, and tuning big data processes using Pyspark Extensive experience to build data platforms on AWS using core AWS services Step function, EMR, Lambda, Glue and Athena, Redshift, Postgres, RDS etc and design/develop data engineering solutions. Orchestrate using Airflow.

 

Technical Experience:

Hands-on experience on developing Data platform and its components Data Lake, cloud Datawarehouse, APIs, Batch and streaming data pipeline Experience with building data pipelines and applications to stream and process large datasets at low latencies.

 

Ø Enhancements, new development, defect resolution and production support of Big data ETL development using AWS native services.

Ø Create data pipeline architecture by designing and implementing data ingestion solutions.

Ø Integrate data sets using AWS services such as Glue, Lambda functions/ Airflow.

Ø Design and optimize data models on AWS Cloud using AWS data stores such as Redshift, RDS, S3, Athena.

Ø Author ETL processes using Python, Pyspark.

Ø Build Redshift Spectrum direct transformations and data modelling using data in S3.

Ø ETL process monitoring using CloudWatch events.

Ø You will be working in collaboration with other teams. Good communication must.

Ø Must have experience in using AWS services API, AWS CLI and SDK

 

Professional Attributes:

Ø Experience operating very large data warehouses or data lakes Expert-level skills in writing and optimizing SQL Extensive, real-world experience designing technology components for enterprise solutions and defining solution architectures and reference architectures with a focus on cloud technology.

Ø Must have 7+ years of big data ETL experience using Python, S3, Lambda, Dynamo DB, Athena, Glue in AWS environment.

Ø Expertise in S3, RDS, Redshift, Kinesis, EC2 clusters highly desired.

 

Qualification:

Ø Degree in Computer Science, Computer Engineering or equivalent.

 

Salary: Commensurate with experience and demonstrated competence

Contact: hr@bigtappanalytics.com

Upload CV:

Contact Informations:

Drop us a message