Work Experience: 5 to 7 Years
Work location : Chennai & Hyderabad
Must Have Skills: Python, Pyspark, DataBricks, Data Lakes, AWS Athena, Glue & S3 Environment.
Job Reference ID:BT/F5A/IND
*DataBricks Certification is Mandatory
Job Responsibilities:
➢ Should be having overall 5 - 7 years of IT Experience. SQL Experience with AWS Needed.
➢ Proficient experience of min 5+ years in implementing data pipelines or data - intensive assets using AWS Athena
➢ Good experience of minimum 5+ years using distributed data processing engines such as Apache Spark Pyspark, Hive
➢ Good experience of minimum 5 years required in utilizing any SQL Server.
➢ Experience in building cloud - native solutions in AWS, especially with s3, Glue, Lambda, Step functions, EMR, EC2, or Azure Good knowledge on Cloud Databases, especially Snowflake.
➢ Experience in the data modelling plan, analysis, design, and build effort. Good Experience in ETL framework.
➢ Very good Data analysis skills in SQL. Able to develop complex SQL Statements and tune them for performance when required.
➢ Experience in creating modular data transformation using an orchestration engine like airflow or equivalent such as Nifi
➢ Understanding of data warehouse and traditional database concepts Experience working in Agile methodology. Good interpersonal and communication skills.
Job Requirements:
➢ A Bachelor's degree in Information Technology, Software Development Management, Software Engineering, Computer Science, or related field.
➢ Proven experience in project management and software development.
➢ Good working knowledge of project estimation techniques.
➢ Excellent technical knowledge.
➢ Good leadership, decision-making, and organization skills.
➢ Strong attention to detail and multi-tasking skills.
Salary: Commensurate with experience and demonstrated competence Contact: hr@bigtappanalytics.com
Contact: hr@bigtappanalytics.com