Solution Architect (Big Data).

Job Title: Solution Architect (Big Data).

Location: Chennai & Hyderabad

Experience: 10-12 Years

Must Skills: Informatica EDC/DEI, Axon, Snowflake, MSTR, Azure, Analytical tools, languages, or libraries and open-source software platforms and languages. 

Additional Skills (Nice to have): Exposure to Temenos data structures.

Job Reference ID:BT/F36/DU


Job Description:

The Solution Architect (Big Data) role will be accountable to deliver proof-of-concept projects, topical workshops, and lead implementation projects. This role will focus on all aspects of data and information (both structured and unstructured) and support the development of enterprise data lake architecting on Snowflake. This role spans the full information management life cycle from acquisition, cleansing, data modelling, transformation, and storage to presentation, distribution, security, privacy, and archiving. The role involves.


➢ Lead and Architect migration of enterprise on-prem data environment to Snowflake with performance and reliability.

➢ Provide thought leadership on how clients can achieve their data strategies and objectives.

➢ Assess and understand the existing EDW workloads and reporting/ analytics to architect value / outcome-based target state architecture and roadmap for execution.

➢ Lead cross-functional, cross-organisation initiatives that drive innovative solutions on Snowflake.

➢ Helping Project Managers in identifying key data, and information risks, mitigation plans, effort estimation and planning.

➢ Act as conduit between client and our delivery team to ensure program delivery is aligned with target state architecture and guiding principles.

➢ Keep abreast of Data & AI revolution in the market to provide technical and business acumen that drive reference architectures and thought leadership to clients.

➢ Deliver presentations to internal & external audiences.

➢ Drive opportunity pursuits for Cloud Data Migration and Data Modernization initiatives.

➢ Mentor and lead other Data Architects/ Solution architects and infuse best practices across a broader Data & AI community.

➢ Developing standards, domain principles, and the best ways for creating and maintaining architecture artifacts (including inventories and models). This includes articulating the value of the artifacts.

➢ Hosting Peer Reviews to assess the status and compliance to architecture standards.

➢ Be accountable for the development of the conceptual, logical, and physical data models, the usage of data lakes on target Azure Cloud platforms.

➢ Be accountable for and govern the expansion of existing data architecture and the optimization of data query performance via the best solutions. The person should have the ability to work both independently and cross-functionally.


Hands on experience in the following:

➢ Demonstrated experience within Consulting and the ability to effectively advise clients on Data and Analytics specific solutions is highly desirable with 5-6 or more years of hands-on development design and develop applications using Snowflake is highly desirable.

➢ In depth understanding of Snowflake Architecture including SnowSQL, Snowpipe, Snowpark, performance tuning, compute, and storage.

➢ Hands on experience in Teradata/Cloudera and/or other Data Warehouse platforms. Must have migration experience from legacy data warehouse to Snowflake.

➢ Broad experience across Cloud architecture, DevOps, Networking, Machine Learning, Security is beneficial.

➢ Stakeholder management and communication skills, including prioritising, problem solving and interpersonal relationship building.

➢ MS Azure: Security Centre, Azure Active Directory (Core, Developer, B2C, Services), Key Vault, understanding of securing PaaS solutions like SQL data warehouse. ADF, SQL DB, Azure App service etc.

➢ Experience in microservice based architecture or awareness is desirable.

➢ Experience in data domain modelling, data designing, tools – ArchiMate or Erwin, etc.

➢ Cloud computing infrastructure (e.g. Amazon Web Services EC2, Elastic MapReduce) and considerations for scalable, distributed systems NoSQL platforms (e.g. key-value stores, graph databases) Data modelling techniques for NoSQL data and Cloud data platforms (e.g. AWS, Azure, GCP) High-scale or distributed cloud native data platforms

➢ Experience in solution architecting, design & execute – Data lakes in CSPs (preferred Azure/GCP, 

AWS)


Experience / Qualification:

➢ SA (Associate) level certification in MS Azure.

➢ Overall 10+ years of experience in IT industry.

➢ Graduate (Any Degree) with 60% and above.

➢ More than 5 years of demonstrated ability with normalized and Dimensional modelling techniques, Star & Snowflake schemas, modelling slowly changing dimensions, dimensional hierarchies, and data classification. Ideally at enterprise scale as well as at organizational level.

➢ 5+ years of experience with high scale /distributed RDBMS.

➢ Expertise in Data Quality, Data Profiling, Data Governance, Data Security, Metadata Management, MDM, Data Archival and Data Migration strategies using appropriate tools.

➢ Ability to define and govern data modelling and design standards, tools, the best approaches, and related development for enterprise data models.

➢ Hands-on data modelling in areas of Canonical, Semantic, Logical & Physical data models, design, schema synchronization and performance tuning.


Benefits:

➢ Competitive salary and benefits package.

➢ Opportunity to work on a variety of projects and industries.

➢ Opportunity to work with a team of experienced and talented professionals.

➢ Access to cutting-edge tools and technologies.

➢ Opportunity to grow and develop your skills through ongoing learning and development opportunities



Salary:  Commensurate with experience and demonstrated competence Contact: hr@bigtappanalytics.com

Contact:    hr@bigtappanalytics.com





Contact Informations:

Drop us a message