Role Overview :
We are looking for a go-getter person who can define :
Data strategies to meet the demands of business requirements
Technical requirements of the organizational data needs from diverse teams
Data architecture, models and goforward data solutions for various teams of the organization
The person will be part of the Data Solutions team for a major Insurance client. He/ She will work with different stakeholders as SME for data engineering; A typical workday will involve working with stakeholders in individual contributor.
Would engage in technical problem solving across multiple technologies; often develop new solutions and recommend technologies that can be leveraged to create data solutions
Would develop, construct, test, and maintain data architectures for data platform, database and analytical/reporting systems.
Would partner with other technology platform teams to leverage innovative and new technology for delivering solutions that best fit internal data. needs for various analytical solutions
Would handle codes independently, manage them and also guide other team members around its development/management
Key Responsibilities & Skillsets :
Common Skillsets :
5+ years of experience in
Data Engineering:
SQL, Python, DWH (Redshift or Snowflake) and associated data engineering jobs.
Experience with Azure, Oracle, AWS ETL pipeline services: Lambda, S3, EMR/Glue, Redshift(or Snowflake), stepfunctions (Preferred)
Experience with building and supporting cloud based ETL (Extract Transform Load) Data Pipelines
Good to have experience in Python
Good to have working experience on RESTful API frameworks (Flask/FastAPI), Messaging Queue service (Kafka)
Should be able to work on a problem independently and prepare client ready deliverable with minimal or no supervision
Experience with working in an agile environment, development life cycle and with diverse stakeholders (like IT, Business, Project Managers etc)
Excellent communication & presentation skills
Data Management Skillsets :
Ability to develop, enhance data models and identify ETL optimization opportunities. Exposure to ETL tools is going to help in the work
Should have strong grasp of advanced SQL functionalities (joins, nested query, procedures, PL/SQL)
Strong ability to translate functional specifications / requirements to technical requirements
Candidate Profile :
Bachelor’s/Master’s degree in economics, mathematics, actuarial sciences, computer science/engineering, operations research or related analytics areas; candidates with BA/BS degrees in the same fields from the top tier academic institutions are also welcome to apply
Strong and indepth understanding of DE Fundamentals
Exposure on designing ETL data pipelines and data analysis
Exposure on endtoend data lifecycle management
Superior analytical and problemsolving skills
Outstanding written and verbal communication skills
Able to work in fast pace continuously evolving environment and ready to take up uphill challenges
,