Req#: 875Position : Data Specialist/Engineer
Location: Remote
Duration : 1 Year
JD :Mandatory Skills:
Strong Python, DBT, Snowflake , SQL
Primary SKills :
Pandas,Numpy,Airflow,AWS
JD:
• 6 plus years of overall IT experience.
• Strong Python experience mandatory.
• Write effective, scalable code in Python.
• Hands on experiance with Python - Pandas library in detail & Numpy.
• Advanced SQL skills are mandatory.
• Ability to write complex SQL queries to query large amounts of data Hands on DBT development experiance.
• Design, develop, and maintain DBT models, transformations, and SQL code to build efficient data pipelines for analytics and reporting.
• Mandatory hands on coding experience in Unix Shell scripting.
• Develop back-end components to improve responsiveness and overall performance.
• Hands-on-experience in designing and building data pipelines in data analytics implementations such as Data Lake and Data Warehouse and development.
• Salesforce CDP knowledge and Snowflake implementation will be a plus.
• Integrate user-facing elements into applications
• Writing Snowflake SQL queries against Snowflake Developing scripts like Unix, Python, etc. to do Extract, Load, and Transform data.
• Good to have working experiance with Airflow.
• Good to have exposure to AWS eco systems.
• Test and debug programs, Improve functionality of existing systems.
• Implement security and data protection solutions
• Coordinate with internal teams to understand user requirements and provide technical solutions
• Support QA, UAT and performance testing phases of the development cycle.
• Understand and incorporate the required security framework in the developed data model and ETL