This job is now closed
Job Description
- Req#: SR-21257
It's fun to work in a company where people truly BELIEVE in what they are doing!
We're committed to bringing passion and customer focus to the business.
Roles and Responsibilities
• Design and build the pipelines which will ingest the data into BigQuery for consumption.
• Monitor and troubleshoot data pipelines and infrastructure to ensure high availability and performance.Mandatory Technical Skills
• Knowledge of Data Warehousing, including ETL pipeline design, development and maintenance
• Minimum 1 years of technology experience in Data Engineering projects
• Minimum 1 years of experience in GCP.
• Basics of python programming is a must.
• Minimum 1 years of experience in SQL/PL SQL Scripting.
• Minimum 1 years of experience in Data Warehouse / ETL.If you like wild growth and working with happy, enthusiastic over-achievers, you'll enjoy your career with us!
Not the right fit? Let us know you're interested in a future opportunity by clicking Introduce Yourself in the top-right corner of the page or create an account to set up email alerts as new job postings become available that meet your interest!
About the company
Fractal Analytics helps global Fortune 100 companies power every human decision in the enterprise by bringing analytics and AI to the decision.