brand logo
View all jobs

Data Engineer

DATA ANALYTICS
Pune
About Us
SG Analytics (SGA), a Global Insights and Analytics company, focusses on ESG, Data Analytics, and Investment & Market research services. The company has presence in New York, San Francisco, Austin, Seattle, Toronto, London, Zurich, Pune, Bengaluru, and Hyderabad and growing consistently for the last few years.

SGA is a Great Place To Work (GPTW) certified company, and with its thriving work environment shaped by a growth mindset, abundant learning & collaboration opportunities, and a meritocracy-driven culture, SG Analytics has also been awarded regional best employer in 2016, 2018 & 2020.
Job Description
We’re in search of a candidate who is knowledgeable about and loves working with modern data integration frameworks, big data, and cloud technologies. Candidates must also be proficient with data programming languages (e.g., Python and SQL). The Yum! data engineer will build a variety of data pipelines and models to support advanced AI/ML analytics projects - with the intent of elevating the customer experience and driving revenue and profit growth in our restaurants globally. 

Job Duties
As a Data Engineer, you will:
  • Play a key role in our Data Operations team - developing data solutions responsible for driving SGA Growth.
  • Design and develop data pipelines – streaming and batch – to move data from point-of-sale, back-of-house, operational platforms, and more to our Global Data Hub. 
  • Contribute to standardizing and developing a framework to extend these pipelines across brands and markets.
  • Develop on the Yum! data platform by building applications using a mix of open-source frameworks (PySpark, Kubernetes, Airflow, etc.) and best-in-breed SaaS tools (Informatica Cloud, Snowflake, Domo, etc.).
  • Implement and manage production support processes around data lifecycle, data quality, coding utilities, storage, reporting, and other data integration points.

Skills and Qualifications:
  • Vast background in all things data related
  • Proficiency with Airflow.
  • AWS platform development experience (EKS, S3, API Gateway, Lambda, etc.)
  • Experience with modern ETL tools such as Informatica, Matillion, or DBT; Informatica CDI is a plus
  • High level of proficiency with SQL (Snowflake a big plus)
  • Proficiency with Python for transforming data and automating tasks
  • Experience with Kafka, Pulsar, or other streaming technologies
  • Experience orchestrating complex task flows across a variety of technologies
  • Bachelor’s degree from an accredited institution or relevant experience
  • Experience working with Postgres, Elasticsearch.
  • Proficiency in building REST API
  • Proficiency working with pandas, duckdb and other python libraries for data transformation.
  • Experience with visualization tools like Streamlit or Kibana
Job Requirement
Experience:
- 3 to 5 years

Qualifications:
- Graduate/Post Graduate in Computer Science

Join Talent Pool

Join our talent pool by simply submitting your resume. We’ll inform you about the new jobs matching your profile and update you if you are the best fit for one of our open positions.