Apply now »

Title:  Data Engineer

                                                                 Data Engineer M2C                                                             

                                                                           Pune

This is Worldline

 

We are the innovators at the heart of the payments technology industry, shaping how the world pays and gets paid. The solutions our people build today power the growth of millions of businesses tomorrow. From your local coffee shop to unicorns and international banks. From San Francisco to Auckland. We are in every corner of the world, in every part of commerce.  And just as we help our customers accelerate their business, we are committed to helping our people accelerate their careers. Together, we shape the evolution.

 

The Opportunity

At Worldline, our technology addresses the persistent challenges of the payment world. We design and operate leading digital payment and transactional solutions that enable sustainable economic growth and reinforce trust and security in our societies. If you are a highly skilled Product Owner who has a creative mind and passionate about delivering quality code, then get ready to join our company! We are looking for Developer/Engineer to produce scalable software solutions in the Payments Domain. You’ll be part of a cross-functional team that’s responsible for the full software development life cycle, from conception to deployment.

Day-to-Day Responsibilities

 

We are seeking a highly skilled and knowledgeable Data Engineer to join our Data Management team on a transformative Move to Cloud (M2C) project. The ideal candidate will have a strong background in creating robust data ingestion pipelines and a deep understanding of ETL processes, particularly within the Google Cloud Platform ecosystem and using tools such as dbt Labs over BigQuery

 

 

  • Develop and maintain scalable and reliable data pipelines using PySpark and SQL to support the migration from Oracle's on-premises data warehouse (structured data source) and unstructured data sources to BigQuery.
  • Design and implement bulletproof data ingestion and integration processes that ensure data quality and consistency.
  • Use the dbt tool to create and manage ETL processes that transform and load data efficiently from/into BigQuery.
  • Ensure the resilience and efficiency of data transformation jobs that can handle large volumes of data within a cloud-based architecture.
  • Work closely with our Data Engineers to gather requirements for the currently developed data pipelines. Provide expertise in GCP services like DataProc, DataFlow, Cloud Functions, Workflows, Cloud Composer, and Bigquery, advocating for best practices in cloud-based data management.
  • Collaborate with data architects and other stakeholders to optimize data models and warehouse design for the cloud environment.
  • Develop and implement monitoring, quality, and validation processes to ensure the integrity of data pipelines and data.
  • Document all data engineering processes and create clear specifications for future reference and compliance

 

 

 

 

Who Are We Looking For

 

We look for big thinkers. People who can drive positive change, step up and show what’s next – people with passion, can-do attitude and a hunger to learn and grow. In practice this means:

 

  • Bachelor's or Master’s degree in Computer Science, Engineering, or a related field.
  • Minimum of 5 years of experience as a Data Engineer with a focus on Google cloud-based solutions.
  • Strong knowledge of versioning and CI/CD.
  • Proficient in GCP services, with a strong emphasis on data-related products such as DataProc, DataFlow, Cloud Functions, Workflows, Cloud Composer, and Bigquery. Extensive experience with ETL tools, particularly dbt Labs, and a clear understanding of ETL best practices.
  • Experience in building and optimizing data pipelines, architectures, and data sets from structured/unstructured data sources.
  • Strong analytical skills with the ability to understand complex requirements and translate them into technical solutions. Excellent problem-solving abilities and a commitment to quality.
  • Strong communication skills, with the ability to work collaboratively in a team environment.
  • Relevant certifications in the Google Cloud Platform or other data engineering credentials are desirable. Proficiency in SQL and Python with knowledge of Spark.
  • Fluent in English, with strong written and verbal communication skills.

 

Shape the evolution

 

We are on an exciting journey towards the next frontiers of payments technology, and we look for big thinkers, people with passion, can-do attitude and a hunger to learn and grow. Here you’ll work with ambitious colleagues from around the world, take on unique challenges as a team, and make a real impact on the society. With an empowering culture, strong technology and extensive training opportunities, we help you accelerate your career - wherever you decide to go. Join our global team of 18,000 innovators and shape a tomorrow that is yours to own.

 

Learn more about life at Worldline at careers.worldline.com

 

We are proud to be an Equal Opportunity employer. We do not discriminate based upon race, religion, color, national origin, sex (including pregnancy, childbirth, or related medical conditions), sexual orientation, gender identity, gender expression, age, status as an individual with a disability, or any applicable legally protected characteristics.

 

Date:  May 30, 2024
Date:  May 30, 2024
Brand:  Worldline
Brand:  Worldline
Category:  Technology - Application development
Category:  Technology - Application development
Contract Type:  Permanent
Contract Type:  Permanent
Location: 

Pune, Maharashtra, IN

Location: 

Pune, Maharashtra, IN


Job Segment: Cloud, Database, Computer Science, Data Management, Engineer, Technology, Data, Engineering

Apply now »