Fiserv Logo

Fiserv

Data Engineer

Posted 5 Hours Ago
Be an Early Applicant
In-Office
2 Locations
Senior level
In-Office
2 Locations
Senior level
Design and build ETL/ELT pipelines, manage data migration to Snowflake, and create machine learning data pipelines. Collaborate in agile teams to optimize data solutions.
The summary above was generated by AI

Calling all innovators – find your future at Fiserv.

We’re Fiserv, a global leader in Fintech and payments, and we move money and information in a way that moves the world. We connect financial institutions, corporations, merchants, and consumers to one another millions of times a day – quickly, reliably, and securely. Any time you swipe your credit card, pay through a mobile app, or withdraw money from the bank, we’re involved. If you want to make an impact on a global scale, come make a difference at Fiserv.

Job Title

Data Engineer

At Fiserv, we’re driving data-led decisions that power better financial experiences. As a Data Engineer on our EMEA data analytics teams, you will design, build and operate large-scale data migration, integration, replication and streaming solutions that deliver high-quality data into Snowflake with speed and reliability. You will work within agile, multidisciplinary delivery teams to architect, deploy and support production data intelligence solutions on Snowflake Cloud Data Warehouse.

What you’ll do:

  • Design, develop and maintain ETL/ELT pipelines for batch and real‑time workloads using Java, Scala or Python Spark jobs for transformation and aggregation.
  • Build and optimize SQL queries and data models in Snowflake; implement Snowpipe and Snow SQL-based transformations.
  • Implement Snowflake external tables, staging strategies, tasks, clustering and performance tuning; apply Time Travel and zero-copy cloning where appropriate.
  • Develop low-latency streaming integrations from Kafka and other event sources into the data warehouse.
  • Create production-grade data integrations and data pipelines of medium to high complexity (real‑time and batch).
  • Design and deliver machine learning data pipelines and feature engineering workflows.
  • Deploy data pipelines, infrastructure and artifacts as code; incorporate CI/CD and test-driven practices.
  • Troubleshoot and provide production support for data load, transformation and translation issues.
  • Write unit tests for transformations and aggregations and participate in peer code reviews.
  • Translate BI and reporting requirements into database and reporting designs.
  • Estimate work, communicate technical status to stakeholders and establish coding and data governance standards.
  • Champion non‑functional requirements including availability, scalability, operability and maintainability.

Experience you’ll need to have:

  • Minimum BSc/BTech/B.E. in Computer Science, Engineering or related discipline.
  • 5+ years’ experience in an enterprise big data environment.
  • Strong hands‑on experience with Snowflake (Snowpipe, external tables, tasks, Time Travel, zero‑copy cloning, performance tuning).
  • Deep knowledge of Spark and Kafka for batch and streaming processing.
  • Extensive SQL expertise and experience writing complex queries for analytics workloads.
  • Hands‑on experience with cloud platforms and services (AWS preferred).
  • Experience with data platforms such as Hadoop, Hive, Redshift or Snowflake.
  • Familiarity with DevOps practices and platforms (for example GitLab), containerization (Docker, Kubernetes) and orchestration tools (Airflow).
  • Solid Linux/OS knowledge and scripting experience.
  • Proven production deployment and operational experience for data technologies and platforms.
  • Strong communication, collaboration and stakeholder engagement skills; ability to work effectively in cross‑functional agile teams.

Certifications and professional qualifications (desirable)

  • SnowPro Core certification, AWS Big Data or equivalent data engineering certifications.

Preferred:

  • Hands‑on Snowflake migration experience.
  • Experience with Oracle RDBMS.
  • Exposure to ETL tools such as StreamSets or dbt.

Why join us We are partners in possibility? Combining data, technology and domain experience to create outcomes that matter for our clients. If you’re motivated by solving hard data problems, delivering production‑grade solutions and collaborating across teams, you’ll find opportunities to grow and make impact here at Fiserv in EMEA.

Thank you for considering employment with Fiserv.  Please:

  • Apply using your legal name
  • Complete the step-by-step profile and attach your resume (either is acceptable, both are preferable).

Our commitment to Diversity and Inclusion:

Fiserv is proud to be an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, national origin, gender, gender identity, sexual orientation, age, disability, protected veteran status, or any other category protected by law. 

Note to agencies:

Fiserv does not accept resume submissions from agencies outside of existing agreements. Please do not send resumes to Fiserv associates. Fiserv is not responsible for any fees associated with unsolicited resume submissions.

Warning about fake job posts:

Please be aware of fraudulent job postings that are not affiliated with Fiserv. Fraudulent job postings may be used by cyber criminals to target your personally identifiable information and/or to steal money or financial information. Any communications from a Fiserv representative will come from a legitimate Fiserv email address.

Top Skills

Airflow
AWS
Docker
Gitlab
Hadoop
Hive
Java
Kubernetes
Python
Redshift
Scala
Snowflake
Spark
SQL

Fiserv Dublin, Dublin, IRL Office

10 Hanover Quay, Dublin, Ireland, D02 K510

Similar Jobs

4 Days Ago
Hybrid
Dublin, IRL
Senior level
Senior level
Blockchain • Fintech • Payments • Consulting • Cryptocurrency • Cybersecurity • Quantum Computing
The Lead Data Engineer will provide technical leadership, design complex data workflows, build data models, and maintain data quality while mentoring junior engineers and collaborating with cross-functional teams.
Top Skills: AirflowAWSDatabricksHadoopHiveImpalaNifiPythonScalaScoopSparkSQL
8 Days Ago
In-Office
Dublin, IRL
Senior level
Senior level
Financial Services
The Senior Snowflake Data Engineer will develop data models, implement Snowflake best practices, and collaborate with teams to enhance data analytics at Davy.
Top Skills: AzureAzure DevopsDbtPower BIPythonSnowflakeSQL
9 Days Ago
In-Office
5 Locations
Mid level
Mid level
Fintech • Financial Services
As a Data Engineer at Starling, you will build cloud-native data warehouses and ML platforms, improving tooling within AWS and GCP.
Top Skills: AWSBigQueryDbtDebeziumGCPJavaKafkaKubernetesPostgresPythonSQLTerraform

What you need to know about the Dublin Tech Scene

From Bono and Oscar Wilde to today's tech leaders, Dublin has always attracted trailblazers, with more than 70,000 people working in the city's expanding digital sector. Continuing its legacy of drawing pioneers, the city is advancing rapidly. Ireland is now ranked as one of the top tech clusters in the region and the number one destination for digital companies, with the highest hiring intention of any region across all sectors.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account