Ardanis Logo

Ardanis

Senior Data Engineer (Snowflake)

Sorry, this job was removed at 04:17 p.m. (GMT) on Wednesday, Jan 28, 2026
Be an Early Applicant
In-Office or Remote
4 Locations
In-Office or Remote
4 Locations

Similar Jobs

18 Hours Ago
Easy Apply
In-Office or Remote
Barcelona, Cataluña, ESP
Easy Apply
Senior level
Senior level
Artificial Intelligence • Healthtech • Information Technology • Software • Conversational AI • Generative AI • Automation
As a Senior Software Engineer at Collectly, you will develop core product features, manage product requirements, coding, testing, and maintenance while collaborating with the team.
Top Skills: FlaskPostgresPythonReactRedisSqlalchemy
18 Hours Ago
Easy Apply
Remote
Spain
Easy Apply
Entry level
Entry level
Big Data • Fintech • Mobile • Payments • Financial Services
Build and maintain backend systems for the Collections team at Affirm, ensuring high availability and reliability while developing recovery strategies for delinquent loans.
Top Skills: AWSKotlinKubernetesMySQLPython
18 Hours Ago
Easy Apply
Remote
31 Locations
Easy Apply
Senior level
Senior level
Cloud • Security • Software • Cybersecurity • Automation
As a Principal Database Engineer, you will architect scalable PostgreSQL solutions, ensuring data reliability, performance, and collaborating with teams to build a robust data platform for GitLab's growth.
Top Skills: ElasticsearchPostgresTypescript

We are seeking a Senior Data Engineer with strong experience in Snowflake-based data platforms to design, build, and operate scalable, cloud-native data solutions. This role requires deep hands-on expertise across data architecture, ETL/ELT pipelines, cloud infrastructure, and data modelling, with the ability to take full technical ownership of complex projects. You will work in an international environment, contributing to architecture decisions, engineering standards, and delivery excellence across multiple data initiatives.

Responsibilities:
  • Design, implement, and maintain end-to-end data platforms using Snowflake as the core analytical engine.
  • Lead the development of scalable, fault-tolerant ETL/ELT pipelines for large-volume and high-velocity data.
  • Build data solutions from initial design through production, with minimal supervision.
  • Implement data transformation and modelling layers using DBT (Core / Cloud) following analytics engineering best practices.
  • Develop distributed data processing jobs using Apache Spark (Python or Scala).
  • Design and optimize Snowflake schemas, warehouses, and performance strategies (clustering, caching, cost control).
  • Ensure high standards of code quality, testing, observability, and documentation.
  • Implement and maintain CI/CD pipelines for data workflows and transformations.
  • Collaborate closely with product, analytics, and engineering teams in an Agile environment.
  • Contribute to technical standards, architectural guidelines, and best practices across the data platform.
  • Troubleshoot performance, data quality, and reliability issues in production environments.

Requirements
  • 4+ years of professional experience in Data Engineering or similar roles.
  • Strong, hands-on experience with Snowflake in production environments.
  • Solid experience with Python or Scala, particularly in data-intensive applications.
  • Proven experience developing and operating Apache Spark pipelines.
  • Strong experience with cloud platforms (Azure or AWS), including native data services.
  • Hands-on experience implementing CI/CD for data pipelines.
  • Experience with data testing strategies (unit, integration, data validation).
  • Strong knowledge of SQL, including performance optimization and complex transformations.
  • Experience using DBT for data transformation, modelling, and dependency management.
  • Experience designing and maintaining large-scale data pipelines.
  • Ability to work autonomously and take technical ownership of deliverables.
  • Hands-on experience with Infrastructure as Code (IaC), particularly using Terraform.
  • Proven experience designing, implementing, and maintaining CI/CD pipelines for data workflows.
  • Knowledge of NoSQL data stores.
  • Experience with data governance, lineage, and metadata management.
  • Familiarity with cost optimization strategies in cloud data platforms.
HQ

Ardanis Dublin, Dublin, IRL Office

50 Richmond Street S, The Lennox Building, Iconic Office, Dublin, County Dublin, Ireland, D02 FK02

What you need to know about the Dublin Tech Scene

From Bono and Oscar Wilde to today's tech leaders, Dublin has always attracted trailblazers, with more than 70,000 people working in the city's expanding digital sector. Continuing its legacy of drawing pioneers, the city is advancing rapidly. Ireland is now ranked as one of the top tech clusters in the region and the number one destination for digital companies, with the highest hiring intention of any region across all sectors.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account