Kargo Logo

Kargo

Senior Data Engineer

Posted 7 Days Ago
Be an Early Applicant
In-Office
Dublin
Senior level
In-Office
Dublin
Senior level
The Senior Data Engineer at Kargo will implement and optimize data pipelines, support CI/CD practices, and enhance data systems for better targeting and privacy compliance.
The summary above was generated by AI

Kargo creates breakthrough cross-screen ad experiences for the world’s leading brands and publishers. Every day, our 600+ employees bring the power of their creativity and diversity to radically raise the bar on what mobile, CTV, AI, social, and eCommerce can do to wow consumers and build businesses. Now 20 years strong, Kargo has offices in NYC, Chicago, Austin, LA,  Dallas, Sydney, Auckland, London and Waterford, Ireland. Humble brag: In 2024, Kargo was recognized as a Best Place to Work by Ad Age and Built In

Who We Hire

Success takes all kinds. Diversity describes our workforce. Inclusion defines our culture. We do not discriminate on the basis of race, color, religion, sex, sexual orientation, gender identity, marital status, age, national origin, protected veteran status, disability or other legally protected status. Individuals with disabilities are provided reasonable accommodation to participate in the job application process, perform essential job functions, and receive other benefits and privileges of employment. 

Title:  Senior Data Engineer 

Job Type: Permanent, Remote

Job Location: Dublin, Ireland

The Opportunity

At Kargo, we are rapidly evolving our data infrastructure and capabilities to address challenges of data scale, new methodologies for onboarding and targeting, and rigorous privacy standards. We're looking for an experienced Senior Data Engineer to join our team, focusing on hands-on implementation, creative problem-solving, and exploring new technical approaches. You'll work collaboratively with our technical leads and peers, actively enhancing and scaling the data processes that drive powerful targeting systems.

The Daily To-Do

  • Independently implement, optimize, and maintain robust ETL/ELT pipelines using Python, Airflow, Spark, Iceberg, Snowflake, Aerospike, Docker, Kubernetes (EKS), AWS, and real-time streaming technologies like Kafka and Flink.
  • Engage proactively in collaborative design and brainstorming sessions, contributing technical insights and innovative ideas for solving complex data engineering challenges.
  • Support the definition and implementation of robust testing strategies, and guide the team in adopting disciplined CI/CD practices using ArgoCD to enable efficient and reliable deployments.
  • Monitor and optimize data systems and infrastructure to ensure operational reliability, performance efficiency, and cost-effectiveness.
  • Actively contribute to onboarding new datasets, enhancing targeting capabilities, and exploring modern privacy-compliant methodologies.
  • Maintain thorough documentation of technical implementations, operational procedures, and best practices for effective knowledge sharing and onboarding.

Qualifications:

  • Strong expertise in implementing, maintaining, and optimizing large-scale data systems with minimal oversight.
  • Deep proficiency in Python, Spark, and Iceberg, with a clear understanding of data structuring for efficiency and performance.
  • Experience with Airflow for building robust data workflows is strongly preferred.
  • Familiarity with analytical warehousing such as Snowflake or Clickhouse, including writing and optimizing SQL queries and understanding Snowflake's performance and cost dynamics.
  • Comfort with Agile methodologies, including regular use of Jira and Confluence for task management and documentation.
  • Proven ability to independently drive implementation and problem-solving, turning ambiguity into clearly defined actions.
  • Excellent communication skills to effectively engage in discussions with technical teams and stakeholders.
  • Familiarity with identity, privacy, and targeting methodologies in AdTech is required.
  • Nice to have: Extensive DevOps experience, particularly with AWS (including EKS), Docker, Kubernetes, CI/CD automation using ArgoCD, and monitoring via Prometheus.

Follow Our Lead

  • Big Picture:  kargo.com
  • The Latest:  Instagram (@kargomobile) and LinkedIn (Kargo)

Top Skills

Aerospike
Airflow
Argocd
AWS
Confluence
Docker
Flink
Iceberg
JIRA
Kafka
Kubernetes
Prometheus
Python
Snowflake
Spark
SQL

Similar Jobs at Kargo

7 Days Ago
In-Office
Dublin, IRL
Mid level
Mid level
AdTech • Digital Media • Marketing Tech • Mobile
The Staff Machine Learning Engineer will design, develop, deploy, and maintain machine learning models, collaborating with teams to integrate solutions into the advertising technology platform.
Top Skills: AWSDatabricksEc2EmrGoKubeflowKubernetesMachine LearningMlflowPythonSagemakerSnowflakeSparkSQL
7 Days Ago
In-Office
Dublin, IRL
Senior level
Senior level
AdTech • Digital Media • Marketing Tech • Mobile
The Senior Data Scientist will drive auction-based models, collaborate with teams, enhance modeling infrastructure, and present insights to leadership.
Top Skills: Machine LearningOpenrtbPythonReal-Time BiddingSQL

What you need to know about the Dublin Tech Scene

From Bono and Oscar Wilde to today's tech leaders, Dublin has always attracted trailblazers, with more than 70,000 people working in the city's expanding digital sector. Continuing its legacy of drawing pioneers, the city is advancing rapidly. Ireland is now ranked as one of the top tech clusters in the region and the number one destination for digital companies, with the highest hiring intention of any region across all sectors.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account