Kargo Logo

Kargo

Senior Data Engineer

Reposted 23 Days Ago
Be an Early Applicant
In-Office
Dublin
Senior level
In-Office
Dublin
Senior level
The Senior Data Engineer at Kargo will implement and optimize data pipelines, support CI/CD practices, and enhance data systems for better targeting and privacy compliance.
The summary above was generated by AI

Kargo unites the world's leading brands, retailers and premium publishers across screens using innovative technology and advanced creative ad formats. At Kargo, we're all about bringing together the best of the best with a spark of creativity to stand out from the crowd. The same is true for our employees. What makes Kargo and each Kargo team member exceptional makes our company special. Kargo believes differences should be celebrated and is committed to diversity in the workplace. As an Equal Opportunity employer, we do not discriminate on the basis of race, color, religion, sex, sexual orientation, gender identity, marital status, age, national origin, protected veteran status, disability or other legally protected status. Individuals with disabilities are provided reasonable accommodation to participate in the job application process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.
Founded in 2003, Kargo is a global company headquartered in New York with offices around the world.

Title:  Senior Data Engineer 

Job Type: Permanent, Remote

Job Location: Dublin, Ireland

The Opportunity

At Kargo, we are rapidly evolving our data infrastructure and capabilities to address challenges of data scale, new methodologies for onboarding and targeting, and rigorous privacy standards. We're looking for an experienced Senior Data Engineer to join our team, focusing on hands-on implementation, creative problem-solving, and exploring new technical approaches. You'll work collaboratively with our technical leads and peers, actively enhancing and scaling the data processes that drive powerful targeting systems.

The Daily To-Do

  • Independently implement, optimize, and maintain robust ETL/ELT pipelines using Python, Airflow, Spark, Iceberg, Snowflake, Aerospike, Docker, Kubernetes (EKS), AWS, and real-time streaming technologies like Kafka and Flink.
  • Engage proactively in collaborative design and brainstorming sessions, contributing technical insights and innovative ideas for solving complex data engineering challenges.
  • Support the definition and implementation of robust testing strategies, and guide the team in adopting disciplined CI/CD practices using ArgoCD to enable efficient and reliable deployments.
  • Monitor and optimize data systems and infrastructure to ensure operational reliability, performance efficiency, and cost-effectiveness.
  • Actively contribute to onboarding new datasets, enhancing targeting capabilities, and exploring modern privacy-compliant methodologies.
  • Maintain thorough documentation of technical implementations, operational procedures, and best practices for effective knowledge sharing and onboarding.

Qualifications:

  • Strong expertise in implementing, maintaining, and optimizing large-scale data systems with minimal oversight.
  • Deep proficiency in Python, Spark, and Iceberg, with a clear understanding of data structuring for efficiency and performance.
  • Experience with Airflow for building robust data workflows is strongly preferred.
  • Familiarity with analytical warehousing such as Snowflake or Clickhouse, including writing and optimizing SQL queries and understanding Snowflake's performance and cost dynamics.
  • Comfort with Agile methodologies, including regular use of Jira and Confluence for task management and documentation.
  • Proven ability to independently drive implementation and problem-solving, turning ambiguity into clearly defined actions.
  • Excellent communication skills to effectively engage in discussions with technical teams and stakeholders.
  • Familiarity with identity, privacy, and targeting methodologies in AdTech is required.
  • Nice to have: Extensive DevOps experience, particularly with AWS (including EKS), Docker, Kubernetes, CI/CD automation using ArgoCD, and monitoring via Prometheus.

Follow Our Lead

  • Big Picture:  kargo.com
  • The Latest:  Instagram (@kargomobile) and LinkedIn (Kargo)

Top Skills

Aerospike
Airflow
Argocd
AWS
Confluence
Docker
Flink
Iceberg
JIRA
Kafka
Kubernetes
Prometheus
Python
Snowflake
Spark
SQL

Similar Jobs at Kargo

17 Days Ago
In-Office
Dublin, IRL
Senior level
Senior level
AdTech • Digital Media • Marketing Tech • Mobile
The Senior Software Engineer I will contribute to full-stack development, implementing features, collaborating with teams, and ensuring code quality and maintainability for internal applications.
Top Skills: AngularGoNode.jsNoSQLPHPReactSQLVue
23 Days Ago
In-Office
Dublin, IRL
Mid level
Mid level
AdTech • Digital Media • Marketing Tech • Mobile
The Staff Machine Learning Engineer will design, develop, deploy, and maintain machine learning models, collaborating with teams to integrate solutions into the advertising technology platform.
Top Skills: AWSDatabricksEc2EmrGoKubeflowKubernetesMachine LearningMlflowPythonSagemakerSnowflakeSparkSQL

What you need to know about the Dublin Tech Scene

From Bono and Oscar Wilde to today's tech leaders, Dublin has always attracted trailblazers, with more than 70,000 people working in the city's expanding digital sector. Continuing its legacy of drawing pioneers, the city is advancing rapidly. Ireland is now ranked as one of the top tech clusters in the region and the number one destination for digital companies, with the highest hiring intention of any region across all sectors.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account