Kargo unites the world's leading brands, retailers and premium publishers across screens using innovative technology and advanced creative ad formats. At Kargo, we're all about bringing together the best of the best with a spark of creativity to stand out from the crowd. The same is true for our employees. What makes Kargo and each Kargo team member exceptional makes our company special. Kargo believes differences should be celebrated and is committed to diversity in the workplace. As an Equal Opportunity employer, we do not discriminate on the basis of race, color, religion, sex, sexual orientation, gender identity, marital status, age, national origin, protected veteran status, disability or other legally protected status. Individuals with disabilities are provided reasonable accommodation to participate in the job application process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.
Founded in 2003, Kargo is a global company headquartered in New York with offices around the world.
Title: Senior Data Engineer
Job Type: Permanent, Remote
Job Location: Dublin, Ireland
The Opportunity
At Kargo, we are rapidly evolving our data infrastructure and capabilities to address challenges of data scale, new methodologies for onboarding and targeting, and rigorous privacy standards. We're looking for an experienced Senior Data Engineer to join our team, focusing on hands-on implementation, creative problem-solving, and exploring new technical approaches. You'll work collaboratively with our technical leads and peers, actively enhancing and scaling the data processes that drive powerful targeting systems.
The Daily To-Do
- Independently implement, optimize, and maintain robust ETL/ELT pipelines using Python, Airflow, Spark, Iceberg, Snowflake, Aerospike, Docker, Kubernetes (EKS), AWS, and real-time streaming technologies like Kafka and Flink.
- Engage proactively in collaborative design and brainstorming sessions, contributing technical insights and innovative ideas for solving complex data engineering challenges.
- Support the definition and implementation of robust testing strategies, and guide the team in adopting disciplined CI/CD practices using ArgoCD to enable efficient and reliable deployments.
- Monitor and optimize data systems and infrastructure to ensure operational reliability, performance efficiency, and cost-effectiveness.
- Actively contribute to onboarding new datasets, enhancing targeting capabilities, and exploring modern privacy-compliant methodologies.
- Maintain thorough documentation of technical implementations, operational procedures, and best practices for effective knowledge sharing and onboarding.
Qualifications:
- Strong expertise in implementing, maintaining, and optimizing large-scale data systems with minimal oversight.
- Deep proficiency in Python, Spark, and Iceberg, with a clear understanding of data structuring for efficiency and performance.
- Experience with Airflow for building robust data workflows is strongly preferred.
- Familiarity with analytical warehousing such as Snowflake or Clickhouse, including writing and optimizing SQL queries and understanding Snowflake's performance and cost dynamics.
- Comfort with Agile methodologies, including regular use of Jira and Confluence for task management and documentation.
- Proven ability to independently drive implementation and problem-solving, turning ambiguity into clearly defined actions.
- Excellent communication skills to effectively engage in discussions with technical teams and stakeholders.
- Familiarity with identity, privacy, and targeting methodologies in AdTech is required.
- Nice to have: Extensive DevOps experience, particularly with AWS (including EKS), Docker, Kubernetes, CI/CD automation using ArgoCD, and monitoring via Prometheus.
Follow Our Lead
- Big Picture: kargo.com
- The Latest: Instagram (@kargomobile) and LinkedIn (Kargo)

