N-iX Logo

N-iX

Senior Data Engineer with Snowflake

Sorry, this job was removed at 08:17 a.m. (GMT) on Friday, May 23, 2025
Be an Early Applicant
Remote
28 Locations
Remote
28 Locations

Similar Jobs

11 Days Ago
Remote
28 Locations
Senior level
Senior level
Information Technology • Consulting
Lead the design and implementation of ETL pipelines and data solutions on the Snowflake platform, optimizing performance and ensuring data quality and insights for business decision-making.
Top Skills: AWSAzureGCPPower BIPythonSnowflakeSQLTableau
15 Hours Ago
Remote
30 Locations
Senior level
Senior level
Artificial Intelligence • Productivity • Software • Automation
Manage and develop the Data Engineering team to build scalable data systems and APIs. Set architectural vision, ensure data quality, and collaborate across teams to drive business impact.
Top Skills: AirflowAWSDatabricksDbtKafkaPythonTypescript
20 Hours Ago
Remote or Hybrid
3 Locations
Mid level
Mid level
Big Data • Food • Hardware • Machine Learning • Retail • Automation • Manufacturing
This role involves optimizing DevOps services, managing CI/CD processes, ensuring security in software development, and driving improvements across teams and technologies.
Top Skills: BambooCi/CdDevOpsGitJenkinsJfrog

Join an exciting journey to create a greenfield, cutting-edge Consumer Data Lake for a leading global organization based in Europe. This platform will unify, process, and leverage consumer data from various systems, unlocking advanced analytics, insights, and personalization opportunities. As a Senior Data Engineer, you will play a pivotal role in shaping and implementing the platform's architecture, focusing on hands-on technical execution and collaboration with cross-functional teams.

Your work will transform consumer data into actionable insights and personalization on a global scale. Using advanced tools to tackle complex challenges, you’ll innovate within a collaborative environment alongside skilled architects, engineers, and leaders.

Key Responsibilities:

  • Hands-On Development: Build, maintain, and optimize data pipelines for ingestion, transformation, and activation.
  • Create and implement scalable solutions to handle diverse data sources and high volumes of information.
  • Data Modeling & Warehousing: Design and maintain efficient data models and schemas for a cloud-based data platform.
  • Develop pipelines to ensure data accuracy, integrity, and accessibility for downstream analytics.
  • Collaboration: Partner with Solution Architects to translate high-level designs into detailed implementation plans.
  • Work closely with Technical Product Owners to align data solutions with business needs.
  • Collaborate with global teams to integrate data from diverse platforms, ensuring scalability, security, and accuracy.
  • Platform Development: Enable data readiness for advanced analytics, reporting, and segmentation.
  • Implement robust frameworks to monitor data quality, accuracy, and performance.
  • Testing & Quality Assurance: Implement robust security measures to protect sensitive consumer data at every stage of the pipeline
  • Ensure compliance with data privacy regulations (e.g., GDPR, CCPA ..) and internal policies.
  • Monitor and address potential vulnerabilities, ensuring the platform adheres to security best practices.

Requirements:

  • Over 4+ years of experience showcasing technical expertise and critical thinking in data engineering.
  • Hands-on experience with DBT and strong Python programming skills.
  • Proficiency in Snowflake and expertise in data modeling are essential.
  • Demonstrated experience in building consumer data lakes and developing consumer analytics capabilities is required.
  • In-depth understanding of privacy and security engineering within Snowflake, including concepts like RBAC, dynamic/tag-based data masking, row-level security/access policies, and secure views.
  • Ability to design, implement, and promote advanced solution patterns and standards for solving complex challenges.
  • Familiarity with multiple cloud platforms (Azure or GCP preferred, with a focus on Azure).
  • Practical experience with Big Data batch and streaming tools.
  • Competence in SQL, NoSQL, relational database design (SAP HANA experience is a bonus), and efficient methods for data retrieval and preparation at scale.
  • Proven ability to collect and process raw data at scale, including scripting, web scraping, API integration, and SQL querying.
  • Experience working in global environments and collaborating with virtual teams.
  • A Bachelor’s or Master’s degree in Data Science, Computer Science, Economics, or a related discipline.

We offer*:

  • Flexible working format - remote, office-based or flexible
  • A competitive salary and good compensation package
  • Personalized career growth
  • Professional development tools (mentorship program, tech talks and trainings, centers of excellence, and more)
  • Active tech communities with regular knowledge sharing
  • Education reimbursement
  • Memorable anniversary presents
  • Corporate events and team buildings
  • Other location-specific benefits

*not applicable for freelancers

What you need to know about the Dublin Tech Scene

From Bono and Oscar Wilde to today's tech leaders, Dublin has always attracted trailblazers, with more than 70,000 people working in the city's expanding digital sector. Continuing its legacy of drawing pioneers, the city is advancing rapidly. Ireland is now ranked as one of the top tech clusters in the region and the number one destination for digital companies, with the highest hiring intention of any region across all sectors.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account