IDT Logo

IDT

Data Engineer

Posted 16 Days Ago
Be an Early Applicant
In-Office or Remote
12 Locations
Senior level
In-Office or Remote
12 Locations
Senior level
Responsible for designing and implementing ETL/ELT data pipelines, maintaining data warehouses, and ensuring data integrity. Collaborate with stakeholders and recommend process improvements while staying updated on emerging technologies.
The summary above was generated by AI
This is a full-time work from home opportunity for a star Data Engineer from LATAM.

IDT(www.idt.net) is an American telecommunications company founded in 1990 and headquartered in New Jersey. Today it is an industry leader in prepaid communication and payment services and one of the world’s largest international voice carriers. We are listed on the NYSE, employ over 1300 people across 20+ countries, and have revenues in excess of $1.5 billion.

IDT is looking for a skilled Data Engineer to join our BI team and take an active role in performing data analysis, ELT/ETL design and support functions to deliver on strategic initiatives to meet organizational goals.

Responsibilities:

  • Design, implement, and validate ETL/ELT data pipelines–for batch processing, streaming integrations, and data warehousing, while maintaining comprehensive documentation and testing to ensure reliability and accuracy. 
  • Maintain end-to-end Snowflake data warehouse deployments and develop Denodo data virtualization solutions.
  • Recommend process improvements to increase efficiency and reliability in ELT/ETL development.
  • Stay current on emerging data technologies and support pilot projects, ensuring the platform scales seamlessly with growing data volumes. 
  • Architect, implement and maintain scalable data pipelines that ingest, transform, and deliver data into real-time data warehouse platforms, ensuring data integrity and pipeline reliability.
  • Partner with data stakeholders to gather requirements for language-model initiatives and translate into scalable solutions.
  • Create and maintain comprehensive documentation for all data processes, workflows and model deployment routines.
  • Should be willing to stay informed and learn emerging methodologies in data engineering, and open source technologies.

Requirements:

  • 5+ years of experience in ETL/ELT design and development, integrating data from heterogeneous OLTP systems and API solutions, and building scalable data warehouse solutions to support business intelligence and analytics.
  • Excellent English communication skills.
  • Effective oral and written communication skills with BI team and user community.
  • Demonstrated experience in utilizing python for data engineering tasks, including transformation, advanced data manipulation, and large-scale data processing.
  • Design and implement event-driven pipelines that leverage messaging and streaming events to trigger ETL workflows and enable scalable, decoupled data architectures.
  • Experience in data analysis, root cause analysis and proven problem solving and analytical thinking capabilities.
  • Experience designing complex data pipelines extracting data from RDBMS, JSON, API and Flat file sources.
  • Demonstrated expertise in SQL and PLSQL programming, with advanced mastery in Business Intelligence and data warehouse methodologies, along with hands-on experience in one or more relational database systems and cloud-based database services such as Oracle, MySQL, Amazon RDS, Snowflake, Amazon Redshift, etc.
  • Proven ability to analyze and optimize poorly performing queries and ETL/ELT mappings, providing actionable recommendations for performance tuning.
  • Understanding of software engineering principles and skills working on Unix/Linux/Windows Operating systems, and experience with Agile methodologies.
  • Proficiency in version control systems, with experience in managing code repositories, branching, merging, and collaborating within a distributed development environment.
  • Interest in business operations and comprehensive understanding of how robust BI systems drive corporate profitability by enabling data-driven decision-making and strategic insights. 

Pluses

  • Experience in developing ETL/ELT processes within Snowflake and implementing complex data transformations using built-in functions and SQL capabilities.
  • Experience using Pentaho Data Integration (Kettle) / Ab Initio ETL tools for designing, developing, and optimizing data integration workflows. 
  • Experience designing and implementing cloud-based ETL solutions using Azure Data Factory, DBT, AWS Glue, Lambda and open-source tools.
  • Experience with reporting/visualization tools (e.g., Looker) and job scheduler software.
  • Experience in Telecom, eCommerce, International Mobile Top-up.
  • Preferred Certification: AWS Solution Architect, AWS Cloud Data Engineer, Snowflake SnowPro Core.

Please attach CV in English.
The interview process will be conducted in English.

Only accepting applicants from LATAM.

Top Skills

Amazon Rds
Amazon Redshift
Aws Glue
Azure Data Factory
Dbt
Denodo
Looker
Pentaho Data Integration
Plsql
Python
Snowflake
SQL

Similar Jobs

Yesterday
Remote or Hybrid
Brazil
Senior level
Senior level
Artificial Intelligence • Hardware • Information Technology • Security • Software • Cybersecurity • Big Data Analytics
Responsible for designing, developing, and maintaining ETL/ELT pipelines using Databricks and Apache Spark, ensuring data quality and integrity across different sources.
Top Skills: .Net CoreSparkC#Cosmos DbDatabricksDynamoDBMariadbMongoDBMySQLPowershellPythonSQL Server
Yesterday
Easy Apply
In-Office or Remote
5 Locations
Easy Apply
Senior level
Senior level
Cloud • Information Technology • Security • Software
The Senior Data Engineer will develop data pipelines, curate and model data, and design databases to enable analytics insights across the organization.
Top Skills: AirflowAWSDbtFivetranGitPythonSnowflakeSQLTableau
5 Hours Ago
Remote
12 Locations
Mid level
Mid level
Information Technology • Software
The AWS Data Engineer will design and maintain ETL/ELT pipelines, manage data warehousing solutions, optimize data performance, and ensure data quality using AWS services.
Top Skills: Amazon AthenaAmazon RedshiftAmazon S3Aws AuroraAws DynamodbAws EmrAws GlueAws KinesisAws Lake FormationAws LambdaAws RdsAws Step FunctionsCloudFormationCloudwatchGoPythonSQLTerraform

What you need to know about the Dublin Tech Scene

From Bono and Oscar Wilde to today's tech leaders, Dublin has always attracted trailblazers, with more than 70,000 people working in the city's expanding digital sector. Continuing its legacy of drawing pioneers, the city is advancing rapidly. Ireland is now ranked as one of the top tech clusters in the region and the number one destination for digital companies, with the highest hiring intention of any region across all sectors.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account