The Data Architect will assemble large datasets, design and implement scalable data architectures and pipelines, and optimize data delivery processes. The role involves collaborating with stakeholders to resolve technical issues, advocating for Big Data technologies, and participating in Agile frameworks for project delivery.
Important Information
- Experience: 7–10+ years
- Job Mode: Full-time
- Work Mode: Hybrid
Job Summary
- Assemble large, complex datasets that meet functional and non-functional business requirements.
- Plan, create, and maintain data architectures aligned with business goals.
- Create and maintain optimal data pipeline architecture, focusing on automation and scalability.
- Identify, design, and implement internal process improvements, including automation of manual processes and optimization of data delivery.
- Propose infrastructure for optimal extraction, transformation, and loading (ETL) of data from diverse sources using SQL and Big Data technologies.
- Continuously audit data management systems to ensure performance, address breaches or gaps, and report findings to stakeholders.
- Recommend analytics tools to generate actionable insights into business performance metrics, including customer acquisition and operational efficiency.
- Collaborate with stakeholders (executives, product teams, and data teams) to resolve technical issues and support data infrastructure needs.
- Build and maintain strong relationships with senior stakeholders to help them leverage Big Data technologies for business solutions.
Responsibilities and Duties
- Assemble, optimize, and maintain large datasets tailored to business needs.
- Design and implement scalable, high-quality data architectures and pipelines.
- Automate workflows, optimize performance, and ensure scalability in infrastructure design.
- Conduct continuous performance audits of data systems and implement improvements as needed.
- Design tools to deliver actionable insights for business intelligence and analytics.
- Collaborate with cross-functional teams to address technical issues and enhance data operations.
- Support data migrations, including integration with platforms like MS Dynamics CRM or SharePoint.
- Actively participate in Agile delivery frameworks (Scrum, DSDM) to ensure quality results.
Qualifications and Skills
- Education: BS/MS in Computer Science, Engineering, Information Technology, or related field with programming experience.
- Proven experience (7–10+ years) in engineering, database modeling, design, and architecture for large-scale analytics projects.
- Expertise in SQL and relational database management, as well as Big Data technologies (Apache Spark, Databricks, Kafka, Hadoop).
- Deep knowledge of modern data architectures (e.g., Lambda architecture, Streaming, Delta Lake).
- Experience with data pipeline tools (Azure Data Factory, Airflow) and Business Intelligence tools (SSAS, Power BI, Tableau).
- Familiarity with cloud services (Azure, AWS).
- Proficiency in programming languages such as Python, R, C#, or Java.
- Knowledge of Data Science, Machine Learning, and Artificial Intelligence trends.
- Strong understanding of industry best practices in data design, integration, and architecture.
- Experience working with Agile methodologies (Scrum, DSDM).
- Excellent English communication skills, both written and spoken.
Role-specific Requirements
- Extensive experience building and optimizing Big Data pipelines and architectures.
- Knowledge of Business Intelligence, analytics, and reporting technologies.
- Experience with data migrations and platforms such as MS Dynamics CRM and SharePoint.
- Strong knowledge of data trends, modern architectures, and scalable design.
- Customer-centric approach to explain technical concepts to non-technical stakeholders.
- Strong communication and collaboration skills in an international and virtual team setting.
- Proven ability to deliver quality results and foster strong client relationships.
Technologies
- Big Data: Apache Spark, Databricks, Snowflake, Kafka, Hadoop
- Data Pipeline Tools: Azure Data Factory, Airflow
- Business Intelligence Tools: SSAS, Power BI, Tableau
- Cloud Services: Azure, AWS
- Programming Languages: Python, R, C#, Java
Skillset Competencies
- Advanced SQL and Big Data pipeline optimization.
- Expertise in modern data architectures and ETL processes.
- Strong data migration and integration experience.
- Proficiency in analytics and reporting technologies.
- Excellent problem-solving, negotiation, and communication skills.
- Ability to work effectively in cross-functional, international teams.
- Strong client relationship management and quality delivery focus.
About Encora
Encora is a trusted partner for digital engineering and modernization, working with some of the world’s leading enterprises and digital-native companies. With over 9,000 experts in 47+ offices worldwide, Encora offers expertise in areas such as Product Engineering, Cloud Services, Data & Analytics, AI & LLM Engineering, and more. At Encora, hiring is based on skills and qualifications, embracing diversity and inclusion regardless of age, gender, nationality, or background.
Top Skills
C#
Java
Python
R
SQL
Similar Jobs
Information Technology • Consulting
As a Solution Architect, you will design and implement data solutions for clients, engage in requirement analysis, guide presales activities, and oversee data platform modernization projects while collaborating with various stakeholders and co-developing the Data and Analytics practice.
Top Skills:
DatabricksSnowflake
Artificial Intelligence • Cloud • Information Technology • Machine Learning • Natural Language Processing • Software
As a Senior Java Developer, you will design and build APIs, develop scalable code, analyze application metrics, and maintain a high-performance translation management application. You will work with cutting-edge technologies, lead code reviews, and innovate new features to enhance user experience.
Top Skills:
JavaJavaScriptPython
Cloud • Security • Software • Cybersecurity • Automation
The Senior Technical Instructor will deliver high-quality technical training on GitLab's platform, mentor other instructors, develop training content, and ensure training aligns with product updates. The role also involves maintaining product knowledge and collaborating with various teams to improve educational offerings.
Top Skills:
GitlabKubernetes
What you need to know about the Dublin Tech Scene
From Bono and Oscar Wilde to today's tech leaders, Dublin has always attracted trailblazers, with more than 70,000 people working in the city's expanding digital sector. Continuing its legacy of drawing pioneers, the city is advancing rapidly. Ireland is now ranked as one of the top tech clusters in the region and the number one destination for digital companies, with the highest hiring intention of any region across all sectors.