Keyrock Logo

Keyrock

Data Architect (Trading)

Posted 10 Days Ago
In-Office or Remote
29 Locations
Mid level
In-Office or Remote
29 Locations
Mid level
The Data Architect designs, implements, and maintains data architecture, enabling data intelligence, analytics, and MLOps, while ensuring security and governance.
The summary above was generated by AI

Data Architect

About Keyrock

Since our beginnings in 2017, we've grown to be a leading change-maker in the digital asset space, renowned for our partnerships and innovation. 

Today, we rock with over 180 team members around the world. Our diverse team hails from 42 nationalities, with backgrounds ranging from DeFi natives to PhDs. Predominantly remote, we have hubs in London, Brussels, Singapore and Paris, and host regular online and offline hangouts to keep the crew tight.

We are trading on more than 80 exchanges, and working with a wide array of asset issuers. As a well-established market maker, our distinctive expertise led us to expand rapidly. Today, our services span market making, options trading, high-frequency trading, OTC, and DeFi trading desks.

But we’re more than a service provider. We’re an initiator. We're pioneers in adopting the Rust Development language for our algorithmic trading, and champions of its use in the industry. We support the growth of Web3 startups through our Accelerator Program. We upgrade ecosystems by injecting liquidity into promising DeFi, RWA, and NFT protocols. And we push the industry's progress with our research and governance initiatives. 

At Keyrock, we're not just envisioning the future of digital assets. We're actively building it.

Position Overview

The Data Architect is responsible for designing, implementing, and maintaining an organization's data architecture and strategy, ensuring that data is collected, stored, and processed efficiently and securely to support business intelligence, data analytics, and machine learning operations (MLOps) practices.

Key Responsibilities
  • Designing Data Architecture: Plan and implement a robust, scalable data architecture that integrates data from various sources and supports diverse analytical needs, while optimizing costs and meeting business requirements.

  • Implementing Data Engineering Pipelines: Design and develop data pipelines for data extraction, transformation, and loading (ETL) processes, ensuring data quality and consistency.

  • Enabling Data Intelligence and Analytics: Build and maintain data warehouses, data marts, and data lakes to support business intelligence and data analytics initiatives.

  • Supporting MLOps Practices: Collaborate with data scientists and machine learning engineers to design and implement data infrastructure and processes that support machine learning model development, deployment, and maintenance.

  • Ensuring Data Security and Compliance: Implement security measures, policies, and procedures to safeguard data privacy and comply with relevant regulations.

  • Data Governance and Management: Establish and enforce data governance policies and standards to ensure data quality, integrity, and accessibility.

  • Collaborating with Cross-Functional Teams: Work closely with data engineers, data scientists, business analysts, and other stakeholders to understand data requirements and translate them into technical solutions.

  • Staying Abreast of Technological Advancements: Keep up-to-date with emerging technologies and trends in data architecture, data engineering, and MLOps to identify opportunities for improvement and innovation.

  • Optimizing Data Performance: Monitor and analyze data processing performance, identify bottlenecks, and implement optimizations to enhance efficiency and scalability.

  • Documentation and Knowledge Sharing: Create and maintain comprehensive documentation of data architecture, models, and processing workflows.

Technical Requirements
  • Extensive experience in data architecture design and implementation.

  • Strong knowledge of data engineering principles and practices.

  • Expertise in data warehousing, data modelling, and data integration.

  • Experience in MLOps and machine learning pipelines.

  • Proficiency in SQL and data manipulation languages.

  • Experience with big data platforms (including Apache Arrow, Apache Spark, Apache Iceberg, and Clickhouse) and cloud-based infrastructure on AWS.

Education & Qualifications
  • Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field, or equivalent experience.

  • Preferred certifications (optional):

    • AWS Cloud Data Engineer

    • AWS Machine Learning Ops Engineer

Leadership & Collaboration
  • Passion for building scalable, reliable, and secure systems in a fast-paced environment.

  • Ability to translate complex technical concepts into clear, actionable insights for technical teams.

  • Strong interpersonal skills with the ability to work effectively across cross-functional teams.

  • Excellent problem-solving and analytical skills.

Our recruitment philosophy

We value self-awareness and powerful communication skills in our recruitment process. We seek fiercely passionate people who understand themselves and their career goals. We're after those with the right skills and a conscious choice to join our field. The perfect fit? A crypto enthusiast who’s driven, collaborative, acts with ownership and delivers solid, scalable outcomes.

Our offer

  • Competitive salary package

  • Autonomy in your time management thanks to flexible working hours and the opportunity to work remotely 

  • The freedom to create your own entrepreneurial experience by being part of a team of people in search of excellence 

As an employer we are committed to building a positive and collaborative work environment. We welcome employees of all backgrounds, and hire, reward and promote entirely based on merit and performance.

Due to the nature of our business and external requirements, we perform background checks on all potential employees, passing which is a prerequisite to join Keyrock.

https://keyrock.com/careers/

Top Skills

Apache Arrow
Apache Iceberg
Spark
AWS
Clickhouse
Rust
SQL

Similar Jobs

Entry level
Machine Learning • Natural Language Processing
Welo Data seeks candidates fluent in Simplified Chinese for remote AI data labeling, evaluation, and instruction tasks, offering flexible hours.
Top Skills: AIDigital Tools
Entry level
Machine Learning • Natural Language Processing
Join Welo Data for AI data annotation and evaluation tasks, working remotely with flexible hours. Contributors shape the future of AI.
Top Skills: AIData AnnotationModel RankingPrompt EngineeringTranscription
Entry level
Machine Learning • Natural Language Processing
Join Welo Data to assist in AI data annotation, evaluation, and instruction creation tasks, all remotely with flexible timing. Candidates must be native or near-native Cantonese speakers with English proficiency and digital tool familiarity.
Top Skills: Ai ToolsDigital ToolsFrameworks

What you need to know about the Dublin Tech Scene

From Bono and Oscar Wilde to today's tech leaders, Dublin has always attracted trailblazers, with more than 70,000 people working in the city's expanding digital sector. Continuing its legacy of drawing pioneers, the city is advancing rapidly. Ireland is now ranked as one of the top tech clusters in the region and the number one destination for digital companies, with the highest hiring intention of any region across all sectors.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account