BeiGene Logo

BeiGene

Senior Manager, Data Platform and Solution Engineering

Reposted 11 Days Ago
Be an Early Applicant
In-Office or Remote
Hiring Remotely in Warsaw, Warszawa, Mazowieckie
Senior level
In-Office or Remote
Hiring Remotely in Warsaw, Warszawa, Mazowieckie
Senior level
The Associate Director will design and implement data architectures using Databricks, manage data frameworks, optimize processes, and collaborate with teams to enhance data-driven decision-making.
The summary above was generated by AI

BeOne continues to grow at a rapid pace with challenging and exciting opportunities for experienced professionals. When considering candidates, we look for scientific and business professionals who are highly motivated, collaborative, and most importantly, share our passionate interest in fighting cancer.

General Description:

Join BeOne's Global Data Strategy and Solutions team to build and scale a cutting-edge and fully integrated Enterprise Data and Analytics Platform that accelerates our journey from Data to Insights and deployment of AI applications. The Senior Manager Platform and Solution Engineering must be an expert in Databricks solution technologies to design a scalable, high-performance data solutions that empower our organization to ingest and curate data and build data products at scale. The ideal candidate will possess strong technical knowledge and experience in cloud data architectures, big data processing, and real-time analytics, coupled with the ability to collaborate cross-functionally to drive data-driven decision-making across the organization.

Essential Functions of the job:

The individual in this position should expect significant day-to-day variability in tasks and challenges.

 Primary duties include but is not limited to the following:

  • Design and implement robust data architectures using Databricks, ensuring integration with existing systems and scalability for future growth
  • Establish data management frameworks, optimizing ETL/ELT processes and data models for performance and accuracy.
  • Evaluate and recommend modern architectural patterns, including Lakehouse, Delta Live Tables, Data Mesh, and real-time streaming.
  • Drive rapid Proof-of-Concepts (POCs) to validate new architectural approaches, tools, and design patterns before enterprise rollout.
  • Partner with data engineers, scientists, and business stakeholders to develop seamless data pipelines prioritizing data integrity and usability.
  • Implement and uphold data governance practices that enhance data accessibility while ensuring compliance with regulations.
  • Integrate external systems, APIs, and cloud-native services to support new data products and analytics use cases.
  • Prototype and test new connectors, ingestion frameworks, and integration patterns to accelerate innovation.
  • Monitor data pipelines and infrastructure performance, troubleshooting issues as they arise and ensuring high availability.
  • Optimize and enhance existing data systems for performance, reliability, and cost-efficiency.
  • Collaborate with data analysts and data scientists to understand data requirements and implement solutions that support data-driven insights and models.
  • Monitor and enhance system performance, employing tools and methodologies to optimize data processing and storage solutions.
  • Optimize compute costs, job orchestration, workflow efficiency, and data storage strategies.
  • Troubleshoot and resolve data-related issues to maintain optimal system functionality.
  • Experiment with new Databricks features (Unity Catalog updates, AI/ML runtimes, Photon, DBRX, Delta Sharing, serverless SQL/compute, etc.) through quick hands-on evaluations.
  • Develop and enforce data governance standards, including data quality, security, and compliance through automation.
  • Innovation & Rapid Prototyping
    • Conduct fast-turnaround POCs to explore new technical capabilities, libraries, and features across Databricks, Azure, Informatica, Reltio, and other ecosystem tools.
    • Build lightweight demo pipelines, dashboards, and micro-solutions to demonstrate feasibility, guide architectural choices, and influence roadmap decisions.
    • Stay current with emerging technologies, industry trends, and platform advancements; translate insights into actionable recommendations.
    • Collaborate with vendors and internal teams to evaluate beta features, pilot new capabilities, and provide technical feedback for adoption decisions.

Education Required:  Bachelor’s Degree in Information Technology or related field/experiences

Qualifications:

  • Proven experience (7+ years) in data architecture or in a similar role, with extensive experience in Databricks and cloud-based data solutions.
  • 7+ years of experience in solution engineering, platform architecture, or related  working in a cross-functional environment.
  • Strong proficiency in Apache Spark, Unity Catalog technology, Python, SQL, and data processing frameworks.
  • Experience with APIs and experience in integrating diverse technology systems.
  • Familiarity with modern development frameworks, DevOps methodologies, and CI/CD processes.
  • Experience with data warehousing solutions, delta lakes, and ETL/ELT processes.
  • Familiarity with cloud environments (AWS, Azure) and their respective data services.
  • Solid understanding of data governance, security, and compliance best practices.
  • Excellent communication and interpersonal skills, with an ability to articulate complex technical concepts to diverse audiences.
  • Databricks certifications or hands-on experience with Delta Lake and its cloud architecture is strongly preferred
  • Familiarity with machine learning, AI frameworks, and data visualization tools (e.g., Tableau, Power BI, Spotfire).
  • A proactive approach to learning and implementing new technologies and frameworks.
  • Experience working with Life Sciences data, including exposure to R&D, Clinical Operations, TechOps, or Manufacturing domains. Understanding of key systems (CTMS, EDC, eTMF, LIMS, MES, PV systems), data models (CDISC, SDTM, ADaM), and typical data challenges (quality, lineage, integration, governance) is highly desirable

Supervisory Responsibilities:  No

Global Competencies

When we exhibit our values of Patients First, Driving Excellence, Bold Ingenuity, and Collaborative Spirit, through our twelve global competencies below, we help get more affordable medicines to more patients around the world.

  • Fosters Teamwork
  • Provides and Solicits Honest and Actionable Feedback
  • Self-Awareness
  • Acts Inclusively
  • Demonstrates Initiative
  • Entrepreneurial Mindset
  • Continuous Learning
  • Embraces Change
  • Results-Oriented
  • Analytical Thinking/Data Analysis
  • Financial Excellence
  • Communicates with Clarity

We are proud to be an equal opportunity employer. BeOne does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.

Top Skills

Spark
AWS
Azure
Databricks
Informatica
Python
Reltio
SQL

Similar Jobs

10 Hours Ago
Easy Apply
Remote
31 Locations
Easy Apply
Expert/Leader
Expert/Leader
Artificial Intelligence • Consumer Web • Digital Media • Information Technology • Social Impact • Software
The Media Lead will build and manage new media properties for Circle, leading a team to grow social channels and engage audiences creatively across platforms while measuring their impact.
Top Skills: InstagramLinkedin)Social Listening ToolsSocial Media Platforms (YoutubeTiktok
10 Hours Ago
Remote or Hybrid
8 Locations
Senior level
Senior level
Big Data • Food • Hardware • Machine Learning • Retail • Automation • Manufacturing
The Senior Internal Auditor performs audits, assesses internal controls, provides recommendations, and ensures compliance with internal policies and regulations.
Top Skills: IfrsMS OfficePower BISAPSoxTableauU.S. Gaap
10 Hours Ago
Easy Apply
Remote
31 Locations
Easy Apply
Senior level
Senior level
Cloud • Security • Software • Cybersecurity • Automation
As a Principal Database Engineer, you will architect scalable PostgreSQL solutions, ensuring data reliability, performance, and collaborating with teams to build a robust data platform for GitLab's growth.
Top Skills: ElasticsearchPostgresTypescript

What you need to know about the Dublin Tech Scene

From Bono and Oscar Wilde to today's tech leaders, Dublin has always attracted trailblazers, with more than 70,000 people working in the city's expanding digital sector. Continuing its legacy of drawing pioneers, the city is advancing rapidly. Ireland is now ranked as one of the top tech clusters in the region and the number one destination for digital companies, with the highest hiring intention of any region across all sectors.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account