Senior Data Engineer
m/w/d,
For our client, a leading company in the MedTech field in Rotkreuz, we are currently looking for a Senior Data Engineer.
Background:
As a Senior Data Engineer you will lead the development and optimization of a robust data infrastructure that drives our DOD operations and data products. In this strategic role, you’ll work closely with multidisciplinary teams - including data scientists, subject matter experts, and fellow data engineers - to design, transform, and enhance data assets that are foundational to our advanced modelling approaches. You’ll take a lead role in shaping and scaling data product standards across Global Operations, ensuring alignment, quality, and impact at the organisational level.
Beyond hands-on contributions, you’ll provide technical guidance to juniors and fellow data engineers alike, establishing best practices and fostering a culture of innovation.
The perfect candidate has a Master’s degree in Computer Science or Data Engineering and brings over 5 years experience in data engineering with a track record in architecting and scaling large data systems. Also we are looking for someone with proven experience in building and managing data pipelines and products.
General Info:
- Start Date ASAP - 01.09.2025
- End Date 31.08.2026, extension possible
- Workload 100%
- Workplace Rotkreuz
Tasks and Responsibilities:
- Lead Data Pipelines & Storage: Design and build scalable data pipelines for real-time and batch processing. Drive architectural decisions and long-term planning for scalable, FAIR data products.
- High-Quality Data Products: Create high-quality data products adhering to FAIR principles. Address complex challenges, ensure compliance, and make strategic decisions to shape data roadmaps.
- Collaboration & Integration: Model data landscapes, acquire data extracts, and define secure exchange methods in collaboration with experts and cross-functional teams.
- Data Ingestion & Processing: Ingest and process data from diverse sources into Big Data platforms (e.g., Snowflake). Develop ERDs and reusable pipelines for advanced analytics.
- Technical Guidance & Governance: Contribute to our Data Mesh Engineering Collective to establish data governance standards, ensure regulatory compliance and data security. Mentor others and promote best practices.
- Information Security & Infrastructure Collaboration: Ensure adherence to information security standards. Collaborate with infrastructure teams for tailored tech stacks. Make independent decisions on data strategies.
- Innovation & Knowledge Sharing: Shape the data engineering roadmap and set standards for data quality and governance. Proactively share best practices.
- Technical Proficiency: Maintain proficiency in data engineering tech stacks, data quality, and observability tools (e.g., Ataccama, Monte Carlo).
- Adherence to Standards: Ensure compliance with relevant guidelines and data governance standards. Develop long-term enterprise tools.
Your Profile:
- Master’s degree in Computer Science, Data Engineering, or a related field.
- Over 5 years in data engineering with a track record in architecting and scaling large data systems.
- 5+ years of experience in leading and mentoring data engineers.
- Proven experience in building and managing data pipelines and products.
- Skilled in handling structured, semi-structured, and unstructured data.
- Proficiency in Python, Java, SQL, or Scala, and experience with big data technologies (e.g., Hadoop, Spark).
- Expertise in multiple cloud platforms (AWS, Azure, GCP) and data warehousing technologies (preferably Snowflake).
- Deep understanding of Information Security to ensure compliant handling and management of process data.
- Familiarity with data modeling and ETL tools.
- Knowledge of version control systems like Git and CI/CD pipelines.
- Proficiency in implementing robust testing practices and monitoring pipelines for performance, reliability, and data quality.
- Client-facing project experience.
- Proven ability to communicate complex solutions to varied technical audiences.
- Strong organizational and interpersonal skills for delivering results and optimizing resources.
- Ability to work independently and collaboratively within a team environment.
- Strong ability to influence and collaborate with stakeholders, trust building and reliable delivery of solutions
Nice to haves:
- 3+ years of experience in the pharmaceutical or healthcare industry.
- Experience with REST APIs and integrating data from various sources.
- Knowledge of regulatory requirements (e.g., GMP, FDA) and Quality systems.
- Experience with AI-driven data solutions and machine learning pipelines.
- Experience with ML platforms (e.g., Dataiku).
- Knowledge of software engineering best practices (code reviews, testing, maintainability).
Das klingt nach einer spannenden Position?
Dann freuen wir uns über vollständige Bewerbungsunterlagen per Onlineformular.
Bei Bewerbungen per E-Mail erklärt sich der Sender respektive die Senderin damit einverstanden, dass die übermittelten Daten unter Berücksichtigung unserer Datenschutzrichtlinie verwendet werden.
Weitere offene Stellen gibt es hier: coopers.ch
Job profile
- IT - Data Science / Digital
- Contracting
- Full Time
- ASAP - 01.09.2025 - 31.08.2026 - Extension possible
- Python, Java, SQL, Data Engineer, Data Modelling, CI/CD, ETL

Sounds interesting?
Simply click "Apply now" and I will get back to you.
Anna Maria Diomatari
Talent Acquisition Consultant
+41 41 632 43 35
anna.diomatari@coopers.ch