Senior Cloud Engineer for R&D Data & Analytics
Do you have experience with the newest and upcoming technologies, platforms and trends that can be exploited to create new business models or radically change current processes? Are you also looking for new challenges and are motivated by:
Working in tandem with fellow software engineers to identify and implement the most optimal cloud-based solutions
Ensuring application performance, uptime, and scale
Maintaining high standards of code quality based on thoughtful design
Defining and document best practices regarding application deployment and infrastructure maintenance
Sharing knowledge in order to increase cloud competencies
Managing cloud environments in accordance with company security guidelines
Participating in all aspects of the software development life cycle with accordance to the Agile methodology
Then you are the Senior Cloud Engineer we are looking for!
At LEO Pharma cloud computing continuously allows us to modernize and consolidate IT infrastructure and automate workloads - thereby enabling next-generation innovation. To help us on this journey, we are seeking an experienced cloud engineer. The ideal profile brings expertise in design, deployment and operations of infrastructure for software applications at scale. Industry-standard technologies and platforms are of utmost necessity in order for a candidate to be successful. The work will have a direct impact on the overall output of the team as well as help our business become more secure and efficient.
We can offer you a job with lots of responsibilities, impact and more of it all, if you have the talent and are ambitious.
Challenges ahead!
LEO Pharma has embarked on a very ambitious growth journey to become the preferred dermatology care partner in the world. Global Research and Development (GRD) is on the forefront of this journey. GRD is a global team, spread across the world consisting of approximately 1000 permanent employees. We drive the execution of all of LEO Pharma’s drug discovery, design and development together with stakeholders and partners across LEO Pharma as well as external partners around the world.
At LEO Pharma we have recently formed a new unit in Global Research & Development to bring data-driven processes into our critical business paths, building data systems and applications to support decision making in our current industrial processes and drive new business opportunities. We serve a two-fold purpose modernizing current drug development processes by focusing on more data-driven approaches and seeking out entirely new opportunities for business in dermatology using platform technologies which are data-driven at its core.
You will work in a cross-functional team built from subject matter experts and technical experts. The team will be responsible for building and delivering use cases to support a data driven R&D. As part of the new unit you and the team around you are responsible for building smart and sustainable data products in close collaboration with the end users. Interaction with other departments is very strong, using both shared projects and actual job rotations to foster cooperation and people development.
You will become part of a team with a pleasant and flexible work environment, a team that have full speed ahead whilst staying open minded and social.
Who are you?
You are an end-to-end problem solver who prefers to use the right tool for the job in close collaboration with your peers.You have experience supporting and handling cloud environments, dealing with multiple data sources, and supplying end users with systems and applications running in production. You are familiar with container applications and CI/CD pipelines . On a daily basis you will be working focused and agile on a project-by-project basis. Finally the person chosen for the position will bring a “can-do” attitude and strengthen our team spirit.
Preferred Qualifications and Experience
Extensive experience with at least one of the major cloud vendors (AWS, Azure, GCP)
Experience with monitoring and telemetry of software at scale
Experience performing operations on distributed system (Kafka, Kubernetes, Cassandra)
Experience building secure systems
Experience with Infrastructure-as-Code
Experience with deployment and rollout of container applications
Experience developing CI/CD pipelines
Proficiency in Linux administration
Proficiency in a general programming language (Python, Go, etc.)
Skilled in scripting and automation
The requirements listed above are considered as necessary in order to be successful in this position. The following qualifications/experience would be a plus:
In-depth knowledge of Azure cloud
Experience with Terraform Cloud and modularizations
Proficiency with Go
Knowledge of HashiCorp’s suite of products
Knowledge of Policy as Code frameworks e.g. Sentinel, OPA
Interest in contributing to data science/data engineering/backend part of the software products
Personal traits include curiosity, ownership, inclusion, a generalist mindset and a can-do attitude.
Do you want to know more?
For further information, please contact Troels Ravn Bærentzen at +45 41851721
Applications will be evaluated on a running basis and candidates will be called in for interviews ongoing, so please apply as soon as possible, and no later than 19th March 2021