- Topsoe A/S is performing a strategic shift towards the company vision of being recognized as a global leader in carbon emission reduction technologies by 2024.
- We are offering 3 attractive (Senior/Lead) Data Engineer positions to join our Digitalization department and to deliver advanced IoT connected services to the renewable chemicals market.
- We are based in our headquarters in Copenhagen, Denmark.
Purpose of role
- Act as the subject matter expert within the data engineering discipline, with a primary focus on delivering lead digital services and products to meet the customers’ business needs.
- Responsible for evolving our data pipeline architecture, as well as improving the data flow and data collection for cross-functional product teams.
- Conceptualize and implement high-quality data ETL pipelines, data frameworks, and data warehouses in our Microsoft Azure IoT platform.
- Develop advanced data processing solutions and data-intensive applications. This could include machine learning pipelines, scheduling activities for data import and export, retrieve data, map data, and create efficient data layers to enable advanced analytics for data scientists.
- Build analytics tools that utilize the data pipeline to provide actionable insights.
- Drive long-term initiatives to improve data quality, development/code management using DevOps principles, promote shared methodologies to enhance efficiency and advance best practices within cross-functional product teams.
- Collaborate closely with platform teams, DevOps engineers in core systems, as well as the data scientist team.
- Build the foundation for our IoT connected services used to help lower our customers’ carbon footprint, by designing, developing, and optimizing our data warehouse and large-scale batch and real-time data ETL pipelines, using state-of-the-art cloud technologies.
- Pre-process, clean, and structure data to facilitate data exploration and advanced analytics/AI activities.
- Work closely with subject matter experts to understand the requirements for the end-product and deliver data models for optimize performance.
Expected mission critical behaviours
- Set ambitious goals for my work aligned with business priorities
- Trust others and seek to gain their trust
- Relentlessly drive to make things better
- Communicate expectations and give open and honest feedback
- Collaborate in the pursuit of company goals
- You hold a relevant academic degree such as M.Sc. or Ph.D. in Applied Mathematics, Chemical/Process Engineering, Computer/Data Science, Data Engineering or equivalent and have solid experience from data modeling, data management, databases, etc.
- 5+ years of experience from ETL processing in a Cloud Data Platform and have comprehensively applied cloud service provider offerings, preferably Azure Services such as Azure Data Lake, Azure Data Factory, and Logic Apps.
- Solid skills in imperative programming languages, preferably Python. Experience in Spark/Databricks, containerization using Docker or Kubernetes, and in database solutions (SQL/NoSQL) is also valuable.
- Experience with Power-BI and DAX code is a strong plus. Most importantly, however, you are motivated and able to learn anything you might be missing to develop as much as possible within this exciting area.
- Demonstrate high level of self-leadership and are motivated by challenge and collaboration.
- A burning curiosity and interest in data, machine learning, fast data, big data, data governance, data models, data warehouse, data insights, etc.
- Communication skills, and proficiency in both spoken and written English
- Enjoy working in product teams with a fast-paced DevOps mindset that incorporates core Agile principles and comfortable taking ownership of decisions in your field of expertise
No terminology in this advert is intended to discriminate on the grounds of gender, race, age, national origin, sexual orientation, and we confirm that we are happy to accept applications from persons of any age or experience for this role.