Share this Job

Chapter Lead Data Engineer

Apply now »

Date: 15-Jan-2022

Location: Amsterdam, NL

Company: HEINEKEN

The mission of Global Analytics is to lead HEINEKEN into becoming a data-driven company and the best-connected brewer. As a team, we act as an incubator for smart data products in all areas of business, from sales to logistics and from marketing to purchasing. This approach has allowed us to grow rapidly and launch many value-creating use cases from management of spare parts to allocation of media spending. Our focus for the upcoming year is to scale these, and other, use cases to as many countries as possible globally. 

 

Currently we are looking for a Lead Data Engineer who will also act as a people manager to join our growing team of analytics experts. He/she will be responsible for the design, development, and implementation of ELT jobs for ingestion, the data pipelines that will support the smart data products and help shape the overall architecture. The Chapter Lead will probably work on multiple product teams at a time. Our team consists of Data Scientists, Data Engineers, Business Intelligence Specialists, and Analytics Translators. 

 

Key responsibilities:

  • Work with Product Owners to help them leverage smart data solutions
  • Providing guidance on suitable options for designing and creating data pipelines for the analytical solutions to/from data lakes/data warehouses to e.g. microservices or API's.
  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL, Azure and AWS ‘big data’ technologies.
  • Building ETL jobs based on jointly defined requirements
  • Works closely with both the backend developers as well as the IT data architecture group
  • Acts as evangelist of modern data engineering throughout the company
  • Be a strong advocate for a culture of process and data quality across development teams
  • Gather requirements and build roadmaps and architectures to help the Product Teams achieve their goals;

 

Key responsibilities

 

  • BS/MS degree in Computer Science, Engineering, Artificial Intelligence, Physics or related subject;
  • 6+ years of experience with modern data architectures like Hadoop, Greenplum, Spark, CouchDB, Elastic, etc. 
  • Strong Cloud Platform Engineering experience in Azure or AWS;
  • Experienced with data modelling, design patterns, building highly scalable and secure analytical solutions;
  • Experienced with SQL, scripting languages (Python, Perl, Ruby), UNIX shell scripting, versioning systems (e.g., Git);
  • Strong hands-on and pragmatic problem-solving skills and experience working in organisations with an agile culture;
  • Good communication skills, professional attitude and service orientation; superb team player;
  • Attitude to thrive in a fun, fast-paced environment that acts as a start-up;
  • Fluent English.
     


Job Segment: Database, Engineer, Computer Science, SQL, Unix, Technology, Engineering