Apply now »

Senior Data Analytics/Solutions Engineer

Digital & Technology Team (D&T) is an integral division of HEINEKEN Global Shared Services Center. We are committed to making Heineken the most connected brewery. That includes digitalizing and integrating our processes, ensuring best-in-class technology, and embedding a data-driven culture. By joining us you will work in one of the most dynamic and innovative teams and have a direct impact on building the future of Heineken!



Would you like to meet the Team, see our office and much more? Visit our website: Heineken (heineken-dt.pl)


As a Senior Data Analytics/Solutions Engineer, you will be driving the development of automated data mapping capabilities within the Data Mapping Chapter. Your role sits at the intersection of analytics engineering, data engineering, and data management, enabling scalable, high quality data mapping artifacts. You will design, build, and operate backend and data solutions on Azure, working with modern platforms such as Databricks, Azure DevOps, and Unity Catalog. You will collaborate closely with data mapping specialists, data quality specialists, data engineers, data business analysts, and domain experts to build resilient data mapping capabilities aligned with our enterprise data strategy. In addition, this role focuses on applying advanced analytical and machine learning techniques to improve the automation and intelligence of data mapping processes, including algorithmic matching, entity resolution, semantic modelling, and knowledge graph-based approaches.

 

Your responsibilities would include:

 

  • designing and implementing automated data mapping solutions using metadata, semantics, and transformation logic
  • translating business definitions and source to target mappings into scalable, reusable data transformations
  • contributing to the evolution of metadata driven mapping approaches, including lineage and semantic models
  • supporting automation of mapping use cases such as standardisation, harmonisation, and matching
  • designing and applying algorithmic and ML driven matching solutions to improve mapping automation
  • implementing entity resolution techniques (e.g. similarity scoring, probabilistic matching) at scale using Python and PySpark
  • developing and maintaining semantic models, including ontologies or knowledge graph–based structures, to improve mapping quality and reusability
  • assessing and refining matching approaches using quality metrics and practical performance considerations
  • collaborating with data engineers to ensure mapping logic aligns with Databricks, Lakehouse, and Medallion architecture principles
  • applying strong knowledge of PySpark, SQL, Delta Lake, and Python to influence pipeline and transformation design
  • ensuring mapping logic is scalable, transparent, and aligned with data governance standards
  • contributing to shared CI/CD, testing, and deployment practices for data pipelines
  • promoting reusable patterns, documentation standards, and technical best practices within the chapter
  • working closely with data mapping specialists, analysts, and domain experts to deliver solutions that meet real business needs.

 

You are a good candidate if you have:

 

  • strong experience (senior level) in data engineering, analytics engineering, or data platform work
  • hands on experience delivering data transformations at scale using Databricks, PySpark, and SQL
  • solid Python skills for data processing and automation
  • experience applying advanced analytical or machine learning methods to data transformation, matching, or semantic problems
  • strong problem-solving skills in designing algorithms for data quality, similarity, and entity alignment, rather than purely rule based transformations
  • a good understanding of data management concepts, including data quality, semantics, modelling, and data contracts
  • experience working with metadata, lineage, and governance tooling
  • familiarity with Azure based data platforms and enterprise data environments
  • the ability to collaborate effectively with platform and data engineers, focusing on data logic, algorithms, and analytical solutions rather than infrastructure or service ownership
  • confidence explaining technical solutions to both technical and non-technical stakeholders
  • excellent written and verbal English
  • strong collaboration skills, enabling effective work with data engineers, data mapping specialists, analysts, and domain experts
  • the capability to guide technical discussions, review designs and code, and promote pragmatic best practices within the chapter.

 

At HEINEKEN Kraków, we take integrity and ethical conduct seriously. If someone has concerns about a possible violation of legal regulations indicated in Polish Whistleblowing Act or our Code of Business Conduct, we encourage them to speak up. Cases can be reported to global team or locally (in line with the local HGSS Whistleblowing procedure) by selecting proper option in this tool or by communicating it on hotline.  

 

We Offer:

#LI-HYBRID #LI-MO1 


Job Segment: Network, SQL, Database, Engineer, Analytics, Technology, Engineering, Management

Apply now »