Big Data Modeller

Contract Type
This position is archived.
This position is archived.
Job description

Our partner is an international shared service company and has its registered office in Munich. Their customers are companies across the globe to which their 7,500 employees across the international locations offer innovative, top-quality products and services in the fields of IT Services and Operations.

Big Data Data Modeller with following responsibilities:

  • Understand and translate business needs into data models for big data uses cases and data lakes
  • Work with the rest of the development team to implement data strategies, build data flows/pipelines and develop data models
  • Create logical, conceptual and physical data models using best practices to ensure high data quality and reduced redundancy
  • Develop best practices for standard naming conventions and coding practices to ensure consistency of data models
  • Perform reverse engineering of physical data models from databases and SQL scripts
  • Evaluate data models and physical databases for variances and discrepancies
  • Validate business data objects for accuracy and completeness
  • Analyze data-related system integration challenges and propose appropriate solutions
  • Write technical documentation

Technical Skills:

The Global Data Platform is a diverse product consisting of several technologies. This characteristic is reflected in our cross functional team of individuals who have a deep understanding of their technical specialty and at the same time are able to work outside of their core area. We encourage to contribute your individual strengths and personality to the team and give you the room to develop some new skills and experiences. The following technical skills are required for this particular position:

Data Modelling:

  • 2 years plus direct modelling experience on commercial projects ideally on cloud platforms using a variety of methodologies and approaches
  • Must have hands on experience in both SQL and Non SQL modelling - Data Vault 2.0, Dimensional Modelling, JSON, Key Value stores using industry standard tooling

Data Engineering

  • Experience with Data Engineering Patterns (Ingestion, Transformation, Storage, Consumption etc.)
  • Experience with complex data pipeline development
  • Experience with Data Governance and Quality concepts and tools
  • Experience with Data Virtualization concepts and tools (e.g. Tibco, denodo)

General IT Skills:

  • Experience with Linux
  • Experience with a public cloud platform (Amazon AWS Cloud or MS Azure)
  • General understanding of Infrastructure, Orchestration and IT Security Principles (especially on an enterprise level)

Other Skills

  • Bachelor (BSc), Master (MSc) or equivalent experience in a technical field (for example, Computer Science, Engineering...)
  • Willingness and ability to learn new technologies and tools
  • Team player open to work in an agile environment
  • Fluent English (written and spoken) is a must, other languages (e.g. German, French, Italian, etc.) are a plus
  • Capability of result oriented communication with people from different departments with different skill sets
  • Competitive salary package according to candidate’s knowledge and competences
  • Annual financial bonus – based on individual targets
  • Wide package of certified trainings
  • Possibility to develop existing and new skills
  • Support of work-life balance
Other notes
For more related job opportunities visit
This position is archived.
This position is archived.