Senior Managing Consultant, BI Solutions, DWH, Cloud Consultant,
Aktualisiert am 27.09.2024
Profil
Mitarbeiter eines Dienstleisters
Remote-Arbeit
Verfügbar ab: 01.10.2024
Verfügbar zu: 100%
davon vor Ort: 25%
Skill-Profil eines fest angestellten Mitarbeiters des Dienstleisters
Niederländisch
Muttersprache
Englisch
Verhandlungssicher
Deutsch
Verhandlungssicher

Einsatzorte

Einsatzorte

Deutschland, Schweiz, Österreich
möglich

Projekte

Projekte

9 Monate
2024-01 - 2024-09

Data Engineering, Data Modeling

Cloud Data Engineer / Data Modeler Databricks DBT Synapse ...
Cloud Data Engineer / Data Modeler

Working for the Security Insights department handling the processes for planning and forecasting of security measures.

  • Security staff
  • Lane performance
  • Security assets
  • Incidents

As part of this certain remodeling of the data has to be made for performance and business needs.

Databricks DBT Synapse Python Visual Studio Code Azure Devops Azure Synapse Jira SQL Server Studio DBeaver SqlFluff
Schiphol Group
3 Jahre 2 Monate
2021-08 - 2024-09

Cloud Data Architecture

Cloud Data Architect / Lead Developer Snowflake Snowpipe Snowpark ...
Cloud Data Architect / Lead Developer
Setting up an architecture using Snowflake, DBT and Airflow inside the current Azure Cloud replacing the current implementation an External Application Design Engine.
In the new architecture, we incorporated the existing delivery of data coming an Apache Kafka message service, Web APIs and Sharepoint by setting up the automatic loading in Snowflake using Snowpipe. In Snowflake we have setup the CDC using Streams options of Snowflake to keep the workload as lean as possible.
DBT is used to define all the transformations, documentation, test and automation by using macros. For the different brands within Amer Sports, like Salomon, Peak, Wilson and others, we have introduced the use of Row Level Security, so that they can only see their own data. In addition to that we also implemented a second level of Row level Security on countries within a brand.
For each brand we have setup environments for each brand and giving them guidelines how to use Snowflake, DBT, Airflow and Azure. The environments have the same structure as the environments for Amer Sports.
In close corporation with Wilson and Salamon we have set the quality of data to a higher level by integrating historical data.
Added models for monitoring the costs on warehouses and cloud services from Snowflake. As a pilot to see how to use Snowpark and Streamlit together, I wrote some Python script to make a model of the costs showing it in a dashboard.
Additional responsibilities:
  • Data modelling using facts and dimensions, fading out data vault
  • the redesign and implementation of the roles in Snowflake
  • the setup for the use of warehouses in Snowflake
  • the setup of DBT
  • the setup of Airflow jobs and configuration of the jobs
  • guiding the development team
  • planning of issues and resources
  • guiding the business teams from the brands

Snowflake Snowpipe Snowpark Streamlit Azure ADF Logic App Data Bricks Storage Accounts DBT Airflow Python Jupyter Notebook Visual Studio Code Atom GitLab Data Vault 2.0 Jira DBeaver Confluence
Amer Sports
6 Jahre 10 Monate
2017-12 - 2024-09

Data Information Architecture & Cloud Consulting

Data Information Architect & Cloud Consultant Amazon Web Services EC2 S3 ...
Data Information Architect & Cloud Consultant
One of the projects is a BI Portal for companies to send the data to us and we deliver all the analytics that can be done on that data.
For this project we use a combination of different cloud suppliers as AWS and GCP. For the data storage we use now Snowflake but also looking at Apache Druid. For the continuous delivery of data, we are building a combination of Snowpipe and/or Apache products in combination with AWS Lambda.
For ETL/Scheduling we are evaluating Airflow/Python, Apache Nifi and Talend Open Studio. For the analyses we use Apache Superset as it is very flexible.
As a part of investigation of using Matillion as an ETL tool I have setup an own Matillion instance to look at the integration of Git with Matillion using the different environments and the use of all the different kinds of variables. One of the shortcomings is that grid variables have a limit of 5000 items.
As part of investigation for the front-end I am using S3, Hugo and React to publish static and dynamic sites.

Amazon Web Services EC2 S3 Lambda SQL Workbench/J Talend Open Studio Tableau Snowflake Apache Druid Apache Superset Apache Airflow Apache Nifi Git Hugo React Python
Gray Trails
1 Jahr 6 Monate
2022-08 - 2024-01

Cloud Data Architecture

Cloud Data Architect / Lead Developer Snowflake Amazon Web Services S3 ...
Cloud Data Architect / Lead Developer

Setting up an architecture using Snowflake, DBT Cloud inside the current AWS Cloud for data quality monitoring and financial reporting on PowerCloud data.

In the architecture, we incorporated the data coming from PowerCloud using stages of Snowflake and macros in DBT Cloud. In Snowflake we have setup a data model and business rules to show customers of PowerCloud the level of errors and reporting on their key figures. The project is an ongoing development to deliver several products for customers of PowerCloud

Additional responsibilities:

  • Data modelling using facts and dimensions 
  • the design and implementation of the roles in Snowflake
  • the setup for the use of warehouses in Snowflake
  • the setup of DBT
  • guiding the development team
  • guiding the business team

I also gave a course in understanding and using Snowflake to a group of 8 people of WTS to get them ready to earn their first Badge in the certification line of Snowflake.
Snowflake Amazon Web Services S3 DBT Cloud Python Visual Studio Code Atom GitHub Trello DBeaver Apache Superset SAP Analytical Cloud PowerCloud
WTS Digital GmbH
1 Jahr 4 Monate
2021-06 - 2022-09

Lead Development

Lead Developer Snowflake Azure DBT ...
Lead Developer
  • Lead the team in how to use the current development environment.
  • Reviewing built models in DBT.
  • Reviewing Pull Requests in Azure DevOps.
  • Setup data pipelines for Energy vouchers and Energy tax information to the government institution.
  • Helping to implement CDC in the current processes for Bila.
  • Helping to extend data pipelines for Gutver (Credit Tax) information.
  • Building PoC for Gutver (Credit Tax) with Rapid Application Development.
  • Helping building PoC for Open Posts (FOMA), performing test and testing performance.
  • Helping in how to use data vault, dbt, snowflake and cloud architecture within the team.

Snowflake Azure DBT Airflow Python Visual Studio Code Atom Github Data Vault 2.0 Jira DBeaver Confluence PowerCloud
E.ON
6 Monate
2020-12 - 2021-05

Data Engineering

Data Engineer Snowflake Azure DBT ...
Data Engineer
  • Setup data pipelines for Energy vouchers and Energy tax information to the government institution.
  • Setup data pipelines for Gutver (Credit Tax) information.
  • Helping in how to use data vault, snowflake and cloud architecture within the team.

Snowflake Azure DBT Airflow Python Visual Studio Code Atom Github Data Vault 2.0 Jira DBeaver Confluence PowerCloud
E.ON
5 Monate
2020-10 - 2021-02

Data Information Architecture

Data Information Architect Snowflake Talend Pipeline Designer Talend Open Studio ...
Data Information Architect
Make a study to how the data warehouse Maps can be innovated. Old software used like Oracle Warehouse Builder and Talend Open Studio were no longer supported. Also the performance of getting data from the SAP and processing it is really slow.
The first option was to rebuild all logic and use Cloud technology. Due to the kind of data in combination with security issues the Cloud had to be avoided. The solution will be based on reinventing on-premise. One part of the solution is to let the SAP system send data using asynchronous calls. The second part of the solution is by reengineering as much as possible the current used software.
In the process we have 3 competitors of which Trivadis has the best option while they can convert all the Oracle Warehouse Builder definition. The extraction and sending of data is in research by the SAP teams.

Snowflake Talend Pipeline Designer Talend Open Studio AWS SAP BAPI Oracle Oracle Warehouse Builder biGenius Theobald
Merck
3 Monate
2019-10 - 2019-12

Cloud Data Engineering

Cloud Data Engineer Amazon Web Services S3 DBeaver ...
Cloud Data Engineer
Building a job in Matillion in combination with Python to send data out to the BowTie server plugins using REST calls, job variables, scalar variables and grid variables. The BowTie application is a visual application which shows the statuses of components in the physical energy network. Helping to setup the test use cases and guiding the user tests.
Investigate for a proof of concept of Asset Event Register to determine several KPIs from data transferred from the old data warehouse to the new data warehouse in Snowflake and the way to go. Defining business rules which can be adapted by the business for developing other KPIs.
Exporting the Carmen data warehouse and Matillion jobs in Redshift and the Matillion jobs to Snowflake using export and import tools of Matillion. After the import corrections were made to let all components work again (Redshift components have other properties than Snowflake components).
All Matillion jobs were divided in shared and the normal jobs using the shared jobs.

Amazon Web Services S3 DBeaver Snowflake Matillion Python Postman Git
Enexis
2 Jahre 10 Monate
2016-09 - 2019-06

Data Information Architecture & Cloud Consulting

Data Information Architect & Cloud Consultant Amazon Web Services Redshift EC2 ...
Data Information Architect & Cloud Consultant
Setting up BI and data warehousing in the cloud using Amazon Web Services and Amazon Redshift and do research on and implement of BI-tooling (in the end Tableau was chosen) for a proof-of-concept, project ENYA.
The first dashboard has been presented on the first of March to the CIO and to the Head of Controlling with success, showing revenue on customers using geographical and tree maps combined with self-explanatory data and interaction with each other from existing data from the data warehouse combined with some meta data from the business.

The second dashboard will be focused on the customer and national data to point out opportunities for the Deutsche Bahn Energie. The first version has been presented to the Business with success in July. Due to more than 100 parameters that drives this dashboard and the need for additional functionality a second version has to be delivered at October.


Tasks:
  • Setup a framework for using Talend Open Studio on the Data Warehouse
  • Setup the architecture for the Data Warehouse in with Redshift
  • Setup a way to cut the costs of Redshift by creating an automated start and stop procedure using Lambda and Boto3 (Python)
  • Guarding the rules for Datavault modelling of the Data Warehouse
  • Setting up a BI portal to use for the proof-of-concept
  • Build a dashboard to monitor and alter their position on the energy market
  • Setting up roles and security for use within Tableau Server
  • Guiding and teaching the internal personal on AWS cloud computing, Data Warehouse modelling, Talend Open Studio, GIT, Redshift and Tableau
  • Quality control for Business Analysts and Developers
  • Direct and indirect involvement with the business as supporting the Business Analysts and the business
  • Setting up the structure for using GIT
  • Setting up documentation for installations
  • Setting up release management
  • Advising on use of AWS cloud computing

Amazon Web Services Redshift EC2 S3 SQL Workbench/J Talend Open Studio Tableau Tibco Spotfire SpagoBI Data Vault Jira Git Tortoise AWS CLI JSON Sharepoint Python
Deutsche Bahn Energie

Kompetenzen

Kompetenzen

Produkte / Standards / Erfahrungen / Methoden

AWS
Experte
Tableau
Fortgeschritten
Jaspersoft
Fortgeschritten
Matillion
Fortgeschritten
Oracle
Fortgeschritten
Snowflake
Experte
MicroStrategy
Fortgeschritten
ElasticSearch
Fortgeschritten
Apache
Fortgeschritten
MapInfo
Fortgeschritten
Looker
Fortgeschritten
QlikView
Fortgeschritten
Talend
Fortgeschritten
Informatica PowerCenter
Fortgeschritten
SAP
Fortgeschritten

Betriebssysteme

Mac
Fortgeschritten
Linus
Fortgeschritten
Unix
Fortgeschritten
Windows
Fortgeschritten
Windows Server
Fortgeschritten
VMS
Fortgeschritten

Datenbanken

Snowflake
Experte
Oracle 7.x-11.x
Fortgeschritten
Netezza
Fortgeschritten
Redshift
Fortgeschritten
MySQL
Experte
MS Access
Fortgeschritten
PostgreSQL
Fortgeschritten
SQLServer
Fortgeschritten

Einsatzorte

Einsatzorte

Deutschland, Schweiz, Österreich
möglich

Projekte

Projekte

9 Monate
2024-01 - 2024-09

Data Engineering, Data Modeling

Cloud Data Engineer / Data Modeler Databricks DBT Synapse ...
Cloud Data Engineer / Data Modeler

Working for the Security Insights department handling the processes for planning and forecasting of security measures.

  • Security staff
  • Lane performance
  • Security assets
  • Incidents

As part of this certain remodeling of the data has to be made for performance and business needs.

Databricks DBT Synapse Python Visual Studio Code Azure Devops Azure Synapse Jira SQL Server Studio DBeaver SqlFluff
Schiphol Group
3 Jahre 2 Monate
2021-08 - 2024-09

Cloud Data Architecture

Cloud Data Architect / Lead Developer Snowflake Snowpipe Snowpark ...
Cloud Data Architect / Lead Developer
Setting up an architecture using Snowflake, DBT and Airflow inside the current Azure Cloud replacing the current implementation an External Application Design Engine.
In the new architecture, we incorporated the existing delivery of data coming an Apache Kafka message service, Web APIs and Sharepoint by setting up the automatic loading in Snowflake using Snowpipe. In Snowflake we have setup the CDC using Streams options of Snowflake to keep the workload as lean as possible.
DBT is used to define all the transformations, documentation, test and automation by using macros. For the different brands within Amer Sports, like Salomon, Peak, Wilson and others, we have introduced the use of Row Level Security, so that they can only see their own data. In addition to that we also implemented a second level of Row level Security on countries within a brand.
For each brand we have setup environments for each brand and giving them guidelines how to use Snowflake, DBT, Airflow and Azure. The environments have the same structure as the environments for Amer Sports.
In close corporation with Wilson and Salamon we have set the quality of data to a higher level by integrating historical data.
Added models for monitoring the costs on warehouses and cloud services from Snowflake. As a pilot to see how to use Snowpark and Streamlit together, I wrote some Python script to make a model of the costs showing it in a dashboard.
Additional responsibilities:
  • Data modelling using facts and dimensions, fading out data vault
  • the redesign and implementation of the roles in Snowflake
  • the setup for the use of warehouses in Snowflake
  • the setup of DBT
  • the setup of Airflow jobs and configuration of the jobs
  • guiding the development team
  • planning of issues and resources
  • guiding the business teams from the brands

Snowflake Snowpipe Snowpark Streamlit Azure ADF Logic App Data Bricks Storage Accounts DBT Airflow Python Jupyter Notebook Visual Studio Code Atom GitLab Data Vault 2.0 Jira DBeaver Confluence
Amer Sports
6 Jahre 10 Monate
2017-12 - 2024-09

Data Information Architecture & Cloud Consulting

Data Information Architect & Cloud Consultant Amazon Web Services EC2 S3 ...
Data Information Architect & Cloud Consultant
One of the projects is a BI Portal for companies to send the data to us and we deliver all the analytics that can be done on that data.
For this project we use a combination of different cloud suppliers as AWS and GCP. For the data storage we use now Snowflake but also looking at Apache Druid. For the continuous delivery of data, we are building a combination of Snowpipe and/or Apache products in combination with AWS Lambda.
For ETL/Scheduling we are evaluating Airflow/Python, Apache Nifi and Talend Open Studio. For the analyses we use Apache Superset as it is very flexible.
As a part of investigation of using Matillion as an ETL tool I have setup an own Matillion instance to look at the integration of Git with Matillion using the different environments and the use of all the different kinds of variables. One of the shortcomings is that grid variables have a limit of 5000 items.
As part of investigation for the front-end I am using S3, Hugo and React to publish static and dynamic sites.

Amazon Web Services EC2 S3 Lambda SQL Workbench/J Talend Open Studio Tableau Snowflake Apache Druid Apache Superset Apache Airflow Apache Nifi Git Hugo React Python
Gray Trails
1 Jahr 6 Monate
2022-08 - 2024-01

Cloud Data Architecture

Cloud Data Architect / Lead Developer Snowflake Amazon Web Services S3 ...
Cloud Data Architect / Lead Developer

Setting up an architecture using Snowflake, DBT Cloud inside the current AWS Cloud for data quality monitoring and financial reporting on PowerCloud data.

In the architecture, we incorporated the data coming from PowerCloud using stages of Snowflake and macros in DBT Cloud. In Snowflake we have setup a data model and business rules to show customers of PowerCloud the level of errors and reporting on their key figures. The project is an ongoing development to deliver several products for customers of PowerCloud

Additional responsibilities:

  • Data modelling using facts and dimensions 
  • the design and implementation of the roles in Snowflake
  • the setup for the use of warehouses in Snowflake
  • the setup of DBT
  • guiding the development team
  • guiding the business team

I also gave a course in understanding and using Snowflake to a group of 8 people of WTS to get them ready to earn their first Badge in the certification line of Snowflake.
Snowflake Amazon Web Services S3 DBT Cloud Python Visual Studio Code Atom GitHub Trello DBeaver Apache Superset SAP Analytical Cloud PowerCloud
WTS Digital GmbH
1 Jahr 4 Monate
2021-06 - 2022-09

Lead Development

Lead Developer Snowflake Azure DBT ...
Lead Developer
  • Lead the team in how to use the current development environment.
  • Reviewing built models in DBT.
  • Reviewing Pull Requests in Azure DevOps.
  • Setup data pipelines for Energy vouchers and Energy tax information to the government institution.
  • Helping to implement CDC in the current processes for Bila.
  • Helping to extend data pipelines for Gutver (Credit Tax) information.
  • Building PoC for Gutver (Credit Tax) with Rapid Application Development.
  • Helping building PoC for Open Posts (FOMA), performing test and testing performance.
  • Helping in how to use data vault, dbt, snowflake and cloud architecture within the team.

Snowflake Azure DBT Airflow Python Visual Studio Code Atom Github Data Vault 2.0 Jira DBeaver Confluence PowerCloud
E.ON
6 Monate
2020-12 - 2021-05

Data Engineering

Data Engineer Snowflake Azure DBT ...
Data Engineer
  • Setup data pipelines for Energy vouchers and Energy tax information to the government institution.
  • Setup data pipelines for Gutver (Credit Tax) information.
  • Helping in how to use data vault, snowflake and cloud architecture within the team.

Snowflake Azure DBT Airflow Python Visual Studio Code Atom Github Data Vault 2.0 Jira DBeaver Confluence PowerCloud
E.ON
5 Monate
2020-10 - 2021-02

Data Information Architecture

Data Information Architect Snowflake Talend Pipeline Designer Talend Open Studio ...
Data Information Architect
Make a study to how the data warehouse Maps can be innovated. Old software used like Oracle Warehouse Builder and Talend Open Studio were no longer supported. Also the performance of getting data from the SAP and processing it is really slow.
The first option was to rebuild all logic and use Cloud technology. Due to the kind of data in combination with security issues the Cloud had to be avoided. The solution will be based on reinventing on-premise. One part of the solution is to let the SAP system send data using asynchronous calls. The second part of the solution is by reengineering as much as possible the current used software.
In the process we have 3 competitors of which Trivadis has the best option while they can convert all the Oracle Warehouse Builder definition. The extraction and sending of data is in research by the SAP teams.

Snowflake Talend Pipeline Designer Talend Open Studio AWS SAP BAPI Oracle Oracle Warehouse Builder biGenius Theobald
Merck
3 Monate
2019-10 - 2019-12

Cloud Data Engineering

Cloud Data Engineer Amazon Web Services S3 DBeaver ...
Cloud Data Engineer
Building a job in Matillion in combination with Python to send data out to the BowTie server plugins using REST calls, job variables, scalar variables and grid variables. The BowTie application is a visual application which shows the statuses of components in the physical energy network. Helping to setup the test use cases and guiding the user tests.
Investigate for a proof of concept of Asset Event Register to determine several KPIs from data transferred from the old data warehouse to the new data warehouse in Snowflake and the way to go. Defining business rules which can be adapted by the business for developing other KPIs.
Exporting the Carmen data warehouse and Matillion jobs in Redshift and the Matillion jobs to Snowflake using export and import tools of Matillion. After the import corrections were made to let all components work again (Redshift components have other properties than Snowflake components).
All Matillion jobs were divided in shared and the normal jobs using the shared jobs.

Amazon Web Services S3 DBeaver Snowflake Matillion Python Postman Git
Enexis
2 Jahre 10 Monate
2016-09 - 2019-06

Data Information Architecture & Cloud Consulting

Data Information Architect & Cloud Consultant Amazon Web Services Redshift EC2 ...
Data Information Architect & Cloud Consultant
Setting up BI and data warehousing in the cloud using Amazon Web Services and Amazon Redshift and do research on and implement of BI-tooling (in the end Tableau was chosen) for a proof-of-concept, project ENYA.
The first dashboard has been presented on the first of March to the CIO and to the Head of Controlling with success, showing revenue on customers using geographical and tree maps combined with self-explanatory data and interaction with each other from existing data from the data warehouse combined with some meta data from the business.

The second dashboard will be focused on the customer and national data to point out opportunities for the Deutsche Bahn Energie. The first version has been presented to the Business with success in July. Due to more than 100 parameters that drives this dashboard and the need for additional functionality a second version has to be delivered at October.


Tasks:
  • Setup a framework for using Talend Open Studio on the Data Warehouse
  • Setup the architecture for the Data Warehouse in with Redshift
  • Setup a way to cut the costs of Redshift by creating an automated start and stop procedure using Lambda and Boto3 (Python)
  • Guarding the rules for Datavault modelling of the Data Warehouse
  • Setting up a BI portal to use for the proof-of-concept
  • Build a dashboard to monitor and alter their position on the energy market
  • Setting up roles and security for use within Tableau Server
  • Guiding and teaching the internal personal on AWS cloud computing, Data Warehouse modelling, Talend Open Studio, GIT, Redshift and Tableau
  • Quality control for Business Analysts and Developers
  • Direct and indirect involvement with the business as supporting the Business Analysts and the business
  • Setting up the structure for using GIT
  • Setting up documentation for installations
  • Setting up release management
  • Advising on use of AWS cloud computing

Amazon Web Services Redshift EC2 S3 SQL Workbench/J Talend Open Studio Tableau Tibco Spotfire SpagoBI Data Vault Jira Git Tortoise AWS CLI JSON Sharepoint Python
Deutsche Bahn Energie

Kompetenzen

Kompetenzen

Produkte / Standards / Erfahrungen / Methoden

AWS
Experte
Tableau
Fortgeschritten
Jaspersoft
Fortgeschritten
Matillion
Fortgeschritten
Oracle
Fortgeschritten
Snowflake
Experte
MicroStrategy
Fortgeschritten
ElasticSearch
Fortgeschritten
Apache
Fortgeschritten
MapInfo
Fortgeschritten
Looker
Fortgeschritten
QlikView
Fortgeschritten
Talend
Fortgeschritten
Informatica PowerCenter
Fortgeschritten
SAP
Fortgeschritten

Betriebssysteme

Mac
Fortgeschritten
Linus
Fortgeschritten
Unix
Fortgeschritten
Windows
Fortgeschritten
Windows Server
Fortgeschritten
VMS
Fortgeschritten

Datenbanken

Snowflake
Experte
Oracle 7.x-11.x
Fortgeschritten
Netezza
Fortgeschritten
Redshift
Fortgeschritten
MySQL
Experte
MS Access
Fortgeschritten
PostgreSQL
Fortgeschritten
SQLServer
Fortgeschritten

Vertrauen Sie auf Randstad

Im Bereich Freelancing
Im Bereich Arbeitnehmerüberlassung / Personalvermittlung

Fragen?

Rufen Sie uns an +49 89 500316-300 oder schreiben Sie uns:

Das Freelancer-Portal

Direktester geht's nicht! Ganz einfach Freelancer finden und direkt Kontakt aufnehmen.