Data Warehouse Specialist
Descrizione dell'offerta
Generali Employee Benefits
Established in Trieste (Italy) in 1831, Assicurazioni Generali SpA is a business with a history. The Generali Group is one of Europe’s biggest multiline insurers by market capitalization and ranks in the top five insurers in the world by global premium income.
Generali Employee Benefits (GEB) is the employee benefits activity of Generali Group. The GEB Network, composed of more than 100 local insurance companies, is one of the leading partners in the international employee benefits management field, servicing more than 1,400 international corporate customers.
For more information, please visit our website
Role Overview
The Data Warehouse Specialist plays a key role in the Data Governance, Automation and Process Optimization team, supporting the design, development, and maintenance of scalable data infrastructure. This role focuses on building robust data pipelines, optimizing data flows, and enabling secure, efficient access to enterprise data for analytics and reporting. The Specialist will collaborate with cross-functional teams to ensure data integrity, automation, and compliance across GEB’s cloud-based data ecosystem.
Key Responsibilities
- Design and implement data extraction pipelines using Google Cloud Platform components such as Data Proc, Data Fusion, and Composer/Airflow.
- Develop and maintain DAGs for ETL processes and automate environment recreation using Google Cloud SDK and API scripting.
- Integrate Cloud Functions (Gen 1 & 2) and Secret Manager into data workflows for secure and scalable operations.
- Build and optimize Big Query datasets for enterprise reporting and analytics.
- Apply AEAD encryption and manage Data plex environments where applicable.
- Write and maintain advanced SQL queries and Shell scripts for data manipulation and automation.
- Support release management activities, including Cloud Build deployment processes.
- Collaborate with business analysts and stakeholders to understand data requirements and translate them into technical specifications.
- Ensure data quality, consistency, and security across all data warehouse environments.
- Document data models, processes, and architecture for internal use and compliance.
- Coordinate with testing teams to validate data components and ensure alignment with business expectations.
- Maintain up-to-date knowledge of emerging data technologies, warehousing standards, and best practices.
- Undertake additional responsibilities and tasks reasonably aligned with the scope of this role, as required by evolving business needs.
Skills & Competences
- Very solid hands-on experience with Google Cloud Platform (GCP) components including Data Proc, Data Fusion, Composer, Cloud Functions, Big Query, and Secret Manager.
- Strong proficiency in Python scripting and SQL is a must.
- Solid understanding of data modelling, ETL design, and data warehouse architecture.
- Experience with Shell scripting and automation of cloud environments.
- Familiarity with release management and CI/CD pipelines (Cloud Build experience is a plus).
- Experience in encryption using AEAD and Data plex are considered a strong asset.
- Strong communication skills with the ability to explain technical concepts to non-technical stakeholders.
- High level of autonomy and accountability.
- Self-initiative, proactive mindset, and team-oriented approach.
- Flexibility in working hours to accommodate global collaboration.
- Comfortable working in a fast-paced, international environment with evolving priorities.
Qualifications & Experience
- Minimal 6 years of experience in data warehousing or business intelligence roles.
- Certification in Google Professional Cloud Data Engineer is highly desirable.
- Familiarity with data governance, compliance, and security best practices.
- Fluency in English is a must; French, Italian or other languages are considered an asset.
- Bachelor or Master’s degree in computer science, data engineering, information systems, or a related field.