DevOps Analystics Engineer Contrat : CDI

Il y a 11 months ago | Informatique | Rabat | 81 Vues

Connectez-vous pour accéder plus rapidement aux meilleurs offres. Cliquez ici si vous n'avez pas encore un compte.
Entreprise

Rejoignez Cnexia et faites le choix de faire partie d’un projet qui valorise l’innovation, promeut le développement continu des compétences et donne l’opportunité aux personnes créatives et ambitieuses du domaine des hautes technologies de réaliser leurs objectifs professionnels.

Fiers de notre statut de filiale Marocaine exclusive de l’opérateur télécom historique du Canada, nous n’avons cessé d’élargir nos équipes depuis 2021 avec plus de 1100 employés principalement basés à Fès. Nous venons d’étendre nos activités dans la partie nord du royaume en inaugurant un nouveau site à Rabat afin d’y poursuivre notre croissance avec comme objectif de plus que doubler nos effectifs à l’horizon 2024.

A Cnexia nous faisons plus que supporter les clients de notre réseau à taille mondiale et les bénéficiaires de nos divers services. Nous développons des solutions innovantes et créons continuellement du contenu média multiplateforme originale. Grâce à cela, nous révolutionnons tous les jours la façon avec laquelle les Canadiens communiquent à travers le web et interagissent via des applications mobiles en les faisant bénéficier d’une expérience améliorée.

Si vous êtes prêts à relever ce défi, nous vous invitons à rejoindre une communauté qui valorise les idées audacieuses et vous donne de multiples opportunités d’accomplissement de carrière dans un environnement multi culturel de classe mondiale !

Adresse

Unité B1, Parc Fès Shore, Route de Sidi Harazem (9 108,11 km)

Poste

Joining Cnexia is choosing to be part of an ambitious project that values Innovation, promotes Continuous Learning and enables all tech champions to fulfill their creative dreams.

At Cnexia, we do more than support the clients of our world-class network and services. We develop innovative solutions and create original multiplatform media content. In fact, we’re revolutionizing how Canadians communicate on the web, interact with Mobile Apps or benefit from an AI-enhanced experience.

Proud of our status as a fully owned Moroccan subsidiary of the largest Canadian Telecom company, we have been ceaselessly growing our team since 2021. With over 1100 employees, mainly based in Fez, we have expanded in the northern region of the kingdom with our Brand-new state of the art site in Technopolis Rabat.

If you are ready for this challenge, we invite you to join a community that values bold ideas and professional growth all in an engaging multi cultural world-class environment.

We are looking for a DevOps Analystics Engineer who will work in the area of managed services.

We are looking for a highly motivated individual who can thrive in a challenging and fast-paced environment to develop, through creativity and innovation, solutions critical to the future success of the Large Enterprise Market segment.

Job Description:

As a member of the Enterprise Data Platform team, reporting to the EDP AI/ML Senior Manager, the Cloud Analytic Engineer and DevOps will play a leading role in the development of new products, capabilities, and standardized practices using Cloud and Data technologies. Working closely with our business partners, this person will be part of a team that advocates the use of advanced Data technologies to solve business problems, and be a thought-partner in Data space.

Key Responsibilities:

1. Data Analysis and visualization:

  • Ability to own and lead your projects through all phases including identifying business requirements, technical design & implementation, and final delivery and refinement with business teams
  • Elicit, analyze and interpret business and data requirements to develop complete analytic solutions, includes business process diagram, data mapping, data models (entity relationship diagrams, dimensional data models), ETL and business rules, data life cycle management, governance, lineage, reporting and dashboarding
  • Facilitate data discovery workshops, downstream impact analyses and proactively manage stakeholder expectations
  • Must have Analytical thought leadership
  • Advanced expertise in SQL (BigQuery, Trino, Impala or similar)
  • Comfortable with data visualization, and data strategies. Experience with BI Analytic tools like Looker, Microstrategy, Tableau, Kibana or similar.
  • Communicate analysis results with effective storytelling.
  • Comfortable working in complex and constantly changing environments, with multidisciplinary teams.
  • Being able to effectively challenge the Status quo and set standard and direction for the solutions
  • Well organized, able to multi-task and manage priorities.

2. Data Engineering:

  • Proven experience building and deploying data analytic workflows in a big data environment (Hadoop, Google cloud platform or similar)
  • Design and develop ETL workflows based on business requirements and using multiple sources of data in various formats within Hadoop platform or Google Cloud platform
  • Clean, manipulate and analyze large, complex data sets, spanning a wide variety of sources.
  • Develop scalable and robust analytic solutions that can support the growing volumes of our data environments.
  • Build data models, implement business rules, and engineer responsive and scalable data analytics pipeline.
  • Design and implement component execution orchestration in Cloud Composer/ Cloud Data Fusion / Oozie / Airflow
  • Promote code to different environments using GitLab CICD
  • Produce well documented quality codes


Qualifications and Skills:

  • Masters degree in Data Science and Analytics, Mathematics, Statistics, Computer Science or related field
  • 5+ years of experience in a analytic engineering role, working in different data management disciplines including data integration, modelling, optimization and quality
  • 5+ years of experience in coding with Python, SQL, Scala,
  • 2+ years of CI/CD deployment, code management platform GitHub, building and coding applications using Hadoop components - HDFS, Hive, Impala, Sqoop, Kafka, HBase, etc.
  • 2+ years experienced in working in Big Data Analytical environments/technologies (Hadoop, Hive, Spark), with a deep understanding of data mining and analytical techniques.
  • 1+ years experience with Cloud platforms such as Google Platform, AWS, Azure, or Databricks
  • 1+ year Experience working with Google Cloud services, including Dataflow, Cloud Composer, Airflow, Cloud Run, Pub/Sub
  • 5+ years experience building reports and visualizations in Tableau, Microstyrategy, Looker, Kibana or equivalent.
  • 5+ years experience with traditional data warehousing and ETL tools
  • Comfortable in version control tools such as Git.


Preferred Qualifications:

  • Deep understanding of techniques used in creating and serving schemas at the time of data consumption
  • Use AI, ML, and other big-data techniques to provide a competitive edge to the business.
  • Advanced Cloud data technologies in ML space (Vertex AI, BQ-ML etc)
  • Experience in AWS/Azure data platforms
  • Past experience using Maven, Git, Jenkins, Se, Ansible or other CI tools is a plus
  • Experience in Exadata and other RDBMS is a plus.
  • Knowledge of predictive analytics techniques (e.g. predictive modeling, statistical programming, machine learning, data mining, data visualization).
  • Familiarity with different development methodologies (e.g. waterfall, agile, XP, scrum)
  • Strong inter-personal and communication skills including written, verbal, and technology illustrations.
  • Experience working with multiple clients and projects at a time.
Profile recherché

Qualifications and Skills:

Masters degree in Data Science and Analytics, Mathematics, Statistics, Computer Science or related field
5+ years of experience in a analytic engineering role, working in different data management disciplines including data integration, modelling, optimization and quality
5+ years of experience in coding with Python, SQL, Scala,
2+ years of CI/CD deployment, code management platform GitHub, building and coding applications using Hadoop components - HDFS, Hive, Impala, Sqoop, Kafka, HBase, etc.
2+ years experienced in working in Big Data Analytical environments/technologies (Hadoop, Hive, Spark), with a deep understanding of data mining and analytical techniques.
1+ years experience with Cloud platforms such as Google Platform, AWS, Azure, or Databricks
1+ year Experience working with Google Cloud services, including Dataflow, Cloud Composer, Airflow, Cloud Run, Pub/Sub
5+ years experience building reports and visualizations in Tableau, Microstyrategy, Looker, Kibana or equivalent.
5+ years experience with traditional data warehousing and ETL tools
Comfortable in version control tools such as Git.

Recherches emploi associées
informatique electronique