Yet to start

By registering, you agree to the Terms and Conditions & Privacy Policy of Skillenza.

Registrations will close

in 15 days

Goes live in 

Societe Generale is currently looking for Bigdata Engineer and the requirements are:


Profile Required

  • Require 3+ years relevant experience (Spark)
  • Programming experience Using Spark - Java/Scala
  • Working knowledge on CI/CD Pipeline
  • Experience in Hadoop and HIVE on AZURE is mandatory
  • Participate to API Development
  • Experience on Unix and SQL is a must.
  • Experience on support activities.
  • Ability to work closely in a team environment is highly recommended.
  • Worked on at least one project with a minimum 12 months period
  • Programming experience Using Spark - Java/Scala
  • Working knowledge on CI/CD Pipeline
  • Experience in Hadoop and HIVE on AZURE is mandatory
  • Participate in API Development


Mandatory Skills: Core Java, Spring, Scala, Bigdata, SQL , Spark

Responsibilities

  • Setup of Datalake Infrastructure SG Cloud
  • Setup of Azure environment for use cases/feeders hosted in Azure
  • Setup and maintain the CI/CD Pipeline for BAN topics (Jenkins, Maven, Sonar, Nexus)
  • Container/orchestrator: as part of CI/CD, building Doker images and deployment within Kubernetes
  • Implementation/maintenance of Scheduler/Orchestrator: Airflow
  • Set up of Workspace in Datalake Lucid
  • Setup and maintain Datamart
  • Assist the team to deploy the Dashboards in SG dashboard
  • Participate to API Development

Why Work Here:

  • Opportunity for career-acceleration at a reputable multinational financial institution with excellent prospectus and benefits.
  • Grow in a workplace that fosters cooperation, respect and a positive mindset ensuring the employees’ well-being.
  • Experience wide range of learning opportunities including various training modules like virtual classes, social learning


To know more about the position, please check Job Description



How To Land Job Offer:

You begin by registering on the Skillenza platform and then proceeding to take the Online challenge Based upon your score, you will be invited to attend a technical interview for the position. So come prepared!

Societe Generale Global Solution Center (SG GSC)

is one of the leading European financial services groups. Based on a diversified and integrated banking model, the Group combines financial strength and proven expertise in innovation with a strategy of sustainable growth, aiming to be the trusted partner for clients.

Set up in 2000, Societe Generale Global Solution Centre, a 100% owned subsidiary of Societe Generale operates across its Bangalore and Chennai facilities in India and provides application development and maintenance, infrastructure management, business process and knowledge process management, corporate investment banking operations, security services operations, compliance and reporting, smart automation, KYC and HR operations across the regions of Asia, Americas, EMEA, Europe and France.

Please login to Read Discussions

enter image description here


The hiring challenge is open to all who have the necessary skills, experience and can prove themselves by taking the online challenge.

You begin by registering on the Skillenza platform and proceeding to take an online challenge. Based upon your score, you will be invited to attend the technical interview for the position.

It’s free of cost. There are no hidden charges.

Position: Bigdata Engineer

Work Location: Bangalore

Experience: 3 - 10 years

Profile Required

  • Require 3+ years relevant experience (Spark)
  • Programming experience Using Spark - Java/Scala
  • Working knowledge on CI/CD Pipeline
  • Experience in Hadoop and HIVE on AZURE is mandatory
  • Participate to API Development
  • Experience on Unix and SQL is a must.
  • Experience on support activities.
  • Ability to work closely in a team environment is highly recommended.
  • Worked on at least one project with a minimum 12 months period
  • Programming experience Using Spark - Java/Scala
  • Working knowledge on CI/CD Pipeline
  • Experience in Hadoop and HIVE on AZURE is mandatory
  • Participate in API Development


Mandatory Skills: Core Java, Spring, Scala, Bigdata, SQL , Spark

Responsibilities

  • Setup of Datalake Infrastructure SG Cloud
  • Setup of Azure environment for use cases/feeders hosted in Azure
  • Setup and maintain the CI/CD Pipeline for BAN topics (Jenkins, Maven, Sonar, Nexus)
  • Container/orchestrator: as part of CI/CD, building Doker images and deployment within Kubernetes
  • Implementation/maintenance of Scheduler/Orchestrator: Airflow
  • Set up of Workspace in Datalake Lucid
  • Setup and maintain Datamart
  • Assist the team to deploy the Dashboards in SG dashboard
  • Participate to API Development


By registering, you agree to the Terms and Conditions & Privacy Policy of Skillenza.

Registrations will close

in 15 days

Evaluate Submissions

SkillDojo PRO | Your future Tech School

Recommended for You

Tags

 Connected
 Not connected