Yet to start

By registering, you accept all the Terms and Conditions of Skillenza.

Registrations will close

in 9 days

Goes live in 

Societe Generale is currently looking for Bigdata Engineer and the requirements are:


- Having a good understanding of Data structure like List, Queue, Map, Tree, Heap, Graphs, and algorithms
- Strong analytical and problem-solving skills.
- Hadoop and Spark experience is mandatory.
- Good to have Azure(or any cloud providers) experience or knowledge
- Measure the code written by him in Time complexity and Space complexity.
- Have a fair understanding of Distributed Systems.
- Familiar with the concepts of Performance measurement and improvements.
- Writing maintainable code using good coding and design practices.
- 4- 8 years of diversified experience developing applications using java language with 2-4 years of experience in Big data technologies such as Spark, Kafka/Spark Streaming, HBase, Hive, Oozie, Knox, Hadoop (Hortonworks) and the related ecosystem.
- Experience working on at least 1 Bigdata projects.
- Experience with Hadoop and Spark highly desired.

Mandatory Skills: Core Java, Spring, Scala, Bigdata, SQL, Spark

Why Work Here:

  • Clear and transparent communication
  • Brilliant team
  • Supportive management
  • Good work/life balance

To know more about the position, please check Job Description



How To Land Job Offer:

You begin by registering on the Skillenza platform and solve the Online challenge Following that, based upon your score, you will be invited to attend a technical interview for the position. So come prepared!

Societe Generale Global Solution Center (SG GSC)

is one of the leading European financial services groups. Based on a diversified and integrated banking model, the Group combines financial strength and proven expertise in innovation with a strategy of sustainable growth, aiming to be the trusted partner for clients.

Set up in 2000, Societe Generale Global Solution Centre, a 100% owned subsidiary of Societe Generale operates across its Bangalore and Chennai facilities in India and provides application development and maintenance, infrastructure management, business process and knowledge process management, corporate investment banking operations, security services operations, compliance and reporting, smart automation, KYC and HR operations across the regions of Asia, Americas, EMEA, Europe and France.

Please login to Read Discussions

enter image description here


If you have relevant skills and experience, you can apply and solve the challenge.

You begin by registering on the Skillenza platform and solve a online challenge. Following that, based upon your score, you will be invited to attend technical interview for the position.

Position: Bigdata Engineer

Work Location: Bangalore

Experience: 3 - 10 years

Requirments:

  • Having a good understanding of Data structure like List, Queue, Map, Tree, Heap, Graphs, and algorithms
  • Strong analytical and problem-solving skills.
  • Measure the code written by him in Time complexity and Space complexity.
  • Have a fair understanding of Distributed Systems
    . - Familiar with the concepts of Performance measurement and improvements.
  • Writing maintainable code using good coding and design practices.
  • 4- 8 years of diversified experience developing applications using java language with 2-4 years of experience in Big data technologies such as Spark, Kafka/Spark Streaming, HBase, Hive, Oozie, Knox, Hadoop (Hortonworks) and the related ecosystem.
  • Experience working on at least 1 Bigdata projects.
  • Experience with Hadoop and Spark highly desired.
  • Knowledge with building stream-processing systems using solutions such as Spark-Streaming, Kafka is a plus.
  • Knowledge of principles and components of Big Data processing and analytics highly desired.
  • Hadoop and Spark experience is mandatory.
  • Good to have Azure(or any cloud providers) experience or knowledge
  • Messaging formats including XML, JSON, Avro, Google Protocol Buffers.
  • Experience in Core Java is a must. Good knowledge of SQL is also required.
  • Experience on working with Spring-boot frameworks for the API development.
  • Experience in creating and consuming RESTful web services.
  • Experience with Agile and test-driven development methodologies required.
  • Good in UNIX commands and Shell Scripting.
  • Comfortable working with Eclipse IDE or Intellij IDEA.
  • Experience with Git, MAVEN, Jira tools.
  • Solid application design, coding, testing, maintenance, and debugging skills.
  • Experience of Object-Oriented software design and design patterns.


Mandatory Skills: Core Java, Spring, Scala, Bigdata, SQL , Spark

Nice to Have:

  • Experience within the Financial Services industry is a plus.
  • Knowledge of functional programming features in java.
  • Experience in working in global teams.
  • Angular JS


By registering, you accept all the Terms and Conditions of Skillenza.

Registrations will close

in 9 days

Evaluate Submissions

Live Hackathons

Recommended for You

Tags

 Connected
 Not connected