Tag: Hadoop Developer jobs

TCS is Hiring Hadoop Developers- Apply Now!!

TCS is Hiring Hadoop Developers- Apply Now!!

Experience: 3 – 8 years

Skills: Hadoop.

Description:

  • Candidate should have at least 2 years experience on Hadoop or Teradata
  • Should be a graduate, willing to work in any shift.

APPLY NOW

Check Here: Frequently Asked Hadoop Interview Questions

Read More:

Advertisements
Bosch Group Hiring Hadoop Developer

Bosch Group Hiring Hadoop Developer

Location – Bengaluru

Experience – 3 to 5 years

Job Description

  • At least 2+ year hands-on experience in building Bigdata Application
  • Mandatory programming knowledge in Java.
  • Working Knowledge of Cloudera/Hortonworks
  • Develop batch data pipelines by using Hive, Pig, Flume, Sqoop
  • Develop real-time data pipelines by using spark and working knowledge of Kafka message queue.
  • Working knowledge of NoSQL database like HBase
  • At least 1 end to end project implementation
  • Good knowledge of Unix shell scripting
  • Working knowledge of Gitbucket/SVN

Apply Now

Check Here – Advanced Hadoop Topics for Interview Preparation

Infosys hiring Hadoop Developer

Infosys hiring Hadoop Developer

Job Description:
• Understand Architecture Requirements and ensure effective Design, Development, Validation and Support activities.

• Good understanding of the technology and domain. ,

• Ability to work in a team towards a desired goal. ,

• Ensure continual knowledge management.

• Adherence to the organizational guidelines and processes.

Technical and Professional Requirements-

•Minimum 2 years of experience in developing software applications including: analysis, design, coding, testing, deploying and supporting of applications. ,

•Experience building Big Data solutions using Hadoop and/or NoSQL technology. ,.

APPLY NOW

Take a look : Top 100 Hadoop Interview Questions

 

Mphasis Hiring Big Data Developer

Mphasis Hiring Big Data Developer

Job description

  • Lead the dialogue with the client and lead the development team.
  • Architecting and Designing Hadoop solution.
  • Actively participate in Scrum calls, work closely with product owner and scrum master for the sprint planning, estimates and story points.
  • Break the user stories into actionable technical stories, dependencies and plan the execution into sprints.
  • Designing batch and real time load jobs from a broad variety of data sources into Hadoop. And design ETL jobs to read data from Hadoop and pass to variety of consumers / destinations.
  • Perform analysis of vast data stores and uncover insights.
  • Responsible for maintaining security and data privacy, creating scalable and high-performance web services for data tracking.
  • Propose best practices / standards.

Skills

  • 10+ years of total IT experience including 3+ years of Big Data experience (Hadoop, SparkSQL (PySpark), Hbase, Hive, Sqoop and Python).
  • Having valid B1 visa to travel to USA for 4 weeks and continue executing the project from India (Chennai).
  • Hands on experience on Big Data tools and technologies is mandatory.
  • Proficient in Linux/Unix scripting.
  • Designed at least one Hadoop Data Lake end to end using the above Big Data Technologies.
  • Experience in Kafka, Storm and NoSQL Databases (e.g. Cassandra) is desirable.
  • Exposure to Cloud technology is desirable
  • Oracle or any other RDBMS experience is desirable
  • Bachelor’s degree in Engineering – Computer Science, or Information Technology. Master’s degree in Finance, Computer Science, or Information Technology a plus.
  • Experience in Agile methodology is a must

Apply Now

Crack the Hadoop Developer Interview Easily:

Top 100 Hadoop Interview Question & Answers

Wipro Hiring Big Data Developers

Wipro Hiring Big Data Developers

Organization – Wipro

Job Code: 549412

Experience: 3-5 Years

Job Location – Bangalore

Primary SkillsBig Data

Job Description:

  • Sound knowledge of Hadoop technology, Hive, HBase, Spark using Scala.
  • Experience with ETL tools, Spark, Hadoop, Hive, HBase on Hadoop distribution.
  • Develop streaming/real-time based complex event processing on Hadoop framework.
  • Interface with different databases (SQL, NoSQL). Manage data quality, by reviewing data for errors or mistakes from data input, data transfer, or storage limitations.
  • Proficient in Java programming, in addition to development IDE tool – Eclipse.
  • Sound knowledge of Unix Scripting and Tidal Enterprise Scheduler.
  • Good to have knowledge of Cisco Process and methodologies.

Apply Now

Best Hadoop Certification Training in Chennai

Capgemini Hiring Big Data Developers

Capgemini Hiring Big Data Developers

Company – Capgemini

Job Id – 111147

Experience – 4 to 6 years

Educational Qualification – Bachelor Degree

Primary SkillsBig Data

Apply Mode – Online

Job Description:

  • Hands on experience in Spark, Spark-SQL, Dataset, and a data frame.
  • Knowledge on various tuning and optimization techniques in Spark.
  • Hands on experience in Hadoop ecosystems like Hive / Apace Drill, handling different file formats like parquet, Avro, ORC etc.
  • Experience with building stream-processing systems, using solutions such as Storm or Spark-Streaming.
  • Development and maintenance of data analytics solutions using the CIB Data Hub platform.
  • Integrating any Hadoop / Big Data tools and frameworks required to provide requested capabilities and data pipeline.
  • Build frameworks, assets and reusable components that could be used across projects.
  • Implementing data transformation process.
  • Implement IT Data Governance (Update Data Dictionary and Meta-model, enrich data lineage, develop data quality rules provided by Data Management officers).

Apply Now

Big Data Training with Placement Assistance in Chennai

Infosys Hiring Hadoop Development Engineers

Infosys Hiring Hadoop Development Engineers

Company – Infosys

Job ID – Inf_EXTERNAL_10028456_4

Job Location – Across India

Experience – 3 to 5 years

SkillsHadoop

Job Description:

  • Min 2-year experience in developing software applications including analysis, design, coding, testing, deploying and supporting of applications.
  • Experience building Big Data solutions using Hadoop and/or NoSQL technology.
  • Extensive experience in developing complex MapReduce programs against structured and unstructured data.
  • Experience in loading external data to Hadoop environments using tools like MapReduce, Sqoop, and Flume.
  • Experience to one of the following technologies: HDFS, Oozie, MapReduce, Hive, Hadoop, Cassandra, Mongo.
  • Strong skills in designing solutions using Flume, Avro, Thrift.

Apply Link

Free with Top 20 Hadoop Interview Questions