Hadoop Administrator Reston – VA

Job Title :Hadoop Administrator
Location: Reston, VA

Duration: 6 Months +

Skills: Hadoop, Map Reduce, Spark, Storm, MongoDB, Cloudera, Splunk, SQL, BI

Tasks:

Client is looking for a Big Data Administrator with proven past experience in setting up Big Data clusters in high available, load balanced configuration across multiple (Production, User Acceptance Testing and Development) environments. The resource needs to work with Solution Architects, Infrastructure Architects, and Lead Big Data Architect/Engineer, Big Data Supplier, and Developers to setup the environment and support the development teams.

Requirements:
•Responsible for successful installation and configuration of the Big Data platform in cluster mode across development, testing and Production environments.
•Work with Big Data team, infrastructure architect, change management team to support configuration, code migrations of Big Data deliverables.
•Builds robust Big Data platform systems with an eye on the long term maintenance and support of the Application
•Looks to leverage reusable code modules to solve problems across the team, including Data Preparation and Transformation and Data export and synchronization
•Act as a Big Data admin liaison with Infrastructure, security, application development.
•Keep current on latest Big Data technologies and products, including hands-on evaluations and in-depth research
•Works with Big Data lead/architect to perform detailed planning, risks/issues escalation.

Qualifications:

•5+ years of administrator experience working with batch-processing and tools in the Hadoop tech stack (e.g., MapReduce, Yarn, Pig, Hive, HDFS, Oozie)*
•5+ years of administrator experience working with tools in the stream-processing tech stack (e.g., Spark, Storm, Samza, Kafka, Avro)
•Administrator experience with NoSQL stores (e.g., ElasticSearch, Hbase, Cassandra, MongoDB, CouchDB)
•Expert knowledge on AD/LDAP security integration with Big Data
•Hands-on experience with at least one major Hadoop Distribution such as Cloudera or Horton Works or MapR or IBM Big Insights
•System usage and optimization tools such as Splunk is a plus
•4+ years of experience with SQL and at least two major RDBMS’s
•6+ years as a systems integrator with Linux systems and shell scripting
•6+ years of doing Data related benchmarking, performance analysis and tuning
•Bachelor’s degree in Computer Science, Information Systems, Information Technology or related field and 8+ years of software development/DW & BI experience.
•Health care experience is plus
•Excellent verbal and written communication skills

Love to Have:

•Hands-on experience with Cloudera 4.5 and higher, Horton Works 2.1 and higher or MapR 4.01 and higher
•ETL Solution experience, preferable on Hadoop
•Experience with industry leading Business Intelligence tools

Hope to hear from you soon.
Best Regards,

Krish
krish@katrinasoft.com