Skill: cloudera , linux system administration , hive , control-m , security , git , testing tools , mysql , hadoop , big data , hbase , github , rdbms , infrastructure , hdfs , sqoop; Exp: 3-6 years; Your Profile Identify project issues, communicate them and assist in their resolution. Deploy new Hadoop infrastructure, Hadoop cluster upgrades, Cluster maintenance, Troubleshooting, Capacity planning and resource optimization. Review, develop, and implement strategies that preserve the availability, stability, security and scalability of large Hadoop clusters. Interact with developers, architects and other operations teams to resolve job performance issues. Preparation of architecture, design and operational documentation. Your Checklist 3 – 6 years of experience in administration of Big Data platform and ecosystem tools. Big data platform software from Hortonworks, Cloudera, MapR Good knowledge of hadoop architecture and Strong hadoop troubleshooting skills Well versed of maintaining infrastructure of large complex systems/cluster. Prior experience of Linux system administration is important. Good knowledge of Hive as a service, Hbase, Kafka, Spark Knowledge of basic data pipeline tools like Sqoop, File ingestion, Distcp and their optimal usage patterns using enterprise scheduling such as control-m Good stakeholder management skills able to engage in formal and casual conversations and driving the right decisions Knowledge of the various file formats and compression techniques used within HDFS and ability to recommend right patterns based on application use cases Experience with industry standard version control tools (Git, GitHub, Subversion) and automated deployment, testing tools (Ansible, Jenkins, Bamboo etc) Working knowledge of open source RDBMS – MySQL, Postgres, Maria DB.
Function: IT Software : Software Products & Services