Job Function : Application Programming, Maintenance
Specialization : Software Developer
Industry : IT-Software
Qualification : School & Graduation - Any Graduate
Experience : 10 - 15
Location : Bengaluru/ Bangalore
Key Skills : Software Dev Architect HDFS, HIVE, Big Data, PIG, LUME
JOB DESCRIPTION :
Job Description:-
Job Summary:
As a Software Architect in the ASup DPS team, you will be responsible for leading the architecture and development of data warehouse and data services solutions using non-traditional technologies based out of or integrated with Hadoop platform. This position is responsible for business-logic, architecture and implementation.
Essential Functions:
A major part of your responsibility will be to lead and facilitate all phases of the product development and maintenance life cycles ensuring high quality output in the following areas:
Requirement analysis and functional specification development
Architecture, and design development and associated peer-reviews
Code development and code reviews
Testing (unit, system and integration)
Reliability, scalability and performance analysis and improvements
Patch releases and urgent fixes
As part of the AutoSupport team, you will be accountable for meeting your commitments to ensure our programs complete on time and meet or exceed our customer's expectations. This requires:
Lead architectural and technical evaluations as development plans are established. You will have direct influence on the decisions and outcomes related to project direction.
A willingness to work on additional tasks and responsibilities that contribute towards team, department, and company goals.
Working in a cross-functional environment with key partners from Engineering, Global Support, Service Marketing, Sales, and Information Technology.
Provide pro-active and re-active operational support to both internal and external users of AutoSupport applications and services.
Desired Profile:-
Requirements:
Firm knowledge of Core Java. Significant work experience in Hadoop platform and its associated technologies like hive/pig, flume, scoop etc. Industry experience architecting and implementing, creating Design and Functional specifications, participating in code reviews. Working knowledge of Oracle or any other industry class RDBS.
The successful candidate must have:
5-8 years industry experience developing distributed application involving high data volumes and stringent performance criteria.
Experience in writing frameworks or specifications in java
Solid experience in end to end implementation and setup of Hadoop based application on a hundred node plus cluster
Good knowledge of Hive. Knowledge of flume, sqoop and pig is added advantage
Strong fundamentals in OS and networking
Strong fundamentals in serialization/de-serialization and data structures
Hands on programming skills. Experience in writing efficient algorithms.
Excellent analytical, problem solving and communication (written and verbal) skills
A strong desire and aptitude for learning new technologies
Ability to work in a fact paced, self-directed, action-oriented environment
Other desired qualities:
Would prefer an experience in contributing directly to the Hadoop open source community.
Experience leading system design and architecture development.
Strong development experience in enterprise-class server and products, especially those with large data-warehouse components.
Education and Experience Education & Experience:
A minimum of 8 years of experience in application development is required. 10-12 years of experience is preferred.
A Bachelor of Science Degree in Electrical Engineering or Computer Science, a Master Degree, or a PhD; or equivalent experience is required.
Demonstrated ability to complete multiple, highly complex technical tasks.
COMPANY DESCRIPTION :
NetApp creates innovative storage and data management solutions that help our customers accelerate business breakthroughs and achieve outstanding cost efficiency. Our dedication to principles of simplicity, innovation, and customer success has made us one of the fastest-growing storage and data management providers today .
No comments:
Post a Comment