Participate in Sprint planning and tasks assigned by application development managers.
Analyze the business requirements and has the ability to convert them into detailed program logic.
Design, develop, test and implement the application.
Investigate and resolve complex issues while supporting existing applications.
Check platform problem reports from business teams and perform initial filtering and analysis of the problems.
Resolve business issues within a timely manner.
Effectively respond to technical queries / requests from team members and customers.
Communicate regularly with business and IT leadership teams.
Collaborate with third party integration vendors and in house teams that build the custom components for the company.
Adhere to applications security procedures and change control guidelines.
Work within internal IT team to collaborate on tasks and completion of varying workloads.
Complete assigned tasks based on pre-planned project deadlines.
Troubleshoot issues and develop resolutions.
Reviews modules for quality assurance and checks compliance with applications architecture standards at system level.
A green-field Big Data project is seeking a self-starting experienced developer as part of a program redefining the manner in which Credit Risk data is stored and presented to the user community.
The role provides the opportunity to work on the delivery of large scale user stories, participating from a big data perspective, the delivery of a new strategic global CRM dashboard with a future user community in the thousands. This new application uses the latest web technologies in tandem with Big Data to satisfy the high expectations in terms of responsiveness and volume of data.
This individual will deal with complex, interrelated applications and systems being designed for today s big data platforms, work with rapidly evolving technologies and internal stakeholders to analyze and define unique and innovative solutions for achieving strategic business goals.
The core development team is presently split across India, US and Poland, with other team functions and additional support teams located in the London and Singapore.
Essentials Skills and Qualifications:
Proven background in Big Data project deliveries.
Hadoop Certification from an established distribution vendor, such as Cloudera, MapR or Hortonworks is preferred.
Ability to create a productive environment that promotes initiative, innovation and continual improvement.
Able to document and clearly explain technical solutions to both technical and non-technical teams.
Must have qualifications/experience:
Proven commercial experience of Big Data platforms and technologies, specifically hands-on experience with Hadoop and associated technologies, such as Apache Spark, Impala, Hive and Sqoop.
Proven commercial experience of delivering systems written in Java and Scala.
Experience in developing RESTful services.
Either experience of working on green-field Big Data projects or the prerequisite skills to enable this.
Experience in designing and developing large-scale Java based enterprise applications that can handle massive data volumes.
Strong analytical skills to research and solve problems.
Excellent verbal and written communication.
Desired Skills and Qualifications:
Flume or Kafka
Enforcement of data-level secrecy.
Experience of data warehousing and data modeling.
Exposure to ANSI-SQL in an Oracle environment.
Agile methodologies like Scrum.
WebSphere, WebLogic or other Application Server runtimes.