hero






Software Engineer II - Cashpro Data Services Team

Bank of America

Bank of America

Software Engineering
Charlotte, NC, USA · United States · Remote
Posted on Thursday, September 12, 2024

Job Description:

At Bank of America, we are guided by a common purpose to help make financial lives better through the power of every connection. Responsible Growth is how we run our company and how we deliver for our clients, teammates, communities and shareholders every day.

One of the keys to driving Responsible Growth is being a great place to work for our teammates around the world. We’re devoted to being a diverse and inclusive workplace for everyone. We hire individuals with a broad range of backgrounds and experiences and invest heavily in our teammates and their families by offering competitive benefits to support their physical, emotional, and financial well-being.

Bank of America believes both in the importance of working together and offering flexibility to our employees. We use a multi-faceted approach for flexibility, depending on the various roles in our organization.

Working at Bank of America will give you a great career with opportunities to learn, grow and make an impact, along with the power to make a difference. Join us!

Job Description:
This job is responsible for developing and delivering complex requirements to accomplish business goals. Key responsibilities of the job include ensuring that software is developed to meet functional, non-functional and compliance requirements, coding solutions, unit testing, and ensuring the solution can be integrated successfully into the overall application/system with clear, robust, and well-tested interfaces. Job expectations include an awareness of development and testing practices in the industry.

LOB Summary

The Cashpro Data Services team is responsible is providing critical data solutions for the overall Cashpro Platform and its suite of applications. This role is responsible for hands-on development with strong data sourcing and provisioning skills to build-out the data lake and perform analytics. Responsible for designing & development of the new data lake for Cashpro analytics.

Responsibilities:

  • Codes solutions and unit test to deliver a requirement/story per the defined acceptance criteria and compliance requirements

  • Utilizes multiple architectural components (across data, application, business) in design and development of client requirements

  • Performs Continuous Integration and Continuous Development (CI-CD) activities

  • Contributes to story refinement and definition of requirements

  • Participates in estimating work necessary to realize a story/requirement through the delivery lifecycle

  • Contributes to existing test suites (integration, regression, performance), analyze test reports, identify any test issues/errors, and triage the underlying cause

  • Performs spike/proof of concept as necessary to mitigate risk or implement new ideas

  • Collaborate with stakeholders such as architects, developers, data analysts and data modelers to understand dependencies, review designs and align roadmaps.

  • Responsible for build of data mappings, sessions, and workflows to meet requirements

  • Responsible for development of job schedules that integrate with upstream and downstream systems (AutoSys jobs)

  • Responsible for coordination of offshore development to assist in build efforts, including meeting with the offshore team daily to discuss ongoing work

  • Contributes to the determination of technical and operational feasibility of solutions.

  • Serves as a fully seasoned/proficient technical resource; provides tech knowledge and capabilities as team member and individual contributor. Will lead a small team.

  • May collaborate with external programmers and offshore teams to coordinate delivery commitments & also works closely with Support teams, offshore teams, Platform teams and Business Partners.

  • Regularly communicate work progress with management, identifying issues early and resolving them quickly to avoid or minimize impact to projects.

  • Gather, refine and validate complex business requirements, interdependencies and potential risks or issues

Required Qualifications

• 5+ years of hands on development and implementation experience with any ETL tool, preferably Informatica.

• 3+ years of development experience in Oracle, SQL S • Master’s degree in Computer Science or related field.

• Exposure and experience to No SQL Databases (Cassandra, Elastic, Marklogic) etc.

• Hands on programming experience with Python .

• Designing and implementing end-to-end complex business analytics solutions utilizing software including Alteryx, Trifacta and MicroStrategy

• Development experience in Big data and knowledgeable with the Hadoop stack (Scala, Spark, Python, PySpark, Hive, Impala)

• Experience in Flume/Kafka/spark is an added advantage

• Text mining/sentiment analysis/natural language processingerver.

• 3+ years in Data Warehouse / Data Mart / Data Mining/ Business Intelligence

• Must be proficient with both Linux shell scripting and SQL.

• Strong understanding of data visualization concepts used to identify trends that lead to actionable insights

• Designing tables and assessing relationships of database systems, including performance tuning, database design, and data models

• Demonstrated data skills with large data stores, comfort with navigating and connecting disparate data sources.

• Ability to present technical concepts to senior level business stakeholders.

• Analytical and problem-solving skills. Impeccable Communication skills – verbal and written.

• Should be a self-motivated worker. Willingness to learn and adapt to changes.

• Excellent interpersonal skills, positive attitude, team-player .

• Experience in working in a global technology development model.

Desired Qualifications:

• Master’s degree in Computer Science or related field.

• Hands on programming experience with Python .

• Designing and implementing end-to-end complex business analytics solutions utilizing software including Alteryx, Trifacta and MicroStrategy

• Development experience in Big data and knowledgeable with the Hadoop stack (Scala, Spark, Python, PySpark, Hive, Impala)

• Experience in Flume/Kafka/spark is an added advantage

• Text mining/sentiment analysis/natural language processing

Skills:

  • Application Development

  • Automation

  • Collaboration

  • DevOps Practices

  • Solution Design

  • Agile Practices

  • Architecture

  • Result Orientation

  • Solution Delivery Process

  • User Experience Design

  • Analytical Thinking

  • Data Management

  • Risk Management

  • Technical Strategy Development

  • Test Engineering

Minimum Education Requirement: Bachelor Degree or Equivalent Work Experience

Shift:

1st shift (United States of America)

Hours Per Week:

40