DATA ENGINEER (Remote)

At TE, you will unleash your potential working with people from diverse backgrounds and industries to create a safer, sustainable and more connected world. 

Job Overview

 

We are on an exciting journey to build and scale our advanced analytics practice. TE is looking for a Data Engineer. The suitable candidate should have demonstrated experience in designing and implementing ETL solutions on-premise and cloud platforms to support Enterprise data warehouse, Data Lake and advanced analytics capabilities. Success in this role comes from marrying a strong data engineering background with product and business acumen to deliver scalable data pipeline and BI solutions that can enable data for Self Service and Advanced Analytics at TE in a simple and standard manner. 

 

You will be responsible to help defining ROI, requirements analysis, design and implementation data solutions on-premise and cloud.  The candidate will work closely with project managers, vendor partners, business unit representatives, project sponsors and Segment CIO teams to deliver the solutions.  The candidate is expected to communicate project status, issues and change control to all levels of management.

Job Responsibilities

 

  • Designs and develops ETL solutions using data warehouse design best practices for Next Generation Analytics platform
  • Analyze data requirements, complex source data, data models, and determine the best methods in extracting, transforming and loading the data into the data staging, warehouse and other system integration projects.
  • Analyze business requirements and outline solutions.
  • Have deep working knowledge of on-prem & cloud ESB architecture to address the client’s requirements for scalability, reliability, security, and performance
  • Provide technical assistance in identifying, evaluating, and developing systems and procedures.
  • Document all ETL and data warehouse processes and flows..
  • Develop and deploy ETL job workflow with reliable error/exception handling and rollback.
  • Manage foundational data administration tasks such as scheduling jobs, troubleshooting job errors, identifying issues with job windows, assisting with Database backups and performance tuning.
  • Design, Develop, Test, Adapt ETL code & jobs to accommodate changes in source data and new business requirements. 
  • Create or update technical documentation for transition to support teams.
  • Develop automated data audit and validation processes
  • Provides senior technical leadership to design, architecture, integration and support of the entire data sourcing platform with a focus on high availability, performance, scalability and maintainability
  • Manage automation of file processing as well as all ETL processes within a job workflow.
  • Develop, Contribute and adhere to the development of standards and sound procedural practices.
  • Proactively communicate innovative ideas, solutions, and capabilities over and above the specific task request
  • Effectively communicate status, workloads, offers to assist other areas.
  • Collaboratively work with a team and independently. Continuously strive for high performing business solutions
  • Perform and coordinate unit and system integration testing.
  • Participate in design review sessions and ensure all solutions are aligned to pre-defined architectural specifications.
  • Ensure data quality throughout entire ETL process, including audits and feedback loops to sources of truth.

Job Requirements

 

  • 6+ years of Data Engineering experience in ETL design, development, optimization & testing using PL/SQL, SAP Data Services (BODS) or Talend etc    
  • 5+  PL/SQL, Complex SQL Tuning, Stored Procedures, Data Warehousing best practices etc.
  • 3+ years of experience in relational and Cloud database design, optimization and performance; preferably with AWS (S3 and Redshift), SAP HANA, BW, Oracle, and Hadoop
  • 3+ years of experience in developing flows using batch, Realtime and streaming process to personalize experiences for our customers.
  • 2+ years of experience in designing service-oriented architecture (SOA), RESTful APIs and enterprise application integration (EAI) solutions utilizing MuleSoft Platform
  • 2+ years of experience with CI/CD tools like Jenkins, GIT, Java and Shell scripting 
  • Strong problem-solving capabilities. Results oriented. Relies on fact-based logic for decision-making.
  • Ability to work with multiple projects and work streams at one time. Must be able to deliver results based upon project deadlines.
  • Willing to flex daily work schedule to allow for time-zone differences for global team communications
  • Strong interpersonal and communication skills

Competencies

Values: Integrity, Accountability, Inclusion, Innovation, Teamwork
Location: 

MAKATI CITY, 00, PH, 1226

City:  MAKATI CITY
State:  00
Country/Region:  PH
Travel:  None
Requisition ID:  125563
Alternative Locations: 
Function:  Information Technology


Job Segment: Database, SQL, Data Warehouse, Warehouse, Developer, Technology, Manufacturing