Data Engineer
Job Description
Job Description Support the development of theorganizational Data Strategy and support data management in linewith strategic business objectives and FAB cultures andvalues. Supports the Data Engineering Lead in thedevelopment and oversight of the big data platform, standards,practices, policies and processes. Supports thedevelopment of the Data Engineering Office as a center ofexcellence committed to ingraining a data-driven decision makingculture across the organization, teaming with external partners,and offering end-to-end data services across business and supportgroups. Aligns architecture with businessrequirements. Promotes good data engineeringpractices and the management of data as a strategic asset. Core responsibilities Create and manage Power center Repositories and Integration services. Create and Configuration Power Exchange CDC Environments listener and Logger. Create application connections for Power Exchange CDC real time and continues mode. Creating Registration Groups Maps and Importing to Informatica CDC Repositories. Monitor Memory space usage and allocation and code deployment management. Serve as a subject matter expert on Informatica. Data acquisition, develop data set processes. Resource and security management. Troubleshooting application errors and ensuring that they do not occur again. Minimum 3 years of direct experience in Apache Hadoop, Informatica and CDC design patterns. Experience of contributing to financial projects would be beneficial. Effectively creating and maintaining the data pipeline architecture that accounts for security, scalability, maintainability, and performance. Full working knowledge of SQL and the ability to identify and translate user cases into Hadoop/Informatica ETL is essential. Basic Hadoop Administration skills: Cloudera Manager and Cloudera Navigator and HUE. Proficiency in writing Spark processes in Java/Scala and ideally in Python as well. Strong Unix/Red Hat skills, python scripting highly beneficial. Must be capable of designing and documenting solutions independently. Excellent track record of supporting and managing internal clients. Knowledge, Skills, and Attributes: Knowledge and Skills Good knowledge of Big Data platforms, frameworks, policies, and procedures. Proficient understanding of distributed computing principles Good knowledge of Big Data querying tools, such as Hive, and Impala ETL tools – Informatica Power center, Informatica Power Exchange. Experience with building stream-processing systems, using solutions such as Kafka and Spark-Streaming Test and debug all ETL and objects to evaluate the performance. Excellent SQL knowledge Experience on Cloud big data technologies beneficial, like Data bricks and Azure HD insights. Attributes A reliable and trustworthy person, able to anticipate and deal with the varying needs and concerns of numerous stakeholders adapting personal style accordingly. Adaptable and knowledgeable who is ableto learn and improve his skills with existing and new technologies
Create Alert
Receive emails for the latest ads matching your search criteria
Other jobs you may like
-
SOA Integration Specialist
- Published 22 March,2021
- Sector: IT Software computer
-
Manager - Business Analyst
- Published 22 March,2021
- Sector: IT Software computer
-
Project Manager (BB-06287)
- Published 22 March,2021
- Sector: IT Software computer
-
Senior Data Scientist (Emirati Only)
- Published 22 March,2021
- Sector: IT Software computer
-
Project Manager – IT Infrastructure
- Published 22 March,2021
- Sector: IT Software computer