Logo

Applying job

Position: Data Engineer - Group Digital Transformation

Location:  / Trinidad and Tobago

COMPANY: ANSA McAL LIMITED

JOB SUMMARY:

The incumbent is responsible for developing, expanding, and optimizing our data and relatedarchitecture, as well as optimizing data flow and collection for cross functional teamsthroughout the ANSA McAL Group of Companies. The Data Engineer – Group Digital Transformation will work closely with business and data teams across all business units to ensure optimal data delivery architecture is consistent throughout ongoing projects. They must be self-directed and comfortable supporting the data needs of multiple teams, systems,products, and data initiatives.

FUNCTIONAL ASSIGNMENT:

The incumbent is responsible for developing, expanding, and optimizing our data and relatedarchitecture, as well as optimizing data flow and collection for cross functional teamsthroughout the ANSA McAL Group of Companies. The Data Engineer – Group Digital Transformation will work closely with business and data teams across all business units to ensure optimal data delivery architecture is consistent throughout ongoing projects. They must be self-directed and comfortable supporting the data needs of multiple teams, systems,products, and data initiatives.

Projects include developing and testing architectures that enable data extraction and transformation for predictive or prescriptive modelling, consumption of data through analytics and data storage.

Proven communication skills, problem-solving skills, and knowledge of integration best practices are critical to successful performance in this role.

This role requires excellent technical skills, and the ability to approach problems in a creative manner. Must be able to interact with internal customers and perform not only technical work, but have the ability to understand needs, extract requirements, and design and propose solutions as well.

 

GENERAL DUTIES & RESPONSIBILITIES:

  1. Design data modelling services used to mine enterprise systems and applications for knowledge and information that enhance business processes.
  2. Harness vast amounts of data from multiple data sources that meet functional / non-functional business requirements and optimize business results.
  3. Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
  4. Participate in the evaluation and selection of all infrastructure components such as software, hardware, big data components, and networking capabilities.
  5. Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources.
  6. Conduct research and make recommendations on data lake products, services, protocols, data standards, and data models for the data lake.
  7. Responsible for assisting in building, deploying, and maintaining data support tools, metadata inventories, and definitions for database file/table creation including requisite documentation.
  8. Build tools that utilize the data architecture to consume data in analytics from developed models, to provide actionable insights into customer acquisition, operational efficiency,and other key business performance metrics.
  9. Assess and cultivate long-term strategic goals for the data storage in conjunction with data users, business units, Executive, and other key stakeholders.
  10. Revise data architectures when required to be compatible with changing business needs and client standards so that all legal, compliance, and operational requirements, such as data sovereignty, regulatory security and privacy policies, and service level agreements (SLA), are accounted for.
  11. Coordinate and work with other technical staff to develop database architectures, coding standards, data management policies and procedures, and protocols for mining databases.

 

EDUCATION AND EXPERIENCE:

  • Bachelor\\\’s degree in Computer Science, Computer Engineering, relevant technical field, or equivalent practical experience.
  • At least 2 years of Python development experience.
  • At least 2 years of SQL experience.
  • At least 2 years of experience with workflow management engines.
  • At least 2 years of experience with Data Modelling.
  • At least 2 years’ experience in ETL (extract, transform, and load) design, implementation,and maintenance.

 

BUSINESS/TECHNICAL SKILLS:

  • Strong understanding of data models, structures, theories, principles, and practices.
  • Strong familiarity with data preparation, processing, classification, and forecasting.
  • Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.
  • Experience building and optimizing ‘big data’ data architectures, and data sets.
  • Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
  • Strong analytic skills related to working with unstructured datasets.
  • Experience with business and technical requirements analysis, business process modelling/mapping and methodology development, and data mapping.
  • Build processes supporting data transformation, data structures, metadata, dependency,and workload management.
  • A successful history of manipulating, processing, and extracting value from large, disconnected datasets.
  • Working knowledge of message queuing, stream processing, and highly scalable ‘big data’ data stores.
  • Strong project management and organizational skills.
  • Experience supporting and working with cross-functional teams in a dynamic environment.

 

CORE COMPETENCIES:

  • A strong passion for empirical research and for answering hard questions with data.
  • A flexible analytic approach that allows for results at varying levels of precision.
  • Ability to communicate complex quantitative analysis in a clear, precise, and actionable manner.
  • Good written and oral communication skills.
  • Strong technical documentation skills.
  • Good interpersonal skills.
  • Highly self-motivated and directed.
  • Keen attention to detail.
  • Strong presentation and interpersonal skills.
  • Ability to conduct research into database issues, data management issues, practices, standards, and products as required.
  • Ability to effectively prioritize and execute tasks in a high-pressure environment.
  • Strong customer service orientation.
  • Experience working in a team-oriented, collaborative environment.

How to apply:
These are the requirements for applying for this job:

- Do you have a Bachelor's degree in Computer Science, Computer Engineering, relevant technical field, or equivalent practical experience? - Do you have at least 2 years of Python development experience? - Do you have at least 2 years of SQL experience? - Do you have at least 2 years of experience with workflow management engines? - Do you have at least 2 years of experience with Data Modelling? - Do you have at least 2 years’ experience in ETL (extract, transform, and load) design, implementation,and maintenance?