17 Etl Architect jobs in Egypt

Data Integration Engineer

EGP40000 - EGP60000 Y INZOX LLC

Posted today

Job Viewed

Tap Again To Close

Job Description


Location:
Remotely


Experience Level:
2–3 years

Are you passionate about
system integration
and looking to grow your career in the Oracle ecosystem?

We are looking for a
Junior Data Integration Engineer
to join our dynamic team and support in building, maintaining, and monitoring data integrations using
Oracle Integration Cloud (OIC)
and
Oracle E-Business Suite
.

Job Responsibilities:

  • Assist in integrating Oracle E-Business Suite with external parties using Oracle Integration Cloud and service-oriented architecture.
  • Support senior team members and application supervisors in handling reports and forms.
  • Prepare and maintain integration documentation.
  • Monitor integrations and troubleshoot issues under supervision.
  • Develop simple customized reports/forms as directed by the manager.
  • Collaborate with the DBA to ensure optimal performance for report execution.
  • Participate in activities related to master data loading and validation.
  • Support users with technical integration and security-related issues.
  • Generate periodic integration/application performance reports for management.

Qualifications:

  • Bachelor's degree in Computer Science, Information Technology, or related field.
  • 2–3 years of experience in system integration or a related IT field.
  • Knowledge of Oracle E-Business Suite.
  • Exposure to Oracle Integration Cloud (OIC) or Oracle SOA Suite.
  • Basic understanding of data management and troubleshooting.
  • Familiarity with Oracle EBS reports/forms development.
  • Strong task planning and problem-solving skills.
  • Proficiency in MS Office (Word, Excel, PowerPoint).
  • Good communication skills in English.

Preferred Skills (Nice to Have):

  • Hands-on Oracle SOA or Middleware experience.
  • Knowledge of SQL & Oracle Database.
  • Exposure to cloud-based integration projects.

If you're eager to advance your skills in Oracle integration and work with cutting-edge cloud technologies, we'd love to hear from you

This advertiser has chosen not to accept applicants from your region.

Data Integration Specialist

EGP40000 - EGP60000 Y SSC HR Solutions

Posted today

Job Viewed

Tap Again To Close

Job Description

Job Summary:

We are seeking a highly skilled and motivated Data Integration Specialist to design, develop, and support scalable ETL solutions across enterprise data platforms. The ideal candidate will have strong expertise in ETL tools, SQL, and shell scripting, with hands-on experience integrating large volumes of data from multiple source systems into enterprise data warehouses. Knowledge of Hadoop, MPP systems, and Unix/Linux environments is a strong advantage.

Key Responsibilities:
  • Capture and analyze ETL requirements from business and technical stakeholders, and translate them into robust data integration designs.
  • Design, develop, and maintain ETL workflows and data pipelines for loading data from diverse source systems into Data Warehouses and Data Marts.
  • Ensure data quality, accuracy, and timeliness, while adhering to defined SLAs and performance benchmarks.
  • Optimize existing data integration processes for performance, scalability, and resource efficiency.
  • Troubleshoot and resolve ETL and data load issues in coordination with other data and infrastructure teams.
  • Provide mentorship and guidance to junior developers and contribute to knowledge-sharing across the team.
  • Create and maintain comprehensive technical documentation for data integration processes, design decisions, and operational procedures.
Requirements
Technical Skills & Qualifications:
  • Strong working knowledge of enterprise relational databases such as Teradata, Oracle, or IBM DB2.
  • Proficiency in two or more ETL tools (e.g., Informatica, IBM DataStage, Oracle Data Integrator (ODI), SAS DI Studio).
  • Advanced SQL skills and shell scripting experience for automation and custom logic implementation.
  • Solid experience in delivering ETL solutions for large enterprise environments, handling complex transformations and high-volume data.
  • Exposure to Hadoop ecosystems and Massively Parallel Processing (MPP) platforms such as Teradata is a plus.
  • Hands-on experience with Unix/Linux OS, including scripting and command-line operations.
Preferred Qualifications:
  • Bachelor's or Master's degree in Computer Science, Information Systems, or a related field.
  • 4+ years of experience in data integration, ETL development, or data engineering roles.
  • Experience working in Agile/Scrum environments with CI/CD practices.
  • Strong problem-solving skills, attention to detail, and a collaborative approach to cross-functional projects.
  • Familiarity with data governance, metadata management, and data lineage tools is an advantage.

Why Join Us?

Be a key contributor to our enterprise data landscape by designing robust and scalable integration solutions. If you're passionate about solving complex data problems and enabling business intelligence through quality data pipelines, we'd love to connect with you.

This advertiser has chosen not to accept applicants from your region.

Senior Data Integration Engineer

EGP90000 - EGP120000 Y IdealRatings

Posted today

Job Viewed

Tap Again To Close

Job Description

We are seeking a
Senior Data Integration Engineer
with extensive experience in
SQL Server
database management,
dynamic scripting
, and
T-SQL development
. The ideal candidate will be skilled in
ETL design and implementation
using
SSIS
, as well as creating
reports and dashboards
in
SSRS
. This role involves working closely with cross-functional teams to ensure the efficient integration, transformation, and delivery of data across various systems.

Key Responsibilities:

Database Management

  • Administer, monitor, and maintain
    SQL Server
    environments, ensuring performance, security, and high availability.
  • Optimize database performance through indexing, query tuning, and resource management.

Data Integration & ETL Development

  • Design, develop, and maintain ETL processes using
    SQL Server
    Integration Services (SSIS).
  • Build
    dynamic and reusable scripts
    for data transformation, validation, and automation.
  • Integrate data from multiple sources, ensuring quality, consistency, and accuracy.

T-SQL Development

  • Write and optimize
    complex T-SQL queries
    , stored procedures, and functions.
  • Develop
    dynamic SQL scripts
    for flexible and parameterized data processing.

Reporting & Analytics

  • Design and implement interactive reports and dashboards using
    SQL Server Reporting Services (SSRS)
    .
  • Collaborate with stakeholders to understand reporting requirements and deliver actionable insights.

Collaboration & Documentation

  • Work closely with data analysts, application developers, and business teams to support integration projects.
  • Create and maintain technical documentation for database designs, ETL workflows, and reporting solutions.

Qualifications Required:

  • Bachelor's degree in
    Computer Science
    ,
    Information Systems
    , or related field (or equivalent experience).
  • 5+ years
    of hands-on experience with
    SQL Server
    administration and development.
  • Strong expertise in
    T-SQL
    and
    dynamic SQL scripting
    .
  • Proven experience in
    SSIS
    package design, deployment, and troubleshooting.
  • Experience building and optimizing reports using
    SSRS
    .
  • Solid understanding of
    ETL best practices
    , data cleansing, and transformation logic.
  • Experience with
    query performance tuning
    and database optimization techniques.
  • Knowledge of
    data warehousing concepts
    and data modeling.

Preferred:

  • Experience with
    version control systems
    (e.g., Bitbucket, SVN).
  • Strong problem-solving skills and ability to work in a fast-paced environment.
This advertiser has chosen not to accept applicants from your region.

Senior Data Engineer-Data Integration

EGP900000 - EGP1200000 Y IBM

Posted today

Job Viewed

Tap Again To Close

Job Description

Introduction
As a Data Engineer, you will lead IBM into the future by translating system requirements into the design and development of customized systems in an agile environment. The success of IBM is in your hands as you transform vital business needs into code and drive innovation. Your work will power IBM and its clients globally, collaborating and integrating code into enterprise systems. You will have access to the latest education, tools and technology, and a limitless career path with the world's technology leader. Come to IBM and make a global impact

Your Role And Responsibilities
A Data Engineer with expertise in Data Integration is responsible for designing and building solutions to transfer data from operational and external environments to the business intelligence environment. They utilize tools such as Informatica, Ab Initio software, and DataStage (formerly Ascential) - IBM's WebSphere Data Integration Suite. This role involves creating and implementing Extract, Transform, and Load (ETL) processes, ensuring the seamless flow of data throughout the business intelligence solution's lifecycle.

Required Technical And Professional Expertise
A Data Engineer with expertise in Data Integration is responsible for designing and building solutions to transfer data from operational and external environments to the business intelligence environment. They utilize tools such as Informatica, Ab Initio software, and DataStage (formerly Ascential) - IBM's WebSphere Data Integration Suite. This role involves creating and implementing Extract, Transform, and Load (ETL) processes, ensuring the seamless flow of data throughout the business intelligence solution's lifecycle.

This advertiser has chosen not to accept applicants from your region.

Data Architect

EGP90000 - EGP120000 Y Envision Employment Solutions

Posted today

Job Viewed

Tap Again To Close

Job Description

Envision Employment Solutions is currently looking for a Data Architect (DataBricks) for one of our partners, a global leader in consulting, digital transformation, technology and engineering services.

This position requires flexibility to work on US times and flexibility to travel abroad when needed.

Responsibilities:

  • Design, develop, and implement data architectures on the Databricks platform, including data lakes, data warehouses, and data pipelines.
  • Define and implement data governance and security policies within the Databricks environment.
  • Collaborate with data engineers to optimize data ingestion, transformation, and processing pipelines on Databricks.
  • Develop and maintain data models and schemas for various business domains.
  • Conduct data profiling and quality assessments to ensure data accuracy and completeness.
  • Troubleshoot and resolve data-related issues within the Databricks environment.
  • Stay abreast of the latest advancements in Databricks technologies and best practices.
  • Mentor and guide junior data engineers and analysts.
  • Collaborate with business stakeholders to understand their data needs and translate them into technical requirements.
  • Participate in the evaluation and selection of new data technologies and tools.
  • Contribute to the development and improvement of data architecture standards and best practices.

Requirements:

  • Bachelor's degree in Computer Science, Computer Engineering, or a related field.
  • 9+ years of experience as a Data Architect or a similar role.
  • Strong experience with Databricks, including Delta Lake, Spark SQL, and Databricks SQL.
  • Experience with cloud platforms such as AWS, Azure, or GCP.
  • Experience with data warehousing and data lake architectures.
  • Strong understanding of data modeling and data warehousing concepts.
  • Experience with data integration and ETL/ELT processes.
  • Proficiency in SQL and Python.
  • Experience with data governance and security best practices.
  • Excellent communication and interpersonal skills.
  • Strong analytical and problem-solving skills.
  • Ability to work independently and as part of a team.

Benefits:

  • Competitive Salary based on experience
  • Social and medical insurance
  • Learning, development and career progression
This advertiser has chosen not to accept applicants from your region.

Data Architect

EGP90000 - EGP120000 Y Giza Systems

Posted today

Job Viewed

Tap Again To Close

Job Description

  • Participates in vendors assessment and selection.
  • Participates with presales team in the proposed solution design in the bidding phase.
  • Prepares the "
    team
    " Scope of Work (SoW) proposal write-up for bidding in software projects.
  • Prepares "
    team
    " professional services sizing, assumptions, and pre-requisites for bidding in software projects.
  • Participates in customer demonstrations and presentations to discuss and convince the customer by our software solutions.
  • Attend requirement gathering workshops and prepares business requirements documents.
  • Prepares strategy documents of the project (i.e., configuration management strategy, migration strategy document, go-live strategy … etc.)
  • Participates and review testing strategy document.
  • Prepares high level design documents including end-to-end solution architecture and integration scenarios with the help of the technical architects.
  • Review / audit development team and subcontractor technical documentation and make sure of alignment with project scope and architecture guidelines.
  • Proven experience as architect and engineering lead in Data & Analytics stream.
  • In-depth understanding of data structure principles and data platforms.
  • Problem-solving attitude, solution mindset with implementation expertise.
  • Working experience on Modern data platforms which involves big data technologies, data management solutions, and data virtualization.
  • Well-versed with the end2end data management philosophies and governance processes.
  • Has pre-sales experience and have involved in RFP/RFI/RFQ processes.
  • Creative problem-solver with strong communication skills.
  • Excellent understanding of traditional and distributed computing paradigm.
  • Should have excellent knowledge in data warehouse / data lake technology and business intelligence concepts.
  • Should have good knowledge in Relational, No-SQL, Big Data Databases and should be able to write complex SQL queries.

Technical Skills

  • Data integration – ETL tools like Talend and Informatica. Ingestion mechanism like Flume & Kafka.
  • Data modelling – Dimensional & transactional modelling using RDBMS, NO-SQL and Big Data technologies. Experience in Snowflake modelling would be an advantage.
  • Data visualization – Tools like Tableau, Power BI and Kibana.
  • Master data management (MDM) – Concepts and expertise in tools like Informatica & Talend MDM.
  • Big data – Hadoop eco-system, Distributions like Cloudera / Hortonworks, Pig and HIVE.
  • Data processing frameworks – Spark & Spark streaming.
  • Hands-on experience with multiple databases like PostgreSQL, Snowflake, Oracle, MS SQL Server, NOSQL (HBase / Cassandra, MongoDB), is required.
  • Knowledge of various data modelling techniques and should be hands on with data modelling tools like ERWin, TOAD, Power Designer, etc.
  • Experience in cloud data eco-system - AWS, Azure or GCP.
  • Demonstrate strong analytical and problem-solving capability.
  • Good understanding of the data eco-system, both current and future data trends.

Personal Skills

  • BSC in Computer Engineering or Computer Science.
  • Years of Experience Min: 10 Max: 20
  • Proven experience as architect and engineering lead in Data & Analytics stream.
  • In-depth understanding of data structure principles and data platforms.
  • Problem-solving attitude, solution mindset with implementation expertise.
  • Working experience on Modern data platforms which involves big data technologies, data management solutions, and data virtualization.
  • Well-versed with the end2end data management philosophies and governance processes.
  • Has pre-sales experience and have involved in RFP/RFI/RFQ processes.
  • Creative problem-solver with strong communication skills.
  • Excellent understanding of traditional and distributed computing paradigm.
  • Should have excellent knowledge in data warehouse / data lake technology and business intelligence concepts.
  • Should have good knowledge in Relational, No-SQL, Big Data Databases and should be able to write complex SQL queries.
This advertiser has chosen not to accept applicants from your region.

Data Architect

EGP120000 - EGP240000 Y Giza Systems EG

Posted today

Job Viewed

Tap Again To Close

Job Description

Job Description

The main purpose of the solution architect position is designing software solutions and technical leadership of software delivery teams and accountable for the technical acceptance of the delivered solution by the customer.

  • Participates in vendors assessment and selection.

  • Participates with presales team in the proposed solution design in the bidding phase.

  • Prepares the "team" Scope of Work (SoW) proposal write-up for bidding in software projects.

  • Prepares "team" professional services sizing, assumptions, and pre-requisites for bidding in software projects.

  • Participates in customer demonstrations and presentations to discuss and convince the customer by our software solutions.

  • Attend requirement gathering workshops and prepares business requirements documents.

  • Prepares strategy documents of the project (i.e., configuration management strategy, migration strategy document, go-live strategy … etc.)

  • Participates and review testing strategy document.

  • Prepares high level design documents including end-to-end solution architecture and integration scenarios with the help of the technical architects.

  • Review / audit development team and subcontractor technical documentation and make sure of alignment with project scope and architecture guidelines.

  • Proven experience as architect and engineering lead in Data & Analytics stream

  • In-depth understanding of data structure principles and data platforms

  • Problem-solving attitude, solution mindset with implementation expertise

  • Working experience on Modern data platforms which involves big data technologies, data management solutions, and data virtualization

  • Well-versed with the end2end data management philosophies and governance processes

  • Has pre-sales experience and have involved in RFP/RFI/RFQ processes

  • Creative problem-solver with strong communication skills

  • Excellent understanding of traditional and distributed computing paradigm

  • Should have excellent knowledge in data warehouse / data lake technology and business intelligence concepts

  • Should have good knowledge in Relational, No-SQL, Big Data Databases and should be able to write complex SQL queries

Technical Skills:

  • Data integration – ETL tools like Talend and Informatica. Ingestion mechanism like Flume & Kafka

  • Data modelling – Dimensional & transactional modelling using RDBMS, NO-SQL and Big Data technologies. Experience in Snowflake modelling would be an advantage

  • Data visualization – Tools like Tableau, Power BI and Kibana

  • Master data management (MDM) – Concepts and expertise in tools like Informatica & Talend MDM

  • Big data – Hadoop eco-system, Distributions like Cloudera / Hortonworks, Pig and HIVE

  • Data processing frameworks – Spark & Spark streaming

  • Hands-on experience with multiple databases like PostgreSQL, Snowflake, Oracle, MS SQL Server, NOSQL (HBase / Cassandra, MongoDB), is required

  • Knowledge of various data modelling techniques and should be hands on with data modelling tools like ERWin, TOAD, PowerDesigner, etc.

  • Experience in cloud data eco-system - AWS, Azure or GCP

  • Demonstrate strong analytical and problem solving capability

  • Good understanding of the data eco-system, both current and future data trends

Personal Skills
  • Preferred to be TOGAF certified
  • Proven experience as an architect and engineering lead in Data & Analytics stream.
  • In-depth understanding of data structure principles and data platforms.
  • Problem-solving attitude, solution mindset with implementation expertise.
  • Working experience on Modern data platforms, which involves big data technologies, data management solutions, and data virtualization.
  • Well-versed in the end-to-end data management philosophies and governance processes.
  • Has pre-sales experience and has been involved in RFP/RFI/RFQ processes.
  • Creative problem-solver with strong communication skills.
  • Excellent understanding of traditional and distributed computing paradigm.
  • Should have excellent knowledge in data warehouse / data lake technology and business intelligence concepts.
  • Should have good knowledge in Relational, No-SQL, Big Data Databases and should be able to write complex SQL queries

Technical Skills :

  • Data integration – ETL tools like Talend and Informatica. Ingestion mechanism like Flume & Kafka.
  • Data modelling – Dimensional & transactional modelling using RDBMS, NO-SQL and Big Data technologies. Experience in Snowflake modelling would be an advantage.
  • Data visualization – Tools like Tableau, Power BI and Kibana.
  • Master data management (MDM) – Concepts and expertise in tools like Informatica & Talend MDM
  • Big data – Hadoop eco-system, Distributions like Cloudera / Hortonworks, Pig and HIVE
  • Data processing frameworks – Spark & Spark streaming.
  • Hands-on experience with multiple databases like PostgreSQL, Snowflake, Oracle, MS SQL Server, NOSQL (HBase / Cassandra, MongoDB), is required.
  • Knowledge of various data modelling techniques and should be hands on with data modelling tools like ERWin, TOAD, Power Designer, etc.
  • Experience in cloud data eco-system - AWS, Azure or GCP
  • Demonstrate strong analytical and problem-solving capability.
  • Good understanding of the data eco-system, both current and future data trends.
Technical Skills
  • Proven experience as architect and engineering lead in Data & Analytics stream.
  • In-depth understanding of data structure principles and data platforms.
  • Problem-solving attitude, solution mindset with implementation expertise.
  • Working experience on Modern data platforms which involves big data technologies, data management solutions, and data virtualization.
  • Well-versed with the end2end data management philosophies and governance processes
  • Has pre-sales experience and have involved in RFP/RFI/RFQ processes
  • Creative problem-solver with strong communication skills
  • Excellent understanding of traditional and distributed computing paradigm
  • Should have excellent knowledge in data warehouse / data lake technology and business intelligence concepts
  • Should have good knowledge in Relational, No-SQL, Big Data Databases and should be able to write complex SQL queriesTechnical Skills: Data integration - ETL tools like Talend and Informatica. Ingestion mechanism like Flume & Kafka.
  • Data modelling - Dimensional & transactional modelling using RDBMS, NO-SQL and Big Data technologies. Experience in Snowflake modelling would be an advantage.
  • Data visualization - Tools like Tableau, Power BI and Kibana.
  • Master data management (MDM) - Concepts and expertise in tools like Informatica & Talend MDM
  • Big data - Hadoop eco-system, Distributions like Cloudera / Hortonworks, Pig and HIVE.


• Data processing frameworks - Spark & Spark streaming
• Hands-on experience with multiple databases like PostgreSQL, Snowflake, Oracle, MS SQL Server, NOSQL (HBase / Cassandra, MongoDB), is required

  • Knowledge of various data modelling techniques and should be hands on with data modelling tools like ERWin, TOAD, PowerDesigner, etc.
  • Experience in cloud data eco-system - AWS, Azure or GCP


• Demonstrate strong analytical and problem solving capability
• Good understanding of the data eco-system, both current and future data trends.

Education

Bachelor's degree in Computer Science, Software Engineering, or a related field.

Job Details

Job Location

Cairo, Egypt

Company Industry

Integration

Company Type

Employer (Private Sector)

Job Role

Information Technology

Employment Status

Full time

Employment Type

Employee

Job Division

COO Office

SW Engineering

Preferred Candidate

Career Level

Mid Career

Years of Experience

Min: 10 Max: 20

Residence Location

Egypt

Nationality

Egypt

Degree

Bachelor's degree

This advertiser has chosen not to accept applicants from your region.
Be The First To Know

About the latest Etl architect Jobs in Egypt !

Data Architect

EGP90000 - EGP120000 Y Hypercell

Posted today

Job Viewed

Tap Again To Close

Job Description

Company Description:

Hypercell is an Egyptian software development firm headquartered in Cairo, Egypt. We collaborate closely with our clients to deliver highly customized solutions while maintaining our unwavering dedication to quality. At Hypercell, we strive to provide innovative and effective software solutions that meet the unique needs of each client, fostering a culture of excellence and continuous improvement. Our commitment to quality and client satisfaction drives everything we do.

Qualifications:

-Bachelor's or master's degree in computer science, Data Science, Information Systems, or a related discipline.

-8+ years of experience in data architecture, preferably in banking, fintech, or financial services.

-⁠Design and implement enterprise-wide data architecture for core banking, payments, credit risk, anti-fraud, and regulatory reporting platforms.

-Develop and maintain enterprise data models aligned with financial product lifecycles (loans, deposits, derivatives, etc.).

-Establish and enforce data governance frameworks, metadata management, and data lineage documentation to meet internal audit and external regulatory standards.

-Lead the modernization of legacy data systems into cloud-native architectures (e.g., Snowflake, Azure Data Lake, AWS Redshift).

-Collaborate with Compliance, Risk, Treasury, and Finance teams to ensure accurate data aggregation and reporting.

-Define and guide standards for ETL/ELT pipelines, master data management (MDM), and data quality assurance.

-Support the implementation of data privacy, retention, and security policies in line with regulatory mandates (e.g., GDPR, CCPA, local data residency laws).

-Review and validate third-party vendor integrations, APIs, and open banking standards from a data architecture perspective.

-Provide technical leadership on strategic initiatives such as real-time fraud detection, credit decisioning engines, and customer 360° analytics.

-Experience with Oracle , Sql Server , Postgres and No SQL Databases

-Strong knowledge of financial domain data structures (loans, accounts, transactions, KYC, AML, risk models).

-Expertise in data modeling, data warehousing (e.g., Oracle, Teradata, Snowflake), and cloud data platforms.

-Experience with data governance tools and practices (e.g., Collibra, Informatica, Apache Atlas).

-Proficiency in SQL, data pipeline orchestration, and integration technologies (e.g., Kafka, Talend, Airflow).

-Knowledge of regulatory requirements (Basel II/III, IFRS 9, BCBS 239, FATCA) and their data implications adherence to the approved service level agreement.

Application Enhancement
: Collaborate with business users to gather requirements for application enhancements and new features. Develop and implement solutions to meet business needs

-Problem Management
: Identify and address recurring issues, provide root cause analysis and permanent resolution.

-Documentation
: Create and maintain comprehensive documentation for supported applications, including user manuals, technical guides, and knowledge base articles.

-Compliance
: Adhere to IT security policies and procedures related to IT security and compliance.

-User Management
: User creation and knowledge of system integration with 2-factor Authentication

Provide business analysis and requirements-gathering support for end-users.

This advertiser has chosen not to accept applicants from your region.

Senior Data Architect

EGP120000 - EGP240000 Y Advansys

Posted today

Job Viewed

Tap Again To Close

Job Description

Job Description:

  • Contribute in overall modeling standards, guidelines, best practices and approved modeling techniques and approaches.
  • Contribute in governing the creation of all physical and logical data models in the organization.
  • Participate in due diligence of new software purchases by reviewing all proposed data models contained in packaged or commercially available applications.
  • Serve on data integration, business intelligence and content management competency panels or teams.
  • Participate in all data integration and Enterprise Information Management (EIM) programs and projects — both enterprise and point-to-point efforts — by rationalizing data processing for reusable module development.
  • Work jointly with the data services administrator in developing the data objects and data models to support data services under a service-oriented architecture approach.
  • Govern the data administration team (modeling, metadata managers and end-user query optimization).
  • Liaise with the data modeling team, data warehouse team and the application owners in understanding the data interface requirements and help the project team developers to resolve issues related with data requirements. Ensure, gatekeeper functions are effectively defined and managed to reduce data redundancy and increase accuracy.
  • Contribute in selection of data management tools, and the development of standards, usage guidelines and procedures for those tools.
  • Document the existing data architecture in a prioritized manner in order to maintain an accurate 'as is' view of the current data environment and define the "to be" view.
  • Document the Data Flows / lineage visualizing the "as-is" data movement across key components of the Data landscape and define the "to-be" views.
  • Work & liaise with various technology and business stakeholders in resolving and escalating issues related with the data solutions and seek their concurrence and approval as per the established process during the SDLC until the solution is deployed.
  • Provide feedback and improvements over the standards, methodologies, processes and best practices to the enterprise architecture team to review and update enterprise architecture standards both at the group and country level.
  • Feedback and improve towards corporate metadata management facility.
  • Work with the Data Quality / Governance teams in understanding areas where data architecture enhancement will assist the overall quality of the Group's data.
  • Educate senior business representatives in best practices for data quality and to work with them to identify realistic goals and data initiatives.
  • Requirements
  • Bachelor's degree in Computer Science, Software Engineering, or a related field.
  • 5 years of experience in data Architecture or Data Modeling

Benefits

-50 Percent of the Salary in USD

  • Social and Medical insurance

  • Transportation and transportation allowance

  • Internet package

This advertiser has chosen not to accept applicants from your region.

Data Architect II

EGP120000 - EGP240000 Y Delivery Hero

Posted today

Job Viewed

Tap Again To Close

Job Description

Company Description

Since launching in Kuwait in 2004, talabat, the leading on-demand food and Q-commerce app for everyday deliveries, has been offering convenience and reliability to its customers. talabat's local roots run deep, offering a real understanding of the needs of the communities we serve in eight countries across the region.

We harness innovative technology and knowledge to simplify everyday life for our customers, optimize operations for our restaurants and local shops, and provide our riders with reliable earning opportunities daily.

Here at talabat, we are building a high performance culture through engaged workforce and growing talent density. We're all about keeping it real and making a difference. Our 6,000+ strong talabaty are on an awesome mission to spread positive vibes. We are proud to be a multi great place to work award winner.

Job Description

As an Analytical Engineer, you will be a core member of the data architect team, responsible for transforming raw data into high-quality, actionable insights. Your mission is to build, own, and maintain the data products that power our analytics, reporting, and business decisions. You will work across the entire data lifecycle, from data ingestion to delivering final insights, ensuring data quality and governance are central to everything you build.

This role requires a unique blend of technical expertise in data modeling, data pipelines, data quality, a strong business acumen to understand complex problems, and exceptional communication skills to collaborate with diverse teams. You will act as the key link between our data science, engineering, product, and the business, driving value by making data trustworthy and accessible.

What's On Your Plate?

As an Analytical Engineer and Data Architect, you will be responsible for:

1. Data Product Development & Ownership

  • Build and Maintain: Design, build, and maintain enterprise data warehouses and analytical data marts.
  • Translate Needs: Work closely with data scientists, product managers, and business teams to understand their problems and translate complex business requirements into robust data models and data products.
  • End-to-End Responsibility: Take full ownership of data products from concept to production, ensuring they are scalable, reliable, and well-documented.

2. Data Quality & Governance

  • Enhance Data Quality: Implement and enforce data quality and governance standards throughout the data pipeline, moving checks and enrichment upstream.
  • Root Cause Analysis: Proactively identify data quality and governance shortfalls, perform root cause analysis, and implement solutions with monitoring mechanisms.

3. Cross-Functional Collaboration

  • Collaborate: Partner with data engineers to ensure smooth data ingestion from source systems and with backend engineers to define and implement data contracts.
  • Consult: Serve as a subject matter expert for analytical and business teams, guiding them on best practices for data consumption and model design.

4. Data Pipeline & Modeling Expertise

  • Pipeline Management: Design and optimize data pipelines to support data transformation, data structures, metadata, and dependency management.
  • Modern Modeling: Apply a variety of data modeling techniques, including modern methods, to build efficient and flexible data solutions.

Qualifications

What Did We Order?

  • Experience: 3+ years of experience in data management, with a strong focus on analytical engineering.
  • Technical Skills: Advanced SQL knowledge and experience with relational and non-relational databases. Proficient in building and optimizing data pipelines and analytical data models.
  • Data Modeling: Solid experience with various data modeling techniques, including modern methodologies.
  • Center of Excellence: Build best practices and processes supporting data transformation, data structures, metadata, dependency, and workload management.
  • Problem-Solving: A strong problem-solver with a "figure it out" growth mindset and a "keep it simple" approach.
  • Collaboration: Proven ability to work effectively with cross-functional teams in a dynamic environment. Excellent collaborator and communicator who can translate business needs into technical solutions.
  • Ownership: A strong sense of ownership and accountability for data products and their impact.
  • Education: Bachelor's degree in engineering, computer science, technology, or a similar field.
This advertiser has chosen not to accept applicants from your region.
 

Nearby Locations

Other Jobs Near Me

Industry

  1. request_quote Accounting
  2. work Administrative
  3. eco Agriculture Forestry
  4. smart_toy AI & Emerging Technologies
  5. school Apprenticeships & Trainee
  6. apartment Architecture
  7. palette Arts & Entertainment
  8. directions_car Automotive
  9. flight_takeoff Aviation
  10. account_balance Banking & Finance
  11. local_florist Beauty & Wellness
  12. restaurant Catering
  13. volunteer_activism Charity & Voluntary
  14. science Chemical Engineering
  15. child_friendly Childcare
  16. foundation Civil Engineering
  17. clean_hands Cleaning & Sanitation
  18. diversity_3 Community & Social Care
  19. construction Construction
  20. brush Creative & Digital
  21. currency_bitcoin Crypto & Blockchain
  22. support_agent Customer Service & Helpdesk
  23. medical_services Dental
  24. medical_services Driving & Transport
  25. medical_services E Commerce & Social Media
  26. school Education & Teaching
  27. electrical_services Electrical Engineering
  28. bolt Energy
  29. local_mall Fmcg
  30. gavel Government & Non Profit
  31. emoji_events Graduate
  32. health_and_safety Healthcare
  33. beach_access Hospitality & Tourism
  34. groups Human Resources
  35. precision_manufacturing Industrial Engineering
  36. security Information Security
  37. handyman Installation & Maintenance
  38. policy Insurance
  39. code IT & Software
  40. gavel Legal
  41. sports_soccer Leisure & Sports
  42. inventory_2 Logistics & Warehousing
  43. supervisor_account Management
  44. supervisor_account Management Consultancy
  45. supervisor_account Manufacturing & Production
  46. campaign Marketing
  47. build Mechanical Engineering
  48. perm_media Media & PR
  49. local_hospital Medical
  50. local_hospital Military & Public Safety
  51. local_hospital Mining
  52. medical_services Nursing
  53. local_gas_station Oil & Gas
  54. biotech Pharmaceutical
  55. checklist_rtl Project Management
  56. shopping_bag Purchasing
  57. home_work Real Estate
  58. person_search Recruitment Consultancy
  59. store Retail
  60. point_of_sale Sales
  61. science Scientific Research & Development
  62. wifi Telecoms
  63. psychology Therapy
  64. pets Veterinary
View All Etl Architect Jobs