News

Industries

Companies

Jobs

Events

People

Video

Audio

Galleries

Submit content

My Account

Advertise with us

Filter jobs
Advertise your job vacancies

    Prepaid job ad packages

      JobNormal costDiscountCostSaving
      4R2,00027%R1,460R540
      6R3,00029%R2,130R870
      8R4,00031%R2,760R1,240
      12R6,00035%R3,900R2,100
      Sign up for a prepaid account
    Recruit Image

    Recruit Image

    Databricks Data Engineer (PySpark, Financial Datasets) – Sandton – R40

    Location:Sandton
    Type:Contract
    Reference:#NG60854
    Company:e-Merge IT Recruitment

    A cutting-edge technology company transforming businesses across the Middle East, Asia Pacific, Africa, and the United States whose mission is to drive cloud innovation and build AI that benefits humanity. As a Databricks Developer, you’ll work with a team delivering advanced AI, data, and cloud solutions that empower the banking and financial sectors. You’ll leverage Databricks to design scalable data pipelines, optimize analytics, and unlock actionable insights that fuel digital transformation.

    Responsibilities:

    • ETL/ELT Development: Develop, test, and deploy robust and efficient data pipelines using PySpark/Scala and the Databricks platform (including Delta Lake and Databricks Workflows).

    • Data Transformation: Implement complex data transformation logic to clean, enrich, and aggregate financial data from various source systems (e.g., core banking, trading platforms).

    • Cloud Integration: Integrate Databricks with native cloud services (AWS, Azure, or GCP) for data ingestion (e.g., S3, ADLS) and workflow orchestration (e.g., Azure Data Factory, AWS Glue).

    • Quality and Testing: Write unit and integration tests for data pipelines and apply data quality checks to ensure accuracy in financial reporting and analysis.

    • Compliance Support: Apply basic security and access control policies, such as those governed by Unity Catalog, to adhere to the firm's compliance requirements.

    • Performance: Assist in monitoring and tuning Databricks cluster configurations and Spark job parameters to improve efficiency and reduce cost.

    Qualifications and Experience:
    • Databricks Platform: Strong hands-on experience with Databricks notebooks, clusters, job scheduling, and understanding of the Delta Lake transaction log for reliability.
    • Programming: Proficient in Python (especially PySpark) and expert in SQL.
    • Big Data Concepts: Solid understanding of Apache Spark fundamentals (e.g., RDDs, DataFrames, lazy evaluation) and distributed computing.
    • Data Modeling: Practical experience with dimensional modeling (star/snowflake schemas) and implementing the Medallion Architecture (Bronze, Silver, Gold layers) on Delta Lake.
    • DevOps/DataOps: Familiarity with version control (Git) and experience with basic CI/CD processes for deploying Databricks code.

    The Reference Number for this position is NG60854 which is a Contract role in Sandton offering a salary of R400 to R500 per hour salary negotiable based on experience. E-mail Nokuthula on nokuthulag@ e-Merge.co.za or call her for a chat on 011 463 3633 to discuss this and other opportunities.

    Are you ready for a change of scenery? e-Merge IT recruitment is a niche recruitment agency. We offer our candidates options so that we can successfully place the right people with the right companies, in the right roles. Check out the e-Merge IT website www.e-merge.co.za for more great positions.

    Posted on 31 Oct 15:32, Closing date 30 Dec

    Apply

    Nokuthula Gumbo
    nokuthulag@e-merge.co.za
    0114633633

    Or apply with your Biz CV

    Create your CV once, and thereafter you can apply to this ad and future job ads easily.
    e-Merge IT Recruitment
    e-Merge is a service orientated, boutique agency working in specific technology verticals. We only recruit within our specialised fields, assuring both client and candidate of expert attention, knowledge and advice.
    Next
    Let's do Biz