Qualifications
* Bachelor's or Master’s degree in Computer Science, Mathematics, or a similar field (PhDs preferred)
* 7+ years of experience working with large-scale data
* Strong knowledge of public cloud platforms (AWS)
* Experience with data acquisition (API calls/FTP downloads), ETL, transformation, and normalization
* Proficiency in ETL processes is mandatory
* Experience building components for enterprise data platforms (data warehouses, Operational Data Stores, API access layers, file extracts, user queries)
* Hands-on experience with SQL, Python, Spark, Kafka
* Excellent communication skills, with proficiency in verbal and written English
About this Job
This role involves developing and maintaining real-time data processing pipelines for enterprise customer data. You will build scalable solutions for massive data sets, supporting 24x7 business operations, and enabling tools, dashboards, and data analysis for various stakeholders, including Fortune 100 clients. As a Data Architect, you will innovate, design, automate, and lead big data challenges in an agile environment, ensuring data integrity and delivering insightful analytics.
About Scopeworker
Scopeworker is an enterprise SaaS platform that automates the Procure-Execute-Pay lifecycle for complex supplier services, creating a marketplace for enterprises and providing live business intelligence. It can function as a standalone platform or integrate with ERP systems like Oracle, SAP, or Microsoft Dynamics. Used by Fortune 100 companies, more information is available in our explainer video.
#J-18808-Ljbffr