Job Description
Description:
We are actively seeking a Data Engineer responsible for designing, building, and maintaining infrastructure that supports data storage, processing, and retrieval. The role involves working with large data sets and developing data pipelines to transfer data from source systems to data warehouses, data lakes, and other storage and processing systems. The Data Engineer will collaborate with stakeholders to address data-related technical issues and support their data infrastructure needs during the development, maintenance, and sustainment of the KR data architecture and data-driven solutions.
Note: Due to federal security clearance requirements, applicants must be U.S. citizens or Permanent Residents with the ability to obtain an active Secret clearance.
This is a contract-to-hire position. Applicants should be willing to work on a W2 basis with the potential to convert to full-time employment after the contract. Benefits include Medical, Dental, Vision, 401k with company matching, and life insurance.
Rate: $80 - $86/hr W2
Responsibilities:
1. Develop, optimize, and maintain data ingestion flows using Apache Kafka, Apache Nifi, and MySQL/PostgreSQL.
2. Develop within AWS cloud services such as RedShift, SageMaker, API Gateway, QuickSight, and Athena.
3. Coordinate with data owners to ensure proper configuration.
4. Document SOPs related to streaming, batch configuration, or API management.
5. Record details of data ingestion activities for team understanding.
6. Develop and uphold best practices in data engineering and analytics following Agile DevSecOps methodologies.
Experience Requirements:
* Strong analytical skills, including statistical analysis, data visualization, and machine learning techniques.
* Proficiency in programming languages such as Python, R, and Java.
* Experience in building modern data pipelines and ETL processes with tools like Apache Kafka and Apache Nifi.
* Proficiency in Java, Scala, or Python programming.
* Experience managing or testing API Gateway tools and Rest APIs.
* Knowledge of traditional databases like Oracle, MySQL, etc., and modern data management technologies such as Data Lake, Data Fabric, and Data Mesh.
* Experience with creating DevSecOps pipelines using CI/CD tools and GitLab.
* Excellent technical documentation and communication skills.
* Strong interpersonal skills and team collaboration experience.
* Proven customer service skills in demanding environments.
* Ability to communicate effectively across all organizational levels.
* Analytical, organizational, and problem-solving skills.
* Experience with data observability tools like Grafana, Splunk, AWS CloudWatch, Kibana, etc.
* Knowledge of container technologies such as Docker, Kubernetes, and Amazon EKS.
Education Requirements:
* Bachelor’s Degree in Computer Science, Engineering, or related field, or at least 8 years of equivalent work experience.
* 8+ years of IT data/system administration experience.
* AWS Cloud certifications are advantageous.
#J-18808-Ljbffr