Kafka SME (Remote)

Clarksburg, WV
  • Job Category: Information Technology
  • Travel: Yes, 25 % of the Time
  • Clearance: SECRET
  • Shift: Day Job
  • Req ID: TUV02811

Apply Now

Tuva, an Akima Company, provides turnkey solutions that offer better systems, tools, and methods to further your goals, reduce costs, and streamline operations. From requirements analysis, design, and implementation to operations and training Tuva’s personnel are well-versed in best practices across enterprise IT, mission support Services, and specialized technologies. Our personnel enjoy competitive benefits packages and challenging roles in work environments committed to innovation, diversity and opportunity for career growth. As an Alaska Native Corporation (ANC), 100% of our company’s profits go back to our more than 14,000 Iñupiat shareholders that have resided near and above the Arctic Circle for more than 10,000 years. Our business helps support their way of life and contributes to the survival of a culture that has thrived in a challenging environment.

Job Responsibilities:

  • Provide expert cloud-related Kafka related support necessary for successfully executing the migration of NCIC to the cloud. The NCIC is a computerized database of documented criminal justice information consisting of 21 files. All of the existing NCIC and new N3G functionality will be migrated to, and developed on, an approved public cloud, such as Amazon Web Services (AWS) GovCloud, requiring real-time communication and data exchange between the mainframe and the public cloud for a transitional period of time.

Minimum Qualifications:

  • 5 years of experience with Apache Kafka or Confluent Kafka.

  • 5 years of experience with data integrations.

  • Bachelors Degree.

  • Physical deployment of infrastructure across multiple environments.

  • Optimization and tuning.

  • Relational database replication.

  • Encryption in flight and at rest.

  • Data replication validation.

  • High availability relative to Kafka.

  • Disaster recovery relative to Kafka.

  • Standards and pattern development.

  • User guide development and training overviews for supporting teams.

  • Provide troubleshooting and best practices methodology for development teams.

  • Design monitoring solutions and baseline statistics reporting to support the implementation.

  • Ansible or Python scripting.

  • Linux/Red Hat operating system commands.

  • Experience with JIRA, Git repo, Bitbucket, and Confluence.

  • Experience working with a highly technical team and in a dynamic, fast-paced environment.

  • Active Secret clearance with the ability to obtain a Top Secret.

Desired Qualifications:

Preferred: however, not required:

  • Experience where the relational data replication source is mainframe based (e.g. DB2).

  • Experience with Elastic Stack.

  • Experience with Agile software development (SAFe, Scrum, Kanban).

  • Experience with DevOps/DevSecOps and CI/CD implementation.

  • Experience with automation tools such as Ansible.

  • Experience with other database systems and data models.

  • Experience with software engineering.

  • Experience with Python, JSON, or Java.

The duties and responsibilities listed in this job description generally cover the nature and level of work being performed by individuals assigned to this position. This is not intended to be a complete list of all duties, responsibilities, and skills required. Subject to the terms of an applicable collective bargaining agreement, the company management reserves the right to modify, add, or remove duties and to assign other duties as may be necessary. We wish to thank all applicants for their interest and effort in applying for the position; however, only candidates selected for interviews will be contacted.

Apply Now