Cloud Software Engineer (TS/SCI with Poly)
Annapolis Junction, MD 
Share
Posted 8 days ago
Job Description
The Cloud Software Engineer develops, maintains, and enhances complex and diverse Big-Data Cloud systems based upon documented requirements. Directly contributes to all stages of back-end processing, analyzing, and indexing. Provides expertise in Cloud Computing, Hadoop Eco-System including implementing Java applications, Distributed Computing, Information Retrieval (IR), and Object Oriented Design. Works individually or as part of a team. Reviews and tests software components for adherence to the design requirements and documents test results. Resolves software problem reports. Utilizes software development and software design methodologies appropriate to the development environment. Provides specific input to the software components of system design to include hardware/software trade-offs, software reuse, use of Commercial Off-the-shelf (COTS)/Government Off-the-shelf (GOTS) in place of new development, and requirements analysis and synthesis from system level to individual software components.
Department
Federal
Employment Type
Permanent - Full Time
Location
Annapolis Junction
Workplace type
Onsite
What you'll be doing:
  • Provide knowledge of Information Retrieval; assisting the software development team in designing, developing and testing Cloud Information Retrieval
  • Implement workflows that manage Cloud MapReduce analytics
  • Implement code that interacts with Cloud Distributed Coordination Frameworks
  • Oversee one or more software development tasks and ensures the work is completed in accordance with the constraints of the software development process being used on any particular project
  • Make recommendations for improving documentation and software development process standards
What you'll need:
  • Software engineering experience in programs and contracts of similar scope, type, and complexity is required; two (2) years of which must be in programs utilizing Big-Data Cloud technologies and/or Distributed Computing.
  • Bachelors degree in Computer Science or related discipline from an accredited college or university is required. Four (4) years of cloud software engineering experience on projects with similar Big-Data systems may be substituted for a bachelors degree. Master in Computer Science or related discipline from an accredited college or university may be substituted for two (2) years of experience. Cloudera Certified Hadoop Developer certification may be substituted for one (1) year of Cloud experience.
  • The following Cloud related experiences are required:
    • Two (2) years of Cloud and Distributed Computing Information Retrieval (IR).
    • One (1) year of experience with implementing code that interacts with implementation of Cloud Big Table.
    • One (1) year of experience with implementing code that interacts with implementation of Cloud Distributed File System.
    • One (1) year of experience with implementing complex MapReduce analytics.
    • One (1) year of experience with implementing code that interacts with Cloud Distributed Coordination Frameworks.
  • Experience with Computer Network Operations: Utility Computing, Network Management, Virtualization (VMWare or VirtualBox), Cloud Computing 2. Multi Node Management and Installation: Management and installation of Cloud and Distributed Computing on multiple nodes, Python, CFEngine, Bash, Ruby or related technologies.
  • Experience with Information Assurance: Securing Cloud Based and Distributed applications through industry standard techniques such as Firewalls, PKI Certificate and Server Authentication with experience in Corporate authentication service(s)
  • Experience with Information Technology:
    • Object Oriented Design and Programming, Java, Eclipse or similar development environment, MAVEN, RESTful web services.
    • Cloud and Distributed Computing Technologies: at least one or a combination of several of the following areas - YARN, J2EE, MapReduce, Zookeeper, HDFS, HBase , JMS, Concurrent Programming, Multi-Node implementation/installation and other applicable technologies.
    • Cloud and Distributed Computing Information Retrieval: at least one or a combination of several of the following areas - HDFS, HBASE, Apache Lucene, Apache Solr, MongoDB
    • Ingesting, Parsing and Analysis of Disparate Data-sources and formats: XML, JSON, CSV, Binary Formats, Sequence or Map Files, Avro and related technologies
    • Aspect Oriented Design and Development
    • Debugging and Profiling Cloud and Distributed Installations: Java Virtual Machine (JVM) memory management, Profiling Java Applications
    • UNIX/LINUX, CentOS
  • Experience with SIGINT:
    • Experience with at least one SIGINT collection discipline areas (FORNSAT, CABLE, Terrestrial/Microwave, Overhead, and ELINT)
    • Geolocation, emitter identification, and signal applications.
    • Joint program collection platforms and dataflow architectures; signals characterization analysis
  • Experience with Other:
    • CentOS and Linux/RedHat
    • Configuration management tools such as Subversion, ClearQuest, or Razor
About Orbis Operations
Orbis Operations is an equal opportunity employer, and all qualified applicants will receive consideration for employment without regard to race, color, religion, sex, gender identity, sexual orientation, national origin, disability, or protected veteran status.

Orbis Operations is an equal opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or protected veteran status.

 

Job Summary
Start Date
As soon as possible
Employment Term and Type
Regular, Full Time
Required Education
Bachelor's Degree
Required Experience
4 years
Email this Job to Yourself or a Friend
Indicates required fields