You are here

Senior Cloud Architect

Washington, D.C.

Full-Time

Position Description:

JPI is seeking a Senior Cloud Architect / Subject Matter Expert to support a big data initiative for a government client.  This is a great opportunity to work on an enterprise-wide implementation of bleeding-edge technical solutions and be part of a high energy team.  The Senior Cloud Architect will leverage in-depth, hands-on experience and expertise across multiple big data, Cloud, and analytics solutions. The successful candidate will have demonstrated experience architecting and implementing enterprise solutions leveraging Cloud and Big Data technology in a Federal environment and working closely with Government clients. In this role you will balance technical leadership with hands-on development and implementation to initiate, plan, and execute large-scale, highly-technical, and cross-functional data and analytics initiatives.

Applicants must possess a demonstrated history of working in the information technology and services industry with a wide variety of skillsets; including but not limited to:

  • Cloud Architecture
  • Big Data / Analytics Tools
  • Relational and non-relational/unstructured database solutions
  • IT Security
  • Software Development leveraging Agile methodologies

Responsibilities:

  • Lead a technical team to architect, design, prototype, implement, and optimize cloud-enabled big data solutions  
  • Architect, develop, implement, and test data processing pipelines, and data mining/data science algorithms on a variety of hosted settings
  • Assist customers with translating complex business analytics requirements into technical solutions and recommendations across diverse environments
  • Experience defining and implementing data ingestion and transformation methodologies, including between classified and unclassified sources
  • Participate in the design, implementation and support of Big Data, Analytics and Cloud solutions through participation in all stages of development lifecycle
  • Conduct regular peer code reviews to ensure code quality and compliance following best practices in the industry
  • Design, implement and optimize leading Big Data frameworks (Hadoop, Spark, SAP HANA) across hybrid hosting platforms (AWS, Azure, on-prem)
  • Review security requirements, analyze processes, and define security strategy to implement compliance and controls in line with organizational standards and industry best practices
  • Develop accreditation and security documentation, including Systems Security Plans (SSP) and Authorization to Operate (ATO) packages.
  • Provide thought leadership and innovation to provide recommendations on emerging technologies or optimization/efficiencies across architecture, implementation, hosting, etc.
  • Lead the planning, development, and execution of data onboarding processing capabilities and services for diverse customers and data sets​
  • Communicate results and educate others through design and development of insightful visualizations, reports, and presentations​

Requirements:

  • 15+ years of professional experience
  • At least 10 years of progressive experience in architecting, developing, and operating modular, efficient and scalable Cloud solutions
  • Experience architecting, implementing, and operating solutions in across multiple Cloud Service Providers (AWS, Azure, Google), including strong understanding of Cloud and distributed systems considerations (i.e., load balancing, scaling, etc.)
  • Fluency and demonstrated expertise across multiple programming languages, such as Python, Java, and C++, and the ability to pick up new languages and technologies quickly;
  • Hands-on experience with data warehousing and business intelligence software including Cloudera and Pentaho
  • Extensive experience with Relational (Oracle, SQL Server) and non-relational/unstructured database solutions (HBase, Mongo)
  • Experience with automation and orchestration tools including Chef, Ansible, and Puppet
  • Extensive experience working within a Linux computing environment and use of command line tools including shell/Python scripting
  • Demonstrated success executing within an agile development team and familiarity with common software development tools (i.e., JIRA) and version control systems (i.e. git)
  • Extensive experience with multiple large-scale, big data frameworks and tools including MapReduce, Hadoop, Spark, Hive, Impala, and Storm
  • Advanced knowledge of application, data and infrastructure architecture disciplines
  • Ability to manage complex engagements and interface with senior level management internally as well as with clients
  • Ability to lead client presentations and communicate complex technical concepts to non-technical audiences and identify and manage project interdependencies
  • Ability to interact with both business and technical stakeholders of clients to provide a sound technical solution​
  • Bachelor of Science in Computer Science or a related field
  • Relevant technical certifications (i.e.., AWS Certified Solutions Architect) a plus
  • Master’s degree a plus

JPI is committed to creating a diverse environment and is proud to be an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability, age, or veteran status.

Files must be less than 10 MB.
Allowed file types: txt rtf pdf doc docx.
Copy and paste resume here if you do not wish to upload.
CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.
10 + 9 =
Solve this simple math problem and enter the result. E.g. for 1+3, enter 4.