JPI is seeking a Senior Technical Project Manager to support a big data initiative for a government client. This is a great opportunity to work on an enterprise-wide implementation of bleeding-edge technical solutions and be part of a high energy team. The Senior Technical Project Manager will leverage in-depth, hands-on experience and expertise across multiple big data, Cloud, and analytics solutions. The successful candidate will have experience implementing enterprise solutions leveraging Cloud and Big Data technology in a Federal environment and working closely with Government clients. In this role you will balance technical leadership with managing the hands-on development and implementation to initiate, plan, and execute large-scale, highly-technical, and cross-functional data and analytics initiatives.
Applicants must possess a demonstrated history of working in the information technology and services industry with a wide variety of skillsets; including but not limited to
- Software Development leveraging Agile methodologies
- Big Data / Analytics Tools
- Relational and non-relational/unstructured database solutions
- Cloud Architecture
- IT Security
- Customer engagement and requirements analysis
- Lead a technical team to architect, design, prototype, implement, and optimize cloud-enabled big data solutions
- Provide Architect, develop, implement, and test data processing pipelines, and data mining/data science algorithms on a variety of hosted settings
- Assist customers with translating comples business analytics requirements into technical solutions and recommendations across diverse environments
- Experience defining and implementing data ingestion and transformation methodologies, including between classified and unclassified sources
- Communicate results and educate others through design and development of insightful visualizations, reports, and presentations
- Participate in the design, implementation and support of Big Data, Analytics and Cloud solutions through participation in all stages of development lifecycle
- Conduct regular peer code reviews to ensure code quality and compliance following best practices in the industry
- Design, implement and optimize leading Big Data frameworks (Hadoop, Spark, SAP HANA) across hybrid hosting platforms (AWS, Azure, on-prem)
- Review security requirements, analyze processes, and define security strategy to implement compliance and controls in line with organizational standards and industry best practices
- Develop accreditation and security documentation, including Systems Security Plans (SSP) and Authorization to Operate (ATO) packages.
- Provide thought leadership and innovation to provide recommendations on emerging technologies or optimization/efficiencies across architecture, implementation, hosting, etc.
- Lead the planning, development, and execution of data onboarding processing capabilities and services for diverse customers and data sets
- 15+ years of progressive experience in architecting, developing, and operating modular, efficient and scalable big data and analytics solutions
- Fluency and demonstrated expertise across multiple programming languages, such as Python, Java, and C++, and the ability to pick up new languages and technologies quickly;
- At least 5 years of experience with distributed computing frameworks, specifically Hadoop 2.0+ (YARN) and associated tools including Avro, Flume, Oozie, Sqoop, Zookeeper, etc.
- Hands-on experience with Apache Hive, Apache Spark and its components (Streaming, SQL, MLLib)
- Hands-on experience with data warehousing and business intelligence software including Cloudera and Pentaho
- Experience developing data visualizations leveraging Tableau
- Extensive experience with Relational (Oracle, SQL Server) and non-relational/unstructured database solutions (HBase, Mongo)
- Experience architecting, implementing, and operating solutions in across multiple Cloud Service Providers (AWS, Azure, Google), including strong understanding of Cloud and distributed systems considerations (i.e., load balancing, scaling, etc.)
- Experience with automation and orchestration tools including Chef, Ansible, and Puppet
- Extensive experience working within a Linux computing environment and use of command line tools including shell/Python scripting
- Demonstrated success executing within an agile development team and familiarity with common software development tools (i.e., JIRA) and version control systems (i.e. git)
- Extensive experience with multiple large-scale, big data frameworks and tools including MapReduce, Hadoop, Spark, Hive, Impala, and Storm
- Advanced knowledge of application, data and infrastructure architecture disciplines
- Ability to manage complex engagements and interface with senior level management internally as well as with clients
- Ability to lead client presentations and communicate complex technical concepts to non-technical audiences and identify and manage project interdependencies
- Ability to interact with both business and technical stakeholders of clients to provide a sound technical solution
JPI is committed to creating a diverse environment and is proud to be an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability, age, or veteran status.