Job ID:  27390

Title:  Specialist, Software Engineering

Location: 

Melbourne, FL, US, 32904

Job Title: Specialist, Software Engineer 

Job Code: 27390

Job Location: Melbourne, FL

Job Description:

We are seeking a skilled and innovative Software Engineer/Data Analyst to join our dynamic team. This role will focus on modernizing and migrating an existing PyQt-based analytics application into a high-performance web application using modern web technologies, such as WebGL, to support mission-critical operations, including satellite data analysis. The successful candidate will leverage deep expertise in Python, database systems, JavaScript, and modern software engineering practices to build scalable systems that process large datasets, including time-series data, to identify patterns and trends. The role involves algorithm development, geospatial data processing, and a basic understanding of orbital mechanics for spacecraft to enhance platform capabilities. Collaboration with cross-functional teams is essential to deliver cutting-edge solutions in a fast-paced, high-stakes environment. 

Essential Functions:

  • Lead the migration of a PyQt-based analytics platform to a web-based application using modern web frameworks and technologies like WebGL for enhanced visualization and performance.
  • Design and develop Python-based backend systems with integrated database solutions (e.g., PostgreSQL) to process millions of records daily, including large time-series datasets related to satellite operations.
  • Develop and optimize algorithms to identify patterns, trends, and anomalies in complex datasets, incorporating basic orbital mechanics principles for spacecraft data analysis.
  • Implement geospatial data processing capabilities to support location-based analytics and visualization in a web environment.
  • Implement comprehensive logging and data preservation mechanisms to ensure work persistence and enable continuous analysis across operational shifts.
  • Modernize legacy systems by replacing fragmented processes with unified, scalable web-based solutions that enhance data quality and operational efficiency.
  • Collaborate with analysts and stakeholders to define requirements, ensuring platforms support detailed follow-up work and historical pattern recognition.
  • Apply data analysis techniques, including numerical and array-based programming, to optimize data processing and visualization workflows.
  • Maintain and enhance platform performance, ensuring reliability and scalability in 24/7 operational environments.
  • Document system architecture, workflows, and processes using modern tools for clear communication with technical and non-technical audiences.

Qualifications:

  • Bachelor’s degree in Mathematics, Computer Science, Data Science, or a related field and 4 years of experience.  Graduate Degree and a minimum of 2 years of prior related experience. In lieu of a degree, minimum of 8 years of prior software experience.
  • Top Secret or TS/SCI Clearance.
  • Experience in software engineering or data analysis, with a focus on processing large datasets.
  • Python and experience with data analysis libraries such as NumPy, Pandas, SciPy, and Plotly.
  • Database systems, including PostgreSQL, with experience in designing and optimizing database schemas for large-scale data processing.
  • Experience with GUI development frameworks like PyQt and basic proficiency in JavaScript for web development.

Preferred Additional Skills:

  • Experience with web development technologies, including WebGL, HTML5, CSS, and modern JavaScript frameworks (e.g., React, Vue.js, or Three.js) for building interactive web applications.
  • Familiarity with migrating desktop applications (e.g., PyQt-based) to web-based platforms.
  • Experience with algorithm development for pattern recognition and trend analysis in time-series data.
  • Proficiency in geospatial data processing using tools like PostGIS or GeoPandas, with exposure to web-based geospatial visualization libraries (e.g., Leaflet, Mapbox).
  • Experience with additional programming languages such as Mojo, Rust, or C++.
  • Familiarity with cloud providers such as AWS and modernizing applications for migration to cloud.
  • Contributions to open-source projects or development of libraries, such as numerical or array-based programming libraries.
  • Familiarity with performance optimization techniques, such as SIMD vectorization or use of libraries like Numba.
  • Experience with other database systems, such as Oracle SQL or SQLite.
  • Proficiency in creating technical documentation or reports using modern documentation tools.

TP-LI1


Nearest Major Market: Melbourne