• Realize Configurable and Interoperable TT&C with Commercial Components

      Patel, Kirti; Space System/Loral (International Foundation for Telemetering, 1996-10)
      With explosive growth in the satellite communication market. there is an increasing need for the satellite network service providers to support many satellites with a common Telemetry, Tracking, and Commanding (TT&C) assets. The open bus technology, and Commercial Off The Shelf (COTS) Hardware and Software components, provides an opportunity to build a common IF and baseband systems that will support many satellites with different frequencies and protocols. However, the high frequency front end components of the ground station such as antenna or HPA can not be common due to different gain and polarization requirements of the various communication bands and frequencies. The system architecture presented in this paper offers such system that is interoperable and reconfigurable in near real-time to support multiple frequency and multiple communication protocols.
    • The Role of Standards in COTS Integration Projects

      Stottlemyer, Alan R.; Hassett, Kevin M. (International Foundation for Telemetering, 1996-10)
      We have long used standards to guide the development process of software systems. Standards such as POSIX, X-Windows, SQL have become part of the language of software developers and have guided the coding of systems that are intended to be portable and interoperable. Standards also have a role to play in the integration of commercial off-the-shelf (COTS) products. At NASA's Goddard Space Flight Center, we have been participating on the Renaissance Team, a reengineering effort that has seen the focus shift from custom-built systems to the use of COTS to satisfy prime mission functions. As part of this effort, we developed a process that identified standards that are applicable to the evaluation and integration of products and assessed how those standards should be applied. Since the goal is to develop a set of standards that can be used to instantiate systems of differing sizes and capabilities, the standards selected have been broken into four areas: global integration standards, global development standards, mission development standards, and mission integration standards. Each of the areas is less restrictive than the preceding area in the standards that are allowed. This paper describes the process that we used to select and categorize the standards to be applied to Renaissance systems.
    • Virtual Cables at the Nevada Test Site

      Khalsa, N. S.; Bechtel (International Foundation for Telemetering, 1996-10)
      Shrinking budgets and labor pools have impacted our ability to perform experiments at the Nevada Test Site (NTS) as we did previously. Specifically, we could no longer run heavy cables to remote data acquisition sites, so we replaced the cables with RF links that were transparent to the existing system, as well as being low-cost and easy to deploy. This paper details how we implemented the system using mostly commercial off-the-shelf components.

      Good, A. C.; Kim, H. W.; Polaha, J. H.; Reinders, R. D. (International Foundation for Telemetering, 1996-10)
      The constraints of the Midcourse Space Experiment (MSX) spacecraft which affect thermal and power management, finite onboard recording capabilities, and limited downlink opportunities establish significant bounds under which spacecraft operations and telemetering systems must operate. This paper reviews the MSX mission and data collection planning processes, commanding and execution procedures, data telemetering processes, and the overall impact of spacecraft constraints and downlink nodes to data collection and downlink activities.
    • Flight Test: In Search of Boring Data

      Hoaglund, Catharine M.; Gardner, Lee S.; Edwards Air Force Base (International Foundation for Telemetering, 1996-10)
      The challenge being faced today in the Department of Defense is to find ways to improve the systems acquisition process. One area needing improvement is to eliminate surprises in unexpected test data which add cost and time to developing the system. This amounts to eliminating errors in all phases of a system’s lifecycle. In a perfect world, the ideal systems acquisition process would result in a perfect system. Flawless testing of a perfect system would result in predicted test results 100% of the time. However, such close fidelity between predicted behavior and real behavior has never occurred. Until this ideal level of boredom in testing occurs, testing will remain a critical part of the acquisition process. Given the indispensability of testing, the goal to reduce the cost of flight tests is well worth pursuing. Reducing test cost equates to reducing open air test hours, our most costly budget item. It also means planning, implementing and controlling test cycles more efficiently. We are working on methods to set up test missions faster, and analyze, evaluate, and report on the test data more quickly, including unexpected results. This paper explores the moving focus concept, one method that shows promise in our pursuit of the goal of reducing test costs. The moving focus concept permits testers to change the data they collect and view during a test, interactively, in real-time. This allows testers who are receiving unexpected test results to change measurement subsets and explore the problem or pursue other test scenarios.

      Griffin, Alan R.; McInerney, R. E.; McDonough, James K.; Babcock, Richard R. (International Foundation for Telemetering, 1996-10)
      The Midcourse Space Experiment (MSX) program is the premier space technology experiment of the Ballistic Missile Defense Organization (BMDO) that addresses BMDO system development requirements. The primary objective of the experiment is to collect and analyze data on target and backgrounds phenomenology using three multi-spectral (ultraviolet through infrared) imaging sensors. The program also has objectives for space-based space-object surveillance, assessing space contamination effects, and investigating atmospheric and space phenomenology. Effective scientific Data Management is one of the critical functions within the MSX program organization and is key to meeting the program objectives. The wide spectrum of objectives and requirements of the MSX program were major drivers in the design of a Data System with a heterogeneous, distributed processing center concept and a dual data flow path to meet sensor assessment and experiment analysis requirements. An important technology decision that evolved from this design was the exclusive use of workstation class computers for data processing. A flexible, highly robust development and testing methodology was created to implement this unique system. Companion papers in this session provide detailed descriptions of functions of key elements in the Data System operations.

      Padilla, Frank Jr; White Sands Missile Range (International Foundation for Telemetering, 1996-10)
      White Sands Missile Range (WSMR) is developing a new transportable telemetry system that consolidates various telemetry data collection functions currently being performed by separate instrumentation. The new system will provide higher data rate handling capability, reduced labor requirements, and more efficient operations support which will result in a reduction of mission support costs. Seven new systems are planned for procurement through Requirements Contracts. They will replace current mobile systems which are over 25 years old on a one-on-one basis. Regulation allows for a sixty-five percent overage on the contract and WSMR plans to make this contract available for use by other Major Range Test Facility Bases (MRTFBs). Separate line items in the contracts make it possible to vary the design to meet a specific system configuration. This paper describes both current and replacement mobile telemetry system

      Hicks, William T.; Aydin Vector Division (International Foundation for Telemetering, 1996-10)
      The monitoring of multi phase 400 Hz aircraft power includes monitoring the phase voltages, currents, real powers, and frequency. This paper describes the design of a multi channel card that uses digital signal processing (DSP) to measure these parameters on a cycle by cycle basis. The card measures the average, peak, minimum cycle, and maximum cycle values of these parameters.
    • Ultraviolet and Visible Imaging and Spectrographic Imaging (UVISI) Data Processing Center (DPC)

      Eichert, James J.; Carbary, James F.; McKerracher, Priscilla L.; Suther, Lora L. (International Foundation for Telemetering, 1996-10)
      The nine sensors and one image processor of the Ultraviolet and Visible Imaging and Spectrographic Imaging (UVISI) instrument aboard the Midcourse Space Experiment (MSX) satellite can potentially generate up to three gigabytes of data of data per day. The UVISI Data Processing Center (DPC) must execute a multitude of complex processing functions in a 24-hour operational window, verify the UVISI data and also provide a compact, quantified record of the verification. The Center additionally must support higher-level data analysis functions. Data processing functions are divided into pipeline processing and data conversion processing. Pipeline processing, which consists of the main pipeline process, Pipeline, and several auxiliary processes is responsible for generating Data Quality Indices (DQI) that summarize sensor performance and Data Measurement Indices (DMI) that summarize sensor measurements. Both sets of indices provide scientists and engineers with a compact, easily-reviewed record of instrument performance. The conversion process, Convert, supports data analysis by converting raw telemetry into scientific/engineering units. On a pixel-by-pixel basis, Convert provides functions for dark-correction, flat-fielding, gain and gate adjustment, non-linearity correction, and count-to-photon conversion. Operating in conjunction with Convert, a pointing utility, Point, is used to determine the locations of selected objects in inertial space. The accomplishment of these myriad tasks relies on a state-of-the-art computer network using multiple workstations. Normal DPC operations are fully automated but remain flexible enough to allow prompt intervention by the UVISI Performance Assessment Team (PAT).
    • Flexible Intercom System Design for Telemetry Sites and Other Test Environments

      Bougan, Timothy B.; Science Applications International Corporation (International Foundation for Telemetering, 1996-10)
      Testing avionics and military equipment often requires extensive facilities and numerous operators working in concert. In many cases these facilities are mobile and can be set up at remote locations. In almost all situations the equipment is loud and makes communication between the operators difficult if not impossible. Furthermore, many sites must transmit, receive, relay, and record telemetry signals. To facilitate communication, most telemetry and test sites incorporate some form of intercom system. While intercom systems themselves are a not a new concept and are available in many forms, finding one that meets the requirements of the test community (at a reasonable cost) can be a significant challenge. Specifically, the test director must often communicate with several manned stations, aircraft, remote sites, and/or simultaneously record all or some of the audio traffic. Furthermore, it is often necessary to conference all or some of the channels (so that all those involved can fully follow the progress of the test). The needs can be so specialized that they often demand a very expensive "custom" solution. This paper describes the philosophy and design of a multi-channel intercom system specifically intended to support the needs of the telemetry and test community. It discusses in detail how to use state-of-the-art field programmable gate arrays, relatively inexpensive computers and digital signal processors, and some other new technologies to design a fully digital, completely non-blocking intercom system. The system described is radically different from conventional designs but is much more cost effective (thanks to recent developments in programmable logic, microprocessor performance, and serial/digital technologies). This paper presents, as an example, the conception and design of an actual system purchased by the US government.
    • Midcourse Space Experiment Spacecraft and Ground Segment Telemetry Design and Implementation

      DeBoy, Christopher C.; Schwartz, Paul D.; Huebschman, Richard K.; The Johns Hopkins University (International Foundation for Telemetering, 1996-10)
      This paper reviews the performance requirements that provided the baseline for development of the onboard data system, RF transmission system, and ground segment receiving system of the Midcourse Space Experiment (MSX) spacecraft. The onboard Command and Data Handling (C&DH) System was designed to support the high data outputs of the three imaging sensor systems onboard the spacecraft and the requirement for large volumes of data storage. Because of the high data rates, it was necessary to construct a dedicated X-band ground receiver system at The Johns Hopkins University Applied Physics Laboratory (APL) and implement a tape recorder system for recording and downlinking sensor and spacecraft data. The system uses two onboard tape recorders to provide redundancy and backup capabilities. The storage capability of each tape recorder is 54 gigabits. The MSX C&DH System can record data at 25 Mbps or 5 Mbps. To meet the redundancy requirements of the high-priority experiments, the data can also be recorded in parallel on both tape recorders. To provide longer onboard recording, the data can also be recorded serially on the two recorders. The reproduce (playback) mode is at 25 Mbps. A unique requirement of the C&DH System is to multiplex and commutate the different output rates of the sensors and housekeeping signals into a common data stream for recording. The system also supports 1-Mbps real-time sensor data and 16-kbps real-time housekeeping data transmission to the dedicated ground site and through the U.S. Air Force Satellite Control Network ground stations. The primary ground receiving site for the telemetry is the MSX Tracking System (MTS) at APL. A dedicated 10-m X-band antenna is used to track the satellite during overhead passes and acquire the 25-Mbps telemetry downlinks, along with the 1-Mbps and 16-kbps real-time transmissions. This paper discusses some of the key technology trade-offs that were made in the design of the system to meet requirements for reliability, performance, and development schedule. It also presents some of the lessons learned during development and the impact these lessons will have on development of future systems.

      Guadiana, Juan M.; Rivera, Jesus; Jedlicka, Russel; White Sands Missile Range; New Mexico State University (International Foundation for Telemetering, 1996-10)
      The effects of multipath in telemetry applications are very well known and the approaches to minimizing these effects are the subject of countless books, papers and articles. Multipath once again rears its head as the U.S. Navy fields the MK-41 Vertical Launching System (VLS), a launching system in which each missile is housed in a canister which is both magazine and launch mechanism. The Canister is designed to protect the missile from Electro Magnetic Interference (EMI), Radio Frequency Interference (RFI) and the environment. As can be expected, a canister designed to prevent Radio Frequency (RF) energy from entering should inherently prevent any RF from escaping, and renders the canister environment ripe with multipath. Pre-Launch telemetry checks, essential to the conduct of a missile flight test, become unreliable events which at times result in aborted missions. Today the “encanistered” missile system enjoys wide acceptance, in the U.S. as well as internationally. Since any missile radiating in a closed volume inherently suffers from these multipath degradations, it is important to disclose the results of Navy testing conducted on the canister as well as the mission observations of the multipath effects. The mission observations are described are “signature” traits of the degradations which should have been attributed to multipath. Clearly many missions and tests were affected, but most were simply ignored by an oblivious test team. A short summary of the canister multipath investigation follows,including unexpected findings, and finally a discussion is given on the Close Coupled Antenna and its effectiveness in mitigating the canister multipath.

      Wallace, Keith; McCleaf, Tim; Pham, Tri; Veda Incorporated; Wright-Patterson Air Force Base (International Foundation for Telemetering, 1996-10)
      A system was developed using capabilities from the Range Applications Joint Program Office (RAJPO) GPS tracking system and the ACMI Interface System (ACINTS) to provide tracking data and visual cues to experimenters. The Mobile Advanced Range Data System (ARDS) Control System (MACS) outputs are used to provide research data in support of advanced project studies. Enhanced from a previous system, the MACS expands system capabilities to allow researchers to locate where Digital Terrain Elevation Data (DTED) is available for incorporation into a reference data base. The System Integration Group at Veda Incorporated has been supporting Wright Laboratories in the ground-based tracking and targeting arena since 1989 with the design, development, and integration of four generations of real-time, telemetry-based tracking aids. Commencing in Q3 1995, Veda began developing a mobile, transportable system based on the RAJPO GPS tracking system. The resulting system architecture takes advantage of the front end processor (FEP) used in the three previous generations of interface systems built for Wright Laboratories, thus maximizing hardware and software reuse. The FEP provides a computational interface between the GPS tracking system and the display (operator) system. The end product is a powerful, flexible, fully mobile testbed supporting RDT&E requirements for Wright Laboratories, as well as to other U.S. and foreign research organizations. The system is rapidly reconfigurable to accommodate ground-based tracking systems as well as GPS-based systems, and its capabilities can be extended to include support for mission planning tools, insertion of virtual participants such as DIS entities, and detailed post-mission analysis.
    • 8PSK Signaling Over Non-Linear Satellite Channels

      Caballero, Rubén; New Mexico State University (International Foundation for Telemetering, 1996-10)
      Space agencies are under pressure to utilize better bandwidth-efficient communication methods due to the actual allocated frequency bands becoming more congested. Budget reductions is another problem that the space agencies must deal with. This budget constraint results in simpler spacecraft carrying less communication capabilities and also the reduction in staff to capture data in the earth stations. It is then imperative that the most bandwidth efficient communication methods be utilized. This paper gives the results of a computer simulation study on 8 Level Phase Shift Keying (8PSK) modulation with respect to bandwidth, power efficiency, spurious emissions, interference susceptibility and the non-constant envelope effect through a non-linear channel. The simulations were performed on a Signal Processing Worksystem (SPW: software installed on a SUN SPARC 10 Unix Station and Hewlett Packard Model 715/100 Unix Station). This work was conducted at New Mexico State University (NMSU) in the Center for Space Telemetry and Telecommunications Systems in the Klipsch School of Electrical and Computer Engineering.
    • Concurrent Telemetry Processing Techniques

      Clark, Jerry; Lockheed Martin Telemetry & Instrumentation (International Foundation for Telemetering, 1996-10)
      Improved processing techniques, particularly with respect to parallel computing, are the underlying focus in computer science, engineering, and industry today. Semiconductor technology is fast approaching device physical limitations. Further advances in computing performance in the near future will be realized by improved problem-solving approaches. An important issue in parallel processing is how to effectively utilize parallel computers. It is estimated that many modern supercomputers and parallel processors deliver only ten percent or less of their peak performance potential in a variety of applications. Yet, high performance is precisely why engineers build complex parallel machines. Cumulative performance losses occur due to mismatches between applications, software, and hardware. For instance, a communication system's network bandwidth may not correspond to the central processor speed or to module memory. Similarly, as Internet bandwidth is consumed by modern multimedia applications, network interconnection is becoming a major concern. Bottlenecks in a distributed environment are caused by network interconnections and can be minimized by intelligently assigning processing tasks to processing elements (PEs). Processing speeds are improved when architectures are customized for a given algorithm. Parallel processing techniques have been ineffective in most practical systems. The coupling of algorithms to architectures has generally been problematic and inefficient. Specific architectures have evolved to address the prospective processing improvements promised by parallel processing. Real performance gains will be realized when sequential algorithms are efficiently mapped to parallel architectures. Transforming sequential algorithms to parallel representations utilizing linear dependence vector mapping and subsequently configuring the interconnection network of a systolic array will be discussed in this paper as one possible approach for improved algorithm/architecture symbiosis.
    • Analysis of the Effects of Sampling Sampled Data

      Hicks, William T.; Drexel University (International Foundation for Telemetering, 1996-10)
      The traditional use of active RC-type filters as anti-aliasing filters in Pulse Code Modulation (PCM) systems is being replaced by the use of Digital Signal Processing (DSP) filters, especially when performance requirements are tight and when operation over a wide environmental temperature range is required. In order to keep systems more flexible, it is often desired to let the DSP filters run asynchronous to the PCM sample clock. This results in the PCM output signal being a sampling of the output of the DSP, which is itself a sampling of the input signal. In the analysis of the PCM data, the signal will have a periodic repeat of a previous sample, or a missing sample, depending on the relative sampling rates of the DSP and the PCM. This paper analyzes what effects can be expected in the analysis of the PCM data when these anomalies are present. Results are presented which allow the telemetry engineer to make an effective value judgment based on the type of filtering technology to be employed and on the desired system performance.

      Schumacher, Gary A.; Terametrix Systems International, Inc. (International Foundation for Telemetering, 1996-10)
      PC based instrumentation and telemetry processing systems are attractive because of their ease of use, familiarity, and affordability. The evolution of PC computing power has resulted in a telemetry processing system easily up to most tasks, even for control of and processing of data from a very complex system such as the Common Airborne Instrumentation System (CAIS) used on the new Lockheed-Martin F-22. A complete system including decommutators, bit synchronizers, IRIG time code readers, simulators, DACs, live video, and tape units for logging can be installed in a rackmount, desktop, or even portable enclosure. The PC/104 standard represents another step forward in the PC industry evolution towards the goals of lower power consumption, smaller size, and greater capacity. The advent of this standard and the availability of processors and peripherals in this form factor has made possible the development of a new generation of portable low cost test equipment. This paper will outline the advantages and applications offered by a full-function, standalone, rugged, and portable instrumentation controller. Applications of this small (5.25"H x 8.0"W x 9.5"L) unit could include: flight line instrumentation check-out, onboard aircraft data monitoring, automotive testing, small craft testing, helicopter testing, and just about any other application where small-size, affordability, and capability are required.
    • SPIRIT III Data Verification Processing

      Garlick, Dean; Wada, Glen; Krull, Pete (International Foundation for Telemetering, 1996-10)
      This paper will discuss the functions performed by the Spatial Infrared Imaging Telescope (SPIRIT) III Data Processing Center (DPC) at Utah State University (USU). The SPIRIT III sensor is the primary instrument on the Midcourse Space Experiment (MSX) satellite; and as builder of this sensor system, USU is responsible for developing and operating the associated DPC. The SPIRIT III sensor consists of a six-color long-wave infrared (LWIR) radiometer system, an LWIR spectrographic interferometer, contamination sensors, and housekeeping monitoring systems. The MSX spacecraft recorders can capture up to 8+ gigabytes of data a day from this sensor. The DPC is subsequently required to provide a 24-hour turnaround to verify and qualify these data by implementing a complex set of sensor and data verification and quality checks. This paper addresses the computing architecture, distributed processing software, and automated data verification processes implemented to meet these requirements.

      Knoebel, Robert; Berdugo, Albert; Aydin Vector Division (International Foundation for Telemetering, 1996-10)
      The Common Airborne Instrumentation System (CAIS) was developed under the auspices of the Department of Defense to promote standardization, commonality, and interoperability among flight test instrumentation. The central characteristic of CAIS is a common suite of equipment used across service boundaries and in many airframe and weapon systems. The CAIS system has many advanced capabilities which must be tested during ground support and system test. There is a need for a common set of low cost, highly capable ground support hardware and software tools to facilitate these tasks. The ground support system should combine commonly available PC-based telemetry tools with unique devices needed for CAIS applications (such as CAIS Bus Emulator, CAIS Hardware Simulator, etc.). An integrated software suite is imperative to support this equipment. A CAIS Ground Support Unit (GSU) has been developed to promote these CAIS goals. This paper presents the capabilities and features of a PC-based CAIS GSU, emphasizing those features that are unique to CAIS. Hardware tools developed to provide CAIS Bus Emulation and CAIS Hardware Simulation are also described.

      Mahon, John P. (International Foundation for Telemetering, 1996-10)
      This paper contains a description of a new technology tracking feed and a discussion of the features which make this feed unique and allow it to perform better than any other comparable feed. Also included in this report are measured primary antenna patterns, measured and estimated phase tracking performance and estimated aperture efficiency. The latter two items were calculated by integrating the measured primary patterns.