Nötzel, Klaus R.; Deutsche Telekom AG (International Foundation for Telemetering, 1996-10)
      Deutsche Telekom has been operating different communication satellites for several years. The Satellite Control Center (SCC) of Deutsche Telekom is located near Usingen, about 50 km northwest of Frankfurt/Main. The system has been under operation since the launch of the first flight model DFS in June 1989. The entire computer system was based on Digital Equipment Corporation (DEC) VAX type computers. The maintenance costs of these old Complex Instruction Sets Computers (CISC) were increased significantly during the last years. Due to the high operational costs Deutsche Telekom decided to exchange the operational computer system. Present-day information technology world uses more and more powerful Reduced Instruction Set Computers (RISC). These new designs allow operational costs to be reduced appreciably. The VAX type computers will be replaced by DEC Alpha AXP Computers. This paper describes the transition process from CISC to RISC computers in an operational realtime environment.
    • Predicting Failures and Estimating Duration of Remaining Service Life from Satellite Telemetry

      Losik, Len; Wahl, Sheila; Owen, Lewis; Lockheed Martin Telemetry & Instrumentation; Lockheed Martin Advanced Technology Center (International Foundation for Telemetering, 1996-10)
      This paper addresses research completed for predicting hardware failures and estimating remaining service life for satellite components using a Failure Prediction Process (FPP). It is a joint paper, presenting initial research completed at the University of California, Berkeley, Center for Extreme Ultraviolet (EUV) Astrophysics using telemetry from the EUV EXPLORER (EUVE) satellite and statistical computation analysis completed by Lockheed Martin. This work was used in identifying suspect "failure precursors." Lockheed Martin completed an exploration into the application of statistical pattern recognition methods to identify FPP events observed visually by the human expert. Both visual and statistical methods were successful in detecting suspect failure precursors. An estimate for remaining service life for each unit was made from the time the suspect failure precursor was identified. It was compared with the actual time the equipment remained operable. The long-term objective of this research is to develop a resident software module which can provide information on FPP events automatically, economically, and with high reliability for long-term management of spacecraft, aircraft, and ground equipment. Based on the detection of a Failure Prediction Process event, an estimate of remaining service life for the unit can be calculated and used as a basis to manage the failure.

      Padilla, Frank Jr; White Sands Missile Range (International Foundation for Telemetering, 1996-10)
      White Sands Missile Range (WSMR) is developing a new transportable telemetry system that consolidates various telemetry data collection functions currently being performed by separate instrumentation. The new system will provide higher data rate handling capability, reduced labor requirements, and more efficient operations support which will result in a reduction of mission support costs. Seven new systems are planned for procurement through Requirements Contracts. They will replace current mobile systems which are over 25 years old on a one-on-one basis. Regulation allows for a sixty-five percent overage on the contract and WSMR plans to make this contract available for use by other Major Range Test Facility Bases (MRTFBs). Separate line items in the contracts make it possible to vary the design to meet a specific system configuration. This paper describes both current and replacement mobile telemetry system

      Hicks, William T.; Aydin Vector Division (International Foundation for Telemetering, 1996-10)
      The monitoring of multi phase 400 Hz aircraft power includes monitoring the phase voltages, currents, real powers, and frequency. This paper describes the design of a multi channel card that uses digital signal processing (DSP) to measure these parameters on a cycle by cycle basis. The card measures the average, peak, minimum cycle, and maximum cycle values of these parameters.
    • Mission Analysis and Reporting System (MARS) - EW Analysis and Reporting On A Personal Computer

      Burton, Ken; Eglin Air Force Base (International Foundation for Telemetering, 1996-10)
      In response to the need to analyze and report upon Electronic Warfare (EW) test data results in a comprehensive and uniform manner, the Mission Analysis and Reporting System (MARS) has been developed. MARS is a government owned PC based Windows application designed for rapid analysis and reporting upon EW test mission data. MARS currently performs Jammer Effectiveness ( Reduction In Lethality, Increase In Survivability, Reduction In Shot, and Reduction In Hit), Radar Warning Receiver (RWR) System performance (Threat ID, Response Time/Ageout, and Direction Finding (DF) Accuracy), and Tracking Error Statistics. Additionally, MARS produces several graphical outputs including polar plotting, dynamic strip charting, Cumulative Distribution Functions (CDF), and RWR Simulated Scope. Continual development and maintenance of MARS at the Air Force Development Test Center, Eglin Air Force Base, Florida, has provided a proven product used by numerous DT&E and OT&E test projects over the last four years.
    • Applications of a Telemetry Signal Simulator

      O’Cull, Douglas; Microdyne Corporation (International Foundation for Telemetering, 1996-10)
      This paper will discuss the use of a specialized telemetry signal simulator for pre-mission verification of a telemetry receiving system. This will include how to configure tests that will determine system performance under “real time” conditions such as multipath fading and Doppler shifting. The paper will analyze a telemetry receiving system and define tests for each part of the system. This will include tests for verification of the antenna system. Also included, will be tests for verification of the receiver/combiner system. The paper will further discuss how adding PCM simulation capabilities to the signal simulator will allow testing of frame synchronizers and decomutation equipment.
    • FX+ Storage and Exchange Structure of Multiplexed Data for Off-line Operations

      Becue, Alain; DASSAULT AVIATION (International Foundation for Telemetering, 1996-10)
      With the technological evolution of flying equipment, computing and store capacity we need to have a new view of the methods of acquisition, storage and archiving data.

      Harvey, Raymond J.; Baer, Glen E. (International Foundation for Telemetering, 1996-10)
      The Mission Operations Center (MOC) at APL is the first processing link in the MSX data system. Two key components of the MOC that play a role in the telemetry acquisition and processing functions are the Mission Control Center (MCC) and the Mission Processing Center (MPC). This paper will present a summary of the telemetry acquisition and data processing structure built to handle the high volume of MSX data and the unique hardware and software systems to perform these functions. The primary responsibility of the MCC is to maintain the health and safety of the MSX spacecraft. This is accomplished by communicating with the spacecraft through the APL stations and the AFSCN. The MCC receives the spacecraft housekeeping 16 Kb telemetry stream and commands the spacecraft via the 2K command link. Due to the complexity of the spacecraft various analysis tools exist to evaluate the spacecraft health and to generate commands for controlling the spacecraft. The primary responsibility of the MPC is the initial processing of the 1Mb and 25Mb spacecraft science telemetry streams. The science data is recorded in a raw format, both analog and digital, and a digital 8 mm tape format, Level 1A tape, which serves the MSX program as the transport media and format for science data dissemination. The MPC also collects downlink data from the MCC and planning products from the Operations Planning Center for inclusion on the Level 1A tape to enable the MSX data community to analysis the data. This data is sent electronically to the MPC via a LAN. One of the key products provided on the Level 1A tape from the MCC is a measure of the spacecraft clock against time standards. The MPC consists of a hardware front end for the capture and formatting of the science data and a computer system for the processing of the formatted science data to produce Level 1A tapes. The hardware front end includes wideband analog recorders, decryption devices, data selectors, bit sync, and frame syncs. One of the unique features of the 25 Mb telemetry stream is that is transmitted to the ground in the reverse direction. The MPC must then reverse the data again which is accomplished via analog recorders in order to perform further processing. The computer system consists of three model VAX 4000 computers with 107 Gb of disk space and 12 8 mm tape drives. One VAX is task with reading the 25 Mb telemetry onto the disk. The second VAX reads to the 1Mb telemetry onto the disk and produces a digital 8 mm tape of the raw data. The third VAX is tasks with processing the data and writing the Level 1A tapes. The systems architecture is such that while today's data is being downlinked yesterday's data is being processed and written to Level 1A tapes. Custom software was developed to perform the processing and data management within the MPC.

      Stokes, Grant H.; Viggh, Herbert E.M.; Pollock, J. Kent (International Foundation for Telemetering, 1996-10)
      This paper discusses the telemetry processing and data verification performed by the SBV Processing, Operations and Control Center (SPOCC) located at MIT Lincoln Laboratory (MIT LL). The SPOCC is unique among the Midcourse Space Experiment (MSX) Data Processing Centers because it supports operational demonstrations of the SBV sensor for Space-Based Space Surveillance applications. The surveillance experiment objectives focus on tracking of resident space objects (RSOs), including acquisition of newly launched satellites. Since Space Surveillance operations have fundamentally short timelines, the SPOCC must be deeply involved in the mission planning for the series of observations and must receive and process the resulting data quickly. In order to achieve these objectives, the MSX Concept of Operations (CONOPS) has been developed to include the SPOCC in the operations planning process. The SPOCC is responsible for generating all MSX spacecraft command information required to execute space surveillance events using the MSX. This operating agreement and a highly automated planning system at the SPOCC allow the planning timeline objectives to be met. In addition, the Space Surveillance experiment scenarios call for active use of the 1 Mbps real-time link to transmit processed targets tracks from the SBV to the SPOCC for processing and for short time-line response of the SPOCC to process the track of the new object and produce new commands for the MSX spacecraft, or other space surveillance sensors, to re-acquire the object. To accomplish this, surveillance data processed and stored onboard the SBV is transmitted to the APL Mission Processing Center via 1 Mbps contacts with the dedicated Applied Physics Laboratory (APL) station, or via one of the AFSCN RTS locations, which forwards the telemetry in real-time to APL. The Mission Processing facility at APL automatically processes the MSX telemetry to extract the SBV allocation and forwards the data via file transfer over a dedicated fractional T1 link to the SPOCC. The data arriving at the SPOCC is automatically identified and processed to yield calibrated metric observations of RSOs. These results are then fed forward into the mission planning process for follow-up observations. In addition to the experiment support discussed above, the SPOCC monitors and stores SBV housekeeping data, monitors payload health and status, and supports diagnosis and correction. There are also software tools which support the assessment of the results of surveillance experiments and to produce a number of products used by the SBV instrument team to assess the overall performance characteristics of the SBV instrument.
    • International Telemetering Conference Proceedings, Volume 32 (1996)

      International Foundation for Telemetering, 1996-10

      Schumacher, Gary A.; Terametrix Systems International, Inc. (International Foundation for Telemetering, 1996-10)
      PC based instrumentation and telemetry processing systems are attractive because of their ease of use, familiarity, and affordability. The evolution of PC computing power has resulted in a telemetry processing system easily up to most tasks, even for control of and processing of data from a very complex system such as the Common Airborne Instrumentation System (CAIS) used on the new Lockheed-Martin F-22. A complete system including decommutators, bit synchronizers, IRIG time code readers, simulators, DACs, live video, and tape units for logging can be installed in a rackmount, desktop, or even portable enclosure. The PC/104 standard represents another step forward in the PC industry evolution towards the goals of lower power consumption, smaller size, and greater capacity. The advent of this standard and the availability of processors and peripherals in this form factor has made possible the development of a new generation of portable low cost test equipment. This paper will outline the advantages and applications offered by a full-function, standalone, rugged, and portable instrumentation controller. Applications of this small (5.25"H x 8.0"W x 9.5"L) unit could include: flight line instrumentation check-out, onboard aircraft data monitoring, automotive testing, small craft testing, helicopter testing, and just about any other application where small-size, affordability, and capability are required.
    • 8PSK Signaling Over Non-Linear Satellite Channels

      Caballero, Rubén; New Mexico State University (International Foundation for Telemetering, 1996-10)
      Space agencies are under pressure to utilize better bandwidth-efficient communication methods due to the actual allocated frequency bands becoming more congested. Budget reductions is another problem that the space agencies must deal with. This budget constraint results in simpler spacecraft carrying less communication capabilities and also the reduction in staff to capture data in the earth stations. It is then imperative that the most bandwidth efficient communication methods be utilized. This paper gives the results of a computer simulation study on 8 Level Phase Shift Keying (8PSK) modulation with respect to bandwidth, power efficiency, spurious emissions, interference susceptibility and the non-constant envelope effect through a non-linear channel. The simulations were performed on a Signal Processing Worksystem (SPW: software installed on a SUN SPARC 10 Unix Station and Hewlett Packard Model 715/100 Unix Station). This work was conducted at New Mexico State University (NMSU) in the Center for Space Telemetry and Telecommunications Systems in the Klipsch School of Electrical and Computer Engineering.
    • Concurrent Telemetry Processing Techniques

      Clark, Jerry; Lockheed Martin Telemetry & Instrumentation (International Foundation for Telemetering, 1996-10)
      Improved processing techniques, particularly with respect to parallel computing, are the underlying focus in computer science, engineering, and industry today. Semiconductor technology is fast approaching device physical limitations. Further advances in computing performance in the near future will be realized by improved problem-solving approaches. An important issue in parallel processing is how to effectively utilize parallel computers. It is estimated that many modern supercomputers and parallel processors deliver only ten percent or less of their peak performance potential in a variety of applications. Yet, high performance is precisely why engineers build complex parallel machines. Cumulative performance losses occur due to mismatches between applications, software, and hardware. For instance, a communication system's network bandwidth may not correspond to the central processor speed or to module memory. Similarly, as Internet bandwidth is consumed by modern multimedia applications, network interconnection is becoming a major concern. Bottlenecks in a distributed environment are caused by network interconnections and can be minimized by intelligently assigning processing tasks to processing elements (PEs). Processing speeds are improved when architectures are customized for a given algorithm. Parallel processing techniques have been ineffective in most practical systems. The coupling of algorithms to architectures has generally been problematic and inefficient. Specific architectures have evolved to address the prospective processing improvements promised by parallel processing. Real performance gains will be realized when sequential algorithms are efficiently mapped to parallel architectures. Transforming sequential algorithms to parallel representations utilizing linear dependence vector mapping and subsequently configuring the interconnection network of a systolic array will be discussed in this paper as one possible approach for improved algorithm/architecture symbiosis.
    • Analysis of the Effects of Sampling Sampled Data

      Hicks, William T.; Drexel University (International Foundation for Telemetering, 1996-10)
      The traditional use of active RC-type filters as anti-aliasing filters in Pulse Code Modulation (PCM) systems is being replaced by the use of Digital Signal Processing (DSP) filters, especially when performance requirements are tight and when operation over a wide environmental temperature range is required. In order to keep systems more flexible, it is often desired to let the DSP filters run asynchronous to the PCM sample clock. This results in the PCM output signal being a sampling of the output of the DSP, which is itself a sampling of the input signal. In the analysis of the PCM data, the signal will have a periodic repeat of a previous sample, or a missing sample, depending on the relative sampling rates of the DSP and the PCM. This paper analyzes what effects can be expected in the analysis of the PCM data when these anomalies are present. Results are presented which allow the telemetry engineer to make an effective value judgment based on the type of filtering technology to be employed and on the desired system performance.
    • Digital Video Telemetry System

      Thom, Gary A.; Snyder, Edwin; Delta Information Systems; Aydin Vector (International Foundation for Telemetering, 1996-10)
      The ability to acquire real-time video from flight test platforms is becoming an important requirement in many test programs. Video is often required to give the flight test engineers a view of critical events during a test such as instrumentation performance or weapons separation. Digital video systems are required because they allow encryption of the video information during transmission. This paper describes a Digital Video Telemetry System that uses improved video compression techniques which typically offer at least a 10:1 improvement in image quality over currently used techniques. This improvement is the result of inter-frame coding and motion compensation which other systems do not use. Better quality video at the same bit rate, or the same quality video at a lower bit rate is achieved. The Digital Video Telemetry System also provides for multiplexing the video information with other telemetered data prior to encryption.

      Orsino, Mary Ellen; Williams, Michael; Avtec Systems, Inc. (International Foundation for Telemetering, 1996-10)
      Satellite Control Systems require a front-end component which performs real-time telemetry acquisition and command output. This paper will describe a fully networked, PC-based telemetry and command front-end which supports multiple streams and is based on Commercial Off The Shelf (COTS) technology. The front-end system is a gateway that accepts multiple telemetry streams and outputs time-tagged frame or packet data over a network to workstations in a distributed satellite control and analysis system. The system also includes a command gateway that accepts input from a command processor and outputs serial commands to the uplink. The front-end can be controlled locally or remotely via the network using Simple Network Management Protocol. Key elements of the front-end system are the Avtec MONARCH-E™ PCI-based CCSDS/TDM Telemetry Processor/Simulator board, a network-based, distributed computing architecture, and the Windows NT operating system. The PC-based telemetry and command gateway is useful throughout the lifecycle of a satellite system. During development, integration, and test, the front-end system can be used as a test tool in a distributed test environment. During operations, the system is installed at remote ground stations, with network links back to operations center(s) for telemetry and command processing and analysis.

      Johnston, Jerry W.; LaPoint, Steve; TYBRIN Corporation; U.S. Army Kwajalein Atoll; Kwajalein Missile Range (International Foundation for Telemetering, 1996-10)
      This paper presents the interim results of an effort to corroborate analytic model predictions of the effects of rocket motor plume on telemetry signal RF propagation. When space is available, telemetry receiving stations are purposely positioned to be outside the region of a rocket motor's plume interaction with the RF path; therefore, little historical data has been available to corroborate model predictions for specific rocket motor types and altitudes. RF signal strength data was collected during the flight of HERA target missile by White Sands Missile Range (WSMR) using a transportable telemetry receiving site specifically positioned to be within the rocket plume region of influence at intermediate altitudes. The collected data was analyzed and compared to an RF plume attenuation model developed for pre-mission predictions. This work was directed by the US Army Kwajalein Atoll (USAKA)/ Kwajalein Missile Range (KMR) Safety Division.

      Wentai, Feng; Biao, Li; Xinan Electronic Engineering Institute (International Foundation for Telemetering, 1996-10)
      It is well known that the pulse telemetering system whose system equipment is simple is superior to the continuous one in ultilizing signal power. But in designing a pulse telemetering receiver the frequency shift problem is often encountered, the shift often greatly wider than the signal bandwidth is very unfavorable for improving receiver working sensitivity. Either to limit transmitter frequency stability strictly or to adapt AFC system in receiver for tracking carrier wave can solve the problem above, the AFC system method could improve the receiver’s performance, but the equipment is complicated. To what extent the receiver working sensitivity will be effected and how to judge the effection in case of adapting VF matched filter and RF being wideband in receiver are this paper’s emphasis. In this paper the power density spectrum distribution of the white noise which has passed through the non-linear system-the linear detector is analysed theoretically, and the improved working sensitivity of the receiver with video matched filter and its difference sensitivity value to that of the optimal receiver are derived. The tested working sensitivity data of two kind pulse receivers with different RF bands are given and the theoretical calculation results conform well with these data, thus it is proven that adapting video matched filter in pulse receiver is a effective approach for compensating the receiver working sensitivity dropping from RF bandwidth increase.
    • SPIRIT III Data Verification Processing

      Garlick, Dean; Wada, Glen; Krull, Pete (International Foundation for Telemetering, 1996-10)
      This paper will discuss the functions performed by the Spatial Infrared Imaging Telescope (SPIRIT) III Data Processing Center (DPC) at Utah State University (USU). The SPIRIT III sensor is the primary instrument on the Midcourse Space Experiment (MSX) satellite; and as builder of this sensor system, USU is responsible for developing and operating the associated DPC. The SPIRIT III sensor consists of a six-color long-wave infrared (LWIR) radiometer system, an LWIR spectrographic interferometer, contamination sensors, and housekeeping monitoring systems. The MSX spacecraft recorders can capture up to 8+ gigabytes of data a day from this sensor. The DPC is subsequently required to provide a 24-hour turnaround to verify and qualify these data by implementing a complex set of sensor and data verification and quality checks. This paper addresses the computing architecture, distributed processing software, and automated data verification processes implemented to meet these requirements.

      Knoebel, Robert; Berdugo, Albert; Aydin Vector Division (International Foundation for Telemetering, 1996-10)
      The Common Airborne Instrumentation System (CAIS) was developed under the auspices of the Department of Defense to promote standardization, commonality, and interoperability among flight test instrumentation. The central characteristic of CAIS is a common suite of equipment used across service boundaries and in many airframe and weapon systems. The CAIS system has many advanced capabilities which must be tested during ground support and system test. There is a need for a common set of low cost, highly capable ground support hardware and software tools to facilitate these tasks. The ground support system should combine commonly available PC-based telemetry tools with unique devices needed for CAIS applications (such as CAIS Bus Emulator, CAIS Hardware Simulator, etc.). An integrated software suite is imperative to support this equipment. A CAIS Ground Support Unit (GSU) has been developed to promote these CAIS goals. This paper presents the capabilities and features of a PC-based CAIS GSU, emphasizing those features that are unique to CAIS. Hardware tools developed to provide CAIS Bus Emulation and CAIS Hardware Simulation are also described.