• REAL-TIME RECOGNITION OF TIME-SERIES PATTERNS

      Morrill, Jeffrey P.; Delatizky, Jonathan; Bolt, Beranek, and Newman Inc. (International Foundation for Telemetering, 1993-10)
      This paper describes a real-time implementation of the pattern recognition technology originally developed by BBN [Delatizky et al] for post-processing of time-sampled telemetry data. This makes it possible to monitor a data stream for a characteristic shape, such as an arrhythmic heartbeat or a step-response whose overshoot is unacceptably large. Once programmed to recognize patterns of interest, it generates a symbolic description of a time-series signal in intuitive, object-oriented terms. The basic technique is to decompose the signal into a hierarchy of simpler components using rules of grammar, analogous to the process of decomposing a sentence into phrases and words. This paper describes the basic technique used for pattern recognition of time-series signals and the problems that must be solved to apply the techniques in real time. We present experimental results for an unoptimized prototype demonstrating that 4000 samples per second can be handled easily on conventional hardware.
    • Use of Nonstandard FM Subcarriers for Telemetry Systems

      Rieger, James L.; Naval Air Warfare Center, Weapons Division (International Foundation for Telemetering, 1993-10)
      Subcarrier use in telemetry has decreased in recent years due to emphasis on all-digital systems, but some cases lend themselves more easily to a mixed-service system carrying subcarriers along with a baseband signal. The 'IRIG 106' Telemetry Standards have maintained and expanded several series of FM subcarriers, but some uses are better served with 'non-standard' subcarriers that might be standard in other types of service, making components relatively easily available and inexpensive. This paper examines topics from the RCC study and describes some of the uses of subcarrier systems available to the telemetry designer.
    • Performance of Soft-Decision Block-Decoded Hybrid-ARQ Error Control

      Rice, Michael; Brigham Young University (International Foundation for Telemetering, 1993-10)
      Soft-decision correlation decoding with retransmission requests for block codes is proposed and the resulting performance is analyzed. The correlation decoding rule is modified to allow retransmission requests when the received word is rendered unreliable by the channel noise. The modification is realized by a reduction in the volume in Euclidean space of the decoding region corresponding to each codeword. The performance analysis reveals the typical throughput - reliability trade-off characteristic of error control systems which employ retransmissions. Performance comparisons with hard-decision decoding reveal performance improvements beyond those attainable with hard-decision decoding algorithms. The proposed soft-decision decoding rule permits the use of a simplified codeword searching algorithm which reduces the complexity of the correlation decoder to the point where practical implementation is feasible.
    • FM, PM and NPR Calculations

      Gallupe, Gary; APCOM, Inc. (International Foundation for Telemetering, 1993-10)
      System performance can be ascertained via a number of parameters; one of which is Signal-to-Noise ratio (SNR). SNR is the ratio of the value of the signal to the value of the noise. It is generally expressed in decibels and usually a function of the system bandwidth. Another measure of performance is the Noise-Power ratio (NPR). NPR is the ratio of the noise level within a specific measurement channel when noise is applied to all channels, to the level that is measured within the specific channel with noise applied to all of the channels but not the specific channel.
    • SYSTEM DESIGN OF A HIGH DATA RATE WIDE BAND FM PCM INSTRUMENTATION AND TELEMETRY SYSTEM FOR INTERCEPTOR TEST FLIGHTS

      Goldsmith, T. A.; Kephart, S. R.; McDonnell Douglas Aerospace; Gulton Data Systems (International Foundation for Telemetering, 1993-10)
      Given the small size of hit-to-kill interceptor test vehicles currently under development, volumetric limitations mandate using the experimental vehicle's telemetry system during vehicle ground level acceptance and environmental testing to gather performance data, in addition to the primary function of successfully gathering and transmitting data during the test flight. In small, lightweight test interceptors, volume and mass become major telemetry system design considerations. In this paper we describe a system engineering approach to determine the key requirements and calculate some of the critical design parameters necessary for the successful design and development of a high data rate wide band FM Pulse Code Modulation (PCM) airborne telemetry system.
    • DATA SYSTEM FOR PROPULSION SYSTEM TESTING ON ILYUSHIN IL-96M

      Ritter, Thomas M.; Pratt & Whitney CEB (International Foundation for Telemetering, 1993-10)
      Pulse Code Modulation (PCM) data systems are used extensively in testing aircraft all over the world. These systems can be tailored to almost any set of measurement requirements using flexible, modular equipment available from several sources. This paper describes a system assembled from readily available components manufactured in the United States that is being used to certify a Russian aircraft flying in The Commonwealth of Independent States. The system features distributed data acquisition, programmable signal conditioning and PCM encoding modules, multi-channel temperature and pressure scanners and real time data displays on board the aircraft. The impact of U.S. export controls and our experience to date is also discussed.
    • AN EVOLUTIONARY APPROACHTO A COMMUNICATIONS INFRASTRUCTURE FOR INTEGRATED VOICE, VIDEO AND HIGH SPEED DATA FROM RANGETO DESKTOP USING ATM

      Smith, Quentin D.; CSTI (International Foundation for Telemetering, 1993-10)
      As technology progresses we are faced with ever increasing volumes and rates of raw and processed telemetry data along with digitized high resolution video and the less demanding areas of video conferencing, voice communications and general LAN-based data communications. The distribution of all this data has traditionally been accomplished by solutions designed to each particular data type. With the advent of Asynchronous Transfer Modes or ATM, a single technology now exists for providing an integrated solution to distributing these diverse data types. This allows an integrated set of switches, transmission equipment and fiber optics to provide multi-session connection speeds of 622 Megabits per second. ATM allows for the integration of many of the most widely used and emerging low, medium and high speed communications standards. These include SONET, FDDI, Broadband ISDN, Cell Relay, DS-3, Token Ring and Ethernet LANs. However, ATM is also very well suited to handle unique data formats and speeds, as is often the case with telemetry data. Additionally, ATM is the only data communications technology in recent times to be embraced by both the computer and telecommunications industries. Thus, ATM is a single solution for connectivity within a test center, across a test range, or between ranges. ATM can be implemented in an evolutionary manner as the needs develop. This means the rate of capital investment can be gradual and older technologies can be replaced slowly as they become the communications bottlenecks. However, success of this evolution requires some planning now. This paper provides an overview of ATM, its application to test ranges and telemetry distribution. A road map is laid out which can guide the evolutionary changeover from today's technologies to a full ATM communications infrastructure. Special applications such as the support of high performance multimedia workstations are presented.
    • Analysis of Frequency Stabilization and Modulation of Airborne Telemetry Transmitter

      Xizhou, Zhang; Jun, Yao; Xinan Electronic Engineering Institute (International Foundation for Telemetering, 1993-10)
      This paper analyzes the feature of frequency stability and modulation of airborne telemetry transmitters. According to the characteristic of telemetry information transmission, several methods for frequency stabilization and modulation are briefly compared. Emphasis is given to discuss frequency dividing phase- locked frequency modulation and on-off keying modulation and FM/on- off keying double modulation. With the view of raising frequency stability and modulation sensibility, extending the linear range of modulation, the contradiction between frequency stabilization and modulation should be coordinated properly. In addition, a compatible method between conventional telemetry channel and super fast signal telemetry channel is introduced. A satisfactory result has been acquired with those views and methods used in engineering application.
    • Mission-Independent Telemetry Processing Software for PCs

      Miller, Richard J.; Micro SciTech Ltd. (International Foundation for Telemetering, 1993-10)
      Until the early 80's, telemetry processing systems were commonly run on mainframe or mini computers running proprietary operating systems and software with limited portability. The advent of the 'low-cost' workstation reduced the hardware cost but the software still remained relatively expensive and relatively mission specific. The workstation itself, although comparatively cheap, was not, and is still not, an everyday piece of computing hardware Telemetry Processing software has been developed by Micro SciTech to meet both low-cost hardware requirements and mission independence. It runs on networked IBM PC compatible computers and can be re-configured and used for many different missions and experiments without the need for extensive software rewrites.
    • A Rugged, Low-Cost, Advanced Data-Acquisition System for Field Test Projects

      Simms, D. A.; Cousineau, K. L.; National Renewable Energy Laboratory (NREL); Zond Systems, Inc. (International Foundation for Telemetering, 1993-10)
      The National Renewable Energy Laboratory (NREL) has teamed up with Zond Systems, Inc., to provide a rugged, low-cost, advanced data-acquisition system (ADAS) for use in field test projects. The ADAS simplifies the process of making accurate measurements on mechanical equipment exposed to harsh environments. It provides synchronized, time-series measurement data from multiple, independent sources. The ADAS is currently being used to acquire data from large wind turbines in operational wind-plant environments. ADAS modules are mounted on rotating blades, turbine towers, nacelles, control modules, meteorological towers, and electrical stations. The ADAS has the potential to meet the testing and monitoring needs of many other technologies as well, including vehicles, heavy equipment, piping and power transmission networks, and building energy systems.
    • ADVANCED AIRBORNE TEST INSTRUMENTATION SYSTEM (AATIS) PROGRAM SYSTEM OVERVIEW

      Chang, Dah W.; Edwards Air Force Base (International Foundation for Telemetering, 1993-10)
      The Advanced Airborne Test Instrumentation System (AATIS), one of the major instrumentation systems in use today by the Department of Defense (DoD), was developed in the late 1980's to improve and modernize its predecessor - the Airborne Test Instrumentation System (ATIS). Use of AATIS, by not only the Air Force but the Navy and Army, has improved instrumentation commonality and interoperability across multiple test programs. AATIS, developed by the same manufacturer as the DoD Common Airborne Instrumentation System (CAIS), has a common bus structure - enabling cross utilization of many components which will ease transition from one system to another. The objective of this paper is to provide an overview on the Advanced ATIS System and its logistics support concept. For system description, an overview is presented on the airborne system and related ground support equipment. A brief description is given on the three levels of maintenance being used or planned for by the using activities. Finally, a projection is presented on the utilization of this system for the next 3 years.
    • Software Techniques for Recovering Noisy Telemetry

      Sweet, John E.; Holmes, Harlan H.; Rockwell International Corporation (International Foundation for Telemetering, 1993-10)
      Software techniques for data quality and useability enhancement are used at two steps in the processing of PCM (Pulse Code Modulation) radio telemetry. The first is a software group synchronization which is used where traditional method has failed. The other is a tool for producing a single best quality data file from diverse receivers. Recovering even small segments of valid information from noisy signals may be of major concert. The importance in many applications is because poor signal power is induced by events of great interest such as failure, detonation or exhaust gas dynamics. The radio receiver and bit synchronizer perform nearly optimally in processing of low signal to noise transmissions. It is found that the group synchronization process can be improved with software algorithm. It is convenient to merge available data from a single test into a single file of best available data. Detected signals are recorded at dispersed tracking stations with varying signal quality over time. Upon achieving the best data from each tracking source the reconstructed data from a collection of all sources is further merged. By using known content to detect bit errors a single file of best quality data is available for analysis. Comparative performance data from use on ICBM telemetry is included. A missile is an example of application where the data recovery is particularly critical at events such as staging and launch.
    • Bit Error Problems with DES

      Loebner, Christopher E. (International Foundation for Telemetering, 1993-10)
      The Data Encryption Standard (DES) was developed in 1977 by IBM for the National Bureau of Standards (NBS) as a standard way to encrypt unclassified data for security protection. When the DES decrypts the encrypted data blocks, it assumes that there are no bit errors in the data blocks. It is the object of this project to determine the Hamming distance between the original data block and the data block after decryption if there occurs a single bit error anywhere in the encrypted bit block of 64 bits. This project shows that if a single bit error occurs anywhere in the 64-bit encrypted data block, a mean Hamming distance of 32 with a standard deviation of 4 is produced between the original bit block an the decrypted bit block. Furthermore, it is highly recommended by this project to use a forward error correction scheme like BCH (127, 64) or Reed-Solomon (127, 64) so that the probability of this bit error occurring is decreased.
    • Generic Telemetry Processing in the Control Center Environment at Johnson Space Center

      Uljon, Linda; Evans, Carol; NASA (International Foundation for Telemetering, 1993-10)
      This paper will describe the effort to provide a common telemetry system for the Control Center Complex (CCC) which will process data from both the space shuttle and the space station vehicles. It is being developed for the manned spaceflight program at Johnson Space Center. Space shuttle uses a traditional Inter-Range Instrumentation Group (IRIG) telemetry format and Space Station Freedom utilizes the more recently developed Consultative Committee for Space Data Systems (CCSDS) standards for packet-based telemetry Although the two telemetry streams are very different in structure, a front end system is being developed which will isolate the differences and provide a common data format to the downstream elements of the control center. Because of this, a CCC workstation could receive and process data from either space station or space shuttle or both using a identical set of workstation program tools. The generic telemetry front end processor, which is called the Consolidated Communications Facility (CCF), will not only provide a cost effective method of processing space shuttle and space station data, but also will position the CCC to support anticipated requirements of' future programs. The development goals for the CCC are to reduce development and sustaining costs. In the CCF project, commercial-off-the-shelf (COTS) equipment is stressed to allow modular maintenance. In addition, the project has emphasized the development of a automated features in the telemetry stream selection and processing which reduce the amount of operator attention needed. The system has been designed to include robotics in the recording operation and artificial intelligence for detecting faults. This paper will review the concept development for processing telemetry and outline the architecture of the front end CCF project. It will discuss the goals and major influences on the design, and provide a status on the development. Ability of the current COTS marketplace to meet the goals will be discussed. In summation, this paper will describe generic telemetry processing in the context of the CCC being built at Johnson Space Center.
    • UNIX-Compatible Real-Time Environment for NASA's Ground Telemetry Data Systems

      Horner, Ward; Kozlowski, Charles; Data Systems Technology Division; RMS Technologies, Inc.; NASA/Goddard Space Flight Center (International Foundation for Telemetering, 1993-10)
      NASA's ground telemetry data systems developed by the Microelectronics Systems Branch at the Goddard Space Flight Center, use a generic but expandable architecture known as the "Functional Components Approach." This approach is based on the industry standard VMEbus and makes use of multiple commercial and custom VLSI hardware based cards to provide standard off-the-shelf telemetry processing functions (e.g., frame synchronization, packet processing, etc.) for many telemetry data handling applications. To maintain maximum flexibility and performance of these systems, a special real-time system environment has been developed, the Modular Environment for Data Systems (MEDS). Currently, MEDS comprises over 300,000 lines of tested and operational code based on a non-UNIX real-time commercial operating system. To provide for increased functionality and adherence to industry standards, this software is being transformed to run under a UNIX-compatible real-time environment. This effort must allow for existing systems and interfaces and provide exact duplicates of the system functions now used in the current real-time environment. Various techniques will be used to provide a relatively quick transition to this new real-time operating system environment. Additionally, all standard MEDS card to card and system to system interfaces will be preserved, providing for a smooth transition and allowing for telemetry processing cards that have not yet been converted to reside side-by-side with cards that have been converted. This paper describes this conversion effort.
    • AN OBJECT-ORIENTED COMMAND AND TELEMETRY "BLACK BOX" SIMULATION USING ADA

      Policella, Joseph; White, Joey; Shillington, Keith; CAE-Link Corporation; Fastrak Training Inc. (International Foundation for Telemetering, 1993-10)
      To model the "black boxes" in a command and telemetry simulation, it is important to preserve the abstraction of a one-to-one match between the real-world interfaces and the simulated interfaces. Everywhere a physical interface exists on the box, there needs to be a simulated interface. Preserving this abstraction allows the model to evolve more naturally with real-world design changes. In most command and telemetry systems, many different types of commands and telemetry can be sent over a single interface. This creates a problem in preserving the interface abstraction if the Ada language is used for implementation. Due to the fact that Ada is a "strongly typed" language, a different or overloaded operation needs to exist for each type of command or telemetry. However, by using a "discriminated variant record" to represent the commands and telemetry streams, a single operation can be used in the Ada specification. This not only preserves the abstraction but makes the software more maintainable by allowing the data list to change during the design of the "black box" without changing the Ada specification. As a result, "loose coupling" is achieved, a common set of commands and telemetry formats can be "inherited" to promote reuse, and overall system development and maintenance costs are reduced.
    • SPACEBOURNE VME BASED PCM ENCODER (VPE)

      Rodriguez, Harry; Edwards Air Force Base (International Foundation for Telemetering, 1993-10)
      The VME bus is used in a wide variety of airborne applications. The particular application of the VPE is for use in the MSTI satellite to provide spacecraft telemetry. The VME based PCM encoder can provide telemetry from any stand alone data acquisition system. This paper describes the VME based PCM encoder. Since this design is ruggedized to meet the launch and environmental requirements for space, it can be used in any airborne VME system.
    • TELEMETRY SYSTEMS TRAINING PROGRAMS: THE KEY TO SUCCESSFUL IMPLEMENTATION AND OPERATION

      Jaunbral, Janis; Computer Sciences Corporation (International Foundation for Telemetering, 1993-10)
      In today's world, the importance of training for telemetry systems continues to grow as new technologies provide users with ever-increasing capabilities. Successful training programs ensure telemetry systems quickly become operational yielding the acquisition of critical test data. Over the years, training programs have been greatly impacted by the changes in defense contracts -- specifically funding. Today's aggressive telemetry market requires contractors to develop complex telemetry systems within the constraints of Firm Fixed price (FFP) contracts and within very short schedules. As a result of these conditions, training programs have changed significantly over the last ten years. Projects which used to have dedicated training personnel (instructors, technical writers, etc.) now rely on the system developers to provide the training. In actuality, the quality of training has improved with this new approach. Now students benefit from having the most knowledgeable personnel teach them about the system and, often times, latent problems with the system are efficiently identified and corrected. This paper will summarize the evolution of training programs for telemetry systems developed by Computer Sciences Corporation. The benefits of a scenario where the system architects train the end users with the use of increased "hands-on" training will be explored.
    • Lowest Cost Alternative to Auto-Tracking Using GPS-TRAK, Augustin-Sullivan Distribution, & Single Axis Antenna Techniques

      Augustin, Eugene P.; Dunn, Daniel S.; Sullivan, Arthur; Technical Systems Associates, inc. (International Foundation for Telemetering, 1993-10)
      The first telemetry tracking system was desired in 1959 for the space program. Cost was of little concern. The tracking technique used was 3 channel monopulse which is still today, after all these years, the optimum in performance for any type of tracking requirement. Telemetry tracking really got off the ground in the early 1970's with the move from P-Band to S-Band for telemetry. In the design of early tracking systems, performance was on the top of the list, and cost was on the bottom of the list in establishing the design criteria. By the beginning of the 1980's cost was approaching performance in importance. Today, with the demise of the cold war and a considerable reduction in global threats coupled with the state of the world economy, cost has now reached the top of the list. The cost of a telemetry tracking system can be reduced by more than a factor of two by going to a single axis tracking technique. The lowest cost single axis approach heretofore has been the use of a cosecant squared (CSC²) distribution. To improve the efficiency of a single axis system and increase the overhead coverage capability, the use of a dual beam antenna has been widely used as another type of single axis approach. The dual beam technique involves additional costs since two tracking antennas are required. Except for satellite tracking, almost all telemetry tracking is performed at low elevation angles and, like it or not, multipath is there. The multipath fade varies from a few dB, to over 20 dB depending upon the reflecting terrain. Most general purpose systems should be designed for at least a 10 dB multipath fade. For all telemetry tracking applications, the multipath effect is completely negligible at elevation angles greater than 10 degrees. The Augustin-Sullivan Distribution, in effect, fades away the multipath margin as the multipath effect decreases. Because of the multipath phenomenon, an antenna beam should not be shaped at the one dB point as is the case with a CSC² distribution, but only needs to be shaped from somewhere between the 15 - 20 dB level based on the mission requirements. This involves a gain reduction from a pencil beam on the order of 1/2 dB or less, rather than the 3 dB reduction associated with the CSC² distribution. The Augustin-Sullivan distribution does not start shaping the beam until shaping is retired, and shapes the beam for constant altitude coverage from the horizon to zenith. For the first time, coverage is provided from the peak of the beam to directly overhead with a single antenna and a single axis rotator. When GPS information is available from the tracked vehicle, the Augustin-Sullivan distribution, with a single axis rotator and using the GPS-TRAK technique, results in the lowest possible cost alternate to autotracking.
    • Instrumentation and Telemetry Systems for Free-Flight Drop Model Testing

      Hyde, Charles R.; Massie, Jeffrey J.; NASA Langley Research Center (International Foundation for Telemetering, 1993-10)
      This paper presents instrumentation and telemetry system techniques used in free-flight research drop model testing at the NASA Langley Research Center. The free-flight drop model test technique is used to conduct flight dynamics research of high performance aircraft using dynamically scaled models. The free-flight drop model flight testing supplements research using computer analysis and wind tunnel testing. The drop models are scaled to approximately 20% of the size of the actual aircraft. This paper presents an introduction to the Free-Flight Drop Model Program which will be followed by a description of the current instrumentation and telemetry systems used at the NASA Langley Research Center, Plum Tree Test Site. The paper describes three telemetry downlinks used to acquire the data, video, and radar tracking information from the model. Also described are two telemetry uplinks, one used to fly the model employing a ground based flight control computer and a second to activate commands for visual tracking and parachute recovery of the model. The paper concludes with a discussion of free-flight drop model instrumentation and telemetry system development currently in progress for future drop model projects at the NASA Langley Research Center.