MONO5

    Examples and Exercises in Data Translation

    Published: Jan 2009

      Format Pages Price  
    PDF Version (42M) 79 $25   ADD TO CART
    Complete Source PDF (50M) 79 $97   ADD TO CART


    Abstract

    Section 3.1 deals with the assimilation of event data from various sources into digital formats usable for investigation analysis. This includes mathematical data in the form of unparsed text sentences (S19 files) and the treatment of multibyte quantities in the parsed data. Also included is the process of mapping hexadecimal block data into hexadecimal linear lists. Linear lists allow the investigator to make parallel data lists which allow comparisons of data from the same unit after different exposures (such as before and after a simulated crash pulse) so that variances in the data patterns can be immediately and visually discerned by the investigator.

    Chapter 3.2 deals with the inductive derivation of appropriate scaling and translation factors to derive intermediate and end-point engineering units from saved (nonvolatile) memory in any particular crash event data recorder. To accomplish this derivation, the reader is shown how to induce a solution using an investigation of the components in any particular ECU (device under test, DUT) and a set of simulated results recorded with an independent data acquisition monitor. In this case, for the key components, commercial specifications are referenced to provide internal scaling limits and translation factors and the DUT is subjected to test-crash trials to resolve successive DUT-output calculation hypotheses versus a complementary set of known test-crash-pulse signatures (recorded with an independent-external accelerometer). Lastly, the investigator is shown how to match the set of DUT-output calculations to the successive and complementary set of independent-external-accelerometer results, at varying test-crash-pulse magnitudes, to achieve a sufficient repeatability and an acceptable error rate for a finalized stable calculation hypothesis and method. The investigator can then put forward that hypothesis and method as a rule to derive data from that ECU model and design level. The work product output of teses exercises produces a scaling, limit, offset, and transfer function (SLOT) for the DUT thus examined.

    Chapter 3.3 deals with the deductive derivation of internal EEPROM scaling and translation factors when the end-point solution is known or given. This is contrasted with an inductive analysis (Chapter 3.2), which begins with no fixed solution. In that case, one or more problem solution hypotheses (methodologies) are repeatedly tested to see if they produce a result that represents a repeatable, logically, and physically correct solution, from which we can gain confidence in that solution as it applies to new problems in that problem set. However with inductive analysis the solution was not guaranteed, and thus, confidence in the solution is not absolute. By contrast, the deductive analysis begins with collected known solutions, specifications, facts or subhypotheses, from which one attempts to deduce a hypothesis (solution method) for a particular outcome set. This often includes yet-unanalyzed raw data. Then, once that deduced hypothesis (solution method) is tested and found to be repeatable, it can be used with confidence to solve new problems for other examples in that process class, where the process outcome is not given. This chapter presents examples of the deductive solution method.

    This chapter extends the prior introduction to data translation, data import, and mapping using spreadsheet tools. In Chapter 3.4 we elaborate on these techniques by creating spreadsheet templates for each of the families of ECU data types we will encounter in the ensemble of real world investigations. This is similar to the use of form letters in various word processors (also often called templates as well). In this chapter we cover the import and multiple mapping of nonregular data sources such as obtained as the logger output of generic network logging tools. This is illustrated by following the steps from text output, to formatted hex table to hex linear list. Also covered is the process of converting a binary file, using a print-to-Adobe®∗.pdf utility to then create an importable text file with either space or comma delimiters.

    This chapter extends the prior introduction to multibyte and number base concepts, also adding a detailed section on bitmap translation, including mixed-mode bitmaps. Bitmaps are the most universal way of documenting of diagnostic trouble codes (DTCs) and the most universal way of documenting binary functional performance metrics (metric met or not met). Included in this chapter are octal, inverted-octal, signed-binary, and conditional status representations in spreadsheet evaluation templates.

    This chapter extends the prior discussed acceleration vector techniques to an analysis of acceleration vector dynamics over the course of a race car accident event. One of the aspects of this analysis is that the instrumented race car in question overturned during the crash event, so that we must confirm true ground azimuth direction, using only on-vehicle instrumentation, for a vehicle which is in the process of yawing, rolling, and overturning during a crash event. In the vehicle under investigation, the signed rectilinear acceleration is captured, with several other parameters, as converted sample words at a rate of 20 sample words/s. Ultimately, the acceleration polar magnitude and azimuth value will be used in a crash strength calculation.

    Aside from the download and interpretation of EDR information, an electronic data investigator can be called upon to evaluate vehicle antilock braking systems (ABSs) function. This subchapter examines a method to analyze data from a heavy truck air brake ABS system and its associated components. The method identifies one potential ABS anomaly [inconsistent wheel speed sensor (WSS) pulses], provides a spreadsheet method to identify (flag) this anomaly as a set of occurrences in voluminous data, and then provides a method of graphical representation to visualize that anomaly set versus wheel azimuth. Additionally, a minitutorial discussion covering the major components of heavy truck air brake ABS systems (ABS ECU, individual wheel speed sensors and individual wheel modulator valves introduce the chapter).

    ACCIDENT INVESTIGATION INVOLVES THE USE OF DATA FROM A variety of sources. While digital electronic media data sources are desirable, not all data are available in electronic media. Sometimes data are only available in hard copy form such as accident reconstruction reports, crash test data, OEM specifications, and component supplier specifications. This chapter shows a method of utilizing crash test hard copy acceleration data taken from air bag must-deploy threshold tests, and how that data were tabulated and integrated to generate comparative acceleration signatures and comparative Delta V profiles. Those acceleration signatures were then used to determine whether the acceleration signature and Delta V profiles in a subject vehicle were above or below the performance for the crash test exemplars.

    Keywords:

    Hexadecimal data mapping, Inductive analysis, EEPROM analysis, EDR analysis, crash event data, deductive analysis, binary files, spreadsheet data mapping, bitmap, signed-binary, octal, conditional status, acceleration vector, inverted vehicle, antilock braking, wheel speed sensor, polar data representation, hardcopy source data, acceleration overlay, acceleration signatures


    Paper ID: MONO10101M

    Committee/Subcommittee: D04.94

    DOI: 10.1520/MONO10101M


    CrossRef ASTM International is a member of CrossRef.

    ISBN10:
    ISBN13: 978-0-8031-7003-2