You are being redirected because this document is part of your ASTM Compass® subscription.
    This document is part of your ASTM Compass® subscription.


    Appendix N: A Regression-Based Approach for Testing Significance of JAR Variable Penalties

    Published: Jan 2009

      Format Pages Price  
    PDF (508K) 6 $25   ADD TO CART


    Traditional penalty analysis attempts to relate JAR variable responses to overall liking or some other “reference” variable. No variance estimates are calculated around the penalties or mean drops, and as a result, significance testing is not done. Consequently, this method does not give any gauge of reliability or importance of the results. Focus is often placed on large penalties that are also associated with a large proportion of respondents; 20 % of respondents on a given side of a “Just About Right” point (e.g., “Too Much” or “Too Little”) is frequently used as a minimum standard of importance. We propose a regression-based approach to better understand which penalties are important for a given product. This approach recodes JAR variable scores into indicator (dummy) variables in order to address the non-linear nature of the typical JAR variable scale (the middle category as “Just About Right” and the other categories as some degree of “Too Much” or “Too Little”). Regression coefficients resulting from the indicator variables are analogous to mean drops of the reference variable. Significance testing can be done on these coefficients parametrically by using the standard error estimates from the regression model itself; semi-parametrically by using standard error estimates from methods such as jackknife and bootstrap; or non-parametrically, for example, by forming “ confidence intervals” from the distribution of a large number of bootstrap samples. All of these methods are presented below. While some parts of this approach can be used to test multiple products and attributes simultaneously, we will consider only one JAR variable/product combination at a time. Along the way we will prove that the regression coefficients from ordinary least-squares (OLS) regression are identical to the traditional penalties assuming the so-called JAR mean is used to determine the penalties.

    Author Information:

    Plaehn, Dave
    InsightsNow, Inc., Corvallis, OR

    Horne, John
    InsightsNow, Inc., Corvallis, OR

    Committee/Subcommittee: E18.03

    DOI: 10.1520/MNL11495M