In machine learning, the reliability of an analytical method is crucial in validating the assumption of linearity for any linear regression model. The existing analytical method of proving the assumption linearity, such as Pearson’s Correlation Coefficient (PCC), Spearman’s Rank Correlation Coefficient (SRCC) and Kendall’s Tau Correlation Coefficient (KTCC), has its limitation as it does not work in monotonic relationship graph. In this paper, we propose a Normalized Least Dependent Difference (NLDD) method to improve the limitation of existing linearity method in identifying monotonic relationship graph. By calculating the difference between each data point and its predicted value on the regression line, we can determine how much the predicted value deviates from the observed value. A consistent difference between each data point and its predicted value, represented by a relative standard deviation in the y-axis that is near to its mean, suggests that the model accurately reflects the relationship between the dependent variable and the independent variable. Our findings show that our NLDD is effective in identifying linearity in linear relationship graph and non-linearity in monotonic relationship graph.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.