# Multiple Regression Analysis Is an Essential Constituent Methodology Chapter

**Pages:** 5 (1613 words) ·
**Bibliography Sources:**
≈ 8 · **File:** .docx · **Level:** Doctorate · **Topic:** Anthropology

Multiple regression analysis is an essential constituent of marketing research procedures. Better usability over several kinds of data, the ability to handle failure of pre-defined assumptions and ease of understanding are some its advantages. Predicting research results and analyzing the different variables used in those predictions are the two primary objectives of multiple regression analysis. In a situation where a couple or more of such predictor variables are related to each other, the results can be erroneous. The problems are usually with regards to summarizing results based on the coefficient size, the related t-tests and typical errors. There are several ways to deal with this issue, termed as multicollinearity. The effect of collinearity cannot be interpreted in seclusion. There needs to be other factors to aid in understanding a certain stage of collinearity. Collinearity is said to be existent if a linear relationship can be devised using the predictor variables. Data usually exists in between a state of perfect or zero collinearity. (Mason, 1991, 268-280)

Download full

paper NOW! Values of collinearities close to each other can increase the disparities of regression coefficients. Collinearity can be dealt with in several ways, none of which can be considered perfect though. Some of them are to ignore one or more of the variables, creating a whole new orthogonal matrix from the available data, biased estimation techniques such as ridge regression, selecting a particular estimator among the different ones available and devising new methods of predictor variable importance to substitute the current ones. Controlling collinearity for bivariate correlations helps out the users to effectively compare the relationship patterns in the covariance matrices with those in actual marketing analysis. The fact that collinearity affects the process of estimating errors, is true however it can be considered as a problem only when there is high degree of correlation between the predictors. (Green, 1991, 499-510)

Confirmatory Analysis

## TOPIC: Methodology Chapter on

Confirmatory factor analysis or CFA is a method of performing hypothesis testing with regards to the unidimensionality of the measures. It is different from Exploratory factor analysis or EFA in the sense that the user needs to state the CFA-based model which he will be using before the data analysis starts. This implies an a priori specifation of the latent variables along with the related experimental indicators. This is done by selecting a set of latent variables on which those indicators can load and identify the variables which are allowed to be associated. CFA has several advantages and avoids some of the problems which show up in EFA.It has been considered to be a better method than EFA with regards to the initial phases of theory development.

CFA utilizes a statistical t-test to analyze the importance of the factor loading. It allows users to judge the suitability of current measurement model with regards to how well it fits the data. This is done using the test and is beneficial due to the fact that the indicators will have a chance of being well associated with the latent variables, even though the fit might not be good enough. CFA allows the users to check the unidimensionality of a set of experimental indicators composed of interrelated and unrelated latent variables. This can be considered as a stronger check of unidimensionality. It also provides ways to check if a particular set of latent variables are related to each other as expected from the mode. This is achieved by a combination of analyzing how important each correlation is as well as the model fit, using the test as mentioned before. Considering its quality, the usage of CFA has not been as much as it should be. (O'Leary-Kelly, 1998, 287-405)

Convenience Sampling

Convenience sampling is the most widely used method for ecological data. Also known as subjective sampling, this method is used to gather data which does not necessarily represent the population in the current context. It is also applied by biologists in their line of work due to its ease of usage. The information from convenience sampling is used to make assessments about the sample characteristic instead of formal conclusion. (Marshall, 1996, 522-525)

The regular approach in use is performed by making a sample selection using probability measurements where a random selection is preferred over subjectivity. Cluster, simple random, adaptive and stratified random methods are among some kinds of probabilistic sampling. These are important methods which provide better results than convenience sampling. There are a couple of major problems which can arise from using convenience sampling methods on a regular basis. There is no stable foundation for inductively deducing sample data from within the population in context. There is no reliable source to review the correctness of the population parameters which have been obtained before. Considering the good points of a convenience sample, they prove more cost-efficient and save time as the subjects chosen are usually the ones available right away. This affects the quality and reliability. Most qualitative studies apply convenience sampling to some extent, however a more effective approach is required. A study conducted by Rosenstock in 2002 with over 200 papers in 9 journals and a conference revealed how most studies use convenience sampling more than probabilistic sampling. The practice of using probability was identified in less than 15% of the papers. The results achieved from applying convenience sampling in some form or the other yields inefficient results and needs to be changed. (Anderson, 2001, 1294-1297)

Triangulation method

Triangulation can be considered as a blend of techniques applied to study the same event. It implements a cross validation method where multiple methods fitting each other are combined in an effort to retrieve compatible data. It allows researchers to experiment with several different options. It increases the confidence factor which helps researchers to explore in various directions. It encourages them to devise innovative methods of approaching problems along with the regular data collection methods already in use. It can be used to reveal unexpected angles of the events being analyzed. New perspectives can uncover elements which might not apply well on the current models. This would require them to be renovated to produce new forms. These will in turn allow better understanding of the problem at hand. (Jick, 1979, 602-611)

Multiple methods can cause theories to join together. Being a thorough approach, triangulation can help in ensuring the completion of theories. The methods incorporated are used to find an effective way to combine all the benefits. It directs researchers towards being closer to the different sources of data. The triangulation method has its low points as well. Reproduction of the results can be hard. Most forms of research require some kind of data replication even though its present usage is not as much as expected. Reproducing the process of a mixed and qualitative approach is too tedious to become a regular practice. Further problems might arise when the focus of research isn't clearly defined. Triangulation might cause a researcher to become biased towards a particular method. In a situation where one method appears to be more effective than others, it is essential to justify the reason adequately or else the objective of triangulation is undermined. (Mathison, 1988, 12-17)

Measuring instrument (BAS: The Bidimensional Acculturation Scale for Hispanic)

The term acculturation can be defined as a gradual process where people coming in touch with two or more cultural environments tend to change their original culture and absorb some characteristics from the new culture. This leads to the development of BAS (Bidimensional Short Acculturation scale for Hispanics), a way to measure that process. Most previous acculturation scales faced some general problems. They usually regard acculturation as a single dimensional process or generate results which are one-dimensional. Most researchers haven't observed the different ways in which acculturation can change over time. Neither have they applied psychometric methods to manage the data and obtain the acculturation scales.

BAS has… [END OF PREVIEW] . . . READ MORE

Download full

paper NOW! Values of collinearities close to each other can increase the disparities of regression coefficients. Collinearity can be dealt with in several ways, none of which can be considered perfect though. Some of them are to ignore one or more of the variables, creating a whole new orthogonal matrix from the available data, biased estimation techniques such as ridge regression, selecting a particular estimator among the different ones available and devising new methods of predictor variable importance to substitute the current ones. Controlling collinearity for bivariate correlations helps out the users to effectively compare the relationship patterns in the covariance matrices with those in actual marketing analysis. The fact that collinearity affects the process of estimating errors, is true however it can be considered as a problem only when there is high degree of correlation between the predictors. (Green, 1991, 499-510)

Confirmatory Analysis

## TOPIC: Methodology Chapter on *Multiple Regression Analysis Is an Essential Constituent* Assignment

Confirmatory factor analysis or CFA is a method of performing hypothesis testing with regards to the unidimensionality of the measures. It is different from Exploratory factor analysis or EFA in the sense that the user needs to state the CFA-based model which he will be using before the data analysis starts. This implies an a priori specifation of the latent variables along with the related experimental indicators. This is done by selecting a set of latent variables on which those indicators can load and identify the variables which are allowed to be associated. CFA has several advantages and avoids some of the problems which show up in EFA.It has been considered to be a better method than EFA with regards to the initial phases of theory development.CFA utilizes a statistical t-test to analyze the importance of the factor loading. It allows users to judge the suitability of current measurement model with regards to how well it fits the data. This is done using the test and is beneficial due to the fact that the indicators will have a chance of being well associated with the latent variables, even though the fit might not be good enough. CFA allows the users to check the unidimensionality of a set of experimental indicators composed of interrelated and unrelated latent variables. This can be considered as a stronger check of unidimensionality. It also provides ways to check if a particular set of latent variables are related to each other as expected from the mode. This is achieved by a combination of analyzing how important each correlation is as well as the model fit, using the test as mentioned before. Considering its quality, the usage of CFA has not been as much as it should be. (O'Leary-Kelly, 1998, 287-405)

Convenience Sampling

Convenience sampling is the most widely used method for ecological data. Also known as subjective sampling, this method is used to gather data which does not necessarily represent the population in the current context. It is also applied by biologists in their line of work due to its ease of usage. The information from convenience sampling is used to make assessments about the sample characteristic instead of formal conclusion. (Marshall, 1996, 522-525)

The regular approach in use is performed by making a sample selection using probability measurements where a random selection is preferred over subjectivity. Cluster, simple random, adaptive and stratified random methods are among some kinds of probabilistic sampling. These are important methods which provide better results than convenience sampling. There are a couple of major problems which can arise from using convenience sampling methods on a regular basis. There is no stable foundation for inductively deducing sample data from within the population in context. There is no reliable source to review the correctness of the population parameters which have been obtained before. Considering the good points of a convenience sample, they prove more cost-efficient and save time as the subjects chosen are usually the ones available right away. This affects the quality and reliability. Most qualitative studies apply convenience sampling to some extent, however a more effective approach is required. A study conducted by Rosenstock in 2002 with over 200 papers in 9 journals and a conference revealed how most studies use convenience sampling more than probabilistic sampling. The practice of using probability was identified in less than 15% of the papers. The results achieved from applying convenience sampling in some form or the other yields inefficient results and needs to be changed. (Anderson, 2001, 1294-1297)

Triangulation method

Triangulation can be considered as a blend of techniques applied to study the same event. It implements a cross validation method where multiple methods fitting each other are combined in an effort to retrieve compatible data. It allows researchers to experiment with several different options. It increases the confidence factor which helps researchers to explore in various directions. It encourages them to devise innovative methods of approaching problems along with the regular data collection methods already in use. It can be used to reveal unexpected angles of the events being analyzed. New perspectives can uncover elements which might not apply well on the current models. This would require them to be renovated to produce new forms. These will in turn allow better understanding of the problem at hand. (Jick, 1979, 602-611)

Multiple methods can cause theories to join together. Being a thorough approach, triangulation can help in ensuring the completion of theories. The methods incorporated are used to find an effective way to combine all the benefits. It directs researchers towards being closer to the different sources of data. The triangulation method has its low points as well. Reproduction of the results can be hard. Most forms of research require some kind of data replication even though its present usage is not as much as expected. Reproducing the process of a mixed and qualitative approach is too tedious to become a regular practice. Further problems might arise when the focus of research isn't clearly defined. Triangulation might cause a researcher to become biased towards a particular method. In a situation where one method appears to be more effective than others, it is essential to justify the reason adequately or else the objective of triangulation is undermined. (Mathison, 1988, 12-17)

Measuring instrument (BAS: The Bidimensional Acculturation Scale for Hispanic)

The term acculturation can be defined as a gradual process where people coming in touch with two or more cultural environments tend to change their original culture and absorb some characteristics from the new culture. This leads to the development of BAS (Bidimensional Short Acculturation scale for Hispanics), a way to measure that process. Most previous acculturation scales faced some general problems. They usually regard acculturation as a single dimensional process or generate results which are one-dimensional. Most researchers haven't observed the different ways in which acculturation can change over time. Neither have they applied psychometric methods to manage the data and obtain the acculturation scales.

BAS has… [END OF PREVIEW] . . . READ MORE

Two Ordering Options:

?

**1.**Download full paper (5 pages)

Download the perfectly formatted MS Word file!

- or -

**2.**Write a NEW paper for me!

We'll follow your exact instructions!

Chat with the writer 24/7.

#### Statistics Durbin Watson Term Paper …

#### Applied Management and Decision Sciences Thesis …

#### Healing Through the Senses the Use of Aromatherapy in Addiction Treatment With Women Term Paper …

#### Attachment Theory & Self-Psychology Dissertation …

### How to Cite "Multiple Regression Analysis Is an Essential Constituent" Methodology Chapter in a Bibliography:

APA Style

Multiple Regression Analysis Is an Essential Constituent. (2010, June 7). Retrieved October 22, 2021, from https://www.essaytown.com/subjects/paper/multiple-regression-analysis-essential/36053MLA Format

"Multiple Regression Analysis Is an Essential Constituent." 7 June 2010. Web. 22 October 2021. <https://www.essaytown.com/subjects/paper/multiple-regression-analysis-essential/36053>.Chicago Style

"Multiple Regression Analysis Is an Essential Constituent." Essaytown.com. June 7, 2010. Accessed October 22, 2021.https://www.essaytown.com/subjects/paper/multiple-regression-analysis-essential/36053.