About this Forum

Welcome to Magate Wildhorse Strategy and Evaluation Answers Forum!

Here you can join members from around the world for discussing issues related to strategy, execution, monitoring and evaluation.  Share your thoughts, ideas, and experiences from the field; and have your questions answered.  Find innovative solutions and increase impact by learning from others how to, apply change marketing and knowledge management for improving strategy execution.

Join us for discussions on all things related to strategy and execution, key sub-themes include:

  • Program, project and organizational success
  • Poverty reduction and economic growth
  • Entrepreneur and enterprise development
  • Water, sanitation, health and environment
  • Sustainable development
  • Making markets work for the poor
  • Project and program management
  • Strategy and Strategic Change Management
  • Monitoring, Evaluation and Assessments
  • Business Incubation
  • Change marketing

Submit questions about:

  • Strategy development, planning and implementation
  • Work plan development and implementation
  • Program and organizational performance measurement and management
  • Designing and implementing monitoring and evaluation systems
  • Conducting monitoring, evaluation or organizational assessment
  • Cause or change marketing
  • Growing your business
  • Improving the outcomes or impact of your program, project, organization or department of government
  • An economic growth initiatives for which you need strategy, evaluation or market development support

Be an active participant:

  • Participate in all forums, storythons and blogathons
  • Submit a request for help privately to Magate Wildhorse
  • Increase your visibility
  • Plus subscribe for our newsletter

Who is this forum for?

This forum is not just for consultants, researchers, monitoring, evaluation and strategy practitioners. Organizational leaders, program and project staff of micro, medium and small enterprises; government, community based and international development organizations are welcome to join.

Registration is free and as easy as 1-2-3. Join our community today!
Have a question about registration?  Click here for help.

Privacy Statement

Research Methodology or Method━ How to Tell the Difference

A keen ear for topics related to research may often hear examples of a research method being described as a research methodology; or of a research methodology described as a research method.

In this post we hope to help the mind to believe what the eyes see and what the ears hear by presenting what I will call classic or strict definitions and examples for accurately distinguishing between the two terms.

Thanks to Google I will not have to re-invent the wheel; instead I will refer to the article “Research dilemmas: Paradigms, methods and methodology”, by Noella Mackenzie and Sally Knipe of Charles Sturt University.   In the article Mackenzie and Knipe cited the definition of research methods offered by McMillan & Schumacher in Research in Education, it reads as follows ━   Research methods – how data are collected and analysed – and the types of generalizations and representations derived from the data (McMillan & Schumacher, 2006, p. 12)”.

By explaining that method consists of “systematic modes, procedures or tools used for collection and analysis of data”, the authors make it easier for readers to understand what may be described as research method.  Among the variety of data collection tools which make up methods are: survey questionnaires, interviews, focus groups, and photographs.

Methods Magate Wildhorse

Methods
Magate Wildhorse

Framework for researchmethodology

James R. Martin, Ph.D., CMA Professor Emeritus, University of Sourth Florida (Management and Accounting Web) http://maaw.info/ArticleSummaries/FrameworkForResearchMethodology.gif

With our memories refreshed as it relates to methods, let us now turn our attention to the term methodology which covers much more than methods do. Mackenzie and Knipe describe research methodology as “the overall approach to research linked to the paradigm or theoretical framework”.  In other words methodology explains how the researcher will solve the problem that is being addressed by the research and include among other components, the methods, frameworks, and indicators of success that will be applied to the study.

Now that we have focused our eyes on what is covered by the term research method, namely: tools, modes and procedures for data collection and analysis; versus methodology which addresses theoretical frameworks, methods and other components described above we hope it will be easier for the mind to distinguish between often heard misuse of the terms and their more accurate, accepted, and formal meaning.

You might find it useful to visit the article and to take a quick review of the two tables listed below, which were used by the authors to illustrate the differences in meaning between research method and research methodology.

  • Table 1: Paradigms: Language commonly associated with major research paradigms
  •  Table 2: Paradigms, methods and tools  

“Research dilemmas: Paradigms, methods and methodology”, (Issues in Educational Research, Volume, 16, 2006), is available [Online] at: http://www.iier.org.au/iier16/mackenzie.html.

Scott, M. E., Copyright © 2015 Magate Wildhorse. All rights reserved.

Thank you for visiting Magate Wildhorse and this post, we look forward to your comments and future visits.

 

Evaluation is Evaluation: True or False?

Evaluation is evaluation, if you know how to conduct an evaluation you just need to apply that knowledge to other circumstances.

Although that “case closed,” or “end of story” perspective may appear to be obvious; it is a very risky and false assertion. Baking a cake and designing an evaluation have two things in common: the approach taken can be simple or complex and both will fail to achieve the intended purpose if the appropriate method is not selected relative to the desired result.

Cappuccino Cheese Cake Source: Wilton http://www.wilton.com/recipe/Cappuccino-Cheesecake

Cappuccino Cheese Cake
Source: Wilton
http://www.wilton.com

Therefore, in the same way the method and approach for making a cheese cake is different to that for making a rich fruit cake; it is in much the same way that methods and approaches for conducting evaluations vary.

Evaluation design and methods vary according to the stage in the life cycle of the evaluand, availability and condition of data, and the suitability of the method for providing solid evidence in relation to the issue that is central to the evaluation.[1] Budget, ethics, objectivity, relevance to decision-making and time are among several other factors that shape the environment in which an evaluation is conducted, together with the other factors mentioned above they serve as the basis for selecting the most appropriate evaluation method.

In order to save time deciding on an evaluation design, you should revisit the Terms of Reference (TOR) to determine the level of program results, and what is involved at the output and outcomes levels.[2]  If you have passed the proposal writing stage you should revisit not only the TOR but engage  in dialogue with the commissioners of the evaluation in order to fully understand the real purpose of the evaluation and how the results are intended to be used.  Once you have identified the levels or results, you can then identify and categorize the evaluation issues, which are usually grouped into three main categories: 1) continued need and relevance of intervention, 2) results, and 3) cost-effectiveness.

The evaluation issues discovered will influence your choice of evaluation strategy━ the strategy will determine the quality of the evidence to be gathered and combined with the concerns or nature of the decision-making environment of the evaluation will help you to decide the most appropriate methods of evaluation. One generally begins with having a fulsome understanding of the program theory. We do this by building a logic model, which includes if possible, both a working theory of change and action. Next, we identify evaluation issues and questions to understand whether the program theory is defensible and valid, the results that are expected or not, and that there is an ongoing connection between needs and results.

With respect to methods of inquiry, an evaluation issue having to do with a factual or descriptive question will require the design of a comparison and supporting measurements as the approach for providing an answer to the evaluation question. Conversely, a strategic or normative issue that requires a judgment to be made will usually require an evaluation comparison comprising evidence required or anticipated, and lines of inquires for delivering the best evidence to support decision-making on the identified questions. In all cases, a combination of evaluation approaches and methods are often required in order to ensure accuracy, and to validate the strength of evidence.

Rich Fruit Cake Source: Grace Foods http://www.gracefoods.com

Rich Fruit Cake
Source: Grace Foods
http://www.gracefoods.com

It would appear; therefore, that evaluation is “not” evaluation after all. By now it should be evident that your choice of evaluation approach will play a major role in influencing the chosen evaluation design, data collection and analysis techniques. It is, therefore, important to note that different kinds of evaluation methods lend themselves to different kinds of data analysis and statistical testing, those are factors that will have implications for internal validity, reliability, and precision, for example. It is good to know the different evaluation methods that may be used, and that are appropriate for providing useful information that can be used to effectively inform policy or program decision-making.

There are three broad categories of evaluation design and several methodologies relevant to each: experimental designs, quasi-experimental designs and implicit experimental designs. Randomized or experimental designs are considered to be the most rigorous and preferred methods, and are often referred to as the gold standard; quasi-experimental designs, and non-experimental designs.

The last category requires a lesser degree of rigour though it is a popular choice for conducting constructivist-type evaluations, for example.  Non-experimental designs may be converted into implicit quasi-experimental designs in order to enable the evaluator to draw strong inferences, and make stronger propositions as a result of the increased strength of conclusion and validity, than would be possible if using a traditional non-experimental approach.

If viewed from the perspective of a  tool for passing a judgment on the worth or importance of: a program, product, project, policy or action then it is reasonable to say that evaluations are one and the same ━ evaluation is therefore evaluation.  On the other hand when it comes to designing and conducting an evaluation it must be remembered and the task should be approached with the knowledge that evaluations differ significantly. How and why they differ is based on their purpose; the  life stage of the intervention; the issue of the evaluation; environmental factors including data quality and budget; stakeholder understanding of and stake in the evaluation, and the scope of impact that the decision to be made based on the findings of the evaluation will have.

Wedding Cake Source: Selena Cakes http://selenacakes.com

Wedding Cake – Source: Selena Cakes http://selenacakes.com

We have seen that a myriad of factors influence and shape the design and conduct of an evaluation ━the result, no two evaluations are the same. A cheese cake will never satisfy the taste buds of someone craving a rum laced fruit cake and vice versa; the best made cheese cake will ruin a wedding for the bride and audience who are expecting a fruit cake; the same applies to evaluations and their stakeholders, only serving the wrong dish at the evaluation table could have more dire consequences for all stakeholders involved.

You are invited to follow this article for a Table which summarizes the three main categories of evaluation designs, methodologies, sources for gathering evidence and a select listing of tests used in data analysis.

We thank you for stopping by, thank you also for sharing your thoughts on the topic.

Scott, M. E.,  Shepherd, R.P., Copyright © 2015 Magate Wildhorse. All rights reserved.

Robert P. Shepherd, PhD, CE is Associate Professor and Supervisor, Diploma in Policy & Program Evaluation at the School of Public Policy and Administration, Carleton University.


[1] PROGRAM EVALUATION METHODS: Measurement and Attribution of Program Results, (Public Affairs Branch Treasury Board of Canada, Secretariat), [Online], Available at: < http://www.tbs-sct.gc.ca/cee/pubs/meth/pem-mep03-eng.asp>, Accessed: 21 March 2015. [2] Public Affairs Branch Treasury Board of Canada, Secretariat,  op. cit