Updates from August, 2016 Toggle Comment Threads | Keyboard Shortcuts

  • davidm34 11:46 am on August 29, 2016 Permalink | Reply  

    MANCOVA statistics: Query 

    I am trying to conduct repeated measures MANCOVA in SPSS software, where I need to measure my dependent variable at 4 periods of time (years). Also, the dependent variable at each year is related with 2 independent variables. To be honest, it is a very complicated for me, I am unable to understand the concept of MANCOVA and running it on SPSS is the icing at the top. Any help regarding this would be really appreciated.

     
    • brunoperryblog 12:48 pm on August 30, 2016 Permalink | Reply

      MNACOVA is not that complicated if you get a hang of it. As I am not a professional, I can’t explain it to you. But I found a similar kind of query in the following link. May be useful.

      How to conduct Repeated Measures MANCOVA in SPSS

    • terrywellch 5:56 am on September 1, 2016 Permalink | Reply

      Hey, you can also do repeated measures in the general linear model menu in SPSS. Now keep the variables constant for testing the effects in that model. You can probably get an SPSS guide for the same.

  • robert591 4:58 am on January 29, 2015 Permalink | Reply  

    choosing a statistical tool for my research 

    Hello members,
    I am preparing my PhD proposal as of now. I have already got my topic approved from my guide. I am studying the effect of factors like HR practices, training and development, compensation satisfaction, stress and organizational commitment on employee turnover intention. I am not sure as to which statistical tool should I use for analysis and results. I will be using primary data obtained through a questionnaire survey. Any suggestions?

     
    • Jamie 7:33 am on January 31, 2015 Permalink | Reply

      Hey Robert
      choosing the right statistical tool is very important since the conclusions of your research will depend largely on this choice.
      i conducted a similar study, the only difference being that i studied for the impact on organizational performance. Turnover intention is a good angle to study.
      You can use Karl Pearson’s coefficient of correlation and accompany it with a graph analysis. The impact can be rightly studied by computing the degree of correlation.
      All the best Robert.

  • Michael 9:49 am on September 5, 2014 Permalink | Reply  

    Tips on Data Analysis 

    Data Analysis is considered to be the most crucial step in a dissertation process. If we’ll pay a little attention to it, it surely pays us back by getting the desired results. Here are few quick tips on how you can do it in a better manner –
    1. Form a clear, Specific, And Concise Hypothesis BEFORE Analysis as it gives a better understanding and also it becomes much easier to do the analysis.
    2. Choosing the right analysis technique is really important. You must try to consult your guide before opting for one to avoid duplicacy.
    3. Always re-check your Assumptions before you start with the Analysis. It is really important that they are related to each other.
    4. Always perform the Analysis on a rough draft first. Show it to your guide if he approves then prepare the final copy.

     
  • Rosaline 9:42 am on August 31, 2014 Permalink | Reply  

    Data Analysis and its types 

    Data Analysis is a process of inspecting and transforming data with the goal of discovering something useful and productive. It is done in a study on the basis of the survey to drive certain conclusions and take decisions. Below are some of its types –

    1. Descriptive – Least amount of efforts are put in this and is used to describe the main features of the data collected in qualitative manner.
    2. Exploratory – Under this method the data is analyzed to find previous unknown facts and define future studies and questions.
    3. Inferential – It is mainly used when the data size is too large.
    4. Predictive – Under this method the analysis is done on the basis of historical facts and studies.
    5. Causal – It is used to see the effect of change in one variable on another.
    6. Mechanistic – This type require most efforts and helps to understqand the exact changes in vatriables for individual objects.

     
  • Dr S Loretti 3:56 am on May 31, 2014 Permalink | Reply  

    Summarizing Data using Histograms 

    A histogram is a data representation format which makes use of bar graphs to describe the frequency of occurrence of the event being studied. It also represents necessary and relevant basic information about the data classes such as the data spread, shape and skewness.

    Histograms serve as one of the best ways to represent data in a summarized form. Being fairly easy to construct than other forms of representative charts, histogram generated results are also slightly more interesting than the table-format display of the data. These types of charts are significant for assessing current situation and for devising ways to improve the same. Even after executing the improvement methods, histograms can be created to study the impact of the methods applied.

    While making use of descriptive statistics such as mean, standard deviation, chi square, kurtosis and others, the histograms can be significantly interpreted to judge the normality of data distribution.

     
    • Alex 11:25 am on June 2, 2014 Permalink | Reply

      Not just easy to prepare, they are the most appropriate for summarizing quantitative data.

      • Joel 9:54 am on September 5, 2014 Permalink | Reply

        Right Alex they are one of the most relevant and easy way to encapsulate the quantitative data.

    • John R. Rooks 6:29 am on August 6, 2014 Permalink | Reply

      Histograms are best to be used when your data can be described by the use of not more than 2 variables.

    • Vincent A. Solis 5:04 am on September 29, 2014 Permalink | Reply

      It is really easy to present data through histograms.

  • Dr S Loretti 3:46 am on March 29, 2014 Permalink | Reply  

    The importance of data to economics 

    One discipline that requires careful analysis of volumes of data is economics. That is why those who choose to make a career in this field are also required to be skilled in mathematics and statistics. Some people make use of Excel files to accumulate all this data so as to analyze it at a later stage. Then there are others who prefer to let modern software applications carry out this work for them. Many of the economic upheavals of the modern world have been detected or predicted in advance since some economists were smart enough to look at the data and understand what it means.

     
    • Robert 11:26 am on June 13, 2014 Permalink | Reply

      Accumulating data as per suitable categories is very cumbersome!

    • Gertrude J. Padilla 6:35 am on August 8, 2014 Permalink | Reply

      Some software for data accumulation or collation can be used.

      • Michael 10:50 am on September 5, 2014 Permalink | Reply

        Yeah! It is less tiresome and much more tedious.

    • Rosa M. Garren 5:04 am on September 29, 2014 Permalink | Reply

      Economics is a study of data only so it is of utmost important.

  • Dr S Loretti 7:00 am on February 25, 2014 Permalink | Reply  

    Tips for effective referencing 

    Referencing is significant only when it is used in the standard bibliographic format to cite every source which the paper is tied to. Follow the useful tips while preparing the Reference Section:

    1. Record the bibliographic information as and when the sources are used to avoid the last minute hustle.
    2. Ensure to apply consistent format for citing all the sources.
    3. Cite only the sources that are directly relevant to the area of study.
    4. Never cite Incomplete or abstract sources.
    5. List all the references alphabetically.
    6. Follow the appropriate citation format for information from different sources. Separate format is used for Journal papers, books, newspaper articles, online content.
    7. Make sure that each in-text citation has an equal entry in the Reference list.
    8. Favor the present tense and active voice.
    9. Avoid the use of dramatic adjectives.
    10. Devote sufficient time for this section and proofread carefully before the dissertation is submitted.
     
    • Steven 4:38 am on February 26, 2014 Permalink | Reply

      I have given few references in the form of footnotes, am I still required to cite them in the References section?

    • Dennis Scott 4:32 am on February 27, 2014 Permalink | Reply

      Footnotes must be cited in the body of the text and on a separate page titled Footnotes.

    • Mary R. Thomas 5:05 am on September 29, 2014 Permalink | Reply

      Referencing is so important, never thought of it. I will definitely keep these tips in mind while preparing the references in my thesis

  • Dr S Loretti 7:06 am on November 28, 2013 Permalink | Reply
    Tags: Data analysis,   

    Hazards of incorrect data collection 

    Imagine the data collected by a bank was incorrect! It could lead to inaccurate dividend statements, incorrect addresses could mean checks sent out to wrong recipients. People could make false claims to credit cards. Government funds that are needed to be distributed would go to the wrong households. Imagine if simple data from the census or a government population study was not cleaned and wrong. It would mean wrong statistics of all kinds. If it were a welfare program study it would mean wrong people get more aid than the deserving folk; all because the data was ‘dirty’ and not cleaned.

    Imagine if data from a clinical trial of a drug got mixed up. The outcome would lead to a life and death situation where a drug may have false data about it being passed or a life saver drug being rejected just because the data was not cleaned. This can happen to thesis data collected in the students’ project the data that is not cleaned could be misleading changing the entire inference and analysis of the study used in a project/thesis. In all these cases a simple cleaning process will give accesses to reliable data to remove all the glitches from it and lead to more reliable results in the statistical analysis that may follow.

     
    • Richard Brien 12:10 pm on November 30, 2013 Permalink | Reply

      Before reading this, I never thought that an incorrect data can create such a menace on a larger scale. I actually don’t pay serious attention to maintaining records correctly.

      • Mathew Mcquire 12:12 pm on December 2, 2013 Permalink | Reply

        Exactly Brien! I was planning a visit to a nearby clinic today but after reading this I’m really getting nervous thinking what if they have messed up their drug information…lol.

    • Richard Brien 12:14 pm on December 5, 2013 Permalink | Reply

      Lol…from now, I’m gonna be very serious about all my important data.

    • Mathew Mcquire 12:14 pm on December 6, 2013 Permalink | Reply

      Me too man!

    • Diane D. Steely 6:35 am on July 6, 2014 Permalink | Reply

      Data collection and maintenance is a must in today’s time when everywhere you see frauds and scams!

    • Teresa J. Mann 5:06 am on September 29, 2014 Permalink | Reply

      The biggest is, we have to do a lot of rework and hard work.

c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel