IRI/Education Analytics 101-1: Merging Data Files

One of the many SAS PROCs that is very useful to know in the IRI/Data Analytics is merging.  Often a higher education institution, i.e., an IR Data Analyst or Associate needs to merge data from different sources to complete a report.  For example, an administrator at College AAA has the interest to know the college choice of its transferred-out students.  Knowing this information will help the decision makers at College AAA to make strategic decisions.  This is particularly important for colleges that are competing for the state money under “Performance Based Funding” system.  Readers will learn more how to complete the tasks by reading the SAS merging concepts and codes here.  This paper has summarized different concepts of merging that can be done in SAS.

Note: This is the first article in Data Analtytics 101 Series.  The Association will start publishing skills, techniques, references and codes to help the IRI/Data Analytics profession to grow.

Data Analytics 101

The Association has discussed that Data Analytics (DA) is not the same with Data Visualization (DV).  However, DV is part of DA.  The first thing that one can do to check if his or her office has applied DA is simple.  Below is a simple list which will help to answer the question.

  1. How many of your team members know how to write codes in SAS, R or Payton?
  2. How often strategic decisions are made based on multivariate and categorical statistical data analyses (of course with all the hypothesis tests) or mathematical programming such as linear, non-linear or simulation?

If your answer is none of the above, then most likely what your office is currently doing falls under DV and not DA.  DV can be done through Tableau or Excel, which are the common tools to visualize data.

Street Light Just Is Not Enough: The Industry Needs Effective Enforcement Actions

The Association has exactly predicted that these outcomes will occur if rules and regulations are not being implemented effectively on the Title IV institutions.  There is no assurance that regulations which are good on paper will be implemented effectively.  AAEA has used the following parallel statement to explain the situation.  In addition to the street light, enforcement needs to be done to ensure that the traffic regulations are implemented in full.  Some motorists may try to pass, though the traffic light is red.  After so many years with ineffective enforcement policy in the US higher education industry, the American public now experiences the pain.  Traffic violators will get a ticket, but what are the consequences for the school administrators who have violated the regulations?  If there is none, then one may expect to see the same repeated stories in the future.

The Association Reaches Another Milestone

Three and half years ago, the Association was established with the main purpose to revitalize the US higher education.  On August 30, 2016, AAEA reached another milestone with more than 200K visitors.  This shows two main points.  (1). The American public has the confidence on AAEA’s published research results; (2). Many are able to get the strategic information out of AAEA’s research work, free of charge.

To change the mindset in managing higher education institutions to cope with fundamental changes in the ultra competitive environment where they are operating is not always easy.  However, we are glad to see that players in the industry have realized what the Association has tried to convey and hypothesized in the past.

The following are the real changes that we have seen after AAEA was established:

  1. The use of education or IRI analytics which never been heard before is becoming more popular. It has become a norm for colleges to adopt this new paradigm after it was introduced for the first time at North Carolina Community College System Annual Meetings in Raleigh, NC on October 7, 2012 and at 2012 South Central SAS Users Group Annual Conference in Houston, TX.
  2. Some schools have changed their Institutional Research & Assessment name to something that has the word “analytics”. For example, Office of Institutional Research and Analytics.
  3. The States start to apply the Performance Based Funding System to award the annual budget among state colleges.
  4. More inappropriate practices have been revealed in recent years.
  5. The regulator starts tightening its grips on both academic and financial metrics on Title IV institutions for more regulations are introduced in the past years.
  6. The regulator is paying more attention on for-profit higher education institutions’ business model.
  7. US higher education is the most discussed issue in the society.
  8. In 2015, the regulator started to publish the list of financially trouble US colleges.
  9. The regulator seems to pay more attention to US higher education.
  10. The accrediting agencies start improving their roles, which may not have been seen before.

After its establishment, there are many more positive impacts that the Association has contributed to the industry and is able to add to the above list.  While this partial list shows that our mission is accomplished, there are many more tasks that AAEA will continue to work on in the future.

List of Financially Trouble US Colleges

The news on DOE banned on ITT new students access to federal loans may have surprised many parties.  More than three years ago, the Association has initiated a research on the issue based on published NCES data.  Beginning in 2015, the DOE started publishing the same info to the American public which may have applied different methods compared to AAEA. The following is the list of other institutions which are under the regulator’s microscope.

ITT and Student Loans

On August 25, 2016 the US Department of Education announced that new ITT’s students cannot take federal loans or grants to enroll at the institution.  DOE’s policy follows ACICS’ recommendation. It seems that there is a correlation why the Accreditor takes this dramatic action with the fact that ACICS itself has been recommended by the DOE staff to be restricted from its role as an accrediting agency back in June, 2016.  What can one learn from this story?

 

Reporting Categorical Data

Yesterday, the Association wrote a gentle reminder how to interpret the 2016 NSSE report results.  If taking the mathematical average or statistical mean or calculating the Standard Deviation and Standard Error may not be exactly appropriate, then what is the more appropriate way to report the results?  Well, the straight answer will be: why not try the frequency instead.

NSSE and Categorical Data

By now US higher education institutions that participated in the 2016 NSSE survey may have received their results.  This note serves as a friendly reminder for them to take an extra caution when reading the reports.  Needless to say that taking the mathematical average or statistical mean on categorical data from the stand point of education/IRI or data analytics may not be exactly appropriate.  A couple of years ago, the Association has raised this issue.

The Impacts of Dropping Parts of the ACT and SAT Assessment Tests

Today we just learned that more US colleges have dropped or plan to erase a certain part of the assessment tests as part of the admissions requirements such as ACT/SAT subject test or essay scores.  Needless to say that statistical analyses have shown that some of these dropped assessment components have significant effects on both student’s retention and graduation.  Perhaps IRI analytics has been applied on institutional past evidence that has shown otherwise and in-depth studies have been done at those colleges to support such admissions policy changes.  Dropping them will surely help to increase the institutions’ enrollment which in turns will increase revenue generated from tuition.  While this policy sounds great, it may potentially create another problem on both retention and graduation rate.  These two important metrics are usually used in the state performance based funding.  Lowering the admissions standards, will increase tuition revenue only in the short-run.  However, in the long-run it potentially will reduce money received from the state, especially in the state where performance based funding has been applied to make award decisions.  The institutions will be able to keep both its retention and graduation rate unchanged or even better after the policy got implemented by lowering their courses passing requirements.  In other words, reducing the admissions standards will force the instructors to lower their courses standard as well.  Making the college admissions less rigorous only shows a clear signal of decreasing student enrollment across the US.  It maybe a beginning of self-destruction toward the US higher ed system.

Lucky Are Those With The IRI Analytics Expertise

Three years ago the Association has published many articles in its blog related to IRI analytics or Data Scientist Today we learned the shortages of such expertise are real in the US.  How this new development will affect the Institutional Research & Students Learning Outcomes and Assessment profession?  It is very obvious.  When the Association initiates to use the term education analytics or IRI analytics, many traditionalists think that was an absurd idea which will never take off the ground.

Currently the facts show that many colleges and universities and other higher ed institutions in the US have renamed their IR department to something with analytics word attached to it, i.e., Office of Institutional Analytics instead of traditional “Office of Institutional Research and Assessment”.  For example, the University of New Mexico renamed its IR office as “Office of Institutional Analytics”, while the University of North Texas relabeled its IR department as the “Office of Data, Analytics, & Institutional Research”.  Moreover, Borough of Manhattan Community College has replaced it traditional name to “Institutional Effectiveness and Analytics”.  The list can go more than only three institutions and it will get longer as time passes.

One important note that the Association would like to share is that changing the name is a good sign, and the first step toward embracing the IRI Analytics paradigm.  However, the change of the name needs to be followed by changes in the culture, mentality and improve the actors’ skills sets. Enough training to equip the personnel with the IRI expertise., for example from “something point and click” to the ability of writing SAS codes. SAS runs parallel with one’s understanding about statistics and the concepts about data types.  This is exactly what Microsoft tries partially to fill the gap by offering the newly launched programs.

There are four main obstacles for the US Higher Education industry to have enough personnel to deal with this new reality of which the AAEA has successfully predicted is going to occur more than three years ago.  The administrators may not get used to use data in the decision making process. Second, recent leaderships who are making the hiring decision are not an expert in the analytics by training.  Third, majority of the current IR professionals (analyst, associate and such) may have limited IRI analytics skills as well for they are the products of the past.  Lastly, US colleges have to compete with the manufacturing, financial or health industry and others for such a talent.  While associates or analysts are paid around $40K-50K or lower to carry out the education analytics jobs, it is surely more difficult to get well-rounded professionals because other industries, on average are paying at least 30 to 50 percent more than what the educational institutions are currently willing to pay.  As results, higher ed institutions will continuously rely to outside consultants.  However, consultants are not magicians who can fix long-run institutional challenges over night.  In the near future, US colleges may start hijacking IRI experts from other higher ed institutions.