The Association Reaches Another Milestone

Three and half years ago, the Association was established with the main purpose to revitalize the US higher education.  On August 30, 2016, AAEA reached another milestone with more than 200K visitors.  This shows two main points.  (1). The American public has the confidence on AAEA’s published research results; (2). Many are able to get the strategic information out of AAEA’s research work, free of charge.

To change the mindset in managing higher education institutions to cope with fundamental changes in the ultra competitive environment where they are operating is not always easy.  However, we are glad to see that players in the industry have realized what the Association has tried to convey and hypothesized in the past.

The following are the real changes that we have seen after AAEA was established:

  1. The use of education or IRI analytics which never been heard before is becoming more popular. It has become a norm for colleges to adopt this new paradigm after it was introduced for the first time at North Carolina Community College System Annual Meetings in Raleigh, NC on October 7, 2012 and at 2012 South Central SAS Users Group Annual Conference in Houston, TX.
  2. Some schools have changed their Institutional Research & Assessment name to something that has the word “analytics”. For example, Office of Institutional Research and Analytics.
  3. The States start to apply the Performance Based Funding System to award the annual budget among state colleges.
  4. More inappropriate practices have been revealed in recent years.
  5. The regulator starts tightening its grips on both academic and financial metrics on Title IV institutions for more regulations are introduced in the past years.
  6. The regulator is paying more attention on for-profit higher education institutions’ business model.
  7. US higher education is the most discussed issue in the society.
  8. In 2015, the regulator started to publish the list of financially trouble US colleges.
  9. The regulator seems to pay more attention to US higher education.
  10. The accrediting agencies start improving their roles, which may not have been seen before.

After its establishment, there are many more positive impacts that the Association has contributed to the industry and is able to add to the above list.  While this partial list shows that our mission is accomplished, there are many more tasks that AAEA will continue to work on in the future.

List of Financially Trouble US Colleges

The news on DOE banned on ITT new students access to federal loans may have surprised many parties.  More than three years ago, the Association has initiated a research on the issue based on published NCES data.  Beginning in 2015, the DOE started publishing the same info to the American public which may have applied different methods compared to AAEA. The following is the list of other institutions which are under the regulator’s microscope.

ITT and Student Loans

On August 25, 2016 the US Department of Education announced that new ITT’s students cannot take federal loans or grants to enroll at the institution.  DOE’s policy follows ACICS’ recommendation. It seems that there is a correlation why the Accreditor takes this dramatic action with the fact that ACICS itself has been recommended by the DOE staff to be restricted from its role as an accrediting agency back in June, 2016.  What can one learn from this story?

 

Reporting Categorical Data

Yesterday, the Association wrote a gentle reminder how to interpret the 2016 NSSE report results.  If taking the mathematical average or statistical mean or calculating the Standard Deviation and Standard Error may not be exactly appropriate, then what is the more appropriate way to report the results?  Well, the straight answer will be: why not try the frequency instead.

NSSE and Categorical Data

By now US higher education institutions that participated in the 2016 NSSE survey may have received their results.  This note serves as a friendly reminder for them to take an extra caution when reading the reports.  Needless to say that taking the mathematical average or statistical mean on categorical data from the stand point of education/IRI or data analytics may not be exactly appropriate.  A couple of years ago, the Association has raised this issue.

Data Analytics V. Data Visualization: How Are They Different?

Recent development in the application of data-driven in the decision making process is phenomenal.  More and more companies shift their decision making strategies from only applying business concepts taught in the MBA program to data analytics.  Therefore, demand for a professional with data mining & analytics expertise and experience is parallel with what has happened many years ago in the labor market for MBA graduates.   While this market is getting cool and slowdown in recent years, demand for professionals with data analytics is soaring.

Along with this development, we noticed that almost everyone is trying to re-brand its organization.   In the process some people think that data visualization and data analytics have the same meaning, content or refer to the same thing.  The fact is, they are not.  Data visualization is just one part of many parts of what data analytics professionals can do.  One thing for sure is that data analytics implementation required some sort of statistical analyses, either estimation or hypothesis tests or mathematical programming or simulation.  Visualization of data does not always generate ultimate answers to solve business problems.  It needs to be supported by more rigorous studies.  This is particularly true in a situation where visualization leads to inconclusive strategic changes or recommendations.  On the other hand, hypothesis tests and mathematical programming will generate clearer cut results which can be used to support strategic actions so long the basic assumption of randomness, Central Limit Theorem and others are satisfied.  The ideal skill sets which will lead to a successful professional in the data analytics field consist of knowing the industry, concepts taught in the MBA program, know how to write codes to solve business problems, understand different type of data, having the talent to translate business strategies into computer codes, know the concept of data security, know how to manage data (merger, slice), know how to treat the outliers and to deal with missing observations and the ability to build econometrics models.

The Impacts of Dropping Parts of the ACT and SAT Assessment Tests

Today we just learned that more US colleges have dropped or plan to erase a certain part of the assessment tests as part of the admissions requirements such as ACT/SAT subject test or essay scores.  Needless to say that statistical analyses have shown that some of these dropped assessment components have significant effects on both student’s retention and graduation.  Perhaps IRI analytics has been applied on institutional past evidence that has shown otherwise and in-depth studies have been done at those colleges to support such admissions policy changes.  Dropping them will surely help to increase the institutions’ enrollment which in turns will increase revenue generated from tuition.  While this policy sounds great, it may potentially create another problem on both retention and graduation rate.  These two important metrics are usually used in the state performance based funding.  Lowering the admissions standards, will increase tuition revenue only in the short-run.  However, in the long-run it potentially will reduce money received from the state, especially in the state where performance based funding has been applied to make award decisions.  The institutions will be able to keep both its retention and graduation rate unchanged or even better after the policy got implemented by lowering their courses passing requirements.  In other words, reducing the admissions standards will force the instructors to lower their courses standard as well.  Making the college admissions less rigorous only shows a clear signal of decreasing student enrollment across the US.  It maybe a beginning of self-destruction toward the US higher ed system.

Lucky Are Those With The IRI Analytics Expertise

Three years ago the Association has published many articles in its blog related to IRI analytics or Data Scientist Today we learned the shortages of such expertise are real in the US.  How this new development will affect the Institutional Research & Students Learning Outcomes and Assessment profession?  It is very obvious.  When the Association initiates to use the term education analytics or IRI analytics, many traditionalists think that was an absurd idea which will never take off the ground.

Currently the facts show that many colleges and universities and other higher ed institutions in the US have renamed their IR department to something with analytics word attached to it, i.e., Office of Institutional Analytics instead of traditional “Office of Institutional Research and Assessment”.  For example, the University of New Mexico renamed its IR office as “Office of Institutional Analytics”, while the University of North Texas relabeled its IR department as the “Office of Data, Analytics, & Institutional Research”.  Moreover, Borough of Manhattan Community College has replaced it traditional name to “Institutional Effectiveness and Analytics”.  The list can go more than only three institutions and it will get longer as time passes.

One important note that the Association would like to share is that changing the name is a good sign, and the first step toward embracing the IRI Analytics paradigm.  However, the change of the name needs to be followed by changes in the culture, mentality and improve the actors’ skills sets. Enough training to equip the personnel with the IRI expertise., for example from “something point and click” to the ability of writing SAS codes. SAS runs parallel with one’s understanding about statistics and the concepts about data types.  This is exactly what Microsoft tries partially to fill the gap by offering the newly launched programs.

There are four main obstacles for the US Higher Education industry to have enough personnel to deal with this new reality of which the AAEA has successfully predicted is going to occur more than three years ago.  The administrators may not get used to use data in the decision making process. Second, recent leaderships who are making the hiring decision are not an expert in the analytics by training.  Third, majority of the current IR professionals (analyst, associate and such) may have limited IRI analytics skills as well for they are the products of the past.  Lastly, US colleges have to compete with the manufacturing, financial or health industry and others for such a talent.  While associates or analysts are paid around $40K-50K or lower to carry out the education analytics jobs, it is surely more difficult to get well-rounded professionals because other industries, on average are paying at least 30 to 50 percent more than what the educational institutions are currently willing to pay.  As results, higher ed institutions will continuously rely to outside consultants.  However, consultants are not magicians who can fix long-run institutional challenges over night.  In the near future, US colleges may start hijacking IRI experts from other higher ed institutions.

Empty Promises?

There has been a lot of talk in the campaign trails on how the candidate will deal with the gigantic $1.3 trillion student loans debt.  This indicates finally this country realizes that there is something wrong with the US higher education despite some say otherwise.  Waiting until 2016 to address such important issues show:

  1. For some people the student loans is not really important. It surfaces as a campaign rhetoric may be just to get student loans borrowers’ votes.
  2. The regulator often adopts a reactive policy instead of proactive or even “do nothing policy”.
  3. Most players in the past have pretended that the country higher education system is not broken.
  4. Some players have taken advantages because the system in fact is badly broken.
  5. Past administrators have lack of “will” to straight things down, and chose to quit rather than to fight all the way through.
  6. The next administration needs to appoint a person-in-chief that has the bone to say what is wrong and take real actions to fix things regardless of the challenges from many sides.
  7. Appoint a person-in-chief that has the integrity, courage and ability to focus on the interest of Uncle Sam’s and not others.
  8. The next administration needs to be honest to the people instead to the groups of people who happen to be his or her supporters in the campaign trail.
  9. A “big name school” does not always indicate or a credible signal that the person-in-chief for the job has the quality as mentioned in point 5, 6 and 7.
  10. Empty promises or lies will cause more damages than good.

The remaining questions are “Can Uncle Sam find the person-in-chief with such qualities”? Or “will the next administrator be able to carry out what has been promised in the campaign trails”? Or will it be just another empty promise as any politician will do”?

The Waste Function Paradigm: What Is It?

Recently, we just learned that the DOE is considering to drop the ACICS from the accreditation business.  The recommendation confirmed what the Association has hypothesized many years ago about the acute problems associated with one of the quality works by the US accreditation agencies.  Though the news on ACICS comes a bit too late, it is better than never.  Applying the Waste Function Paradigm on 11 years published data, the Association has calculated the proxy of wasteful resources on other US regional accrediting agencies (note that only DOE has the exact numbers).  The calculation is based on nominal (Dollar) value and on all Carnegie classification institutions.  That said, potentially there will be a double counting.  Perhaps, it is time for the regulator to take proactive actions instead of reactive to protect the tax payers’ money, the students’ and their family precious resources from the predators.