At the APPA 2019 conference, the Facilities Informatics Working Group (FIWG) cochairs presented on the current issues that the working group is seeking to answer. The most prominent issues are a) inconsistency of definitions on terms and, b) differing data formats between software vendors. The biweekly working group meetings have engaged different business enterprise software vendors used by APPA members to assist with managing their business processes and solving information challenges. Through these discussions, we have discovered more challenges that we are now focusing on to help APPA members prepare for the future.
Consistent Terms and Definitions for Facilities Management
The first challenge we are addressing is the reality of inconsistent definitions in the broader facilities management space, especially across educational institutions. To engage in this challenge, we require a consensus around how we define terms such as “current replacement value” (CRV), “gross institutional expenditure” (GIE), and so forth. For instance, colleges and universities describe terms such as “annual facility operational expenses” (AFOE), “auxiliary(ies),” “building usage,” and many others according to their own parameters, which causes issues when comparing their datasets with each other.
In addition, given the level of personnel churn and other factors, these definitions are not even consistent across time, making year-to-year or month-to-month comparisons befuddling. During the working group discussions to date, we have also learned that some institutions are required to use certain definitions based on state rules, which are not even consistent from state to state. For instance, while some universities have always used categories such as housing, dining, recreation, athletics, and student services to describe “auxiliaries,” others have excluded intercollegiate athletics, student academic support, or certain retail activities, and yet use that same term.
Then there are terms like “parking.” Is that an auxiliary expense or is it part of the core infrastructure? Changing the definition of certain metrics or terms is also often a painstaking process in that it may require buy-in from both internal and external (even legislative) parties. A well-established standard of definitions would assist with this process of having “apples-to-apples” comparisons as well as clarifying for the overall educational enterprise what we are talking about and why it is critical to the mission we are jointly seeking to serve.
The biweekly meetings have shown us that a few of the vendors are using self-developed definition standards, and the working group is pursuing that information. The vendors understand the need for consistent definitions, as they also struggle when comparing data, and as a result often spend countless hours working with their client institutions during software implementation. Data integrity becomes flawed and is questioned when you are trying to compare your institution’s data with that of other universities. The working group is looking at the information it has gathered to see what level of granularity (detail) is required for various kinds of data. Does the data need to be at the per-second frequency or is a higher-level/less-granular dataset the best answer to the questions you are encountering? The goal of this strategy is to limit the amount of issues caused by lack of standard definitions. If someone really wants to understand the cost of owning a facility, does the CRV really answer that question? The process of establishing clearer terms that are common across our profession is aided by such efforts as the total cost of ownership (TCO) ANSI standard (to be publicly released in January 2021) and the APPA terms and definition webpage, which can found at https://www.appa.org/terms-and-definitions/.
Standard Data Properties and Nomenclature
The second challenge is how to handle the raw datasets and the questions they have generated in the process of clarifying definitions. For instance, when using data analytics software to compare air-handler information, if one company uses integer value and another uses a string, the software is not able to compare the two data points and instead generates a calculation error. This kind of issue, however, is one that is can be solved quickly. The solution would require software vendors to agree upon the data properties of their datasets. In this specific example, using a standard nomenclature and data properties would allow colleges, universities and K-12 schools—who openly share this information with APPA—to compare with each other to improve on process and energy usage; but more broadly, it would help them improve their performance over time. The logical conclusion is that a standard—perhaps initially an APPA standard and then, eventually, an ANSI/ISO standard—would greatly move this effort forward. Such a standard would benefit educational institutions but also vendor software systems, as they would be able to gather consistent data types. One software vendor who has been participating in the working group mentioned being able to “tag” a data point in their software and link it with other datasets. The working group is following up on this concept to see if it is a pathway forward even in the face of vendors not agreeing to protocols for data properties.
One of the goals of the FIWG is for institutions to be able to look at a piece of equipment (e.g., an HVAC system) and see what other institutions are spending on its maintenance, energy usage, life cycle, etc. It would be extremely valuable to have such knowledge before you purchase an expensive piece of equipment, and to be able to compare your processes with other institutions to see how you can make institutional improvements. The data for these kinds of process improvements is already available in most cases, but the storage and data analytics function may be still in progress or in design. Depending on the institution’s level of data maturity, access to necessary data and the ability to use it to control devices, change systems, and provide value may vary. (For more on data maturity, get the FIWG technical report at the APPA bookstore [https://www1.appa.org/bookstore/] by searching for the “APPA Facilities Informatics Maturity Matrix Technical Report”.)
Conclusion
In conclusion, the FIWG has “gotten on base” with its work to this point. To be able to “load the bases,” however, we need the two challenges discussed here to be resolved. To that end, we are working hard to tackle them and looking to solve others (such as aiding the evolution of Facilities Performance Indicators 2.0 [FPI 2.0]). The working group continues to be interested in working with industry partners, APPA members, and any others that can contribute to improving our knowledge and opening the doors for our institutions well into the future. Keep an eye out for more news and results from our efforts in the year ahead.
Markus Hogue is program coordinator at the University of Texas at Austin and can be reached at [email protected]. Erik C. Backus is director, construction engineering management at Clarkson University in Potsdam, NY. He can be reached at [email protected].
Code Talkers
Code Talkers: Highlights the various codes, laws, and standards specific to educational facilities and explains the compliance issues and implications of enforcing these measures. To contribute, contact Kevin Willmann, FM Column Editor.
See all Code Talkers.