Recently I was asked to opine on trends in the sensor marketplace. The question is one I have considered many times in my career and lately is a daily consideration given the importance of data to both personal and professional life. The challenge with the question is that sensors are no longer just a physical thing. Today we can combine data inputs from multiple sources, some physical some virtual, add some cognitive insight, and craft a recipe for a new measurement. But recipes only work reliably when the ingredients are fresh and the instructions followed diligently.

The relationship between data and sensors is special. For any given data stream one can define, sometimes in the abstract, a sensor that generates that data. Netflix, for example, uses our TV’s, PCs, and mobile devices to “sense” our viewing preferences for their content. I was once part of a team that developed a “smart stethoscope” that could measure the level of bovine respiratory disease (BRD) in cattle. We mapped the cognitive analytics of two veterinarians using a learning agent into a digital stethoscope. The base detector in the stethoscope was an acoustic sensor, but the finished device was used, millions of times, as a “BRD sensor” scoring the animal’s health from 1 to 5. Not surprising that the resulting data bases were of great interest to herd managers, veterinarians and pharmaceutical companies alike.

My divergence into the definition of a sensor is relevant because while every measurement comes from a sensing element, not all sensors are equal. Physical sensors tend to measure base or intrinsic variables of the subject in real time. Derived or virtual sensors, like our BRD sensor, have subtleties. They embody assumptions and introduce latency. Derived sensing is particularly tricky when the derivation is presented as a simple score the way a base sensor works. The “score” can automate decisions and operations, but it can also hide risks – not the least of which is accuracy over time. When we identify a measurement or data stream that is critical to the operation of our business, e.g., credit for lease/loan originations, we must consider how we come about that data and whether or not it is the best, most accurate measurement of the subject. The integrity of a data stream is fundamental to the reliability, durability, and accuracy of the actions taken based on that data.

Credit scores are calculated from a variety of parameters, but they are most strongly influenced by payment histories. Often the credit scores are a proprietary blend of payment data and financial parameters that, like that recipe for a good beer, are trusted to deliver good outcomes in risk decisions. But a brewer’s recipe starts with base ingredients – hops, malt, yeast, water, etc. As long as these ingredients have integrity, i.e. are fresh, and the brewer follows the process, good beer is reliably produced. The principal ingredient of a credit score is payment history, but payment history is a derived measure, not a base ingredient. The base ingredient of payment is the operational profit that creates the ability to pay. Payment data can be 60 to 90 days behind the operational data that generated the profit used to make that payment. Payment data is not fresh, its recycled operational performance data.

Fresh is a critical characteristic of data because with the passing of time, data from any operation becomes a less accurate measure of the operation. Sometimes, as has been the case in Equipment Finance, we use what ever data we can get because some data is better than no data. But every organization must strive for continuous improvement and a better understanding of data integrity is one way to improve.

The good news is that today’s ecosystem of connected equipment and digital services, the Internet of Things, is saturated with sensors. Back in 2015 when some IoT forecasters were talking about billions of connections, others were forecasting trillions sensors -- base, physical sensors – that would create a new economy. These sensors enable new sensing elements, derived in real time, of almost any aspect of an operation. Consider a paint spraying system used in commercial office construction. Just a few physical sensors can provide location, linear distance traveled, runtime, paint consumption, and spray coverage. If we add some cognitive, operational insight, we can measure - in real-time- the cost of paint, cost of labor, and square footage painted by location. A simple pricing model, price per square foot painted, combined with the cost per gallon of paint and hourly wage then reveals an estimate of revenue, profit, and cash generation using a typical overhead percentage. Using a few operational sensors on the equipment, we can measure and forecast the ability to pay for that equipment.

Credit sensors tell us where a customer has been, financially. Operational sensors tell us where a customer is going, financially. Operational sensors monitor both the equipment and the assumptions used in the contract throughout the life of the contract. They measure, in real time, the fundamentals of operational performance and the basics of financial health. One could say that credit scores recycle operational data in the form of the payments that a business chose to make with its profits.

Credit is, and will continue to be, one of the most critical decision parameters in equipment finance. It’s time to take a fresh look at how we measure it.

Written by

Scott Nelson

President & Chief Digital Officer, Board Member

Scott Nelson is the president and chief digital officer of Tamarack Technology. He has more than 30 years of strategic technology development, deployment and design thinking experience working with both entrepreneurs and Fortune 500 companies. Nelson is a sought-after speaker and contributor on topics related to IoT and digital health. His involvement in technology in the local and national technology community reflects an ongoing and outstanding commitment to technology development and innovation.


« Back

Contact us

  • Should be Empty: