The procurement process is a subtle art, which is sometimes attempted to be applied uniformly to too many different types of deliverables.
There are basically three ways to approach the issue:
Is an appropriate method to acquire goods/services that are available in a very comparable manner from a number of suppliers.
Basically, asphalt is asphalt, and the minimum specifications required to participate will be sufficient to properly define the deliverable, 100% by the client.
“Quality” is well defined, and is not part of the evaluation beyond the minimum qualification standards.
This approach lends itself well to collusion between suppliers. The winner is known in advance, it will be the lowest price… etc. The phenomenon has been amply documented in the public arena.
This method is used to give suppliers some latitude to participate in defining the deliverable, and to demonstrate how they differ from the competition.
An example would be a copier purchase, where the customer defines 70% (?) of the deliverable, and the suppliers fight to define the other 30% as attractive.
The result is very similar to the Lowest Qualified Bidder method, in that price is typically 2 to 4 times more important than quality.
This method is becoming increasingly popular, and applicable for purchases of:
- highly sophisticated technology products; the case in point is integrated Human Capital Management (HCM) software. A modern HCM system has 30 to 40 modules (e.g. payroll, pension, recruitment, performance, etc.), 5,000 data elements, 2,000 tables, 3,000 screens, millions of lines of code and documentation, thousands of business rules.
- where the client is unable to fully document its requirements in sufficient detail to allow for operational delivery (e.g. all the subtleties in the day-to-day application of collective bargaining rules beyond the text, elements that are discovered long after a contract).
- and where the client risks setting too narrow a framework for the deliverables by defining them only partially.
- where the continuity of services over a long period is sine qua non;
- In a market where there are published and known unit cost metrics (such as PEPY, Price per Employee per Year), for comparable facilities and where it is easy for the client to estimate at a high level what a quality system should cost.
In this method, the customer estimates a price at and below which they will award all points on the price issue, and let the battle rage on the quality issue.
And it will be up to the vendors to demonstrate in scripted demos how their systems will support the customer’s business.
Hoping to save costs and end up with a second-rate system in a strategic HCM area is not sound economics. This is how organizations end up with dozens of redundant systems to fill in the gaps of the non-integrated bottom end.
In technology, there are a large number of vertical markets where the systematic application of the Lowest Qualified Bidder has led to perverse results:
- a reduction in competition by making it impossible for new entrants with new technologies to justify an R&D and go-to-market effort;
- a gradual elimination of smaller suppliers
- the creation of oligopolies where a very small number of large players cut their R&D and innovation costs to remain the lowest bidder.
After a few decades of this regime, entire sectors are left with obsolete technologies, and at the mercy of oligopolies.
Procuring a HCM system is an opportunity to modernize the way you manage the most important resource in any organization, your people, and should not result in acquiring the lowest cost system that is typically the least efficient and the least up-to-date.
It is possible to judge what would be a smart budget for the organization, and publish it, and then let competitors present the quality they offer; this method will result in a fair value for the expense and the best possible modern scalable systems.
By far the most expensive acquisition is a weak system at a discount that will require compensating with small, non-integrated parallel systems, Excel files, manual processes, etc.
And there is more to this disheveled approach today: people’s personal data ends up in multiple copies in unsecured systems, highly vulnerable to malicious access.
All for the sake of trying to save a little money on the initial acquisition, which represents “peanuts” in the grand scheme of things.
And you won’t know what you’re missing because the best solutions won’t want to waste their time competing in a lowest cost market controlled by an oligopoly. When an oligopoly is well established without competition, the lowest prices cease to be low prices… and become the lowest of prices all exorbitant in relation to the quality delivered.