As the 2019 trade show and conference season ramps up, security integrators are faced with the sometimes daunting task of researching, evaluating and ultimately choosing which new products or partners they want to align themselves with. Midlothian, Va.-based Richmond Alarm, ranked No. 71 on the SDM 100, an electronic security provider that services 25,000 customers, recently constructed a system to quantify and standardize this process. Here is their story:
At Richmond Alarm we know our main objective is to deliver the most advanced security solutions for every one of our customers. It can be all too easy to keep “assessing products” on the to-do list indefinitely. Business is booming, competition is growing, ensuring the ongoing success of sales, customer service, managing staff and the bottom line is enough to fill the day, the week, and the year without engaging in a labor-intensive project. And yet, evaluating new products needs to be done on a regularly scheduled basis to ensure the business keeps up with the latest technology.
In early 2018, we resolved to streamline our product vendors, and to do so by building a product selection framework from the ground up. I formed a product committee led by Tom Kenney, our senior project manager, and comprised a team made up of representatives from sales, technical, central station, and management. In the first couple of meetings, we spent time soliciting feedback from the different departments, and came up with two working lists: customer needs and company needs.
Customer needs included:
- ease of use – how intuitive is the system to use;
- remote access – whether a customer can easily access the system remotely;
- reliability – wear and tear, warranty period;
- flexibility – scalability of the solution; and
- integration – whether it integrates easily with other popular solutions.
Company needs were:
- pricing – relative to competitors;
- supply chain – whether it is purchased directly or via a distributor;
- sales training – how is the sales team supported;
- demo program – is demo equipment offered virtually or onsite?
- technical training – how is the technical team supported?
- technical support – hours, where done, wait times relative to competition;
- legacy support – takeover options for existing customers, backwards compatibility;
- warranty – reliability, process and length of period; and
- service accessibility – service team access and ways to manage truck rolls.
Our team designed a vendor scorecard incorporating all these requirements, so we could objectively quantify our discoveries about each product and supplier. We embarked on a journey to ISC West, the industry’s largest trade show, to see the most expansive display of vendors in one setting. We divvied up into teams to engage with as many vendors across as many product lines as possible. Where initial scores ranked highly, we scheduled follow-up meetings for the entire team.
Once we returned to Richmond, we totaled our scores and made a shortlist of a dozen potential vendors and reached out to each one, inviting them to an on-site interview. We designed it as an interview rather than a one-way presentation, because growing their understanding of our company and needs was and is vital to a solid, ongoing business relationship. The sales teams were all professional and prompt, and surprisingly grateful when we sent them blank copies of our scorecards, so they could be fully prepared.
Having narrowed our list of suppliers to a number that we felt were promising in terms of their ability to deliver, the products themselves were going to determine our final decision. Could each product we were interested in completely integrate security/access/CCTV/fire systems? Was the customer interface of the solution easy to use and could it be easily accessed remotely? How accessible was the system for an installer?
When all presentations were complete, we compiled our results, re-engaged the top two or three in each category and entered the next phase: discussing pricing, whether we would have exclusivity of the product in the region, which distributors best represented the solution, marketing resources and support, and a timeline for commitments.
In the end, we had several takeaways that might prove useful to others. The first takeaway is that abandoning bias from the beginning of the process was critical to success. Every product we had or aspired to have, and even every relationship we had with suppliers was on the table. Opening ourselves this way enabled members of the team to speak freely about any struggles they were having, and the results were occasionally surprising.
Diversity was another: having long-term, knowledgeable employees from each department was critical not only in designing the process, but scoring suppliers and products. Features and flaws of each product and supplier took on different weighting, depending on whether the department representative thought it would positively or adversely affect their department’s future success. Equally, serving on the committee had a positive impact in-house: everyone involved became a stakeholder in the result, which bolstered their affinity for our company and buy-in to the implementation.
Tradeshows were another: testing multiple products side-by-side at the show would have been difficult and time-consuming to duplicate in our offices, and the one-on-one meetings we had with suppliers were invaluable. As an aside, we’d suggest suppliers might want to ensure they have a process for testing demos before each show, as several companies were having issues. We’d also suggest suppliers might want to make sure they have enough representation in their booth: those that could avail themselves to those one-on-one meetings despite the busy schedule were at an advantage.
But the biggest takeaway was protecting and improving the process we had developed so it could be duplicated on a scheduled regular basis in the future, ensuring we always stay on top of technology and can best meet our customers’ needs, now and in the future. — By Brian Vanderheyden, CEO, Richmond Alarm.