Machine Vision Camera Selector
Designing a robust product selection tool for FLIR’s most spec-sensitive market.
Summary
I led UX / UI design and project managed the implementation of our Camera Selector, which provided a powerful tool for customers to quickly find the best camera for their needs.
Role
UX/UI design in the early phase; Project Management once approved for development.
Why
FLIR acquired Point Grey, a machine vision-focused company, in early 2018. In late 2018 we migrated their content onto flir.com with one major exception: Their camera selector. Many customers complained about having trouble finding the best camera for their specific needs without a table-based overview of our models.
One of the reasons we didn’t migrate Point Grey’s camera selector onto flir.com initially is that it frankly wasn’t very usable. There was no fixed grid so the table would just stretch horizontally for eternity if customers opted to show too many columns. It was also slow to respond to customer input.
As we learned more about the machine vision industry, however, it became clear that purchase decisions were primarily based on camera specs. To succeed in the machine vision market, it was critical to give customers the tools they needed to evaluate large numbers of cameras quickly and easily. But we wanted to build a camera selector that met our UX and brand standards. That meant it needed to fulfill a few requirements:
It needed to be as flexible as possible in case our OEM cores, R&D thermal, or any other vertical wanted to use it for their product categories.
It needed to fit within FLIR.com’s 1200 pixel grid. We wanted to ensure the full display of information was always within view.
It needed to naturally integrate into the customer journey on flir.com. This meant it needed to be part of our standard Product Listing pages.
Solution
I flew up to Vancouver, BC to collaborate with the Machine Vision team on this project. Initially, they wanted me to design essentially the same camera selector they had on their old site. As we discussed further, however, I was able to persuade them that the user experience tradeoffs involved with horizontal scrolling were not worth the additional data density.
We agreed to three compromises that would allow us to legibly display the most critical information within the table, while keeping the contents within our 1200px grid:
The dataset would primarily contain numeric data. This allowed us to keep the columns narrow, even in languages with longer words (German). This also worked well since one of the most useful features of a table is the ability to sort numeric data.
All 14 filters from the standard Product Listing page would be shown. This meant users could still isolate models based on attributes such as Readout Method or Sensor.
We would add the ability for customers to easily compare the full specs of up to three cameras.
I left Vancouver with a direction for the tool and a basic wireframe:
Back in Portland, the design went through a number of iterations as we worked through several questions:
We were anticipating extremely heavy use of the filters section. How could we make that section as useful as possible?
How could we naturally integrate this tool into the product selection journey for machine vision customers?
Were there additional interactions we wanted to support from this page? Such as “Add to Cart” or “Add to Wishlist”?
To integrate the selector into the product selection journey, we decided to present it as just another way to view our product catalog:
If customers use the filters in the grid or list views, then switch to the Model Selector, their filter selections are carried over to the Model Selector (and vice versa).
To account for the anticipated heavy use of the filter section, we made a couple design tweaks to how filters are typically presented:
Filters could either be cleared per-attribute or all at once.
Multiple values selected would appear within the same attribute grouping to conserve space (see Megapixels and Pixel Size below).
Final filter UI
Here is the InVision prototype we used to iterate on the UX / UI design.
And here is the final UI:
Once we reached alignment with the Machine Vision team, I worked closely with our external development partner to ensure the Camera Selector worked flawlessly on all browsers.
Results
The Camera Selector launched to extremely positive feedback from customers and the internal Sales team. It has also performed very well in usability tests (though we have some changes planned for phase 2). It provided customers what they’d been asking for: An interactive view into our hundreds of machine vision cameras so they could find the best camera for their project. It attracts ~8,000 sessions per month and has played an important role in the eComm revenue growth we’ve seen over the last couple years.
A mobile version was not in scope for phase 1. It will be part of a major phase 2 effort focused on integrating the selector into additional product categories planned for later in 2021.