Dashboard Research and Design
Generative Research | A/B Testing | Iterations
Generative Research | A/B Testing | Iterations
At AGCO, I was hired as an Engineer, but along the way I fell in love with user interviews, testing, and iterative designs. I created a dashboard to evaluate vendor performance, as my side project, and decided to expand it's reach as it became more popular. I conducted user interviews, usability tests, and A/B testing to establish the dashboard as a credible source for information on supplier KPI's.
My new KPI Dashboard design resulted in AGCO's 2018 Global Parts Per Million (PPM) quality assurance metric improving by 150 points, and the dashboard is still used today. Pretty awesome in my opinion.
After merging and acquiring several manufacturing facilities, AGCO Corporation struggled to centralize data of vendors that were common to different facilities across North America, South America, Europe, and Asia Pacific. The company needed to optimize their supply chain with high performing vendors, and eliminate poor performing vendors based on the PPM KPI.
Each site collects their own vendor data, creates their own reports in excel, and emails them to management at different site, regional, and global levels. These reports are submitted in different formats, and at times, the same suppliers even have slightly different names i.e. ECI Group vs Electrical Components Incorporated.
I took the initiative to design a solution that eliminated excel sheets, email blasts, and human error. I researched, designed, developed, launched, and iterated a vendor dashboard that made data available globally, and instantly. The dashboard design was a new aspect in APEX, an online tool that AGCO was already using.
Field studies with Quality Engineers showed that the user wanted to see how vendors at their own location performed in any given month, as well as year to date. Month-to-month data illustrated whether or not the vendor KPI was improving, which was key information for the vendor escalation process. The study also indicated a need to filter or sort by commodity, not just location. Commodity KPIs were essential in determining how to resolve issues with poor performing vendors. The sites could determine if they needed more resources and knowledge allocated to “electronics” vendors vs. “plastics” vendors, for example.
Stakeholder interviews revealed the need for global KPI information, not solely aggregated site or commodity information. This highlighted the need for a second type of user – Supply Chain Executives. Before the dashboard, Executives were also on the mass email list that contained large excel files. The monthly report would often get lost in their inbox, and they relied on conference calls to get the most up to date KPI information.
The first test that was conducted was A/B testing. Originally, there were two designs. One design mimicked the excel document, which contained several lists that showed best and worst vendors per site. The other design was one page, containing graphs and lists that could be manipulated with filters by the users. While the Quality Engineers enjoyed both dashboards, the Executives preferred the overview of graphs. Graphs summarized high-level information, while lists showed detailed data.
After eliminating one design, contextual inquiries were performed. Users were asked to give feedback about the most recent design and were observed performing quality engineering tasks in their own environment. Users wanted 2 more things,
1. The ability to go back to previous years, and
2. The ability to send the dashboard directly to the printer or save as a PDF without resizing.
APEX was not a responsive application, thus, neither were the dashboards. The design was iterated to be compatible with Google Chrome and Internet Explorer, the two search engines permitted company wide. The data range was expanded to include all data for all years, and a year filter was added to obtain year-by-year KPI data. The current year remained as the default view.
Both methods confirmed that users liked the option of applying filters (month, location, commodity etc..), then being able to print, download, or share the filtered information. This feature increased the positive user experience the most for 3 reasons. Filters made reporting and sharing information easy, it reduced the amount of data crunching for quality engineers, and it ensured accuracy throughout reporting across multiple business locations.
The biggest challenge was committing to designing a dashboard. The dashboard took months to build before it was released, it was the first of its kind, and there was no instruction manual on how to build a dashboard in APEX. It was a lot of trial and error in between meetings, business travel, being an engineer, and being a part-time MBA student. I was not hired as a UX Designer but creating a better user experience for the people I worked with, made my role easier in the long run. Reports became autonomous, and as a group, we were able to be more proactive, rather than reactive, when evaluating vendors.
Global PPM was reduced year over year as a result of multi-disciplinary functions having easy access to vendor data and being able to inquire about specific data points.
Vendor contracts are only renewed based on good global performance, reducing poor performing vendors from the supply chain.
PPM was reduced by 150 points in one year.
The company optimized the supply base by a 4% vendor reduction.
Engineers were using APEX more often.