Call Me: 312.451.3608

Textura Performance Tracker

Project Overview

Performance Tracker was an application Textura developed as a way to track the quality of subcontractors. This was a useful tool as it helped general contractors determine which subcontractors were worth bringing into different projects. A general contractor would create an evaluation based upon their criteria, select those who needed to participate in the evaluation and the system would send an automated email to the evaluators. Once an evaluator started their evaluation they would rate the subcontractor on a scale of 1-5, with 5 being the highest rating. After all evaluations are completed the users could compare different subcontractors against each others in multiple categories. The user would also be able to view the overall evaluations and ratings history of every subcontractor.

Performance Tracker was one of the first applications Textura designed to work with other applications, specifically GradeBeam. GradeBeam was an application that allowed general contractors to send out, collect, track, compare and eventually award a project bid. Performance Tracker results became a very important filter in GradeBeam; the general contractor could eliminate subcontractors from consideration if their aggregate scores fell below a certain level.

A year after the product was launched I lead efforts to redesign Textura’s style guide, the visuals you see below are the newer style guide.

Process

Performance Tracker was the first project I worked on when I started at Textura. The application design and development had already started when I began work on it. This created an interesting situation for me as I was learning Textura’s culture and making recommendations to an already in progress application. Thankfully Textura was agile so I was able to pivot the project toward a better direction.

I was introduced to the team, which consisted of the product owner and the company’s lead developer. They walked me through the existing processes and showed me the early prototypes on the staging server. Once I got a firm grasp of what they were trying to accomplish we sat down for a few white board sessions where we managed to streamline several aspects of the project. From there I proceeded to several pencil sketches to improve the way the information was presented on the page. Once the dev and product owner were on board I moved into visuals. Due to time constraints we skipped past the wire frame stage. My other challenge on this project was trying to understand how Textura’s style guide worked and trying to follow it. We would present the designs and prototypes to larger customers to get their feedback and made adjustments accordingly.

Outcome

Evaluation Setup

When a user wants to set up a new evaluation they would navigate to this page. We designed it as a drag and drop because user feedback showed they enjoyed that kind of functionality. Having a drag and drop system made the experience easier to setup and edit.

Evaluation Setup: Add Questions and Headings

Once the user drags the question into the target area, the system recognizes there was no heading. A heading is required for each section. Each heading can have multiple questions. In this instance the user must create a “Header,” then they select question type from a dropdown and finally they select from a series of preset questions from their template.

Evaluation Setup: Example Layout

After a few headings and questions were set up the user would see a view similar to this. The headers were separate using a blue background. Everything on the page was editable, the user could click a pencil to rename a heading, the red X to delete a heading, they could drag & drop a question to the delete area to remove it, and they could also use the drag & drop to change question order. The rating alerts were preset on the right, but a user could change the alerts on a per question basis. After research we found the users preferred to set the “required questions” after they were set up. One additional feature; the right column locked into place, so no matter how many questions there were the user never had to page scroll to see their options.

Evaluation Setup: Target Area

As a user is editing the evaluation questions and they want to add a new question/heading this view shows what the target area looks like. When something is dragged the blue outlines and background shows where the item will be dropped.

Evaluation Setup: Email

After an evaluation has been setup and the evaluators have been selected (not shown here) an email is automatically generated and sent to the evaluators. This notifies them they have evaluations to complete.

Evaluation

Once the user decides it is time to complete the evaluation this is the page they will use. The evaluations follow the same format as the setup, with headings and questions. The user has the option to either click a rating number or they can enter it in the form field on the left. Also, on the left column of the page the user can see a full list of all the evaluations they are required to complete.

Overall Score and Comparison

Upon completion of an evaluation the scores are tallied and compiled into a single overall score. This represents an average of all the scores from all the evaluations an organization has received. The reports also track who scored each option and how they scored them. A user can also compare against trade codes or specific questions. They can also see trends how a subcontractor has either improved or gone downhill over time.

Vendor Dashboard: Overview

Once a vendor or subcontractor has been added to the Textura system all of their information gets compiled into a single overall dashboard. The top of the dashboard shows their evaluation results. The Overview section shows the majority of their at glance information. The dashboard has sections that must be filled out in order to create a complete picture of the subcontractor. All information is editable (Summary).

Vendor Dashboard: Attachments

All vendors/subcontractors have documents attached to the projects they have worked on. This is the repository for those documents.

Vendor Dashboard: Ratings

This view tracks all the ratings and who rated the vendor/subcontractor by category for its entire existence. The ratings can be expanded or collapsed and if there are more than 5 ratings an internal scroll bar will appear. The rating can also be dragged around into whatever order the user prefers.

Vendor Dashboard: Audit Log

Every update or edit is tracked either by date or by user. This is the Audit Log tracked by date.


Let's Connect