Streamlining Mining Data




tl;dr

Midway through 2023, we uncovered a critical gap in our product: while our tablet app successfully captured field data and synced it to the web, users had no way to view, analyze, or edit this data. This oversight jeopardized mining operations, where safety and productivity rely on dynamic planning and quality control. To tackle this, we conducted 15 discovery sessions with 12 research participants across Australia and North America.


Insights showed that user’s mental model perceived blast actuals data as part of the live file that is tracked from design to field execution to analysis. Additionally, 80% of users preferred a list view for managing blast files, requested color-coordinated hole layouts, and needed discrepancies between design and actual data to be visually highlighted. These findings directly informed my design tailored to meet users' critical needs for managing and analyzing field data.



 Background

If you’ve ever worked on a growing team, that has multiple competing priorities and lots of new features on the roadmap, you’ll understand the struggle to keep your product’s experience consistent across different parts of it. This was the spot in which our product team found itself midway through 2023: We’ve had successfully built the web application that was continuously improving and gaining traction among users AND we’ve had just released the companion tablet app to be used on the go, however we had a gap in the experience we forgot to account for; How do we store the field data and how do we display it to the web user?


Goal: Bridge the gap between data capture and analysisIn mining it’s crucial for safety and productivity of operations that users have data analysis tools at hand when planning and executing blasts. While the field data captured through our mobile app was seamlessly flowing to our web database there was no actual way to view, compare, analyze or alter it. We assumed this would create potential issues for those users who wanted to a. dynamically change plans based on captured data and b. have the need for quality control at their operations.



 Workshopping Assumptions

Objective of this workshop was to get the product, design and support teams on the same page before diving into user research. I invited 6 members from these teams to a 2 hour long workshop.

Outcome 1: Identification of User Groups



Blasting EngineerNeeds to analyze and adjust blast design based on field data.




Drill & Blast ManagerNeeds to monitor blasting operation progress and productivity of the crew.




Tech Sales PersonNeeds to monitor blast progress and performance to report to customers.

Outcome 2: Documentation of Unknowns

Once we were ready to dive in, I set 15 minutes on a timer for everyone to grab a sticky note and jot down all their existing assumptions. I wanted to keep the workshop engaging and focused, so to make sure people felt comfortable expressing their assumptions I proposed a format which could help guide their contributions. The format was the following:“Because I saw [qualitative/quantitative insight] I believe [this assumption is true]. We will know we’re [right/wrong] when we see this [qualitative/quantitative feedback/KPI change].



Then we dot voted for the most important questions/assumptions that we wanted to test.


Outcome 3: Hypothesis


+


Users need a clear way to view and edit field data that comes into the web application for two major reasons – evidence based planning & quality control of operations.




 Discovery



01. User flow highlighting experience gaps in our customer’s workflow

I created these maps during discovery sessions to understand our users’ current workflow and highlight experience gaps. Stars show a gap in capturing and analyzing field data at the user’s operation.

Site 1: Users are using pen and paper to capture field data, which can cause inaccuracies. User also relieson our competitor app for analysis.




02. Prototyping as part of discovery

At the time of this feature’s discovery phase I was reading Marty Cagan’s Inspired, which extensively describes the benefits of using prototypes over static lo-fi designs for discovery. This inspired me to create quick mockups and prototypes of the concept of blast actuals. For the parts of completely unknown user needs I resorted to showing empty interactive screens to let the user tell us what they want to see.  


Major Pain Points and Needs:
    1. Blasters not being notified when something is wrong with holes
    2. No good way to see deviation comparisons
    3. Ability to see blast status by looking at hole states
    4. Ability to see a version history for each blast
    5. Unreliable data collected in the field due to bad UX/UI of the tablets that cause mistyping or lack of information being entered



...I had been going about this all wrong – users don’t think of blast actuals as a separate concept from the lifecycle of a blast, therefore they expect the most recent data to be present by default!




 Updates and Justification

Based on usability testing below are changes I made to my design and respctive justification for each update.

01. Treat Blast Actuals as the most up-to-date view of the blast file

Majority of users were initially confused by the idea of “switching” to blast actuals, since their mental model of a blast is a living document. I changed the header of the workspace from “Design” to “Blast” and removed the tabs that differentiated designs from actuals. Now blast actuals will automatically sync to the workspace when available. I also added a section called “status’ to let the user see the progress of the blast in real time broken down in completion rates.




02. Add hole states for  different  phases  of operation

Users case deeply about the ability to understand the progress of a blast at a glance. As a blast pattern goes through different stages of the operation: being drilled, being measured, being loaded, being marked for abnormalities - users want to be informed to adjust the next steps accordingly. I created different hole treatments using symbols and colors to help the users quickly digest the progress of a blast.


03. Proceed with the list view
8/8 users said they prefer the list view over the card view. Keep the card view as an alternative.


04. Allow users to add deviation tolerances.
Without tolerances all holes may be counted as deviated because some values such as hole depths are never exactly as designed.






 Impact and Road Ahead

So far our team has only delivered the MVP of this feature, which entails the ability to view field data and compare it to the designed data. In the future, we’ll be releasing the full suite of features that allows the user to not only view the blast actuals but perform analyses, set custom deviations and change designs based on field data.











Thank you.