RADIOCENTRAL
Mission-Critical Device &
Data Management Platform | Motorola
Role: Lead UX Researcher (sole)
Impact: Informed the redesign of a safety-critical fleet management platform
Scale: Enterprise platform supporting thousands of public safety radio devices
Timeline: 2-month multi-method intensive evaluation
Note: Details limited due to NDA.
Background
​
RadioCentral is Motorola’s internal tool for radio device provisioning and fleet management that includes programming, preventive maintenance, and real-time monitoring of radio devices used by police officers and emergency responders. This complex platform was undergoing a major evolution from on-premise, relational databases to a scalable, flexible cloud solution, to include a new UI and user experience. I was brought in as the sole UX research owner to evaluate the early-stage and iterating interactive prototypes within an agile environment.
Bottom Line Up Front
​
This case study represents the first of many UX research projects I conducted on this solution. From this initial project, and because there was no discovery research conducted, my research identified over 40 usability issues (several of which were critical blockers); identified prioritized dashboard improvements with necessary customization options; and increased findability by 80% and task completion by 70% through data-informed layout changes for target user groups per Role Based Access Control (RBAC).
Business Needs
​​
-
Streamline, integrate, and increase the usability of this complex radio device provisioning system, in which the public safety department from each state has its own unique needs, problems, and ecosystem of interdependent tools
-
Identify and mitigate error-prone points in this zero-error tolerance environment
-
Inform a multi-million dollar platform redesign and transition to the cloud
-
Identify and reduce time-consuming tasks
-
Increase emergency response capabilities
​
Research Objectives
​
Evaluate the usability of the evolving RadioCentral Invision prototype, including its:
-
Dashboard and modular workflows
-
Feature accessibility
-
Key task and workflow sequencing
-
User confidence and information clarity
-
Findability of task content and functionality
-
Navigation pain points and interaction logic
My Role
​
I led the entire evaluative research effort from strategy to reporting.
-
Designed a 3-phase evaluation framework
-
Created and administered task scenarios and success metrics
-
Facilitated remote sessions with SMEs and users representing target user groups
-
Analyzed behavioral and attitudinal data
-
Translated usability findings into prioritized recommendations for designers and product stakeholders
​
Research Innovation
​
Developed and executed comprehensive evaluation framework:
-
Created custom severity rating system for mission-critical interfaces
-
Designed hybrid methodology combining internal subject matter expert and external user validation
-
Established new standard for enterprise platform testing
-
Built repeatable evaluation framework
Research Methodology
Phase 1: Expert Analysis (Heuristic Evaluation)
-
Tested against a specialized 11-point heuristic framework
-
Evaluated against mission-critical dashboard requirements
-
Created tiered severity classification:
-
System reliability impacts
-
Workflow efficiency barriers
-
Optimization opportunities
-
-
Delivered actionable development priorities
Phase 2: Workflow Evaluation
-
Led cognitive walkthroughs with 3 domain experts representing our target user groups
-
Evaluated 12 mission-critical tasks
-
Mapped mental models to interface design
-
Quantified task completion rates and error patterns
Phase 3: Usability Testing
-
Conducted in-depth testing with 6 system experts
-
Validated 13 high-priority workflows
-
Measured:
-
Task success rates
-
Completion time
-
Error frequency
-
Navigation efficiency
-
User confidenc​e​
-
Key Insights
​
-
Identified and prioritized 40+ specific usability issues, such as:
-
key information buried several clicks in​
-
unreliable data
-
confusing information architecture
-
confusing taxonomy
-
inability to organize information within data tables
-
redundant steps within a workflow, repeating the same thing multiple times within a single session
-
-
Prevented 5 critical workflow blockers, such as:
-
users were reliant on unreliable search function for critical, daily tasks (not all results would show, and often unrelated)​
-
the search function was relied upon because the critical information was typically deeply buried in a disorganized database
-
the database was disorganized because naming conventions were not established by Motorola or each public safety department; each user created their own
-
-
Resolved 12 serious efficiency impediments, such as:
-
redundant data input into several external tools (spreadsheets and asset management tools); this resulted in increased errors, complexity, user task load, cognitive load, and time
-
Business Impact
​
-
Improved findability rates by 80% and task completion rates by 70%
-
Reduced cognitive load in complex workflows by identifying daily, weekly, monthly, and critical tasks, and surfacing these task points within a customized, streamlined, and intuitive user experience
-
Prevented potential system failures by Motorola establishing the standardized naming conventions for the database; this significantly increased reliability, consistency, clarity, and structure that users could customize and leverage for their organization's unique needs
-
Reduced steep learning curve for users by creating help content (previously nonexistent) and a wizard for setup and common workflows
-
Created reusable research evaluation framework
Strategic Influence
​
-
Informed product roadmap prioritization
-
Built stakeholder confidence through data
-
Established new usability standards
-
Created foundation for future platform evolution
​
Successfully:
-
Led high-stakes interface validation
-
Developed innovative testing methodology
-
Influenced mission-critical product decisions
-
Created enterprise evaluation standards
-
Balanced user needs with system reliability