OEVis

Modernizing Network Visualization
For Military Intelligence

Client

TRADOC G2 (Army)

Roles

Research, UI/UX Design, Prototyping

TIMEFRAME

2 years

Platform

Web // Desktop Browser

Client

TRADOC G2 (Army)

Roles

Research, UI/UX Design, Prototyping

TIMEFRAME

2 years

Platform

Web // Desktop Browser

OEVis:
Operational Environment Visualization
OEVis:
Operational Environment Visualization
OEVis:
Operational Environment Visualization
WHAT IS NETWORK ANALYSIS?
WHAT IS NETWORK ANALYSIS?
WHAT IS NETWORK ANALYSIS?

Network analysis is the study of how entities are connected.

Data is represented as nodes (people, organizations, entities) and edges (relationships between them).

By examining these connections, analysts can uncover patterns like who matters most, where vulnerabilities exist, and how information flows.

OEVis makes these patterns visual and interactive.

OVERVIEW
OVERVIEW
OVERVIEW

For over a decade, Army intelligence analysts relied on ORA, a powerful but outdated network analysis tool, to understand complex operational environments.

As ORA aged out of support, the Army faced losing a critical capability entirely.

I led the design of OEVis, a government-owned replacement built to preserve essential workflows while modernizing the analyst experience.

The challenge: create an intuitive interface for complex network analysis without direct access to end users, under a tight timeline, and with the constraint that analysts had spent years learning ORA's workflows.

IMPACT:

92% user satisfaction rate

30% reduction in onboarding time

22% improvement in task completion efficiency

CHALLENGES
CHALLENGES
CHALLENGES
LIMITED USER ACCESS:
LIMITED USER ACCESS:
LIMITED USER ACCESS:

I couldn't interview actual analysts - only the instructors who trained them.

This meant relying on proxy users to understand real analyst needs and pain points.

I couldn't interview actual analysts - only the instructors who trained them.

This meant relying on proxy users to understand real analyst needs and pain points.

I couldn't interview actual analysts - only the instructors who trained them.

This meant relying on proxy users to understand real analyst needs and pain points.

Familiarity vs. innovation:
Familiarity vs. innovation:
Familiarity vs. innovation:

Analysts had invested years learning ORA.

A drastically different interface risked friction and resistance, so continuity mattered even as we modernized.

Analysts had invested years learning ORA.

A drastically different interface risked friction and resistance, so continuity mattered even as we modernized.

Analysts had invested years learning ORA.

A drastically different interface risked friction and resistance, so continuity mattered even as we modernized.

RESEARCH & DISCOVERY
RESEARCH & DISCOVERY
RESEARCH & DISCOVERY

With limited access to end-user analysts, I conducted stakeholder interviews with 2 instructor analysts who trained them, along with a competitive analysis of ORA and Gephi to understand existing patterns and pain points.

Key findings that shaped the design:

1) Data import was the top pain point:

“If the data isn't formatted exactly right, the whole import falls apart. Most of the time, that's where things go wrong.”

Import failures were common and hard to diagnose. Analysts spent hours preprocessing files to match ORA's rigid expectations, and errors could invalidate entire analyses. This became a top design priority.

2) Tools were buried and hard to find

“I know the tool can do what I need, but I have to remember where everything lives. Half the battle is just finding the feature.”

ORA's fragmented interface forced analysts to hunt through nested menus for frequently-used features, slowing down routine tasks.

3) Competitive landscape revealed a gap

"ORA has everything we need analytically, but new analysts struggle with how complicated it is. Gephi is simpler, but it doesn't have the military data formats we require."

ORA offered analytical depth but overwhelmed new users. Gephi, a popular open-source network visualization tool, was more usable but lacked military-specific features and couldn't satisfy security requirements for government use.

OEVis needed to combine both strengths.

DESIGN ITERATION: WHAT DIDN'T WORK
DESIGN ITERATION: WHAT DIDN'T WORK
DESIGN ITERATION: WHAT DIDN'T WORK

Like most complex products, OEVis evolved through iteration.

Rather than showing every experiment, this section focuses on the panel layout as a representative example of how design decisions were tested and improved.

IDEA 1. All Panels on One Side

Initial idea:

Initial idea:

Initial idea:

Place all six panels on a single side of the workspace to simplify navigation.

Why it didn't work:

Why it didn't work:

Why it didn't work:

  • Six stacked panels created excessive vertical scrolling, pushing important functionality below the fold.

  • Switching between panels became slow, especially for new analysts.

  • The layout felt crowded and unstable as panels opened and closed.

What I learned:

What I learned:

What I learned:

Analysts needed to reference multiple panels simultaneously.

The single-side approach forced constant toggling, disrupting workflow, and wasted screen real estate on wide monitors.

IDEA 2. Multiple Expanded Panels Per Side

Initial idea:

Initial idea:

Initial idea:

Split panels between two sides of the main window, allowing multiple panels on each side to remain expanded simultaneously.

Why it didn't work:

Why it didn't work:

Why it didn't work:

  • Multiple open panels created long scroll distances and visual chaos.

  • On smaller screens, content became cramped and overwhelming.

What I learned:

What I learned:

What I learned:

Allowing multiple panels to remain expanded simultaneously created too much visual complexity.

An accordion behavior where expanding one panel automatically collapses others on the same side would maintain a cleaner workspace and reduce cognitive load.

IDEA 3. Grouping Panels Purely by Function

Initial idea:

Initial idea:

Initial idea:

Allow one panel to expand per side, grouping Project Info, Attributes, and Legend (informational) together, while keeping Analytics, Layout, and Display (interactive) together.

Why it didn't work:

Why it didn't work:

Why it didn't work:

While this grouping made logical sense, it created an imbalanced layout.

The three interactive panels were all significantly taller when expanded, meaning one side would always require excessive scrolling while the other required none.

What I learned:

What I learned:

What I learned:

Grouping by conceptual categories (informational vs. interactive) made logical sense, but wouldn't be ideal in practice.

The right vs left side distribution needed to find a compromise between panel classification and panel height.

IDEA 4. Height-Balanced, Workflow-Aligned Panel Layout

Final panel solution

After testing multiple layouts, I arrived at a panel structure that balanced discoverability, focus, and verification workflows.

Left side panels:

Left side panels:

Left side panels:

  • Project Info

  • Attributes

  • Display (with tabs for Nodes and Edges)

Right side panels:

Right side panels:

Right side panels:

  • Analytics

  • Layout

  • Legend

The solution:

The solution:

The solution:

Display, the tallest panel when fully expanded, was paired with two shorter informational panels (Project Info and Attributes) on the left.

The remaining panels (Analytics, Layout, Legend) were placed on the right, creating balanced scrolling across both sides.

Why this worked:

Why this worked:

Why this worked:

Choosing Legend as the panel to swap with Display solved two problems at once:

First, it created the best height balance since Display is the tallest panel when expanded.

Second, it supported a critical verification workflow.

Analysts frequently adjust visual properties in Display and then immediately verify those changes in Legend.

Placing them on opposite sides allows both to remain open simultaneously within the one-panel-per-side constraint, preventing the need to constantly toggle between them.

Display and Legend panels expanded

Video: Display Configuration

ADDITIONAL PANEL FLOWS
ADDITIONAL PANEL FLOWS
ADDITIONAL PANEL FLOWS

Analytics: Running an algorithm

Video: Analytics - Shortest Path algorithm

Layout: Graph Configuration

Video: Layout Configuration

OTHER KEY DESIGN DECISIONS
OTHER KEY DESIGN DECISIONS
OTHER KEY DESIGN DECISIONS

Multi-Step Import Wizard

Video: Importing Montreal Street Gangs 2007 network data

Data import was historically the most frustrating step in ORA. Analysts had to preprocess files to match rigid expectations, and errors were common.

I designed a guided, multi-step import flow that validated files, mapped attributes, and previewed the network before import.

This reduced errors, improved onboarding times, and gave analysts confidence in their data from the start.

Two-Tab Workspace for Dual Workflows

Video: Editing a network inside the Data Table tab

Analysts frequently switched between raw data tables and visual graphs, but ORA treated these as separate, disjointed views.

I introduced a tab-based workspace that allowed one-click switching between the graph visualization and data table within the same window.

This reduced friction and kept users oriented in a single, stable workspace.

Streamlined Graph Toolbar

Video: Graph Toolbar interactions

ORA's toolbar was overloaded with rarely-used controls, making common actions hard to find.

I distilled the toolbar down to essential graph interactions:

  • adding/removing nodes and edges

  • zoom controls

  • recenter

  • fullscreen toggle

  • grayscale/color toggle

This reduced visual noise and made direct graph manipulation faster and more intuitive.

Design System Alignment

The engineering team used AntV's Graphin for graph visualization, so I leveraged Ant Design for standard UI components while creating custom components for panels and the graph toolbar.

This ensured visual consistency while keeping implementation efficient for developers.

OUTCOMES
OUTCOMES
OUTCOMES
MVP LAUNCH

OEVis launched as a minimum viable product in TRADOC's Azure Cloud in Spring 2024.

By preserving familiar workflows from legacy tools while modernizing the interface, OEVis enabled analysts to transfer their expertise with minimal disruption, accelerating adoption and improving efficiency from day one.

IMPACT:

92%

Satisfaction Rate

92%

Satisfaction Rate

92%

Satisfaction Rate

User surveys demonstrated strong approval of the new interface, validating that the balance between familiarity and modernization resonated with analysts.

New analysts reached proficiency faster compared to ORA, reducing the training burden on instructors and allowing analysts to contribute to real work sooner.

30%

Faster Onboarding

30%

Faster Onboarding

30%

Faster Onboarding

22%

Higher Task Efficiency

22%

Higher Task Efficiency

22%

Higher Task Efficiency

Core analysis tasks - from data import to network visualization - became faster and less error-prone, allowing analysts to focus on insights rather than fighting the interface.

LONG-TERM VALUE

The Army now owns a government-developed tool that can evolve with analysts' needs without vendor dependencies, preserving critical network analysis capabilities for future training and operations.

REFLECTION
REFLECTION
REFLECTION

Working with constraints drove better decisions

Limited access to end users pushed me to rely on proxy research and competitive analysis, forcing me to be more intentional about design choices rather than making assumptions.

Familiarity and innovation can coexist

Analysts had years of muscle memory with ORA. The most effective approach wasn't to reinvent everything - it was to preserve what worked while modernizing what didn't, enabling smoother adoption.

Iteration reveals what stakeholders can't articulate

The panel layout evolved through multiple attempts before landing on the final solution. Concepts that seemed logical (grouping by function) failed in practice because they didn't match actual workflows.