File size: 8,596 Bytes
d1b209f
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
Data Taxonomy (several cross-cutting categories):
* Application Domains (this shouldn’t necessarily be a category rather a way to indicate the source of the data)
   * Climate data
   * SEM image data
   * CT scan for objects
   * Medical/CT/MRI scan
   * Simulation data
   * Molecular data
   * ….


* Data Type
   * Structured Data
      * Image data (uniform rectilinear grids)
      * Rectilinear grids
      * Structured grids (curvilinear)
      * AMR (Adaptive Mesh Refinement)?
   * Unstructured Data
      * Unstructured grids (arbitrary cell types)
      * Polygonal data (surfaces, meshes)
      * Point clouds
   * Specialized Types
      * Hyper-tree grids
      * Composite/multi-block datasets
      * Graph/network data


* Temporal Dimension
   * Static/Single-Timestep
      * Analysis of a single snapshot
      * Spatial patterns only
   * Time-Series
      * Multiple discrete steps
      * Temporal sequences


* Attribute Types (how the data is represented, what information it has) 
   * Scalar Fields (single value per point/cell)
      * Temperature, pressure, density
      * Scalars driving isosurfaces, color mapping
   * Vector Fields (3-component direction/magnitude)
      * Velocity, force, displacement
      * Visualized via streamlines, glyphs, and LIC
   * Tensor Fields (matrix at each point)
      * Stress, strain, diffusion tensors
      * Visualized via tensor glyphs, eigenvalue analysis
   * Multi-variate/Multi-field/Multi-modal
      * Multiple scalar/vector fields
      * High-dimensional data visualization by analyzing multiple dimensions all at once
* Data Ensemble
   * Multiple runs, parameter sweeps, etc.






Tasks Taxonomy: 
Top level: 1) Atomic Operation, 2) Workflow, 3) Scientific Insights


1) Atomic Operation (individual well-defined operations):
* Extraction & Subsetting (Isolate regions of interest from larger datasets)
   * Spatial/temporal Extraction
      * Clipping (by plane, box, sphere, implicit function)
      * Region selection (by point/cell IDs, bounding box)
      * VOI extraction (volume of interest from structured data)
   * Value-Based Selection
      * Thresholding (scalar range filtering)
      * Isocontouring/isosurfacing (constant value extraction)
      * Connectivity filtering (connected region extraction)
   * Sampling
      * Subsampling/decimation (reduce resolution)
      * Probing (sampling at specific points)
      * Masking (regular/irregular point selection)
* Geometry & Topology Transformation (Change the structure or shape of data without necessarily changing attribute values)
   * Geometric Modification
      * Translation, rotation, scaling
      * Deformation, warping
      * Point/vertex manipulation
   * Topological Changes
      * Triangulation (convert polygons to triangles), Tessellation/subdivision
      * Mesh refinement/coarsening
      * Cell type conversion
      * Boundary extraction (surface from volume)
   * Structural Operations
      * Merging datasets
      * Appending data
      * Splitting/partitioning
* Attribute Computation & Derivation (Calculate new data attributes from existing ones)
   * Field Derivatives
      * Gradient computation
      * Divergence, curl, vorticity
      * Curvature calculation
      * Normal generation
   * Scalar Operations
      * Arithmetic operations on fields
      * Vector magnitude computation
      * Component extraction
      * Field aggregation/statistics
   * Advanced Computations[a]
      * Tensor operations (eigenvalues, eigenvectors)
      * Interpolation between fields
      * Distance computations
   * Time-Dependent Processing
      * Temporal interpolation
      * Particle tracing through time
      * Flow integration [b](streamlines, pathlines, streaklines, timelines)
      * Temporal statistics/aggregation
      * Flow map computation
* Representation & Mapping (Transform data into visual representations)
   * Glyph-Based Representation
      * Oriented glyphs (arrows, cones)
      * Scaled glyphs (size-mapped symbols)
      * Tensor glyphs (ellipsoids, superquadrics)
      * Volume markers
   * Geometric Primitives
      * Isosurfaces
      * Contour lines/surfaces
      * Cut planes/slices
      * Ribbons, tubes, streamlines
   * Color & Texture Mapping
      * Scalar to color mapping
      * Texture coordinate generation
      * Opacity mapping
   * Volume Representations
      * Ray casting
      * Splatting
* Smoothing & enhancement (Improve data quality or visual appearance)
   * Smoothing Operations
      * Surface smoothing (Laplacian, Gaussian)
      * Data noise reduction
      * Interpolation for filling gaps
   * Enhancement
      * Edge enhancement/detection
      * Feature extraction
      * Sharpening
   * Filtering 
      * Outlier removal
      * Statistical filtering
* View / Rendering Manipulation
   * View Changes
      * Rotation, zoom, move
   * Render Options (should be belong to transformation)
      * Light position
      * Lighting mode


2) Workflow (sequence of operations with a clear visualization/analysis goal)
* Data Understanding & Exploration
   * Data Characterization (statistics, distributions, quality check)
   * Spatial Exploration (slicing, probing, overview generation)
   * Feature Discovery (identifying interesting structures/patterns)
* Analysis & Quantification
   * Statistical Analysis (descriptive statistics, distributions, correlations)
   * Region-Based Measurement (volumes, areas, integrated quantities)
   * Profile & Cross-Section Analysis (1D/2D extraction from 3D)
   * Derived Quantity Computation (gradients, vorticity, custom fields)
* Feature Extraction & Tracking
   * Structure Identification (vortices, shocks, boundaries, topology)
   * Feature Characterization (properties, classification, quantification)
   * Temporal Tracking (feature evolution, lifecycle, trajectories)
* Comparative & Temporal Analysis
   * Multi-Variable Comparison (correlation, coordinated views)
   * Temporal Evolution / Comparison (time-series comparison)
   * Simulation Comparison (parameter studies, model validation)
   * Difference Analysis (error fields, change detection)
* Flow & Transport Analysis[c]
   * Trajectory Computation (streamlines, pathlines, streaklines)
   * Lagrangian Analysis (particle tracking, residence time, FTLE)
   * Transport Quantification (flux, mixing, coherent structures)
* Verification & Validation
   * Data Quality Assessment (outliers, artifacts, boundary conditions)
   * Code Verification (convergence, analytical comparison, consistency)
   * Physical Validation (experimental comparison, uncertainty quantification)
* Data Processing & Optimization
   * Data Conditioning (cleaning, smoothing, noise reduction)
   * Data Reduction (decimation, sampling, compression)
   * Format Conversion & Restructuring (mesh generation, type conversion)
   * Parallel & Distributed Processing (HPC workflows, decomposition)[d]
* Uncertainty Quantification & Visualization[e]
   * Sensitivity analysis, uncertainty characterization (estimate uncertainty)
   * Aggregation and summarization (visualize quantile and interval)
* Communication & Dissemination
   * Static Visualization (publication figures, high-resolution images)
   * Animation Generation (temporal, spatial, parameter animations)
   * Interactive Applications (web/desktop viewers, dashboards)
   * Report Generation (automated analysis reports, summaries)


3) Scientific Insights (analysis or visualization that leads to domain-relevant insights)
* Application-specific questions and insights that can be derived from analysis or visualization
* The result could be: binary decisions, multiple choices, 
* Potentially involves different workflow steps








What not to include 
* Excessively large dataset (multiple GB to TB level)
* Interaction that ties to a specific tool/interface
* Questions do not have unique and clear answers/ground truth
[a]probably need better title
[b]Move flow integration into its own category of Vector Operations, analogous to Scalar Operations? Streamlines are not time-dependent. Also could include LIC, and maybe FTLE.
[c]There is overlap between this category and previous atomic operations on flow integration (see my earlier comment).
[d]Consider large scale HPC to be out of scope, in order to make the benchmark more broadly accessible?
[e]Group together with V&V?