0 stringclasses 12
values | 1 float64 0 26.5k |
|---|---|
megatron.core.transformer.attention.forward.qkv | 196.692444 |
megatron.core.transformer.attention.forward.adjust_key_value | 0.11248 |
megatron.core.transformer.attention.forward.rotary_pos_emb | 0.091328 |
megatron.core.transformer.attention.forward.core_attention | 937.971436 |
megatron.core.transformer.attention.forward.linear_proj | 0.756416 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention | 1,136.766235 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda | 263.833679 |
megatron.core.transformer.mlp.forward.linear_fc1 | 6.184864 |
megatron.core.transformer.mlp.forward.activation | 306.726074 |
megatron.core.transformer.mlp.forward.linear_fc2 | 22.752096 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp | 336.622131 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda | 0.235232 |
megatron.core.transformer.attention.forward.qkv | 1.297152 |
megatron.core.transformer.attention.forward.adjust_key_value | 0.0032 |
megatron.core.transformer.attention.forward.rotary_pos_emb | 0.00304 |
megatron.core.transformer.attention.forward.core_attention | 37.656258 |
megatron.core.transformer.attention.forward.linear_proj | 0.671264 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention | 39.649792 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda | 265.641266 |
megatron.core.transformer.mlp.forward.linear_fc1 | 26.127584 |
megatron.core.transformer.mlp.forward.activation | 56.294655 |
megatron.core.transformer.mlp.forward.linear_fc2 | 3.260192 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp | 85.913918 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda | 0.230144 |
megatron.core.transformer.attention.forward.qkv | 216.014526 |
megatron.core.transformer.attention.forward.adjust_key_value | 0.120032 |
megatron.core.transformer.attention.forward.rotary_pos_emb | 0.08896 |
megatron.core.transformer.attention.forward.core_attention | 5,690.135254 |
megatron.core.transformer.attention.forward.linear_proj | 4.210944 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention | 5,912.009277 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda | 239.898239 |
megatron.core.transformer.mlp.forward.linear_fc1 | 5.483072 |
megatron.core.transformer.mlp.forward.activation | 249.525223 |
megatron.core.transformer.mlp.forward.linear_fc2 | 3.363072 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp | 259.349335 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda | 1.854272 |
megatron.core.transformer.attention.forward.qkv | 2.802176 |
megatron.core.transformer.attention.forward.adjust_key_value | 0.28224 |
megatron.core.transformer.attention.forward.rotary_pos_emb | 0.306112 |
megatron.core.transformer.attention.forward.core_attention | 23.875648 |
megatron.core.transformer.attention.forward.linear_proj | 0.33696 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention | 28.715296 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda | 0.120544 |
megatron.core.transformer.mlp.forward.linear_fc1 | 1.508096 |
megatron.core.transformer.mlp.forward.activation | 0.169216 |
megatron.core.transformer.mlp.forward.linear_fc2 | 1.383296 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp | 3.072128 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda | 0.119872 |
megatron.core.transformer.attention.forward.qkv | 197.398529 |
megatron.core.transformer.attention.forward.adjust_key_value | 0.134624 |
megatron.core.transformer.attention.forward.rotary_pos_emb | 0.098336 |
megatron.core.transformer.attention.forward.core_attention | 4,813.348633 |
megatron.core.transformer.attention.forward.linear_proj | 3.424672 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention | 5,015.865723 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda | 238.546204 |
megatron.core.transformer.mlp.forward.linear_fc1 | 11.02528 |
megatron.core.transformer.mlp.forward.activation | 224.08902 |
megatron.core.transformer.mlp.forward.linear_fc2 | 1.619264 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp | 237.643326 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda | 0.28496 |
megatron.core.transformer.attention.forward.qkv | 6.849344 |
megatron.core.transformer.attention.forward.adjust_key_value | 0.105312 |
megatron.core.transformer.attention.forward.rotary_pos_emb | 0.12144 |
megatron.core.transformer.attention.forward.core_attention | 1,622.59436 |
megatron.core.transformer.attention.forward.linear_proj | 0.176384 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention | 1,630.219116 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda | 0.064 |
megatron.core.transformer.mlp.forward.linear_fc1 | 0.759552 |
megatron.core.transformer.mlp.forward.activation | 0.08736 |
megatron.core.transformer.mlp.forward.linear_fc2 | 0.699136 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp | 1.557536 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda | 0.06544 |
megatron.core.transformer.attention.forward.qkv | 320.137909 |
megatron.core.transformer.attention.forward.adjust_key_value | 0.124192 |
megatron.core.transformer.attention.forward.rotary_pos_emb | 0.102368 |
megatron.core.transformer.attention.forward.core_attention | 9,041.067383 |
megatron.core.transformer.attention.forward.linear_proj | 3.818496 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention | 9,366.875 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda | 614.570313 |
megatron.core.transformer.mlp.forward.linear_fc1 | 1.22304 |
megatron.core.transformer.mlp.forward.activation | 311.892639 |
megatron.core.transformer.mlp.forward.linear_fc2 | 1.049664 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp | 314.93634 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda | 0.473888 |
megatron.core.transformer.attention.forward.qkv | 0.592064 |
megatron.core.transformer.attention.forward.adjust_key_value | 0.084352 |
megatron.core.transformer.attention.forward.rotary_pos_emb | 0.0952 |
megatron.core.transformer.attention.forward.core_attention | 2,040.485474 |
megatron.core.transformer.attention.forward.linear_proj | 0.093216 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention | 2,041.650391 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda | 0.037888 |
megatron.core.transformer.mlp.forward.linear_fc1 | 0.373568 |
megatron.core.transformer.mlp.forward.activation | 0.047936 |
megatron.core.transformer.mlp.forward.linear_fc2 | 0.34848 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp | 0.781696 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda | 0.038496 |
megatron.core.transformer.attention.forward.qkv | 204.408676 |
megatron.core.transformer.attention.forward.adjust_key_value | 0.116512 |
megatron.core.transformer.attention.forward.rotary_pos_emb | 0.093632 |
megatron.core.transformer.attention.forward.core_attention | 1,795.054321 |
End of preview. Expand in Data Studio
No dataset card yet
- Downloads last month
- 4