Skip to content

Asset Handling

Brian Lach edited this page May 4, 2023 · 3 revisions

Asset Types

All TF2 Panda asset types have a source representation and a compiled binary representation. During the tfmodels build process, the source form assets are compiled into their binary counterparts for fast loading in the game. Below is a description of the how various asset types are represented and built.

Textures

Textures are represented in source form by one or more image files and a corresponding texture description file, .ptex. For example, hands.rgb contains the actual image data, and hands.ptex will reference hands.rgb and define properties for the texture, such as the filtering mode, color space, etc. When tfmodels is built, hands.rgb and hands.ptex will be compiled into a single binary .txo (Texture Object) file, containing both the image data and texture properties.

Below is an example of a .ptex file.

{
  image hands.rgb
  format srgb
  wrap clamp
}

Materials

Materials are used to define the appearance of a mesh. Various material "types" with different kinds of parameters are available. Each material type has an associated shader that renders the material using the given parameters. They are analogous to .vmt files.

Materials are represented in source form by a .pmat file, which is a text file that specifies the material type and values of parameters. .pmat files can also specify fixed-function render state attributes, such as depth test mode, culling mode, etc. When tfmodels is built, .pmat files are compiled into binary .mto (Material Object) files.

Below is an example of a .pmat file.

{
  // Define the material type to use.
  "material"   "SourceMaterial"
  // Parameters for SourceMaterial.
  "parameters" 
  {
    "basetexture"     "door_grate001_metalgrate.ptex"
  }
  
  // Fixed-function render state attributes.
  "cull"   "double_sided"
  "alpha_test"   "greater_equal"
  "alpha_test_ref"   0.37
}

Models

In source form, models are stored as one or more .blend files and a single .pmdl file. For an animated character, there will be an additional .blend file for each animation.

During the build process, the .blend files are exported to .egg, which is Panda's intermediate representation of a model, analogous to .smd, .fbx, etc. EGG is different from SMD in that it can define all "pieces" of the model in a single file.

The .pmdl file is analogous to .qc. It fully defines a model by referencing a .egg file and defining properties for the model, such as material groups (skins), animation definitions, LODs, physics properties, and more.

The .pmdl file and all referenced .egg files are compiled into a single binary .bam (Binary Animation and Model) file.

Below is an example of the files used to represent a model during the build process.

flowerpot-zero.blend -> exports to flowerpot-zero.egg (base model)
flowerpot-dance.blend -> exports to flowerpot-dance.egg (animation)
flowerpot.pmdl -> references flowerpot-zero.egg and flowerpot-dance.egg, compiles to flowerpot.bam

flowerpot-zero.egg:

<Group> flowerpot.skeleton {
  <Dart> { 1 }
  <Group> flowerpot_physics {
    <Transform> {
      <Matrix4> {
        1 0 0 0
        0 1 0 0
        0 0 1 0
        0 0 0 1
      }
    }
    <VertexPool> flowerpot_physics {
      <Vertex> 0 {
        -32 15.999997 2.000001
        <UV> {
          0 0
          <Tangent> { 1 0 0 }
          <Binormal> { 0 0.396704 0.461616 }
        }
        <Normal> { -0.793433 0.461616 -0.396704 }
        // body:1
      }
      <Vertex> 1 {
        32 16.000002 2.000001
        <UV> {
          0 0
          <Tangent> { 1 0 0 }
          <Binormal> { 0 -0.240185 0.139761 }
        }
        <Normal> { -0.960613 0.139761 0.240185 }
        // body:1
      }
...

flowerpot-dance.egg:

<Table> {
  <Bundle> flowerpot.skeleton {
    <Table> "<skeleton>" {
      <Table> body {
        <Xfm$Anim_S$> xform {
          <Scalar> fps { 24 }
          <Char*> order { sprht }
          <S$Anim> h { <V> { -0.003495 } }
          <S$Anim> p { <V> { 89.992988 } }
          <S$Anim> r { <V> { 0.003495 } }
        }
        <Table> armR {
          <Xfm$Anim_S$> xform {
            <Scalar> fps { 24 }
            <Char*> order { sprht }
            <S$Anim> r {
              <V> {
                0 0.144328 3.54661 10.9766 21.7927 35.3457 50.994
                68.0935 86.0002 104.07 121.656 138.114 152.803 165.076
                174.289 -179.407 -175.641 -175.312 -179.091 173 162.187
                155.524 158.002 158.002 158.002 158.002 158.002 158.002
                158.002
              }
            }
            <S$Anim> x { <V> { 28.492493 } }
            <S$Anim> y { <V> { 57 } }
            <S$Anim> z { <V> { 15.783018 } }
          }
        }
...

flowerpot.pmdl:

{
  model "optchar/flowerpot-zero.egg"

  material_paths [
    "../materials/models/flowerpot"
  ]

  material_groups [
    {
      name red
      materials [ "flowerpot_red.pmat" ]
    }
    {
      name blue
      materials [ "flowerpot_blue.pmat" ]
    }
  ]

  sequences [
    {
      name dance
      anim "optchar/flowerpot-dance.egg"
      fps 24
      loop true
    }
  ]

  physics_model {
    mesh flowerpot_physics
    mass 100.0
  }
}

Asset Porting

In order to bring a texture, material, or model from Source TF2 into TF2 Panda, the asset must be decompiled from the Source Engine and converted into a proper representation for TF2 Panda.

Textures

Porting a texture is very simple. You usually don't need to port a standalone texture, as the material porting script will automatically convert referenced textures. You will usually have to port a texture by hand if the material porting script doesn't support the VMT material type, or the texture isn't actually used by a VMT.

Use VTFEdit, vtf2tga, etc, to export the .vtf into raw image files. Place the raw image files in tfmodels using the same directory structure as the original game, but starting with the maps folder instead of materials. For example, materials/models/player/engineer_red.vtf should be exported to tfmodels/src/maps/models/player/engineer_red.tga.

Then, create a .ptex file for the texture in the same directory that references the .tga. The .ptex file should define the format of the texture and filtering properties. Use srgb for 3-channel color images, srgb_alpha for 4-channel color images. Use rgb/rgba for normal maps, phong masks, etc. VTFEdit will show the properties used for the original texture, which can be used to inform the properties specified in the .ptex.

Look at other .ptex in tfmodels files for reference.

Materials

The most commonly used VMT materials (VertexLitGeneric, LightmappedGeneric, etc) can be automatically converted into .pmat form with the tfmodels/src/devscripts/port_vmt.py script. It will automatically convert the material parameters and textures, save the converted files in tfmodels, and register them in the build scripts.

Less common material types, such as skyboxes, have to be manually converted and added to tfmodels, including the textures. You can look at other ported skyboxes and materials for reference.

The VMT material might not have a corresponding .pmat material type. In this case, we need to implement a new material type for it and write a corresponding shader, which is probably not a job for you.

Models

Porting a model can either be extremely simple and automatic or cumbersome and tedious.

Static prop models bare on the side of simple and automatic, as the tfmodels/src/devscripts/port_mdl.py script handles porting static props, including referenced materials and textures. Give it the path to the .mdl file and it will do the job.

Animated characters, including weapon models and dynamic props, are more involved. There is no automatic process for converting animations, but there are some scripts to help you along the way. Luckily there are way more static props than animated characters, so you don't have to deal with this too often. I may write instructions for this at a later time. The general process is to use something like Crowbar to decompile the .mdl, import the .qc into Blender and save it as modelname-zero.blend, then run blender -b -P tfmodels\src\devscripts\import_smd_anims.py, passing in the directory containing the .smd files for each animation of the model. It will create a .blend file for each animation that links to the base model .blend. Port the materials used by the model by invoking tfmodels/src/devscripts/port_vmt.py on each material the model uses, or using port_mdl.py, passing the -m option. Edit the Sources.pp script in the directory of the model, adding a tf_char_egg build rule to export the .blend files to .egg files and optimize them. Then create a .pmdl for the model, using the original .qc as reference to set up the LODs, material groups, animation definitions, etc. Finally, add an install_mdl rule to Sources.pp, specifying the .pmdl file for the model.

Maps

This one is pretty well automated aside from some edge cases, such as not-yet-ported dynamic props or not-yet-implemented entity types. Get the .vmf for the level and place it in tfmodels/src/levels. Then run python tfmodels/src/devscripts/port_vmf.py -i tfmodels/src/levels/map.vmf -b -p -o. This script will port all brush materials, overlay materials, and static prop models and materials. That does most of the job. Then you should be able to run the mapbuilder on it (see the Wiki page on compiling maps).

Particle Systems

The particle system is still a work-in-progress and there is no textual representation for particle systems. What I've done so far is decompile the particle systems using dmxconvert, then read the .dmx file to create a particle definition in tf/src/tfbase/TFEFffects.py. The materials and textures have to be converted by hand.

UI

Haven't gotten to this point yet in development. All the UI is very basic.