Search Unity

  1. Unity 6 Preview is now available. To find out what's new, have a look at our Unity 6 Preview blog post.
    Dismiss Notice
  2. Unity is excited to announce that we will be collaborating with TheXPlace for a summer game jam from June 13 - June 19. Learn more.
    Dismiss Notice

I wrote Surface Shaders 2.0 so you don't have to deal with SRPs anymore

Discussion in 'General Graphics' started by jbooth, Jan 20, 2021.

  1. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,461
    So I wrote a surface shader system, which is currently compiling shaders into Standard and URP, and HDRP should be working soon. I think it's much cleaner than the old surface shaders system, which has a lot of kookiness around screenPos, viewDir, normals, etc. And it's also modular, allowing you do something like #include's, but where they can bring properties, cbuffer entries, and other stuff along with it.

    As an example, here's a basic shader with tessellation support:

    Code (CSharp):
    1.  
    2. BEGIN_OPTIONS
    3.    Tessellation "distance"
    4. END_OPTIONS
    5.  
    6. BEGIN_PROPERTIES
    7.    _Albedo ("Albedo", 2D) = "white" {}
    8.    _Normal ("Normal", 2D) = "bump" {}
    9.    _Height ("Height Map", 2D) = "black" {}
    10.    _DisplacementAmount("Displacement Amount", Range(0,2)) = 0.5
    11.    _DisplacementMipBias("Displacement Mip Bias", Range(0,6)) = 2
    12.    _TessSubdiv("Tessellation Subdivisions", Range(2, 24)) = 8
    13.    _TessMinDistance("Tessellation Min Distance", Float) = 0
    14.    _TessMaxDistance("Tessellation Max Distance", Float) = 35
    15. END_PROPERTIES
    16.  
    17. BEGIN_CBUFFER
    18.    float _DisplacementAmount;
    19.    float _DisplacementMipBias;
    20.    float _TessSubdiv;
    21.    float _TessMinDistance;
    22.    float _TessMaxDistance;
    23. END_CBUFFER
    24.  
    25. BEGIN_CODE
    26.  
    27.    sampler2D _Albedo;
    28.    sampler2D _Normal;
    29.    sampler2D _Height;
    30.  
    31.    // (optional)modify the vertex post tessellation
    32.    void DisplaceVertex(inout VertexData v)
    33.    {
    34.       v.vertex.xyz = v.vertex.xyz + v.normal * tex2Dlod(_Height, float4(v.texcoord0.xy, 0, _DisplacementMipBias)).g * _DisplacementAmount;
    35.    }
    36.  
    37.    // (optional) if you are using tessellation and displacement, you can return
    38.    // the tessellation distance and subdivision here
    39.    float3 GetTessDistanceFactors ()
    40.    {
    41.       return float3(_TessMinDistance, _TessMaxDistance, _TessSubdiv);
    42.    }
    43.  
    44.    void SurfaceFunction(inout LightingInputs o, ShaderData d)
    45.    {
    46.       half4 c = tex2D(_Albedo, d.texcoord0.xy);
    47.       o.Albedo = c.rgb;
    48.       o.Alpha = c.a;
    49.       o.Normal = UnpackNormal(tex2D(_Normal, d.texcoord0.xy));
    50.    }
    51.  
    52. END_CODE
    53.  
    This will compile to all three render pipelines, and acts just like any other shader in your system. Note that you don't write the v2f and other traditional structures- rather the system uses a naming convention and constructs them for you based on if you access that data. So, for instance, if you read d.TangentSpaceViewDir, then it will be provided for you.

    Here is an example shader documenting the current data and options that are available (this will grow):

    Code (CSharp):
    1.  
    2. BEGIN_OPTIONS
    3.    // ShaderName "Path/ShaderName"  // The default will just use the filename, but if you want to path/name your shader
    4.    // Tessellation "Distance"       // automatic tessellation, distance, edge, phong
    5.    // Alpha "Blend"                 // use alpha blending?
    6.    // Fallback "Diffuse"            // fallback shader
    7.    // CustomEditor "MyCustomEditor" // Custom Editor
    8.    // RenderType "Opaque"           // render type
    9.    // Queue "Geometry+100"          // forward rendering order
    10.    // Workflow "Metallic"           // Specular or Metallic workflow, metallic is default
    11. END_OPTIONS
    12.  
    13. // Put any properties you have between the begin/end property blocks
    14. BEGIN_PROPERTIES
    15.     _Color ("Main Color", Color) = (0, 1, 0, 1)
    16. END_PROPERTIES
    17.  
    18. // Any variables you want to have in the per material CBuffer go here.
    19. BEGIN_CBUFFER
    20.     half4 _Color;
    21. END_CBUFFER
    22.  
    23. // if you are writing a subshader, any defines that should be set on the main
    24. // shader are defined here
    25. BEGIN_DEFINES
    26.  
    27. END_DEFINES
    28.  
    29. // All code goes here
    30. BEGIN_CODE
    31.  
    32.     // (optional) if you want to modify any vertex data before it's processed,
    33.     //    put it in the ModifyVertex function. The struct is:
    34.     //    struct VertexData
    35.    // {
    36.    //    float4 vertex      : POSITION;
    37.    //    float3 normal      : NORMAL;
    38.    //    float4 tangent     : TANGENT;
    39.    //    float4 texcoord0    : TEXCOORD0;
    40.    //    float4 texcoord1   : TEXCOORD1;
    41.    //    float4 texcoord2   : TEXCOORD2;
    42.    //    float4 texcoord3   : TEXCOORD3;
    43.    //    float4 vertexColor : COLOR;
    44.    // };
    45.  
    46.    // (optional)modify the vertex
    47.     void ModifyVertex(inout VertexData v)
    48.     {
    49.     }
    50.  
    51.    // (optional)modify the vertex post tessellation
    52.    void DisplaceVertex(inout VertexData v)
    53.    {
    54.    }
    55.  
    56.    // (optional) if you are using automatic tessellation and displacement, you can return
    57.    // the tessellation distance and subdivision here
    58.    float3 GetTessDistanceFactors ()
    59.    {
    60.       float minDistance = 0;
    61.       float maxDistance = 35;
    62.       float subDiv = 12;
    63.       return float3(minDistance, maxDistance, subDiv);
    64.    }
    65.  
    66.  
    67.     // (required) Write your surface function, filling out the inputs to the
    68.    // lighting equation. LightingInputs contains:
    69.  
    70.     // struct LightingInputs
    71.    // {
    72.    //    half3 Albedo;
    73.    //    half3 Normal;
    74.    //    half Smoothness;
    75.    //    half Metallic;     // only used in metallic workflow
    76.    //    half3 Specular; // only used in specular workflow
    77.    //    half Occlusion;
    78.    //    half3 Emission;
    79.    //    half Alpha;
    80.    // };
    81.  
    82.    // The SurfaceData function contains common data you might want, precomputed
    83.    // for you. Note the system strips unused elements from the structures automatically,
    84.    // so there is no cost to unused stuff.
    85.  
    86.    // struct ShaderData
    87.    // {
    88.    //    float3 LocalSpacePosition;
    89.    //    float3 LocalSpaceNormal;
    90.    //    float3 LocalSpaceTangent;
    91.    //    float3 WorldSpacePosition;
    92.    //    float3 WorldSpaceNormal;
    93.    //    float3 WorldSpaceTangent;
    94.    //    float3 WorldSpaceViewDir;
    95.    //    float3 TangentSpaceViewDir;
    96.    //    float4 texcoord0;
    97.    //    float4 texcoord1;
    98.    //    float4 texcoord2;
    99.    //    float4 texcoord3;
    100.    //    float2 screenUV;
    101.    //    float4 screenPos;
    102.    //    float3x3 TBNMatrix;
    103.    // };
    104.  
    105.  
    106.     void SurfaceFunction(inout LightingInputs o, ShaderData d)
    107.     {
    108.         o.Albedo = _Color.rgb;
    109.         o.Alpha = _Color.a;
    110.     }
    111.  
    112. END_CODE
    113.  
    114.  
    Note that even though I have automatic ways to do things like tessellation, it's still possible to write your own versions of those functions if you want to do that, or to add geometry shaders, compute buffer data, etc. There's significantly less restraints and assumptions than the old surface shader system had, and the naming conventions are clear about things like "what space is this in?", and there's no funky "This will be in this space on Tuesdays, but on Wednesdays it will return NaN, and on Friday it will be whatever the value in o.pos is", and you don't have to do funky stuff to get at the TBN matrix, it's just there, where you can access it. Crazy, right?

    Shaders are also modular. One shader can include another, and it will bring in it's properties, cbuffer entries, etc. Right now this is only handled via code, but it would be possible to provide this via a scriptable object, such that users could add "Snow" to an existing shader. Obviously there are limits to how far you can push this, but adding weather effects to existing shaders is a perfect example that has been an issue for a lot of games in the past.

    So main benefits:
    - Simple, constant way to write shaders
    - Write once, run on standard, URP, or HDRP
    - Shaders automatically upgrade between SRP versions (*assuming I have done the support for them)
    - Can write features as separate shaders, then plug them together.

    ------

    So, all that said, it would be interesting to know some of the following, assuming this is something that interests you:

    - Which SRP do you use, if any?
    - What use cases do you write custom or surface shaders for?
    - What features did you find limiting in Surface Shaders that you would want to have control over?
    - Which unique shading features do you use in your SRP (ie: Bent normals, SSS, etc), and how would you like to handle fallbacks in other pipelines (ie: Approximation of SSS in URP, don't bother, etc).

    I also have a lot of different thoughts about how to sell this. I will most likely move MicroSplat over to using this system instead of it's current render adapters and provide an upgrade there, as maintenance of systems like these is a huge potential cost. Last year, supporting SRPs was about half of my development time, so having only one system to abstract these issues makes a ton of sense.

    Anyway, thoughts welcome.
     
    Last edited: Jan 20, 2021
  2. Cynicat

    Cynicat

    Joined:
    Jun 12, 2013
    Posts:
    290
    Holy crap, nicely done!
     
    Walter_Hulsebos and RB_lashman like this.
  3. andrea_i

    andrea_i

    Joined:
    Nov 18, 2012
    Posts:
    32
    This is 100% needed, and personally I'd have loved to see a system like this used as the base foundation for shadergraph.
     
    AlejMC, LooperVFX, Cynicat and 5 others like this.
  4. Peter77

    Peter77

    QA Jesus

    Joined:
    Jun 12, 2013
    Posts:
    6,636
    Walter_Hulsebos and fherbst like this.
  5. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,461
    Design is based off that, yes - but actually compiles to multiple pipelines, supports transparency, spec/metallic workflows, shader stripping, tessellation, proper line number error reporting, etc..
     
    Last edited: Jan 20, 2021
  6. print_helloworld

    print_helloworld

    Joined:
    Nov 14, 2016
    Posts:
    231
    Now this is epic
     
  7. syscrusher

    syscrusher

    Joined:
    Jul 4, 2015
    Posts:
    1,104
    Brilliant -- this is what Unity's own team should have done from the start, which would have saved the rest of us vast amounts of time.

    Have you considered any integration of this with Amplify Shader Editor, possibly as a custom template?
     
  8. Aras

    Aras

    Unity Technologies

    Joined:
    Nov 7, 2005
    Posts:
    4,770
    This looks great, love it! And yeah, ticks all the boxes of what I think "surface shaders 2.0" should have been.

    The original surface shaders system was done for Unity 3.0 pretty much a decade ago, and one of the causes of it's wonkiness was that it tried real hard to save on instructions & interpolators. DX9 shader model 2.0 with max. 64 instructions and 8 interpolators was still a big thing back then, and since all the previously hand-written Unity built-in shaders were going to get changed to surface shaders, the system spent a whole lot of effort in making sure there's no regressions in terms of generated code, instructions or interpolators, compared to hand-written shaders. This did lead to things like "oh, a shader does not actually sample the normal map in this variant? this means tangent space does not need to be passed, saving us a couple multiplies". That was (I think) a good call for DX9 SM2.0 era, but now a decade later is just mostly pointless confusion & complexity.

    My own initial prototypes for surface shaders looked really similar to your example above (see blog post), which is curious since there's 11 years of time between them :) The final result that ended up in Unity was different since once you want to have multiple sub-shaders, fallbacks, more features, etc. etc. it (to some extent) stops being this nice thing and becomes a bit of a mess. A whole lot of that is much less relevant today though, so fingers crossed your system does not have to become a mess.

    And yes, doing all the codegen & logic in C# with a custom importer makes much more sense.
     
  9. fherbst

    fherbst

    Joined:
    Jun 24, 2012
    Posts:
    802
    Awesome Jason, eager to try this out!

    Frustration and anger are very powerful forces to push the state of the art ;)

    Not sure how that fits with monetization, but I think keeping this working across versions, adding arcane features etc would probably be easier if the source was accessible somewhere - I'm sure many people (including me) would love to push this forward.
     
    Last edited: Jan 20, 2021
  10. AFrisby

    AFrisby

    Joined:
    Apr 14, 2010
    Posts:
    223
    SRP: Them all. :'(
    SRP Unique Features: SSS, Iridescence, Bent Normals.
    Fallbacks: Please emulate *if possible*, drop if not.
     
  11. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,461
    Yeah, it's pretty obvious looking at the output code how much work was done to save even one interpolator or computation, none of which matters much anymore. But it likely helped Unity gain dominance on early mobile, which is a large part of the companies success story, so it might be baggage now, but it had it's use. URP is largely a cleaned up version of the old pipeline, and once you strip all the bloat the shader graph outputs about packing and unpacking structures, it's quite workable - but I will never understand why it does all that struct conversion, simply standardizing the structs across the pipelines and stages would do a ton to making the code more readable. Right now, it's like 9k lines, 70% boilerplate, and you're like "Where is the actual frag function?" (*it's deep in includes).

    I will forever be bitter about whomever made macro's assume that the position is called pos in the standard shader. I'm forced to keep that convention internally (well, or I unroll a ton of macro's spread across multiple files to avoid it).

    The begin/end block style makes it super easy to parse - and while I prefer the surface shader syntax (looks like code), I'm trying to make this as easy on myself as possible, so I can focus on the larger issues; like what to do about features which only exist in HDRP and how to gracefully fall back, how do I monetize this thing so I'm not trapped in porting hell for minimum wage, etc. But the need is there, and I've been hacking together adapters for MicroSplat long enough that doing a more formalized system might make sense even if no one else uses it, and 5 years of raising the issue with Unity hasn't gotten anywhere - so I'm going for it and we'll see what happens. I think, if anything, I'll say no to more stuff than surface shaders did, and be able to ride of the benefits of having pushed that system to it's breaking point, as well as not needing to handle super low end stuff anymore.

    I mean the fact that I can do all of this via the game engine's scripting language is kind of what makes Unity great. A lot of new Unity has been moving away from this kind of scriptability, closing off APIs, etc - but to me that is what makes the engine cool.
     
  12. bac9-flcl

    bac9-flcl

    Joined:
    Dec 5, 2012
    Posts:
    829
    Wow, that's sounds incredible so far! I wonder what the catch is, if you don't mind me asking. I assume you're only planning to support one version of HDRP and URP, e.g. version shipping with current LTS?

    To answer your questions:

    I'm trying HDRP on a personal project (going to try HDRP again with release 10, primarily interested in archviz-like scenes coupled with VFX experiments, mostly because that's what built-in HDRP features and Shader Graph seem to handle well) and built-in deferred pipeline on our main project.

    We're mostly using surface shaders. We don't need to do anything custom with lighting/shadows (although it's nice to be able to extend the output structure and modify deferred to use it, like AFS does with translucency/wrap lighting output it adds). Lots of the shaders in the project for things like visual effects are something you can potentially build with Shader Graph, but there are couple of things I'm not sure it'd support. We rely on support for procedural instancing (we submit batches using Graphics.DrawMeshInstancedIndirect and our shaders read StructuredBuffers to get instance data), can't render our levels without that. We also needed to use custom shaders to use Star Citizen style mesh decals blending into individual deferred buffers. I vaguely recall an old version of that using some of the special "final" functions surface shaders has, but current one is fully custom so we probably hit some limitation with that (can't quite recall right now, I can look into it if it'd be useful).

    A bit less boilerplate, like your examples show, would already be a huge improvement. My personal most hated quirk is properties like worldPos giving you different things under different circumstances and maybe how vertex shader in surface shaders uses different space vs. custom shaders. It makes something technically trivial like a surface billboard shader more pain to implement, as you have to revert whatever was done to the value you hoped to get, then transform your output back into space surface system expected. Can't quite recall much else to complain about off the top of my head, I'm actually pretty happy with surface shaders relative to many other parts of Unity, the biggest pain is them not existing in any form on new pipelines. :)

    Full support for all useful forms of instancing (and maybe making it possible to do SSS and deferred mesh decals that can blend separately on separate targets, e.g. just normals) are the most important things for me. Not much else comes to mind just yet. Wrt emulation, I don't think I'll need emulation of URP/HDRP features on built-in, if that's what you were referring to.

    No matter how you'll decide to release it, I'm very much looking forward to hearing more. Keep up the great work, I think Unity community is lucky to have you!
     
    Last edited: Jan 20, 2021
    AlejMC and SonicBloomEric like this.
  13. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,461
    So yeah, the money part. It's what stopped me from doing this a while back, because I didn't really want to be the janitor writing these templates and chasing down changes every time they are made, but also because how do you, exactly, get paid for that work? I think there are several users of this system:

    1. Someone who just wants an approachable way to write a shader, without being hampered by a graph. Maybe they are doing something with compute and need to feed it procedural data, and having to unroll a 10,000 line HDRP shader and modify it is not exactly fun.
    2. People publishing to the Asset Store, like myself, who have to support multiple pipelines.
    3. Larger studios who want one way to write things for multiple projects, which could be using any renderer.

    I'd like to support all of these cases, but they each provide different challenges. For 1 and 3, you sell an asset on the store that does what it does, and they are happy. However, the market for this is likely not massive- and since most larger studios buy one license and illegally share it, you're not getting paid what your supposed to be paid for those customers, even though it's easily worth a lot of money for those studios.

    For #1:
    - They need a reasonable priced solution to solve their personal problems, making shaders easy to develop. They are likely not super interesting in multiple pipelines.

    For #2:

    Here there could be options:
    - Adapters sold separately (much like my URP/HDRP adapters for MicroSplat). Maybe the standard pipeline is free, and people buy the URP/HDRP adapters. I get money from selling more adapters, they save tons of development effort by not having to develop everything three times. This is obviously more complex, but at some scale if enough developers sign on, then users wanting those products for HDRP/URP pick up the adapters and get support in a whole bunch of products. But this needs scale to work - if developers don't sign on, then the only adapters which sell are because of my products (MicroSplat, etc) anyway. And though I have not really had any backlash about selling SRP compatibility, a lot of UAS dev's want to embrace race for the bottom tactics, thinking that pricing low and killing themselves supporting all these different pipelines is going to get them somewhere (it's not).
    - Enterprise licenses to UAS developers. Real money per year, you get to ship a DLL that compiles this stuff with your product. To the user, everything just works, and you save development time. But this is a small market of mostly shader heavy assets.
    - Just sell the editor, and make UAS developers export different shaders for each platform/version/etc, like they have to do now. This kind of blow's because the user experience is still pink material's when they install, until they unpack the "HDRP shaders for 2019.4" package, and asset store authors are shipping non-modifiable source and multiple copies of everything. On the plus side, it's much easier on my end - I add an export button and I'm done. And if asset store authors ship their source and users buy my system, they can edit them from the source. (This is basically the Amplify model).

    For #3:

    - Sell on the store, and get paid a fraction of what you are supposed to be paid.
    - Sell only through enterprise licenses. Get paid properly, and provide top notch support

    ----

    Trying to balance all this is hard. And a lot of these lead to source being in a DLL, which I'm personally not a huge fan of. I'd much prefer to open source it and have the community chip in, but a) Unity's community is not historically good with that, so it would mostly fall onto me and b) I'm not trying to go hungry atoning for Unity's sins.
     
  14. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,461
    Most likely I'd just support LTS versions, and if they work on non-LTS versions, great. Now if this became a major revenue stream, I might increase that, but not for what MicroSplats HDRP/URP adapters make now (which is actually decent money, just not enough to rewrite them every other month).

    That would be useful to know more about. I have not added a final color style system yet, but that would be very easy to add. The Code block is just kinda dropped into each pass, and I plan to have defines so you can easily write code for specific passes or SRPs if you need to do that kind of stuff. Thinks like structured buffers and such would just work.


    Right now the ModifyVertex function is called before any transforms are done to the vertex, so it's entirely local space. However, it would be trivial to break that into a function that is called on the vertex, and one that is called on the v2f structure after it's transforms are done, so if you wanted to modify the vertex in clip space, or even get into funky low level stuff like modifying the lightmap UVs in URP only, you could. But I haven't exposed that yet because the V2F structure is different depending on pass and SRP, and that seems prone to error in a lot of cases.

    Well, for instance, there are cheap approximations for SSS that would be fine for URP/Built In, so I could add those and have some kind of fallback setting so you can emulate it or not. Bent Normals I'd likely just drop. It's kind of case by case, because HDRP has a lot more shading features, and some of them aren't easily emulated. Instancing should just work.
     
  15. FM-Productions

    FM-Productions

    Joined:
    May 1, 2017
    Posts:
    72
    I just wanted to say that I absolutely love an approach like this! Can't wait to try it out. Letting everything be modifiable/having less restraints is a big plus too.

    I have been working with the old Surface shaders for a good bit recently because I'm not yet willing to port my bigger project to one of the new render pipelines (will probably be HDRP if so) - but even then I would likely go with Amplify Shader instead of Shader Graph (from the tools that are available to me so far).

    What I wished for in the Standard Render Pipeline was a way to have master materials/material templates - after I saw how Unreal handles this, which is very designer friendly). A good comparison would be Prefabs to Prefab variants, where a material variant would inherit all properties and property references of the parent unless explicitly marked for overwriting. That is most likely outside the functionality for a tool supposed to compile to shaders in multiple render pipelines, but it's still a thing that bothered me with the Standard Pipeline workflow.
     
    Last edited: Jan 20, 2021
    AlejMC likes this.
  16. LeFx_Tom

    LeFx_Tom

    Joined:
    Jan 18, 2013
    Posts:
    88
    Just to chip in here with some piece of advice:
    Have you checked how the whole Shapes approach via Patreon "pay what you think it's worth" worked for Freya Holmér? It seemed like she made some quite ok returns by open-sourcing a "demo" and having people pay via Patreon.
    I don't know exact numbers, but it seemed to be a surprisingly viable approach for her. Maybe worth reaching out and getting some feedback from her about that?
     
    syscrusher likes this.
  17. KYL3R

    KYL3R

    Joined:
    Nov 16, 2012
    Posts:
    139
    I bought that on 50% sale. I was surprised by the results of the "how much should I charge for Shapes?"-Poll on twitter.
    I think compared to ShaderForge, Shapes is really expensive (when not on sale) but Freya absolutely deserves the success nontheless.
     
  18. bac9-flcl

    bac9-flcl

    Joined:
    Dec 5, 2012
    Posts:
    829
    All of that sounds absolutely great!

    If I recall correctly, the specific surface shader feature we originally used was finalgbuffer function which let you blend contribution to each render target separately by modifying alpha of output color. It didn't allow you to do anything fancy like proper normal blending since you can't read what's already in the target, but it was better than nothing. After that we also wanted to blend smoothness nicely, and that started complicating things because you had to make it a two-pass shader for an imperfect approximation (due to smoothness being packed into alpha of one of the targets and therefore due to needing to write a backing color into it). A current decal shader we use is just a fork of Standard shader (not a surface shader) that does a Blend Zero SrcColor pass first (to output something like

    outGBuffer0 = half4(1 - alphaAlbedo, 1 - alphaAlbedo, 1 - alphaAlbedo, 1 - alphaOcclusion);
    and then does a Blend One One pass (to apply actual contributions multiplied by per-output alpha). Hopefully that helps!

    I think HDRP had some support for this decal shader style built in, last I tried it (when FPS Sample was first released), but I haven't had a chance to try that in latest HDRP yet. Haven't heard of URP having any mesh decal support. It's a bit hard to imagine how to make a surface 2.0 shader like that work in all pipelines, especially when URP is added to the equation, since the whole idea is deferred specific. I'm curious what your opinion on handling deferred specific cases like per gbuffer outputs - is that something your system should even try to cover?
     
  19. bac9-flcl

    bac9-flcl

    Joined:
    Dec 5, 2012
    Posts:
    829
    I'm also curious if you could go into more detail about the outputs you're planning to support. Are you planning to focus only on a case of PBR surface materials (akin to Lit shader in new pipelines), which is what the original surface shader system primarily cared about? Most of the surface shaders you see only use SurfaceOutputStandard so that could be reasonable. Or are you planning to attempt replicating something akin to Shader Graph Master Nodes (Lit + Unlit + Hair + Decal + Fabric etc.) across all pipelines? A lot of shaders make interesting decisions not in the fiddly details of using final results like albedo value, but with how they arrive at a given value at a given point, so I can see a ton of value in getting multiple output types out of the box. But it's probably a much bigger support burden to try covering all of those output types when only basic lit surface is covered equally by all 3 pipelines.
     
  20. KYL3R

    KYL3R

    Joined:
    Nov 16, 2012
    Posts:
    139
    To answer your questions:

    - Which SRP do you use, if any?
    I'm using HDRP for a PC Game.

    - What use cases do you write custom or surface shaders for?

    A custom Grass Shader with ComputeBuffers to be used with DrawMeshInstancedIndirect. I used DrawMeshInstanced before with MaterialPropertyBlock-Arrays, but the indirect method is even better for me. I tried to redo it in ShaderGraph, which worked with MaterialPropertyBlocks on single meshes, but not per-mesh. The "access instanced props" is not supported in ShaderGraph, as far as I understood. I need that for grass interaction like bending, cutting etc.


    - What features did you find limiting in Surface Shaders that you would want to have control over?
    in classic Surface Shaders ("1.0") I always wanted to do more, like manipulating the vertices which required me to write vertex/frag shaders from early on - I rarely touched surface shaders because they felt simple but limited. If I want a simple shader today, I'd use shadergraph.

    - Which unique shading features do you use in your SRP (ie: Bent normals, SSS, etc), and how would you like to handle fallbacks in other pipelines (ie: Approximation of SSS in URP, don't bother, etc).
    Do not really bother for my current project.
     
  21. DreamingImLatios

    DreamingImLatios

    Joined:
    Jun 3, 2017
    Posts:
    4,302
    Whether or not you remember this, we once had a discussion where I ranted about my frustrations with surface shaders. This solves all but two of them.
    1) Inconsistent conventions for struct fields. Some are PascalCase and some are camelCase.
    2) Perhaps a personal pet peeve of mine, but why does everyone call the position attribute "vertex" despite it really just being one attribute of a real vertex that the vertex shader processes?

    With those said, it is difficult for me to justify this over a graph workflow nowadays.
    1) Graph workflows have realtime intermediate outputs. I think if you really want this to take off, you need to provide a macro that generates a temporary output and a window that previews all the temporary outputs.
    2) Most of the time I need to write custom shaders, it is because I want to do something custom with the lighting.
    3) I hate the lack of proper IDE support for Unity shaders.

    I use HDRP and DOTS so you probably have no interest in supporting my graphical adventures, but maybe this offers food for thought.
     
  22. Aras

    Aras

    Unity Technologies

    Joined:
    Nov 7, 2005
    Posts:
    4,770
    I think Rider is pretty good at shaders these days (i.e. started to get good sometime during 2020). Have you tried it?
     
  23. DreamingImLatios

    DreamingImLatios

    Joined:
    Jun 3, 2017
    Posts:
    4,302
    Rider isn't compatible with my custom code analysis tools.

    But my main point was that for a surface shader solution that relies on codegen to work, it needs IDE support, and the out-of-the-box visual studio IDE experience is not sufficient.
     
  24. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,461
    Yes, she ended up shutting it down and going with the asset store, and someone leaked her source in a day.

    The primary focus will be on standard PBR shaders, though I will likely add unlit as an option, and cover some pipeline specific features when possible. There's kind of two solves here- one is portability across pipelines, the other is ease of writing shaders. It would be a lot easier to support all of HDRP's features if I wasn't dealing with portability, for instance.
     
    bac9-flcl and FM-Productions like this.
  25. PhilSA

    PhilSA

    Joined:
    Jul 11, 2013
    Posts:
    1,926
    A use case that I always have a hard time with is: how can an asset store package work on LegacyRP, URP, and HDRP without any hassle. Imagine a sample game project that comes with a store package, and the shaders need to work everywhere

    I feel like this is one of the most common reasons for wanting to support all the RPs at the same time, but I'm wondering about how this would work if the surface shaders solution ends up being a paid/licensed product. Could it be included in other asset store projects that are being sold?

    I think the ideal scenario would be if Unity paid you a good amount of money to acquire this tech, so it comes built-into the engine (if that's something you'd be ok with). A "pay me to keep working on this" Patreon might also work, but could be hard to pull off if you don't already have a large following
     
    Last edited: Jan 20, 2021
    syscrusher likes this.
  26. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,461
    Yeah, this is about half the use cases- the other being wanting to have a sane way to write simple shaders in HDRP/URP. Right now most people just ship multiple copies of the shader, and that could be an easy thing to support. But this doesn't give you automatic updating, or a non-pink world when you install. Another option is requiring paid adapters, as stated above, but I did think of a third this morning.

    Basically, I could sell a separate asset that allows you to package the shaders for the asset store. It would essentially write them into a binary glob, every version that you support - so 2019.4URP, 2020.4URP, 2019.4HDRP, etc, etc. It would come with a small bit of code which can extract the correct shader for the correct pipeline and compile it on import. This would allow the shader to seem "universal" to the user, however, it would still lack upgrading to new versions until the author exports a new version, and it would still require the user to have the code to read them installed, most likely from a separate download.

    The ideal scenario is that Unity would have solved this issue when they first started working on SRPs and built from that. But they didn't, and after about 5 years of screaming from the community they have shown little interest in solving this issue, as they load more wood onto the fire instead. This is not a hard issue to solve internally- it's much harder to solve externally the way that I'm doing. For instance, I have to write code to capture shader errors and relay them, with line numbers for their files, to the user- that would be much easier to do if I had internal access, as the API's there seem to have some issues (shaders showing compile errors even after they have compiled and I've used ShaderUtil to clear them, etc). Also, they know what actual changes are being made to the shader source and why- where as we're left to diff and guess, with no documentation or help.

    I'd love it if things like Patreon were viable, but they just really aren't beyond tips.
     
  27. laurentlavigne

    laurentlavigne

    Joined:
    Aug 16, 2012
    Posts:
    6,444
    Wow! This is super readable!

    Could you show an example that passes vertex data to the surface function?

    The name DisplaceVertex implies that it only displaces vertices, is this an example or a reserved keyword used by your processor? If it's a reserved keyword, naming convention would benefit from something that shows which stage they're at and what they do like PostTesselationModifyVertex.
     
  28. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,461
    Yeah, naming isn't final- I'll likely do a pass to make things explicit, but also managable in size. Right now there's a call for processing the vertex before tessellation, and one for after the vertex is tessellated. These will have function names and arguments.

    To pass data from vertex to pixel, you'd just use the texcoords. Right now I support 4 float4's, but will likely add a few more. This means you cannot name the data and must pack it yourself, but that's a good tradeoff IMO. So for instance, if you had 4 values packed into the v.color.r channel, and wanted to unpack them before sending across the v2f bus so the interpolation doesn't mess them up, you might do something like this:

    Code (CSharp):
    1.  
    2. void ModifyVertex(inout VertexData v)
    3. {
    4.      v.texcoord3 = UnpackFloat4(v.color.r);
    5.      v.texcoord4 = UnpackFloat4(v.color.g);
    6.      v.texcoord5 = UnpackFloat4(v.color.b);
    7.      v.texcoord0.zw = ComputeSomeStuff();
    8. }
    This is a little un-optimal in that it bloats the vertex data structure, though I'm not sure how much that matters if the mesh doesn't have that data on it anyway. To fix this, I'd have to make the parser more complex to understand reads vs. writes, and make a dummy structure that's used instead of the actual app structure, but for now this seems fine..
     
    laurentlavigne likes this.
  29. ph_

    ph_

    Joined:
    Sep 5, 2013
    Posts:
    232
    I know it's probably a stretch, but a use case could also be custom SRPs.

    Like if I coded (or sold on the asset store) a custom SRP, I could implement some base class from your plugin to basically output the shader (from your internal format to "my" shaders).
     
  30. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,461
    This wouldn't be hard to do, since it would just need a template for your format- but Unity is moving towards making it so it's not really practical to have your own SRPs by hard coding the shader graph and VFX graphs to only work with their pipelines (or variants of), so I kinda doubt this is ever going to be a thing you see on the asset store.
     
    ph_ likes this.
  31. chingwa

    chingwa

    Joined:
    Dec 4, 2009
    Posts:
    3,792
    So please EXPLAIN to us why Unity is not providing this for it's users. You come in and praise this as a solution and simultaneously are completely dropping the ball and leaving your long-time users and customers in the lurch.
     
  32. ph_

    ph_

    Joined:
    Sep 5, 2013
    Posts:
    232
    Aras isn't necessarily "the one" that can decide this, and especially because he's not working in graphics like he used to.

    Please stay respectful and constructive :)
    In addition, he didn't have to come and comment, but he did and provided light as to why things have been done this way in the past.
     
    JoNax97, syscrusher, LeFx_Tom and 6 others like this.
  33. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,461
    Company dynamics are usually not up to one person. I'm pretty sure Aras would like to have seen Unity providing this type of solution, he wrote the original surface shaders after all, and maybe he can shed some light on why not, but at Unity's scale the answer is usually "Politics"..
     
  34. chingwa

    chingwa

    Joined:
    Dec 4, 2009
    Posts:
    3,792
    Yes that is all well and good, but I would really like some insight as to why this is not being provided now, when we need it more than ever with three completely incompatible rendering systems.
     
    atomicjoe and ratking like this.
  35. Aras

    Aras

    Unity Technologies

    Joined:
    Nov 7, 2005
    Posts:
    4,770
    Someone who actually makes graphics roadmap decisions & prioretization would have to do that. That person is not me :)
     
  36. mgear

    mgear

    Joined:
    Aug 3, 2010
    Posts:
    9,526
    could suggest them to "grab" this developer/system, until SRP's are "ready"..?

    (same way like they did for missing networking > MLAPI, missing visual scripting > BOLT.. and maybe had few more?)
     
    Walter_Hulsebos likes this.
  37. BattleAngelAlita

    BattleAngelAlita

    Joined:
    Nov 20, 2016
    Posts:
    400
    How about SG->Surf2 conversion? Many HDRP/URP projects have already a lots of SG shaders. And convertor can help them to easily migrate to different renderers.
     
  38. Thomas-Mountainborn

    Thomas-Mountainborn

    Joined:
    Jun 11, 2015
    Posts:
    503
    Soo... what would it take to get you back into the graphics side of things? 0:]
     
    laurentlavigne likes this.
  39. syscrusher

    syscrusher

    Joined:
    Jul 4, 2015
    Posts:
    1,104
    Thanks for all the background info. My first version of Unity was circa 5.3, so I didn't realize the surface shader model went back that far.
     
  40. syscrusher

    syscrusher

    Joined:
    Jul 4, 2015
    Posts:
    1,104
    I share your frustration with the pipeline incompatibilities, and the difficulty of migrating existing art assets to SRP (the automated upgrades help a lot, but they don't always work and in particular sometimes miss the albedo map input, which to me should be the simplest one since it it literally the same functionality).

    That said, I think much can be attributed to Unity's octopus-like extension as a company, in a fast-moving industry. They're riding the wave of interactive content growing as an industry, plus hardware advances that have made realtime CGI for film production plausible, and they're trying to be everywhere at once while still keeping pace with (or catching up to) raw graphics advances from UE4 (and 5), CryEngine, etc.

    They've added people and acquired companies, and while all of this adds resources, it also challenges organizational leadership and inter-departmental coordination. Working teams can become silos, and there may be great management at the team level combined with visionary executive leadership -- but the sheer pace of growth strains the middle layers trying to communicate and coordinate despite middle managers trying their best to cope. New-hire onboarding takes time and resources from tenured mentors (even if the new hires are technically brilliant), and corporate acquisitions are disruptive of technical and administrative infrastructure, and of organizational dynamics.

    I am not intending to either excuse nor criticize Unity, only to share observations that I have made at other growing organizations in my role as an ITSM (IT Service Management) analyst for over 15 years. In most of the clients I've assisted, the challenges were at least as much a matter of organizational silos as they were technology or resourcing limitations. You can have all the water pressure and water volume in the world, but it won't turn a mill wheel if it has no directionality.
     
  41. syscrusher

    syscrusher

    Joined:
    Jul 4, 2015
    Posts:
    1,104
    • - Which SRP do you use, if any?
    For my professional work, I am transitioning from built-in to exclusively HDRP. There may be some URP in the future for supporting a subset of our applications on high-end tablets, but there is no need for us to support phones or lower-tier tablets. So if we can make HDRP work on the top-tier tablets, by conditionally omitting certain features from those builds, that would be simpler than porting between HDRP and URP. We have no current requirement for custom SRP pipelines; HDRP plus a few custom shaders -- made with Amplify Shader Editor, purchased from the Asset Store, or (occasionally) hand-coded -- can do everything we need.

    I am using URP in some personal, non-workplace projects.
    • - What use cases do you write custom or surface shaders for?
    Our projects are in archviz, engineering, and simulation primarily, and we have recently created custom shaders for specialized object fading and backface rendering behavior (for example, we needed a two-sided shader where the backface was unlit and transparent, but the front face was lit and opaque but faded as the camera approached).

    We also use unlit shaders for a few UI widgets that are easier (or more performant) to implement that way than with C# code.
    • - What features did you find limiting in Surface Shaders that you would want to have control over?
    I was going to cite a couple of situations here, but then I realized they had the same root cause, so I will cite that instead. :) In a surface shader, it is very difficult to have a way for the fragment function to access the actual vertex data for the vertices over which its inputs were interpolated. The available semantics (as far as I have ever been able to determine) interpolate their values. I've always resolved this using a geometry function, which of course is missing from traditional surface shaders because of the older Shader Model lacking this feature. (Should a geometry function -- defaulting to a no-op -- be a standard feature in Surface 2.0 and beyond?)
    • - Which unique shading features do you use in your SRP (ie: Bent normals, SSS, etc), and how would you like to handle fallbacks in other pipelines (ie: Approximation of SSS in URP, don't bother, etc).
    The most significant HDRP dependencies for our projects are lighting-related, and these fall right in line with the published feature differences between HDRP and SRP. If we were using Surface 2.0 to more easily support tablets, a subset of features would suffice, with a policy of "gracefully emulate at lower quality, if feasible, and gracefully omit if not." As long as the pixels are recognizable and the app usable, miraculous emulation of the full feature set would not be important here.
     
  42. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,461
    [q] I've always resolved this using a geometry function, which of course is missing from traditional surface shaders because of the older Shader Model lacking this feature. [/q]

    By this do you mean a geometry shader? The problem with geometry shaders, besides them being bad in general, is that many platforms are not supporting them moving forward (like metal). I will not explicitly support or not support geometry shaders, if you want them you can add the pragma in the defines block and add the appropriate functions.

    As for interpolation modifiers, I'm not exactly sure which platforms they are available on as I've never used them, but I could add a set of options to tag any specific interpolator with them in the VertexToPixel function easily enough..
     
    syscrusher likes this.
  43. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,461
    So, URP2019.4, HDRP2019.4, and Standard pipeline all compile shaders with alpha, tessellation, and specular/metallic workflows correctly. With it comes some differences:

    - HDRP redefines unity_ObjectToWorld and unity_WorldToObject, so you cannot use them in HDRP. So what I'm going to do is implement the following, which should work across all pipelines:

    float4x4 GetObjectToWorldMatrix();
    float4x4 GetWorldToObjectMatrix();
    float3 TransformWorldToObject(float3);
    float3 TransformObjectToWorld(float3);

    Note that HDRP also has to deal with camera relative rendering, so I will likely have more wrapper functions for things like getting the camera position, etc. And dealing with concepts such as the "Main Light", which deferred/HDRP don't have a concept of, but everyone always writes a component to do anyway.

    HDRP greatly differs from URP/Standard in how it deals with blend modes, culling, stencil, etc. Instead of putting these directly into the shader, they expose them all as properties on the material and when you change various options, they adjust them all accordingly. This is nice because you can use the same shader file for both transparent/non-transparent surfaces, but means that every shader needs a custom editor to be viable, and I prefer that the shaders output from this be pure, and work without any custom editors or managing hidden material state. So I will be following the standard/urp model here.
    • Right now there is only an alpha "Blend" option, which sets the blend modes automatically, but I will eventually add all the various hooks to the option block and the alpha modes will just be shortcuts to set all of these things if you don't want to customize.
    • HDRP uses the stencil stuff a lot- I'm unclear on what it's doing, so I'm going to leave this alone for now.. Eventually I'll decode it all for the alpha settings, remove the property based system entirely, and let you override it with whatever you want.
    • No HDRP special stuff is exposed at this point (alternate shader types, bent normal, etc), but once the core is more solid it will be easy to expose a lot of that stuff, as the HDRP shader abstracts some of those aspects much better than Standard and URP do.
    I have some cleanup to do, but at this point I'd like to take one or two beta testers to give me some direct feedback on the format, etc. I would want these people to be people who are deeply familiar with SRP shaders and shaders in general, who have a real interest in putting this system into full production at a later date.

    I also think I have a pretty good idea of how I'll handle the business model so that this can be used for unity store assets, and be a better workflow than what people are using right now (packs of textures that you have to unzip, etc).
     
    Walter_Hulsebos and syscrusher like this.
  44. Sheynes

    Sheynes

    Joined:
    Mar 27, 2017
    Posts:
    66
    Damn this hype me a lot, i'd love to be able to convert my standard pipeline VR portal shader to HDRP with your tool ! So far it's impossible to do using shadergraph because the node Screen Position is broken and return a wrong value in VR.
     
  45. GuitarBro

    GuitarBro

    Joined:
    Oct 9, 2014
    Posts:
    180
    Dude, this is legendary. Wish Unity had just done this from the start.

    As for your questions:

    - Which SRP do you use, if any?
    • Only built-in deferred for now, but potentially HDRP once it's more solidified.
    - What use cases do you write custom or surface shaders for?
    • Usually doing biplanar/triplanar texturing or to do some instancing that's not possible with the standard shader
    - What features did you find limiting in Surface Shaders that you would want to have control over?
    • Normal map tangents with custom UVs. Actually ran into this very issue recently and apparently surface shaders just assume the tangents from the mesh UVs which is pretty disappointing. A way to have the nice surface shader abstraction while being able to bypass this assumption would be great. Edit: essentially the problem described in this thread https://forum.unity.com/threads/uv-...e-shader-normals-problem.481083/#post-3129910.
    • Outputting to unused buffer channels in built-in deferred is clunky. Would be nice if that was a standard feature.
    - Which unique shading features do you use in your SRP (ie: Bent normals, SSS, etc), and how would you like to handle fallbacks in other pipelines (ie: Approximation of SSS in URP, don't bother, etc).
    • Don't currently use anything of note, but both of those would be excellent and personally I'd just say "don't bother" and leave any approximation up to the user to avoid any unexpected results.
     
    Last edited: Jan 22, 2021
  46. FlaxenFlash

    FlaxenFlash

    Joined:
    Oct 15, 2015
    Posts:
    31
    Really looking forward to this. I've been very hesitant to even consider moving to any of the new renderers because of the current state of shaders and have been putting off shader work generally.

    The main annoyance I've run into with surface shaders on a few occasions is doing some kind of manipulation or calculation of normals that is easier to do in world space rather than tangent space. It would be nice to be able to mark the output normal from the surface function as world space somehow so that it doesn't then get transformed afterwards. Sometimes I end up just compiling the surface shader output and deleting the transformation from all the passes, but that's obviously a pain in the ass when you want to make changes again. Maybe I should just be transforming it from world back to tangent again but it seems like a waste.
     
  47. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,461
  48. laurentlavigne

    laurentlavigne

    Joined:
    Aug 16, 2012
    Posts:
    6,444
    If you manage to make it the glue between the various authoring methods this is a killer product!
    Any plan for multi pass for outlines?
    And LightingFunction()? Sometimes I want a wraplambert, boost the GI, NPR like toon, high quality NPR or downgrade to a cheap alternative of the PBR math to hit that sweet 144Hz. Your stacked shader would then allow to turn any shader's look by stacking a lighting function with surface shaders that doesn't have one.
    Overrides would be useful in cases where you want to replace the tesselation formula or the way highlights look of a shader that already has all that.
     
    tspk91 likes this.
  49. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,461
    There's a pretty heavy tension between some of the goals of the project- making things work across all 3 pipelines will naturally limit what can be done compared to going deep on a single pipeline, and provides significant overhead to new features. For instance, nothing is named or done the same across pipelines - the keywords are different for various shader features, one will use properties and a custom inspector to set the alpha modes, while another expects the user to hard code them. So for even a simple feature like alpha, I have to unify the approach across the three pipelines, giving the user one interface but having the compiler do completely different techniques. A lot of this could have been avoided had the teams talked to each other, but they clearly were allowed to run in completely different directions.

    What this means is that features which interface with the externals of the render pipeline are much harder to implement than ones which exist entirely in my abstraction layer. And because I will have to rewrite these templates for every version bump, the more custom they get, the harder that rewrite becomes. I could, for instance, rewrite the whole stack to be consistent between the 3 pipelines, but then I'm rewriting a lot of the include files and such as well- so porting to new HDRP/URP versions becomes a huge nightmare.

    A lot of the functions your talking about are very pipeline specific. Your not going to hack the lighting functions in a deferred renderer like you can in a forward renderer, or in HDRP at all. So if you want to do heavy changes to the lighting system, then you'd either want to customize the templates with these features for your specific pipeline, or write the shader by hand (which, quite frankly, using the output of my system will still be cleaner than the shader graphs - the first thing I do in creating a template is delete about half the boilerplate in the shader graph output).
     
    laurentlavigne likes this.
  50. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,461
    And the first bit of this is out:

    https://github.com/slipster216/ShaderPackager

    This is primarily designed for asset store publishers being able to give new users a non-#SRPLife experience, where they don't see pink shaders when they install an asset. You can create your shaders in any system you like, and package them together with this. When the resulting file is imported, it will look at the Unity and SRP versions, select the correct shader, and import itself into the project. To the user, it looks like a shader that just works on all 3 pipelines.

    The included example has a shader output from Better Shaders in Standard, URP2019.4, and HDRP2019.4, and will load into any of those versions correctly.
     
    rz_0lento, Aras, AFrisby and 7 others like this.