<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0"><channel><title><![CDATA[Roderick's Debug Diary]]></title><description><![CDATA[https://mastodon.social/@rvkennedy]]></description><link>https://roderickkennedy.com</link><generator>RSS for Node</generator><lastBuildDate>Mon, 13 Apr 2026 11:45:17 GMT</lastBuildDate><atom:link href="https://roderickkennedy.com/rss.xml" rel="self" type="application/rss+xml"/><language><![CDATA[en]]></language><ttl>60</ttl><item><title><![CDATA[Embedding v8 Javascript in an MSVC project]]></title><description><![CDATA[rem building v8
rem following https://medium.com/angular-in-depth/how-to-build-v8-on-windows-and-not-go-mad-6347c69aacd4

echo off
set Path=%cd%\depot_tools;%Path%
set DEPOT_TOOLS_WIN_TOOLCHAIN=0
set GYP_MSVS_VERSION=2022

rem curl -O https://storage...]]></description><link>https://roderickkennedy.com/embedding-v8-javascript-in-an-msvc-project</link><guid isPermaLink="true">https://roderickkennedy.com/embedding-v8-javascript-in-an-msvc-project</guid><dc:creator><![CDATA[Roderick Kennedy]]></dc:creator><pubDate>Fri, 20 Dec 2024 12:15:33 GMT</pubDate><content:encoded><![CDATA[<pre><code class="lang-bash">rem building v8
rem following https://medium.com/angular-in-depth/how-to-build-v8-on-windows-and-not-go-mad-6347c69aacd4

<span class="hljs-built_in">echo</span> off
<span class="hljs-built_in">set</span> Path=%<span class="hljs-built_in">cd</span>%\depot_tools;%Path%
<span class="hljs-built_in">set</span> DEPOT_TOOLS_WIN_TOOLCHAIN=0
<span class="hljs-built_in">set</span> GYP_MSVS_VERSION=2022

rem curl -O https://storage.googleapis.com/chrome-infra/depot_tools.zip
rem mkdir depot_tools
rem powershell -<span class="hljs-built_in">command</span> <span class="hljs-string">"Expand-Archive -Force '%~dp0depot_tools.zip' '%~dp0depot_tools'"</span>


rem fetch v8
<span class="hljs-built_in">set</span> Path=%<span class="hljs-built_in">cd</span>%\depot_tools\.cipd_bin\3.8\bin\Lib\venv\scripts\nt;%Path%

<span class="hljs-built_in">echo</span> on

rem to activate the venv, we need a pyvenv.cfg <span class="hljs-keyword">in</span> the scripts\nt directory e.g.
rem home = C:/Code/Teleport/Javascript/depot_tools/.cipd_bin/3.8/bin/Lib/venv/scripts/nt
rem include-system-site-packages = <span class="hljs-literal">false</span>
rem version = 3.8

call %<span class="hljs-built_in">cd</span>%\depot_tools\.cipd_bin\3.8\bin\Lib\venv\scripts\nt\activate.bat

rem activate.bat calls <span class="hljs-built_in">echo</span> off, but we want to see these commands:
<span class="hljs-built_in">echo</span> on

<span class="hljs-built_in">where</span> python

<span class="hljs-built_in">cd</span> v8

rem gclient sync
rem python tools/dev/v8gen.py list
rem python tools/dev/v8gen.py x64.release

rem To embed V8 into our application we need to build it as a static library. To <span class="hljs-keyword">do</span> that, we need to modify default build configuration and add these two flags to args.gn file:

rem is_component_build = <span class="hljs-literal">false</span>
rem v8_static_library = <span class="hljs-literal">true</span>

rem  now to out.gn\x64.release/args.gn add:
rem is_debug = <span class="hljs-literal">false</span>
rem target_cpu = <span class="hljs-string">"x64"</span>
rem is_component_build = <span class="hljs-literal">false</span>
rem v8_static_library = <span class="hljs-literal">true</span>
rem <span class="hljs-comment"># disable thin lib shenanigans</span>
rem  thin_lto_enable_cache = <span class="hljs-literal">false</span>

rem GOT to have Windows SDK 10.0.22621.2428, install by hand <span class="hljs-keyword">if</span> necessary
rem see also https://chromium.googlesource.com/chromium/src/+/HEAD/docs/windows_build_instructions.md
rem <span class="hljs-keyword">for</span> building on windows with msvc rather than clang

call <span class="hljs-string">"C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Auxiliary\Build\vcvars64.bat"</span> x86_amd64
<span class="hljs-built_in">set</span> WINDOWSSDKDIR=C:\Program Files (x86)\Windows Kits\10
rem gn args out.gn/x64.release --list
rem build/config/win/BUILD.GN: cflags += [ <span class="hljs-string">"-fms-extensions"</span>,<span class="hljs-string">"-fms-compatibility"</span>]
gn gen --ide=vs --ninja-executable=%<span class="hljs-built_in">cd</span>%\third_party\ninja\ninja.exe out.gn/x64.vs.release
rem <span class="hljs-built_in">cd</span> out.gn/x64.release
rem ninja -C out.gn/x64.release
rem -t msvc 

rem python tools/run-tests.py --gn
</code></pre>
]]></content:encoded></item><item><title><![CDATA[Don't step into STL function in Visual Studio 2022]]></title><description><![CDATA[Find your Visual Studio install directory. It is probably something like "C:\Program Files\Microsoft Visual Studio\2022\Community"
Navigate to the subfolder "Common7\Packages\Debugger\Visualizers".
As administrator, edit the file default.natstepfilte...]]></description><link>https://roderickkennedy.com/dont-step-into-stl-function-in-visual-studio-2022</link><guid isPermaLink="true">https://roderickkennedy.com/dont-step-into-stl-function-in-visual-studio-2022</guid><category><![CDATA[nostepinto]]></category><category><![CDATA[visual studio]]></category><category><![CDATA[C++]]></category><category><![CDATA[debugging]]></category><dc:creator><![CDATA[Roderick Kennedy]]></dc:creator><pubDate>Thu, 08 Feb 2024 10:48:54 GMT</pubDate><content:encoded><![CDATA[<p>Find your Visual Studio install directory. It is probably something like "C:\Program Files\Microsoft Visual Studio\2022\Community"</p>
<p>Navigate to the subfolder "Common7\Packages\Debugger\Visualizers".</p>
<p>As administrator, edit the file default.natstepfilter.</p>
<p>Insert this in betwen the <mark>&lt;StepFilter&gt;&lt;/StepFilter&gt;</mark> xml tags:</p>
<pre><code class="lang-xml"><span class="hljs-tag">&lt;<span class="hljs-name">Function</span>&gt;</span><span class="hljs-tag">&lt;<span class="hljs-name">Name</span>&gt;</span>std\:\:.* <span class="hljs-tag">&lt;/<span class="hljs-name">Name</span>&gt;</span><span class="hljs-tag">&lt;<span class="hljs-name">Action</span>&gt;</span>NoStepInto<span class="hljs-tag">&lt;/<span class="hljs-name">Action</span>&gt;</span><span class="hljs-tag">&lt;/<span class="hljs-name">Function</span>&gt;</span>
</code></pre>
]]></content:encoded></item><item><title><![CDATA[Lambdas, or functions as function parameters, in Sfx]]></title><description><![CDATA[Following the work in my previous post, I happened across this Mastodon post about how good it would be to use lambdas, or function objects, as inputs in shaders.
[

](https://mastodon.gamedev.place/@meuns/111119173437770101)
It occurred to me that i...]]></description><link>https://roderickkennedy.com/lambdas-or-functions-as-function-parameters-in-sfx-1</link><guid isPermaLink="true">https://roderickkennedy.com/lambdas-or-functions-as-function-parameters-in-sfx-1</guid><dc:creator><![CDATA[Roderick Kennedy]]></dc:creator><pubDate>Tue, 26 Sep 2023 09:41:58 GMT</pubDate><enclosure url="https://cdn.hashnode.com/res/hashnode/image/upload/v1707397829906/bbce9c20-7049-4332-a48e-c87441d51fd2.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>Following the work in <a target="_blank" href="https://roderickkennedy.com/dbgdiary/the-sfx-shader-language-and-shader-variants">my previous post</a>, I happened across <a target="_blank" href="https://mastodon.gamedev.place/@meuns/111119173437770101">this</a> Mastodon post about how good it would be to use lambdas, or function objects, as inputs in shaders.</p>
<p>[</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1707397827279/e9644b67-13fa-4a9d-bbe8-e41be039e1ec.png" alt="Image of Sylvain Meunier's Mastodon post" /></p>
<p>](https://mastodon.gamedev.place/@meuns/111119173437770101)</p>
<p>It occurred to me that in our <a target="_blank" href="https://github.com/simul/Platform/tree/main/Applications/Sfx">Sfx</a> framework this might not be difficult to implement.</p>
<p>So I’ve implemented it. Because Sfx is essentially just a big text-processing system with semantic knowledge of shader syntax, it wasn’t that difficult. You declare a function input as follows.</p>
<p>shader vec4 PS_Variants(posTexVertexOutput IN,variant function&lt;vec4(posTexVertexOutput)&gt; RenderFunction) : SV_TARGET { return RenderFunction(IN); }</p>
<p>For now, we actually discard the function object’s template parameters, but later on this will be used for error-checking. Then, in the pass declaration we would do, e.g.</p>
<p>SetPixelShaderVariants(ps_variants({PSNeon, PSWhite}));</p>
<p>And this would generate two variants, calling the function PSNeon(posTexVertexOutput IN) or PSWhite(posTexVertexOutput IN).</p>
<p>In the variant-generation stage, where we assemble the source for compilation to your specified language, variant parameters that are functions are replaced in the function content with the specific function for that variant, e.g. for GLSL this generates:</p>
<p>layout(location =0) in Block { posTexVertexOutput BlockData0; } ioblock; layout(location = 0) out vec4 returnObject_vec4;</p>
<p>void PS_Variants() { posTexVertexOutput BlockData0=ioblock.BlockData0; #line 1366 "C:/Teleport/client/Shaders/test.sfx" {returnObject_vec4=PSWhite(BlockData0);return;} }</p>
<p>At the moment, this only works for shader functions, we can’t pass lambdas down to the functions it calls. But I think it can easily be extended - the trick will be that when calling a function that itself has variants, we’ll need an intermediate syntax which eliminates the lambdas from the parameter list, but indicates to the shader constructor that a specialized instance of the called-function must be created.</p>
]]></content:encoded></item><item><title><![CDATA[The Sfx shader language and shader variants]]></title><description><![CDATA[The Sfx shader language and shader variants
19 Sep
Written By Roderick Kennedy
For some reason there seems to be a lot of interest in custom and alternative game engine development now.
Whatever the cause of this mysterious uptick, I thought I'd post...]]></description><link>https://roderickkennedy.com/the-sfx-shader-language-and-shader-variants</link><guid isPermaLink="true">https://roderickkennedy.com/the-sfx-shader-language-and-shader-variants</guid><dc:creator><![CDATA[Roderick Kennedy]]></dc:creator><pubDate>Tue, 19 Sep 2023 17:57:03 GMT</pubDate><content:encoded><![CDATA[<h1 id="heading-the-sfx-shader-language-and-shader-variants">The Sfx shader language and shader variants</h1>
<p>19 Sep</p>
<p>Written By <a target="_blank" href="https://roderickkennedy.com/dbgdiary?author=5f08d2770b281846bf04ee3b">Roderick Kennedy</a></p>
<p>For some reason there seems to be a lot of interest in custom and alternative game engine development now.</p>
<p>Whatever the cause of this mysterious uptick, I thought I'd post a bit about what we do at <a target="_blank" href="https://simul.co/">Simul</a>.</p>
<p>Handling shaders can be a vexed issue in native development. If you're writing a single shader, or a matched pair of vertex and pixel shaders, things start very simply, whether you're using GLSL or HLSL. You'll have a vertex shader file, like this:</p>
<blockquote>
<p>    cbuffer CameraConstants
    {
        float4x4 viewProj;
    };
    struct VS_IN
    {
        float4 a_Positions : POSITION;
    };
    struct VS_OUT
    {
        float4 o_Position : SV_Position;
    };
    VS_OUT main(VS_IN IN)
    {
        VS_OUT OUT;
        OUT.o_Position = mul(viewProj,IN.a_Positions);
        return OUT;
    }</p>
</blockquote>
<p>And a pixel shader file:</p>
<blockquote>
<p>    struct PS_IN
    {
        float4 i_Position : SV_Position;
    };
    struct PS_OUT
    {
        float4 o_Colour : SV_Target0;
    };</p>
<p>    PS_OUT main(PS_IN IN)
    {
        PS_OUT OUT;
        OUT.o_Colour = float4(1.0,0,0,0);
        return OUT;
    }</p>
</blockquote>
<p>This above is HLSL. GLSL looks similar inside the functions, but the surrounding declarations are different, e.g.</p>
<blockquote>
<p>    #version 450
    layout(std140, binding = 0) uniform CameraConstants {
        mat4 modelViewProj;
    };
    layout(location = 0) in vec4 a_Positions;
    void main() {
        gl_Position = modelViewProj * a_Positions;
    }</p>
<p>    #version 450
    layout(location = 0) in flat uvec2 i_TexCoord;
    layout(location = 0) out vec4 o_Colour;
    void main() {
        o_Colour = vec4(1.0,0,0,0);
    }</p>
</blockquote>
<p>The trick comes as your number of shaders starts to increase, and you start needing multiple combinations - different materials might need shaders with different inputs. 3D models may have different vertex setups, so you need a vertex shader that exactly matches the vertex layout you're sending. And to combine a vertex and pixel shader, you must ensure that the vertex shader output matches the pixel shader's input.</p>
<p>It can get unwieldy.</p>
<p>At Simul, we developed <a target="_blank" href="https://github.com/simul/Platform/tree/main/Applications/Sfx">Sfx</a>, a sort of wrapper or meta- language to help manage shaders. The simple vertex-pixel combo above, in Sfx, looks like this:</p>
<blockquote>
<p>    cbuffer CameraConstants
    {
        float4x4 viewProj;
    };
    struct VS_IN
    {
        float4 a_Positions : POSITION;
    };
    struct VS_TO_PS
    {
        float4 o_Position : SV_Position;
    };
    struct PS_OUT
    {
        float4 o_Colour : SV_Target0;
    };</p>
<p>    VS_TO_PS VS_Main(VS_IN IN)
    {
        VS_TO_PS OUT;
        OUT.o_Position = mul(viewProj,IN.a_Positions);
        return OUT;
    }</p>
<p>    PS_OUT PS_Main(VS_TO_PS IN)
    {
        PS_OUT OUT;
        OUT.o_Colour = float4(1.0,0,0,0);
        return OUT;
    }</p>
<p>    VertexShader vs_main    = CompileShader(vs_5_0,VS_Main());
    PixelShader ps_main    = CompileShader(ps_5_0, PS_Main());</p>
<p>    technique simpledraw
    {
        pass p0
        {
            SetRasterizerState( wireframeRasterizer );
            SetDepthStencilState( DisableDepth, 0 );
            SetBlendState(AddBlend, vec4(0.0,0.0,0.0,0.0), 0xFFFFFFFF );
            SetVertexShader(vs_main);
            SetPixelShader(ps_main);
        }
    }</p>
</blockquote>
<p>Nice and simple: one file instead of two. And we've added some useful renderstate setup. But how is this used?</p>
<p>Sfx is a <em>transpiler</em>. It doesn't compile binary shaders, rather it translates its shaders to a target language. We might invoke it with</p>
<p>    Sfx.exe simple.sfx -I"shader_includes" -O"C:/Teleport/build_pc_client/shaderbin/$PLATFORM_NAME" -P"Sfx/DirectX12.json" -P"Sfx/Vulkan.json" </p>
<p>This means, transpile the effect file "simple.sfx" into shaders, using the profiles "DirectX12.json" and "Vulkan.json". These setup files tell Sfx how to generate shader code. For DirectX 12, it will generate hlsl shaders for Microsoft's dxc compiler. For Vulkan, it will create glsl shaders for glslangvalidator.exe. In both cases, it will invoke the relevant compiler, and bundle the output into an effect binary, simple.sfxb, and an effect setup file simple.sfxo. The former contains the shaders, the latter contains instructions on how they can be used in passes.</p>
<p>Then at runtime, a small library (specific to your chosen graphics API) loads up the binaries and applies the shaders, and renderstate when needed.</p>
<p>I called "simple.sfx" an <em>effect</em> file. This unfashionable concept was all the rage when shaders were first in widespread use, but fell out of favour as more developers started to use commercial game engines, which handled that sort of thing internally. But I think it still has a lot of value. Although each shader language is essentially a special-purpose version of C, the effect file has a meta-language that surrounds it which helps us build and manage collections of shaders.</p>
<p>Originally Sfx was modelled on the effect file format of Direct3D, which was mothballed some years ago. But it's grown into its own thing now. For example, we can declare shader <em>variants</em> as follows:</p>
<blockquote>
<p>    shader vec4 PS_Var(vertexOutput IN,variant bool textured) : SV_TARGET
    {
        vec4 colour;
        if(textured)
        {
            colour=texture.Sample(wrapSamplerState, IN.texCoords.xy);
        }
        else
        {
            colour=globalColour;
        }
        return colour;
    }</p>
</blockquote>
<p>This shader has two variants, one with a texture lookup and one without. In the pass declaration, we invoke this with:</p>
<blockquote>
<p>    PixelShader ps_var    = CompileShader(ps_5_0, PS_Var());
    technique test
    {
        pass p0
        {
            SetVertexShader(vs_main);
            SetPixelShaderVariants(ps_var({false,true}));
        }
    }</p>
</blockquote>
<p>Sfx will now generate two versions of the pixel shader, one with texture=true, one with false. Although <em>if(textured)</em> looks above like runtime code, each version will be compiled with the value of <em>textured</em> constant: it will be optimized down to static code. At runtime though, we can choose which version we want. We could also specify two or more different shaders, e.g.</p>
<p>            SetPixelShaderVariants(ps_var1,ps_var2,ps_debug);</p>
<p>It would have been easy enough to just write two versions of the pass, one with textures and one without. The power of the variant system comes as shaders multiply. Combining, say, three vertex shaders with two pixel shaders:</p>
<p>            SetVertexShader(vs_main,vs_alternative);
            SetPixelShaderVariants(ps_var({false,true}),ps_test);</p>
<p>Now, assuming they are mutually compatible, we would have six possible passes. Consider for example, creating specialized debug views of your 3D objects: for each view, you can just add a pixel shader variant to the pass.</p>
<p>So for larger projects, a system like Sfx can be invaluable in quickly writing efficient GPU code. If you do find yourself programming natively, without the comforts of a game engine, it may be just the ticket.</p>
<p>As of today, Sfx supports not only Vulkan and OpenGL flavours of GLSL and D3D11 and 12 flavours of HLSL, but also certain proprietary platforms that I'm not permitted to name. We codenamed them Spectrum and Commodore, for convenience.</p>
<p>All the source for Sfx is at <a target="_blank" href="https://github.com/simul/Platform/tree/main/Applications/Sfx">https://github.com/simul/Platform/tree/main/Applications/Sfx</a>.</p>
<p> <a target="_blank" href="https://roderickkennedy.com/dbgdiary?author=5f08d2770b281846bf04ee3b">Roderick Kennedy</a></p>
<h1 id="heading-the-sfx-shader-language-and-shader-variants-1">The Sfx shader language and shader variants</h1>
<p>19 Sep</p>
<p>Written By <a target="_blank" href="https://roderickkennedy.com/dbgdiary?author=5f08d2770b281846bf04ee3b">Roderick Kennedy</a></p>
<p>For some reason there seems to be a lot of interest in custom and alternative game engine development now.</p>
<p>Whatever the cause of this mysterious uptick, I thought I'd post a bit about what we do at <a target="_blank" href="https://simul.co/">Simul</a>.</p>
<p>Handling shaders can be a vexed issue in native development. If you're writing a single shader, or a matched pair of vertex and pixel shaders, things start very simply, whether you're using GLSL or HLSL. You'll have a vertex shader file, like this:</p>
<blockquote>
<p>    cbuffer CameraConstants
    {
        float4x4 viewProj;
    };
    struct VS_IN
    {
        float4 a_Positions : POSITION;
    };
    struct VS_OUT
    {
        float4 o_Position : SV_Position;
    };
    VS_OUT main(VS_IN IN)
    {
        VS_OUT OUT;
        OUT.o_Position = mul(viewProj,IN.a_Positions);
        return OUT;
    }</p>
</blockquote>
<p>And a pixel shader file:</p>
<blockquote>
<p>    struct PS_IN
    {
        float4 i_Position : SV_Position;
    };
    struct PS_OUT
    {
        float4 o_Colour : SV_Target0;
    };</p>
<p>    PS_OUT main(PS_IN IN)
    {
        PS_OUT OUT;
        OUT.o_Colour = float4(1.0,0,0,0);
        return OUT;
    }</p>
</blockquote>
<p>This above is HLSL. GLSL looks similar inside the functions, but the surrounding declarations are different, e.g.</p>
<blockquote>
<p>    #version 450
    layout(std140, binding = 0) uniform CameraConstants {
        mat4 modelViewProj;
    };
    layout(location = 0) in vec4 a_Positions;
    void main() {
        gl_Position = modelViewProj * a_Positions;
    }</p>
<p>    #version 450
    layout(location = 0) in flat uvec2 i_TexCoord;
    layout(location = 0) out vec4 o_Colour;
    void main() {
        o_Colour = vec4(1.0,0,0,0);
    }</p>
</blockquote>
<p>The trick comes as your number of shaders starts to increase, and you start needing multiple combinations - different materials might need shaders with different inputs. 3D models may have different vertex setups, so you need a vertex shader that exactly matches the vertex layout you're sending. And to combine a vertex and pixel shader, you must ensure that the vertex shader output matches the pixel shader's input.</p>
<p>It can get unwieldy.</p>
<p>At Simul, we developed <a target="_blank" href="https://github.com/simul/Platform/tree/main/Applications/Sfx">Sfx</a>, a sort of wrapper or meta- language to help manage shaders. The simple vertex-pixel combo above, in Sfx, looks like this:</p>
<blockquote>
<p>    cbuffer CameraConstants
    {
        float4x4 viewProj;
    };
    struct VS_IN
    {
        float4 a_Positions : POSITION;
    };
    struct VS_TO_PS
    {
        float4 o_Position : SV_Position;
    };
    struct PS_OUT
    {
        float4 o_Colour : SV_Target0;
    };</p>
<p>    VS_TO_PS VS_Main(VS_IN IN)
    {
        VS_TO_PS OUT;
        OUT.o_Position = mul(viewProj,IN.a_Positions);
        return OUT;
    }</p>
<p>    PS_OUT PS_Main(VS_TO_PS IN)
    {
        PS_OUT OUT;
        OUT.o_Colour = float4(1.0,0,0,0);
        return OUT;
    }</p>
<p>    VertexShader vs_main    = CompileShader(vs_5_0,VS_Main());
    PixelShader ps_main    = CompileShader(ps_5_0, PS_Main());</p>
<p>    technique simpledraw
    {
        pass p0
        {
            SetRasterizerState( wireframeRasterizer );
            SetDepthStencilState( DisableDepth, 0 );
            SetBlendState(AddBlend, vec4(0.0,0.0,0.0,0.0), 0xFFFFFFFF );
            SetVertexShader(vs_main);
            SetPixelShader(ps_main);
        }
    }</p>
</blockquote>
<p>Nice and simple: one file instead of two. And we've added some useful renderstate setup. But how is this used?</p>
<p>Sfx is a <em>transpiler</em>. It doesn't compile binary shaders, rather it translates its shaders to a target language. We might invoke it with</p>
<p>    Sfx.exe simple.sfx -I"shader_includes" -O"C:/Teleport/build_pc_client/shaderbin/$PLATFORM_NAME" -P"Sfx/DirectX12.json" -P"Sfx/Vulkan.json" </p>
<p>This means, transpile the effect file "simple.sfx" into shaders, using the profiles "DirectX12.json" and "Vulkan.json". These setup files tell Sfx how to generate shader code. For DirectX 12, it will generate hlsl shaders for Microsoft's dxc compiler. For Vulkan, it will create glsl shaders for glslangvalidator.exe. In both cases, it will invoke the relevant compiler, and bundle the output into an effect binary, simple.sfxb, and an effect setup file simple.sfxo. The former contains the shaders, the latter contains instructions on how they can be used in passes.</p>
<p>Then at runtime, a small library (specific to your chosen graphics API) loads up the binaries and applies the shaders, and renderstate when needed.</p>
<p>I called "simple.sfx" an <em>effect</em> file. This unfashionable concept was all the rage when shaders were first in widespread use, but fell out of favour as more developers started to use commercial game engines, which handled that sort of thing internally. But I think it still has a lot of value. Although each shader language is essentially a special-purpose version of C, the effect file has a meta-language that surrounds it which helps us build and manage collections of shaders.</p>
<p>Originally Sfx was modelled on the effect file format of Direct3D, which was mothballed some years ago. But it's grown into its own thing now. For example, we can declare shader <em>variants</em> as follows:</p>
<blockquote>
<p>    shader vec4 PS_Var(vertexOutput IN,variant bool textured) : SV_TARGET
    {
        vec4 colour;
        if(textured)
        {
            colour=texture.Sample(wrapSamplerState, IN.texCoords.xy);
        }
        else
        {
            colour=globalColour;
        }
        return colour;
    }</p>
</blockquote>
<p>This shader has two variants, one with a texture lookup and one without. In the pass declaration, we invoke this with:</p>
<blockquote>
<p>    PixelShader ps_var    = CompileShader(ps_5_0, PS_Var());
    technique test
    {
        pass p0
        {
            SetVertexShader(vs_main);
            SetPixelShaderVariants(ps_var({false,true}));
        }
    }</p>
</blockquote>
<p>Sfx will now generate two versions of the pixel shader, one with texture=true, one with false. Although <em>if(textured)</em> looks above like runtime code, each version will be compiled with the value of <em>textured</em> constant: it will be optimized down to static code. At runtime though, we can choose which version we want. We could also specify two or more different shaders, e.g.</p>
<p>            SetPixelShaderVariants(ps_var1,ps_var2,ps_debug);</p>
<p>It would have been easy enough to just write two versions of the pass, one with textures and one without. The power of the variant system comes as shaders multiply. Combining, say, three vertex shaders with two pixel shaders:</p>
<p>            SetVertexShader(vs_main,vs_alternative);
            SetPixelShaderVariants(ps_var({false,true}),ps_test);</p>
<p>Now, assuming they are mutually compatible, we would have six possible passes. Consider for example, creating specialized debug views of your 3D objects: for each view, you can just add a pixel shader variant to the pass.</p>
<p>So for larger projects, a system like Sfx can be invaluable in quickly writing efficient GPU code. If you do find yourself programming natively, without the comforts of a game engine, it may be just the ticket.</p>
<p>As of today, Sfx supports not only Vulkan and OpenGL flavours of GLSL and D3D11 and 12 flavours of HLSL, but also certain proprietary platforms that I'm not permitted to name. We codenamed them Spectrum and Commodore, for convenience.</p>
<p>All the source for Sfx is at <a target="_blank" href="https://github.com/simul/Platform/tree/main/Applications/Sfx">https://github.com/simul/Platform/tree/main/Applications/Sfx</a>.</p>
<p> <a target="_blank" href="https://roderickkennedy.com/dbgdiary?author=5f08d2770b281846bf04ee3b">Roderick Kennedy</a></p>
]]></content:encoded></item><item><title><![CDATA[Why the Metaverse Failed]]></title><description><![CDATA[Why the Metaverse Failed
11 Sep
Written By Roderick Kennedy
The research firm Gartner has a way of visualizing the development and adoption of new technologies. After an initial work and investment gold rush, there follows a "trough of disillusionmen...]]></description><link>https://roderickkennedy.com/why-the-metaverse-failed</link><guid isPermaLink="true">https://roderickkennedy.com/why-the-metaverse-failed</guid><dc:creator><![CDATA[Roderick Kennedy]]></dc:creator><pubDate>Mon, 11 Sep 2023 20:35:35 GMT</pubDate><enclosure url="https://cdn.hashnode.com/res/hashnode/image/upload/v1707401867807/b21e5d2e-410a-4605-b189-c0fdac49b841.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<h1 id="heading-why-the-metaverse-failed">Why the Metaverse Failed</h1>
<p>11 Sep</p>
<p>Written By <a target="_blank" href="https://roderickkennedy.com/virtual-reality?author=5f08d2770b281846bf04ee3b">Roderick Kennedy</a></p>
<p>The research firm Gartner has a way of visualizing the development and adoption of new technologies. After an initial work and investment gold rush, there follows a "trough of disillusionment", where the bold claims of the early days haven't panned out, and investors, journalists and even the technologists themselves feel that things aren't going anywhere - at least going nowhere fast enough to justify the hype.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1707401859840/3df58822-3ac1-4602-a173-3b26d6c2e2d7.png" alt /></p>
<p>By Jeremy Kemp at English Wikipedia, CC BY-SA 3.0,</p>
<p><a target="_blank" href="https://commons.wikimedia.org/w/index.php?curid=10547051">https://commons.wikimedia.org/w/index.php?curid=10547051</a></p>
<p>In the Gartner Hype Cycle, this Slough of Despond is followed by the "Slope of Enlightenment", where the technology continues to develop and starts to achieve its best practices. Finally, there's a Plateau of Productivity, where it reaches the Nirvana of actual use in the real world and value to the economy.</p>
<p>Of course there are a number of problems with this model, if we can even call it that. The Y-axis is "expectations", which is a pretty nebulous thing to measure. In the full flush of hype, you could be halfway up the peak, or you could be at the top and ready to drop. You only know some time later where you were.</p>
<p>And the whole graph presupposes that this technology <em>will end up being useful</em>. If every Trough of Disillusionment was guaranteed to be followed by an upward slope, why would we even get disillusioned?</p>
<p>So you could equally be in the trough before enlightenment kicks in, or you could be on the tail end of an idea that just isn't going to work, and will never find its Plateau. Think, Apple Newton, or... jetpacks.</p>
<p>Now, we certainly seem to be in one or other of these two situations with the Spatial Internet, or as some prefer to call it, the Metaverse.</p>
<p>The idea of a Spatial Internet goes back a long way. In William Gibson's Neuromancer, he imagined hackers "jacking in" with a brain-computer interface directly to the network itself, experiencing a "shared hallucination", a direct view onto the data and communication structures of the "matrix". Conversely, Snow Crash by Neal Stephenson envisaged a <em>simulated</em> environment: what you see is not the raw data made visual, but a virtual world recreating aspects of the real world: you can own property, do business, socialize. This latter model took hold in the minds of many key thinkers in Silicon Valley and beyond.</p>
<p>In 1994, only 3/4 years after the Web burst onto the scene and made the Internet navigable to ordinary people, David Ragget, Mark Pesce and Brian Behlendorf published a proposal for "VRML: Extending WWW to support Platform Independent Virtual Reality", the first substantial technological effort to make the networked Metaverse a reality.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1707401861280/e118821f-e844-401d-9449-b4b0a9a4a355.png" alt /></p>
<p>There was a whole hype cycle around VRML. But by the time of the dotcom crash, the tech writer Clay Shirky noted:</p>
<blockquote>
<p>“The Quake 3-D engine is for something, while VRML is a technology in search of a use. The Quake engine’s original use is immaterial; once you have a world-modeling protocol… you’ve got the whole ball of wax — spaces, movement, objects, avatars, even a simple physics… once a tool is good for one thing, it’s easy to see how to make it good for another thing.”</p>
</blockquote>
<p>— Quote Source</p>
<p>And so VRML reached its own Trough of Disillusionment, from which it has yet to recover. But contrary to Gartner, this was not the first such trough for VR. In 1984, VPL Research, founded by Jaron Lanier had launched, with products such as the DataGlove and the EyePhone (an early headset). There was much fanfare, and films like The Lawnmower Man built the hype around VR, but the hardware was expensive, and didn't find much use outside of military training.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1707401862577/48c7880c-5a15-4249-b146-5645c04cc881.png" alt /></p>
<p>So what we really see is multiple peaks and troughs. But will we ever see the plateau?</p>
<p>When I think of the Spatial Internet, I tend to imagine something like what Pesce and his cohorts proposed in VRML: a shared, common, <em>navigable</em> network. But what we have instead at present is not that. Many millions of venture dollars and person-years of effort have been put into downloadable, installable applications running on XR hardware. It's a mobile phone model: there's an app store, you buy an app, you run it. If you want to do something different, you run a different app.</p>
<p>Now, there are strong efforts at standardization: file formats like GLTF and USD offer a way for applications to share data, so it's possible for users to port some aspects of their personal data between apps.</p>
<p>What we do not have right now though, is a navigable network. When Tim Berners Lee created the World Wide Web, he presented two interlocking technologies. HTTP transfers whole files from a server to a client. HTML allows textual data to be formatted on arbitrary screens, and provides a set of common guiderails.</p>
<p>Guiderails are key. Everyone knows what a link looks like on a web page. Everyone knows that when you click a link, it takes you to a new server, a new URL, and that this is a one-way system - the receiving site doesn't need to give permission to the linking site, it treats each request afresh. Everyone knows that images can be embedded, and it's up to the browser to choose how to display them, according to a clear set of rules that HTML defines.</p>
<p>So you know when navigating the Web that each page you go to follows these same rules, and will be amenable to how you are used to viewing and interacting with sites.</p>
<p>There are no guiderails for the Spatial Internet. Because there's no protocol: we're still using HTTP for WebXR, when we really ought to have a realtime protocol that's built for spatial use-cases.</p>
<p>What should a link look like? Should it be one-way? I suggest: not always. Spatial applications are much more data-heavy than websites. You need to know whether, when you follow a link or portal to a different site, that site is: a) available, and b) has some way back to where you were before.</p>
<p>In 1997, researchers at the US Naval Postgraduate School proposed VRTP: a protocol for VRML that met the needs of spatial internet applications (<a target="_blank" href="https://faculty.nps.edu/brutzman/vrtp/">https://faculty.nps.edu/brutzman/vrtp/</a>). But because VRML was hitting its own trough soon afterwards, VRTP was not developed to completion.</p>
<p>If there is to be a Spatial Internet, we have to address this fundamental challenge. It's what we're doing with Teleport VR, and working with the standards bodies I hope we'll get Teleport or something like it adopted. From here, it could go either way: the last line of that Gartner curve has not yet been drawn.</p>
<p><a target="_blank" href="https://roderickkennedy.com/virtual-reality/http-for-vr">https://roderickkennedy.com/virtual-reality/http-for-vr</a></p>
<p> <a target="_blank" href="https://roderickkennedy.com/virtual-reality?author=5f08d2770b281846bf04ee3b">Roderick Kennedy</a></p>
<h1 id="heading-why-the-metaverse-failed-1">Why the Metaverse Failed</h1>
<p>11 Sep</p>
<p>Written By <a target="_blank" href="https://roderickkennedy.com/virtual-reality?author=5f08d2770b281846bf04ee3b">Roderick Kennedy</a></p>
<p>The research firm Gartner has a way of visualizing the development and adoption of new technologies. After an initial work and investment gold rush, there follows a "trough of disillusionment", where the bold claims of the early days haven't panned out, and investors, journalists and even the technologists themselves feel that things aren't going anywhere - at least going nowhere fast enough to justify the hype.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1707401863814/b6f25ce6-07c0-47ae-9ac4-37998570dfad.png" alt /></p>
<p>By Jeremy Kemp at English Wikipedia, CC BY-SA 3.0,</p>
<p><a target="_blank" href="https://commons.wikimedia.org/w/index.php?curid=10547051">https://commons.wikimedia.org/w/index.php?curid=10547051</a></p>
<p>In the Gartner Hype Cycle, this Slough of Despond is followed by the "Slope of Enlightenment", where the technology continues to develop and starts to achieve its best practices. Finally, there's a Plateau of Productivity, where it reaches the Nirvana of actual use in the real world and value to the economy.</p>
<p>Of course there are a number of problems with this model, if we can even call it that. The Y-axis is "expectations", which is a pretty nebulous thing to measure. In the full flush of hype, you could be halfway up the peak, or you could be at the top and ready to drop. You only know some time later where you were.</p>
<p>And the whole graph presupposes that this technology <em>will end up being useful</em>. If every Trough of Disillusionment was guaranteed to be followed by an upward slope, why would we even get disillusioned?</p>
<p>So you could equally be in the trough before enlightenment kicks in, or you could be on the tail end of an idea that just isn't going to work, and will never find its Plateau. Think, Apple Newton, or... jetpacks.</p>
<p>Now, we certainly seem to be in one or other of these two situations with the Spatial Internet, or as some prefer to call it, the Metaverse.</p>
<p>The idea of a Spatial Internet goes back a long way. In William Gibson's Neuromancer, he imagined hackers "jacking in" with a brain-computer interface directly to the network itself, experiencing a "shared hallucination", a direct view onto the data and communication structures of the "matrix". Conversely, Snow Crash by Neal Stephenson envisaged a <em>simulated</em> environment: what you see is not the raw data made visual, but a virtual world recreating aspects of the real world: you can own property, do business, socialize. This latter model took hold in the minds of many key thinkers in Silicon Valley and beyond.</p>
<p>In 1994, only 3/4 years after the Web burst onto the scene and made the Internet navigable to ordinary people, David Ragget, Mark Pesce and Brian Behlendorf published a proposal for "VRML: Extending WWW to support Platform Independent Virtual Reality", the first substantial technological effort to make the networked Metaverse a reality.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1707401864903/133d3a05-e061-4cd8-908c-40bd5d51a5c5.png" alt /></p>
<p>There was a whole hype cycle around VRML. But by the time of the dotcom crash, the tech writer Clay Shirky noted:</p>
<blockquote>
<p>“The Quake 3-D engine is for something, while VRML is a technology in search of a use. The Quake engine’s original use is immaterial; once you have a world-modeling protocol… you’ve got the whole ball of wax — spaces, movement, objects, avatars, even a simple physics… once a tool is good for one thing, it’s easy to see how to make it good for another thing.”</p>
</blockquote>
<p>— Quote Source</p>
<p>And so VRML reached its own Trough of Disillusionment, from which it has yet to recover. But contrary to Gartner, this was not the first such trough for VR. In 1984, VPL Research, founded by Jaron Lanier had launched, with products such as the DataGlove and the EyePhone (an early headset). There was much fanfare, and films like The Lawnmower Man built the hype around VR, but the hardware was expensive, and didn't find much use outside of military training.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1707401866231/9880112f-5fe6-4c87-88b6-3657adb1447c.png" alt /></p>
<p>So what we really see is multiple peaks and troughs. But will we ever see the plateau?</p>
<p>When I think of the Spatial Internet, I tend to imagine something like what Pesce and his cohorts proposed in VRML: a shared, common, <em>navigable</em> network. But what we have instead at present is not that. Many millions of venture dollars and person-years of effort have been put into downloadable, installable applications running on XR hardware. It's a mobile phone model: there's an app store, you buy an app, you run it. If you want to do something different, you run a different app.</p>
<p>Now, there are strong efforts at standardization: file formats like GLTF and USD offer a way for applications to share data, so it's possible for users to port some aspects of their personal data between apps.</p>
<p>What we do not have right now though, is a navigable network. When Tim Berners Lee created the World Wide Web, he presented two interlocking technologies. HTTP transfers whole files from a server to a client. HTML allows textual data to be formatted on arbitrary screens, and provides a set of common guiderails.</p>
<p>Guiderails are key. Everyone knows what a link looks like on a web page. Everyone knows that when you click a link, it takes you to a new server, a new URL, and that this is a one-way system - the receiving site doesn't need to give permission to the linking site, it treats each request afresh. Everyone knows that images can be embedded, and it's up to the browser to choose how to display them, according to a clear set of rules that HTML defines.</p>
<p>So you know when navigating the Web that each page you go to follows these same rules, and will be amenable to how you are used to viewing and interacting with sites.</p>
<p>There are no guiderails for the Spatial Internet. Because there's no protocol: we're still using HTTP for WebXR, when we really ought to have a realtime protocol that's built for spatial use-cases.</p>
<p>What should a link look like? Should it be one-way? I suggest: not always. Spatial applications are much more data-heavy than websites. You need to know whether, when you follow a link or portal to a different site, that site is: a) available, and b) has some way back to where you were before.</p>
<p>In 1997, researchers at the US Naval Postgraduate School proposed VRTP: a protocol for VRML that met the needs of spatial internet applications (<a target="_blank" href="https://faculty.nps.edu/brutzman/vrtp/">https://faculty.nps.edu/brutzman/vrtp/</a>). But because VRML was hitting its own trough soon afterwards, VRTP was not developed to completion.</p>
<p>If there is to be a Spatial Internet, we have to address this fundamental challenge. It's what we're doing with Teleport VR, and working with the standards bodies I hope we'll get Teleport or something like it adopted. From here, it could go either way: the last line of that Gartner curve has not yet been drawn.</p>
<p><a target="_blank" href="https://roderickkennedy.com/virtual-reality/http-for-vr">https://roderickkennedy.com/virtual-reality/http-for-vr</a></p>
<p> <a target="_blank" href="https://roderickkennedy.com/virtual-reality?author=5f08d2770b281846bf04ee3b">Roderick Kennedy</a></p>
]]></content:encoded></item><item><title><![CDATA[Updating the PHP version on Wordpress]]></title><description><![CDATA[Wordpress sometimes needs you to update the php version. Outside of a managed hosting, if you’re working with raw Linux, you’re pretty much stuck. Ignoring the official, unhelpful page at https://wordpress.org/support/update-php, what actually has to...]]></description><link>https://roderickkennedy.com/updating-the-php-version-on-wordpress-1</link><guid isPermaLink="true">https://roderickkennedy.com/updating-the-php-version-on-wordpress-1</guid><dc:creator><![CDATA[Roderick Kennedy]]></dc:creator><pubDate>Fri, 21 Jul 2023 08:11:22 GMT</pubDate><content:encoded><![CDATA[<p>Wordpress sometimes needs you to update the php version. Outside of a managed hosting, if you’re working with raw Linux, you’re pretty much stuck. Ignoring the official, unhelpful page at <a target="_blank" href="https://wordpress.org/support/update-php,">https://wordpress.org/support/update-php,</a> what actually has to be done is this:</p>
<p>Using the example of updating from PHP 7.2 to 7.4, and assuming you have installed 7.4 from command-line already, add the line:</p>
<p>AddHandler application/x-httpd-php74 .php</p>
<p>to the .htaccess file in the web root. Now call:</p>
<p>sudo a2dismod php7.2</p>
<p>sudo a2enmod php7.4</p>
<p>sudo systemctl restart apache2</p>
<p>And done.</p>
]]></content:encoded></item><item><title><![CDATA[Introducing WebRTC to Teleport VR]]></title><description><![CDATA[Introducing WebRTC to Teleport VR
17 Mar
Written By Roderick Kennedy
Getting Teleport running on Linux
It's long been a goal to get a Teleport server running on Linux. The reason to do this is that I want to offer effectively "VR hosting in-a-box", a...]]></description><link>https://roderickkennedy.com/introducing-webrtc-to-teleport-vr</link><guid isPermaLink="true">https://roderickkennedy.com/introducing-webrtc-to-teleport-vr</guid><dc:creator><![CDATA[Roderick Kennedy]]></dc:creator><pubDate>Fri, 17 Mar 2023 23:06:33 GMT</pubDate><content:encoded><![CDATA[<h1 id="heading-introducing-webrtc-to-teleport-vr">Introducing WebRTC to Teleport VR</h1>
<p>17 Mar</p>
<p>Written By <a target="_blank" href="https://roderickkennedy.com/virtual-reality?author=5f08d2770b281846bf04ee3b">Roderick Kennedy</a></p>
<h4 id="heading-getting-teleport-running-on-linux">Getting Teleport running on Linux</h4>
<p>It's long been a goal to get a Teleport server running on Linux. The reason to do this is that I want to offer effectively "VR hosting in-a-box", and the way to do that is containerization: we make it possible to create a Teleport server instance in a Docker container, then it can be uploaded to a cloud service like AWS with a minimum of fuss.</p>
<p>If we try to do this with Windows, we come up against licensing problems: if you're running a Windows server, you need a Windows licence. Best to use Linux.</p>
<p>But apart from the Android/Quest client, all Teleport work so far has been on Windows.  </p>
<p>So for the past few months I've tried to get this resolved: removing obvious Win32 API dependencies, getting the necessary libraries built. An early issue was we were passing strings to and from Unity as _bstr_t, a Microsoft class that isn't supported in clang/Unix. So I switched over to plain old char* for Unity to communicate with the Teleport dll.</p>
<p>Debugging was also a challenge. I've not yet found a way to debug C# code in Unity on Linux, but C++ works, after some deep dives into the gdb settings (I haven’t yet got lldb working, but gdb works fine for clang code).</p>
<p>So finally we can now connect from a Teleport client to the Linux server in Unity. But the data streams are not working. I could try to fix our UDP-based data stream code for Linux, which uses the 3rd party libraries SRT and EFP.</p>
<h4 id="heading-streaming-transport">Streaming Transport</h4>
<p>We need streaming transport because in Teleport we stream video, audio and geometry. In the hybrid rendering mode (shown above), far objects are streamed as video in the background. Near objects are streamed as geometry. In this video I've added a grey highlight to the video stream, otherwise it might be difficult to see the join.  </p>
<p>As we move through the scene, any geometry that we approach is streamed into the local device. In this way, Teleport enables:</p>
<ul>
<li><p>Higher visual quality than an installed application or WebXR</p>
</li>
<li><p>Larger scenes than an installed application or WebXR</p>
</li>
<li><p>Multi-platform support without multi-platform code</p>
</li>
<li><p>Instant deployment to any device.  </p>
<p>So under the Teleport protocol we need two network transport layers: one for messaging, one for data streaming. Plus HTTPS for static file downloads.</p>
</li>
</ul>
<h4 id="heading-introducing-webrtc">Introducing WebRTC</h4>
<p>To build Teleport, needed a data streaming system. We used UDP packets, a fast way of sending data - much faster than the TCP/IP that browsers use to download web content. UDP is "unreliable": the packets aren't guaranteed to get there the way TCP/IP packets are. So we needed a couple of extra layers to ensure reliable streaming: we used SRT for ordering and EFP to manage "frames" or data chunks. Both good systems. And when we started building Teleport there wasn't a good alternative.  </p>
<p>But WebRTC is mature now, and does exactly what UDP+SRT/EFP does. So once it's integrated, that standardizes the Teleport stack around something that's in widespread use.</p>
<p>So if I can, I'm going to switch the transport layer over to WebRTC, which is well-supported, supports arbitrary data streams, and is at least nominally available in C++.</p>
<p>WebRTC was originally intended as a peer-to-peer multimedia protocol, a way to enable (for example) video calls on web pages. But it’s pretty flexible: it supports two-way video tracks, audio tracks, and crucially for us: data channels - arbitrary text or binary data, flowing in either direction, either reliably or not. WebRTC offers us a number of advantages.</p>
<ul>
<li><p>Web-compatible: when we implement the web version of the Teleport client, we won’t be able to use UDP because browsers don’t support raw packet streaming. But they do support WebRTC.</p>
</li>
<li><p>Firewall-friendly: the implementations of WebRTC include support for STUN/TURN, which helps navigate restrictive firewalls, simplifying the process of finding the right port and underlying transport.</p>
</li>
<li><p>Well-supported: compared to the UDP-based systems we’ve been using, WebRTC is used a lot more, so it’s easier to find expertise and solutions.</p>
</li>
</ul>
<p>So here we have it: streaming the geometry via WebRTC. I've added a green shader at the front to highlight each object as it streams in.</p>
<p>There's no background because I'd only implemented geometry streaming so far: video/audio streaming is handled differently in WebRTC. But it worked well: very reliable, and more robust to network changes+firewalls than the UDP approach.</p>
<p>And it should work fine in Linux, though that remains to be tested.</p>
<p>As it turned out, it was pretty simple to activate WebRTC video and audio support by just using data channels - rather than WebRTC's native video/audio tracks. This is most likely less efficient, and will need to be reviewed.</p>
<p>Here as well I show the attractive, minimal 2D user-interface menus I've implemented for desktop use: the 3D menus will now only appear if you're in VR. So the app in desktop now looks... a lot like a browser! Which is intentional.</p>
<p>The back half of the video shows more clearly the transition between the video-streamed background and the geometry-streamed foreground - again with a grey cast to highlight the join - this will enable VR apps of truly vast scope to be accessed instantly via Teleport. While still permitting the option of geometry-only streams, which will be comparable in price to running a web server - as against the $dollar+ per user per hour that Unreal or Unity pixel-streaming costs!</p>
<p>Because there's no chance of a true Metaverse developing at those prices. It needs to be cheap enough that anyone can build a site, host content, create services - just as they can on the web. And that's what Teleport delivers.</p>
<p> <a target="_blank" href="https://roderickkennedy.com/virtual-reality?author=5f08d2770b281846bf04ee3b">Roderick Kennedy</a></p>
<h1 id="heading-introducing-webrtc-to-teleport-vr-1">Introducing WebRTC to Teleport VR</h1>
<p>17 Mar</p>
<p>Written By <a target="_blank" href="https://roderickkennedy.com/virtual-reality?author=5f08d2770b281846bf04ee3b">Roderick Kennedy</a></p>
<h4 id="heading-getting-teleport-running-on-linux-1">Getting Teleport running on Linux</h4>
<p>It's long been a goal to get a Teleport server running on Linux. The reason to do this is that I want to offer effectively "VR hosting in-a-box", and the way to do that is containerization: we make it possible to create a Teleport server instance in a Docker container, then it can be uploaded to a cloud service like AWS with a minimum of fuss.</p>
<p>If we try to do this with Windows, we come up against licensing problems: if you're running a Windows server, you need a Windows licence. Best to use Linux.</p>
<p>But apart from the Android/Quest client, all Teleport work so far has been on Windows.  </p>
<p>So for the past few months I've tried to get this resolved: removing obvious Win32 API dependencies, getting the necessary libraries built. An early issue was we were passing strings to and from Unity as _bstr_t, a Microsoft class that isn't supported in clang/Unix. So I switched over to plain old char* for Unity to communicate with the Teleport dll.</p>
<p>Debugging was also a challenge. I've not yet found a way to debug C# code in Unity on Linux, but C++ works, after some deep dives into the gdb settings (I haven’t yet got lldb working, but gdb works fine for clang code).</p>
<p>So finally we can now connect from a Teleport client to the Linux server in Unity. But the data streams are not working. I could try to fix our UDP-based data stream code for Linux, which uses the 3rd party libraries SRT and EFP.</p>
<h4 id="heading-streaming-transport-1">Streaming Transport</h4>
<p>We need streaming transport because in Teleport we stream video, audio and geometry. In the hybrid rendering mode (shown above), far objects are streamed as video in the background. Near objects are streamed as geometry. In this video I've added a grey highlight to the video stream, otherwise it might be difficult to see the join.  </p>
<p>As we move through the scene, any geometry that we approach is streamed into the local device. In this way, Teleport enables:</p>
<ul>
<li><p>Higher visual quality than an installed application or WebXR</p>
</li>
<li><p>Larger scenes than an installed application or WebXR</p>
</li>
<li><p>Multi-platform support without multi-platform code</p>
</li>
<li><p>Instant deployment to any device.  </p>
<p>So under the Teleport protocol we need two network transport layers: one for messaging, one for data streaming. Plus HTTPS for static file downloads.</p>
</li>
</ul>
<h4 id="heading-introducing-webrtc-1">Introducing WebRTC</h4>
<p>To build Teleport, needed a data streaming system. We used UDP packets, a fast way of sending data - much faster than the TCP/IP that browsers use to download web content. UDP is "unreliable": the packets aren't guaranteed to get there the way TCP/IP packets are. So we needed a couple of extra layers to ensure reliable streaming: we used SRT for ordering and EFP to manage "frames" or data chunks. Both good systems. And when we started building Teleport there wasn't a good alternative.  </p>
<p>But WebRTC is mature now, and does exactly what UDP+SRT/EFP does. So once it's integrated, that standardizes the Teleport stack around something that's in widespread use.</p>
<p>So if I can, I'm going to switch the transport layer over to WebRTC, which is well-supported, supports arbitrary data streams, and is at least nominally available in C++.</p>
<p>WebRTC was originally intended as a peer-to-peer multimedia protocol, a way to enable (for example) video calls on web pages. But it’s pretty flexible: it supports two-way video tracks, audio tracks, and crucially for us: data channels - arbitrary text or binary data, flowing in either direction, either reliably or not. WebRTC offers us a number of advantages.</p>
<ul>
<li><p>Web-compatible: when we implement the web version of the Teleport client, we won’t be able to use UDP because browsers don’t support raw packet streaming. But they do support WebRTC.</p>
</li>
<li><p>Firewall-friendly: the implementations of WebRTC include support for STUN/TURN, which helps navigate restrictive firewalls, simplifying the process of finding the right port and underlying transport.</p>
</li>
<li><p>Well-supported: compared to the UDP-based systems we’ve been using, WebRTC is used a lot more, so it’s easier to find expertise and solutions.</p>
</li>
</ul>
<p>So here we have it: streaming the geometry via WebRTC. I've added a green shader at the front to highlight each object as it streams in.</p>
<p>There's no background because I'd only implemented geometry streaming so far: video/audio streaming is handled differently in WebRTC. But it worked well: very reliable, and more robust to network changes+firewalls than the UDP approach.</p>
<p>And it should work fine in Linux, though that remains to be tested.</p>
<p>As it turned out, it was pretty simple to activate WebRTC video and audio support by just using data channels - rather than WebRTC's native video/audio tracks. This is most likely less efficient, and will need to be reviewed.</p>
<p>Here as well I show the attractive, minimal 2D user-interface menus I've implemented for desktop use: the 3D menus will now only appear if you're in VR. So the app in desktop now looks... a lot like a browser! Which is intentional.</p>
<p>The back half of the video shows more clearly the transition between the video-streamed background and the geometry-streamed foreground - again with a grey cast to highlight the join - this will enable VR apps of truly vast scope to be accessed instantly via Teleport. While still permitting the option of geometry-only streams, which will be comparable in price to running a web server - as against the $dollar+ per user per hour that Unreal or Unity pixel-streaming costs!</p>
<p>Because there's no chance of a true Metaverse developing at those prices. It needs to be cheap enough that anyone can build a site, host content, create services - just as they can on the web. And that's what Teleport delivers.</p>
<p> <a target="_blank" href="https://roderickkennedy.com/virtual-reality?author=5f08d2770b281846bf04ee3b">Roderick Kennedy</a></p>
]]></content:encoded></item><item><title><![CDATA[Teleport VR: A Protocol for the Metaverse - presentation slides]]></title><description><![CDATA[Teleport VR: A Protocol for the Metaverse - presentation slides
18 Mar
Written By Roderick Kennedy
These are my slides for my session talk at VRARA’s Metaverse 2.0, which was held over 9th-10th March 2022 online. In the talk, I go over some early his...]]></description><link>https://roderickkennedy.com/teleport-vr-a-protocol-for-the-metaverse-presentation-slides</link><guid isPermaLink="true">https://roderickkennedy.com/teleport-vr-a-protocol-for-the-metaverse-presentation-slides</guid><dc:creator><![CDATA[Roderick Kennedy]]></dc:creator><pubDate>Fri, 18 Mar 2022 00:11:46 GMT</pubDate><enclosure url="https://cdn.hashnode.com/res/hashnode/image/upload/v1707401871904/20248a91-8560-421c-a6e5-306491686324.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<h1 id="heading-teleport-vr-a-protocol-for-the-metaverse-presentation-slides">Teleport VR: A Protocol for the Metaverse - presentation slides</h1>
<p>18 Mar</p>
<p>Written By <a target="_blank" href="https://roderickkennedy.com/virtual-reality?author=5f08d2770b281846bf04ee3b">Roderick Kennedy</a></p>
<p>These are my slides for my session talk at <a target="_blank" href="https://www.thevrara.com/events/2022/3/9/you-wanted-more-register-for-our-metaverse-20-virtual-event-metaverse-immerse-vrara">VRARA’s Metaverse 2.0</a>, which was held over 9th-10th March 2022 online. In the talk, I go over some early history of the Metaverse and VR concepts, then go on to make the case why something like <a target="_blank" href="https://teleportvr.io/">Teleport</a> is needed for an open Metaverse of the type we optimistically envision. I compare our approach to the app model status quo, and to Web-based technologies, detailing how in the first case we would be restricted to walled silos, and in the latter to a static data model. Teleport, while open and browsable like the Web, is a dynamic protocol better suited to a live and dynamic 3D Metaverse.</p>
<p>Download the slides here: <a target="_blank" href="https://s3.amazonaws.com/appforest_uf/f1662368617722x291645507128291100/VRARA-Metaverse2.0-10-3-2022.pdf">VRARA-Metaverse2.0-10-3-2022.pdf</a>.</p>
<iframe src="https://drive.google.com/viewerng/viewer?url=https%3A//www.teleportvr.io/files/VRARA-Metaverse2.0-10-3-2022.pdf&amp;embedded=true&amp;wmode=opaque" width="600" style="border:none" height="780"></iframe>

<p>Embed Block</p>
<p>Add an embed URL or code. <a target="_blank" href="https://support.squarespace.com/hc/articles/206543617">Learn more</a></p>
<p><a target="_blank" href="https://roderickkennedy.com/virtual-reality/tag/metaverse">metaverse</a><a target="_blank" href="https://roderickkennedy.com/virtual-reality/tag/teleportvr">teleportvr</a></p>
<p> <a target="_blank" href="https://roderickkennedy.com/virtual-reality?author=5f08d2770b281846bf04ee3b">Roderick Kennedy</a></p>
<h1 id="heading-teleport-vr-a-protocol-for-the-metaverse-presentation-slides-1">Teleport VR: A Protocol for the Metaverse - presentation slides</h1>
<p>18 Mar</p>
<p>Written By <a target="_blank" href="https://roderickkennedy.com/virtual-reality?author=5f08d2770b281846bf04ee3b">Roderick Kennedy</a></p>
<p>These are my slides for my session talk at <a target="_blank" href="https://www.thevrara.com/events/2022/3/9/you-wanted-more-register-for-our-metaverse-20-virtual-event-metaverse-immerse-vrara">VRARA’s Metaverse 2.0</a>, which was held over 9th-10th March 2022 online. In the talk, I go over some early history of the Metaverse and VR concepts, then go on to make the case why something like <a target="_blank" href="https://teleportvr.io/">Teleport</a> is needed for an open Metaverse of the type we optimistically envision. I compare our approach to the app model status quo, and to Web-based technologies, detailing how in the first case we would be restricted to walled silos, and in the latter to a static data model. Teleport, while open and browsable like the Web, is a dynamic protocol better suited to a live and dynamic 3D Metaverse.</p>
<p>Download the slides here: <a target="_blank" href="https://s3.amazonaws.com/appforest_uf/f1662368617722x291645507128291100/VRARA-Metaverse2.0-10-3-2022.pdf">VRARA-Metaverse2.0-10-3-2022.pdf</a>.</p>
<iframe src="https://drive.google.com/viewerng/viewer?url=https%3A//www.teleportvr.io/files/VRARA-Metaverse2.0-10-3-2022.pdf&amp;embedded=true&amp;wmode=opaque" width="600" style="border:none" height="780"></iframe>

<p>Embed Block</p>
<p>Add an embed URL or code. <a target="_blank" href="https://support.squarespace.com/hc/articles/206543617">Learn more</a></p>
<p><a target="_blank" href="https://roderickkennedy.com/virtual-reality/tag/metaverse">metaverse</a><a target="_blank" href="https://roderickkennedy.com/virtual-reality/tag/teleportvr">teleportvr</a></p>
<p> <a target="_blank" href="https://roderickkennedy.com/virtual-reality?author=5f08d2770b281846bf04ee3b">Roderick Kennedy</a></p>
]]></content:encoded></item><item><title><![CDATA[OpenXR Interaction Profiles]]></title><description><![CDATA[I’ve tabulated the standard OpenXR interaction profiles from the Khronos OpenXR spec at https://www.khronos.org/registry/OpenXR/specs/1.0/html/xrspec.html#semantic-path-interaction-profiles.
A given Teleport server will have a control set it supports...]]></description><link>https://roderickkennedy.com/openxr-interaction-profiles-1</link><guid isPermaLink="true">https://roderickkennedy.com/openxr-interaction-profiles-1</guid><dc:creator><![CDATA[Roderick Kennedy]]></dc:creator><pubDate>Sat, 18 Dec 2021 16:58:04 GMT</pubDate><enclosure url="https://cdn.hashnode.com/res/hashnode/image/upload/v1707397838057/62161d9c-8bf2-4af8-95e7-63c4e34a5174.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>I’ve tabulated the standard OpenXR interaction profiles from the Khronos OpenXR spec at <a target="_blank" href="https://www.khronos.org/registry/OpenXR/specs/1.0/html/xrspec.html#semantic-path-interaction-profiles">https://www.khronos.org/registry/OpenXR/specs/1.0/html/xrspec.html#semantic-path-interaction-profiles</a>.</p>
<p>A given Teleport server will have a control set it supports. This must be sent to any connecting client, to say “these are the controls I need for interaction to work”. The client must then match these controls to its hardware.</p>
<p>Unfortunately, there seems to be nothing in OpenXR that allows us to query the XR device for what paths it provides - you just have to suggest bindings and that will either succeed or fail.</p>
<p>And we need to handle the case where an essential control is missing, and the case where the suggested binding for two required inputs is the same.</p>
<p>This is the table:</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1707397835626/9f9f978b-fcd4-4806-a034-33961e9da919.png" alt="OpenXR Interaction Profiles" /></p>
<p>https://docs.google.com/spreadsheets/d/1w4Me9_yG_TNho4Gmelrc9ILYtG1TELoxCXc622xUiVM/edit?usp=sharing</p>
<p>Most hardware should support at least the Khronos Simple Controller, plus its own profile if present. There is also the option of extension paths which include “_ext” in them. But again, no way to query them…</p>
<h4 id="heading-user-paths">User paths</h4>
<p>Only the HTC Vive Pro implements the “head” path, and only for a system button and sound controls. OpenXR <em>does not</em> treat the headset as a controller the way it does for the hands; you have to use xrLocateSpace to get the head pose.</p>
<p>So unless we’re using a gamepad, in most cases we’ll have a user path “/user/hand/left” and “/user/hand/right”, to which we’ll append the input path. For example “/user/hand/left/input/grip/pose” to get the position+orientation (the “pose” in XR jargon) of the left hand. And all of the hand controllers provide two poses per hand: the “grip” which is supposed to be the centre of the palm with the Z-axis pointing down from index finger to pinkie; and the “aim”, where the negative Z-axis points in the “aiming” direction of the controller. These two poses are locked together in all current controllers I know of, they move in sync and the offsets from “grip” to “aim” simply represent the slightly different geometries of the various handsets.</p>
]]></content:encoded></item><item><title><![CDATA[The HTTP of VR]]></title><description><![CDATA[The HTTP of VR
27 Nov
Written By Roderick Kennedy

There’s been a lot of interest lately in the Metaverse and its potential. To get to that potential, we’re missing a vital piece of the puzzle, without which I think the whole project will never reall...]]></description><link>https://roderickkennedy.com/the-http-of-vr</link><guid isPermaLink="true">https://roderickkennedy.com/the-http-of-vr</guid><dc:creator><![CDATA[Roderick Kennedy]]></dc:creator><pubDate>Sat, 27 Nov 2021 17:41:52 GMT</pubDate><enclosure url="https://cdn.hashnode.com/res/hashnode/image/upload/v1707401877924/ff28c146-b249-40de-ab7b-f0a852a9d29e.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<h1 id="heading-the-http-of-vr">The HTTP of VR</h1>
<p>27 Nov</p>
<p>Written By <a target="_blank" href="https://roderickkennedy.com/virtual-reality?author=5f08d2770b281846bf04ee3b">Roderick Kennedy</a></p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1707401874831/54a0355d-b576-4b9f-908c-4981ed310aac.png" alt /></p>
<p>There’s been a lot of interest lately in the <a target="_blank" href="https://www.matthewball.vc/all/themetaverse">Metaverse</a> and its potential. To get to that potential, we’re missing a vital piece of the puzzle, without which I think the whole project will never really get off the ground.</p>
<p>The Metaverse needs a network protocol: it needs its own “HTTP”.</p>
<h4 id="heading-in-the-beginning">In the Beginning</h4>
<p>In 1989-90, Tim Berners Lee created the Web. He did this by specifying the two technologies that we still use today: the document format HTML, and the Hypertext Transfer Protocol (HTTP) - the language that web browsers and servers use to communicate. It’s crucial at this distance in time to remember what these technologies were designed for. As a document format, HTML is intended to represent text and associated content: images, tables and so on, but principally, <em>human-readable</em> text. And it’s a linear, one-dimensional format - for the non-linear hopping around that characterizes the Web, we need the other side of the coin, HTTP. And HTTP was designed as a request-response system - the browser or client sends a URI, along with a command (usually “GET” as in, get me this webpage, but also “PUT”, “POST” and so on). It’s a document-retrieval language, intended originally to help scientific institutions freely share data.</p>
<ul>
<li><p>It’s discontinuous: each request returns a specific, finite response.</p>
</li>
<li><p>It’s stateless: the server is never required to retain information about a client between requests.</p>
</li>
<li><p>It’s unidirectional: links are one-way, and you don’t need permission from the target server to link to it.</p>
</li>
</ul>
<p>So much of the history of the Web over the last thirty years has been of how tech companies have tried to get around all of these properties to turn a system intended for the free sharing of information into one suited for commerce: how to feed things like ads to clients that didn’t request them; how to record “state” - i.e. track the user and observe their behaviour; how to restrict access so that you can charge for your content. But it’s a testament to the power of the <em>original</em> design that the Web became so important that companies felt they had to work around its restrictions. No-one has seriously proposed returning to the walled-garden online services of 90’s AOL, CompuServe and so on; instead, they’ve tried to recreate miniature walled-gardens within the wider web.</p>
<p><strong>But in VR it’s a different story: every app is a silo, barely connected with the wider Metaverse.</strong></p>
<h4 id="heading-vr-on-the-web">VR on the Web</h4>
<p>One of the ways people have tried to extend the Web is to enable content that isn’t a 2D view of a 1D document. Inspired in part by science-fiction visions of a network you could “experience” from the inside, they tried to make it capable of representing 3D spaces.</p>
<p>At the very first World Wide Web Conference, as far back as 1994 - VRML was proposed: a text file format for 3D graphics, sure, but its name, and the paper by David Raggett of HP that proposed it (<a target="_blank" href="https://www.w3.org/People/Raggett/vrml/vrml.html">Extending WWW to support Platform Independent Virtual Reality</a>) revealed its ambition: the Web was to incorporate VR environments, “allowing users to "walk" around and push through doors to follow hyperlinks to other parts of the Web.” The protocol that would deliver these environments? HTTP.</p>
<p>Alongside such document-format technologies, the capability of the Web to support VR rendering has developed. WebXR (formerly WebVR) is a set of Javascript libraries that communicate with a web browser to interact with any VR hardware that the device might have attached. Combined with the 3D rendering capabilities of WebGL (again, Javascript), WebXR gets close to Raggett’s original vision - you can indeed explore parts of the web in VR, and thanks to the capabilities of Javascript, you can even navigate and follow links. Libraries like three.js further extend the power of this approach.</p>
<p>But to get to the level of functionality WebXR provides, you’ve pretty much left the Web behind in terms of its design and intention. A WebXR app is more a Javascript app you’ve downloaded via a web page. By relying so much on the Web to deliver what we need, we’ve lost touch with many of its benefits. In a WebXR app, there are no standards for what a 3D URI looks like, for how the app responds to different hardware capabilities, for how we interact or the relationships between the user and the data model. The DOM, a hugely powerful element of the Web’s design, has little bearing on WebGL, for which only one element, the canvas, is relevant to the 3D environment. Behind the scenes, in Javascript, we are in the realms of code alone: by trying to use the Web for too much, we lose many of its advantages.</p>
<p>The modern inheritor of VRML is perhaps <a target="_blank" href="https://aframe.io/">A-Frame</a>, which does away with even the need for a separate file format, and extends HTML itself to support 3D objects. A-Frame is amazing, allowing the hosting of entire 3D scenes in a web page or part of one. Still: the scene is a <em>document</em>. A linear, finite chunk of data returned by a GET request.</p>
<p>There is tremendous power and potential in these technologies. And they will play a vital role in joining the Web with the spatial internet. They cannot deliver the entire vision of the Metaverse. By far the greatest reason to look beyond HTML and HTTP for spatial computing is simply this: these technologies will continue to develop, and will always be driven by their primary purpose: to deliver webpages, websites and static, or marginally dynamic content.</p>
<p><strong>Spatial computing will remain a secondary consideration.</strong></p>
<p>If the Metaverse is to achieve its full potential, it needs its own protocol.</p>
<h4 id="heading-the-http-of-vr-1">The HTTP of VR</h4>
<p>A virtual space is <em>not</em> a document, and its fundamental characteristics are quite different. Much as the Web has become dynamic over the years, an html page is a stationary format subject to step changes when acted on via the DOM - but while a virtual space can be stationary, it could equally be continuously evolving, more akin to an online game than something that can be retrieved and observed in steady-state. While there is no limit on the size of an html document, it is the nature of text that only a finite contiguous segment is viewed at any time in a browser. So it can be viewed without difficulty on minimal hardware. In 3D, it is possible for vastly distant elements of a scene to be visible from any point. This introduces great performance challenges, particularly when we want the viewing device to be light enough to wear.</p>
<p><strong>So let’s propose an outline of an alternative technology that would fit these requirements.</strong></p>
<p>Imagine an application-layer protocol for VR with the following characteristics:</p>
<ul>
<li><p>A real-time, dynamic, stateful two-way client-server protocol. As such, it will be if not fully RTP then close to it.</p>
</li>
<li><p>It will carry 3D geometry, 2D textures, materials, audio, video and volumetric data mainly from server to client.</p>
</li>
<li><p>It will carry control inputs, including spatial, binary and analogue inputs; as well as video, audio and other data from client to server.</p>
</li>
<li><p>Almost all of the application logic will be processed at the server.</p>
</li>
<li><p>Parts of the app logic that are latency-sensitive will occur at the client.</p>
</li>
<li><p>Less latency-sensitive parts of the rendering may occur at the server.</p>
</li>
<li><p>The final rendering and compositing will occur at the client.</p>
</li>
</ul>
<p>We split the workload: let the client handle the near stuff, and the server everything else. The whole system of client and server will form essentially two “loops”:</p>
<ul>
<li><p>The far loop between client input and server logic, which has network latency.</p>
</li>
<li><p>The near loop within the client, which has local latency.</p>
</li>
</ul>
<p>I think that this system could handle almost every current application of XR nicely, and enable a wide range of apps that would struggle in a standalone, installable format. I speak of “the HTTP of VR” rather than the “HTML” because as a real-time system, most of the data will be in binary form; we’re less interested in a human-readable data format, and more in the language of transmission.</p>
<h4 id="heading-an-open-protocol">An Open Protocol</h4>
<p>Much of the above could be considered a fair description of a few VR apps already: particularly the more generic “social network” applications that download a large amount of content on the fly to represent avatars, scenery and so on. But these apps are over-specified for what I have in mind, the definition of a “thick client”. And crucially, each client is proprietary - the apps don’t talk to each other.</p>
<p><strong>The key is that the protocol must be open.</strong></p>
<p>Imagine a world where every XR device has a lightweight client application, a “VR browser”. Where VR experiences of all types - games, social networks, education, simulation - are all accessible by simply launching the browser and connecting to the appropriate server. Where it doesn’t matter what headset you have, because all the best apps are online via the protocol: no need to download and install, just connect and go. Where as a developer you don’t need to worry about fitting your entire application in a few gigabytes, because it lives on your server/s, not taking up space on a heavy, head-mounted hard-drive. And neither do you have to deploy updates to hundreds or thousands of users - you mostly just update the server-side.</p>
<p>It seems to me that in this way, the Metaverse could really get going: with network effects, discoverability, openness. Low barriers to entry and a free marketplace of content. These are the factors that allowed the Web to grow and thrive.</p>
<p><strong>By moving beyond the Web with its own open, native protocol, the Metaverse can do the same.</strong></p>
<p>So at Simul, for the past few years we’ve been building this protocol: it’s called <a target="_blank" href="https://teleportvr.io/">Teleport VR</a>. Let’s see what we can make with it!</p>
<p> <a target="_blank" href="https://roderickkennedy.com/virtual-reality?author=5f08d2770b281846bf04ee3b">Roderick Kennedy</a></p>
<h1 id="heading-the-http-of-vr-2">The HTTP of VR</h1>
<p>27 Nov</p>
<p>Written By <a target="_blank" href="https://roderickkennedy.com/virtual-reality?author=5f08d2770b281846bf04ee3b">Roderick Kennedy</a></p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1707401876317/83695d53-2870-4998-97db-d47e7ca1977d.png" alt /></p>
<p>There’s been a lot of interest lately in the <a target="_blank" href="https://www.matthewball.vc/all/themetaverse">Metaverse</a> and its potential. To get to that potential, we’re missing a vital piece of the puzzle, without which I think the whole project will never really get off the ground.</p>
<p>The Metaverse needs a network protocol: it needs its own “HTTP”.</p>
<h4 id="heading-in-the-beginning-1">In the Beginning</h4>
<p>In 1989-90, Tim Berners Lee created the Web. He did this by specifying the two technologies that we still use today: the document format HTML, and the Hypertext Transfer Protocol (HTTP) - the language that web browsers and servers use to communicate. It’s crucial at this distance in time to remember what these technologies were designed for. As a document format, HTML is intended to represent text and associated content: images, tables and so on, but principally, <em>human-readable</em> text. And it’s a linear, one-dimensional format - for the non-linear hopping around that characterizes the Web, we need the other side of the coin, HTTP. And HTTP was designed as a request-response system - the browser or client sends a URI, along with a command (usually “GET” as in, get me this webpage, but also “PUT”, “POST” and so on). It’s a document-retrieval language, intended originally to help scientific institutions freely share data.</p>
<ul>
<li><p>It’s discontinuous: each request returns a specific, finite response.</p>
</li>
<li><p>It’s stateless: the server is never required to retain information about a client between requests.</p>
</li>
<li><p>It’s unidirectional: links are one-way, and you don’t need permission from the target server to link to it.</p>
</li>
</ul>
<p>So much of the history of the Web over the last thirty years has been of how tech companies have tried to get around all of these properties to turn a system intended for the free sharing of information into one suited for commerce: how to feed things like ads to clients that didn’t request them; how to record “state” - i.e. track the user and observe their behaviour; how to restrict access so that you can charge for your content. But it’s a testament to the power of the <em>original</em> design that the Web became so important that companies felt they had to work around its restrictions. No-one has seriously proposed returning to the walled-garden online services of 90’s AOL, CompuServe and so on; instead, they’ve tried to recreate miniature walled-gardens within the wider web.</p>
<p><strong>But in VR it’s a different story: every app is a silo, barely connected with the wider Metaverse.</strong></p>
<h4 id="heading-vr-on-the-web-1">VR on the Web</h4>
<p>One of the ways people have tried to extend the Web is to enable content that isn’t a 2D view of a 1D document. Inspired in part by science-fiction visions of a network you could “experience” from the inside, they tried to make it capable of representing 3D spaces.</p>
<p>At the very first World Wide Web Conference, as far back as 1994 - VRML was proposed: a text file format for 3D graphics, sure, but its name, and the paper by David Raggett of HP that proposed it (<a target="_blank" href="https://www.w3.org/People/Raggett/vrml/vrml.html">Extending WWW to support Platform Independent Virtual Reality</a>) revealed its ambition: the Web was to incorporate VR environments, “allowing users to "walk" around and push through doors to follow hyperlinks to other parts of the Web.” The protocol that would deliver these environments? HTTP.</p>
<p>Alongside such document-format technologies, the capability of the Web to support VR rendering has developed. WebXR (formerly WebVR) is a set of Javascript libraries that communicate with a web browser to interact with any VR hardware that the device might have attached. Combined with the 3D rendering capabilities of WebGL (again, Javascript), WebXR gets close to Raggett’s original vision - you can indeed explore parts of the web in VR, and thanks to the capabilities of Javascript, you can even navigate and follow links. Libraries like three.js further extend the power of this approach.</p>
<p>But to get to the level of functionality WebXR provides, you’ve pretty much left the Web behind in terms of its design and intention. A WebXR app is more a Javascript app you’ve downloaded via a web page. By relying so much on the Web to deliver what we need, we’ve lost touch with many of its benefits. In a WebXR app, there are no standards for what a 3D URI looks like, for how the app responds to different hardware capabilities, for how we interact or the relationships between the user and the data model. The DOM, a hugely powerful element of the Web’s design, has little bearing on WebGL, for which only one element, the canvas, is relevant to the 3D environment. Behind the scenes, in Javascript, we are in the realms of code alone: by trying to use the Web for too much, we lose many of its advantages.</p>
<p>The modern inheritor of VRML is perhaps <a target="_blank" href="https://aframe.io/">A-Frame</a>, which does away with even the need for a separate file format, and extends HTML itself to support 3D objects. A-Frame is amazing, allowing the hosting of entire 3D scenes in a web page or part of one. Still: the scene is a <em>document</em>. A linear, finite chunk of data returned by a GET request.</p>
<p>There is tremendous power and potential in these technologies. And they will play a vital role in joining the Web with the spatial internet. They cannot deliver the entire vision of the Metaverse. By far the greatest reason to look beyond HTML and HTTP for spatial computing is simply this: these technologies will continue to develop, and will always be driven by their primary purpose: to deliver webpages, websites and static, or marginally dynamic content.</p>
<p><strong>Spatial computing will remain a secondary consideration.</strong></p>
<p>If the Metaverse is to achieve its full potential, it needs its own protocol.</p>
<h4 id="heading-the-http-of-vr-3">The HTTP of VR</h4>
<p>A virtual space is <em>not</em> a document, and its fundamental characteristics are quite different. Much as the Web has become dynamic over the years, an html page is a stationary format subject to step changes when acted on via the DOM - but while a virtual space can be stationary, it could equally be continuously evolving, more akin to an online game than something that can be retrieved and observed in steady-state. While there is no limit on the size of an html document, it is the nature of text that only a finite contiguous segment is viewed at any time in a browser. So it can be viewed without difficulty on minimal hardware. In 3D, it is possible for vastly distant elements of a scene to be visible from any point. This introduces great performance challenges, particularly when we want the viewing device to be light enough to wear.</p>
<p><strong>So let’s propose an outline of an alternative technology that would fit these requirements.</strong></p>
<p>Imagine an application-layer protocol for VR with the following characteristics:</p>
<ul>
<li><p>A real-time, dynamic, stateful two-way client-server protocol. As such, it will be if not fully RTP then close to it.</p>
</li>
<li><p>It will carry 3D geometry, 2D textures, materials, audio, video and volumetric data mainly from server to client.</p>
</li>
<li><p>It will carry control inputs, including spatial, binary and analogue inputs; as well as video, audio and other data from client to server.</p>
</li>
<li><p>Almost all of the application logic will be processed at the server.</p>
</li>
<li><p>Parts of the app logic that are latency-sensitive will occur at the client.</p>
</li>
<li><p>Less latency-sensitive parts of the rendering may occur at the server.</p>
</li>
<li><p>The final rendering and compositing will occur at the client.</p>
</li>
</ul>
<p>We split the workload: let the client handle the near stuff, and the server everything else. The whole system of client and server will form essentially two “loops”:</p>
<ul>
<li><p>The far loop between client input and server logic, which has network latency.</p>
</li>
<li><p>The near loop within the client, which has local latency.</p>
</li>
</ul>
<p>I think that this system could handle almost every current application of XR nicely, and enable a wide range of apps that would struggle in a standalone, installable format. I speak of “the HTTP of VR” rather than the “HTML” because as a real-time system, most of the data will be in binary form; we’re less interested in a human-readable data format, and more in the language of transmission.</p>
<h4 id="heading-an-open-protocol-1">An Open Protocol</h4>
<p>Much of the above could be considered a fair description of a few VR apps already: particularly the more generic “social network” applications that download a large amount of content on the fly to represent avatars, scenery and so on. But these apps are over-specified for what I have in mind, the definition of a “thick client”. And crucially, each client is proprietary - the apps don’t talk to each other.</p>
<p><strong>The key is that the protocol must be open.</strong></p>
<p>Imagine a world where every XR device has a lightweight client application, a “VR browser”. Where VR experiences of all types - games, social networks, education, simulation - are all accessible by simply launching the browser and connecting to the appropriate server. Where it doesn’t matter what headset you have, because all the best apps are online via the protocol: no need to download and install, just connect and go. Where as a developer you don’t need to worry about fitting your entire application in a few gigabytes, because it lives on your server/s, not taking up space on a heavy, head-mounted hard-drive. And neither do you have to deploy updates to hundreds or thousands of users - you mostly just update the server-side.</p>
<p>It seems to me that in this way, the Metaverse could really get going: with network effects, discoverability, openness. Low barriers to entry and a free marketplace of content. These are the factors that allowed the Web to grow and thrive.</p>
<p><strong>By moving beyond the Web with its own open, native protocol, the Metaverse can do the same.</strong></p>
<p>So at Simul, for the past few years we’ve been building this protocol: it’s called <a target="_blank" href="https://teleportvr.io/">Teleport VR</a>. Let’s see what we can make with it!</p>
<p> <a target="_blank" href="https://roderickkennedy.com/virtual-reality?author=5f08d2770b281846bf04ee3b">Roderick Kennedy</a></p>
]]></content:encoded></item><item><title><![CDATA[A virtual keyboard in Dear ImGui]]></title><description><![CDATA[A virtual keyboard in Dear ImGui
10 Nov
Written By Roderick Kennedy
There are a few kinks implementing a virtual keyboard in Dear ImGui - the main problem is getting key inputs from the buttons into the InputText item.

Clicking a button changes focu...]]></description><link>https://roderickkennedy.com/a-virtual-keyboard-in-dear-imgui</link><guid isPermaLink="true">https://roderickkennedy.com/a-virtual-keyboard-in-dear-imgui</guid><dc:creator><![CDATA[Roderick Kennedy]]></dc:creator><pubDate>Wed, 10 Nov 2021 13:48:14 GMT</pubDate><enclosure url="https://cdn.hashnode.com/res/hashnode/image/upload/v1707394745357/ffe554e2-9ef5-403f-80a3-7ed1c6a86b40.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<h1 id="heading-a-virtual-keyboard-in-dear-imgui">A virtual keyboard in Dear ImGui</h1>
<p>10 Nov</p>
<p>Written By <a target="_blank" href="https://roderickkennedy.com/dbgdiary?author=5f08d2770b281846bf04ee3b">Roderick Kennedy</a></p>
<p>There are a few kinks implementing a virtual keyboard in <a target="_blank" href="https://github.com/ocornut/imgui">Dear ImGui</a> - the main problem is getting key inputs from the buttons into the InputText item.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1707394742834/1d21c959-c09a-4b3e-a8dd-f695aef9e456.png" alt /></p>
<p>Clicking a button changes focus to the button - away from the InputText. You can switch it back on the next pass, but ImGui won’t recognize it as being ready for input until two frames later.<br />So I implemented a counter called “refocus”. It’s set to zero every time we click a keyboard button. It increments every frame. Only when it gets to 2 or greater do we pull the inputs from out of our buffer.</p>
<p>Our variables are:</p>
<p>        std::vector keys_pressed;</p>
<p>        int refocus=0;</p>
<p>Here’s the code:</p>
<p>            ImGui::Begin("Virtual Keyboard");
            io.KeysDown[VK_BACK] = false;
            if(refocus==0)
            {
                ImGui::SetKeyboardFocusHere();
            }
            else if(refocus&gt;=2)
            {
                while(keys_pressed.size())
                {
                    int k=keys_pressed[0];
                    if(k==VK_BACK)
                    {
                        io.KeysDown[k] = true;
                    }
                    else
                    {
                        io.AddInputCharacter(k);
                    }
                    keys_pressed.erase(keys_pressed.begin());
                }
            }
            static char buf[500];
            if(ImGui::InputText("", buf, IM_ARRAYSIZE(buf)))
            {
                current_url=buf;
            }
            refocus++;</p>
<p>We define a simple lambda function to add a line of keys to the keyboard:</p>
<p>            auto KeyboardLine = [&amp;io,this](const char<em> key)
            {
                size_t num = strlen(key);
                for (size_t i = 0; i &lt; num; i++)
                {
                     char key_label[] = "X";
                     key_label[0] = </em>key;
                     if (ImGui::Button(key_label,ImVec2(46,32)))
                     {
                         refocus=0;
                         keys_pressed.push_back(*key);
                     }
                     key++;
                     if (i&lt;num-1)
                        ImGui::SameLine();
                }
            };</p>
<p>This allows us to add the number keys simply with:</p>
<p>            KeyboardLine("1234567890-");
            ImGui::SameLine();</p>
<p>Now here’s our backspace arrow button:</p>
<p>            if (ImGui::Button(ICON_FK_LONG_ARROW_LEFT,ImVec2(92,32)))
            {
                 refocus=0;
                 keys_pressed.push_back(ImGuiKey_Backspace);
            }</p>
<p>Using a Text() call to insert a bit of padding in front of the next row of keys:</p>
<p>            ImGui::Text("  ");
            ImGui::SameLine();
            KeyboardLine("qwertyuiop");
            ImGui::Text("    ");
            ImGui::SameLine();
            KeyboardLine("asdfghjkl");
            ImGui::SameLine();
            if (ImGui::Button("Return",ImVec2(92,32)))
            {
                 refocus=0;
                 keys_pressed.push_back(ImGuiKey_Enter);
            }
            ImGui::Text("      ");
            ImGui::SameLine();
            KeyboardLine("zxcvbnm,./");
            ImGui::End();</p>
<p>And done.</p>
<p> <a target="_blank" href="https://roderickkennedy.com/dbgdiary?author=5f08d2770b281846bf04ee3b">Roderick Kennedy</a></p>
<h1 id="heading-a-virtual-keyboard-in-dear-imgui-1">A virtual keyboard in Dear ImGui</h1>
<p>10 Nov</p>
<p>Written By <a target="_blank" href="https://roderickkennedy.com/dbgdiary?author=5f08d2770b281846bf04ee3b">Roderick Kennedy</a></p>
<p>There are a few kinks implementing a virtual keyboard in <a target="_blank" href="https://github.com/ocornut/imgui">Dear ImGui</a> - the main problem is getting key inputs from the buttons into the InputText item.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1707394743906/40ce9636-e91e-42b3-a0ba-a3b6db0dd397.png" alt /></p>
<p>Clicking a button changes focus to the button - away from the InputText. You can switch it back on the next pass, but ImGui won’t recognize it as being ready for input until two frames later.<br />So I implemented a counter called “refocus”. It’s set to zero every time we click a keyboard button. It increments every frame. Only when it gets to 2 or greater do we pull the inputs from out of our buffer.</p>
<p>Our variables are:</p>
<p>        std::vector keys_pressed;</p>
<p>        int refocus=0;</p>
<p>Here’s the code:</p>
<p>            ImGui::Begin("Virtual Keyboard");
            io.KeysDown[VK_BACK] = false;
            if(refocus==0)
            {
                ImGui::SetKeyboardFocusHere();
            }
            else if(refocus&gt;=2)
            {
                while(keys_pressed.size())
                {
                    int k=keys_pressed[0];
                    if(k==VK_BACK)
                    {
                        io.KeysDown[k] = true;
                    }
                    else
                    {
                        io.AddInputCharacter(k);
                    }
                    keys_pressed.erase(keys_pressed.begin());
                }
            }
            static char buf[500];
            if(ImGui::InputText("", buf, IM_ARRAYSIZE(buf)))
            {
                current_url=buf;
            }
            refocus++;</p>
<p>We define a simple lambda function to add a line of keys to the keyboard:</p>
<p>            auto KeyboardLine = [&amp;io,this](const char<em> key)
            {
                size_t num = strlen(key);
                for (size_t i = 0; i &lt; num; i++)
                {
                     char key_label[] = "X";
                     key_label[0] = </em>key;
                     if (ImGui::Button(key_label,ImVec2(46,32)))
                     {
                         refocus=0;
                         keys_pressed.push_back(*key);
                     }
                     key++;
                     if (i&lt;num-1)
                        ImGui::SameLine();
                }
            };</p>
<p>This allows us to add the number keys simply with:</p>
<p>            KeyboardLine("1234567890-");
            ImGui::SameLine();</p>
<p>Now here’s our backspace arrow button:</p>
<p>            if (ImGui::Button(ICON_FK_LONG_ARROW_LEFT,ImVec2(92,32)))
            {
                 refocus=0;
                 keys_pressed.push_back(ImGuiKey_Backspace);
            }</p>
<p>Using a Text() call to insert a bit of padding in front of the next row of keys:</p>
<p>            ImGui::Text("  ");
            ImGui::SameLine();
            KeyboardLine("qwertyuiop");
            ImGui::Text("    ");
            ImGui::SameLine();
            KeyboardLine("asdfghjkl");
            ImGui::SameLine();
            if (ImGui::Button("Return",ImVec2(92,32)))
            {
                 refocus=0;
                 keys_pressed.push_back(ImGuiKey_Enter);
            }
            ImGui::Text("      ");
            ImGui::SameLine();
            KeyboardLine("zxcvbnm,./");
            ImGui::End();</p>
<p>And done.</p>
<p> <a target="_blank" href="https://roderickkennedy.com/dbgdiary?author=5f08d2770b281846bf04ee3b">Roderick Kennedy</a></p>
]]></content:encoded></item><item><title><![CDATA[Accessing MongoDB from Nextjs]]></title><description><![CDATA[Following the example of
www.mongodb.com/developer/how-to/nextjs-with-mongodb/ you soon run into difficulty - as of October 2021, the version of the nextjs Mongo integration initialized by
npx create-next-app --example with-mongodb mflix
as the tutor...]]></description><link>https://roderickkennedy.com/accessing-mongodb-from-nextjs-1</link><guid isPermaLink="true">https://roderickkennedy.com/accessing-mongodb-from-nextjs-1</guid><dc:creator><![CDATA[Roderick Kennedy]]></dc:creator><pubDate>Fri, 22 Oct 2021 22:18:04 GMT</pubDate><content:encoded><![CDATA[<p>Following the example of</p>
<p><a target="_blank" href="https://www.mongodb.com/developer/how-to/nextjs-with-mongodb/">www.mongodb.com/developer/how-to/nextjs-with-mongodb/</a> you soon run into difficulty - as of October 2021, the version of the nextjs Mongo integration initialized by</p>
<p>npx create-next-app --example with-mongodb mflix</p>
<p>as the tutorial expects to find the file util/mongodb.js and the function connectToDatabase it contains, which is missing - because create-next-app here checks out <a target="_blank" href="https://github.com/vercel/next.js/tree/canary/examples/with-mongodb">github.com/vercel/next.js/tree/canary/examples/with-mongodb</a> to the “canary” branch. You have to either check out the branch “master” of that repo, or manually add the mongodb.js file from that branch into the project it created.</p>
<p>Then, the tutorial skips over a lot of crucial information about how to use the API. The answer to obtaining a single result from the API is to create a file e.g. [movie_id].js in pages/api/movies, and fill it with:</p>
<p>import { connectToDatabase } from "../../../util/mongodb";</p>
<p>const {ObjectId} = require('mongodb');</p>
<p>export default async function handler(req, res)</p>
<p>{</p>
<p>const { movie_id } = req.query;</p>
<p>const { db } = await connectToDatabase();</p>
<p>var o_id = new ObjectId(movie_id);</p>
<p>const movie_info = await db</p>
<p>.collection("movies")</p>
<p>.find({'_id' : o_id})</p>
<p>.toArray();</p>
<p>res.json(movie_info);</p>
<p>}</p>
<p>The key here is getting the definition of ObjectId from the mongodb node module, initializing o_id as an ObjectId from the 24-char movie idm then using it in the find() function of the collection object.</p>
<p>Overall, I’m thinking Mongo/Nextjs is insufficiently documented for my needs - my search for a node data backend continues.</p>
]]></content:encoded></item><item><title><![CDATA[Changing the name of a variable in CMake]]></title><description><![CDATA[Here's a neat trick in CMake: you want to change the name of a variable, but worry that anyone you've distributed the code to already will lose the option they've selected.
Use the old variable as the default value for the new one:
option(OLD_VARIABL...]]></description><link>https://roderickkennedy.com/changing-the-name-of-a-variable-in-cmake-1</link><guid isPermaLink="true">https://roderickkennedy.com/changing-the-name-of-a-variable-in-cmake-1</guid><dc:creator><![CDATA[Roderick Kennedy]]></dc:creator><pubDate>Fri, 27 Nov 2020 14:18:00 GMT</pubDate><content:encoded><![CDATA[<p>Here's a neat trick in CMake: you want to change the name of a variable, but worry that anyone you've distributed the code to already will lose the option they've selected.</p>
<p>Use the old variable as the default value for the new one:</p>
<p><code>option(OLD_VARIABLE "Some variable" ON) option(NEW_VARIABLE "Some variable" ${OLD_VARIABLE})</code></p>
<p>or...</p>
<p><code>set( OLD_STRING_VARIABLE "Old default" CACHE STRING "Help text" )</code></p>
<p><code>set( NEW_STRING_VARIABLE "${OLD_STRING_VARIABLE}" "Help text" )</code></p>
]]></content:encoded></item><item><title><![CDATA[Passing an array of structs from C++ to C#]]></title><description><![CDATA[To pass an array of structs from C++ to C#, you can pass a pointer to a C-style array. In C++ you may have a struct, e.g.
#pragma pack(push) #pragma pack(1) struct InputEvent { uint32_t eventId; float floatValue; uint32_t intValue; }; #pragma pack(po...]]></description><link>https://roderickkennedy.com/passing-an-array-of-structs-from-c-to-c-1</link><guid isPermaLink="true">https://roderickkennedy.com/passing-an-array-of-structs-from-c-to-c-1</guid><dc:creator><![CDATA[Roderick Kennedy]]></dc:creator><pubDate>Thu, 12 Nov 2020 14:17:00 GMT</pubDate><content:encoded><![CDATA[<p>To pass an array of structs from C++ to C#, you can pass a pointer to a C-style array. In C++ you may have a struct, e.g.</p>
<p><code>#pragma pack(push) #pragma pack(1) struct InputEvent { uint32_t eventId; float floatValue; uint32_t intValue; }; #pragma pack(pop)</code><br />The delegate in C++ is:<br /><code>typedef void(__stdcall* ProcessNewInputFn) (int numEvents, const InputEvent**);</code> Telling C++ what C# function to call:</p>
<p><code>extern "C" __declspec(dllexport) void SetInputProcessingDelegate(ProcessNewInputFn newInputProcessing) { processNewInput = newInputProcessing; }</code></p>
<p>Using this from C++</p>
<p><code>std::vector&lt;InputEvent&gt; inputEvents;</code><br /><code>const avs::InputEvent *v=inputEvents.data(); processNewInput(inputEvents.size(), &amp;v);</code></p>
<p>In C# the struct is defined as:</p>
<p><code>[StructLayout(LayoutKind.Sequential, Pack = 1)] public struct InputEvent { public UInt32 eventId; public float floatValue; public UInt32 intValue; };</code><br />Note the packing! It must match what we had in C++. Now C# must declare the the delegate type it will implement:</p>
<p><code>[UnmanagedFunctionPointer(CallingConvention.StdCall)] delegate void OnNewInput(int numEvents, in IntPtr newEvents);</code></p>
<p>And declare in C# the C++ function that sets the delegate:</p>
<p><code>[DllImport("dllname")] static extern void SetInputProcessingDelegate(OnNewInput onNewInput );</code></p>
<p>This is called with</p>
<p><code>ok = SetInputProcessingDelegate(ProcessingClass.SetInput);</code></p>
<p>Where we have a class like this:<br /><code>class ProcessingClass { public static void StaticProcessInput(int numEvents, in IntPtr inputEventsPtr ) {     int EventSize = System.Runtime.InteropServices.Marshal.SizeOf(typeof(avs.InputEvent));     avs.InputEvent[] inputEvents = new avs.InputEvent[inputState.numEvents];     IntPtr ptr=  inputEventsPtr;     for (int i = 0; i &lt; inputState.numEvents; i++)     {        inputEvents[i]=Marshal.PtrToStructure&lt;avs.InputEvent&gt;(ptr);        ptr += EventSize; }         } }</code><br />Here, we take the C++ style pointer-to-array, and iterate through the array elements, copying each in turn into the C# style array.</p>
]]></content:encoded></item><item><title><![CDATA[Resolving conflicts between Qt versions]]></title><description><![CDATA[Resolving conflicts between Qt versions
14 Nov
Written By Roderick Kennedy
The trueSKY plugin for Unreal uses Qt dll's for UI. Unfortunately so do some other plugins. Because Windows just uses whichever version of a dll was loaded first, this leads t...]]></description><link>https://roderickkennedy.com/resolving-conflicts-between-qt-versions</link><guid isPermaLink="true">https://roderickkennedy.com/resolving-conflicts-between-qt-versions</guid><dc:creator><![CDATA[Roderick Kennedy]]></dc:creator><pubDate>Thu, 14 Nov 2019 14:16:00 GMT</pubDate><content:encoded><![CDATA[<h1 id="heading-resolving-conflicts-between-qt-versions">Resolving conflicts between Qt versions</h1>
<p>14 Nov</p>
<p>Written By <a target="_blank" href="https://roderickkennedy.com/dbgdiary?author=5f08d2770b281846bf04ee3b">Roderick Kennedy</a></p>
<p>The trueSKY plugin for Unreal uses Qt dll's for UI. Unfortunately so do some other plugins. Because Windows just uses whichever version of a dll was loaded first, this leads to (for example) crashes in trueSKY UI because it tries to access the wrong parts of a dll loaded by Quixel Megascans.  </p>
<p>To solve this we recompile Qt using the switch -qtlibinfix to modify the output filenames. Thus instead of Qt5Core.dll we get Qt5Core_simul.dll etc.  </p>
<p>No more conflicts!  </p>
<p>UPDATE: To compile Qt, follow the instructions at <a target="_blank" href="https://wiki.qt.io/Building_Qt_5_from_Git">https://wiki.qt.io/Building_Qt_5_from_Git</a>. For example, for me on Windows, I must git-clone the repo, install perl (!) and call perl init-repository.  </p>
<p>Then, I create a subdirectory BUILD_DIR at subdirectory "build/x64". From there, I call:  </p>
<p>call ../../configure.bat -qtlibinfix %QT_INFIX% -prefix %BUILD_DIR%\qtbase -skip qtwebengine -developer-build -%reldeb% -force-debug-info -no-warnings-are-errors -L kernel32 -opengl desktop -opensource -make libs -make tools -nomake examples -nomake tests -platform win32-msvc -confirm-license -no-compile-examples -qt-zlib -plugin-manifests -no-angle -qt-freetype -qt-libjpeg -qt-libpng -D U_STATIC_IMPLEMENTATION %INC% %LIBDIRS% %IC%  </p>
<p>QT_INFIX is _simul, while INC LIBDIRS and IC are extra compile options.  </p>
<p>Finally, we run nMake to build Qt.</p>
<p> <a target="_blank" href="https://roderickkennedy.com/dbgdiary?author=5f08d2770b281846bf04ee3b">Roderick Kennedy</a></p>
<h1 id="heading-resolving-conflicts-between-qt-versions-1">Resolving conflicts between Qt versions</h1>
<p>14 Nov</p>
<p>Written By <a target="_blank" href="https://roderickkennedy.com/dbgdiary?author=5f08d2770b281846bf04ee3b">Roderick Kennedy</a></p>
<p>The trueSKY plugin for Unreal uses Qt dll's for UI. Unfortunately so do some other plugins. Because Windows just uses whichever version of a dll was loaded first, this leads to (for example) crashes in trueSKY UI because it tries to access the wrong parts of a dll loaded by Quixel Megascans.  </p>
<p>To solve this we recompile Qt using the switch -qtlibinfix to modify the output filenames. Thus instead of Qt5Core.dll we get Qt5Core_simul.dll etc.  </p>
<p>No more conflicts!  </p>
<p>UPDATE: To compile Qt, follow the instructions at <a target="_blank" href="https://wiki.qt.io/Building_Qt_5_from_Git">https://wiki.qt.io/Building_Qt_5_from_Git</a>. For example, for me on Windows, I must git-clone the repo, install perl (!) and call perl init-repository.  </p>
<p>Then, I create a subdirectory BUILD_DIR at subdirectory "build/x64". From there, I call:  </p>
<p>call ../../configure.bat -qtlibinfix %QT_INFIX% -prefix %BUILD_DIR%\qtbase -skip qtwebengine -developer-build -%reldeb% -force-debug-info -no-warnings-are-errors -L kernel32 -opengl desktop -opensource -make libs -make tools -nomake examples -nomake tests -platform win32-msvc -confirm-license -no-compile-examples -qt-zlib -plugin-manifests -no-angle -qt-freetype -qt-libjpeg -qt-libpng -D U_STATIC_IMPLEMENTATION %INC% %LIBDIRS% %IC%  </p>
<p>QT_INFIX is _simul, while INC LIBDIRS and IC are extra compile options.  </p>
<p>Finally, we run nMake to build Qt.</p>
<p> <a target="_blank" href="https://roderickkennedy.com/dbgdiary?author=5f08d2770b281846bf04ee3b">Roderick Kennedy</a></p>
]]></content:encoded></item><item><title><![CDATA[Finding and removing files added to git by accident]]></title><description><![CDATA[Finding and removing files added to git by accident
2 Aug
Written By Roderick Kennedy
If for example, you've added lib files by mistake to a large git repo, and want to remove them, but don't know the exact paths, use this:  
git ls-files *.lib>lib.b...]]></description><link>https://roderickkennedy.com/finding-and-removing-files-added-to-git-by-accident</link><guid isPermaLink="true">https://roderickkennedy.com/finding-and-removing-files-added-to-git-by-accident</guid><dc:creator><![CDATA[Roderick Kennedy]]></dc:creator><pubDate>Thu, 02 Aug 2018 13:14:00 GMT</pubDate><content:encoded><![CDATA[<h1 id="heading-finding-and-removing-files-added-to-git-by-accident">Finding and removing files added to git by accident</h1>
<p>2 Aug</p>
<p>Written By <a target="_blank" href="https://roderickkennedy.com/dbgdiary?author=5f08d2770b281846bf04ee3b">Roderick Kennedy</a></p>
<p>If for example, you've added lib files by mistake to a large git repo, and want to remove them, but don't know the exact paths, use this:  </p>
<p>git ls-files *.lib&gt;lib.bat  </p>
<p>Then in lib.bat you may have e.g.:<br />Plugins/Media/Intermediate/Build/Win64/DebugMediaEditor/MediaEditor-Win64-Debug.lib  </p>
<p>Add git rm --cached to the front of each line, then run the batch file and commit the result.</p>
<p> <a target="_blank" href="https://roderickkennedy.com/dbgdiary?author=5f08d2770b281846bf04ee3b">Roderick Kennedy</a></p>
]]></content:encoded></item><item><title><![CDATA[How to make a custom Wizard for Unreal Editor]]></title><description><![CDATA[How to make a custom Wizard for Unreal Editor
21 Jun
Written By Roderick Kennedy
I wanted to create a wizard in the trueSKY Unreal plugin that would make it easier for users to add trueSKY to UE scenes. I was following this video where Epic's Michael...]]></description><link>https://roderickkennedy.com/how-to-make-a-custom-wizard-for-unreal-editor</link><guid isPermaLink="true">https://roderickkennedy.com/how-to-make-a-custom-wizard-for-unreal-editor</guid><dc:creator><![CDATA[Roderick Kennedy]]></dc:creator><pubDate>Thu, 21 Jun 2018 13:13:00 GMT</pubDate><enclosure url="https://cdn.hashnode.com/res/hashnode/image/upload/v1707394758660/662df265-4967-4470-9630-7256449cd79d.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<h1 id="heading-how-to-make-a-custom-wizard-for-unreal-editor">How to make a custom Wizard for Unreal Editor</h1>
<p>21 Jun</p>
<p>Written By <a target="_blank" href="https://roderickkennedy.com/dbgdiary?author=5f08d2770b281846bf04ee3b">Roderick Kennedy</a></p>
<p>I wanted to create a wizard in the <a target="_blank" href="https://simul.co/truesky">trueSKY</a> Unreal plugin that would make it easier for users to add trueSKY to UE scenes. I was following <a target="_blank" href="https://www.youtube.com/watch?v=zg_VstBxDi8&amp;t=1482s">this video</a> where Epic's Michael Noland describes various ways to modify the Editor. So I made a custom Property Editor window with settings to select a sky sequence, create a TrueSkyLight etc.  </p>
<p>But it didn't look very friendly. And implementing a wizard-style Apply button just put a button in amongst the other settings - not great. After some searching in the UE codebase, I discovered the SWizard class that Unreal Editor uses for its own wizards. Here's what you do:  </p>
<p>1. Create a class derived from SCompoundWidget containing a TSharedPtr&lt;SWizard&gt;. Mine looks like this:  </p>
<p>DECLARE_DELEGATE_FourParams( FOnTrueSkySetup, bool, ADirectionalLight* ,bool, UTrueSkySequenceAsset *);  </p>
<p>    #define S_DECLARE_CHECKBOX(name) \
     bool name; \
     ECheckBoxState Is##name##Checked() const { return name ? ECheckBoxState::Checked:ECheckBoxState::Unchecked;} \
     void On##name##Changed(ECheckBoxState InCheckedState) {name=(InCheckedState==ECheckBoxState::Checked);}</p>
<p>    class STrueSkySetupTool : public SCompoundWidget
    {
    public:
     SLATE_BEGIN_ARGS( STrueSkySetupTool )
      :_CreateTrueSkyLight(false)
      ,_DirectionalLight(nullptr)
      ,_CreateDirectionalLight(nullptr)
      ,_Sequence(nullptr)
     {}
     /<strong> A TrueSkyLight actor performs real-time ambient lighting.*/
     SLATE_ARGUMENT(bool,CreateTrueSkyLight)
     /</strong> TrueSKY can drive a directional light to provide sunlight and moonlight.<em>/
     SLATE_ARGUMENT(ADirectionalLight</em>,DirectionalLight)
     /<strong> If there's no directional light in the scene, you can create one with this checkbox.*/
     SLATE_ARGUMENT(bool,CreateDirectionalLight)
     /</strong> The TrueSKY Sequence provides the weather state to render.<em>/
     SLATE_ARGUMENT(UTrueSkySequenceAsset </em>,Sequence)
     /<strong> Event called when code is successfully added to the project */
     SLATE_EVENT( FOnTrueSkySetup, OnTrueSkySetup )
     SLATE_END_ARGS()
     /</strong> Constructs this widget with InArgs */
     void Construct( const FArguments&amp; InArgs );</p>
<p>     /<em>* Handler for when cancel is clicked </em>/
     void CancelClicked();</p>
<p>     /<em>* Returns true if Finish is allowed </em>/
     bool CanFinish() const;</p>
<p>     /<em>* Handler for when finish is clicked </em>/
     void FinishClicked();</p>
<p>    ...</p>
<p>     S_DECLARE_CHECKBOX(CreateTrueSkyLight)
     S_DECLARE_CHECKBOX(ShowAllSequences)</p>
<p>     void SetupSequenceAssetItems();</p>
<p>     void CloseContainingWindow();
    private:
     /<em>* The wizard widget </em>/
     TSharedPtr MainWizard;
     FOnTrueSkySetup OnTrueSkySetup;
    ...
    };</p>
<p>The SLATE_ARGUMENT macros allow initialization of named parameters in this style:  </p>
<p>    TSharedRef TrueSkySetupTool = SNew(STrueSkySetupTool).OnTrueSkySetup(OnTrueSkySetup1).CreateTrueSkyLight(true);</p>
<p>etc. This is super-useful.  </p>
<p>2. Create a callback for the wizard to execute:  </p>
<p>    FOnTrueSkySetup OnTrueSkySetupDelegate;</p>
<p>3. Create a window for the widget. This function is called when the menu option to start the wizard is selected:  </p>
<p>    void FTrueSkyEditorPlugin::OnAddSequence()
    {
     TrueSkySetupWindow = SNew(SWindow)
       .Title( NSLOCTEXT("InitializeTrueSky", "WindowTitle", "Initialize trueSKY") )
       .ClientSize( FVector2D(600, 550) )
       .SizingRule( ESizingRule::FixedSize )
       .SupportsMinimize(false).SupportsMaximize(false);
     OnTrueSkySetupDelegate.BindRaw(this,&amp;FTrueSkyEditorPlugin::OnTrueSkySetup);
     TSharedRef TrueSkySetupTool = SNew(STrueSkySetupTool).OnTrueSkySetup(OnTrueSkySetupDelegate);
     TrueSkySetupWindow-&gt;SetContent( TrueSkySetupTool );</p>
<p>If the main frame exists parent the window to it. The main frame should always exist...  </p>
<p>     TSharedPtr&lt; SWindow &gt; ParentWindow;
     if( FModuleManager::Get().IsModuleLoaded( "MainFrame" ) )
     {
      IMainFrameModule&amp; MainFrame = FModuleManager::GetModuleChecked( "MainFrame" );
      ParentWindow = MainFrame.GetParentWindow();
     }</p>
<p>     bool modal=false;
     if (modal)
     {
      FSlateApplication::Get().AddModalWindow(TrueSkySetupWindow.ToSharedRef(), ParentWindow);
     }
     else if (ParentWindow.IsValid())
     {
      FSlateApplication::Get().AddWindowAsNativeChild(TrueSkySetupWindow.ToSharedRef(), ParentWindow.ToSharedRef());
     }
     else
     {
      FSlateApplication::Get().AddWindow(TrueSkySetupWindow.ToSharedRef());
     }
     TrueSkySetupWindow-&gt;ShowWindow();
    }</p>
<p>4. Implement the setup tool:  </p>
<p>    BEGIN_SLATE_FUNCTION_BUILD_OPTIMIZATION
    void STrueSkySetupTool::Construct( const FArguments&amp; InArgs )
    {
     OnTrueSkySetup = InArgs._OnTrueSkySetup;
     CreateTrueSkyLight=InArgs._CreateTrueSkyLight;
     DirectionalLight=InArgs._DirectionalLight;
     Sequence=InArgs._Sequence;
    ...</p>
<p>The interface to build the actual UI is really interesting. By overloading the [] and + operators, Epic lets you specify the widget structure like so:  </p>
<p>     ChildSlot
     [
      SNew(SBorder)
      .Padding(18)
      .BorderImage( FEditorStyle::GetBrush("Docking.Tab.ContentAreaBrush") )
      [
       SNew(SVerticalBox)
       +SVerticalBox::Slot()
       [
        SAssignNew( MainWizard, SWizard)
        .ShowPageList(false)
        .CanFinish(this, &amp;STrueSkySetupTool::CanFinish)
        .FinishButtonText(  LOCTEXT("TrueSkyFinishButtonText", "Initialize") )
        .OnCanceled(this, &amp;STrueSkySetupTool::CancelClicked)
        .OnFinished(this, &amp;STrueSkySetupTool::FinishClicked)
        .InitialPageIndex( 0)
        +SWizard::Page()
        [
         SNew(SVerticalBox)
         +SVerticalBox::Slot()
         .AutoHeight()
         [
          SNew(STextBlock)
          .TextStyle( FEditorStyle::Get(), "NewClassDialog.PageTitle" )
          .Text( LOCTEXT( "WeatherStateTitle", "Choose a Sequence Asset" ) )
         ]
         +SVerticalBox::Slot()
         .AutoHeight()
         .Padding(0)
         [
          SNew(SHorizontalBox)
          +SHorizontalBox::Slot()
          .FillWidth(1.f)
          .VAlign(VAlign_Center)
          [
           SNew(STextBlock)
           .Text(LOCTEXT("TrueSkySetupToolDesc", "Choose which weather sequence to use initially.") )
           .AutoWrapText(true)
           .TextStyle(FEditorStyle::Get(), "NewClassDialog.ParentClassItemTitle")
          ]
         ]
        ]
        +SWizard::Page()
        [
         ...
        ]
       ]
      ]
     ];</p>
<p>    }</p>
<p>So by adding new +SWizard::Page() elements we add pages to the wizard.  </p>
<p>5. Finally, implement the callback that the delegate calls when you click "Finish":  </p>
<p>    void FTrueSkyEditorPlugin::OnTrueSkySetup(bool CreateDirectionalLight, ADirectionalLight<em> DirectionalLight,bool CreateTrueSkyLight,UTrueSkySequenceAsset </em>Sequence)
    {
    ...
    }</p>
<p>The end result looks like this:</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1707394754040/040e4471-660e-45c0-847f-5b93286e79f8.png" alt="image (4).png" /></p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1707394755359/404b4d59-d3aa-4472-9d32-2fdaaea150e2.png" alt /></p>
<p>Full source for this is at our UE branch, (register at <a target="_blank" href="https://simul.co/register">Simul</a> to access).</p>
<p> <a target="_blank" href="https://roderickkennedy.com/dbgdiary?author=5f08d2770b281846bf04ee3b">Roderick Kennedy</a></p>
<h1 id="heading-how-to-make-a-custom-wizard-for-unreal-editor-1">How to make a custom Wizard for Unreal Editor</h1>
<p>21 Jun</p>
<p>Written By <a target="_blank" href="https://roderickkennedy.com/dbgdiary?author=5f08d2770b281846bf04ee3b">Roderick Kennedy</a></p>
<p>I wanted to create a wizard in the <a target="_blank" href="https://simul.co/truesky">trueSKY</a> Unreal plugin that would make it easier for users to add trueSKY to UE scenes. I was following <a target="_blank" href="https://www.youtube.com/watch?v=zg_VstBxDi8&amp;t=1482s">this video</a> where Epic's Michael Noland describes various ways to modify the Editor. So I made a custom Property Editor window with settings to select a sky sequence, create a TrueSkyLight etc.  </p>
<p>But it didn't look very friendly. And implementing a wizard-style Apply button just put a button in amongst the other settings - not great. After some searching in the UE codebase, I discovered the SWizard class that Unreal Editor uses for its own wizards. Here's what you do:  </p>
<p>1. Create a class derived from SCompoundWidget containing a TSharedPtr&lt;SWizard&gt;. Mine looks like this:  </p>
<p>DECLARE_DELEGATE_FourParams( FOnTrueSkySetup, bool, ADirectionalLight* ,bool, UTrueSkySequenceAsset *);  </p>
<p>    #define S_DECLARE_CHECKBOX(name) \
     bool name; \
     ECheckBoxState Is##name##Checked() const { return name ? ECheckBoxState::Checked:ECheckBoxState::Unchecked;} \
     void On##name##Changed(ECheckBoxState InCheckedState) {name=(InCheckedState==ECheckBoxState::Checked);}</p>
<p>    class STrueSkySetupTool : public SCompoundWidget
    {
    public:
     SLATE_BEGIN_ARGS( STrueSkySetupTool )
      :_CreateTrueSkyLight(false)
      ,_DirectionalLight(nullptr)
      ,_CreateDirectionalLight(nullptr)
      ,_Sequence(nullptr)
     {}
     /<strong> A TrueSkyLight actor performs real-time ambient lighting.*/
     SLATE_ARGUMENT(bool,CreateTrueSkyLight)
     /</strong> TrueSKY can drive a directional light to provide sunlight and moonlight.<em>/
     SLATE_ARGUMENT(ADirectionalLight</em>,DirectionalLight)
     /<strong> If there's no directional light in the scene, you can create one with this checkbox.*/
     SLATE_ARGUMENT(bool,CreateDirectionalLight)
     /</strong> The TrueSKY Sequence provides the weather state to render.<em>/
     SLATE_ARGUMENT(UTrueSkySequenceAsset </em>,Sequence)
     /<strong> Event called when code is successfully added to the project */
     SLATE_EVENT( FOnTrueSkySetup, OnTrueSkySetup )
     SLATE_END_ARGS()
     /</strong> Constructs this widget with InArgs */
     void Construct( const FArguments&amp; InArgs );</p>
<p>     /<em>* Handler for when cancel is clicked </em>/
     void CancelClicked();</p>
<p>     /<em>* Returns true if Finish is allowed </em>/
     bool CanFinish() const;</p>
<p>     /<em>* Handler for when finish is clicked </em>/
     void FinishClicked();</p>
<p>    ...</p>
<p>     S_DECLARE_CHECKBOX(CreateTrueSkyLight)
     S_DECLARE_CHECKBOX(ShowAllSequences)</p>
<p>     void SetupSequenceAssetItems();</p>
<p>     void CloseContainingWindow();
    private:
     /<em>* The wizard widget </em>/
     TSharedPtr MainWizard;
     FOnTrueSkySetup OnTrueSkySetup;
    ...
    };</p>
<p>The SLATE_ARGUMENT macros allow initialization of named parameters in this style:  </p>
<p>    TSharedRef TrueSkySetupTool = SNew(STrueSkySetupTool).OnTrueSkySetup(OnTrueSkySetup1).CreateTrueSkyLight(true);</p>
<p>etc. This is super-useful.  </p>
<p>2. Create a callback for the wizard to execute:  </p>
<p>    FOnTrueSkySetup OnTrueSkySetupDelegate;</p>
<p>3. Create a window for the widget. This function is called when the menu option to start the wizard is selected:  </p>
<p>    void FTrueSkyEditorPlugin::OnAddSequence()
    {
     TrueSkySetupWindow = SNew(SWindow)
       .Title( NSLOCTEXT("InitializeTrueSky", "WindowTitle", "Initialize trueSKY") )
       .ClientSize( FVector2D(600, 550) )
       .SizingRule( ESizingRule::FixedSize )
       .SupportsMinimize(false).SupportsMaximize(false);
     OnTrueSkySetupDelegate.BindRaw(this,&amp;FTrueSkyEditorPlugin::OnTrueSkySetup);
     TSharedRef TrueSkySetupTool = SNew(STrueSkySetupTool).OnTrueSkySetup(OnTrueSkySetupDelegate);
     TrueSkySetupWindow-&gt;SetContent( TrueSkySetupTool );</p>
<p>If the main frame exists parent the window to it. The main frame should always exist...  </p>
<p>     TSharedPtr&lt; SWindow &gt; ParentWindow;
     if( FModuleManager::Get().IsModuleLoaded( "MainFrame" ) )
     {
      IMainFrameModule&amp; MainFrame = FModuleManager::GetModuleChecked( "MainFrame" );
      ParentWindow = MainFrame.GetParentWindow();
     }</p>
<p>     bool modal=false;
     if (modal)
     {
      FSlateApplication::Get().AddModalWindow(TrueSkySetupWindow.ToSharedRef(), ParentWindow);
     }
     else if (ParentWindow.IsValid())
     {
      FSlateApplication::Get().AddWindowAsNativeChild(TrueSkySetupWindow.ToSharedRef(), ParentWindow.ToSharedRef());
     }
     else
     {
      FSlateApplication::Get().AddWindow(TrueSkySetupWindow.ToSharedRef());
     }
     TrueSkySetupWindow-&gt;ShowWindow();
    }</p>
<p>4. Implement the setup tool:  </p>
<p>    BEGIN_SLATE_FUNCTION_BUILD_OPTIMIZATION
    void STrueSkySetupTool::Construct( const FArguments&amp; InArgs )
    {
     OnTrueSkySetup = InArgs._OnTrueSkySetup;
     CreateTrueSkyLight=InArgs._CreateTrueSkyLight;
     DirectionalLight=InArgs._DirectionalLight;
     Sequence=InArgs._Sequence;
    ...</p>
<p>The interface to build the actual UI is really interesting. By overloading the [] and + operators, Epic lets you specify the widget structure like so:  </p>
<p>     ChildSlot
     [
      SNew(SBorder)
      .Padding(18)
      .BorderImage( FEditorStyle::GetBrush("Docking.Tab.ContentAreaBrush") )
      [
       SNew(SVerticalBox)
       +SVerticalBox::Slot()
       [
        SAssignNew( MainWizard, SWizard)
        .ShowPageList(false)
        .CanFinish(this, &amp;STrueSkySetupTool::CanFinish)
        .FinishButtonText(  LOCTEXT("TrueSkyFinishButtonText", "Initialize") )
        .OnCanceled(this, &amp;STrueSkySetupTool::CancelClicked)
        .OnFinished(this, &amp;STrueSkySetupTool::FinishClicked)
        .InitialPageIndex( 0)
        +SWizard::Page()
        [
         SNew(SVerticalBox)
         +SVerticalBox::Slot()
         .AutoHeight()
         [
          SNew(STextBlock)
          .TextStyle( FEditorStyle::Get(), "NewClassDialog.PageTitle" )
          .Text( LOCTEXT( "WeatherStateTitle", "Choose a Sequence Asset" ) )
         ]
         +SVerticalBox::Slot()
         .AutoHeight()
         .Padding(0)
         [
          SNew(SHorizontalBox)
          +SHorizontalBox::Slot()
          .FillWidth(1.f)
          .VAlign(VAlign_Center)
          [
           SNew(STextBlock)
           .Text(LOCTEXT("TrueSkySetupToolDesc", "Choose which weather sequence to use initially.") )
           .AutoWrapText(true)
           .TextStyle(FEditorStyle::Get(), "NewClassDialog.ParentClassItemTitle")
          ]
         ]
        ]
        +SWizard::Page()
        [
         ...
        ]
       ]
      ]
     ];</p>
<p>    }</p>
<p>So by adding new +SWizard::Page() elements we add pages to the wizard.  </p>
<p>5. Finally, implement the callback that the delegate calls when you click "Finish":  </p>
<p>    void FTrueSkyEditorPlugin::OnTrueSkySetup(bool CreateDirectionalLight, ADirectionalLight<em> DirectionalLight,bool CreateTrueSkyLight,UTrueSkySequenceAsset </em>Sequence)
    {
    ...
    }</p>
<p>The end result looks like this:</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1707394756415/1bdbf5d3-e8f2-42d0-a137-bbbc945e5689.png" alt="image (4).png" /></p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1707394757430/32001bdf-224f-4bc3-8ff7-8b45e3c0157f.png" alt /></p>
<p>Full source for this is at our UE branch, (register at <a target="_blank" href="https://simul.co/register">Simul</a> to access).</p>
<p> <a target="_blank" href="https://roderickkennedy.com/dbgdiary?author=5f08d2770b281846bf04ee3b">Roderick Kennedy</a></p>
]]></content:encoded></item><item><title><![CDATA[Signing installers with certificates]]></title><description><![CDATA[Signing installers with certificates
10 Apr
Written By Roderick Kennedy
Windows Defender has recently decided to falsely mark all of our installers as containing some virus or other.  
It'll be a long long while before they get around to questioning ...]]></description><link>https://roderickkennedy.com/signing-installers-with-certificates</link><guid isPermaLink="true">https://roderickkennedy.com/signing-installers-with-certificates</guid><dc:creator><![CDATA[Roderick Kennedy]]></dc:creator><pubDate>Tue, 10 Apr 2018 13:09:00 GMT</pubDate><content:encoded><![CDATA[<h1 id="heading-signing-installers-with-certificates">Signing installers with certificates</h1>
<p>10 Apr</p>
<p>Written By <a target="_blank" href="https://roderickkennedy.com/dbgdiary?author=5f08d2770b281846bf04ee3b">Roderick Kennedy</a></p>
<p>Windows Defender has recently decided to falsely mark all of our installers as containing some virus or other.  </p>
<p>It'll be a long long while before they get around to questioning whether their algorithms are in fact, "full of it", as they say, so let's see what happens if we sign our executables using a root certificate.  </p>
<p>First, get a certificate, from Comodo. This takes weeks while they check whether an arbitrary non-governmental organization, Dun and Bradstreet, regards your company as genuine. Just check with Companies House? Way too simple!  </p>
<p>So you need to get a DUNS number from D&amp;B, then buy a certificate from tucows/Comodo.  </p>
<p>After jumping through their hoops (which don't seem to be very secure to me, just cumbersome), you'll get a .crt file.  </p>
<p>Then <a target="_blank" href="https://support.citrix.com/article/CTX221295">https://support.citrix.com/article/CTX221295</a> will tell you how to convert your crt to a pfx.  </p>
<p>Finally, use the pfx and signtool.exe (in the Windows SDK) to sign your executable.</p>
<p> <a target="_blank" href="https://roderickkennedy.com/dbgdiary?author=5f08d2770b281846bf04ee3b">Roderick Kennedy</a></p>
<h1 id="heading-signing-installers-with-certificates-1">Signing installers with certificates</h1>
<p>10 Apr</p>
<p>Written By <a target="_blank" href="https://roderickkennedy.com/dbgdiary?author=5f08d2770b281846bf04ee3b">Roderick Kennedy</a></p>
<p>Windows Defender has recently decided to falsely mark all of our installers as containing some virus or other.  </p>
<p>It'll be a long long while before they get around to questioning whether their algorithms are in fact, "full of it", as they say, so let's see what happens if we sign our executables using a root certificate.  </p>
<p>First, get a certificate, from Comodo. This takes weeks while they check whether an arbitrary non-governmental organization, Dun and Bradstreet, regards your company as genuine. Just check with Companies House? Way too simple!  </p>
<p>So you need to get a DUNS number from D&amp;B, then buy a certificate from tucows/Comodo.  </p>
<p>After jumping through their hoops (which don't seem to be very secure to me, just cumbersome), you'll get a .crt file.  </p>
<p>Then <a target="_blank" href="https://support.citrix.com/article/CTX221295">https://support.citrix.com/article/CTX221295</a> will tell you how to convert your crt to a pfx.  </p>
<p>Finally, use the pfx and signtool.exe (in the Windows SDK) to sign your executable.</p>
<p> <a target="_blank" href="https://roderickkennedy.com/dbgdiary?author=5f08d2770b281846bf04ee3b">Roderick Kennedy</a></p>
]]></content:encoded></item><item><title><![CDATA[Building Unreal Engine projects from the solution using MSBuild]]></title><description><![CDATA[17 Nov
Written By Roderick Kennedy
If you want to use MSBuild to build UE4 projects, but need to build them from within the solution instead of specifying the vcxproj file (which doesn't always work correctly), you need to use the "Target" /t: comman...]]></description><link>https://roderickkennedy.com/building-unreal-engine-projects-from-the-solution-using-msbuild-1</link><guid isPermaLink="true">https://roderickkennedy.com/building-unreal-engine-projects-from-the-solution-using-msbuild-1</guid><dc:creator><![CDATA[Roderick Kennedy]]></dc:creator><pubDate>Fri, 17 Nov 2017 14:08:00 GMT</pubDate><content:encoded><![CDATA[<p>17 Nov</p>
<p>Written By <a target="_blank" href="https://roderickkennedy.com/dbgdiary?author=5f08d2770b281846bf04ee3b">Roderick Kennedy</a></p>
<p>If you want to use MSBuild to build UE4 projects, but need to build them from within the solution instead of specifying the vcxproj file (which doesn't always work correctly), you need to use the "Target" /t: command line parameter, like so:</p>
<p><code>"path to MSBuild.exe" /t:Engine\UE4 /p:Configuration="Development Editor" /p:Platform=Win64 UE4.sln</code></p>
<p>Key things to note:</p>
<ul>
<li><p>the configuration and platform specifiers are Solution-style, with spaces instead of underscores, and Win64 instead of x64 etc.</p>
</li>
<li><p>The solution folder path must be specified in the target parameter, otherwise MSBuild will not recognize the project name.</p>
</li>
</ul>
<p><a target="_blank" href="https://roderickkennedy.com/dbgdiary?author=5f08d2770b281846bf04ee3b">Roderick Kennedy</a></p>
<h1 id="heading-building-unreal-engine-projects-from-the-solution-using-msbuild">Building Unreal Engine projects from the solution using MSBuild</h1>
<p>17 Nov</p>
<p>Written By <a target="_blank" href="https://roderickkennedy.com/dbgdiary?author=5f08d2770b281846bf04ee3b">Roderick Kennedy</a></p>
<p>If you want to use MSBuild to build UE4 projects, but need to build them from within the solution instead of specifying the vcxproj file (which doesn't always work correctly), you need to use the "Target" /t: command line parameter, like so:</p>
<p><code>"path to MSBuild.exe" /t:Engine\UE4 /p:Configuration="Development Editor" /p:Platform=Win64 UE4.sln</code></p>
<p>Key things to note:</p>
<ul>
<li><p>the configuration and platform specifiers are Solution-style, with spaces instead of underscores, and Win64 instead of x64 etc.</p>
</li>
<li><p>The solution folder path must be specified in the target parameter, otherwise MSBuild will not recognize the project name.</p>
</li>
</ul>
<p><a target="_blank" href="https://roderickkennedy.com/dbgdiary?author=5f08d2770b281846bf04ee3b">Roderick Kennedy</a></p>
]]></content:encoded></item><item><title><![CDATA[Advanced custom Qt Container Widgets and Qt Designer]]></title><description><![CDATA[3 Jun
Written By Roderick Kennedy
Qt has a nice UI editor called Designer, and you can create custom widgets that go in Designer's toolkit. But the only example I've ever found is this one in the Qt docs, which doesn't explain how to create container...]]></description><link>https://roderickkennedy.com/advanced-custom-qt-container-widgets-and-qt-designer-1</link><guid isPermaLink="true">https://roderickkennedy.com/advanced-custom-qt-container-widgets-and-qt-designer-1</guid><dc:creator><![CDATA[Roderick Kennedy]]></dc:creator><pubDate>Sat, 03 Jun 2017 13:07:00 GMT</pubDate><enclosure url="https://cdn.hashnode.com/res/hashnode/image/upload/v1707397865579/a060371c-5310-4a07-b4c4-86326b3104ff.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>3 Jun</p>
<p>Written By <a target="_blank" href="https://roderickkennedy.com/dbgdiary?author=5f08d2770b281846bf04ee3b">Roderick Kennedy</a></p>
<p>Qt has a nice UI editor called Designer, and you can create custom widgets that go in Designer's toolkit. But the only example I've ever found is <a target="_blank" href="http://doc.qt.io/qt-5/designer-creating-custom-widgets.html">this one</a> in the Qt docs, which doesn't explain how to create container widgets.</p>
<p>The problem is to create a widget that contains some decoration or controls, but also has a sub-window where people can put their own widgets.</p>
<p>For example, I wanted an "accordion" control that had a checkbox at the top to open and close it, then to be able to put any other control inside this.</p>
<p>The way it will be structured is a QAccordion, with a VBoxLayout, will contain a QCheckbox and a QWidget called the content widget. This content widget will have its own VBoxLayout, where the controls will go.</p>
<p>You will create two classes: one called (say) QAccordion, which implements the widget, and one called QAccordionInterface, which tells Designer that it's available.</p>
<p>&lt;#include #include "GeneratedFiles/ui_QAccordion.h" #include "Export.h"</p>
<p>class SIMUL_QT_WIDGETS_EXPORT QAccordion : public QWidget { Q_OBJECT</p>
<p>Q_PROPERTY(QString title READ title WRITE setTitle DESIGNABLE true) Q_PROPERTY(bool open READ isOpen WRITE setOpen DESIGNABLE true) public: QAccordion(QWidget *parent = 0); ~QAccordion(); void setTitle(QString f); QString title() const; void setOpen(bool o); bool isOpen() const; public slots: void on_accordionCheckBox_toggled(); void setSearchText(const QString &amp;); signals: protected: void childEvent ( QChildEvent * event ) override; void paintEvent(QPaintEvent *) override; private: Ui::Accordion ui; bool setup_complete; QWidget *contentsWidget; void hookupContentsWidget(); };</p>
<p>The subclass Ui::Accordion shows that I created the basic class in Designer itself. This is optional, but the QAccordion.ui file is just:</p>
<p>?xml version="1.0" encoding="UTF-8"? Accordion0067133600AccordionAccordiontrue</p>
<p>By putting the layout and checkbox in the ui file, they will be created in code, in Ui::Accordion.</p>
<p>But: if we were to create the whole thing, including the contents widget in here, after we built the class the contents widget would NOT be accessible in Designer, and neither would its layout be recognized. So instead we put these in a function called domXml in QAccordionInterface.</p>
<p>QString QAccordionInterface::domXml() const { return "&lt;ui language="c++"&gt;\n" " &lt;widget class="QAccordion" name="accordion"&gt;\n" " &lt;property name="geometry"&gt;\n" " \n" " 0\n" " 0\n" " 100\n" " 24\n" " \n" " \n" " &lt;property name="toolTip" &gt;\n" " \n" " \n" " &lt;property name="whatsThis" &gt;\n" " .\n" " \n" " &lt;widget class="QWidget" name="contentsWidget" native="true" &gt;\n" " &lt;layout class="QVBoxLayout" name="accContentsVLayout"&gt;\n" " &lt;property name="spacing"&gt;\n" " 2\n" " \n" " &lt;property name="leftMargin"&gt;\n" " 2\n" " \n" " &lt;property name="topMargin"&gt;\n" " 2\n" " \n" " &lt;property name="rightMargin"&gt;\n" " 2\n" " \n" " &lt;property name="bottomMargin"&gt;\n" " 2\n" " \n" " \n" " \n" " \n" "\n"; }</p>
<p>By specifying the contents widget and its layout here, Designer will know to dynamically create them when you add a QAccordion, so they'll appear in the editor. You can then drag any control into the contents widget, and it will be correctly positioned. Be careful that you drag it to the contents widget and not the QAccordion itself or a subcontrol. Designer doesn't properly obey its "isContainer" function, so it sees any custom control as a container, not just the ones you indicate.</p>
<p>So now in designer, we can add QAccordions. Without styling they just look like checkboxes with a space below where you can drag controls:</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1707397861638/e622321c-fe37-4800-8d71-3dc4cc59475e.png" alt="designer.PNG" /></p>
<p>After applying some styling, the final result looks like this:</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1707397862855/37014f18-d923-40bf-9696-44c9a865b528.png" alt="accordion.PNG" /></p>
<p>The accordion elements - Cloud Window, Precipitation etc are inside a searchable property panel, implemented on the same principles.</p>
<p>And here are the files for the final class:</p>
<p><a target="_blank" href="https://simul.co/wp-content/uploads/blog-files/QAccordion.zip">QAccordion.zip</a></p>
<p><a target="_blank" href="https://roderickkennedy.com/dbgdiary?author=5f08d2770b281846bf04ee3b">Roderick Kennedy</a></p>
<h1 id="heading-advanced-custom-qt-container-widgets-and-qt-designer">Advanced custom Qt Container Widgets and Qt Designer</h1>
<p>3 Jun</p>
<p>Written By <a target="_blank" href="https://roderickkennedy.com/dbgdiary?author=5f08d2770b281846bf04ee3b">Roderick Kennedy</a></p>
<p>Qt has a nice UI editor called Designer, and you can create custom widgets that go in Designer's toolkit. But the only example I've ever found is <a target="_blank" href="http://doc.qt.io/qt-5/designer-creating-custom-widgets.html">this one</a> in the Qt docs, which doesn't explain how to create container widgets.</p>
<p>The problem is to create a widget that contains some decoration or controls, but also has a sub-window where people can put their own widgets.</p>
<p>For example, I wanted an "accordion" control that had a checkbox at the top to open and close it, then to be able to put any other control inside this.</p>
<p>The way it will be structured is a QAccordion, with a VBoxLayout, will contain a QCheckbox and a QWidget called the content widget. This content widget will have its own VBoxLayout, where the controls will go.</p>
<p>You will create two classes: one called (say) QAccordion, which implements the widget, and one called QAccordionInterface, which tells Designer that it's available.</p>
<p>&lt;#include #include "GeneratedFiles/ui_QAccordion.h" #include "Export.h"</p>
<p>class SIMUL_QT_WIDGETS_EXPORT QAccordion : public QWidget { Q_OBJECT</p>
<p>Q_PROPERTY(QString title READ title WRITE setTitle DESIGNABLE true) Q_PROPERTY(bool open READ isOpen WRITE setOpen DESIGNABLE true) public: QAccordion(QWidget *parent = 0); ~QAccordion(); void setTitle(QString f); QString title() const; void setOpen(bool o); bool isOpen() const; public slots: void on_accordionCheckBox_toggled(); void setSearchText(const QString &amp;); signals: protected: void childEvent ( QChildEvent * event ) override; void paintEvent(QPaintEvent *) override; private: Ui::Accordion ui; bool setup_complete; QWidget *contentsWidget; void hookupContentsWidget(); };</p>
<p>The subclass Ui::Accordion shows that I created the basic class in Designer itself. This is optional, but the QAccordion.ui file is just:</p>
<p>?xml version="1.0" encoding="UTF-8"? Accordion0067133600AccordionAccordiontrue</p>
<p>By putting the layout and checkbox in the ui file, they will be created in code, in Ui::Accordion.</p>
<p>But: if we were to create the whole thing, including the contents widget in here, after we built the class the contents widget would NOT be accessible in Designer, and neither would its layout be recognized. So instead we put these in a function called domXml in QAccordionInterface.</p>
<p>QString QAccordionInterface::domXml() const { return "&lt;ui language="c++"&gt;\n" " &lt;widget class="QAccordion" name="accordion"&gt;\n" " &lt;property name="geometry"&gt;\n" " \n" " 0\n" " 0\n" " 100\n" " 24\n" " \n" " \n" " &lt;property name="toolTip" &gt;\n" " \n" " \n" " &lt;property name="whatsThis" &gt;\n" " .\n" " \n" " &lt;widget class="QWidget" name="contentsWidget" native="true" &gt;\n" " &lt;layout class="QVBoxLayout" name="accContentsVLayout"&gt;\n" " &lt;property name="spacing"&gt;\n" " 2\n" " \n" " &lt;property name="leftMargin"&gt;\n" " 2\n" " \n" " &lt;property name="topMargin"&gt;\n" " 2\n" " \n" " &lt;property name="rightMargin"&gt;\n" " 2\n" " \n" " &lt;property name="bottomMargin"&gt;\n" " 2\n" " \n" " \n" " \n" " \n" "\n"; }</p>
<p>By specifying the contents widget and its layout here, Designer will know to dynamically create them when you add a QAccordion, so they'll appear in the editor. You can then drag any control into the contents widget, and it will be correctly positioned. Be careful that you drag it to the contents widget and not the QAccordion itself or a subcontrol. Designer doesn't properly obey its "isContainer" function, so it sees any custom control as a container, not just the ones you indicate.</p>
<p>So now in designer, we can add QAccordions. Without styling they just look like checkboxes with a space below where you can drag controls:</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1707397863858/6d59cfae-a8e2-4f08-8a6a-32d0a40b11a1.png" alt="designer.PNG" /></p>
<p>After applying some styling, the final result looks like this:</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1707397864607/8f42c0f9-4ae0-481c-af2a-b0cc528fa9ab.png" alt="accordion.PNG" /></p>
<p>The accordion elements - Cloud Window, Precipitation etc are inside a searchable property panel, implemented on the same principles.</p>
<p>And here are the files for the final class:</p>
<p><a target="_blank" href="https://simul.co/wp-content/uploads/blog-files/QAccordion.zip">QAccordion.zip</a></p>
<p><a target="_blank" href="https://roderickkennedy.com/dbgdiary?author=5f08d2770b281846bf04ee3b">Roderick Kennedy</a></p>
]]></content:encoded></item></channel></rss>