A multi-platform framework for fast and easy demo development.
The framework abstracts away all the boilerplate & OS specific code of allocating windows, creating the context, texture loading, shader compilation, render loop, animation ticks, benchmarking graph overlays etc. Thereby allowing the demo/benchmark developer to focus on writing the actual 'demo' code.
Therefore demos can be developed on PC or Android where the tool chain and debug facilities often allows for faster turnaround time and then compiled and deployed without code changes for other supported platforms.
The framework also allows for ‘real’ comparative benchmarks between the different OS and windowing systems, since the exact same demo/benchmark code run on all of them.
- Console. A freestyle project that runs in a console like environment.
- G2D (early access)
- OpenCL (early access)
- OpenCV (early access)
- OpenGL ES 2, 3, 3.1
- OpenVG
- OpenVX (early access)
- Vulkan (early access)
- Window. A freestyle project that runs in a window based environment.
- Android NDK
- Linux with various windowing systems (Yocto).
- Ubuntu 22.04
- Windows 10+
- Written in a limited subset of C++ 17 and uses RAII to manage resources.
- Uses a limited subset of STL to make it easier to port.
- No copyleft restrictions from GPL / L-GPL licenses.
- Allows for direct access to the expected API’s (EGL, OpenGL ES 2, OpenGL ES 3, OpenVG, OpenCV, etc)
- Package based architecture that ensures your app only has dependencies to the libs it uses.
- Content pipeline:
- Automatically compile Vulkan shaders during build.
- Services
- Keyboard, mouse and GamePad.
- Persistent data manager
- Assets management (models, textures)
- Defines a standard way for handling
- Init, shutdown & window resize.
- Program input arguments.
- Input events like keyboard, mouse and touch.
- Fixed time-step and variable time-step demo implementations.
- Logging functionality.
- Provides optional helper classes for commonly used tasks:
- Matrix, Vector3, GLShader, GLTexture, etc
- Easy access to optional libs like: GLM, GLI, RapidJSON and Assimp
See the setup guides for your platform:
For details about the build system see the FslBuildGen document.
While writing this we currently have forty-two OpenGL ES 2 samples, seventy-five OpenGL ES 3.x samples, sixty-six Vulkan samples, eight OpenVG samples, two G2D samples, three OpenCL samples, two OpenCV samples, three OpenVX sample and six other samples. Which is 207 sample applications.
The demo framework currently runs on at least four platforms so using a traditional approach we would have to maintain 207 * 4 = 828 build files for the samples alone. Maintaining 828 or even just 207 build files would be an extremely time consuming and error prone process. So ideally, we wanted to use a build tool that supported
- Minimalistic build description files, that are used to ‘auto generate’ real build files.
- Proper package dependency support.
- A good developer experience
- co-existence of debug, release and other variants.
- re-use of already build libs for other samples.
- full source access to all used packages inside IDE’s.
- Support for Windows, Ubuntu, Yocto and Android NDK.
- Ensure a similar source layout for all samples.
The common go-to solution for C++ projects these days would be to use CMake, but when we started this project it was not nearly as widely used, the build files were not as minimalistic as we would like, the package dependencies were not handled as easy as we would have liked, it had no Android NDK support and the sample layout would have to be manually enforced.
As no existing system really fit, we made the controversial choice of creating our own minimalistic system. It uses a minimalistic build-meta-data file with no scripting capabilities (as we want everything to be build the same way). From the meta data files + the content of the folders ‘include’ and ‘source’ we then generate the platform dependent build files using various templates that can be changed in one location. This means that once the ‘build-meta-data’ file has been created it would basically never have to change unless the source code gets new dependencies to other packages.
Over the years, we been using this approach we have not really touched any of the build-meta-data files (Fsl.gen) since they were initially created. If we ever needed any build file changes we could instead update the template in one location and have it affect all samples right away.
Here is an example ‘Fsl.gen’ build-meta-data file for the GLES2.Blur sample app:
<?xml version="1.0" encoding="UTF-8"?>
<FslBuildGen xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:noNamespaceSchemaLocation="../../../FslBuildGen.xsd">
<Executable Name="GLES2.Blur" NoInclude="true">
<ImportTemplate Name="DemoAppGLES2"/>
<Dependency Name="EnvironmentMappingShared"/>
</Executable>
</FslBuildGen>
It basically specifies that this directory contains an executable package with no include directory, that it uses the ‘DemoAppGLES2’ template and has a dependency on a package called ‘EnvironmentMappingShared’.
Another example is the ‘Fsl.gen’ file for the FslGraphics package which has had lots of files added over the years, but its build file has been untouched.
<?xml version="1.0" encoding="UTF-8"?>
<FslBuildGen xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:noNamespaceSchemaLocation="../../FslBuildGen.xsd">
<Library Name="FslGraphics">
<Dependency Name="FslBase"/>
</Library>
</FslBuildGen>
It specifies a ‘library’ (static library), with a dependency to ‘FslBase’ and a windows visual studio project id.
While CMake has improved a lot since we initially looked, it would requires more manual work to keep the samples up to date than our current solution.
It's worth mentioning that its entirely possible generate 'CMakeLists.txt' with this system, in fact we do just that internally for the android gradle+cmake build.
Operating System | Build system |
---|---|
Android | gradle + cmake (Android Studio can be used with the generated projects) |
Ubuntu | cmake (ninja) |
Windows | cmake (Visual studio 2022 x64) |
Yocto | cmake (ninja) |
Is a cross-platform build-file generator. Which main purpose is to keep all build files consistent, in sync and up to date. See the FslBuildGen document for details.
Extends the technology behind FslBuildGen with additional knowledge about how to execute the build system for a given platform. So basically, FslBuild works like this
- Invoke the build-file generator that updates all build files if necessary.
- Filter the builds request based on the provided feature and extension list.
- Build all necessary build files in the correct order.
FslBuild comes with a few useful arguments
Argument | Description |
---|---|
--ListRequirements | List the requirement tree and exit. |
--ListVariants | List all variants. |
--RequireFeatures | The list of features that are required for a executable package to be build. For example [OpenGLES2] to build all executables that use OpenGLES2. |
--UseFeatures | Allows you to limit what’s build based on a provided feature list. For example [EGL,OpenGLES2]. This parameter defaults to all features. |
--UseExtensions | The list of available extensions to build for. For example [OpenGLES3.1:EXT_geometry_shader,OpenGLES3.1:EXT_tessellation_shader] to allow the OpenGLES3.1 extensions EXT_geometry_shader and EXT_tessellation_shader. You can also specify * for all extensions (default). |
--Variants | Define the variants you wish to build (if any). For yocto for example you select the window system and build type using --Variants [config=Debug,WindowSystem=FB] |
--BuildTime | Time the build and print the result and the end. |
-t 'sdk' | Build all demo framework projects |
-v | Set verbosity level |
-- | arguments written after this is send directly to the native build system. |
- Don’t modify the auto-generated files. The FslBuild scripts are responsible for creating all the build files for a platform and verifying dependencies. Since all build files are auto generated you can never modify them directly as the next build will overwrite your changes. Instead add your changes to the ‘Fsl.gen’ files as they control the build file generation!
- The ‘Fsl.gen’ file is the real build file.
- All include and source files in the respective folders are automatically added to the build files.
This only runs the content builder part of the build process.
FslBuildContent.py
Build the current directories package content.
Generate a new project of the specified type. This is basically a project wizard that will prepare a new project directory with a basic template for the chosen project type.
FslBuildNew.py GLES2 FlyingPigsApp
Create the FlyingPigsApp sample app directory using the GLES2 template.
Package build environment checker. Based on what features the package uses this will try to detect setup errors. It also has the capability to scan the source for common mistakes and it can check if the proper License.json file is provided for screenshots.
FslBuildCheck.py
Check the current build environment to see if the package can be build.
FslBuildCheck.py --scan
Scan the current package and see if there is any common mistakes with for example include guards, tabs, etc.
A new *work in progress tool that helps keep the README.md files similar and that fills out various areas of the root README.md file.
FslBuildDoc.py
The following description of the demo application details uses a GLES2 demo named ‘S01_SimpleTriangle’ as example. It lists the default methods that a demo should implement, the way it can provide customized parameters to the windowing system and how asset management is made platform agnostic.
This is a list of the methods that every Demo App is most likely to override .
// Init
S01_SimpleTriangle(const DemoAppConfig& config)
// Shutdown
~S01_SimpleTriangle()
// OPTIONAL: Custom resize logic (if the app requested it). The default logic is to
// restart the app.
void ConfigurationChanged(const DemoWindowMetrics& windowMetrics)
// OPTIONAL: Fixed time step update method that will be called the set number of times
// per second. The fixed time step update is often used for physics.
void FixedUpdate(const DemoTime& demoTime)
// OPTIONAL: Variable time step update method.
void Update(const DemoTime& demoTime)
// Put the rendering calls here
void Draw(const FrameInfo& frameInfo)
When the constructor is invoked, the Demo Host API will already be setup and ready for use, the demo framework will use EGL to configure things as requested by your EGL config and API version.
It is recommended that you do all your setup in the constructor.
This also means that you should never try to shutdown EGL in the destructor since the framework will do it at the appropriate time. The destructor should only worry about resources that your demo app actually allocated by itself.
The ConfigurationChanged method will be called if the screen metrics changes.
Is a fixed time-step update method that will be called the set number of times per second. The fixed time step update is often used for physics.
Will be called once before every draw call and you will normally update your animation using delta time. For example if you need to move your object 10 units horizontally per second you would do something like
_positionX += 10 * demoTime.DeltaTime;
Should be used to render graphics.
Depending on what your demo is doing, you might use one or the other - or both. It’s actually a very complex topic once you start to dig into it, but in general anything that need precision and predictable/repeatable calculations, like for example physics, often benefits from using fixed time steps. It really depends on your algorithm and it’s recommended to do a couple of google searches on fixed vs variable, since there are lots of arguments for both. It’s also worth noting that game engines like Unity3D support both methods.
The methods will be called in this order
- Events (if any occurred)
- ConfigurationChanged
- FixedUpdate (0-N calls. The first frame will always have a FixedUpdate call)
- Update
- Draw After the draw call, a swap will occur.
The framework supports loading files from the Content folder on all platforms.
Given a content folder like this:
Content/Texture1.bmp
Content/Stuff/Readme.txt
You can load the files via the IContentManager service that can be accessed by calling
std::shared_ptr<IContentManager> contentManager = GetContentManager();
You can then load files like this:
// *** Text file ***
// Read it directly into a new string
const std::string content = contentManager->ReadAllText("Stuff/Readme.txt");
// *** Binary file ***
// Read the content directly into a new vector
const std::vector<uint8_t> content = contentManager->ReadBytes("MyData.bin");
// Read the content into a existing vector
std::vector<uint8_t> content;
contentManager->ReadAllBytes(content, "MyData.bin");
// *** Bitmap file ***
// Read the content directly into a new bitmap
const Bitmap bitmap = contentManager->ReadBitmap("Texture1.bmp", PixelFormat::R8G8B8_UINT);
// Read the content into a existing bitmap object.
// Beware the bitmap object will be resized and format changed as needed, but some memory could be reused.
Bitmap bitmap;
contentManager->Read(bitmap, "Texture1.bmp", PixelFormat::R8G8B8_UINT);
// *** Texture file ***
// Read the content directly into a new texture
const Texture texture = contentManager->ReadTexture("Texture1.bmp", PixelFormat::R8G8B8_UINT);
// Read the content directly into a existing texture object.
// Beware the texture object will be resized and format changed as needed, but some memory could be reused.
Texture texture;
contentManager->Read(texture, "Texture1.bmp", PixelFormat::R8G8B8_UINT);
If you prefer to control the loading yourself, you can retrieve the path to the files like this:
IO::Path contentPath = contentManager->GetContentPath();
IO::Path myData = IO::Path::Combine(contentPath, "MyData.bin");
IO::Path readmePath = IO::Path::Combine(contentPath, "Stuff/Readme.txt");
IO::Path texture1Path = IO::Path::Combine(contentPath, "Texture1.bmp");
You can then open the files with any method you prefer. Both methods work for all supported platforms.
This is done in the S01_SimpleTriangle_Register.cpp file.
namespace Fsl
{
namespace
{
// Custom EGL config (these will per default overwrite the custom settings. However an exact EGL config can be used)
static const EGLint g_eglConfigAttribs[] =
{
EGL_SAMPLES, 0,
EGL_RED_SIZE, 8,
EGL_GREEN_SIZE, 8,
EGL_BLUE_SIZE, 8,
EGL_ALPHA_SIZE, 0, // buffers with the smallest alpha component size are preferred
EGL_DEPTH_SIZE, 24,
EGL_SURFACE_TYPE, EGL_WINDOW_BIT,
EGL_NONE,
};
}
// Configure the demo environment to run this demo app in a OpenGLES2 host environment
void ConfigureDemoAppEnvironment(HostDemoAppSetup& rSetup)
{
DemoAppHostConfigEGL config(g_eglConfigAttribs);
DemoAppRegister::GLES2::Register<S01_SimpleTriangle>(rSetup, "GLES2.S01_SimpleTriangle", config);
}
}
Since the demo framework is controlling the main method, you need to register your application with the Demo Host specific registration call (in this case the OpenGL ES2 host), for the framework to register your demo class.
To register a demo for OpenGLES 3.X you would use the GLES3 register method:
DemoAppRegister::GLES3::Register<S01_SimpleTriangle>(rSetup, "GLES3.S01_SimpleTriangle", config);
Per default the app is destroyed and recreated when a resolution change occurs. It is left up to the DemoApp to save and restore demo specific state.
The demo app can request an exit to occur, or it can be terminated via an external request. In both cases one of the following things occur.
- If the app has been constructed and has received a FixedUpdate, then it will finish its FixedUpdate, Update, Draw, swap sequence before its shutdown.
- If the app requests a shutdown during construction, the app will be destroyed before calling any other method on the object (and no swap will occur).
The app can request an exit to occur by calling:
GetDemoAppControl()->RequestExit(1);
All demos support various command line arguments. Use –h on a demo for a complete list
Argument | Description |
---|---|
-h | Show the command line argument help. |
--Stats | Show a performance graph. |
--LogStats | Log various stats to the console. |
--Window | Run inside a window instead of using the fullscreen. Used like this --Window [0,0,640,480] the parameters specify (x,y,width,height). |
--ScreenshotFrequency | Create a screenshot at the given frame frequency. |
--ExitAfterFrame | Exit after the given number of frames has been rendered |
--ContentMonitor | Monitor the Content directory for changes and restart the app on changes. WARNING: Might not work on all platforms and it might impact app performance (experimental) |
All apps support these keys per default, but can override then if they chose to do so. Beware that some platforms might not support the given 'key' type and therefore the functonality is unsupported.
Key | Function |
---|---|
Escape | Exit the app. |
F4 | Take a screenshot (If supported by the test service) |
F5 | Restart the app. |
All samples support time stepping which can be useful for debugging. It might not be available on platforms that don't support the given keys. Also beware that apps can override these keys if they chose to do so.
Key | Function |
---|---|
Pause | Pause the sample. |
PageDown | Move forward one timestep. |
Delete | Toggle between normal and Slow 2x playback |
End | Toggle between normal and Slow 4x playback |
Insert | Toggle between normal and fast 2x playback. |
Home | Toggle between normal and fast 4x playback. |
Shows how to create a work thread that returns a result and posts to a concurrent queue while working. The custom work thread is created using std::thread.
Shows how to create a work thread that returns a result and posts to a concurrent queue while working. The custom work thread is created using std::async.
Shows how to create a work thread that returns a result and posts to a concurrent queue while working. The custom work thread is created using a service framework Async service. This allows the work thread to access other framework services.
Shows how to create a freestyle console based application using the demoframework. This console app has access to the normal graphics libraries.
Shows how to create a freestyle minimal console based application using the demoframework. This minimal app does not pull in any of the graphics libraries.
A example of how to register custom app specific services.
Shows how to use the Demo Frameworks 'basic' 2d rendering capabilities that work across all backends. The basic2D interface allows you to render ASCII strings using a system provided font and draw colored points.
The functionality in Basic2D is used internally in the framework to render the profiling overlays like the frame rate counter and graph.
Shows how to use the G2D API blit functions to create a multiplane/multilayer landscape. The layers are rendered to the FB directly without having the need of using other APIs.
A example of how to create a bloom effect. The idea is not to create the most accurate bloom, but something that is fairly fast to render.
Instead of increasing the kernal size to get a good blur we do a fairly fast approximation by downscaling the original image to multiple smaller render-targets and then blurring these using a relative small kernel and then finally rescaling the result to the original size.
Showcases multiple ways to implement a gaussian blur.
- One pass blur
- Two pass blur The 2D Gaussian filter kernel is separable. This allows us two produce the same output as a one pass algorithm by first applying a X-blur and then a Y-blur.
- Two pass linear blur Uses the two pass technique and further reduces the bandwidth requirement by taking advantage of the GPU's linear texture filtering which allows us to reduce the needed kernel length to roughly half its length while producing the same output as the full kernel length.
- Two pass linear scaled blur Uses the two pass linear technique and further reduces the bandwidth requirement by downscaling the 'source image' to 1/4 its size (1/2w x 1/2h) before applying the blur and and then upscaling the blurred image to provide the final image. This works well for large kernel sizes and relatively high sigma's but the downscaling produces visible artifacts with low sigma's
This sample shows how to use OpenGL ES shaders to Debayer an input video. Please check the Shader.frag file within the Content folder to actually see how the data is converted. The video data is obtained using gstreamer and using the DirectVIVMap extension mapped to a GPU buffer to be used as a texture for the fragment shader to DeBayer.
This sample shows how to use Gstreamer and OpenGL ES to display a YUV video on a texture by doing the YUV to RGB conversion on a shader and also use the DirectVIV extensions to avoid copying data from the Video Buffer to the GL Texture.
Creates a simple parallax scrolling effect by blending eight 32 bit per pixel 1080p layers on top of each other. This is not the most optimal way to do it as it uses eight passes. But it does provide a good example of the worst case bandwidth use for the operation.
The demo was created to compare GLES to the G2D eight blend blit functionality.
Can render both the julia and mandelbrot set using a fragment shader. This demo was used to demonstrates GPU shader performance by using up roughly 515 instructions to render each fragment while generating the julia set.
It uses no textures, has no overdraw and has a minimal bandwidth requirement.
Use the commandline arguments to select the scene and quality.
A simple example of dynamic line rendering using the LineBuilder helper class. The line builder has 'Add' methods for most FslBase.Math classes like BoundingBox, BoundingSphere, BoundingFrustrum, Ray, etc.
Demonstrates how to use the FslSceneImporter and Assimp to load a scene and render it using OpenGLES2.
The model is rendered using a simple per pixel directional light shader.
For a more complex example take a look at the ModelViewer example.
Expands the ModelLoaderBasics example with:
- A arcball camera
- Multiple different scenes (Knight, Dragon, Car, etc)
- More advanced shaders for directional per pixel specular light with support for gloss and normal maps.
Demonstrates how to use OpenCV from inside a OpenGL ES 2 project.
This is a very basic example that mainly shows how to setup the correct dependency in the Fsl.gen file and then it does some very basic OpenCV operations. It could be used as a good starting point for a more complex example.
Shows how to render a single colored Triangle using OpenGL ES, this sample serves as a good introduction to the OpenGL ES 2 Pipeline and the abstraction classes that the DemoFramework provides.
It's basically the typical 'Hello World' program for graphics.
Shows how to render a vertex colored Triangle using OpenGL ES, this demonstrates how to add more than vertex positions to the vertex attributes.
This is basically the same as the S01 example it just adds vertex colors to the shader.
Renders a animated vertex colored triangle.
This shows how to modify the model matrix to rotate a triangle and how to utilize demoTime.DeltaTime to do frame rate independent animation.
This example shows how to:
- Build a perspective projection matrix
- Render two simple 3d models using frame rate independent animation.
Demonstrates how to use a pre-compiled shader using the offline compiler tool 'vCompiler' from Verisilicon.
This currently only works on the Yocto platform.
This example shows how to use the Texture class to use a texture in a cube.
It also shows you how to use the ContentManager service to load a 'png' file from the Content directory into a bitmap utility class which is then used to used to create a OpenGL ES texture.
This sample shows how to use a cubemap texture to simulate a reflective material.
It also shows you how to use the ContentManager service to load a 'dds' file from the Content directory into a Texture utility class which is then used to used to create a OpenGL ES cubemap texture.
This sample is a variation from the previous sample, again, a cubemap texture is used, but this time instead of simulating a reflective material a refractive material is simulated.
It also shows you how to use the ContentManager service to load a 'dds' file from the Content directory into a Texture utility class which is then used to used to create a OpenGL ES cubemap texture.
This sample shows how to use the Verisilicon extensions to create a texture without having the need to copy the image data to GL.
Simple example of bitmap fonts vs SDF bitmap fonts. This example shows the worst case differences as we only use one resolution for the bitmap font meaning we often upscale the image which gives the worst ouput. A proper bitmap font solution should have multiple font textures at various DPI's and select the one closest to the actual font rendering size and preferbly also prefer to downscale the image instead of upscaling it.
It also showcases two simple SDF effects:
- Outline
- Shadow
Showcase the new stats services.
Executes a highly configurable stress test for the OpenGL ES API.
It will procedurally generate a mesh and fur texture that is then rendered to cover the entire screen.
This will often showcase the worst case power consumption of the GPU.
Load and render the supported compressed textures. It also outputs information about the compression support.
This example shows how to use the DirectVIV extension to use an existing buffer as a texture source without having to copy the data to GL.
Quick example that showcase how to mix rendering using the basic rendering API and the FslSimpleUI.
Development project for the IBasicRenderSystem.
Development project for custom shaders for the IBasicRenderSystem.
Development project for the Vulkan NativeTexture2D and DynamicNativeTexture2D implementation. Makes it easy to provoke certain NativeTexture2D/DynamicNativeTexture2D scenarios.
Shows how to use the Demo Frameworks 'basic' 2d rendering capabilities that work across all backends. The basic2D interface allows you to render ASCII strings using a system provided font and draw colored points in batches.
The functionality in Basic2D is used internally in the framework to render the profiling overlays like the frame rate counter and graphs.
Shows how to use the Demo Frameworks NativeBatch implementatin to render various graphics elements. The native batch functionality works across various 3D backends and also allows you to use the API native textures for rendering.
The native batch is very useful for quickly getting something on the screen which can be useful for prototyping and debugging. It is however not a optimized way of rendering things.
Demonstrates how to receive various input events and logs information about them onscreen and to to the log.
This can also be used to do some basic real time tests of the input system when porting the framework to a new platform.
Development project for on demand rendering and demonstrates how to implement it. It also has some basic 'janky timing' detection animations.
This application has been designed for a 1920x1080dp screen and will provide a sub-optimal experience for resolutions lower than that.
Simple example of UI data binding
UI benchmark that can be used to benchmark various ways of rendering a UI. This allows you to see what works best on the given hardware.
This application has been designed for a 1920x1080dp screen and will provide a sub-optimal experience for resolutions lower than that.
Simple example of UI chart rendering.
Experimental declarative UI that use the new data-binding capability to create UI from a XML file.
This sample showcases a UI that is DPI aware vs one rendered using the standard pixel based method.
It also showcases various ways to render scaled strings and the errors that are easy to introduce.
This sample showcases some of the common scaling traps that can occur when trying to achieve pixel perfect rendering.
This sample test the various internal UI rendering primitives.
A very basic example of how to utilize the DemoFramework's UI library. The sample displays four buttons and reacts to clicks.
The UI framework that makes it easy to get a basic UI up and running. The main UI code is API independent. It is not a show case of how to render a UI fast but only intended to allow you to quickly get a UI ready that is good enough for a demo.
A more complex example of how to utilize the DemoFramework's UI library. It displays various UI controls and ways to utilize them.
The UI framework that makes it easy to get a basic UI up and running. The main UI code is API independent. It is not a show case of how to render a UI fast but only intended to allow you to quickly get a UI ready that is good enough for a demo.
This sample showcases the difference between sub pixel accuracy vs pixel accuracy when scrolling.
Showcase all controls that is part of the Basic UI theme.
A simple example of multi texturing.
A example of how to create a bloom effect. The idea is not to create the most accurate bloom, but something that is fairly fast to render.
Instead of increasing the kernal size to get a good blur we do a fairly fast approximation by downscaling the original image to multiple smaller render-targets and then blurring these using a relative small kernel and then finally rescaling the result to the original size.
Showcases how to use the helios camera API. It captures a image from the camera and renders it using OpenGL ES 3.
Checks for the presence of known EGL color space extensions and outputs information about them to the console.
This sample introduces you to the use of Vertex Buffer Objects. Look for the OSTEP tags in the code, they will list in an ordered way the steps needed to set a VBO, The CHALLENGE tags will list additional steps you can follow to create an additional VBO.
This sample is basically a exact copy of E1_1_VBOs but it uses the DemoFramework utility classes to make the code simpler.
This sample introduces you to the use of Vertex Array Objects. They will allow you to record your Vertex States only once and then restore them by calling the glBindVertexArray function. As with the E* samples, look for the OSTEP tags to see how they are created and used.
This sample is basically a exact copy of E1_2_VAOs but it uses the DemoFramework utility classes to make the code simpler.
Showcases how to use the helios camera API. It captures a image from the camera and renders it using the nativebatch.
This sample shows how to use Gstreamer and OpenGL ES to display a YUV video on a texture by doing the YUV to RGB conversion on a shader and also use the DirectVIV extensions to avoid copying data from the Video Buffer to the GL Texture.
This sample introduces you to the use of Vertex Buffer Objects. Look for the OSTEP tags in the code, they will list in an ordered way the steps needed to set a VBO, The CHALLENGE tags will list additional steps you can follow to create an additional VBO.
To see a simpler version of this code that utilize utility classes from the DemoFramework take a look at D1_1_VBOs example.
This sample introduces you to the use of Vertex Array Objects. They will allow you to record your Vertex States only once and then restore them by calling the glBindVertexArray function. As with the E* samples, look for the OSTEP tags to see how they are created and used.
To see a simpler version of this code that utilize utility classes from the DemoFramework take a look at the D1_2_VAOs example.
This sample teaches you how to use the glCopyBufferSubData to copy data between 2 Vertex Buffer Objects. In this sample we copy a Buffer for Positions and a Buffer for Indices.
.
This sample introduces the concept of instancing, this concept is useful to render several meshes that share geometry. In this sample all cubes will share the same vertex positions, however the colors and MVP Matrices will be independent.
.
This sample shows you a new feature of OpenGL ES 3.0 called primitive restart. This allows you to segment the geometry without the need of adding a degenerate triangle to the triangle index list.
.
.
This sample introduces multiple render targets (MRT). This feature allows you to define multiple outputs from your fragment shader. Please check the fragment shader under the Content folder, you will notice how 4 outputs are being defined.
.
This sample creates a particle system using GL_POINTS. It defines an initial point where the particles will be tightly packed and then, during the explosion lifetime each point will be animated from a starting position to an end position.
.
Convert a equirectangular map to a cubemap using OpenGL ES3.
Can render both the julia and mandelbrot set. Was used to demonstrates GPU shader performance by using up to 515 instructions each fragment while generating the julia set.
No texture and no overdraw, minimal bandwidth requirements.
Illustrates how to render fur over several primitives.
The fur is rendered on a layered approach using a seamless texture as a base and then creating a density bitmap.
.
A simple example of how to do gamma correction it shows the difference that SRGB textures and gamma correction makes to the output by comparing it to the uncorrected rendering methods.
As normal framebuffer values are clamped between 0.0 and 1.0 it means that any light value above 1.0 gets clamped. Because of this its not really possible to differentiate really bright lights from normal lights. To take advantage of the light information that normally gets discarded we use a tone mapping algorithm to try and preserve it. This demo applies the tonemapping right away in the lighting shader so no temporary floating point framebuffer is needed.
As normal framebuffer values are clamped between 0.0 and 1.0 it means that any light value above 1.0 gets clamped. Because of this its not really possible to differentiate really bright lights from normal lights. To take advantage of the light information that normally gets discarded we use a tone mapping algorithm to try and preserve it. This demo applies the tonemapping as a postprocessing step on the fully lit scene, so a temporary floating point framebuffer is needed.
This sample outputs to a LDR screen.
Render a HDR skybox and apply various tonemapping algorithms to it.
This sample outputs to a LDR screen.
Demonstrates how to enable HDRFramebuffer mode if available. The render a test scene using a pattern that makes it easy to detect if the display actually enabled HDR mode.
This sample outputs to a HDR screen if supported.
A simple example of dynamic line rendering using the LineBuilder helper class. The line builder has 'Add' methods for most FslBase.Math classes like BoundingBox, BoundingSphere, BoundingFrustrum, Ray, etc.
Shows the use of instancing for rendering many copies of the same mesh.
Demonstrates how to use the FslSceneImporter and Assimp to load a scene and render it using OpenGLES2.
The model is rendered using a simple per pixel directional light shader.
For a more complex example take a look at the ModelViewer example.
Expands the ModelLoaderBasics example with:
- A arcball camera
- Multiple different scenes (Knight, Dragon, Car, etc)
- More advanced shaders for directional per pixel specular light with support for gloss and normal maps.
Demonstrates how to utilize multiple viewports. It reuses the fractal shaders from the FractalShader demo to render the julia and mandelbrot sets.
No texture and no overdraw, minimal bandwidth requirements.
Shows how to select (pick) 3d objects using the mouse via Axis Aligned Bounding Boxes (AABB).
Beware that AABB's often represent quite a rough fit and therefore is best used as a quick way to determine if there might be a collision and then utilize a more precise calculation to verify it.
Simple application that allows you to get your system's OpenCL available platforms.
Demonstrates how to use OpenCL from inside a OpenGL ES 3 project.
This is a very basic example that mainly shows how to setup the correct dependency in the Fsl.gen file and then it does some very basic OpenCL operations. It could be used as a good starting point for a more complex example.
This sample uses OpenCL to execute a Gaussian Blur on an image.
The output will then be stored into a bmp image and also displayed as an OpenGL ES 3.0 texture mapped to a cube.
Demonstrates how to use OpenCV from inside a OpenGL ES 3 project.
This is a very basic example that mainly shows how to setup the correct dependency in the Fsl.gen file and then it does some very basic OpenCV operations. It could be used as a good starting point for a more complex example.
Demonstrates how to take a OpenCV mat and convert it to a Bitmap which is then converted to a Texture2D for use with the NativeBatch. The texture is then shown on screen and can be compared to the same texture that was loaded using the normal DemoFramework methods.
The cv::Mat -> Bitmap routines used here are a very basic proof of concept.
Demonstrates how to take a OpenCV mat and convert it to a Bitmap which is then converted to a Texture2D for use with the UI frmaework. The texture is then shown on screen and can be compared to the same texture that was loaded using the normal DemoFramework methods.
The cv::Mat -> Bitmap routines used here are a very basic proof of concept.
Demonstrate how process a image with OpenVX then use it to render as a texture on the GPU.
Creates a configurable particle system where you can select the type of primitive each particle will have and the amount of particles.
.
.
This sample covers how to use OpenGL ES 3.0 to render to a texture. It also shows how to use the DirectVIV extensions to:
-
Map an existing buffer to be used as a texture and as a FBO. This scheme uses glTexDirectVIVMap, this function creates a texture which contents are backed by an already existing buffer. In this case we create a buffer using the g2d allocator.
-
Create a texture and obtain a user accessible pointer to modify it, this texture can also be used as a FBO. This scheme uses glTexDirectVIV, this function creates a Texture in memory, but gives you access to its content buffer in GPU memory via a pointer, so you can write in it.
Shows how to render a single colored Triangle using OpenGL ES, this sample serves as a good introduction to the OpenGL ES 3 Pipeline and the abstraction classes that the DemoFramework provides.
It's basically the typical 'Hello World' program for graphics.
Shows how to render a vertex colored Triangle using OpenGL ES, this demonstrates how to add more than vertex positions to the vertex attributes.
This is basically the same as the S01 example it just adds vertex colors to the shader.
Renders a animated vertex colored triangle.
This shows how to modify the model matrix to rotate a triangle and how to utilize demoTime.DeltaTime to do frame rate independent animation.
This example shows how to:
- Build a perspective projection matrix
- Render two simple 3d models using frame rate independent animation.
Demonstrates how to use a pre-compiled shader using the offline compiler tool 'vCompiler' from Verisilicon.
This currently only works on the Yocto platform.
This example shows how to use the Texture class to use a texture in a cube.
It also shows you how to use the ContentManager service to load a 'png' file from the Content directory into a bitmap utility class which is then used to used to create a OpenGL ES texture.
This sample shows how to use a cubemap texture to simulate a reflective material.
It also shows you how to use the ContentManager service to load a 'dds' file from the Content directory into a Texture utility class which is then used to used to create a OpenGL ES cubemap texture.
This sample is a variation from the previous sample, again, a cubemap texture is used, but this time instead of simulating a reflective material a refractive material is simulated.
It also shows you how to use the ContentManager service to load a 'dds' file from the Content directory into a Texture utility class which is then used to used to create a OpenGL ES cubemap texture.
This sample shows how to use the Verisilicon extensions to create a texture without having the need to copy the image data to GL.
A simple example of how glScissor works.
This is showcased by rendering the insides of a rotating cube and using a animated scissor rectangle to clip.
Simple example of bitmap fonts vs SDF bitmap fonts. This example shows the worst case differences as we only use one resolution for the bitmap font meaning we often upscale the image which gives the worst ouput. A proper bitmap font solution should have multiple font textures at various DPI's and select the one closest to the actual font rendering size and preferbly also prefer to downscale the image instead of upscaling it.
It also showcases two simple SDF effects:
- Outline
- Shadow
Render a simple skybox using a cubemap.
A simple example of how a spatial hash grid works in 2d.
Background test to showcase SPline animations used for simulating a fluid background that can be stimulated by either the spheres or mouse/touch input.
It is then demonstrated how to render the grid using:
- The DemoFramework native batch (Basic or Catmull-Rom splines)
- Linestrips in a VertexBuffer (Catmull-Rom splines)
- Linestrips in a VertexBuffer but using the geometry shader to make quads (Catmull-Rom splines)
Enables a SRGB Framebuffer if the extension EGL_KHR_gl_colorspace is available. If unavailable it does normal gamma correction in the shader.
Showcase the new stats services.
Executes a highly configurable stress test for the OpenGL ES API.
It will procedurally generate a mesh and fur texture that is then rendered to cover the entire screen.
This will often showcase the worst case power consumption of the GPU.
Simple tessellation sample that allows you to select the tessellation level to see how it modifies the level of detail on the selected geometry.
.
Shows how to load scenes via Assimp and then render them using
- a directional per pixel specular light with support for normal maps.
- tessellation using the geometry shader.
Load and render some ETC2 compressed textures. It also outputs information about the found compression extensions.
A very simple verlet integration example.
It's inspired by: Coding Math: Episode 36 - Verlet Integration Part I+IV
.
Quick example that showcase how to mix rendering using the basic rendering API and the FslSimpleUI.
Development project for the IBasicRenderSystem.
Development project for custom shaders for the IBasicRenderSystem.
Development project for the Vulkan NativeTexture2D and DynamicNativeTexture2D implementation. Makes it easy to provoke certain NativeTexture2D/DynamicNativeTexture2D scenarios.
Shows how to use the Demo Frameworks 'basic' 2d rendering capabilities that work across all backends. The basic2D interface allows you to render ASCII strings using a system provided font and draw colored points.
The functionality in Basic2D is used internally in the framework to render the profiling overlays like the frame rate counter and graph.
Shows how to use the Demo Frameworks NativeBatch implementatin to render various graphics elements. The native batch functionality works across various 3D backends and also allows you to use the API native textures for rendering.
The native batch is very useful for quickly getting something on the screen which can be useful for prototyping and debugging. It is however not a optimized way of rendering things.
Application used to debug the gesture handling code.
Development project for on demand rendering and demonstrates how to implement it. It also has some basic 'janky timing' detection animations.
This application has been designed for a 1920x1080dp screen and will provide a sub-optimal experience for resolutions lower than that.
Simple example of UI data binding
UI benchmark that can be used to benchmark various ways of rendering a UI. This allows you to see what works best on the given hardware.
This application has been designed for a 1920x1080dp screen and will provide a sub-optimal experience for resolutions lower than that.
Simple example of UI chart rendering.
Experimental declarative UI that use the new data-binding capability to create UI from a XML file.
This sample showcases a UI that is DPI aware vs one rendered using the standard pixel based method.
It also showcases various ways to render scaled strings and the errors that are easy to introduce.
Application used to debug the UI gesture handling code.
This sample showcases some of the common scaling traps that can occur when trying to achieve pixel perfect rendering.
This sample test the various internal UI rendering primitives.
A very basic example of how to utilize the DemoFramework's UI library. The sample displays four buttons and reacts to clicks.
The UI framework that makes it easy to get a basic UI up and running. The main UI code is API independent. It is not a show case of how to render a UI fast but only intended to allow you to quickly get a UI ready that is good enough for a demo.
A more complex example of how to utilize the DemoFramework's UI library. It displays various UI controls and ways to utilize them.
The UI framework that makes it easy to get a basic UI up and running. The main UI code is API independent. It is not a show case of how to render a UI fast but only intended to allow you to quickly get a UI ready that is good enough for a demo.
This sample showcases the difference between sub pixel accuracy vs pixel accuracy when scrolling.
Showcase all controls that is part of the Basic UI theme.
OpenCL Kernel and code to execute a Colorseg --Options
"InputBmpFile" Input BMP file. "OutputHsvBmpFile" Output Hsv BMP file. "OutputRgbBmpFile" Output Rgb BMP file.
OpenCL Kernel and code to execute a Fast Fourier Transform --Options
"Length" FFT length.
OpenCL Kernel and code to execute a GaussianFilter --Options
"InputBmpFile" Input BMP file. "OutputBmpFile" Output BMP file. "type" Select type: Gray, Rgb.
OpenCL Kernel and code to execute a Gray2Rgb --Options
"InputBmpFile" Input BMP file. "OutputBmpFile" Output BMP file.
Simple OpenCL Application that allows you to obtain your system's complete OpenCL information: Information related to CL kernel compilers, number of buffers supported, extensions available and more.
OpenCL Kernel and code to execute a MedianFilter --Options
"InputBmpFile" Input BMP file. "OutputBmpFile" Output BMP file.
OpenCL Kernel and code to execute a MorphoDilate --Options
"InputBmpFile" Input BMP file. "OutputBmpFile" Output BMP file.
OpenCL Kernel and code to execute a Rgb2Gray --Options
"InputBmpFile" Input BMP file. "OutputBmpFile" Output BMP file.
OpenCL Kernel and code to execute a Rgb2Hsv --Options
"InputBmpFile" Input BMP file.
OpenCL Kernel and code to execute a Rgb888toRgb565 --Options
"InputBmpFile" Input BMP file. "OutputBmpFile" Output BMP file.
OpenCL Kernel and code to execute a Rgb888toUYVY --Options
"InputBmpFile" Input BMP file. "OutputRawFile" Output RAW file.
OpenCL Kernel and code to execute a SobelHFilter --Options
"InputBmpFile" Input BMP file. "OutputBmpFile" Output BMP file.
OpenCL Kernel and code to execute a SobelVHFilter --Options
"InputBmpFile" Input BMP file. "OutputBmpFile" Output BMP file.
It is a software-based image signal processing(SoftISP) application optimized by GPU. SoftISP --Options "Enable" Enable high quality noise reduction node
Simple and quick test for OpenCV 3.1. This test tries to read a couple of bitmaps, process them and show the result on the window.
Simple and quick test for OpenCV 3.1. This test tries to read a couple of bitmaps, process them and show the result on the window.
Shows how to render text using a bitmap font in OpenVG.
.
.
This sample shows how to use image data in OpenVG. You can think of it as an OpenGL Texture on a Quad. A vgImage is a rectangular shape populated with color information from an image. This sample also shows that you can transform those images as the paths in previous samples and also can apply special filters like Gaussian Blurs.
.
Shows how to draw lines using OpenVG, this sample introduces the concept of Points and segments and how to integrate them into paths that describe the final shapes that are rendered on screen.
.
This sample builds on top of Example1 and shows how to add color to the path strokes and the path fill area by introducing the concept of "paints" in OpenVG.
.
.
This sample will introduce the transformation functions on OpenVG as well as the scissoring function. Each object will be animated using either vgTranslate, vgScale, vgShear or vgRotate. The scissoring rectangle will be set at the beginning of the Draw method.
.
Small benchmarking application for benchmarking various ways to render points in OpenVG.
.
.
.
Executes a configurable stress test for the OpenVG API.
This will often showcase the worst case power consumption.
.
Shows how to use the Demo Frameworks 'basic' 2d rendering capabilities that work across all backends. The basic2D interface allows you to render ASCII strings using a system provided font and draw colored points.
The functionality in Basic2D is used internally in the framework to render the profiling overlays like the frame rate counter and graph.
It is a software-based image signal processing(SoftISP) application optimized by GPU. SoftISP --Options "Enable" Enable high quality noise reduction node
It is a stereo vision implementations based on a multi resolution strategy running on GPU. GPU kernels are developed on i.MX8 series using extended vision instruction set (EVIS). Input images are taken by fisheye camera, so they have some distortion.
A example of how to create a bloom effect. The idea is not to create the most accurate bloom, but something that is fairly fast to render.
Instead of increasing the kernal size to get a good blur we do a fairly fast approximation by downscaling the original image to multiple smaller render-targets and then blurring these using a relative small kernel and then finally rescaling the result to the original size.
Attraction based particle system. A shader storage buffer is used to store particle on which the compute shader does some physics calculations. The buffer is then used by the graphics pipeline for rendering with a gradient texture for. Demonstrates the use of memory barriers for synchronizing vertex buffer access between a compute and graphics pipeline
Based on a example called ComputeParticles by Sascha Willems. Recreated as a DemoFramework freestyle window sample in 2016.
Uses tessellation shaders to generate additional details and displace geometry based on a heightmap.
Based on a example called Displacement mapping by Sascha Willems. Recreated as a DemoFramework freestyle window sample in 2016.
Renders a terrain with dynamic tessellation based on screen space triangle size, resulting in closer parts of the terrain getting more details than distant parts. The terrain geometry is also generated by the tessellation shader using a 16 bit height map for displacement. To improve performance the example also does frustum culling in the tessellation shader.
Based on a example called Dynamic terrain tessellation by Sascha Willems. Recreated as a DemoFramework freestyle window sample in 2016.
Shows how to render to a offscreen texture. The texture is then used to render the next pass where it applies a very simple 'water' like effect.
Can render both the julia and mandelbrot set. Was used to demonstrates GPU shader performance by using up to 515 instructions each fragment while generating the julia set.
No texture and no overdraw, minimal bandwidth requirements.
Illustrates how to render fur over several primitives.
The fur is rendered on a layered approach using a seamless texture as a base and then creating a density bitmap.
.
A simple example of how to do gamma correction it shows the difference that SRGB textures and gamma correction makes to the output by comparing it to the uncorrected rendering methods.
Vulkan interpretation of glxgears. Procedurally generates separate meshes for each gear, with every mesh having it's own uniform buffer object for animation. Also demonstrates how to use different descriptor sets.
Based on a example called Vulkan Gears by Sascha Willems. Recreated as a DemoFramework freestyle window sample in 2016.
A simple example that shows how to generate mipmaps at runtime.
Simple example that showcase the vkCmdWriteTimestamp functionality on supported devices.
As normal framebuffer values are clamped between 0.0 and 1.0 it means that any light value above 1.0 gets clamped. Because of this its not really possible to differentiate really bright lights from normal lights. To take advantage of the light information that normally gets discarded we use a tone mapping algorithm to try and preserve it. This demo applies the tonemapping right away in the lighting shader so no temporary floating point framebuffer is needed.
As normal framebuffer values are clamped between 0.0 and 1.0 it means that any light value above 1.0 gets clamped. Because of this its not really possible to differentiate really bright lights from normal lights. To take advantage of the light information that normally gets discarded we use a tone mapping algorithm to try and preserve it. This demo applies the tonemapping as a postprocessing step on the fully lit scene, so a temporary floating point framebuffer is needed.
This sample outputs to a LDR screen.
Render a HDR skybox and apply various tonemapping algorithms to it.
This sample outputs to a LDR screen.
Demonstrates how to enable HDRFramebuffer mode if available. The render a test scene using a pattern that makes it easy to detect if the display actually enabled HDR mode.
This sample outputs to a HDR screen if supported.
A simple example of dynamic line rendering using the LineBuilder helper class. The line builder has 'Add' methods for most FslBase.Math classes like BoundingBox, BoundingSphere, BoundingFrustrum, Ray, etc.
Shows the use of instancing for rendering many copies of the same mesh using different attributes and textures. A secondary vertex buffer containing instanced data, stored in device local memory, is used to pass instance data to the shader via vertex attributes with a per-instance step rate. The instance data also contains a texture layer index for having different textures for the instanced meshes.
Based on a example called Mesh instancing by Sascha Willems. Recreated as a DemoFramework freestyle window sample in 2016.
Shows the use of instancing for rendering many copies of the same mesh.
Expands the ModelLoaderBasics example with:
- A arcball camera
- Multiple different scenes (Knight, Dragon, Car, etc)
- More advanced shaders for directional per pixel specular light with support for gloss and normal maps.
Demonstrates how to utilize multiple viewports. It reuses the fractal shaders from the FractalShader demo to render the julia and mandelbrot sets.
No texture and no overdraw, minimal bandwidth requirements.
Check that vkGetPhysicalDeviceSurfaceCapabilitiesKHR reports the expected values before and after swapchain creation for the given native window implementation.
Shows how to select (pick) 3d objects using the mouse via Axis Aligned Bounding Boxes (AABB).
Beware that AABB's often represent quite a rough fit and therefore is best used as a quick way to determine if there might be a collision and then utilize a more precise calculation to verify it.
Simple application that allows you to get your system's Vulkan available platforms.
Demonstrates how to use OpenCL from inside a Vulkan project.
This is a very basic example that mainly shows how to setup the correct dependency in the Fsl.gen file and then it does some very basic OpenCL operations. It could be used as a good starting point for a more complex example.
This sample uses OpenCL to execute a Gaussian Blur on an image.
The output will then be stored into a bmp image and also displayed as an Vulkan texture mapped to a cube.
Demonstrates how to use OpenCV from inside a Vulkan project.
This is a very basic example that mainly shows how to setup the correct dependency in the Fsl.gen file and then it does some very basic OpenCV operations. It could be used as a good starting point for a more complex example.
Demonstrates how to take a OpenCV mat and convert it to a Bitmap which is then converted to a Texture2D for use with the NativeBatch. The texture is then shown on screen and can be compared to the same texture that was loaded using the normal DemoFramework methods.
The cv::Mat -> Bitmap routines used here are a very basic proof of concept.
Demonstrates how to take a OpenCV mat and convert it to a Bitmap which is then converted to a Texture2D for use with the UI frmaework. The texture is then shown on screen and can be compared to the same texture that was loaded using the normal DemoFramework methods.
The cv::Mat -> Bitmap routines used here are a very basic proof of concept.
Demonstrate how process a image with OpenVX then use it to render as a texture on the GPU.
A simple example of how scissoring works in Vulkan.
This is showcased by rendering the insides of a rotating cube and using a animated scissor rectangle to clip.
Shows how to take a screenshot from code.
Simple example of bitmap fonts vs SDF bitmap fonts. This example shows the worst case differences as we only use one resolution for the bitmap font meaning we often upscale the image which gives the worst ouput. A proper bitmap font solution should have multiple font textures at various DPI's and select the one closest to the actual font rendering size and preferbly also prefer to downscale the image instead of upscaling it.
It also showcases two simple SDF effects:
- Outline
- Shadow
Simple example that showcase the device extension VK_KHR_shader_clock.
Render a simple skybox using a cubemap.
A simple example of how a spatial hash grid works in 2d.
Enables a SRGB Framebuffer if its available. If unavailable it does normal gamma correction in the shader.
Showcase the new stats services.
Executes a highly configurable stress test for the Vulkan API.
It will procedurally generate a mesh and fur texture that is then rendered to cover the entire screen.
This will often showcase the worst case power consumption of the GPU.
Generating curved PN-Triangles on the GPU using tessellation shaders to add details to low-polygon meshes, based on this paper, with shaders from this tutorial.
Based on a example called PN-Triangles by Sascha Willems. Recreated as a DemoFramework freestyle window sample in 2016.
Load and render the supported compressed textures. It also outputs information about the compression support.
Shows how to upload a 2D texture into video memory for sampling in a shader. Loads a compressed texture into a host visible staging buffer and copies all mip levels to a device local optimal tiled image for best performance.
Also demonstrates the use of combined image samplers. Samplers are detached from the actual texture image and only contain information on how an image is sampled in the shader.
Based on a example called Texture mapping by Sascha Willems. Recreated as a DemoFramework freestyle window sample in 2016.
Texture arrays allow storing of multiple images in different layers without any interpolation between the layers. This example demonstrates the use of a 2D texture array with instanced rendering. Each instance samples from a different layer of the texture array.
Based on a example called Texture arrays by Sascha Willems. Recreated as a DemoFramework freestyle window sample in 2016.
Building on the basic texture loading example, a cubemap texture is loaded into a staging buffer and is copied over to a device local optimal image using buffer to image copies for all of it's faces and mip maps.
The demo then uses two different pipelines (and shader sets) to display the cubemap as a skybox (background) and as a source for reflections.
Based on a example called Cube maps by Sascha Willems. Recreated as a DemoFramework freestyle window sample in 2016.
Most basic example. Renders a colored triangle using an indexed vertex buffer. Vertex and index data are uploaded to device local memory using so-called "staging buffers". Uses a single pipeline with basic shaders loaded from SPIR-V and and single uniform block for passing matrices that is updated on changing the view.
This example is far more explicit than the other examples and is meant to be a starting point for learning Vulkan from the ground up. Much of the code is boilerplate that you'd usually encapsulate in helper functions and classes (which is what the other examples do).
Renders a red triangle.
Calculating and drawing of the Mandelbrot set using the core Vulkan API
Based on a sample by Norbert Nopper from VKTS Examples VKTS_Sample08 Recreated as a DemoFramework freestyle console sample in 2016.
.
Commandline tool to dump vulkan system information to the console.
This is a easy way to quickly query the hardware capabilities as reported by vulkan.
Quick example that showcase how to mix rendering using the basic rendering API and the FslSimpleUI.
A example of how to register custom app specific services.
Development project for the IBasicRenderSystem.
Development project for custom shaders for the IBasicRenderSystem.
Development project for the basic quad batch implementation that is used to implement the native batch for Vulkan. The NativeBatch implementation is what allows the UI library to work with Vulkan. .
Development project for the Vulkan NativeTexture2D and DynamicNativeTexture2D implementation. Makes it easy to provoke certain NativeTexture2D/DynamicNativeTexture2D scenarios.
Shows how to use the Demo Frameworks 'basic' 2d rendering capabilities that work across all backends. The basic2D interface allows you to render ASCII strings using a system provided font and draw colored points.
The functionality in Basic2D is used internally in the framework to render the profiling overlays like the frame rate counter and graph.
Shows how to use the Demo Frameworks NativeBatch implementatin to render various graphics elements. The native batch functionality works across various 3D backends and also allows you to use the API native textures for rendering.
The native batch is very useful for quickly getting something on the screen which can be useful for prototyping and debugging. It is however not a optimized way of rendering things.
Visualize the supported easing functions.
Application used to debug the gesture handling code.
Demonstrates how to receive various input events and logs information about them onscreen and to to the log.
This can also be used to do some basic real time tests of the input system when porting the framework to a new platform.
Development project for on demand rendering and demonstrates how to implement it. It also has some basic 'janky timing' detection animations.
This application has been designed for a 1920x1080dp screen and will provide a sub-optimal experience for resolutions lower than that.
Simple example of UI data binding
UI benchmark that can be used to benchmark various ways of rendering a UI. This allows you to see what works best on the given hardware.
This application has been designed for a 1920x1080dp screen and will provide a sub-optimal experience for resolutions lower than that.
Simple example of UI chart rendering.
Experimental declarative UI that use the new data-binding capability to create UI from a XML file.
This sample showcases a UI that is DPI aware vs one rendered using the standard pixel based method.
It also showcases various ways to render scaled strings and the errors that are easy to introduce.
Application used to debug the UI gesture handling code.
This sample showcases some of the common scaling traps that can occur when trying to achieve pixel perfect rendering.
This sample test the various internal UI rendering primitives.
A very basic example of how to utilize the DemoFramework's UI library. The sample displays four buttons and reacts to clicks.
The UI framework that makes it easy to get a basic UI up and running. The main UI code is API independent. It is not a show case of how to render a UI fast but only intended to allow you to quickly get a UI ready that is good enough for a demo.
A more complex example of how to utilize the DemoFramework's UI library. It displays various UI controls and ways to utilize them.
The UI framework that makes it easy to get a basic UI up and running. The main UI code is API independent. It is not a show case of how to render a UI fast but only intended to allow you to quickly get a UI ready that is good enough for a demo.
This sample showcases the difference between sub pixel accuracy vs pixel accuracy when scrolling.
Showcase all controls that is part of the Basic UI theme.
Demonstrates how to use the Freestyle window project type to create a window for use with Vulkan.
Then renders a simple triangle to it.
The triangle rendering code is based on a sample by Norbert Nopper from VKTS Examples VKTS_Sample02 Recreated as a DemoFramework freestyle window sample in 2016.
Just shows how to create a native window using the FslNativeWindow library.
This can be used to develop support for a new platform.
Demonstrates how to receive various input events and logs information about them to the log.
This can also be used to do some basic real time tests of the input system when porting the framework to a new platform.