|
|
@ -428,7 +428,7 @@ |
|
|
|
====================================== |
|
|
|
|
|
|
|
Q: How can I tell whether to dispatch mouse/keyboard to imgui or to my application? |
|
|
|
A: You can read the 'io.WantCaptureMouse', 'io.WantCaptureKeyboard' and 'io.WantTextInput' flags from the ImGuiIO structure. |
|
|
|
A: You can read the 'io.WantCaptureMouse', 'io.WantCaptureKeyboard' and 'io.WantTextInput' flags from the ImGuiIO structure (e.g. if (ImGui::GetIO().WantCaptureMouse) { ... } ) |
|
|
|
- When 'io.WantCaptureMouse' is set, imgui wants to use your mouse state, and you may want to discard/hide the inputs from the rest of your application. |
|
|
|
- When 'io.WantCaptureKeyboard' is set, imgui wants to use your keyboard state, and you may want to discard/hide the inputs from the rest of your application. |
|
|
|
- When 'io.WantTextInput' is set to may want to notify your OS to popup an on-screen keyboard, if available (e.g. on a mobile phone, or console OS). |
|
|
@ -443,28 +443,88 @@ |
|
|
|
were targeted for Dear ImGui, e.g. with an array of bool, and filter out the corresponding key-ups.) |
|
|
|
|
|
|
|
Q: How can I display an image? What is ImTextureID, how does it works? |
|
|
|
A: ImTextureID is a void* used to pass renderer-agnostic texture references around until it hits your render function. |
|
|
|
Dear ImGui knows nothing about what those bits represent, it just passes them around. It is up to you to decide what you want the void* to carry! |
|
|
|
It could be an identifier to your OpenGL texture (cast GLuint to void*), a pointer to your custom engine material (cast MyMaterial* to void*), etc. |
|
|
|
At the end of the chain, your renderer takes this void* to cast it back into whatever it needs to select a current texture to render. |
|
|
|
Refer to examples applications, where each renderer (in a imgui_impl_xxxx.cpp file) is treating ImTextureID as a different thing. |
|
|
|
|
|
|
|
// The example OpenGL back-end uses integers to identify textures.
|
|
|
|
// You can safely store an integer into a void* by casting it. e.g. (void*)(intptr_t)MY_GL_UINT to cast to void*.
|
|
|
|
GLuint my_opengl_texture; |
|
|
|
glGenTextures(1, &my_opengl_texture); |
|
|
|
// [...] load image, render to texture, etc.
|
|
|
|
ImGui::Image((void*)(intptr_t)my_opengl_texture, ImVec2(512,512)); |
|
|
|
|
|
|
|
// The example DirectX11 back-end uses ID3D11ShaderResourceView* to identify textures.
|
|
|
|
ID3D11ShaderResourceView* my_texture_view; |
|
|
|
device->CreateShaderResourceView(my_texture, &my_shader_resource_view_desc, &my_texture_view); |
|
|
|
ImGui::Image((void*)my_texture_view, ImVec2(512,512)); |
|
|
|
|
|
|
|
To display a custom image/texture within an ImGui window, you may use ImGui::Image(), ImGui::ImageButton(), ImDrawList::AddImage() functions. |
|
|
|
Dear ImGui will generate the geometry and draw calls using the ImTextureID that you passed and which your renderer can use. |
|
|
|
You may call ImGui::ShowMetricsWindow() to explore active draw lists and visualize/understand how the draw data is generated. |
|
|
|
It is your responsibility to get textures uploaded to your GPU. |
|
|
|
A: Short explanation: |
|
|
|
- You may use functions such as ImGui::Image(), ImGui::ImageButton() or lower-level ImDrawList::AddImage() to emit draw calls that will use your own textures. |
|
|
|
- Actual textures are identified in a way that is up to the user/engine. |
|
|
|
- Loading image files from the disk and turning them into a texture is not within the scope of Dear ImGui (for a good reason). |
|
|
|
Please read documentations or tutorials on your graphics API to understand how to display textures on the screen before moving onward. |
|
|
|
|
|
|
|
Long explanation: |
|
|
|
- Dear ImGui's job is to create "meshes", defined in a renderer-agnostic format made of draw commands and vertices. |
|
|
|
At the end of the frame those meshes (ImDrawList) will be displayed by your rendering function. They are made up of textured polygons and the code |
|
|
|
to render them is generally fairly short (a few dozen lines). In the examples/ folder we provide functions for popular graphics API (OpenGL, DirectX, etc.). |
|
|
|
- Each rendering function decides on a data type to represent "textures". The concept of what is a "texture" is entirely tied to your underlying engine/graphics API. |
|
|
|
We carry the information to identify a "texture" in the ImTextureID type. |
|
|
|
ImTextureID is nothing more that a void*, aka 4/8 bytes worth of data: just enough to store 1 pointer or 1 integer of your choice. |
|
|
|
Dear ImGui doesn't know or understand what you are storing in ImTextureID, it merely pass ImTextureID values until they reach your rendering function. |
|
|
|
- In the examples/ bindings, for each graphics API binding we decided on a type that is likely to be a good representation for specifying |
|
|
|
an image from the end-user perspective. This is what the _examples_ rendering functions are using: |
|
|
|
|
|
|
|
OpenGL: ImTextureID = GLuint (see ImGui_ImplGlfwGL3_RenderDrawData() function in imgui_impl_glfw_gl3.cpp) |
|
|
|
DirectX9: ImTextureID = LPDIRECT3DTEXTURE9 (see ImGui_ImplDX9_RenderDrawData() function in imgui_impl_dx9.cpp) |
|
|
|
DirectX11: ImTextureID = ID3D11ShaderResourceView* (see ImGui_ImplDX11_RenderDrawData() function in imgui_impl_dx11.cpp) |
|
|
|
DirectX12: ImTextureID = D3D12_GPU_DESCRIPTOR_HANDLE (see ImGui_ImplDX12_RenderDrawData() function in imgui_impl_dx12.cpp) |
|
|
|
|
|
|
|
For example, in the OpenGL example binding we store raw OpenGL texture identifier (GLuint) inside ImTextureID. |
|
|
|
Whereas in the DirectX11 example binding we store a pointer to ID3D11ShaderResourceView inside ImTextureID, which is a higher-level structure |
|
|
|
tying together both the texture and information about its format and how to read it. |
|
|
|
- If you have a custom engine built over e.g. OpenGL, instead of passing GLuint around you may decide to use a high-level data type to carry information about |
|
|
|
the texture as well as how to display it (shaders, etc.). The decision of what to use as ImTextureID can always be made better knowing how your codebase |
|
|
|
is designed. If your engine has high-level data types for "textures" and "material" then you may want to use them. |
|
|
|
If you are starting with OpenGL or DirectX or Vulkan and haven't built much of a rendering engine over them, keeping the default ImTextureID |
|
|
|
representation suggested by the example bindings is probably the best choice. |
|
|
|
(Advanced users may also decide to keep a low-level type in ImTextureID, and use ImDrawList callback and pass information to their renderer) |
|
|
|
|
|
|
|
User code may do: |
|
|
|
|
|
|
|
// Cast our texture type to ImTextureID / void*
|
|
|
|
MyTexture* texture = g_CoffeeTableTexture; |
|
|
|
ImGui::Image((void*)texture, ImVec2(texture->Width, texture->Height)); |
|
|
|
|
|
|
|
The renderer function called after ImGui::Render() will receive that same value that the user code passed: |
|
|
|
|
|
|
|
// Cast ImTextureID / void* stored in the draw command as our texture type
|
|
|
|
MyTexture* texture = (MyTexture*)pcmd->TextureId; |
|
|
|
MyEngineBindTexture2D(texture); |
|
|
|
|
|
|
|
Once you understand this design you will understand that loading image files and turning them into displayable textures is not within the scope of Dear ImGui. |
|
|
|
This is by design and is actually a good thing, because it means your code has full control over your data types and how you display them. |
|
|
|
If you want to display an image file (e.g. PNG file) into the screen, please refer to documentation and tutorials for the graphics API you are using. |
|
|
|
|
|
|
|
Here's a simplified OpenGL example using stb_image.h: |
|
|
|
|
|
|
|
// Use stb_image.h to load a PNG from disk and turn it into raw RGBA pixel data:
|
|
|
|
#define STB_IMAGE_IMPLEMENTATION |
|
|
|
#include <stb_image.h> |
|
|
|
[...] |
|
|
|
int my_image_width, my_image_height; |
|
|
|
unsigned char* my_image_data = stbi_load("my_image.png", &my_image_width, &my_image_height, NULL, 4); |
|
|
|
|
|
|
|
// Turn the RGBA pixel data into an OpenGL texture:
|
|
|
|
GLuint my_opengl_texture; |
|
|
|
glGenTextures(1, &my_opengl_texture); |
|
|
|
glBindTexture(GL_TEXTURE_2D, image_tex); |
|
|
|
glPixelStorei(GL_UNPACK_ROW_LENGTH, 0); |
|
|
|
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, image_width, image_height, 0, GL_RGBA, GL_UNSIGNED_BYTE, image_data); |
|
|
|
|
|
|
|
// Now that we have an OpenGL texture, assuming our imgui rendering function (imgui_impl_xxx.cpp file) takes GLuint as ImTextureID, we can display it:
|
|
|
|
ImGui::Image((void*)(intptr_t)my_opengl_texture, ImVec2(my_image_width, my_image_height)); |
|
|
|
|
|
|
|
C/C++ tip: a void* is pointer-sized storage. You may safely store any pointer or integer into it by casting your value to ImTexture / void*, and vice-versa. |
|
|
|
Because both end-points (user code and rendering function) are under your control, you know exactly what is stored inside the ImTexture / void*. |
|
|
|
Examples: |
|
|
|
|
|
|
|
GLuint my_tex = XXX; |
|
|
|
void* my_void_ptr; |
|
|
|
my_void_ptr = (void*)(intptr_t)my_tex; // cast a GLuint into a void* (we don't take its address! we literally store the value inside the pointer)
|
|
|
|
my_tex = (GLuint)(intptr_t)my_void_ptr; // cast a void* into a GLuint
|
|
|
|
|
|
|
|
ID3D11ShaderResourceView* my_dx11_srv = XXX; |
|
|
|
void* my_void_ptr; |
|
|
|
my_void_ptr = (void*)my_dx11_srv; // cast a ID3D11ShaderResourceView* into an opaque void*
|
|
|
|
my_dx11_srv = (ID3D11ShaderResourceView*)my_void_ptr; // cast a void* into a ID3D11ShaderResourceView*
|
|
|
|
|
|
|
|
Finally, you may call ImGui::ShowMetricsWindow() to explore/visualize/understand how the ImDrawList are generated. |
|
|
|
|
|
|
|
Q: How can I have multiple widgets with the same label or without a label? |
|
|
|
Q: I have multiple widgets with the same label, and only the first one works. Why is that? |
|
|
|