[Edit]13 Jan 2011, After Windows Update KB2454826, this hack was not working. I have patched the sample to make it work again. Of course, you shouldn't consider this hack for anykind of production use. Use the standard DXGI shared sync keyed mutex instead. This hack is just for fun![/Edit]
If you know Direct3D 11 and Direct 2D - they were released almost at the same time - you already know that there is a huge drawback to use Direct 2D : It's in fact only working with Direct3D 10.1 API (although It's working with older hardware thanks to the new feature level capability of the API).
From a coding user point of view, this is really disappointing that such a good API doesn't rely on the latest Direct3D API... moreover when you know that the Direct3D 11 API is really close to the Direct3D 10.1 API... In the end, more work are required for a developer that would like to work with Direct3D 11, as It doesn't have any more Text API for example, meaning that in D3D11, you have to do it yourself, which isn't a huge task itself, if you go to the easy precalculated-texture-of-fonts generated by some GDI+ calls or whatever, but still... this is annoying specially when you need to display some information/FPS on the screen and you can't wait to build a nice font-texture-based system...
I'm not completely fair with Direct2D interoperability with Direct3D 11 : there is in fact a well known solution proposed by one guy from DirectX Team that imply the use of DXGI mutex to synchronized a surface shared between D3D10.1 and D3D11. I was expecting this issue to be solved in some DirectX SDK release this year, but It seems that there is no plan to release in the near future an update for Direct2D (see my question in the comments and the anwser...)... WP7 and XNA are probably getting much more attention here...
So last week, I took some time on the Direct2D API and found that It's in fact fairly easy to hack Direct2D and redirect all the D3D10.1 API calls to a real Direct3D 11 instance... and this is a pretty cool news! Here is the story of this little hack...
How Direct2D is accessing your already instantiated D3D10.1 device?
In order to use Direct2D with a renderable D3D10 texture2D, you need to query the IDXGISurface from your ID3D10Texture2D object, something like this:
IDXGISurface* surface;
// Create a Texture2D (or use SwapChain backbuffer)
d3d10Device->CreateTexture2D(&texture2DDesc, 0, &texture2D);
// Query the DXGI Surface associated with the D3D10.1 Texture2D
texture2D->QueryInterface(__uuidof(IDXGISurface), &surface);
// Create a D2D Render target from the D3D10 Texture2D through the associated DXGISurface
d2dFactory->CreateDxgiSurfaceRenderTarget(
surface,
&props,
&d2dRenderTarget
);
So starting from this CreateDxgiSurfaceRenderTarget call, Direct2D is somehow able to get back your D3D10.1 instance and is able to use it to submit drawcalls / create textures... etc. In order to find how Direct2D is getting an instance of ID3D10Device1, I have first implemented a Proxy IDXGISurface that was responsible to embed the real DXGI Surface and delegate all the calls for it...while being able to track down how Direct2D is getting back this ID3D10Device1 : - After the surface enters the CreateDxgiSurfaceRenderTarget, Direct2D is querying the IDXGIDevice through the GetDevice method on the IDXGISurface
- From the IDXGIDevice, Direct2D is calling QueryInterface with the IID of the ID3D10Device interface (surprisingly not the ID3D10Device1)
Interoperability between D3D10.1 and D3D11 API
Migrating from D3D10/D3D10.1 to D3D11 API is quite straightforward and even have a dedicated paper on msdn. For the purpose of this quick hack, I didn't implement proxies for the whole D3D10 API... but I have instead focused my work on how is used the D3D10 API from D2D and what are the real methods/structures used that are not binary compatible between D3D10 and D3D11.
In the end, I have developped 5 proxies :
- a Proxy for IDXGISurface interface, in order to hack the GetDevice method and return my own proxy for IDXGIDevice
- a Proxy for IDXGIDevice interface in order to hack the QueryInterface method and return my own proxy for ID3D10Device1
- a Proxy for the ID3D10Device1 interface
- a Proxy for the ID3D10Texture2D interface
- a Proxy for the ID3D10Buffer interface
virtual void STDMETHODCALLTYPE VSGetShader(
/* [annotation] */
__out ID3D10VertexShader **ppVertexShader) {
context->VSGetShader((ID3D11VertexShader**)ppVertexShader, 0, 0);
}
A Real proxy would have to wrap the ID3D11VertexShader inside a ID3D10VertexShader proxy... but because Direct2D (and this is not a surprise) is only using VSGetShader to later call VSSetShader (in order to restore the saved states, or to set it's own vertex/pixel shaders), It doesn't call any method on the ID3D10VertexShader instance... meaning that we can give it back directly a ID3D11VertexShader without performing any - costly - conversion.
For instance, most of the ID3D10Device1 proxy methods are like the previous one, a simple redirection to a D3D11 Device or DeviceContext... easy!
I was only forced to implement custom proxies for some incompatible structures... or returned object instance that are effectively used by Direct2D (like ID3D10Buffer and ID3D10Texture2D).
For example, the ID3D10Device::CreateBuffer proxy methods is implemented like this :
virtual HRESULT STDMETHODCALLTYPE CreateBuffer(
/* [annotation] */
__in const D3D10_BUFFER_DESC *pDesc,
/* [annotation] */
__in_opt const D3D10_SUBRESOURCE_DATA *pInitialData,
/* [annotation] */
__out_opt ID3D10Buffer **ppBuffer) {
D3D11_BUFFER_DESC desc11;
*((D3D10_BUFFER_DESC*)&desc11) = *pDesc;
// StructureByteStride field is new in D3D11
desc11.StructureByteStride = 0;
// Returns our ID3D10Buffer proxy instead of the real one
ProxyID3D10Buffer* buffer = new ProxyID3D10Buffer();
buffer->device = this;
*ppBuffer = buffer;
HRESULT result = device()->CreateBuffer(&desc11, (D3D11_SUBRESOURCE_DATA*)pInitialData, (ID3D11Buffer**)&buffer->backend);
CHECK_RETURN(result);
// return S_OK;
}
There was also just a few problems with 2 incompatible structures between D3D10_VIEWPORT/D3D11_VIEWPORT (D3D11 is using floats instead of ints!) and D3D10_BLEND_DESC/D3D11_BLEND_DESC... but the proxy methods were easy to implement:
virtual void STDMETHODCALLTYPE RSSetViewports(
/* [annotation] */
__in_range(0, D3D10_VIEWPORT_AND_SCISSORRECT_OBJECT_COUNT_PER_PIPELINE) UINT NumViewports,
/* [annotation] */
__in_ecount_opt(NumViewports) const D3D10_VIEWPORT *pViewports) {
// Perform conversion between D3D10_VIEWPORT and D3D11_VIEWPORT
D3D11_VIEWPORT viewports[16];
for(int i = 0; i < NumViewports; i++) {
viewports[i].TopLeftX = pViewports[i].TopLeftX;
viewports[i].TopLeftY = pViewports[i].TopLeftY;
viewports[i].Width = pViewports[i].Width;
viewports[i].Height = pViewports[i].Height;
viewports[i].MinDepth = pViewports[i].MinDepth;
viewports[i].MaxDepth = pViewports[i].MaxDepth;
}
context->RSSetViewports(NumViewports, (D3D11_VIEWPORT*)viewports);
}
Even if I haven't performed any performance timing measurement, the cost of those proxy methods should be almost unnoticeable... and probably much more lightweight than using mutex synchronization between D3D10 and D3D11 devices!
Plug-in the proxies
In the end, I have managed to put those proxies in a single .h/.cpp with an easy API to plug the proxy. The sequence call before passing the DXGISurface to Direct2D should then be like this:
d3d11Device->CreateTexture2D(&offlineTextureDesc, 0, &texture2D);
// Create a Proxy DXGISurface from Texture2D compatible with Direct2D
IDXGISurface* surface = Code4kCreateD3D10CompatibleSurface(d3d11Device, d3d11DeviceContext, texture2D);
d2dFactory->CreateDxgiSurfaceRenderTarget(
surface,
&props,
&d2dRenderTarget
);
And that's all! You will find attached a project with the sources. Feel free to test it and let me know if you are encountering any issues with it. Also, the code is far from being 100% safe/robust... It's a quick hack. For example, I have not checked carefully that my proxies behaves well with AddRef/Release... but that should be fine.
So far, It's seems to work well on the whole Direct2D API... I have even been able to use DirectWrite with Direct2D... using Direct3D 11, without any problem. There is only one issue : PIX won't be able to debug Direct2D over Direct3D 11... because It seems that Direct2D is performing some additional method calls (D3D10CreateStateBlocks) that are incompatible with the lightweight proxies I have developed... In order to be fully supported, It would be necessary to implement all the proxies for all the interfaces returned by ID3D10Device1... But this is a sooo laborious task that by that time, we can expect to have Direct2D fully working with Direct3D 11 provided from DirectX Team itself!
Also from this little experience, I can safely confirm that It shouldn't take more than one day for one guy from the Direct2D team to patch existing Direct2D code in order to use Direct3D 11... as it is much easier to do this on the original code than going to the proxy road as I did! ;)
You can grab the VC++ 2010 project from here : D2D1ToD3D11.7z
This sample is only saving a "test.png" image using Direct2D API over Direct3D11.