Tuesday, October 16, 2007
Fixed
I was passing the vertex count as the count parameter to
glDrawElements(GLenum mode, GLsizei count, GLenum type,
void *indices)
instead of passing the number of indices. I had to do an application with my vertex and index buffers hard coded to re-create the problem.
Here is a wire framed screen of the tiger in the nebula distribution with all its vertices.
Monday, October 15, 2007
It's a bird, It's a plane ... its supposed to be a box
Here is a screenshot of my uhmm... box
Friday, October 12, 2007
Re-arranged yet again
After re-arranging the display device and render device code to what I thought was better I’ve re-arranged it yet again. While looking at render targets, the default render target in particular I realized that the default render target gets its pixel format from the display device’s pixel format. This is done when the render device is first open which based on my previous code is before the display device’s pixel format is known. All in all I think all the OpenGL initialization code is better placed in the display device.
I’ve started working on primitives. OpenGL’s C interface makes managing vertex and index buffers very simple, so simple in fact that it feels like I’ve missed something. I like it!
I’m still now just working on the code without testing it as I need to implement a few more classes before I can see what I’m actually doing.
Wednesday, October 10, 2007
then there was a window!
After cleaning up the code a bit more yesterday, I think its working almost as well as the Direct X implementation. The issue I’m having now is that windows returns two display adapters on my PC. I suppose this is accurate since my graphics card has two monitor slots. The problem is that’s when I ask windows for the second adapter’s display mode it gives me negative invalid numbers for the width, height and color depth which causes an assert. I’ve put in a hack to fix it temporarily as I look for a solution.
Next stop, primitives.
Here is a screenshot.
Monday, October 1, 2007
OpenGL on Nebula3
I created two new classes Win32OGLRenderDevice and Win32OGLDisplayDevice. The Win32OGLRenderDevice was pretty straight forward but the display device was a bit tricky. It turns out that for you to query hardware capabilities you have to attach a render context and for you to create a render context with the capabilities that you require you have to query the hardware's capabilities - chicken or the egg anyone?
I have it such that the display device first creates a renderdevice and opens a window with default capabilities then queries for the required capabilities e.g. fsaa etc then re-opens the window with those.
After about 4 hours of work, I had a DisplayDevice and a RenderDevice compiling but I cant test it yet since when the base RenderDevice opens it creates a render target which I have not implemented yet and the currently implemented one is tied to the Direct3DRenderDevice.
I want to have OpenGL running on Windows first before I start porting nebula3 to linux and later Mac OS as I don't have access to a Mac yet.