All right, thanks a lot! My assumption was correct, then.
Also, I've redone a lot of my code, adding things like a quadrupleton (aka 4 of 'em) pool allocation tracker into the OAMState's private static scope, more appropriately managing sprite allocation tracking, and other such things. I now have a working prototype (a simple raindrop animation), which I will be putting up on my MSOE student webpage in the near future, along with source code and project files.
However, I've come across a bit of a bug, which is similar enough to the original question to be notable here. In my test program, when I disable the rain animation, all the associated sprites are freed and returned to the sprite pool I developed, wiped out via oamClearSprite()
EDIT: Might this nuke some vital information concerning graphics alignment in the sprite entry?, and the Sprite pointers are deleted and removed from the list within my Rain object. When the sprites are re-enabled, in order to render, they fetch the SpriteEntry* and convert its fields to oamSet()-appropriate values. However, when converting from SpriteEntry::gfxIndex to a VRAM pointer, I have been using oamGetGfxPtr(oam, pEntry->gfxIndex) behind the scenes, and setting that to the appropriate graphics pointer, but on the sprites for which this is done, the image appears... wrong. It's as if the wrong pointer is retrieved, or the wrong color depth or format is applied. Instead of a straight, light-blue raindrop, there is a scattering of light-blue pixels, as if the image has been spread out incorrectly. I'll put up a screenshot later, if anyone is interested.
The problem may lie within the way I fetch and deduce the SpriteSize value from an ObjSize and ObjShape value, along with a lookup table for the pixel count, or it may be how I get the SpriteColorFormat from the color and blend mode, but I don't believe either of those are the case. I think that the pointer retrieved by oamGetGfxPtr() is not the same as the pointer set within oamSet(). This may be caused by an improper setting of the sprite mapping, but none of my code other than the initialization step modifies the sprite mappings, so I believe that it may be a bug in the way the graphics pointer is being retrieved.
Right now, I am avoiding this issue by manually setting the graphics pointer each time I create a Sprite, but I have had to add a lot of code to deal with this dependency for the time being, and I am uncomfortable with doing so. Also, if my test assertion
Code: Select all
gfxPtr == oamGetGfxPtr(pOam, oamGfxPtrToOffset(pOam, gfxPtr));
is proven false, then this could be a pretty serious bug in libnds. I will be doing some work today to test this, and I will get back with my findings the next time I post.
Thanks again, zeromus and vuurrobin, for your help.
EDIT:
My graphical settings are:
SpriteMapping_1D_128, and extendedPalette = false, for initializing the OAM state.
SpriteSize_32x32, and SpriteColorFormat_256Color, for all graphics allocations.
A PNG source image in 8-bit paletted mode, sized 128w x 32h (a 4x1 grid of 32x32 tiles), using the sprite.grit file taken from the animate_simple example, which works properly in all situations except with the aforementioned bug.
128 (32 * 32) for all graphics memory sizes and frame offsets (ie. the size parameter for dmaCopy() calls between the source image and allocated VRAM, done only within the Rain constructor).
No affine transformations or any other special settings.
I figured this might be important, because of the presence of a bug involving 16x16 sprites on the bug tracker. This bug concerns 32x32 sprites, not 16x16.