I decided to try and determine the baselines myself, at runtime. Using the font metrics, I could multiply the line stride by the font height to determine how many bytes are in each character's bitmap. Then I could jump to the end of a bitmap and read each row from the bottom to the top until I detected a non-zero pixel. That would be the baseline. Since the line stride tells me how many bytes are in each line, it seemed easy enough to do. After testing, this method does work for a custom font I uploaded in L4 format. However, it doesn't work for built-in fonts. Or at least it doesn't work for font 31. I haven't checked the others yet.
The problem is the font format. Despite the BT817/8 datasheet saying font 31 is an L4 format font, the metrics block says it is COMPRESSED_RGBA_ASTC_8x8_KHR. The datasheet also says the line stride should be 18, but the metrics say it is a whopping 80 bytes per line!
Now I don't know what to do. Perhaps I'm reading the wrong metrics block, but all the other values seem reasonable.
What is the meaning of line stride in an ASTC compressed font? Can I use it to determine the start of each row of the bitmap? Or is it only useful when multiplying against the font height to determine the size of each character?
Is there a way I can extract a single line from the bitmap so I can determine if any pixels are set without decompressing the entire character? Or if the entire character must be decompressed, can I use the BT817/8 to decompress it into RAM_G where I can then examine it row, by row?
Or better yet, can someone provide the baseline metrics for all the ROM fonts and I can hard code those values into my program and only runtime-calculate the baselines for my custom fonts.
Thanks,
mike