The question: Is it possible to make a custom fbtft_device that can format the SPI video packet as needed?
I have 10m of the RGB LED strips (APA102 based -- https://www.adafruit.com/images/product ... PA102C.pdf). So I build up a very low definition screen -- 23 x 13 pixels. (I ordered a few more to make it a nice even 32x18)
I started with an Arduino and was able to get it to play videos from an SD card. This took a fair amount of pre-processing to decode the video, downscale, apply gamma correction, add the global/alpha byte, swap the B & R bytes, reverse every other line... to generate a raw byte stream that the Arduino can keep up with. After all that I was only able to get ~15fps out of it, and it was a pain to play a different video.
I switched to a RPi 2B (clean install of Raspbian via NOOBS) with the hopes of being able to stream videos directly from a file or even Youtube. Getting the SPI hardware working and streaming the pre-processed files was straight forward enough and it is MUCH faster than the bit-banged Arduino approach.
Now I'm having trouble getting video from the framebuffer. My first attempt was to make a copy of the fb0, process it, and send it out SPI. For the most part this worked except for it appears that omxplayer and other GPU programs don't use fb0.
It looks like the fbtft driver my be the right answer. Is it possible to make a custom fbtft_device that can format the SPI packet as needed? All I'm seeing in the documentation is the init packet and a few other minor tweaks.
I'm more of a hardware designer and all my previous real time video experience has been in FPGAs & verilog so I'm a little new to the Linux way of things. Any help pointing me in the right direction is greatly appreciated!