Thursday, February 17, 2011

DPHP 6 - Fighter Plane in a Helmet

It's amazing how quickly the one a day promise falls by the wayside. Excuses excuse, etc etc, blah blah blah. The honest fact is it's a work week from hell, where the world is expected and as much as possible needs to be delivered. So I'm behind a few days. Here's getting back on the bus.



You may be noticing a trend here.  For the longest time, my youtube videos have been the best kept records of this project.  I often put far too technical an explanation into the description, but at least it's somewhere.

I've made mention of what you're seeing about before.  I once again, there's a nice narative to explain what's going on.

Simply put, there is no way to see through the LED display matrix.  Even if the sub visor were made transparent, the backglow of the LEDs would be bright enough and close enough to the face to completely wash out any other light source, let alone causing potential damage from the proximity and brightness.  So how does one fix this?  Well, the obvious choice is with a camera and video goggles.  The camera can be mounted elsewhere in the helmet with an adequate view point, and the goggles go infront of the eyes, where they do the most good.

But clearly, that is too simple a solution, far to simple for the scope and scale of this project.  After some research, I came up an On Screen Display (OSD) prototyping chip on sparkfun, based on a MAX7456.  This way I could overlay text onto a standard NTSC video signal, i.e. what coming out of the camera and going into the video goggles.  Bingo.

And another one of my main engineering hurdles rears it's head.  Animations, for the most part, require almost all of the processing time of the microcontroller.  Maybe not the ones you've seen in the first and second round animations, but some of my later ones certainly require every single cycle in order to have a reasonable refresh rate.  Where am I going to find the processing time/cycles to run all the background stuff to drive an OSD chip?  The answer is a second controller, known as the UI controller, in addition to the one used to run the animations, now known as the SHOW controller.


Ok, so now I have a whole second microcontroller to play with, and it's job is only to interface with the user, tell the SHOW what to do.  Here's a breakdown of the features:

Power
-Report the voltage of the primary and secondary power batteries
-Report the amp draw from the primary battery
-Integrate voltage and amp draw to calculate used power, and remaining power in the battery

Climate Control
-Report the temperatures measured by a couple of sensors
-Turn on and off cooling devices based on temperature data
-Provide automation of the climate control functions

Status Display (OSD)
-Report what is currently running on the display
-Report the current menu interface state
-Report climate control and power status information

Command Control
-Accept inputs from the user
-Track state information to/from the user and to/from the SHOW microcontroller
-Send appropriate commands to the SHOW microcontroller

So how is all this accomplished?  Well, I have the following bits and bobs connected to the UI micro:

component - purpose - protocol

>MAX7456 - on screen display - SPI
>ACS712 - current sensor - ADC
>DS18B20 - temperature sensor - 1Wire
>Voltage divider - Voltage sensor - ADC
>Glove buttons - user input - I2C
>PNP MOSFET - power switching to cooling components - DIO

We'll save how all this breaks down for another day.  There are a lot of systems and subsystems to explain, that's for sure.

No comments:

Post a Comment