I am currently running the AnalogReadSerial program - read the serial port and print the value with a 250ms delay. The adc port is connected to a voltage divider outputting 0.110-0.111v. The value being read is jumping between about 96 - 108. Seems like a very large variation to me. This would appear to link to a 12mV fluctuation.
I'm trying to figure out why the analog port value is fluctuating so much. I assume it has something to do with the input voltage jumping? Is there a better way to smooth that out? Is my wall adapter cheap junk? Not sure where I got it, but I saw saw similar voltage jump with a 5.4V@2A and 9V@500mA through the 3.3v regulator. Maybe I should try the 9V on a 5V regulator?
Any thought / ideas / suggestions greatly appreciated. Thank you.