# i need help to understand this voltmeter.

• March 12th, 2008, 09:40 PM
rainbowolf
i need help to understand this voltmeter.
im doing a proj on "design of universal monitoring circuit for solar panel"
im given a voltmeter (DPM 3AS-BL) a LASCAR component.
i've searched for days but found no answers to understand these circuits and their uses..
i need help >.<

i kept reading this datasheet but i cant figure anything out..
my lecturer wanted me to understand how this component works.
im suppose to choose 1 of the 6 circuits below to measure a voltage that the solar panel provides.

http://www.martelmeters.com/pdf/DPM_3AS-BL.PDF

thanks!
• March 13th, 2008, 01:26 AM
Harold14370
This meter is measuring a voltage in the range of up to 200 millivolts. You will need to connect this voltage you are measuring to the pins labeled INHI and INLO. You will also need to connect a power supply to power up the measuring circuit and the backlight (optional). This will be a d-c voltage in the range of either 3-5 volts or 6-9 volts. There could be a lot of ways to do this. If there is some control circuit that is a part of this system you could pick off an appropriate voltage from that power supply. Maybe you could use the voltage from the battery you are charging. Or, you could have an a-c adapter of the kind that is used to charge a cell phone battery or something. This would probably put out a d-c voltage in the range you need for your meter. Once you know the range of d-c voltage you will be using to power your meter, you can select a circuit for that voltage range. If you want a backlight on your meter to make it easer to read you will select one of the circuits that have a connection to L+ and L-. You also have an option to connect in a switch to turn the backlight on and off. if you so desire.
• March 14th, 2008, 01:18 AM
rainbowolf
thanks alot :-D
well i fix up the ciruit, out of the 6 in that datasheet. i used the one on the top left hand corner. the one that measures voltage with a common ground.

i've connect my 5V power to pin 8 (v+) to power up the component correct?

where's the input, the range of allowable input voltage?

so i'll need to measure the voltage i want to measure at INHI & INLO? meaning i have to connect a voltage of less then 200 mV to those 2 pins? and the value will show on the meter right?
how do i put this 200mV in? im told to use a voltage divider. so im working on it now.

i've connect it up & even when i didnt put any voltage into INHI & INLO, values are showing and it exceeds 200! aint it suppose to only measure up to 200? or is the unit not in mV?

and what's the 10uF capacitor for anyway?
• March 14th, 2008, 05:07 AM
Harold14370
It would help to know what the rest of your circuit looks like. You might be better off with the isolated input configuration. What did you use to power the meter?

If you want to use this meter to measure >200 millivolts (0.2 V) you can easily make a voltage divider. Let's say you want to measure up to 20 volts. That means you need to reduce that by a factor of 100. Get two resistors that add up to 1000 and put them in series, say a 10 ohm and a 990 ohm. The voltage across the 10 ohm resistor will be 1/100 the voltage across the series resistance.

The capacitor and resisor on the input make up a filter to smooth out the input voltage. Without the filter, the reading could jump around a lot and be difficult to read on your digital display.

You say you connected the power to pin 8. There are two power connections, the positive and negative. Did you connect negative of the power supply to pin 7?
• March 16th, 2008, 09:49 PM
rainbowolf
its just a small circuit, to test out on a breadboard to see if it works..
yea i did put the negative to pin 7..
i use a labortory DC power supply to power it.. giving it 5V..

http://img169.imageshack.us/img169/1...seresisgu4.jpg
is the value correct? ( for the resistors on my voltage divider)
oh.. i forgot to draw this.. the voltage going into the voltage divider is 5V.. or does it even matter if its 5v or 10v? the 1k omh resistor is suppose to be adjusted.. haha

i know its very basic & thanks alot for your help Harold..
this is very nice way to learn for beginners like me.. :D
hope i dont sound too improfessional to all these circuits.. haha

i think i will extend this circuit to something else..
it depends on wad my lecturer wants..
well.. i guess he wants me to make a solar toy car that could measure voltage/current from the sunlight (from the solar panel).. LOL
• March 16th, 2008, 10:43 PM
Harold14370
Where you have the ? connects to the 1M resistor at the input of your measuring circuit.

The 220 K resistor is where you connect the voltage your meter is measuring. I just noticed that the manual has a scaling section that tells you how to set up your voltage divider circuit. They use fixed resistors and then calibrate it using a built in potentiometer. Since you are using a variable resistor for Rb, you can calibrate it by adjusting Rb. Connect the voltage you want to represent full scale (measured on your DVM) and adjust Rb until you get full scale reading on the meter you are adjusting. Set Rb to the minimum at first and adjust it up so you don't over-range the meter. Be careful with that because you could wreck the meter. It would probably be safer to use fixed resistors as shown in the scaling section of the product manual.

You asked if the resistor value was correct. Use the voltage divider equation Vout = Vin*R1/(R1+R2). Then if Vin is 5 volts, Vout is 5*(1K/221K)=.0227 or 22.7 millivolts. You want 220 mv so no, it's not the right size.

(On edit.) I think it would be a real good idea to use the scaling provided in the manual. You have to scale it to some multiple of 10 times 200 mv or your displayed units will not be right. You also have to add a decimal point in the appropriate spot. The manual tells you how to do that.
• March 17th, 2008, 09:38 PM
rainbowolf
ooo yeah there's some fixed sets i could use.. thanks ^^
btw when im not connecting anycurrent to measure..
the values would fluctuate. (thats when INHI & INLO is not connected together)
when i join them together.. the voltmeter shows 000v.. so i guess that's correct right? i just want to confirm that.. ^^

thanks alot for your help ^^

i'll use the fixed valued resistor to see if i could work something out..

cheers
• March 18th, 2008, 04:50 AM
Harold14370
Yeah, that's not unusual to see some stray voltage when there is nothing connected. You probably see the same thing on your digital multimeter. These devices have a very high input impedance, not like the old fashioned analog meters, so it doesn't take much static electricity or whatever to affect them.
• March 26th, 2008, 11:07 PM
rainbowolf
im still having problems with the resistor values..
i need to measure the input volt.
for example, 1v will show 1.00 on the multimeter.. which is actually only 1 mv..
so i need the resistors of the voltage divider im doing will need to divide the voltage that im measuring by 1000.

i cant figure out the value for ra and rb. i need them to make the voltage that comes in from (+) divide by 1000 before it reaches INHI.
im quite confused now..

i'll need to tell u that..
im gotta fix a solar panel onto the volt meter circuit..
if the solar panel gives out 1V, i must see the volt meter display 1v..
but i cant get it right...
• March 26th, 2008, 11:53 PM
(In)Sanity
Hope this helps

http://www-k.ext.ti.com/srvs/cgi-bin...obj(32620),new

Looks like you want 91,000 ohms and 100 ohms. The 91,000 could be a variable resistor of say 100k
• March 27th, 2008, 02:35 AM
Harold14370
According to the scaling section of your manual, for a 2 volt full scale range, you would use 910K and 100K. For a 2 volt (2000 mv) input this would give you 2000*100/1010=198 mv into the meter input terminals, and the meter will read 198. The adjustment pot on your meter will let you adjust that to 200. You want that to read 2, so just light up the second decimal point by closing the appropriate solder link.
• March 27th, 2008, 04:48 AM
rainbowolf
cool! thanks alot guys. :-D
• March 31st, 2008, 09:14 PM
rainbowolf
i cant seem to light up the decimal place..
i pumped in 2.5v to be read.. the voltmeter shows 025.. which is gd..
but i wanna make it show 02.5
i tried some pins but i ended up lighting the backlight instead..
how do i solder link the decimal point? which 1?

thx alot ^^
• April 1st, 2008, 01:16 AM
Harold14370
According to the manual the solder links are labeled DP1, DP2 and DP3. Did you try those? I think the solder link will be 2 metal pins close together which you will connect by putting s spot of solder on them. Is that what it looks like? Test it by connecting the links with a screwdriver before you solder, then you will know you have the right decimal point.
• April 1st, 2008, 02:08 AM
rainbowolf
i connected dp1 and dp2.. but that only made the backlight light up.
my dp3's leg is chipped off lol
• April 1st, 2008, 05:16 AM
Harold14370
Why did you connect both dp1 and dp2? Or are you saying you tried them both separately?