Ron,
I mentioned that about 560 ohms would be a good place to start and work down, for a six volt circuit. I have a variable DC supply, so I can mimic what you would have with a six volt DC transformer. I decided to just set it up and show the test.
I had several 100 ohm resistors, so I put 5 of them together, making a 500 ohm series that can be connected at any of the junctions.
(Five 100 ohm resistors)
[ATTACH=CONFIG]529593[/ATTACH]
The basic setup is shown below. I had trouble getting a good exposure of both the parts and the meter face. I opted to show the parts better and inset a closeup of the meter.
NOTE: the meter setting and BE SURE to set the meter for the right range before using it to test.
(LED with 500 ohms)
[ATTACH=CONFIG]529595[/ATTACH]
The meter you use will doubtless be different. Try to correlate these settings to what you have. You want DC milliamps. Since we are trying to adjust for 25 mA of current, I have a 30 mA scale that works well. Pick whatever seems best on your meter.
(Meter Setting)
[ATTACH=CONFIG]529596[/ATTACH]
I worked my down the resistor chain with the following results:
500 ohms = 9 mA
400 ohms = 12 mA
300 ohms = 17 mA
200 ohms = 23 mA
Below is a picture of the last of these tests.
(LED with 200 ohms)
[ATTACH=CONFIG]529597[/ATTACH]
I'd pick the closest standard value to the last test. It doesn't hurt to run the LED slightly below its full rating. If my memory of standard values is correct, you'd have to go up or down a little. Maybe 220 ohms?
I hope this helps.
John