well.. that's correct, if you are looking at it from one side..
step around to the other and you'll see that its what causes the typical "led switch" problems.
Red LEDs may put out 12000 MCDs (the measure of light output), whereas your blue that you choose to use may output 45000MCD. Even if you chose to go with 12V LEDs (built in resistor, actually) so that there is NO RESISTOR needed, the blue would be brighter, because at the CORRECT VOLTAGE to the LED, the blue one still outputs nearly 4 times the light.
This is what causes the "overly bright" look. So, the way people get around that is that they limit the voltage/current to the blue ones with either a bigger resistor, or additional smaller resistors. You've now taken your simple, no resistor needed setup and added a complexity.
Its another resistor to burn out. The life expectancy of a typical LED is not measured in the hundreds or thousands of hours like a bulb.. They are measures in the 10 or hundreds of thousand hours.
a resistor? They don't last NEARLY as long when they are forced to dissipate the wattage usually thrown at them from your typical setup on here. This is where the whole "half of mine are blown" statements come from. the LEDs are likely fine (unless they were subjected to overcurrent, IE a resistor too small) and the RESISTOR is likely blown.
It would be much simpler to find a blue LED with the light output characteristics you desire (8000MCD? 12000MCD? instead of having to use a resistor to "hide" an overly bright output. The resistor should be used to put the LED in its efficiency range (which would be its required voltage, which would output 45000MCD from above). When you use a resistor to lower the light output even LOWER than that, you may be dissipating more wattage than normal on your resistor not to mention adding one more component to fail.
Keep it simple.. plan accordingly.. and APPROPRIATELY.
This is getting long, but oh well:
Most              LEDs have their characteristics specified at a current of 20 mA. If              you want really good reliability and you are not certain you don't              have worse-than-average heat conductivity in your mounting, heat buildup              in wherever you mount them, voltage/current variations, etc. then              design for 15 milliamps. 
           Now              for how to make 15 milliamps flow through the LED: 
           First              you need to know the LED voltage drop. It is safe enough to assume              1.7 volts for non-high-brightness red, 1.9 volts for high-brightness,              high-efficiency and low-current red, and 2 volts for orange and yellow,              and 2.1 volts for green. Assume 3.4 volts for bright white, bright              non-yellowish green, and most blue types. Assume 4.6 volts for 430              nM bright blue types such as Everbright and Radio Shack. Design for              12 milliamps for the 3.4 volt types and 10 milliamps for the 430 NM              blue. 
           You              can design for higher current if you are adventurous or you know you              will have a good lack of heat buildup. In such a case, design for              25 ma for the types with voltage near 2 volts, 18 ma for the 3.4 volt              types, and 15 ma for the 430 NM blue. 
           Meet              or exceed the maximum rated current of the LED only under favorable              conditions of lack of heat buildup. Some LED current ratings assume              some really favorable test conditions - such as being surrounded by              air no warmer than 25 degrees Celsius and some decent thermal conduction              from where the leads are mounted. Running the LED at specified laboratory              conditions used for maximum current rating will make it lose half              its light output after rated life expectancy (20,000 to 100,000 hours)              - optimistically! You can use somewhat higher currents if you heat-sink              the leads and/or can tolerate much shorter life expectancy. 
           Next,              know your supply voltage. It should be well above the LED voltage              for reliable, stable LED operation. Use at least 3 volts for the lower              voltage types, 4.5 volts for the 3.4 volt types, and 6 volts for the              430 NM blue.
           The              voltage in most cars is 14 volts while the alternator is successfully              charging the battery. A well-charged 12 volt lead-acid battery is              12.6 volts with a light load discharging it. Many "wall wart" DC power              supplies provide much higher voltage than specified if the load is              light, so you need to measure them under a light load that draws maybe              10-20 milliamps. 
           Next              step is to subtract the LED voltage from the supply voltage. This              gives you the voltage that must be dropped by the dropping resistor.              Example: 3.4 volt LED with a 6 volt supply voltage. Subtracting these              gives 2.6 volts to be dropped by the dropping resistor. 
                       
The              next step is to divide the dropped voltage by the LED current to get              the value of the dropping resistor. If you divide volts by amps, you              get the resistor value in ohms. If you divide volts by milliamps,              you get the resistor value in kilo-ohms or k. 
           Example:              6 volt supply, 3.4 volt LED, 12 milliamps. Divide 2.6 by .012. This              gives 217 ohms. The nearest standard resistor value is 220 ohms. 
           If              you want to operate the 3.4 volt LED from a 6 volt power supply at              the LED's "typical" current of 20 ma, then 2.6 divided by .02 yields              a resistor value of 130 ohms. The next higher popular standard value              is 150 ohms.
           If              you want to run a typical 3.4 volt LED from a 6 volt supply at its              maximum rated current of 30 ma, then divide 2.6 by .03. This indicates              87 ohms. The next higher popular standard resistor value is 100 ohms.              Please beware that I consider the 30 ma rating for 3.4-3.5 volt LEDs              to be optimistic. 
           One              more thing to do is to check the resistor wattage. Multiply the dropped              voltage by the LED current to get the wattage being dissipated in              the resistor. Example: 2.6 volts times .03 amp (30 milliamps) is .078              watt. For good reliability, I recommend not exceeding 60 percent of              the wattage rating of the resistor. A 1/4 watt resistor can easily              handle .078 watt. In case you need a more powerful resistor, there              are 1/2 watt resistors widely available in the popular values.
           You              can put LEDs in series with only one resistor for the whole series              string. Add up the voltages of all the LEDs in the series string.              This should not exceed 80 percent of the supply voltage if you want              good stability and predictable current consumption. The dropped voltage              will then be the supply voltage minus the total voltage of the LEDs              in the series string. 
           Do              not put LEDs in parallel with each other. Although this usually works,              it is not reliable. LEDs become more conductive as they warm up, which              may lead to unstable current distribution through paralleled LEDs.              LEDs in parallel need their own individual dropping resistors. Series              strings can be paralleled if each string has its own dropping resistor.              
If you wanted to get more complex, at least get complex for the right reason..  pulsing current output beyond the max on an LED in rapid succession leads to higher light output without a dramatic decrease in life expectancy.  That's for another discussion, though.
/hijack