Let me take a shot at some LED basics.
Leds are diodes, and as such one has to think in currents and not voltages. At a given current, a given LED will put out a certain amount of light. At that current, the voltage drop across the led will change as the temperature of the LED changes. If you put a voltage across a LED and slowly increase it, there will be a point where the current through the LED will increase exponentially. The heat dissipated by the LED also increases with the current.
Small LEDs (currents of 50 mA or less) can often use a simple dropping resistor. This lets you put a range of voltage on there, such as the range often seen from a lead acid battery. The intensity of the led will vary a bit, but from a practical point of view it works. The efficiency is low due to the energy lost in the dropping resistor, but from a practical point of view the entire system draws so little power that the cost savings make it worth it.
Larger LEDs, such as the 1 watt ones, need about a third of an amp. While you can use a dropping resistor, the losses in the resistor start to be a problem. The voltage drop on a LED depends on what the material is, which is also what determines the color of the LED. While and blue LEDs tend to have about 3.3 Volts of drop (with the afore mentioned changes with temperature). Green is about 2V, red is about 1.4 volts.
So with the larger LEDs, electronic circuits are used to create a constant current power supply. There are lots of chips to do this available at reasonable costs, but this is not a DIY project. The chips typically take an input voltage, say from 6 to 30 volts, and output a constant current to drive the LEDs. LEDs are often put in series. For example, you could put 2 or 3 white LEDs in series and run them off of 14 volts. I've used 2 LEDs in series for battery stuff so everything can work all the way down to 11 volts. The key concept here is a constant current so you don't care about the changes in LED voltage with temperature, and the current through the LEDs does not change with the input voltage to the driving circuit.
Driving LEDs from AC (like house voltages) has the same problem as the compact fluorescence. You have to build a power supply into the fixture, and the electronics doesn't like all the trapped heat. Driving LEDs from a DC source of less than 30 volts is much easier since there is just the little driver chip.
The lifetime of LEDs is somewhat subjective. You plot the light output vs. time. At some threshold, often 70% of initial output, you declare the LED to be dead. This number is out in the 100,000 hour range for most LEDs. The problem, and this is why LED lighting is having a hard time taking off, is you have to get rid of the heat the LED makes. Heat is the # one killer of all electronic devices. For every 10 degrees C (18 degrees F) rise in temperature, the LED life is cut in half (this is a rule of thumb so don't flame me here). Getting rid of heat is expensive since it require a metal heatsink, and the means for thermally connecting the LED to the heatsink are often expensive to do from a manufacturing process perspective. So a poor manufacture can cut corners here, run the LED hot, and get 10,000 hours before the LED burns out. Put a 1 year warranty on it and they are good to go. The same poor manufacturer can run the LED at a higher current for more light output, but that higher current means higher temperatures and less life.
LEDs light systems powered by AC have the double whammy of getting rid of both the heat from the LED and from the electronics, often in a form factor and space designed for incandescent such as a ceiling can. If you look at a lot of the commercial LED fixtures (not retrofits for the standard type A "light bulb") you'll see that the rear of them is often a big finned heat sink.
As for efficiency, the design of the LED and the fixture play a role here. Another rule of thumb is that 1 watt of LED is about the same amount of light as 5 watts of incandescent. So a 1 watt LED is about like a long life night light bulb. But to get the 1 watt into the led, the total power consumption (led _plus_ the circuitry) might be in the 1.1 to 1.4 watt range. LEDs in series are popular because you only path the extra power penalty for the constant current supply once even if you're driving a bunch of LEDs.
So LEDs are current devices, and either simple resistors or constant current electronics make them work with constant (or near constant) voltage sources. Heat is the dominant factor in LED life. Efficiency is pretty good, but there is a small price to pay for the electronics. Due to the long life of a correctly engineered LED, unscrupulous or inexperienced manufacturers can scrimp on cost and still have a device with 5x the lifetime of an incandescent light. And in some cases, this might be an OK trade off (like a flashlight).
Finally, LEDs and the electronic driver circuits are semiconductors and as such are very sensitive to voltage spikes. In theory, someone making such a device knows this and will put in spike protection. In practice, for AC stuff be sure you have a whole house surge protector. For DC operated units, try to use a spike protector on the power source or figure out (and this can be impossible) if the manufacturer has included the protection in their LED lights.
So there's some condensed background. The OP asked about LED lights. Done right, they are lower current and longer life and a win. They can save money on installations due to smaller wire sizes needed, and cost less to operate due to lower power - higher efficiency. Drawback is higher initial cost. If you can get something that's not a "light bulb retrofit" package, it has a better chance of being done right.
Another long post, hope it helps....
Pete