Solar FAQ

When you install a solar energy system on your property, you save money on your electricity bills and protect yourself against rising electricity rates in the future. How much you can save depends on the utility rates and solar policies in your area, but going solar is a smart investment regardless of where you live.
Solar power, like other renewable energy resources, has many environmental and health benefits. Going solar reduces greenhouse gas emissions, which contribute to climate change, and also results in fewer air pollutants like sulfur dioxide and particulate matter, which can cause health problems.
Net metering is the system that utilities use to credit solar energy system owners for the electricity produced by their solar panels. With net metering, you only pay for the electricity that you use beyond what your solar panels can generate. Net metering policies differ from state to state, so make sure to do your homework ahead of time.
Solar panels absorb the sun's energy throughout the day and convert it into direct current (DC) electricity. Most homes and businesses run on alternating current (AC) electricity, so the DC electricity is then passed through an inverter to convert it to usable AC electricity. At that point, you either use the electricity in your house or send it back to the electric grid.
The amount of power your solar energy system can generate is dependent on sunlight. As a result, your solar panels will produce slightly less energy when the weather is cloudy, and no energy at night. However, because of high electricity costs and financial incentives, solar is a smart decision even if you live in a cloudy city.
Solar panel systems are made of durable tempered glass and require little to no maintenance for the 25 to 35 years that they will generate power. In most cases, you don’t even need to clean your solar panels regularly. If something does happen, most equipment manufacturers include warranties, although warranty terms depend on the company.

Generator FAQ

The primary difference between kW (kilowatt) and kVA (kilovolt-ampere) is the power factor. kW is the unit of real power and kVA is a unit of apparent power (or real power plus re-active power). The power factor, unless it is defined and known, is therefore an approximate value (typically 0.8), and the kVA value will always be higher than the value for kW.
In relation to industrial and commercial generators, kW is most commonly used when referring to generators in the United States, and a few other countries that use 60 Hz, while the majority of the rest of the world typically uses kVa as the primary value when referencing generator sets.
To expand on it a bit more, the kW rating is essentially the resulting power output a generator can supply based on the horsepower of an engine. kW is figured by the horsepower rating of the engine times .746. For example if you have a 500 horsepower engine it has a kW rating of 373. The kilovolt-amperes (kVa) are the generator end capacity. Generator sets are usually shown with both ratings. To determine the kW and kVa ratio the formula below is used.
0.8 (pf) x 625 (kVa) = 500 kW
The power factor (pf) is typically defined as the ratio between kilowatts (kW) and kilovolt amps (kVa) that is drawn from an electrical load, as was discussed in the question above in more detail. It is determined by the generators connected load. The pf on the nameplate of a generator relates the kVa to the kW rating (see formula above). Generators with higher power factors more efficiently transfer energy to the connected load, while generators with a lower power factor are not as efficient and result in increased power costs. The standard power factor for a three phase generator is 0.8.
Standby power generators are most often used in emergency situations, such as during a power outage. It is ideal for applications that have another reliable continuous power source like utility power. It’s recommend usage is most often only for the duration of a power outage and regular testing and maintenance.
Prime power ratings can be defined as having an “unlimited run time”, or essentially a generator that will be used as a primary power source and not just for standby or backup power. A prime power rated generator can supply power in a situation where there is no utility source, as is often the case in industrial applications like mining or oil & gas operations located in remote areas where the grid is not accessible.
Continuous power is similar to prime power but has a base load rating. It can supply power continuously to a constant load, but does not have the ability to handle overload conditions or work as well with variable loads. The main difference between a prime and continuous rating is that prime power gensets are set to have maximum power available at a variable load for an unlimited number of hours, and they generally include a 10% or so overload capability for short durations.

Transformer FAQ

A transformer is an electrical static equipment designed to convert alternating current from one voltage to another. It can be designed to “step up” or “step down” voltages and works on the magnetic induction principle.
A transformer has no moving parts and is a completely static solid state device, which insures under normal conditions, a long and trouble-free life. It consists, in it’s simplest form, of two or more coils of insulated wire wound on a laminated steel core. When voltage is introduced to one coil, called the primary, it magnetizes the iron core.
A voltage is then induced in the other coil, called the secondary or output coil. The change of voltage level (or potential difference ratio) between the primary and secondary depends on the turns ratio of the two coils.
Generally, there are 2 kind of losses in a transformer; Iron losses and copper losses. Since iron losses depend on voltage and copper losses on current, the total losses depend on voltage and current, and no power factor is involved. Transformers are rated in kVA as kW would include power factor.
Yes, a transformer can be connected reversely. However, the output voltage will be less than the rated voltage due to the compensation factor of the windings.
On the basis of their use
1. Power transformer: Used in transmission network, high rating
2. Distribution transformer: Used in distribution network, comparatively lower rating than that of power transformers.
1. IS 1180(Part 1): 2014 - Outdoor type, insulated liquid immersed Distribution Transformers upto and including 2500 kVA, 33kV (Part 1: Mineral Oil Immersed).
2. IS 2026 (Part 1): 2011 - Three-phase and singlephase power transformers (including autotransformers)

Earthing FAQ

Electrical grounding or “Grounding” originally began as a safety measure used to help prevent people from accidentally coming in contact with electrical hazards. Think of your refrigerator. It’s a metal box standing on rubber feet with electricity running in and out of it. You use magnets to hang your child’s latest drawing on the metal exterior. The electricity running from the outlet and through the power cord to the electrical components inside the refrigerator are electrically isolated from the metal exterior or chassis of the refrigerator.
If for some reason the electricity came in contact with the chassis, the rubber feet would prevent the electricity from going anywhere and it would “sit”, waiting for someone to walk up and touch the refrigerator. Once someone touched the refrigerator, the electricity would flow from the chassis of the refrigerator and through the unlucky person, possibly causing injury.
Grounding is used to protect that person. By connecting a wire to the metal frame of the refrigerator, if the chassis inadvertently becomes charged for any reason, the unwanted electricity will travel down the wire and out safely into the earth; and in the process, trip the circuit-breaker stopping the flow of electricity. Obviously, that wire has to connect to something that is in turn connected to the earth or ground outside.
The process of electrically connecting to the earth itself is often called “earthing”, particularly in Europe where the term “grounding” is used to describe the above ground wiring. The term “Grounding” is used in America to discuss both earthing and grounding.
While grounding may have originally been considered only as a safety measure, with today’s advances in electronics and technology, grounding has become an essential part of everyday electricity. Computers, televisions, microwave ovens, fluorescent lights and many other electrical devices, generate lots of “electrical noise” that can damage equipment and cause it to work less efficiently. Proper grounding cannot only remove this unwanted “noise”, but can even make surge protection devices work better.
The most common performance criteria or specification used today is Resistance-to-Ground or commonly called “ground resistance”. In the electrical world, resistance is anything that opposes the flow of electricity. Do you remember all the hype about “super conductors” that has been in the news for the last decade or so? With super conductors, Scientists are trying to develop a material with zero resistance to electricity. They have yet to succeed at any practical level.
It turns out, that all known materials have an electrical resistance at some level, even copper. So as you can imagine, dirt, rock and sand have varying resistances, and based on the particular composition of the soil, the resistance to electricity that your particular part of the earth provides can be very different. In fact, the resistance of the soil (per cubic meter) can vary from location to location by thousands of ohms (ohms is a unit of measurement used for resistance) and that can make a big difference in how effective your grounding will be.
Resistance-to-Ground (or ground resistance) is a measurement of the actual resistance of the electrodes in the grounding system. The measurement is made in ohms with a target level of 25-ohms or less being mandated by the National Electric Code.