Sponsored Links
-->

Wednesday, May 16, 2018

Network analyzer (electrical) - Wikipedia
src: upload.wikimedia.org

From 1929 to the late 1960s, large alternating current power systems were modelled and studied on AC network analyzers (also called alternating current network calculators or AC calculating boards) or transient network analyzers. These special-purpose analog computers were an outgrowth of the DC calculating boards used in the very earliest power system analysis. By the middle of the 1950s, fifty network analyzers were in operation. AC network analyzers were much used for power flow studies, short circuit calculations, and system stability studies, but were ultimately replaced by numerical solutions running on digital computers. While the analyzers could provide real-time simulation of events, with no concerns about numeric stability of algorithms, the analyzers were costly, inflexible, and limited in the number of buses and lines that could be simulated. Eventually powerful digital computers replaced analog network analyzers for practical calculations, but analog physical models for studying electrical transients are still in use.


Video Network analyzer (AC power)



Calculating methods

As AC power systems became larger at the start of the 20th century, with more interconnected devices, the problem of calculating the expected behavior of the systems became more difficult. Manual methods were only practical for systems of a few sources and nodes. The complexity of practical problems made manual calculation techniques too laborious or inaccurate to be useful. Many mechanical aids to calculation were developed to solve problems relating to network power systems.

DC calculating boards used resistors and DC sources to represent an AC network. A resistor was used to model the inductive reactance of a circuit, while the actual series resistance of the circuit was neglected. The principle disadvantage was the inability to model complex impedances. However, for short-circuit fault studies, the effect of the resistance component was usually small. DC boards served to produce results accurate to around 20% error, sufficient for some purposes.

Artificial lines were used to analyze transmission lines. These carefully constructed replicas of the distributed inductance, capacitance and resistance of a full-size line were used to investigate propagation of impulses in lines and to validate theoretical calculations of transmission line properties. An artificial line was made by winding layers of wire around a glass cylinder, with interleaved sheets of tin foil, to give the model proportionally the same distributed inductance and capacitance as the full-size line. Later, lumped-element approximations of transmission lines were found to give adequate precision for many calculations.

Laboratory investigations of the stability of multiple-machine systems were constrained by the use of direct-operated indicating instruments (voltmeters, ammeters, and wattmeters). To ensure that the instruments negligibly loaded the model system, the machine power level used was substantial. Some workers in the 1920s used three-phase model generators rated up to 600 kVA and 2300 volts to represent a power system. General Electric developed model systems using generators rated at 3.75 kVA. It was difficult to keep multiple generators in synchronism, and the size and cost of the units was a constraint. While transmission lines and loads could be accurately scaled down to laboratory representations, rotating machines could not be accurately miniaturized and keep the same dynamic characteristics as full-sized prototypes; the ratio of machine inertia to machine frictional loss did not scale.


Maps Network analyzer (AC power)



Scale model

A network analyzer system was essentially a scale model of the electrical properties of a specific power system. Generators, transmission lines, and loads were represented by miniature electrical components with scale values in proportion to the modeled system. Model components were interconnected with flexible cords to represent the schematic diagram of the modeled system.

Instead of using miniature rotating machines, accurately calibrated phase-shifting transformers were built to simulate electrical machines. These were all energized by the same source (at local power frequency or from a motor-generator set) and so inherently maintained synchronism. The phase angle and terminal voltage of each simulated generator could be set using rotary scales on each phase-shifting transformer unit. Using the per-unit system allowed values to be conveniently interpreted without additional calculation.

To reduce the size of the model components, the network analyzer often was energized at a higher frequency than the 50 Hz or 60 Hz utility frequency. The operating frequency was chosen to be high enough to allow high-quality inductors and capacitors to be made, and to be compatible with the available indicating instruments, but not so high that stray capacitance would affect results. Many systems used either 440 Hz, or 480 Hz, provided by a motor-generator set, to reduce size of model components. Some systems used 10 kHz, using capacitors and inductors similar to those used in radio electronics.

Model circuits were energized at relatively low voltages to allow for safe measurement with adequate precision. The model base quantities varied by manufacturer and date of design; as amplified indicating instruments became more common, lower base quantities were feasible. Model voltages and currents started off around 200 volts and 0.5 amperes in the MIT analyzer, which still allowed directly driven (but especially sensitive) instruments to be used to measure model parameters. The later machines used as little as 50 volts and 50 mA, used with amplified indicating instruments. By use of the per-unit system, model quantities could be readily transformed into the actual system quantities of voltage, current, power or impedance. A watt measured in the model might correspond to hundreds of kilowatts or megawatts in the modeled system. One hundred volts measured on the model might correspond to one per-unit, which could represent, say, 230,000 volts on a transmission line or 11,000 volts in a distribution system. Typically, results accurate to around 2% of measurement could be obtained. Model components were single-phase devices, but using the symmetrical components method, unbalanced three-phase systems could be studied as well.

A complete network analyzer was a system that filled a large room; one model was described as four bays of equipment, spanning a U-shaped arrangement 26 feet (8 metres) across. Companies such as General Electric and Westinghouse could provide consulting services based on their analyzers; but some large electrical utilities operated their own analyzers. The use of network analyzers allowed quick solutions to difficult calculation problems, and allowed problems to be analyzed that would otherwise be uneconomic to compute using manual calculations. Although expensive to build and operate, network analyzers often repaid their costs in reduced calculation time and expedited project schedules. For example, a stability study might indicate if a transmission line should have larger or differently spaced conductors to preserve stability margin during system faults; potentially saving many miles of cable and thousands of insulators.

Network analyzers did not directly simulate the dynamic effects of load application to machine dynamics (torque angle, and others). Instead, the analyzer would be used to solve dynamic problems in a stepwise fashion, first calculating a load flow, then adjusting the phase angle of the machine in response to its power flow, and re-calculating the power flow.

In use, the system to be modelled would be represented as a single line diagram and all the impedances of lines and machines would be scaled to model values on the analyzer. A plugging diagram would be prepared to show the interconnections to be made between the model elements. The circuit elements would be interconnected by patch cables. The model system would be energized, and measurements taken at the points of interest in the model; these could be scaled up to the values in the full-scale system.


Surveying | SC-WiFi
src: scwifi.files.wordpress.com


The MIT network analyzer

The network analyzer installed at Massachusetts Institute of Technology (MIT) grew out of a 1924 thesis project by Hugh H. Spencer and Harold Locke Hazen, investigating a power system modelling concept proposed by Vannevar Bush. Instead of miniature rotating machines, each generator was represented by a transformer with adjustable voltage and phase, all fed from a common source. This eliminated the poor accuracy of models with miniature machines. The 1925 publication of this thesis attracted the attention at General Electric, where Robert Doherty was interested in modelling problems of system stability. He asked Hazen to verify that the model could accurately reproduce the behavior of machines during load changes.

Design and construction was carried out jointly by General Electric and MIT. When first demonstrated in June 1929, the system had eight phase-shifting transformers to represent synchronous machines. Other elements included 100 variable line resistors, 100 variable reactors, 32 fixed capacitors, and 40 adjustable load units. The analyzer was described in a 1930 paper by H.L Hazen, O.R. Schurig and M.F. Gardner. The base quantities for the analyzer were 200 volts, and 0.5 amperes. Sensitive portable thermocouple-type instruments were used for measurement. The analyzer occupied four large panels, arranged in a U-shape, with tables in front of each section to hold measuring instruments. While primarily conceived as an educational tool, the analyzer saw considerable use by outside firms, who would pay to use the device. American Gas and Electric Company, the Tennessee Valley Authority, and many other organizations studied problems on the MIT analyzer in its first decade of operation. In 1940 the system was moved and expanded to handle more complex systems.

By 1953 the MIT analyzer was beginning to fall behind the state of the art. Digital computers were first used on power system problems as early as "Whirlwind" in 1949. Unlike most of the forty other analyzers in service by that point, the MIT instrument was energized at 60 Hz, not 440 or 480 Hz, making its components large, and expansion to new types of problems difficult. Many utility customers had bought their own network analyzers. The MIT system was dismantled and sold to the Puerto Rico Water Resources Authority in 1954.


1 GHz RF Spectrum Analyzer
src: www.qsl.net


Commercial manufacturers

By 1947, fourteen network analyzers had been built at a total cost of about two million US dollars. General Electric built two full-scale network analyzers for its own work and for services to its clients. Westinghouse built systems for their internal use and provided more than 20 analyzers to utility and university clients. After the Second World War analyzers were known to be in use in France, the UK, Australia, Japan, and the Soviet Union. Later models had improvements such as centralized control of switching, central measurement bays, and chart recorders to automatically provide permanent records of results.

General Electric's Model 307 was a miniaturized AC network analyzer with four generator units and a single electronically amplified metering unit. It was targeted at utility companies to solve problems too large for hand computation but not worth the expense of renting time on a full size analyzer. Like the Iowa State College analyzer, it used a system frequency of 10 kHz instead of 60 Hz or 480 Hz, allowing much smaller radio-style capacitor and inductors to be used to model power system components. The 307 was cataloged from 1957 and had a list of about 20 utility, educational and government customers. In 1959 its list price was $8,590.

In 1953, the Metropolitan Edison Company and a group of six other electrical companies purchased a new Westinghouse AC network analyzer for installation at the Franklin Institute in Philadelphia. The system, described as the largest ever built, cost $400,000.

In Japan, network analyzers were installed starting in 1951. The Yokogawa Electric company introduced a model energized at 3980 Hz starting in 1956.


ENTES New Generation Network Analyzers - YouTube
src: i.ytimg.com


Other applications

Transient analyzer

A "transient network analyzer" was an analog model of a transmission system especially adapted to study high-frequency transient surges (such as those due to lightning or switching), instead of AC power frequency currents. Similarly to an AC network analyzer, they represented apparatus and lines with scaled inductances and resistances. A synchronously driven switch repeatedly applied a transient impulse to the model system, and the response at any point could be observed on an oscilloscope or recorded on an oscillograph. Some transient analyzers are still in use for research and education, sometimes combined with digital protective relays or recording instruments.

Anacom

The Westinghouse Anacom was an AC-energized electrical analog computer system used extensively for problems in mechanical design, structural elements, lubrication oil flow, and various transient problems including those due to lightning surges in electric power transmission systems. The excitation frequency of the computer could be varied. The Westinghouse Anacom constructed in 1948 was used up to the early 1990s for engineering calculations; its original cost was $500,000. The system was periodically updated and expanded; by the 1980s the Anacom could be run through many simulation cases unattended, under the control of a digital computer that automatically set up initial conditions and recorded the results. Westinghouse built a replica Anacom for Northwestern University, sold an Anacom to ABB, and twenty or thirty similar computers by other makers were used around the world.

Physics and chemistry

Since the multiple elements of the AC network analyzer formed a powerful analog computer, occasionally problems in physics and chemistry were modeled (by such researchers as Gabriel Kron of General Electric), in the late 1940s prior to the ready availability of general-purpose digital computers. Another application was water flow in water distribution systems. The forces and displacements of a mechanical system could be readily modelled with the voltages and currents of a network analyzer, which allowed easy adjustment of properties such as the stiffness of a spring by, for example, changing the value of a capacitor.

Structures

The David Taylor Model Basin operated an AC network analyzer from the late 1950s until the mid-1960s. The system was used on problems in ship design. An electrical analog of the structural properties of a proposed ship, shaft, or other structure could be built, and tested for its vibrational modes. Unlike AC analyzers used for power systems work, the exciting frequency was made continuously variable so that mechanical resonance effects could be investigated.


Electrical network analyzer / voltage / power quality / for ...
src: img.directindustry.com


Decline and obsolescence

Even during the Depression and the Second World War, many network analyzers were constructed because of their great value in solving calculations related to electric power transmission. By the mid 1950s, about thirty analyzers were available in the United States, representing an oversupply. Institutions such as MIT could no longer justify operating analyzers as paying clients barely covered operating expenses.

Once digital computers of adequate performance became available, the solution methods developed on analog network analyzers were migrated to the digital realm, where plugboards, switches and meter pointers were replaced with punch cards and printouts. The same general-purpose digital computer hardware that ran network studies could easily be dual-tasked with business functions such as payroll. Analog network analyzers faded from general use for load-flow and fault studies, although some persisted in transient studies for a while longer. Analog analyzers were dismantled and either sold off to other utilities, donated to engineering schools, or scrapped.

The fate of a few analyzers illustrates the trend. The analyzer purchased by American Electric Power was replaced by digital systems in 1961, and donated to Virginia Tech. The Westinghouse network analyzer purchased by the State Electricity Commission of Victoria, Australia in 1950 was taken out of utility service in 1967 and donated to the Engineering department at Monash University; but by 1985, even instructional use of the analyzer was no longer practical and the system was finally dismantled.

One factor contributing to the obsolescence of analog models was the increasing complexity of interconnected power systems. Even a large analyzer could only represent a few machines, and perhaps a few score lines and busses. Digital computers routinely handled systems with thousands of busses and transmission lines.


Spectrum analyzer - Wikiwand
src: upload.wikimedia.org


See also

  • Network analyzer (electrical)
  • Power system protection
  • Differential analyser
  • Prospective short-circuit current

Using the Spectrum Analyzer With the Analog Discovery 2
src: cdn.instructables.com


References


LUMEL 3-Phase Power Network Analyzer/Recorder, ND40
src: media.oem.se


External links

  • [1] Lee Allen Mayo, thesis Simulation without replication, University of Notre Dame 2011, pp. 52-101 discusses use of network analyzers for theoretical calculations

Source of article : Wikipedia