The notion of cable as a high-end component, so contentious in the 1980s, is today rather well accepted by most serious listeners. Controversy arises because sound perception is not always predictable on the basis of objective engineering measurements and analysis. Richard A. Greiner (1931–2015), at the time professor of electrical and computer engineering at the University of Wisconsin, writing in the August 1989 issue of Audio magazine, seemed to deliver a definitive engineering analysis of speaker cable and concluded that “normal cables are suitable, and essentially perfect, compared to other defects in the transmission system—not the least of which is the loudspeaker crossover network and level-pad arrangement.” Cable auditions in the context of high-end systems tell a different story. And that’s why subjective audio reviewing was created some 40 years ago by J. Gordon Holt (1930–2009). Harry Olson (1901–1982), for many years the dean of American acoustical engineering, stated it best: The ear is the final arbiter in all things musical.
It’s fair to say that the great awakening took place in the mid-1970s. Prior to that time no serious attention was paid to audio cable. For example, the original 1972 manual of the famed Dahlquist DQ-10 advised users that “for distances to 25 or 30 feet, use no smaller than #18 lamp cord (“zip cord”). For greater lengths, use #16, or larger. In general, it is preferred to use the heavier wire, even for short distances. Smaller wires may have enough electrical resistance to reduce the damping provided by the amplifier and affect low frequency transient response.” And that was about the collective wisdom regarding cable at that time. Zip cord was cheap, often included as a freebie at the point of sale, and assumed to be a perfect conductor as long as resistance effects were taken into account.
The status quo was shattered by the publication of several articles. In Japan it was Akihiko Kaneda at Akita University (1974) who argued that sound quality of a speaker/amplifier interface could be impacted by wire or cable. He suggested that this could be caused by the skin effect whereby current is progressively pushed to the skin layer of a conductor at increasing frequency, an effect made worse by the common practice at that time of tin-plating copper wire. Soon thereafter, in 1975, the late great Japanese audio critic, Saburo Egawa (1932–2015), practically started his audio career with the publication of listening test results showing sonic differences between different speaker cables. At Japan’s Mogami Cable, Koichi Hirabayashi was determined to prove Egawa wrong. But after extensive listening tests he became convinced that despite its apparent minimal theoretical effect over the audible bandwidth, skin effect does play a rather large role in perceived sonic differences. The end results of his research were the Mogami 2803 interconnect and 2804 speaker cable.
Jean Hiraga, who was probably familiar with Egawa’s work, having lived in Japan during the 1970s, published an article titled “Can we hear connecting wires?” in the October 1976 issue of the French magazine La Nouvelle Revue du Son. In August 1977, HiFi News & Record Review reprinted a translation of Hiraga’s article, which while deemed as controversial, managed to stir up quite a bit of excitement among English-speaking audiophiles.
Hiraga pointed out that while theoretically the skin effect appears to be negligible for frequencies below 200kHz, subjective listening tests suggest otherwise. Apparently, he started experimenting as early as 1972 with Litz-type speaker cable, which consists of a large number of individually insulated fine wires twisted or braided into a uniform pattern in order to maximize conductor surface area. He substituted Litz wire between an amplifier and an Onken 5000T tweeter and discovered that as the number of strands increased, so did the impression of detail and definition, accompanied by the perception of additional distortion. The obvious conclusion was that one should not shoot the messenger since the Litz cable was allowing more of the message to get through.
The Japanese were apparently first to commercialize a Litz-wire speaker cable, possibly based on the work of Kaneda and Egawa. Imported by Polk Audio circa 1977, it became recognized as the first high-end cable design and was commonly referred to as “Cobra cable” due to its distinctive appearance. It was constructed of two bundles of Litz wire (colored green and copper), one for each cable polarity, which were intimately woven together around a plastic core. Such a geometry drastically minimized cable inductance since the induced magnetic fields around the negative and positive conductors were in the same direction but in opposite polarity and largely canceled each other out. The resultant inductance was only 0.026µH/ft.—an order of magnitude lower than that of 18-gauge zip cord. The downside was a massive increase in cable capacitance to 500pF/ft. or almost 20 times that of 18-gauge zip cord. That didn’t sit well with marginally stable solid-state amps, which simply blew up when smitten by the Cobra’s capacitive venom.
Bob Fulton (1925–1988) has been called a mad genius and a “screwball,” but it’s fair to say that during Fulton Musical Industries’ (FMI) relatively short lifespan, few designers have been more creative than he was. Gordon Holt was a big fan of the FMI 80 loudspeaker, but Fulton was also actively involved in the entire recording chain, including microphones, tape recorders, and record production via the Ark label. He was the first U.S.-based designer to focus on optimizing the amplifier-speaker interface. His research resulted in two models of cable, referred to as Gold and Brown, presumably on the basis of the color of the outer jacket. The Gold turned quite a few heads because of its price and performance. It was a massive cable which quickly gained a reputation for stupendous bass response and midrange clarity. It was said to be equivalent to 4-ga. wire, but its resistance (R) per foot was 0.001 ohms versus 0.00025 for 4-ga. copper wire. The Gold was a multi-strand twin-lead design with each polarity conductor adequately spaced apart to keep capacitance (C) quite reasonable at 28pF/ft. Inductance (L) was 0.19µH/ft., about the same as 18-ga. zip cord.
The year 1979 proved to be monumental for the evolution of high-end cable. It was the year that both Monster Cable and Kimber Kable came into being. In the late 1970s, Noel Lee’s unique résumé included engineering positions at government labs and a stint as a drummer. He was also an audiophile who wanted to improve the sound quality of his home system, but without substantial financial resources he decided to focus on cable. Working out of his family’s apartment and later from his in-laws’ garage, he experimented with different cable concepts to find a superior alternative to zip cord. He would compare various designs while listening to Tchaikovsky’s 1812 Overture. Lee dubbed his final design “Monster” because of its size relative to ordinary zip cord. It was a multi-strand twin-lead design, approximating 12-ga. wire with a resistance of 0.0034 ohms/ft., an inductance of 0.21µH/ft., and a capacitance of 24pF/ft.—pretty much Goldilocks specs for a speaker cable. Initially, retail price was about 60 cents per foot; not cheap, but far more affordable than the Fulton Gold. Lee would go store-to-store doing live demos. And after a favorable reception at the 1979 Summer CES in Chicago, the company was officially launched. Lee’s business genius lay in establishing an extensive retailer network and in promotion. He did more than anyone else to bring cable to the forefront as a specialty component. Over the years, Monster Inc. has grown rapidly, diversifying into other markets. Today, its unique product count is around 6000, including speakers, headphones, power strips, accessories, and automobile audio devices.
Ray Kimber’s “Aha!” moment arrived in the mid-70s while working as a sound engineer in Los Angeles when the first big discotheques were being installed in the States. The big problem encountered during the install was with cables picking up EMI/RFI noise from electronics and, especially, lighting systems. In a discotheque, placement is very tight so that noise from the lighting systems was significant and clearly audible due to ordinary cable behaving like an antenna. Kimber tried shielding at first by encasing the cable in a steel conduit. This did help with the noise, but sound quality suffered. The fix that worked was to braid the negative and positive polarity conductors of the cable at an angle approaching 90 degrees. At such angles, EMI/RFI is reduced through electric-field cancellation. Not only was lighting system noise eliminated but sound quality improved across the spectrum. After braiding the first speaker cables by hand and listening to the results, Kimber decided to strike out on his own with Kimber Kable. Since the initial need was for braided cable, he started buying braiding machines to expedite production. In subsequent years Kimber continued to experiment with the number of conductors, insulation, and metals, the ultimate goal being to make a cable that is as neutral as possible.
Bill Low, who founded AudioQuest in 1980, said a few years back that “everything I’ve learned about hi-fi or cables is purely the result of being interested in getting high on music.” It was this passion that drove AudioQuest to innovate, becoming the first U.S. high-performance cable company to introduce advanced conductor technology in the form of 6N (99.9999% pure) copper and linear crystal copper. Over the years, AudioQuest diversified to embrace consumer electronics. HDMI cable currently represents a big chunk of its business. In addition to digital cables, the award-winning DragonFly USB DAC should be mentioned as well.
Bruce Brisson happened to become a cable designer by chance. In the late 1970s, after he repaired a complex three-way speaker system with active crossovers, he cabled the system back differently—easy to do since he was using three different cable types. The sound of the system changed. He proceeded to move the cables back to where they had been prior to the repair and everything sounded correct again. He decided to pursue in earnest the question of why that should be. Circa 1981 he was already busy designing and patenting speaker cable for Monster Cable. The goal of his first design was to minimize time delay between the low and high frequencies, a theme that would echo throughout his career. This was achieved with a geometry in which the outer conductors were wound into a number of bundles around a central conductor. Brisson founded Music Interface Technologies (MIT) in 1984, and has since become a force majeure in the cable industry as he has continued to develop a series of innovative designs. In the late 1990s he introduced a cable terminated by a passive network consisting of RC or RLC elements in parallel. The network is connected between the negative and positive cable polarities in order to control impedance resonances over the audible bandwidth. This concept continued to evolve and morph into the recently released ACC 268 Articulation Control Console, which maintains MIT’s traditional sonic virtues while allowing the user to fine-tune a system’s sound.
George Cardas argues that cables chose him since the design issues were squarely in the middle of his interest and skill set. He was engineering transmission lines at the phone company and, as he puts it, “obsessively interested in music.” By 1985 it seemed unlikely that cable geometry was amenable to any further insights. And then Cardas discovered a solution to the issue of wire-strand resonance in a Litz cable. By using a “Golden Ratio” progression of strand sizes, such that the size of the smaller strand to that of the next larger strand is about 0.62x. The reference to size typically refers to the cross-sectional area of the individual conductive strands within the cable, although it can also refer to the strand diameter. Cardas discovered the sonic benefit of such an arrangement through trial and error—by putting his ears into the engineering process. Image outlines snapped into focus when the right combination of strands was used, something that measurements just couldn’t capture.
In 1987 Japan’s Nippon Mining company succeeded in implementing copper purification technology suitable for commercial-scale production of high-purity copper. Just as important is the conductor’s grain structure. Copper is not a homogeneous metal. On a microscopic scale standard copper displays about 1500 grains per foot. Elongated-grain copper, referred to as linear-crystal, is drawn in a process that results in only about 70 grains per foot. Even better is the Ohno Continuous Casting (OCC) process, developed by professor Atsumi Ohno (1926–2017) in 1986 at the Chiba Institute of Technology in Japan. This technology has been used to manufacture single-crystal copper rods from which wire can be drawn with grain structures several hundred feet long. Minimizing grain count translates to higher purity and reduced capacitive effects at grain boundaries.
An important historical footnote belongs to Ed Meitner who initiated a cryogenic treatment program while at Museatex (now defunct). When copper wire is extruded, intense heat is generated along the surface which causes stress at a molecular level. Cryogenic treatment of cables substantially reduces surface tension and this results in more coherent signal transmission. Although only skin deep, and not as elegant as the OCC process, cryogenic treatment can be quite effective in focusing image outlines.
Audio history was made in 1976 when Hiroyasu Kondo (1941–2006), founder of Audio Note Japan, unveiled the world’s first 4N pure-silver cable. Mr. Kondo, aka the “Audio Silversmith,” gets credit for being first in elevating silver technology atop the high-end totem pole. As a conductor of both heat and electricity silver has no equal, and it is only second to gold in malleability and ductility. One ounce of silver can be drawn into a fine wire about 30 miles long! Lab-grade silver is 4N, but 5N as well as 6N grades are available at a significant premium. If conduction electrons could talk, they would sing praises of silver wire. Lack of granularity and oxygen contaminants make electron drift along a conductor more efficacious from an audio standpoint, as there is less opportunity for time smearing and loss of low-level detail.
High-purity silver wire is clearly a phenomenon of the high-end audio scene, and no one takes silver more seriously than Siltech Cables, which of course stands for Silver Technology. Silver became the focus of attention by virtue of its superior conductivity, chemical stability, and ability to maintain its crystalline integrity when subjected to mechanical stress. Although the company was founded in 1985 in the little Dutch town of Elst, it received a boost of innovation when it was acquired by Edwin van der Kleij (now Kleij-Rijnveld) in 1992. Today, International Audio Holding oversees the Siltech and Crystal Cable brands—quite logical as Edwin is married to Gabi van der Kleij-Rijnveld, founder of Crystal Cable. Edwin, an electronics engineer, worked for Philips and Exxon before focusing on high-end audio. This was a natural destination since he was a music enthusiast from a young age, played bass guitar in a high school band, and built speakers and amplifiers along the way. He was rather curious to know how cables create audible differences in sound. Over time, Edwin was able to obtain key answers helped by new and better measurements and multi-physics simulations, which allow the combined effects of material and construction properties to be visualized prior to production. Noteworthy are Crystal Cable’s designs using a pure-silver monocrystal core with outer layers of silver-plated monocrystal copper and gold-plated monocrystal silver.
An alternative to Litz wire offered by several manufacturers, most notably Tara Labs, is best described as small-gauge solid-core wire. Tara Labs was founded by Matthew Bond in Sydney, Australia, in 1984. After its move to the U.S., it was first to market in 1988 with solid-core cable designs. The design was refined in 1990 when the shape of the conductor was changed from round to rectangular to further reduce skin effect.
Skin depth is defined for a given frequency and conductor material as the distance into the wire at which the signal decreases by a factor of 2.718. To minimize the impact of the skin effect, conductor radius should therefore be small relative to the skin depth at the highest frequency of interest. For copper at 20kHz, the skin depth is 0.47mm—or about the radius of a 19-gauge wire. The finer the gauge, the more uniform the impedance magnitude becomes but at the cost of a higher DC resistance, which is not usually an issue for interconnect design since it is used in a high-impedance circuit.
Some manufacturers such as van den Hul have taken this approach to an extreme limit. Van den Hul’s Carbon Nano Tube (CNT) interconnect uses 19 carbon conductors (to keep impedance reasonable) twisted together to form a single interconnect leg. Each carbon strand is a mere 15 microns in diameter and that manufacturing process is an art in itself.
For those of us who have lived through the past 40 years, the advances in cable technology have been nothing short of amazing. And there is no reason to think that innovation will stand still. Today, cable is one of the most popular accessory categories, and my guess is that most audiophiles upgrade cables more often than any other component.