HOW DID A WEAVING LOOM LEAD TO ONE OF THE GREATEST TECHNOLOGY INNOVATIONS OF THE 21ST CENTURY?
The loom’s punch cards later inspired English mathematician Charles Babbage to revolutionize the process of creating mathematical tables.
The Jacquard Loom was a significant breakthrough in the history of textile production, an essential manufacturing tool of the Industrial Revolution. Joseph Marie Jacquard, a silk weaver from Lyon, France, first demonstrated his improved drawloom at an industrial exposition in Paris in 1801. By 1803, a spark of genius inspired him to make another improvement to this loom—the “Jacquard attachment.”
This mechanism, mounted above the loom, uses a continuous chain of punch cards to control the lifting of individual threads. Each card on the loom corresponds to a hook, which can be raised or stopped depending on whether the hole is punched out or solid. The cards are mounted on a rotating cylinder and pressed against pins, which detect the presence of holes. The loom’s hooks are raised or lowered by a harness, which guides the thread to form a pattern in the fabric.
The Jacquard Loom automated the work of weavers. Changing the punch cards changed the pattern, giving the weaver endless ways to “program” this device and to create intricate tapestries, damasks, brocades and other fabrics. Traditional silk weavers could produce approximately one inch of complex fabric in a day. The skilled Jacquard Loom operator, however, could create approximately two feet of fabric in the same amount of time.
Jacquard’s invention did not go unrecognized. In 1806, the city of Lyon purchased his patent as public property, and the loom began to spread. It was introduced in England in 1820 and arrived in the United States in 1823. It was used primarily in the workshops of professional weavers or small factories.
The Jacquard Loom’s punch cards later inspired English mathematician and engineer Charles Babbage to rethink the process of creating mathematical tables. In Babbage’s time, these tables were essential in the fields of navigation, science and engineering. But they were calculated by hand, leaving much room for human error in both the calculation and transcription. The tables were sometimes printed with these errors, problematic for a supply ship trying to calculate its course or an engineer trying to build a piece of equipment.
Babbage wanted to find a way to automate computation using machines to improve accuracy in this process. In 1837, he designed a device that is recognized as one of the first mechanical computers—the Analytical Engine. Babbage was inspired by the Jacquard Loom’s ability to process complex data using punch cards and applied this same model to the Analytical Engine. In turn, Ada Lovelace, a mathematician and friend of Babbage, was inspired to publish an English translation of an article written by Luigi Menabrea describing the invention in detail. Her accompanying notes are lengthier than the original article and describe a complicated algorithm to be carried out by the engine. This is often considered to be “the first computer program,” and by extension, Lovelace became the first computer programmer. While Babbage’s engine was not built to completion during his lifetime, the prototypes and designs were inspirational enough to cement his legacy as one of the fathers of modern computing.
Jacquard's Loom at The Henry Ford
A Jacquard Loom can be found inside the Greenfield Village Weaving Shop, where a continuous loop of 622 punched cards is used to produce pictorial textile designs. The loom was built by former textile director and curator Sidney Holloway at The Henry Ford in 1934. It was restored to operation in 2008.
How did a shirt pocket lead to a feat of engineering?
The origins of Hewlett-Packard’s HP-35 Scientific Calculator began with a challenge. In 1971, William Hewlett dared his engineers to prove their engineering prowess by miniaturizing the company’s 9100A Desktop Calculator—a forty-pound machine—into a device small enough to fit into a shirt pocket. The calculator’s target size of approximately 6x3 inches was supposedly arrived at by measuring one of Hewlett’s own shirt pockets.
The twelve or so experimental HP-35s that began as "company hacks" soon proved useful beyond the prototype stage. They were popular among the staff who built and tested them, and were presented for marketing studies. Despite a high manufacturing cost driving a retail cost of $395 (equivalent to $2,200 in 2015) and research that warned of a limited market, Hewlett-Packard decided to proceed with production. The company’s 1972 sales goal of selling 10,000 calculators was quickly exceeded: they sold 100,000. Its rapid success made the slide rule obsolete practically overnight, as engineers, scientists and mathematicians abandoned their analog calculating devices in favor of embracing the digital future.
The HP-35 (named for its 35 keys) was the world’s first handheld scientific calculator. This advanced machine, with its full suite of features, was capable of processing more complex mathematical functions than any other calculator on the market at the time. It was also the company’s first product to use both integrated circuits and an LED display, which eased communication between the screen and keys. The HP-35 inspired others too—it caught the attention of a young Hewlett-Packard engineer named Steve Wozniak. During the day, he worked at designing follow-up models of the calculator; in the evening, he developed his own electronic projects at home. All the while, he was percolating ideas that would lead to the beginnings of the Apple 1 computer.
How did a meeting in a garage provide the inspiration for a new kind of home computing?
On a rainy day in March 1975, some of the most radical minds in computing gathered in the garage of Gordon French in Menlo Park, California. At this—the inaugural meeting of the Homebrew Computer Club—technical genius and countercultural ethics fused with the obsession to push technology to its limits for social good. It made for an inspiring (if not competitive) environment. Steve Wozniak, then an engineer working at Hewlett-Packard, had been given a flyer for that first Homebrew meeting by a co-worker. He attended and walked away with the inspiration to create an affordable and powerful computer for the everyday home user. This was the beginning of the Apple 1.
Wozniak wanted to provide the maximum amount of computing power using the least amount of components. Thanks to the powerful new 6502 MOS microprocessor chip, he found a way to condense his design onto a small rectangular circuit board holding a total of 60 chips. He also gave some thought to a user-friendly interface. The Apple 1 is the first personal computer that allowed people to type on a keyboard and have their text show up on a television monitor.
In 1976, Wozniak’s engineering skills, coupled with his friend Steve Jobs’ bold marketing moves, led to an order for 200 assembled Apple 1 motherboards by ByteShop owner Paul Terrell. And the word assembled here is important—the Apple 1 is the first preassembled personal computer ever sold. Before the Apple 1, computer enthusiasts built their systems from kits, soldering components and pairing them with clunky interface components like teletype machines. Wozniak later reminisced: “Nobody'd ever imagined it, a full computer that could run programs could be that small.”
Ironically, when it came time to find the money to produce the circuit boards for the first Apple 1 order, Wozniak’s contribution was raised by selling his HP-65 calculator, a follow-up model to the HP-35. When the Apple 1 circuit boards arrived, they were assembled and tested over the course of 30 days at the Jobs family home. This was the humble, almost cottage-industry-like beginnings of what would become one of the world’s most profitable companies. When Wozniak and Jobs took their first order, they had no way of predicting what the future would bring.
How were “text messages” sent in the 19th century?
Cyrus Field wanted to wire the world. A successful paper merchant turned telecommunications pioneer, Field established the American Telegraphy Company in 1856 and set to work raising the funds and gathering the minds needed to bridge the oceanic divide between Europe and America.
In 1858, after several failed attempts, an underwater cable—capable of transmitting telegraph signals across the Atlantic Ocean—was laid from Valentia, Ireland, to Heart’s Content, Newfoundland. In August the first messages were sent, including an exchange between Queen Victoria and President Buchanan. It took 17 hours to transmit Queen Victoria’s 98 words. The triumph of the 1858 cable was short-lived; a month later, it failed, a victim of excess voltage in an attempt to increase the speed of messages.
This cable machine, built by Glass, Eliot & Co., was used to prepare telecommunications cable at Enderby’s Wharf in Greenwich, England, for the second transatlantic cable. Machines like these were used to create the core of submarine cable from iron and conductive copper—and then moved aboard a ship, where they applied a protective sheath made of galvanized steel, an insulating layer of gutta-percha and a final layer of jute to protect against abrasion. One mile of finished cable weighed almost a ton, but it was as flexible as a rope, built to withstand the pull of the ship laying it and hazards on the ocean floor.
In 1865, 2,300 nautical miles of cable were carried aboard the leviathan iron steamship, the SS Great Eastern. The ship left in July but was forced to return to port when the cable snapped and the end was lost at sea. A second cable excursion began a year later and was successful. This was the first truly sustainable and durable telegraph cable, continuing to carry the Morse code “text messages” of telegraph operators across continents—at a rate 80 times faster than the first cable. It remained in operation until the mid-1870s, by which time four additional cables had been laid.
This machine was essential to the “wiring of the world,” reorganizing basic materials into the spine of the first permanent transcontinental telecommunications network. These submarine cables—like the modern-day fiber-optic cables that carry the signals of Internet traffic—connected cultures and communities.
How do you redefine a chair for the digital age?
Herman Miller’s Aeron Chair has been called the most comfortable chair in the world and the most privileged chair in the office. In a 1994 interview, co-designer Bill Stumpf shared his thoughts: “A chair is our third life after sleeping and standing… As an object, it’s alive and intimate, not static. It’s 10 times more interesting than a building and 10 times more difficult to design.” Three years earlier, when the designers Bill Stumpf and Don Chadwick first entered into their collaboration, their goal was to design a chair that supported a person in any position, at any task their office job served up.
The Aeron took an iconoclastic approach to chair design: Stumpf’s deep knowledge of ergonomics, combined with Chadwick’s curiosity for materials use, they built it from the ground up. The design team conducted anthropometric studies across the country, using a specially designed measuring device to examine the relationship between sizes of people and their preference for chair size. Their studies proved that the “one size fits all” chair model was inaccurate, and so the Aeron was made available in three sizes to conform to different body types.
Over the course of three years of research and design, as the Aeron Chair came to life, so did fourteen unique patents. The Pellicle mesh fabric, developed specifically for the Aeron, was breathable, and unlike the foam cushioning of the typical office chair, allowed heat and moisture from the body to pass through. Interwoven Hytrel fibers conformed to the sitter while seated, distributing weight and pressure upon the body—but always returned to their neutral position when vacated.
The Aeron prototypes housed in the collections of The Henry Ford—from the earliest design explorations to pre-production examples—represent a physical timeline of Propst and Chadwick’s achievements. The prototype seen on this page is “wired” for testing—once connected to strain gauges to measure load and stress. Every individual component of the Aeron Chair was sent through a rigorous engineering process, from aerated suspension to posture-fitting tilt mechanisms; recycled aluminum alloy structures to the way the mesh seating was encapsulated in the frame.
The Aeron’s history is closely entwined with the growth of the computing industry. As desktop computers became ubiquitous in the 1980s, the requirement to engage with monitors and keyboards for longer amounts of time increased. The Aeron Chair is a modern solution to the unhealthy alignment of bodies to screens. Its launch in 1994 coincided with the early stages of the “dot-com boom,” and the chair was adopted as a status symbol, populating the offices of the growing tech industry. And despite the “bursting” of the same bubble in 2000, the chair continues to be regarded among creative and information workers as a back-saving necessity.
After centuries of use and improvement, how do you “reinvent” the telephone?
In 2004, a building on the campus of Apple’s corporate headquarters went on lockdown. Only a small handful of staff were allowed access to the “Purple Dorm,” passing through layer upon layer of security portals each time they entered and exited the building. Behind closed doors, a research and design team labored away, developing a new product known only as “Project Purple,” giving up their nights and weekends in order to meet the demanding deadlines of their employer, Steve Jobs. In 2007, during the Macworld keynote speech, Jobs unleashed the work of the “Project Purple” team into the world. The culmination of their labor was a small black rectangle, which Jobs held up in his hand while he announced: “Today, Apple is going to reinvent the phone.”
This first demonstration of the iPhone promised more modern features than any other mobile phone at the time, collapsing multiple technologies into a sleek user-friendly touch-screen device. It wasn’t just a phone—it was also an iPod and a fully enabled Internet device. It introduced new modes of text messaging using a more human, conversational format. It was a camera, an entertainment device, and included a GPS-like application that used cell phone towers to pinpoint location.
The small icons that were clicked to launch applications transferred the familiarity of the graphical user interface environment of desktop computing to the mobile phone—fingertips taking over the point-and-click duties of the mouse. With the iPhone, people were able to haptically engage with their phones using new forms of gestural interaction: the replacement of pressed keys with a virtual touch keyboard, the “swipe to power on” function, and the “pinch and zoom” feature for viewing photographs.
The iPhone’s release date was a well-choreographed media and cultural event. Consumers lined up for hours, waiting to purchase their own iPhones at Apple stores across the country. From the first generation to 2015, over 700 million units have been sold. And in the last decade, the refinement of the smartphone, the rise of the app industry and the popularity of social media platforms have all led to astounding levels of engagement with mobile technology. The boundaries between work and play, off and “always on” mentalities have blurred, as we do more than we ever thought was possible with a telephone.