User Tools

Site Tools



Ada's Algorithm: How Lord Byron's Daughter Ada Lovelace Launched the Digital Age
James Essinger. Melville House, 2014
Over 150 years after her death, a widely-used scientific computer program was named “Ada,” after Ada Lovelace, the only legitimate daughter of the eighteenth century’s version of a rock star, Lord Byron. Why? Because, after computer pioneers such as Alan Turing began to rediscover her, it slowly became apparent that she had been a key but overlooked figure in the invention of the computer. In Ada Lovelace, James Essinger makes the case that the computer age could have started two centuries ago if Lovelace’s contemporaries had recognized her research and fully grasped its implications. It’s a remarkable tale, starting with the outrageous behavior of her father, which made Ada instantly famous upon birth. Ada would go on to overcome numerous obstacles to obtain a level of education typically forbidden to women of her day. She would eventually join forces with Charles Babbage, generally credited with inventing the computer, although as Essinger makes clear, Babbage couldn’t have done it without Lovelace. Indeed, Lovelace wrote what is today considered the world’s first computer program—despite opposition that the principles of science were “beyond the strength of a woman’s physical power of application.” Based on ten years of research and filled with fascinating characters and observations of the period, not to mention numerous illustrations, Essinger tells Ada’s fascinating story in unprecedented detail to absorbing and inspiring effect.

Ada: A Life and Legacy
Dorothy Stein. MIT Press, 1987
Augusta Ada Byron, Countess of Lovelace, was the daughter of Lord Byronand a close friend to many of the leading figures of the Victorian era; based on herreport on Charles Babbage's Analytical Engine she is also generally known as theinventor of the science of computer programming. In this engrossing biography, Dorothy Stein strips away the many layers of myth to reveal a story far moredramatic and fascinating than previous accounts have indicated.Dorothy Stein is apsychologist with a special interest in thought and language and a background inphysics and computer programming. She has taught courses in nineteenth-centurywomen's history and in the biology and psychology of sex differences, and isparticularly concerned with the use of myth in science.

Ada Lovelace: The Making of a Computer Scientist
Christopher Hollings, Ursula Martin, Adrian Rice. Bodleian Library, University of Oxford, 2018
Ada, Countess of Lovelace (1815­–52), daughter of romantic poet Lord Byron and the highly educated Anne Isabella, is sometimes called the world’s first computer programmer, and she has become an icon for women in technology today. But how did a young woman in the nineteenth century, without access to formal schooling or university education, acquire the knowledge and expertise to become a pioneer of computer science? Although it was an unusual pursuit for women at the time, Ada Lovelace studied science and mathematics from a young age. This book uses previously unpublished archival material to explore her precocious childhood—from her curiosity about the science of rainbows to her design for a steam-powered flying horse—as well as her ambitious young adulthood. Active in Victorian London’s social and scientific elite alongside Mary Somerville, Michael Faraday, and Charles Dickens, Ada Lovelace became fascinated by the computing machines of Charles Babbage, whose ambitious, unbuilt invention known as the “Analytical Engine” inspired Lovelace to devise a table of mathematical formulae which many now refer to as the “first program.” Ada Lovelace died at just thirty-six, but her work strikes a chord to this day, offering clear explanations of the principles of computing, and exploring ideas about computer music and artificial intelligence that have been realized in modern digital computers. Featuring detailed illustrations of the “first program” alongside mathematical models, correspondence, and contemporary images, this book shows how Ada Lovelace, with astonishing prescience, first investigated the key mathematical questions behind the principles of modern computing.

Alan Turing: The Enigma
Andrew Hodges, Douglas R. Hofstadter. Simon & Schuster, 1983
It is only a slight exaggeration to say that the British mathematician Alan Turing (1912-1954) saved the Allies from the Nazis, invented the computer and artificial intelligence, and anticipated gay liberation by decades–all before his suicide at age forty-one. This New York Times–bestselling biography of the founder of computer science, with a new preface by the author that addresses Turing's royal pardon in 2013, is the definitive account of an extraordinary mind and life. Capturing both the inner and outer drama of Turing’s life, Andrew Hodges tells how Turing’s revolutionary idea of 1936–the concept of a universal machine–laid the foundation for the modern computer and how Turing brought the idea to practical realization in 1945 with his electronic design. The book also tells how this work was directly related to Turing’s leading role in breaking the German Enigma ciphers during World War II, a scientific triumph that was critical to Allied victory in the Atlantic. At the same time, this is the tragic account of a man who, despite his wartime service, was eventually arrested, stripped of his security clearance, and forced to undergo a humiliating treatment program–all for trying to live honestly in a society that defined homosexuality as a crime.

Atanasoff: Forgotten Father of the Computer
Clark R. Mollenhoff. Iowa State University Press, 1988
Recounts how John Atanasoff invented the first electronic digital computer, explains how his ideas were exploited, and describes the court battle that restored his proper recognition.

ENIAC in Action: Making and Remaking the Modern Computer
Thomas Haigh, Mark Priestley, Crispin Rope. MIT Press, 2016
The history of the first programmable electronic computer, from its conception, construction, and use to its afterlife as a part of computing folklore. Conceived in 1943, completed in 1945, and decommissioned in 1955, ENIAC (the Electronic Numerical Integrator and Computer) was the first general-purpose programmable electronic computer. But ENIAC was more than just a milestone on the road to the modern computer. During its decade of operational life, ENIAC calculated sines and cosines and tested for statistical outliers, plotted the trajectories of bombs and shells, and ran the first numerical weather simulations. ENIAC in Action tells the whole story for the first time, from ENIAC's design, construction, testing, and use to its afterlife as part of computing folklore. It highlights the complex relationship of ENIAC and its designers to the revolutionary approaches to computer architecture and coding first documented by John von Neumann in 1945. Within this broad sweep, the authors emphasize the crucial but previously neglected years of 1947 to 1948, when ENIAC was reconfigured to run what the authors claim was the first modern computer program to be executed: a simulation of atomic fission for Los Alamos researchers. The authors view ENIAC from diverse perspectives – as a machine of war, as the “first computer,” as a material artifact constantly remade by its users, and as a subject of (contradictory) historical narratives. They integrate the history of the machine and its applications, describing the mathematicians, scientists, and engineers who proposed and designed ENIAC as well as the men – and particularly the women who – built, programmed, and operated it.

Grace Hopper and the Invention of the Information Age
Kurt W. Beyer. The MIT Press, 2009.
A Hollywood biopic about the life of computer pioneer Grace Murray Hopper (1906–1992) would go like this: a young professor abandons the ivy-covered walls of academia to serve her country in the Navy after Pearl Harbor and finds herself on the front lines of the computer revolution. She works hard to succeed in the all-male computer industry, is almost brought down by personal problems but survives them, and ends her career as a celebrated elder stateswoman of computing, a heroine to thousands, hailed as the inventor of computer programming. Throughout Hopper's later years, the popular media told this simplified version of her life story. In Grace Hopper and the Invention of the Information Age, Kurt Beyer reveals a more authentic Hopper, a vibrant and complex woman whose career paralleled the meteoric trajectory of the postwar computer industry. Both rebellious and collaborative, Hopper was influential in male-dominated military and business organizations at a time when women were encouraged to devote themselves to housework and childbearing. Hopper's greatest technical achievement was to create the tools that would allow humans to communicate with computers in terms other than ones and zeroes. This advance influenced all future programming and software design and laid the foundation for the development of user-friendly personal computers.

Grace Hopper: Admiral of the Cyber Sea
Kathleen Broome Williams. US Naval Institute Press, 2004.
When grace Hopper retired as a rear admiral from the U.S. Navy in 1986, she was the first woman restricted line officer to reach flag rank and, at the age of seventy-nine, the oldest serving officer in the Navy. A mathematician by training who became a computer scientist, the eccentric and outspoken Hopper helped propel the Navy into the computer age. She also was a superb publicist for the Navy, appearing frequently on radio and television and quoted regularly in newspapers and magazines. Yet in spite of all the attention she received, until now “Amazing Grace,” as she was called, has never been the subject of a full biography. Kathleen Broome Williams looks at Hopper's entire naval career, from the time she joined the Waves and was sent in 1943 to work on the Mark 1 computer at Harvard, where she became one of the country's first computer programmers. Thanks to this early Navy introduction to computing, the author explains, Hopper had a distinguished civilian career in commercial computing after the war, gaining fame for her part in the creation of COBOL. The admiral's Navy days were far from over, however, and Williams tells how Hopper–already past retirement age–was recalled to active duty at the Pentagon in 1967 to standardize computer-programming languages for Navy computers. Her temporary appointment lasted for nineteen years while she standardized COBOL for the entire department of defense. Based on extensive interviews with colleague and family and on archival material never before examined, this biography not only illuminates Hopper's pioneering accomplishments in a field that came to be dominated by men, but provides a fascinating overview of computing from its beginnings in World War II to the late 1980s.

Howard Aiken: Portrait of a Computer Pioneer
I. Bernard Cohen. MIT Press, 1999
Howard Hathaway Aiken (1900-1973) was a major figure of the early digital era. He is best known for his first machine, the IBM Automatic Sequence Controlled Calculator or Harvard Mark I, conceived in 1937 and put into operation in 1944. But he also made significant contributions to the development of applications for the new machines and to the creation of a university curriculum for computer science. This biography of Aiken, by a major historian of science who was also a colleague of Aiken's at Harvard, offers a clear and often entertaining introduction to Aiken and his times. Aiken's Mark I was the most intensely used of the early large-scale, general-purpose automatic digital computers, and it had a significant impact on the machines that followed. Aiken also proselytized for the computer among scientists, scholars, and businesspeople and explored novel applications in data processing, automatic billing, and production control. But his most lasting contribution may have been the students who received degrees under him and then took prominent positions in academia and industry. I. Bernard Cohen argues convincingly for Aiken's significance as a shaper of the computer world in which we now live.

iWoz: From Computer Geek to Cult Icon: How I Invented the Personal Computer, Co-Founded Apple, and Had Fun Doing It
Steve Wozniak, Gina Smith. W. W. Norton, 2006
Before slim laptops that fit into briefcases, computers looked like strange vending machines, with cryptic switches and pages of encoded output. But in 1977 Steve Wozniak revolutionized the computer industry with his invention of the first personal computer. As the sole inventor of the Apple I and II computers, Wozniak has enjoyed wealth, fame, and the most coveted awards an engineer can receive, and he tells his story here for the first time.

John Von Neumann and the Origins of Modern Computing
William Aspray. MIT Press, 1990
John von Neumann (1903-1957) was unquestionably one of the most brilliant scientists of the twentieth century. He made major contributions to quantum mechanics and mathematical physics and in 1943 began a new and all-too-short career in computer science. William Aspray provides the first broad and detailed account of von Neumann's many different contributions to computing. These, Aspray reveals, extended far beyond his well-known work in the design and construction of computer systems to include important scientific applications, the revival of numerical analysis, and the creation of a theory of computing.Aspray points out that from the beginning von Neumann took a wider and more theoretical view than other computer pioneers. In the now famous EDVAC report of 1945, von Neumann clearly stated the idea of a stored program that resides in the computer's memory along with the data it was to operate on. This stored program computer was described in terms of idealized neurons, highlighting the analogy between the digital computer and the human brain. Aspray describes von Neumann's development during the next decade, and almost entirely alone, of a theory of complicated information processing systems, or automata, and the introduction of themes such as learning, reliability of systems with unreliable components, self-replication, and the importance of memory and storage capacity in biological nervous systems; many of these themes remain at the heart of current investigations in parallel or neurocomputing.Aspray allows the record to speak for itself. He unravels an intricate sequence of stories generated by von Neumann's work and brings into focus the interplay of personalities centered about von Neumann. He documents the complex interactions of science, the military, and business and shows how progress in applied mathematics was intertwined with that in computers.

Steve Jobs
Walter Isaacson. Simon & Schuster, 2011
Based on more than forty interviews with Jobs conducted over two years—as well as interviews with more than a hundred family members, friends, adversaries, competitors, and colleagues—Walter Isaacson has written a riveting story of the roller-coaster life and searingly intense personality of a creative entrepreneur whose passion for perfection and ferocious drive revolutionized six industries: personal computers, animated movies, music, phones, tablet computing, and digital publishing. At a time when America is seeking ways to sustain its innovative edge, and when societies around the world are trying to build digital-age economies, Jobs stands as the ultimate icon of inventiveness and applied imagination. He knew that the best way to create value in the twenty-first century was to connect creativity with technology. He built a company where leaps of the imagination were combined with remarkable feats of engineering. Although Jobs cooperated with this book, he asked for no control over what was written nor even the right to read it before it was published. He put nothing off-limits. He encouraged the people he knew to speak honestly. And Jobs speaks candidly, sometimes brutally so, about the people he worked with and competed against. His friends, foes, and colleagues provide an unvarnished view of the passions, perfectionism, obsessions, artistry, devilry, and compulsion for control that shaped his approach to business and the innovative products that resulted. Driven by demons, Jobs could drive those around him to fury and despair. But his personality and products were interrelated, just as Apple’s hardware and software tended to be, as if part of an integrated system. His tale is instructive and cautionary, filled with lessons about innovation, character, leadership, and values.

The Home Computer Wars: An Insider's Account of Commodore and Jack Tramiel
Michael S. Tomczyk. Compute Publications International, 1984
In one of the most intriguing moves in modem corporate history, Jack Tramiel, the most successful consumer computer manufacturer, recentl y left Commodore, the company he had founded, and bought Atari, one of his biggest victims in the billion-dollar battle for the per- sonal computer dollar. A survivor of the Nazi Holocaust, Tramiel had taken a tiny typewriter parts company and built it in to a major American corporation . In the process, he became a modem corporate legend. Some of his vice presidents thought he was a saint; some thought he had the world 's hardest heart. But few deny the brilliance of this complex entrepreneur. For the past four years, Michael Tomczyk was Trarniel's assistant. Throu ghout Comm odore's explosive rise to lead ership in the computer field, Tomczyk was a close insider. Most im- portantl y, Tomczyk is a keen observer, and his book takes the reader into a vivid, dra matic world where a powerful, brilliant businessman almos t single- handedl y fashions the Ame rican con- sumer computer industry. It was a titanic strugg le, a two- front war. Con flict raged inside Com - modore, as careers rose and fell. Ou tside, archriva ls Texas Instruments and Atari fought a losing battle against an increasingly aggressive Commodore attack. This book takes you through some of the most exciting episodes in modem American business, concluding with the latest events at Jack Tramiel's new company, Atari.

The Man Behind the Microchip: Robert Noyce and the Invention of Silicon Valley
Leslie Berlin, Robert Noyce. Oxford University Press, 2005
The Man Behind the Microchip Robert Noyce and the Invention of Silicon Valley published in the year 2005 was published by Oxford University Press. The author of this book is Leslie Berlin. ed page displaying collection of Leslie Berlin books here. This is the Hardback version of the title “The Man Behind the Microchip Robert Noyce and the Invention of Silicon Valley” and have around pp. xiii + 402 pages. The Man Behind the Microchip Robert Noyce and the Invention of Silicon Valley is currently Available with us.

The Man Who Invented the Computer: The Biography of John Atanasoff, Digital Pioneer
Jane Smiley. Doubleday, 2010
One night in the late 1930s, in a bar on the Illinois–Iowa border, John Vincent Atanasoff, a professor of physics at Iowa State University, after a frustrating day performing tedious mathematical calculations in his lab, hit on the idea that the binary number system and electronic switches, com­bined with an array of capacitors on a moving drum to serve as memory, could yield a computing machine that would make his life and the lives of other similarly burdened scientists easier. Then he went back and built the machine. It worked. The whole world changed. Why don’t we know the name of John Atanasoff as well as we know those of Alan Turing and John von Neumann? Because he never patented the device, and because the developers of the far-better-known ENIAC almost certainly stole critical ideas from him. But in 1973 a court declared that the patent on that Sperry Rand device was invalid, opening the intellectual property gates to the computer revolution. Jane Smiley tells the quintessentially American story of the child of immigrants John Atanasoff with technical clarity and narrative drive, making the race to develop digital computing as gripping as a real-life techno-thriller.

The ultimate entrepreneur: the story of Ken Olsen and Digital Equipment Corporation
Glenn Rifkin, George Harrar. Contemporary Books 1988
The first full-length portrait of Olsen and his company describes the hectic pace of DEC's growth; the engineers' revolt that led to the formation of Data General; the loss of the personal computer market to IBM and Apple; and Wall Street's call for the ouster of Ken Olsen in 1983.

Turing's Vision: The Birth of Computer Science
Chris Bernhardt. Mit Press, 2016
In 1936, when he was just twenty-four years old, Alan Turing wrote a remarkable paper in which he outlined the theory of computation, laying out the ideas that underlie all modern computers. This groundbreaking and powerful theory now forms the basis of computer science. In Turing's Vision, Chris Bernhardt explains the theory, Turing's most important contribution, for the general reader. Bernhardt argues that the strength of Turing's theory is its simplicity, and that, explained in a straightforward manner, it is eminently understandable by the nonspecialist. As Marvin Minsky writes, “The sheer simplicity of the theory's foundation and extraordinary short path from this foundation to its logical and surprising conclusions give the theory a mathematical beauty that alone guarantees it a permanent place in computer theory.” Bernhardt begins with the foundation and systematically builds to the surprising conclusions. He also views Turing's theory in the context of mathematical history, other views of computation (including those of Alonzo Church), Turing's later work, and the birth of the modern computer.

This website uses cookies. By using the website, you agree with storing cookies on your computer. Also you acknowledge that you have read and understand our Privacy Policy. If you do not agree leave the website.More information about cookies

Made with DokuWikiSitemapRSS FeedFeedback


books/biographies.txt · Last modified: 2020/06/10 16:35 by system