NASA’s New Shortcut to Fusion Power - IEEE Spectrum

2022-11-24 04:59:05 By : Ms. Kitty Huang

IEEE websites place cookies on your device to give you the best user experience. By using our websites, you agree to the placement of these cookies. To learn more, read our Privacy Policy.

Lattice confinement fusion eliminates massive magnets and powerful lasers Neodymium Hexaboride Powder

NASA’s New Shortcut to Fusion Power - IEEE Spectrum

Physicists first suspected more than a century ago that the fusing of hydrogen into helium powers the sun. It took researchers many years to unravel the secrets by which lighter elements are smashed together into heavier ones inside stars, releasing energy in the process. And scientists and engineers have continued to study the sun’s fusion process in hopes of one day using nuclear fusion to generate heat or electricity. But the prospect of meeting our energy needs this way remains elusive.

The extraction of energy from nuclear fission, by contrast, happened relatively quickly. Fission in uranium was discovered in 1938, in Germany, and it was only four years until the first nuclear “pile” was constructed in Chicago, in 1942.

There are currently about 440 fission reactors operating worldwide, which together can generate about 400 gigawatts of power with zero carbon emissions. Yet these fission plants, for all their value, have considerable downsides. The enriched uranium fuel they use must be kept secure. Devastating accidents, like the one at Fukushima in Japan, can leave areas uninhabitable. Fission waste by-products need to be disposed of safely, and they remain radioactive for thousands of years. Consequently, governments, universities, and companies have long looked to fusion to remedy these ills.

Among those interested parties is NASA. The space agency has significant energy needs for deep-space travel, including probes and crewed missions to the moon and Mars. For more than 60 years, photovoltaic cells, fuel cells, or radioisotope thermoelectric generators (RTGs) have provided power to spacecraft. RTGs, which rely on the heat produced when nonfissile plutonium-238 decays, have demonstrated excellent longevity—both Voyager probes use such generators and remain operational nearly 45 years after their launch, for example. But these generators convert heat to electricity at roughly 7.5 percent efficiency. And modern spacecraft need more power than an RTG of reasonable size can provide.

One promising alternative is lattice confinement fusion (LCF), a type of fusion in which the nuclear fuel is bound in a metal lattice. The confinement encourages positively charged nuclei to fuse because the high electron density of the conductive metal reduces the likelihood that two nuclei will repel each other as they get closer together.

The deuterated erbium (chemical symbol ErD3) is placed into thumb-size vials, as shown in this set of samples from a 20 June 2018 experiment. Here, the vials are arrayed pre-experiment, with wipes on top of the metal to keep the metal in position during the experiment. The metal has begun to crack and break apart, indicating it is fully saturated.NASA

The vials are placed upside down to align the metal with the gamma ray beam. Gamma rays have turned the clear glass amber.NASA

We and other scientists and engineers at NASA Glenn Research Center, in Cleveland, are investigating whether this approach could one day provide enough power to operate small robotic probes on the surface of Mars, for example. LCF would eliminate the need for fissile materials such as enriched uranium, which can be costly to obtain and difficult to handle safely. LCF promises to be less expensive, smaller, and safer than other strategies for harnessing nuclear fusion. And as the technology matures, it could also find uses here on Earth, such as for small power plants for individual buildings, which would reduce fossil-fuel dependency and increase grid resiliency.

Physicists have long thought that fusion should be able to provide clean nuclear power. After all, the sun generates power this way. But the sun has a tremendous size advantage. At nearly 1.4 million kilometers in diameter, with a plasma core 150 times as dense as liquid water and heated to 15 million °C, the sun uses heat and gravity to force particles together and keep its fusion furnace stoked.

On Earth, we lack the ability to produce energy this way. A fusion reactor needs to reach a critical level of fuel-particle density, confinement time, and plasma temperature (called the Lawson Criteria after creator John Lawson) to achieve a net-positive energy output. And so far, nobody has done that.

Lighting the Fusion Fire In lattice confinement fusion (LCF), a beam of gamma rays is directed at a sample of erbium [shown here] or titanium saturated with deuterons. Occasionally, gamma rays of sufficient energy will break apart a deuteron in the metal lattice into its constituent proton and neutron. The neutron collides with another deuteron in the lattice, imparting some of its own momentum to the deuteron. The electron-screened deuteron is now energetic enough to overcome the Coulomb barrier, which would typically repel it from another deuteron. Deuteron-Deuteron Fusion When the energetic deuteron fuses with another deuteron in the lattice, it can produce a helium-3 nucleus ( helion ) and give off useful energy. A leftover neutron could provide the push for another energetic deuteron elsewhere. Alternatively, the fusing of the two deuterons could result in a hydrogen-3 nucleus (triton) and a leftover proton. This reaction also produces useful energy.Stripping and OP Reaction Another possible reaction in lattice confinement fusion would happen if an erbium atom instead rips apart the energetic deuteron and absorbs the proton. The extra proton changes the erbium atom to thulium and releases energy. If the erbium atom absorbs the neutron, it becomes a new isotope of erbium. This is an Oppenheimer-Phillips (OP) stripping reaction. The proton from the broken-apart deuteron heats the lattice.

In lattice confinement fusion (LCF), a beam of gamma rays is directed at a sample of erbium [shown here] or titanium saturated with deuterons. Occasionally, gamma rays of sufficient energy will break apart a deuteron in the metal lattice into its constituent proton and neutron.

The neutron collides with another deuteron in the lattice, imparting some of its own momentum to the deuteron. The electron-screened deuteron is now energetic enough to overcome the Coulomb barrier, which would typically repel it from another deuteron.

When the energetic deuteron fuses with another deuteron in the lattice, it can produce a helium-3 nucleus ( helion ) and give off useful energy. A leftover neutron could provide the push for another energetic deuteron elsewhere.

Alternatively, the fusing of the two deuterons could result in a hydrogen-3 nucleus (triton) and a leftover proton. This reaction also produces useful energy.

Another possible reaction in lattice confinement fusion would happen if an erbium atom instead rips apart the energetic deuteron and absorbs the proton. The extra proton changes the erbium atom to thulium and releases energy.

If the erbium atom absorbs the neutron, it becomes a new isotope of erbium. This is an Oppenheimer-Phillips (OP) stripping reaction. The proton from the broken-apart deuteron heats the lattice.

Fusion reactors commonly utilize two different hydrogen isotopes: deuterium (one proton and one neutron) and tritium (one proton and two neutrons). These are fused into helium nuclei (two protons and two neutrons)—also called alpha particles—with an unbound neutron left over.

Existing fusion reactors rely on the resulting alpha particles—and the energy released in the process of their creation—to further heat the plasma. The plasma will then drive more nuclear reactions with the end goal of providing a net power gain. But there are limits. Even in the hottest plasmas that reactors can create, alpha particles will mostly skip past additional deuterium nuclei without transferring much energy. For a fusion reactor to be successful, it needs to create as many direct hits between alpha particles and deuterium nuclei as possible.

In the 1950s, scientists created various magnetic-confinement fusion devices, the most well known of which were Andrei Sakharov’s tokamak and Lyman Spitzer’s stellarator. Setting aside differences in design particulars, each attempts the near-impossible: Heat a gas enough for it to become a plasma and magnetically squeeze it enough to ignite fusion—all without letting the plasma escape.

Inertial-confinement fusion devices followed in the 1970s. They used lasers and ion beams either to compress the surface of a target in a direct-drive implosion or to energize an interior target container in an indirect-drive implosion. Unlike magnetically confined reactions, which can last for seconds or even minutes (and perhaps one day, indefinitely), inertial-confinement fusion reactions last less than a microsecond before the target disassembles, thus ending the reaction.

Both types of devices can create fusion, but so far they are incapable of generating enough energy to offset what’s needed to initiate and maintain the nuclear reactions. In other words, more energy goes in than comes out. Hybrid approaches, collectively called magneto-inertial fusion, face the same issues.

Proton: Positively charged protons (along with neutrons) make up atomic nuclei. One component of lattice confinement fusion (LCF) may occur when a proton is absorbed by an erbium atom in a deuteron stripping reaction.

Neutron: Neutrally charged neutrons (along with protons) make up atomic nuclei. In fusion reactions, they impart energy to other particles such as deuterons. They also can be absorbed in Oppenheimer-Phillips reactions.

Erbium & Titanium: Erbium and titanium are the metals of choice for LCF. Relatively colossal compared with the other particles involved, they hold the deuterons and screen them from one another.

Deuterium: Deuterium is hydrogen with one proton and one neutron in its nucleus (hydrogen with just the proton is protium). Deuterium’s nucleus, call a deuteron, is crucial to LCF.

Deuteron: The nucleus of a deuterium atom. Deuterons are vital to LCF—the actual fusion instances occur when an energetic deuteron smashes into another in the lattice. They can also be broken apart in stripping reactions.

Hydrogen-3 (Tritium): One possible resulting particle from deuteron-deuteron fusion, alongside a leftover proton. Tritium has one proton and two neutrons in its nucleus, which is also called a triton.

Helium-3: One possible resulting particle from deuteron-deuteron fusion, alongside a leftover neutron. Helium-3 has two protons and one neutron in its nucleus, which is also called a helion.

Alpha particle: The core of a normal helium atom (two protons and two neutrons). Alpha particles are a commonplace result of typical fusion reactors, which often smash deuterium and tritium particles together. They can also emerge from LCF reactions.

Gamma ray: Extremely energetic photons that are used to kick off the fusion reactions in a metal lattice by breaking apart deuterons.

Current fusion reactors also require copious amounts of tritium as one part of their fuel mixture. The most reliable source of tritium is a fission reactor, which somewhat defeats the purpose of using fusion.

The fundamental problem of these techniques is that the atomic nuclei in the reactor need to be energetic enough—meaning hot enough—to overcome the Coulomb barrier, the natural tendency for the positively charged nuclei to repel one another. Because of the Coulomb barrier, fusing atomic nuclei have a very small fusion cross section, meaning the probability that two particles will fuse is low. You can increase the cross section by raising the plasma temperature to 100 million °C, but that requires increasingly heroic efforts to confine the plasma. As it stands, after billions of dollars of investment and decades of research, these approaches, which we’ll call “hot fusion,” still have a long way to go.

The barriers to hot fusion here on Earth are indeed tremendous. As you can imagine, they’d be even more overwhelming on a spacecraft, which can’t carry a tokamak or stellarator onboard. Fission reactors are being considered as an alternative—NASA successfully tested the Kilopower fission reactor at the Nevada National Security Site in 2018 using a uranium-235 core about the size of a paper towel roll. The Kilopower reactor could produce up to 10 kilowatts of electric power. The downside is that it required highly enriched uranium, which would have brought additional launch safety and security concerns. This fuel also costs a lot.

But fusion could still work, even if the conventional hot-fusion approaches are nonstarters. LCF technology could be compact enough, light enough, and simple enough to serve for spacecraft.

How does LCF work? Remember that we earlier mentioned deuterium, the isotope of hydrogen with one proton and one neutron in its nucleus. Deuterided metals—erbium and titanium, in our experiments—have been “saturated” with either deuterium or deuterium atoms stripped of their electrons (deuterons). This is possible because the metal naturally exists in a regularly spaced lattice structure, which creates equally regular slots in between the metal atoms for deuterons to nest.

In a tokamak or a stellarator, the hot plasma is limited to a density of 10 14 deuterons per cubic centimeter. Inertial-confinement fusion devices can momentarily reach densities of 1026 deuterons per cubic centimeter. It turns out that metals like erbium can indefinitely hold deuterons at a density of nearly 1023 per cubic centimeter—far higher than the density that can be attained in a magnetic-confinement device, and only three orders of magnitude below that attained in an inertial-confinement device. Crucially, these metals can hold that many ions at room temperature.

The deuteron-saturated metal forms a plasma with neutral charge. The metal lattice confines and electron-screens the deuterons, keeping each of them from “seeing” adjacent deuterons (which are all positively charged). This screening increases the chances of more direct hits, which further promotes the fusion reaction. Without the electron screening, two deuterons would be much more likely to repel each other.

Using a metal lattice that has screened a dense, cold plasma of deuterons, we can jump-start the fusion process using what is called a Dynamitron electron-beam accelerator. The electron beam hits a tantalum target and produces gamma rays, which then irradiate thumb-size vials containing titanium deuteride or erbium deuteride.

When a gamma ray of sufficient energy—about 2.2 megaelectron volts (MeV)—strikes one of the deuterons in the metal lattice, the deuteron breaks apart into its constituent proton and neutron. The released neutron may collide with another deuteron, accelerating it much as a pool cue accelerates a ball when striking it. This second, energetic deuteron then goes through one of two processes: screened fusion or a stripping reaction.

In screened fusion, which we have observed in our experiments, the energetic deuteron fuses with another deuteron in the lattice. The fusion reaction will result in either a helium-3 nucleus and a leftover neutron or a hydrogen-3 nucleus and a leftover proton. These fusion products may fuse with other deuterons, creating an alpha particle, or with another helium-3 or hydrogen-3 nucleus. Each of these nuclear reactions releases energy, helping to drive more instances of fusion.

In a stripping reaction, an atom like the titanium or erbium in our experiments strips the proton or neutron from the deuteron and captures that proton or neutron. Erbium, titanium, and other heavier atoms preferentially absorb the neutron because the proton is repulsed by the positively charged nucleus (called an Oppenheimer-Phillips reaction). It is theoretically possible, although we haven’t observed it, that the electron screening might allow the proton to be captured, transforming erbium into thulium or titanium into vanadium. Both kinds of stripping reactions would produce useful energy.

As it stands, after billions of dollars of investment and decades of research, these approaches, which we’ll call “hot fusion,” still have a long way to go.

To be sure that we were actually producing fusion in our vials of erbium deuteride and titanium deuteride, we used neutron spectroscopy. This technique detects the neutrons that result from fusion reactions. When deuteron-deuteron fusion produces a helium-3 nucleus and a neutron, that neutron has an energy of 2.45 MeV. So when we detected 2.45 MeV neutrons, we knew fusion had occurred. That’s when we published our initial results in Physical Review C.

Electron screening makes it seem as though the deuterons are fusing at a temperature of 11 million °C. In reality, the metal lattice remains much cooler than that, although it heats up somewhat from room temperature as the deuterons fuse.

Rich Martin [left], a research engineer, and coauthor Bruce Steinetz, principal investigator for the LCF project’s precursor experiment, examine samples after a run. NASA

Overall, in LCF, most of the heating occurs in regions just tens of micrometers across. This is far more efficient than in magnetic- or inertial-confinement fusion reactors, which heat up the entire fuel amount to very high temperatures. LCF isn’t cold fusion—it still requires energetic deuterons and can use neutrons to heat them. However, LCF also removes many of the technologic and engineering barriers that have prevented other fusion schemes from being successful.

Although the neutron recoil technique we’ve been using is the most efficient means to transfer energy to cold deuterons, producing neutrons from a Dynamitron is energy intensive. There are other, lower energy methods of producing neutrons including using an isotopic neutron source, like americium-beryllium or californium-252, to initiate the reactions. We also need to make the reaction self-sustaining, which may be possible using neutron reflectors to bounce neutrons back into the lattice—carbon and beryllium are examples of common neutron reflectors. Another option is to couple a fusion neutron source with fission fuel to take advantage of the best of both worlds. Regardless, there’s more development of the process required to increase the efficiency of these lattice-confined nuclear reactions.

We’ve also triggered nuclear reactions by pumping deuterium gas through a thin wall of a palladium-silver alloy tubing, and by electrolytically loading palladium with deuterium. In the latter experiment, we’ve detected fast neutrons. The electrolytic setup is now using the same neutron-spectroscopy detection method we mentioned above to measure the energy of those neutrons. The energy measurements we get will inform us about the kinds of nuclear reaction that produce them.

We’re not alone in these endeavors. Researchers at Lawrence Berkeley National Laboratory, in California, with funding from Google Research, achieved favorable results with a similar electron-screened fusion setup. Researchers at the U.S. Naval Surface Warfare Center, Indian Head Division, in Maryland have likewise gotten promising initial results using an electrochemical approach to LCF. There are also upcoming conferences: the American Nuclear Society’s Nuclear and Emerging Technologies for Space conference in Cleveland in May and the International Conference on Cold Fusion 24, focused on solid-state energy, in Mountain View, Calif., in July.

Any practical application of LCF will require efficient, self-sustaining reactions. Our work represents just the first step toward realizing that goal. If the reaction rates can be significantly boosted, LCF may open an entirely new door for generating clean nuclear energy, both for space missions and for the many people who could use it here on Earth.

Bayar Baramsai is a systems engineer at NASA Glenn Research Center contributing to on the lattice confinement fusion project.

Theresa Benyo is a physicist and the principal investigator for the lattice confinement fusion project at NASA Glenn Research Center.

Lawrence Forsley is the deputy principal investigator for NASA’s lattice confinement fusion project, based at NASA Glenn Research Center.

Bruce Steinetz is a senior technologist at NASA Glenn Research Center involved in the lattice confinement fusion project.

It is great that this research continues to refine the decades of previous work in related lattice areas. For those interested you can google: Condensed Matter Nuclear Science. And International Society for CMNS. There have been 23 international conferences since 1989, with the 24th scheduled for this July, ICCF24, in Mountain View, CA, same city as the Google headquarters.

This article is like a Turing Machine. It's proven. Bone-head simple. Anyone can understand it, particularly particle physicists because it uses only familiar tools and evidence is high energy particles. However the implications are that fusion can happen in metals 10 billion times easier than in gas or plasma. That ought to set Fusion scientists back on their heels. At the least consider a metal hydride target. Also take another look at Solid State Atomic and Fusion Energy aka Low Energy Nuclear Reactions. They get more plausibile now that we have proven electron shielding works. This bolsters the observations that Earth must have fusion reactions inside to account for the flux of geothermal heat energy, the magnetic shield, plate tectonics, and the helium flux and isotope ratios. It has been odd that the heat from Earth's core seems to come from the solid inner core, and is convected by iron gyres to the mantle. But the solid core and liquid core are made of the same materials? Why would the solid be sending convection gyres out? Perhaps there is some heat producing reaction that works in a solid but not a liquid.. Here it is! Lattice confined fusion or LENR might be in Earth's core keeping us safe from the solar winds by powering our magnetic force field!

As I understand, Sakharov’s ideas were originally developed thinking of a space propulsion system, the idea of turning it into an energy reactor came later and overlooked that fact entirely.

Anyway, I see briefly mentioned that you are also doing deuterium

diffusion through palladium experiments (I read the paper in the International Journal of Hydrogen Energy) and also electrolytic palladium loading (I read the paper where you counted an increase in neutrons with bubble detectors).

I can only say: isn’t this 1989 all over again? I am just astonished that 32 years had to pass to people begin taking seriously LENR again. Thanks to all the involved in this Important research, please keep an open mind, this is not just about smashing particles together, there’s an entire electromagnetic aspect being completely overlooked.

A new book tells the story of how they broke a computer-science glass ceiling

Joanna Goodrich is the associate editor of The Institute, covering the work and accomplishments of IEEE members and IEEE and technology-related events. She has a master's degree in health communications from Rutgers University, in New Brunswick, N.J.

Jean Jennings (left) and Frances Bilas, two of the ENIAC programmers, are preparing the computer for Demonstration Day in February 1946.

If you looked at the pictures of those working on the first programmable, general-purpose all-electronic computer, you would assume that J. Presper Eckert and John W. Mauchly were the only ones who had a hand in its development. Invented in 1945, the Electronic Numerical Integrator and Computer (ENIAC) was built to improve the accuracy of U.S. artillery during World War II. The two men and their team built the hardware. But hidden behind the scenes were six women—Jean Bartik, Kathleen Antonelli, Marlyn Meltzer, Betty Holberton, Frances Spence, and Ruth Teitelbaum—who programmed the computer to calculate artillery trajectories in seconds.

The U.S. Army recruited the women in 1942 to work as so-called human computers—mathematicians who did calculations using a mechanical desktop calculator.

For decades, the six women were largely unknown. But thanks to Kathy Kleiman, cofounder of ICANN (the Internet Corporation for Assigned Names and Numbers), the world is getting to know the ENIAC programmers’ contributions to computer science. This year Kleiman’s book Proving Ground: The Untold Story of the Six Women Who Programmed the World’s First Modern Computer was published. It delves into the women’s lives and the pioneering work they did. The book follows an award-winning documentary, The Computers: The Remarkable Story of the ENIAC Programmers, which Kleiman helped produce. It premiered at the 2014 Seattle International Film Festival and won Best Documentary Short at the 2016 U.N. Association Film Festival.

Kleiman plans to give a presentation next year about the programmers as part of the IEEE Industry Hub Initiative’s Impact Speaker series. The initiative aims to introduce industry professionals and academics to IEEE and its offerings.

Planning for the event, which is scheduled to be held in Silicon Valley, is underway. Details are to be announced before the end of the year.

The Institute spoke with Kleiman, who teaches Internet technology and governance for lawyers at American University, in Washington, D.C., about her mission to publicize the programmers’ contributions. The interview has been condensed and edited for clarity.

Kathy Kleiman delves into the ENIAC programmers’ lives and the pioneering work they did in her book Proving Ground: The Untold Story of the Six Women Who Programmed the World’s First Modern Computer.Kathy Kleiman

The Institute: What inspired you to film the documentary?

Kathy Kleiman: The ENIAC was a secret project of the U.S. Army during World War II. It was the first general-purpose, programmable, all-electronic computer—the key to the development of our smartphones, laptops, and tablets today. The ENIAC was a highly experimental computer, with 18,000 vacuums, and some of the leading technologists at the time didn’t think it would work, but it did.

Six months after the war ended, the Army decided to reveal the existence of ENIAC and heavily publicize it. To do so, in February 1946 the Army took a lot of beautiful, formal photos of the computer and the team of engineers that developed it. I found these pictures while researching women in computer science as an undergraduate at Harvard. At the time, I knew of only two women in computer science: Ada Lovelace and then U.S. Navy Capt. Grace Hopper. [Lovelace was the first computer programmer; Hopper co-developed COBOL, one of the earliest standardized computer languages.] But I was sure there were more women programmers throughout history, so I went looking for them and found the images taken of the ENIAC.

The pictures fascinated me because they had both men and women in them. Some of the photos had just women in front of the computer, but they weren’t named in any of the photos’ captions. I tracked them down after I found their identities, and four of six original ENIAC programmers responded. They were in their late 70s at the time, and over the course of many years they told me about their work during World War II and how they were recruited by the U.S. Army to be “human computers.”

Eckert and Mauchly promised the U.S. Army that the ENIAC could calculate artillery trajectories in seconds rather than the hours it took to do the calculations by hand. But after they built the 2.5-meter-tall by 24-meter-long computer, they couldn’t get it to work. Out of approximately 100 human computers working for the U.S. Army during World War II, six women were chosen to write a program for the computer to run differential calculus equations. It was hard because the program was complex, memory was very limited, and the direct programming interface that connected the programmers to the ENIAC was hard to use. But the women succeeded. The trajectory program was a great success. But Bartik, McNulty, Meltzer, Snyder, Spence, and Teitelbaum’s contributions to the technology were never recognized. Leading technologists and the public never knew of their work.

I was inspired by their story and wanted to share it. I raised funds, researched and recorded 20 hours of broadcast-quality oral histories with the ENIAC programmers—which eventually became the documentary. It allows others to see the women telling their story.

“If we open the doors to history, I think it would make it a lot easier to recruit the wonderful people we are trying to urge to enter engineering, computer science, and related fields.”

Why was the accomplishment of the six women important?

Kleiman: The ENIAC is considered by many to have launched the information age.

We generally think of women leaving the factory and farm jobs they held during World War II and giving them back to the men, but after ENIAC was completed, the six women continued to work for the U.S. Army. They helped world-class mathematicians program the ENIAC to complete “hundred-year problems” [problems that would take 100 years to solve by hand]. They also helped teach the next generation of ENIAC programmers, and some went on to create the foundations of modern programming.

What influenced you to continue telling the ENIAC programmers’ story in your book?

Kleiman: After my documentary premiered at the film festival, young women from tech companies who were in the audience came up to me to share why they were excited to learn the programmers’ story. They were excited to learn that women were an integral part of the history of early computing programming, and were inspired by their stories. Young men also came up to me and shared stories of their grandmothers and great-aunts who programmed computers in the 1960s and ’70s and inspired them to explore careers in computer science.

I met more women and men like the ones in Seattle all over the world, so it seemed like a good idea to tell the full story along with its historical context and background information about the lives of the ENIAC programmers, specifically what happened to them after the computer was completed.

What did you find most rewarding about sharing their story?

Kleiman: It was wonderful and rewarding to get to know the ENIAC programmers. They were incredible, wonderful, warm, brilliant, and exceptional people. Talking to the people who created the programming was inspiring and helped me to see that I could work at the cutting edge too. I entered Internet law as one of the first attorneys in the field because of them.

What I enjoy most is that the women’s experiences inspire young people today just as they inspired me when I was an undergraduate.

Clockwise from top left: Jean Bartik, Kathleen Antonelli, Betty Holberton, Ruth Teitelbaum, Marlyn Meltzer, Frances Spence.Clockwise from top left: The Bartik Family; Bill Mauchly, Priscilla Holberton, Teitelbaum Family, Meltzer Family, Spence Family

Is it important to highlight the contributions made throughout history by women in STEM?

Kleiman: [Actor] Geena Davis founded the Geena Davis Institute on Gender in Media, which works collaboratively with the entertainment industry to dramatically increase the presence of female characters in media. It’s based on the philosophy of “you can’t be what you can’t see.”

That philosophy is both right and wrong. I think you can be what you can’t see, and certainly every pioneer who has ever broken a racial, ethnic, religion, or gender barrier has done so. However, it’s certainly much easier to enter a field if there are role models who look like you. To that end, many computer scientists today are trying to diversify the field. Yet I know from my work in Internet policy and my recent travels across the country for my book tour that many students still feel locked out because of old stereotypes in computing and engineering. By sharing strong stories of pioneers in the fields who are women and people of color, I hope we can open the doors to computing and engineering. I hope history and herstory that is shared make it much easier to recruit young people to join engineering, computer science, and related fields.

Are you planning on writing more books or producing another documentary?

Kleiman: I would like to continue the story of the ENIAC programmers and write about what happened to them after the war ended. I hope that my next book will delve into the 1950s and uncover more about the history of the Universal Automatic Computer, the first modern commercial computer series, and the diverse group of people who built and programmed it.

The ban spotlights semiconductors for supercomputers; China hasn’t yet responded to restrictions

It has now been over a month since the U.S. Commerce Department issued new rules that clamped down on the export of certain advanced chips—which have military or AI applications—to Chinese customers.

China has yet to respond—but Beijing has multiple options in its arsenal. It’s unlikely, experts say, that the U.S. actions will be the last fighting word in an industry that is becoming more geopolitically sensitive by the day.

This is not the first time that the U.S. government has constrained the flow of chips to its perceived adversaries. Previously, the United States hasblocked chip sales to individual Chinese customers. In response to the Russian invasion of Ukraine earlier this year, the United States (along with several other countries, including South Korea and Taiwan) placed Russia under a chip embargo.

But none of these prior U.S. chip bans were as broad as the new rules, issued on 7 October. “This announcement is perhaps the most expansive export control in decades,” says Sujai Shivakumar, an analyst at the Center for International and Strategic Studies, in Washington.

The rules prohibit the sale, to Chinese customers, of advanced chips with both high performance (at least 300 trillion operations per second, or 300 teraops) and fast interconnect speed (generally, at least 600 gigabytes per second). Nvidia’s A100, for comparison, is capable of over 600 teraops and matches the 600 Gb/s interconnect speed. Nvidia’s more-impressive H100 can reach nearly 4,000 trillion operations and 900 Gb/s. Both chips, intended for data centers and AI trainers, cannot be sold to Chinese customers under the new rules.

Additionally, the rules restrict the sale of fabrication equipment if it will knowingly be used to make certain classes of advanced logic or memory chips. This includes logic chips produced at nodes of 16 nanometers or less (which the likes of Intel, Samsung, and TSMC have done since the early 2010s); NAND long-term memory integrated circuits with at least 128 layers (the state of the art today); or DRAM short-term memory integrated circuits produced at 18 nanometers or less (which Samsung began making in 2016).

Chinese chipmakers have barely scratched the surface of those numbers. SMIC switched on 14-nm mass production this year, despite facing existing U.S. sanctions. YMTC started shipping 128-layer NAND chips last year.

The rules restrict not just U.S. companies, but citizens and permanent residents as well. U.S. employees at Chinese semiconductor firms have had to pack up. ASML, a Dutch maker of fabrication equipment, has told U.S. employees to stop servicing Chinese customers.

Speaking of Chinese customers, most—including offices, gamers, designers of smaller chips—probably won’t feel the controls. “Most chip trade and chip production in China is unimpacted,” says Christopher Miller, a historian who studies the semiconductor trade at Tufts University.

The controlled sorts of chips instead go into supercomputers and large data centers, and they’re desirable for training and running large machine-learning models. Most of all, the United States hopes to stop Beijing from using chips to enhance its military—and potentially preempt an invasion of Taiwan, where the vast majority of the world’s semiconductors and microprocessors are produced.

In order to seal off one potential bypass, the controls also apply to non-U.S. firms that rely on U.S.-made equipment or software. For instance, Taiwanese or South Korean chipmakers can’t sell Chinese customers advanced chips that are fabricated with U.S.-made technology.

It’s possible to apply to the U.S. government for an exemption from at least some of the restrictions. Taiwanese fab juggernaut TSMC and South Korean chipmaker SK Hynix, for instance, have already acquired temporary exemptions—for a year. “What happens after that is difficult to say,” says Patrick Schröder, a researcher at Chatham House in London. And the Commerce Department has already stated that such licenses will be the exception, not the rule (although Commerce Department undersecretary Alan Estevez suggested that around two-thirds of licenses get approved).

More export controls may be en route. Estevez indicated that the government is considering placing restrictions on technologies in other sensitive fields—specifically mentioning quantum information science and biotechnology, both of which have seen China-based researchers forge major progress in the past decade.

The Chinese government has so far retorted with harsh words and little action. “We don’t know whether their response will be an immediate reaction or whether they have a longer-term approach to dealing with this,” says Shivakumar. “It’s speculation at this point.”

Beijing could work with foreign companies whose revenue in the lucrative Chinese market is now under threat. “I’m really not aware of a particular company that thinks it’s coming out a winner in this,” says Shivakumar. This week, in the eastern city of Hefei, the Chinese government hosted a chipmakers’ conference whose attendees included U.S. firms AMD, Intel, and Qualcomm.

Nvidia has already responded by introducing a China-specific chip, the A800, which appears to be a modified A100 cut down to meet the requirements. Analysts say that Nvidia’s approach could be a model for other companies to keep up Chinese sales.

There may be other tools the Chinese government can exploit. While China may be dependent on foreign semiconductors, foreign electronics manufacturers are in turn dependent on China for rare-earth metals—and China supplies the supermajority of the world’s rare earths.

There is precedent for China curtailing its rare-earth supply for geopolitical leverage. In 2010, a Chinese fishing boat collided with two Japanese Coast Guard vessels, triggering an international incident when Japanese authorities arrested the boat’s captain. In response, the Chinese government cut off rare-earth exports to Japan for several months.

Certainly, much of the conversation has focused on the U.S. action and the Chinese reaction. But for third parties, the entire dispute delivers constant reminders of just how tense and volatile the chip supply can be. In the European Union, home to less than 10 percent of the world’s microchips market, the debate has bolstered interest in the prospective European Chips Act, a plan to heavily invest in fabrication in Europe. “For Europe in particular, it’s important not to get caught up in this U.S.-China trade issue,” Schröder says.

“The way in which the semiconductor industry has evolved over the past few decades has predicated on a relatively stable geopolitical order,” says Shivakumar. “Obviously, the ground realities have shifted.”

Learn about the latest generation high-performance data acquisition boards from Teledyne

In this webinar, we explain the design principles and operation of our fourth-generation digitizers with a focus on the application programming interface (API).

Register now for this free webinar!

Topics covered in this webinar:

Who should attend? Developers that want to learn more about Teledyne SP Device's latest generation data acqusition boards (digitizers).

What attendees will learn? An overview of existing digitizer products and their specifications as well as planned upcoming models. Details about the application programming interface (API) and how to operate these devices.

NASA’s New Shortcut to Fusion Power - IEEE Spectrum

Terbium Hydride Presenter: Thomas Elter, Senior Field Applications Engineer