Why Winning a Trade War Can Lose the U.S. Battle over Innovation

Why Winning a Trade War Can Lose the U.S. Battle over Innovation

For years, American companies have complained that foreign competitors, the Japanese in the past, and more recently, the Chinese, were stealing, borrowing, and copying intellectual properties. These complaints against copycats that foreign theft of American trade secrets runs up an estimated cost of “between $225 billion and $600 billion annually,” according to the United States Trade Representative, is not entirely new. It’s an age-old grievance, one that has gone on for centuries.

In the 1850s, half of Britain’s exports were cotton goods, and by the early 20th century, half of the world’s cotton clothes were British made. But companies from the United States overtook their British counterparts in less than two decades. Those U.S. companies that came to dominate the textile industry were then surpassed by yet another wave of new competitors, this time from Asia: first Japan, then Hong Kong, next Taiwan and South Korea, and finally China, India, and Bangladesh.

As competition raged, the large, populous mill towns that once made up the manufacturing basins across New England and Carolina were reduced to ghost towns. Industrial buildings were either painfully repurposed or abandoned entirely. Unemployment, de-urbanization, and rising crime rates plagued these formerly vibrant cities.

It is this understandable fear of ghost towns and abandoned cities that our politics has come to depend upon. The sense of grievance is used to justify imposing tariffs on imports from China to Mexico to Europe in the belief that “protection will lead to great strength and prosperity.” That the existing trade deal is “a disaster,” and “one-sided.” Yet, some companies continued to thrive over the course of centuries whether their grievance was heard or ignored.

The fact that industry knowledge is being copied, and other nations are catching up fast are the same complaints made more than a century ago by German drug maker Hoechst against its Swiss copycats, CIBA, Giegy, and Sandoz. In the land of counterfeiters, or le pays de contre-facteurs, as the French called it, Switzerland had yet to establish patent laws as late as 1888. Local chemists were free, or even encouraged, to imitate foreign inventions. When antipyrine—the first synthetic fever-reducing drug—was created in the laboratory, the world couldn’t get enough of it, and the Swiss were selling their German imitation as quickly as they could produce it. Organic chemistry was the hotbed of innovation. Only when Alexander Fleming discovered antibiotic penicillin did everyone understand that the next blockbuster would no longer come from chemistry alone but from an entirely new discipline: microbiology.

Immediately following the Second World War, from Europe to the U.S., the nascent pharmaceutical firms were rushing to set up soil screening programs around the world, looking for exotic fungi, and hunting down the pay dirt, in search of increasingly potent antibiotics. Field workers got soil from cemeteries and sent balloons up in the air to collect windblown samples. They went down to the bottoms of mine shafts, hiked up to mountain tops, and ventured anywhere else in between. The study of microbes took center stage, displacing organic chemistry as the key discipline for scientific discovery. Along with new techniques of deep-tank fermentation and the purification of medicines, the world saw a precipitous drop in infectious disease. What was once a nasty, brutish, and fatal infection was transformed into a curable inconvenience.

Leap by Howard YuThen came the biotechnology revolution, beginning in the 70s. Scientists marveled at the inner workings of chromosomes within the nuclei of cells and eventually were capable of recombining DNA molecules, instructing bacteria to produce insulin for diabetics, and producing many other active ingredients that couldn’t be abundantly harvested from nature alone. Then, with the complete scanning of the human genome and the advancements in computational applications, genetic engineering essentially went digital. Scientists are now uncovering molecular pathways and the biological underpinnings of rare cancers. We have again witnessed the transition into a new knowledge frontier, a discipline under the rubric of genomics and bioengineering.

Today, it takes extravagantly equipped laboratories, huge budgets, and large teams of investigators to stay at the forefront of the industry. Switzerland’s Novartis alone spent $10 billion on research and development in 2013. From cancer therapy to HIV treatment, the Western pioneers still lead the global industry in the latest developments. In contrast to the automotive industry, where global competition has decimated Detroit, turning it into the American rust belt, the wealth in Basel, where Novartis is headquartered, seems forever bountiful. Its inhabitants continue to enjoy the highest standards of living in Western Europe.

The capital expenditure of the pharmaceutical industry, however, can never properly explain how newcomers in Asia fail to overtake Western pioneering incumbents. In fact, capital expenditure, trade secrets, and patent protection were hardly a challenge for the United States when it overtook Great Britain as the leading exporter of textiles during the turn of the century. The same went for heavy machinery, wind turbines, solar panels, personal computers, mobile phones, and automobiles, where latecomers irrevocably displaced early pioneers.

What the history of pharmaceuticals has shown is that the leap toward a knowledge discipline—first chemistry, next microbiology, and then genomics—opened new paths for innovation. And only by infusing new knowledge and constantly changing the way a product or service is delivered could a pioneer create headroom for growth, thus preventing latecomers from catching up. For a nation to prosper economically, its industrial clusters must be encouraged to leap.

That’s why for all the talks on trade war and import tariffs, the great headlines are distracting the public from understanding what truly matters.

I remember attending a start-up pitch night right before Christmas last year, where entrepreneurs presented their business plan to an audience of investors from the Harvard alumni community in New York. A medical device start-up based in California had just launched a baby monitoring device using imaging technologies to prevent accidental suffocation during sleep. The founding team were all graduate students from MIT who were born in India. The team members who now live in San Francisco are mostly responsible for design and research. The entire supply chain, from early prototyping to manufacturing to packaging to logistics, resides in Shenzhen, China. “They are fast and quick and very flexible. I can’t find anyone prototype my product idea this fast and this cheap in the U.S.,” the thirty-something CEO told me. “I can’t innovate without my counterpart in Asia.” In other words, one direct casualty of a trade war is the innovation cluster in the U.S., because their own start-ups will no longer able to collaborate with Asian suppliers on competitive terms to get their businesses off the ground.

Meanwhile, the high-profile recrimination of China’s ZTE for violating sanctions on Iran and North Korea only strengthened the determination for China to become semiconductor-independent. The original decision of U.S. Commerce Department to ban ZTE from buying microprocessors from Qualcomm and other American technologies, though reversed and put on hold by Trump, has created much anxiety that Beijing realized it can no longer depend on foreign imports. In April, China announced a “Big Fund” of about $47.4 billion to spur development of its semiconductor industry. Ensuring that its domestic companies could leap from being low-cost providers, and ridding itself from the dependence of foreign suppliers for critical parts, have become a matter of survival. In fact, when America stops selling technologies to China, it is in effect an import substitution, an economic policy to force indigenous companies to master their own industry’s know-how. Japan, South Korea, Taiwan, and Israel have all at some point embraced import substitution to nurture their infant industries before domestic firms were able to leap to the world stage.

Even if the White House manages to reduce the trade deficit through a hardline approach and clever bargaining, the key question is how long those effects can last. What industrial history tells us is that a vibrant economic system depends on a set of companies that stay globally competitive. The way our global supply chain is currently formed is precisely freeing up time and resources for American companies to specialize on the highest value-added activities at the top end. The biggest risk for a trade battle or recrimination of copycats remains that it disrupts the global supply chain and makes it harder for American firms to innovate. Losing the war on innovation would be a great tragedy.

Change Planning Toolkit Million Dollar Value

Wait! Before you go…

Choose how you want the latest innovation content delivered to you:

Howard YuHoward Yu is the LEGO professor of management and innovation at the IMD Business School in Switzerland and the author of LEAP: How Businesses Thrive in a World Where Everything Can Be Copied (PublicAffairs, June 2018).

Howard Yu




Our Annual Trends Report: Boosting the Possibilities for 2021

By Shawn Nason | February 25, 2021

  I still remember the day we got the email telling us that the in-person version of the 2021 Consumer…

Read More

Controlling the controllables

By Janet Sernack | February 22, 2021

A recent article by McKinsey and Co “COVID-19: Implications for business”  describes a paradoxical dilemma for managers: the need to process both the…

Read More

Leave a Comment