Tariff wars and a reshaping of AI’s global landscape

In the aftermath of the presidential election in the United States in 2024, renewed implementation of substantial tariffs could lead to a fundamental restructuring of global technology supply chains that power artificial intelligence (AI) development. While established players recalibrate, countries such as India are finding themselves in a precarious, yet potentially advantageous, position — as the “third option” in the technological rivalry between the U.S. and China.

The tariffs have raised the costs of imported components that are critical to AI infrastructure. In 2024, electronics imports to the U.S. alone were nearly $486 billion, with data processing machine imports costing around $200 billion, sourced largely from tariff-affected countries such as Mexico, Taiwan, China, and Vietnam. These tariffs risk making the U.S. the most expensive place in the world to build AI infrastructure, driving companies to relocate data centre construction abroad, ironically to China.

The first wave of the Trump tariffs, between 2018-20, resulted in a price increase for imported semiconductor components. The current tariff regime has expanded this to as high as 27% on critical AI hardware components in 2025, particularly affecting specialised AI accelerators and advanced logic chips, components that constitute the computational foundation.

Economics behind the scenes

Economic theory suggests such tariff policies should stimulate domestic production through import substitution. Indeed, some reports project that the U.S. will more than triple its domestic semiconductor manufacturing capacity from 2022 to 2032, which is the largest projected growth rate globally. However, classical Ricardian trade theory reminds us that comparative advantage remains operative even under protectionist regimes. The specialised nature of AI hardware production means that it has to deal with dispersed technical capabilities, creating inevitable inefficiencies when global supply chains are artificially segmented.

This protectionist approach often comes at the cost of economic efficiency and innovation. The tariffs disrupt global supply chains, increase production costs, and create uncertainty that discourages investment. Empirical studies show that a one standard deviation increase in tariffs can reduce output growth by 0.4% over five years, and reversing the recent U.S. tariffs could have led to a 4% cumulative output gain. In the context of AI — where innovation cycles are rapid and dependent on access to cutting-edge technology and global collaboration — such disruptions can slow technological progress and reduce productivity.

Tariffs may shield domestic firms from competition, reducing their incentive to innovate, and limit access to advanced imported technologies that are necessary for AI advancement. This is consistent with what economists call a “deadweight loss”, where the diminished trade volume creates economic inefficiencies that benefit neither producers or consumers.

Rapid expansion in AI chip demand will require massive increases in data centre power capacity, from about 11 GW in 2024 to potentially 68 GW by 2027 and 327 GW by 2030. Failure to meet these infrastructure needs could undermine the U.S.’s competitiveness in AI.

Research demonstrates that access to expensive, advanced computational infrastructure serves as a primary determinant of innovation capacity in AI, leading to a stratification effect. Moreover, tariffs imposed by developed countries can reduce technology transfer rates, temporarily changing innovation incentives, which can in turn, slow down the overall pace of AI innovation. On the other hand, tariffs by developing countries can speed up technology transfer but affect relative wages and innovation differently. This is a complex interplay that can increase global inequalities in AI capabilities.

Where India stands

This could create unexpected opportunities for India, which has positioned itself as a strategic “third option” in the U.S.-China technological competition. Indian IT exports growth rates have been around 3.3% to 5.1% year-over-year in recent years. AI and digital engineering segments are among the fastest-growing areas within India’s tech sector. The Indian government has launched significant AI-related programmes, and increased semiconductor design, fabrication and technology investments, with several billion dollars in semiconductor fab proposals and multinational research and development centres such as AMD’s $400 million design campus in Bengaluru.

India’s comparative advantage lies in lower labour costs and specialised knowledge domains. India produces approximately 1.5 million engineering graduates annually, with a lot of them showing considerable aptitude for AI development.

India depends heavily on imported hardware components and international collaborations for this. Tariffs and supply chain disruptions that raise costs of AI infrastructure could slow down India’s global ambitions in AI. However, India might also benefit indirectly if companies seek alternatives to China for manufacturing and data centre locations.

The economic reshaping catalysed by these tariff policies has accelerated what economists call “capital substitution effects”. As hardware costs rise, companies increasingly shift toward optimising existing resources through algorithmic efficiency, model compression techniques and hardware improvements rather than raw computational power. The tariff environment has effectively created these price signals. The cost of using AI models is falling dramatically (by about 40 times a year) due to this. Therefore, while tariffs may increase upfront infrastructure costs, consumer-level AI applications might not see immediate price hikes.

Tariff structures interact with differential regulatory environments uniquely to create novel competitive dynamics. Lenient data protection regulations, broad digital access, and data availability can partially offset hardware cost disadvantages through greater access to training data. Regulatory and economic factors can defy simplistic analysis.

Decentralised AI development

Tariff changes have led to the development of specialised AI hardware that is designed specifically for particular applications rather than general-purpose computation. This “application-specific integrated circuit” (ASIC) approach represents an architectural shift. To optimise data centre infrastructure for AI inference, over 50% of workload accelerators could be custom ASICs by 2028, up from 30% in 2023.

Ironically, policies intended to strengthen domestic technological capabilities could inadvertently accelerate the decentralisation of AI development. Historical analogies suggest that technologies facing market constraints often evolve toward more distributed implementations. The mainframe-to-personal computer transition of the 1980s offers an instructive parallel.

Arindam Goswami is a Research Analyst in the High-Tech Geopolitics Programme at The Takshashila Institution, Bengaluru

Leave a Comment