Thoughts in Between

by Matt Clifford

TiB 166: Twitter and economic stagnation; China’s GPT-3; climate change makes us bad; and more...

Welcome new readers! Thoughts in Between is a newsletter (mainly) about how technology is changing politics, culture and society and how we might do it better.

It goes out every Tuesday to thousands of entrepreneurs, investors, policy makers, politicians and others. It’s free.

Forwarded this email? Subscribe here. Enjoy this email? Forward it to a friend.

Social media vs economic progress

Eli Dourado (whose work on NASA procurement we covered in TiB 157) has a new essay, "The New Productivity Revolution", on reasons for optimism on technological progress and economic growth. He argues we're on the cusp of a new wave of "Great Inventions" that will have a comparable impact on growth to electrification or plumbing. Dourado cites mRNA technology, protein design and geothermal energy capture as candidates (I mostly share his tech optimism, but it's worth engaging with the other side of the argument too, such as this excellent skeptical piece from February).

According to Dourado, the primary barrier to the realisation of economic progress is political, not technological. Echoing Marc Andreessen's argument in "It's Time to Build" (see TiB 111), he argues that it's NIMBY-ism, special interests and misplaced risk aversion (the FDA is singled out) holding back advances as diverse as better high density housing, COVID vaccines and supersonic flight.

Why has this happened? Dourado has an interesting hypothesis:

we have come to care less about absolute progress... because we now spend more of our mental and emotional energy on relative status

That is, the rise of mass - and especially social - media has sucked us into zero sum status competition and away from focus on "growing the pie". It's a plausible fit with the evidence. We’ve talked before about Ray Fair’s work on when America stopped investing in the future (TiB 71); about 1971 as the year it all started going wrong (TiB 114); and 1960 as the “year the singularity was cancelled” (TiB 65). Time to get off Twitter and start inventing.

PanGu: a Chinese GPT-3

For a long time there was a (perhaps comforting) trope that China was adept at copying technology, but not at innovating. It's not been possible to take that idea seriously for some time; new evidence to the contrary surfaces all the time, particularly in AI. A recent and important example is PanGu, a Chinese variation on OpenAI's GPT-3 language model (which made a big splash last year and which we've discussed several times before - see TiB 119, 124 and 128). Jeff Ding and Jack Clark have excellent commentary.

As Jeff notes, the team behind PanGu claims that it surpasses GPT-3 on a number of dimensions, particularly "few shot" learning tasks, where the model has a small number of examples of a specific task to train on. There are a few important and familar threads here. First, it's another example of the continued efficacy of scale in machine learning (see TiB 152): PanGu is a ~200bn parameter model trained on over a terrbyte of text (compared to GPT's 175bn parameters trained on ~0.5TB of text)

Second, models of this scale require vast, nation state- or Big Tech-level resources to train, which makes them hard to replicate or audit for startup and academic actors (see TiB 159). Jeff points out that PanGu was a collaboration between Huawei and PCL, a government owned and operated research lab. Third, semiconductor politics are never far from the surface. As Jack notes, the model was trained on Ascend processors, which - though currently manufactured by TSMC - Huawei is trying to disentangle from supply chains the US can control. PanGu will be worth keeping an eye on.

Climate change makes us bad

Lots of people have hypothesised that climate change might lead to an increase in conflict, as macro shifts in the distribution and scarcity of resources lead to higher intensity competition between states and peoples (This is not exclusively a 21st century phenomenon, of course). But there are also micro reasons to think that a warming planet will see more conflict. A new paper suggests that higher temperatures increases the level of individual discriminatory behaviour against out-groups.

The authors conducted two large experiments in Germany where members of the majority population were placed in a position where they could help or ignore a stranger in need of assistance. The experiment varied the appearance of the stranger to appear as either a white native German or a Muslim immigrant. It also took advantage of a summer of highly variable weather to observe the effect of high temperatures on behaviour. The paper reports that the incidence of out-group discrimination increased significantly with the temperature.

It's just one paper, but it's an interesting example of the kind of under-explored impact that climate change might have. I came across this paper, incidentally, via Kevin Lewis, who is perhaps the best curator of new academic papers on the internet. His latest climate change roundup is full of gems: the pieces on the cost of emissions reduction in China and the harm of forest degradation (vs deforestation) are particularly worth checking out.

Quick links

  1. Don't ask, don't get. AI models won't make your pictures look good unless you demand it. Funny and surprising.
  2. A harsher mistress? A draft constitution of Mars.
  3. Please interpret this link imaginatively. People are more creative when you tell them they will be(!)
  4. Do you feel lucky, robot? What single word would you say to convince a judge that you're human? Fascinating study on "Minimal Turing Tests".
  5. Ban lawnmowers! Garden equipment emits more particulate matter than cars.

How you can help

Thanks for reading all the way to the end.

I'd love it if you'd forward this to a friend or share the link on Twitter, etc.

It's always nice to get email from readers; just hit reply.

Until next week,

Matt Clifford

PS: Lots of newsletters get stuck in Gmail’s Promotions tab. If you find it in there, please help train the algorithm by dragging it to Primary. It makes a big difference.