An outsider on the inside: how Ans Westra created New Zealand’s ‘national photo album’

They try but invariably fail – those writers who believe they are capable of encapsulating in prose or verse the essence of what it means to be a New Zealander. Even at the point of publication, their works seem anachronistic and clichéd.

Harnessing the New Zealand identity has proven to be as challenging as clutching at fog – it may be apparent everywhere, but it seems impossible to pin down.

But if you do want a representation of New Zealandness over the past six decades, the work of the photographer Ans Westra (1936–2023) is in many ways unsurpassable.

Westra produced what amounts to a national photo album, in which a vast span of the country’s everyday existence was documented with unrivalled skill and perception. At their best, her images crossed that threshold into a liminal space where culture, memory, history and experience all fused together.

Born in Leiden in the Netherlands in 1936, Westra arrived in New Zealand in December 1957. She went on to become unquestionably one of New Zealand’s greatest documentary photographers. Practically every image she produced was a masterpiece of composition, lighting, space, form, angle and subject choice.

She was also able to achieve a sort of artistic alchemy in these works – taking the base elements of film, camera, card and chemicals, and converting them into an entrancing amalgam of scenes that reflected the essence of the nation’s intricately-marbled identity.

Ans Westra, West Coast, Towards Blackball, 1971.
National Library, ref AW-0265

Elevated ordinariness

In her works, the boundaries between documentary photography and art photography were completely porous, which enabled her to conjure up New Zealandness in an artistic as well as descriptive form.

Of course, subject selection was vital. But instead of chasing the sensational or the sublime, Westra chose to fossick around the everyday and the mundane – a realm where her skills were unequalled. After all, who else could infuse such luminosity into scenes that at first glance appeared plain and even dreary?

She seemed consumed by the urge to document every crevice of our day-to-day lives, but in ways that often elevated ordinariness into profound poignancy. Her approach shunned artifice and visual gimmickry in favour of penetrating cultural and social exploration, leaving her photographs to simmer rather than fizz.

The photograph titled “Māori on Willis Street” exemplifies this approach. The image was one of thousands she took documenting aspects of post-war Māori urbanisation.

Ans Westra, Māori on Willis Street, 1960.
National Library, ref AWM-0125-2-F

The picture is of a young person staring blankly out into a Wellington street on a rainy day. But instead of a photograph of the entire subject in profile, Westra positioned her shot in a way that made the back of the subject’s head and shoulders visible though a shop’s corner display window.

This had the effect of obscuring partly the clarity of the outline of the subject, while simultaneously projecting reflections of the traffic and buildings on the foregrounded window.

This virtuoso composition does not allow the viewer simply to glance before turning the page. Instead, the image demands contemplation. The material world is shown up here for the surface-only value it possesses, along with its capacity to distract and elude.

The bright lights of the urban Promised Land have been exposed as just a superficial glare. The photograph’s foreground is dominated by this dark-coated youth who appears almost as a silhouetted figure. The background is an indifferent, even life-depleting cityscape.

Bleak shop facades line this commercial canyon, offering the subject no joy or optimism – just an awning to shelter from the drizzle. Above all, though, the viewer is drawn to the solemn stillness of the subject’s face, which conveys a sense of overpowering loneliness and mournful gloom.

Ans Westra, Scenes of Rural Life along the Whanganui River, Pipiriki.
National Library, ref AWM-0264-F

An exposition of self

In the early 1990s, during a period of acute self-reflection, Westra jotted down some revelatory notes about her approach to photography:

The image is the ultimate goal […] subject is only a means to an end […] certainly the photograph is not about the subject.

Photography could not be

solely controlled by the brain. Your personality, subconscious, flows through […] you have to allow it to come through […] for the outcome to be relevant.

Ultimately, she said, photography was “always an exposition of self”. This quasi-biographical exposition was necessarily very public, though. The National Library’s commitment to digitising Westra’s photographs is a fitting gesture of democratising the artistic corpus of this most democratic of photographers.

The result is a collection of prodigious proportions (over 300,000 images will eventually comprise the collection), immense span (over six decades) and ambitious scope.

Even for those New Zealanders born more recently, Westra’s images can serve as a form of prosthetic memory – one that may not be based on direct experience, but that nonetheless props up their collective perceptions of the country.

Days before her death, I asked Westra what specifically it was about her photographs that reflected the character of the country and its people so intimately. Slouched in her armchair, her mind seemed to see-saw between thoughts for a while before she responded in her brittle voice, “Maybe it’s because I’m an outsider”.

This was a crucial observation. She was able to see aspects of New Zealand most of its residents probably took for granted. However, looking at her vast body of work, it is equally true in every sense that Westra was the greatest insider we have ever had.

Ans Westra: a life in photography – Paul Moon (Massey University Press), available May 9. Läs mer…

The National Aboriginal Hockey Championship celebrates the strength of Indigenous sporting communities

The puck has dropped on another National Aboriginal Hockey Championship (NAHC) — one of Canada’s most unique youth athletic competitions and cultural celebrations. This year’s tournament runs from May 5 to 11.

The NAHC was founded by the Aboriginal Sport Circle in 2002 and has been held annually since then. It brings together top under-18 male and female Indigenous athletes from across Canada to showcase the elite athletic abilities of Canadian Indigenous youth.

Part of what makes this tournament so unique compared to mainstream sport is the inherent focus on fostering cultural unity and pride.

The tournament has distinct ceremonial practices that distinguish it from other elite hockey tournaments and reinforce the themes present in the NAHC’s founding documents. These themes are demonstrated through event programming, including a players-only welcome event, opening and closing ceremonies, traditional artwork integrated into trophy presentation and playing traditional music.

The Truth and Reconciliation Commission’s call to action 87 asks all levels of government to collaborate with Indigenous Peoples to “provide public education that tells the national story of Aboriginal athletes in history.” The NAHC provides a tangible answer to this call by providing resources that tell the story of the tournament and celebrate past accomplishments.

In this way, the NAHC presents an important opportunity for researchers seeking examples of Indigenous sport that run counter to deficit narratives of disadvantage. Rather than focusing on potential struggles that players have overcome to reach the tournament, the NAHC is a celebration of the strength of Indigenous sporting communities.

The NAHC and Indigenous hockey excellence

The exceptionally high skill level of the NAHC has attracted the attention of scouts from across North America, though that is only a part of the reason the championship has maintained its presence in the Canadian sport landscape.

During the 2023 championship, tournament organizers called athletes onto the ice in front of their families and local community members. With the best Indigenous youth hockey players in the nation standing side-by-side along the rink boards, tournament organizers, elders and Indigenous leaders offered prayers and advice to kickstart a week of exceptional competition and athletic ability.

The NAHC has hosted professional hockey alumni, including former NHL player Jordin Tootoo.
(AP Photo/Winslow Townson)

In addition, the NAHC has hosted numerous professional hockey alumni such as Jordin Tootoo, Ted Nolan, Bridgette Lacquette, Jocelyn Larocque, Michael Ferland, Brandon Montour and many others. In this way, the event’s legacy is connected with those of its most successful participants, further defining its place in both Indigenous and broader sport history.

Moving beyond deficit perspectives

The NAHC provides Indigenous youth the chance to celebrate their culture and participate in sports at an elite level and also tackles the deficit perspective commonly applied to Indigeneity.

In research, a deficit perspective treats individuals as the subject of numerous problems that need “solving.” This perspective “continues to reinforce to others what is not working, while failing to ”actively seek out and report on what is working well.”

In this specific context, the term deficit perspective refers to language and practices (most often from non-Indigenous people) that emphasize the problems, issues and failures — both historical and present — of Indigenous Peoples.

Because a deficit perspective focuses on highlighting what is missing or absent in one group versus the dominant other, it leads to the continuing subjugation of Indigenous events and evolving traditions, including the NAHC.

For numerous reasons, Indigenous Peoples are routinely the subject of deficit-based research. If performed exclusively and without engaging Indigenous communities, deficit-perspective research has the potential to harm relationships between Indigenous and settler communities.

Shifting sport policy

Throughout the late 1980s and 1990s, the focus of Canadian sport policy shifted from elite sport and national sporting excellence towards the inclusion and participation of marginalized communities.

As a result, Indigenous sport leaders and organizations participated in sport policy development alongside the federal government. These developments led to the creation of the 1992 Sport: The Way Ahead and 1998 Sport: Everybody’s Business policies, as well as the North American Indigenous Games, Aboriginal Sport Circle, and, eventually, the National Aboriginal Hockey Championship in 2002.

By facilitating an opportunity for Indigenous Peoples to participate in sports against the best in the nation, the tournament honours the resiliency of Indigenous Peoples while emphasizing notions of community and presenting youth with the opportunity to grow personally and professionally.

One unique way the NAHC commits to this strengths-based approach is evident in the distribution of tournament awards. As part of the closing ceremonies, each team sits in the stands with award recipients being called down to the ice while being cheered on by both teammates and opponents.

Because it puts a strengths-based approach in practice by centering and celebrating Indigenous sporting excellence, The NAHC challenges the deficit-based perspective that so often informs research and reporting on Indigenous sport. For this reason, the NAHC maintains an important position in the Canadian hockey landscape as a successful and lasting example of Indigenous athletic excellence.

Lucas Rotondo, a research assistant and undergraduate student from Sport Management at Brock University, co-authored this article. The authors also acknowledge the extensive contribution of Mel Whitesell, executive director of the Manitoba Aboriginal Sports and Recreation Council. Läs mer…

As climate change amplifies urban flooding, here’s how communities can become ‘sponge cities’

“When it rains, it pours” once was a metaphor for bad things happening in clusters. Now it’s becoming a statement of fact about rainfall in a changing climate.

Across the continental U.S., intense single-day precipitation events are growing more frequent, fueled by warming air that can hold increasing levels of moisture. Most recently, areas north of Houston received 12 to 20 inches (30 to 50 centimeters) of rain in several days in early May 2024, leading to swamped roads and evacuations.

Earlier in the year, San Diego received 2.72 inches (7 centimeters) of rain on Jan. 22 that damaged nearly 600 homes and displaced about 1,200 people. Two weeks later, an atmospheric river dumped 5 to 10 inches (12 to 25 centimeters) of rain on Los Angeles, causing widespread mudslides and leaving more than a million people without power.

Events like these have sparked interest in so-called sponge cities – a comprehensive approach to urban flood mitigation that uses innovative landscape and drainage designs to reduce and slow down runoff, while allowing certain parts of the city to flood safely during extreme weather. Sponge city techniques differ from other stormwater management approaches because they are scaled to much larger storms and need to be applied across nearly all urban surfaces.

I’m a water resources engineer who studies and designs strategies for sustainably managing urban stormwater. In response to recent flooding episodes, some U.S. cities are beginning to take steps toward incorporation of sponge city concepts into their stormwater management plans, but most of these projects are still pilots. If this concept is to evolve into the new standard for urban design, city officials and developers will need to find ways to scale up and accelerate this work.

Copenhagen, Denmark, is taking steps to become spongier in response to severe floods.

The problem of stormwater

For more than a century after U.S. cities started installing centralized sewage systems in the mid-1800s, pipes carried stormwater – rain or melted snow that runs off streets and buildings – to nearby rivers or harbors. This approach reduced local flooding but polluted adjoining waters and exacerbated flood risks further downstream.

The 1972 Clean Water Act was designed to make the nation’s waters fishable and swimmable by 1983 but failed to meet that goal. One major reason was that the law initially focused on reducing only point sources – pollution discharges that came from an identifiable source, such as a pipe discharging human or industrial waste.

In the late 1980s, Congress amended the law to address nonpoint, or diffuse, water pollution sources, including stormwater. Engineers began designing systems to capture sediments in the “first flush” of runoff, since harmful pollutants such as heavy metals were believed to adhere to these particles.

To this day, green infrastructure and other stormwater management practices in the U.S. are typically designed to detain, retain or filter only the first 1 to 2 inches (2.5 to 5 centimeters) of runoff. Individually, they can’t capture all the runoff generated during larger storms, the kind of events that are becoming more frequent due to climate change. What’s more, stormwater management frequently is not required on smaller land parcels, which can collectively represent a large fraction of urban watersheds.

All of these factors limit green infrastructure’s ability to reduce flood risks.

Detroit has installed green features like this bioswale – a shallow, vegetated area that collects stormwater – to reduce flooding that has plagued neighborhoods for decades.
AP Photo/Corey Williams

Greening infrastructure, bit by bit

The term “sponge city” originated in China around 2010, but U.S. cities have employed similar ideas since the 1970s to improve water quality in rivers and streams.

In the early 2000s, the idea of designing communities to filter and soak up stormwater became known as green infrastructure. Regulators and utilities saw it as a potentially cost-effective strategy for complying with federal clean water regulations. In cities where existing storm sewage systems discharged directly to creeks, lakes and rivers, green infrastructure had the potential to filter out pollutants from stormwater before it flowed into those waterways.

In hundreds of cities, mainly in the Northeast and Midwest, stormwater and wastewater are carried in the same sewage pipes. Green infrastructure offered a strategy for diverting stormwater away from the sewage system to places where it could soak into the ground. That helped reduce the chances of sewage systems overflowing and sending untreated stormwater and wastewater into local waters.

Old sewage systems in many cities carry both sewage and stormwater. A combined sewage overflow is a relief point that prevents flooding in homes and treatment plants by discharging the combined flow to the environment during heavy rains.

Cities including Philadelphia, New York, Cincinnati, San Francisco, Cleveland, Washington, D.C., and Kansas City, Mo., have spent billions of dollars over the past 20 years to retrofit developed landscapes with rain gardens, green roofs, permeable pavements, constructed wetlands and other site-scale stormwater control measures. Most of these systems, however, were installed in areas that produced the most water pollution and were not sized to manage large storms.

In the best cases, green infrastructure has been installed on publicly owned land and required on new or redesigned large-scale developments. It has proved much more challenging to incorporate green infrastructure on smaller, privately owned land parcels, which collectively make up a significant percentage of urban watershed areas.

In some cities, some new development is still approved without any required stormwater treatment system or analysis of the dramatic ways in which its stormwater could cause flooding on downstream and adjacent properties. And in many cities, stormwater from small land parcels is allowed to pass without treatment into piped sewage systems. If many such parcels are located in the same neighborhood, this common practice can augment downstream flood risks.

Every surface matters

In my lab at Drexel University we are studying solutions to flooding in the Eastwick section of southwest Philadelphia. This neighborhood sits at the downstream end of a 77-square-mile suburban watershed. When it rains heavily upstream, Eastwick floods. In 2020, Tropical Storm Isaias flooded some homes with more than 4 feet (1.2 meters) of water.

Our computer models suggest that if conventional green infrastructure had been in place to treat runoff from 65% of the watershed’s impervious surfaces, Isaias would not have caused Eastwick to flood. But that’s five times more treatment than upstream communities are planning as part of their state-mandated stormwater pollutant reduction plans.

Some critics say this level of greening is not technically, logistically or socially feasible. But if the notion of sponge cities is to become a reality, cities will eventually have to figure out how to get there.

To get to 65%, these towns would need to treat runoff from nearly all rooftops, parking lots and roadway surfaces in some form of green infrastructure. If dedicated space for new rain gardens and wetlands on the ground is limited, parking lots could be retrofitted with permeable asphalt or concrete that allowed water to pass through it to the ground beneath. Rooftops could be converted into vegetated green roofs that detain and retain stormwater.

In this sponge city vision, streets would be recontoured to direct stormwater to parks and recreational fields built feet below the street surface and designed to flood safely during extreme weather. Existing natural areas would be leveraged for stormwater storage, enhancing their ecology.

Depending on where extreme rainfall occurs, these systems could function individually or together, mimicking the modularity and redundancy found in natural ecosystems.

Finding the money

In sponge cities, every surface needs to be connected to a space that can flood safely. Getting from traditional green infrastructure to sponge cities requires integrated policies, plans and incentives that apply these kinds of solutions wherever rain falls.

Parking lots can be designed to flood and release water slowly. So can basketball courts, parks, plazas and even streets, as prescribed in Copenhagen, Denmark’s Cloudburst management plan.

Such a transformation of the built environment can’t be fully bankrolled by stormwater utilities. These organizations face a dizzying array of regulatory requirements and can’t raise rates above their customers’ ability to pay.

One way to raise more money would be through collaborations between city agencies responsible for upgrades to roadways, parks, schoolyards and other public land that also attract federal dollars, such as New York City’s Cloudburst Resiliency projects.. In some cases, funding from a third party could supplement the effort. One example is a collaboration between New York City and the Trust for Public Land to add green infrastructure features to a Bronx schoolyard to help reduce local flooding.

Cities could also offer incentives for retrofitting and scaling up existing stormwater management systems on private land. A trading system could be set up to sell the residual capacity to nearby property owners who lack onsite stormwater management opportunities.

This strategy isn’t cheap, but neither is inaction. Inland flooding caused US$177.9 billion in damage from 1980 to 2022, and billion-dollar disasters are becoming more frequent with climate change.

As extreme weather events become more prevalent, I expect that urban planning and design standards will evolve to include sponge city concepts. And this more robust approach to stormwater management will continue to figure prominently in all kinds of municipal and private design and development decisions. Läs mer…

How 2-Tone brought new ideas about race and culture to young people beyond the inner cities

This Town, Peaky Blinders creator Steven Knight’s latest drama for the BBC, brings to life a defining – if short-lived – era in the history of British youth culture and popular music. Set in the West Midlands against the backdrop of industrial decline and social unrest in the early 1980s, the drama unfolds to the syncopated sounds of 2-Tone.

A furious mix of punk and Jamaican ska, 2-Tone became a genuinely national phenomenon, bursting out of a bedsit in Coventry and into the charts and the popular consciousness.

We know a lot about the urban multiracial landscapes of its Midlands origins, out of which its twin ideals of racial unity and musical hybridity sprang. But we know much less about how it resonated with the experience of young people beyond the big towns and cities.

Razorpix / Alamy

Such considerations are timely. It is now 45 years since the founding of 2-Tone Records by Jerry Dammers, organist and songwriter for ska’s most famous band The Specials, and mastermind of the whole movement.

Of course, 1979 was also a decisive year for politics in the UK. But bands like The Specials did more than just soundtrack the civil strife of the early Thatcher years; they actually inspired political and cultural change.

To understand how they did so is important not only for historical reasons. A deeper sense of how anti-racist and multicultural ideas have shaped less culturally diverse regions may enrich contemporary debates over racism, particularly rural racism, which have become increasingly polarised.

My own ongoing oral history project with people from the Dorset region registers the powerful effect 2-Tone had in less racially mixed areas. Interviewees speak vividly of the energy, excitement and unruliness of attending gigs, as well as the sense of shared community, belonging and togetherness.

Nobody is special

As The Specials’ first single, Gangster, hit the airwaves in the summer of 1979 and the first 2-Tone tour opened in the autumn (with support from fellow labelmates The Selecter and Madness), a growing legion of youth clad in slim-fit mohair “tonic suits”, pork-pie hats, and black-and-white checkerboard greeted the bands as they made their way across the country. By the time all three bands appeared together on Top of the Pops that November, 2-Tone had swept the nation.

The Selecter were a popular 2-Tone band headed up by Pauline Black.
Records / Alamy

The Specials, in particular, built an ethos on the idea that “nobody is special”, refusing the division between band and audience (symbolically represented in the audience joining the band on the stage for the final numbers).

The inaugural tour covered the length and breadth of the country, reaching musical outposts like Aberdeen, Ayr, Blackburn, Bournemouth, Plymouth and Swindon. A seaside tour followed in 1980, winding its way through several English coastal towns, from Blackpool to Worthing.

One interviewee described how 2-Tone bands made a big deal of moving out into the remote areas and bringing the music to the people. That made them more accessible, setting them apart from other bands of the period.

For one fan from Weymouth, travelling up to that first Bournemouth gig was a powerful unifying experience:

You just didn’t realise that you were part of a bigger thing…When you get in there and everyone’s got the same attitude, the same outlook, the same sense of purpose and sense of place – it was really quite an amazing feeling.

Playing venues in far-flung places was part of the 2-Tone mission. For Dammers and others, the anti-racist message was aimed directly and primarily at white youth. These 2-Tone bands sought to reach audiences with a visual and aural display of unity. The symbolism had a profound impact. As another interviewee recalled:

Groups were either all white or all black…2-Tone was the first thing where you actually saw white and black musicians on stage together…That was a massive difference.

But not everyone suddenly became a staunch anti-racist. Some simply went for the music, the dancing and the good times. But for others the unity of politics, style and music cut across divisions among fractious youth cults and against far-right influences. Embracing the spirit of 2-Tone gave rural and small-town youth a way of expressing anti-racist politics in a more local idiom.

Race and racism today

Despite the contribution of 2-Tone – and before it, Rock against Racism – to anti-racist struggles, issues of racism have never gone away. The fight against far-right nationalism and police brutality continues, but increasingly the spotlight has shifted towards the more subtle and unseen ways in which racism is perpetuated. This ranges from everyday microaggressions to the lingering shadow of Britain’s imperial legacy, attracting a strong backlash in some quarters.

Recent evidence of rural racism, for example, has been met with swift dismissals. The former home secretary Suella Braverman was quick to deny others’ experience of racism, stating that the claim the countryside is racist is one of the most ridiculous examples of left-wing identity politics – just because there are more white people than non-white people somewhere does not make it racist.

Recalling the example of 2-Tone and The Specials may encourage a longing for a simpler time, when racists were easy to spot; things are more complicated today. Still, it can help us to understand how racial solidarities are forged, particularly in and through social and geographical differences. For my interviewees, 2-Tone’s ska revival was not a passing fad; it allowed them to reinterpret their own experience of class, race and locality.

If only for a moment, 2-Tone mania ruled Britain, in the words of the music critic Simon Reynolds. But as This Town shows, its rich and complex legacies can still be brought powerfully to life in the present. Läs mer…

Africa dramatically dried out 5,500 years ago – our new study may warn us of future climate tipping points

Around five and half millenia ago, northern Africa went through a dramatic transformation. The Sahara desert expanded and grasslands, forests and lakes favoured by humans disappeared. Humans were forced to retreat to the mountains, the oases, and the Nile valley and delta.

As a relatively large and dispersed population was squeezed into smaller and more fertile areas, it needed to innovate new ways to produce food and organise society. Soon after, one of the world’s first great civilisations emerged – ancient Egypt.

This transition from the most recent “African humid period”, which lasted from 15,000 to 5,500 years ago, to the current dry conditions in northern Africa is the clearest example of a climate tipping point in recent geological history. Climate tipping points are thresholds that, once crossed, result in dramatic climate change to a new stable climate.

Our new study published in Nature Communications reveals that before northern Africa dried out, its climate “flickered” between two stable climatic states before tipping permanently. This is the first time it’s been shown such flickering happened in Earth’s past. And it suggests that places with highly variable cycles of changing climate today may in some cases by headed for tipping points of their own.

Whether we will have any warnings of climate tipping points is one of the biggest concerns of climate scientists today. As we pass global warming of 1.5˚C, the most likely tipping points involve the collapse of ice sheets in Greenland or Antarctica, tropical coral reefs dying off, or abrupt thawing of Arctic permafrost.

Some say that there will be warning signs of these major climate shifts. However, these depend very much on the actual type of tipping point, and the interpretation of these signals is therefore difficult. One of the big questions is whether tipping points will be characterised by flickering or whether the climate will initially appear to become more stable before tipping over in one go.

620,000 years of environmental history

To investigate further, we gathered an international team of scientists and went to the basin of Chew Bahir in southern Ethiopia. There was an extensive lake here during the last African humid period, and deposits of sediment, several kilometres deep, underneath the lake bed record the history of climate-driven lake level fluctuations very precisely.

Today, the lake has largely disappeared and the deposits can be drilled relatively cheaply without the need for a drill rig on a floating platform or on a drillship. We drilled 280 metres below the dry lake bed – almost as deep as the Eiffel Tower is tall – and extracted hundreds of tubes of mud around 10 centimetres in diameter.

Drilling for ancient lake sediment in Chew Bahir.
Asfawossen Asrat

By putting these tubes together in order they form a so-called sediment core. That core contains vital chemical and biological information which records the past 620,000 years of eastern African climate and environmental history.

We now know that at the end of the African humid period there was around 1,000 years in which the climate alternated regularly between being intensely dry and wet.

In total, we observed at least 14 dry phases, each of which lasted between 20 and 80 years and recurred at intervals of about 160 years. Later there were seven wet phases, of a similar duration and frequency. Finally, around 5,500 years ago a dry climate prevailed for good.

Climate flickering

These high-frequency, extreme wet-dry fluctuations represent a pronounced climate flickering. Such flickering can be simulated in climate model computer programs and also happened in earlier climate transitions at Chew Bahir.

We see the same types of flickering during a previous change from humid to dry climate around 379,000 years ago in the same sediment core. It looks like a perfect copy of the transition at the end of the African humid period.

This is important because this transition was natural, as it occurred long before humans had any influence on the environment. Knowing such a change can occur naturally counters the argument made by some academics that the introduction of livestock and new agricultural techniques may have accelerated the end of the last African humid period.

Conversely, humans in the region were undoubtedly affected by the climate tipping. The flickering would have had a dramatic impact, easily noticed by a single human, compared to the slow climate transition spanning tens of generations.

It could perhaps explain why the archaeological findings in the region are so different, even contradictory, at times of the transition. People retreated during the dry phases and then some came back during the wet phases. Ultimately, humans retreated to the places that were consistently wet like the Nile valley.

Confirmation of climate flickering as precursors to a major climate tipping is important because it may also provide insights into possible early warning signals for large climate changes in future.

It seems that highly variable climate conditions such as rapid wet–dry cycles may warn of a significant shift in the climate system. Identifying these precursors now may provide the warning we need that future warming will take us across one of more of the sixteen identified critical climate tipping points.

This is particularly important for regions such as eastern Africa whose nearly 500 million people are already highly vulnerable to climate change induced impacts such as drought. Läs mer…

Assisted dying: why Scotland should be wary of changing the law

Scotland took the first step towards legalising assisted dying on March 27 with the publication of the assisted dying for terminally ill adults (Scotland) bill. If the law is passed, Scotland would become the first UK nation to offer terminally ill people assistance to end their lives.

The bill’s promoter, the Liberal Democrat MSP Liam McArthur, cites uncontroversial-sounding principles of compassion, autonomy and legal clarity as the bill’s objectives.

He argues that the law of Scotland is “unacceptably unclear” and must be changed to give people who are suffering at the ends of their lives some autonomy over the timing of their deaths.

Few would deny that compassion and autonomy are desirable or that the law should be clear. And polling has consistently shown that most adults in Scotland support assisted dying.

Yet there is plenty in this bill to cause concern. One important issue is the terminology it uses. Whereas the last Scottish bill of this kind was transparently titled the assisted suicide (Scotland) bill, the term “assisted suicide” has been replaced this time around with the vaguer “assisted dying”.

Understandably, many of those who campaign to change the law dislike the term “assisted suicide”. Suicide is tragic and something that governments – including the Scottish government – strive to prevent.

By contrast, the term “assisted dying” is designed to conjure images of things we all support: hospices, palliative care and respect for patients’ choices about their treatment at the end of life.

But the Scottish bill is not about these things, which are all lawful already. Rather, it proposes a fundamental shift in the relationship between doctors and their patients – one that would cross a moral, ethical, cultural and professional red line: the prohibition against killing.

The euphemism “assisted dying” disguises how seismic this would be by implying that what is being proposed is on a continuum with what the law already allows. The law allows patients to refuse treatment, even if they will die without it. It allows doctors to withdraw treatment that is no longer in an unconscious patient’s best interests, even if death will result.

If the family agrees with the medical team, there is now no need to get permission from a court before doing this. The “doctrine of double effect” allows doctors to make patients comfortable at the end of life by administering pain relief even at doses high enough to cause death. This is permitted provided the purpose is not to kill, but to relieve suffering.

All of these things are lawful – and ethical – because they do not cross the ultimate line and allow doctors to deliberately kill (or assist in killing) their patients.

The therapeutic relationship between doctors and patients is often described as a “relationship of trust and confidence”. Erasing the prohibition on killing risks eroding that trust and changing the nature of this crucial relationship irrevocably.

Liam McArthur, a Liberal Democrat MSP, tabled the new assisted dying bill.
Colin Fisher / Alamy Stock Photo

The term “assisted dying” also muddies the distinction between assisted suicide (where the person who dies performs the final act that causes death) and euthanasia (where someone else performs the final act). The Scottish bill ostensibly provides only for assisted suicide when it refers to the “coordinating registered medical practitioner … providing a terminally ill adult with an approved substance with which the adult may end their own life”.

However, although the bill envisages the adult taking the substance by him or herself, the medical practitioner is obliged to “remain with the adult … until the adult has died”. And the bill does not make clear what they are permitted to do should the person get into difficulty or distress.

There is evidence from jurisdictions that have already legalised assisted suicide “that some patients who ingest the prescribed lethal drugs experience distressing complications”. In Oregon, US, where assisted suicide is legal, annual complication rates as high as 14.8% have been reported.

The Scottish bill is silent about what the role of the professional is in such circumstances. Are they permitted to step in and provide more direct “assistance” to complete the dying process? If so, this could amount to euthanasia, not assisted suicide. And the term “assisted dying” obscures exactly what would be allowed.

This is the kind of point that should be explicit in a bill of this kind, particularly one that cites a lack of clarity in the current law as a justification for change.

Worryingly vague

Other aspects of the Scottish bill are also worryingly vague. For example, the eligibility criteria are poorly defined, leading some to wonder whether conditions like anorexia nervosa might qualify someone for assisted dying.

The status of the conscience clause in the bill is also unclear, meaning that doctors who would wish to take no part in ending their patients’ lives cannot rely on assurances that they would be able to opt out.

Doctors in the UK worry that it would be “deeply dangerous” to introduce assisted dying here, with some citing the “horrific state of the NHS” and the funding crisis in hospice care.

Others point out the danger of introducing assisted dying while disability discrimination is still so prevalent in the UK.

Stories emerge regularly from other countries of how assisted dying, once introduced, is soon used in ways not originally envisaged. And former supporters of the practice have changed their position on it as they observe its inexorable expansion – for example, to include children and people with mental illness.

Against this background of widespread concern, the longstanding public support for assisted dying in Scotland may be weakening.

The next step for the Scottish bill will be scrutiny by the health, social care and sport committee of the Scottish parliament, which will hear evidence from a range of stakeholders, who are likely to include health professionals’ representative bodies, faith groups and disability rights activists. As this process unfolds, the bill’s promoters will face serious and legitimate questions about its safety. Läs mer…

Most Gypsy and Traveller sites in Great Britain are located within 100 metres of major pollutants, shows research

Gypsy and Traveller communities are among the more socially excluded groups in the UK. There is a long history of government failures in meeting these groups’ housing needs.

The shortage of sites has resulted in a homelessness problem. Those who do secure pitches on council-managed sites often have to contend with living near potential hazards.

For our recent study, we mapped local authority-managed Gypsy and Traveller sites in Great Britain. Of those sites, 39% were within 50 metres of one or more major pollutants and 54% were within 100 metres.

The effect on residents is significant. As one of our interviewees, Sarah (all names have been changed), put it:

You can’t breathe here. A lot of people have asthma. Lots of babies in the community have poor health. A lot of them have skin rashes. Nobody ever lived past about 50 here. Whatever is coming out is killing people. Lots of people are dying of chest, COPD and cancer.

Urban sites are often located in industrial areas.
Katharine Quarmby

Worsening conditions

Between 2021 and 2022, we mapped 291 Gypsy and Traveller sites across Great Britain, noting their proximity to environmental hazards. These included motorways, A-roads, railway lines, industrial estates and sewage works.

To do so, we used the Caravan Count 2020, which lists all authorised local authority managed sites in England and Wales and a freedom of information request to the Scottish government, which gave us the names and addresses of all the authorised public sites in Scotland.

The study included in-depth case studies, site visits and interviews with 13 site residents (including repeat interviews with five site residents on two sites).

Local newspapers that reported on the highly contested historical and current planning processes were also analysed. Freedom of information requests were sent to local authorities to obtain planning meeting documents and 11 interviews were conducted with representatives of local and national organisations that work with Gypsy and Traveller communities.

When new Gypsy and Traveller sites are proposed by local authorities near existing residential areas, objections come from three main groups: residents, local politicians and local media outlets.

These objections often result in new sites being pushed further to the margins of towns and cities, in places that other communities would not be expected to live.

As a result, sites are often in isolated areas, quite literally on the wrong side of the tracks. They are nestled in among the infrastructure that services the needs of the local settled communities, from major roads to recycling centres.

Pollutants are a recurring problem.
Katharine Quarmby

One of the sites we visited has been in use since the 1970s, despite the fact that, already then, it was located near a waste transfer station. The intervening five decades have only seen conditions on the site worsen.

A chicken slaughterhouse nearby now burns carcasses regularly. The household waste recycling centre has expanded to allow for recycling and incineration of solid waste from commerce and industry.

Lorries and other vehicles now come in and out in large numbers, just metres away from some of the pitches. Residents experience constant noise and vibrations. Mary, who lives on the site, says the sound of the skips being deposited from 5am every morning is like a bomb going off: “It drops so hard it shakes the chalet.”

The air is always heavy with dust. Residents have to keep their windows closed – even in the summer – to keep out the flies. As Jane, who is the fourth generation of her family to live on the site, puts it:

We are living in an industrial area. It’s the air quality, the sand, the dust, the recycling tip is just behind us. The noise is a big problem. There is an incinerator near the slaughterhouse and that’s really bad. And the smell…

Environmental racism

According to the World Health Organization, housing is one of the major factors determining health. The physical conditions of a home – including mould, asbestos, cold, damp and noise – are obvious risk factors. So too, are wider environmental factors, from overcrowding and isolation from services to the relative lack of access to green spaces.

The people we spoke with, including site residents and organisational representatives, highlight the harmful health effects of living on many Gypsy and Traveller sites. This chimes with the government’s own reports, which have found these sites to be unsafe.

Traditional nomadic ways of life are under threat.
Katharine Quarmby

Research on health inequalities in the UK bears this out. People from Gypsy and Irish Traveller backgrounds report the poorest health and a life expectancy of between ten and 25 years less than the general population. They also have higher rates of long-term illness and conditions that limit everyday life and activities.

The Police, Crime, Sentencing and Courts Act 2022 has further constrained Gypsy and Traveller communities by criminalising roadside stopping and forcing people on to transit sites. These are designed for short stays and are often in even worse locations than permanent sites.

This poses a plain threat to traditional nomadic ways of life, from travelling in the summer months to fairs and attending religious gatherings.

Thousands of people rely on these local authority-managed sites, located dangerously near the kind of environmental pollutants that are associated with poor health and premature deaths. The term “environmental racism” is used to refer to how people from minority and low-income communities are disproportionately subjected to environmental harm.

Yvonne MacNamara is the chief executive of the non-profit advocacy organisation, Traveller Movement. She highlights that the inequalities these communities face are systemic. Local authorities, she says, treat Traveller communities “like second-class citizens”.

To one resident’s mind, attitudes within local government to Gypsy and Traveller social housing are clearly discriminatory. As she put it: “They wouldn’t expect anyone but a Traveller to live here.” Läs mer…

Novelist J.G. Ballard was experimenting with computer-generated poetry 50 years before ChatGPT was invented

The novelist and short story writer J.G. Ballard, is known for conjuring warped and reimagined versions of the world he occupied. Dealing with strange exaggerations of realities and often detailing the breakdown of social norms, his unconventional works are hard to categorise.

Sitting on the edge of reality, these unsettling visions often provoked controversy. Eschewing a science-fiction of the distant future, Ballard described his own work as being set in “a kind of visionary present”.

Today, as we contemplate generative AI writing texts, composing music and creating art, Ballard’s visionary present yet again has something prescient and fresh to tell us.

In an interview from 2004 the author Vanora Bennett suggested to Ballard that he writes about “what is just about to happen in a given community”. Asked about what “kind of real-life event” inspired the ideas in his fiction Ballard responded:

I just have a feeling in my bones: there’s something odd going on, and I explore that by writing a novel, by trying to find the unconscious logic that runs below the surface and looking for the hidden wiring. It’s as if there are all these strange lights, and I’m looking for the wiring and the fuse box.

The topics in Ballard’s fiction frequently reveal just how highly attuned he was to the subtleties of the emerging technological and social shifts that were, as he puts it, just below the surface. The fuse box of society was often rewired in his ideas.

And with generative AI there is undoubtedly something odd going on, to which Ballard’s attention seems to have been drawn long before it even happened.

Author J. G. Ballard, who wrote famous novels like Crash and Empire of the Sun, in March 1965.
Contributor:
Trinity Mirror / Mirrorpix / Alamy Stock Photo

As well as the various editions of OpenAI’s now infamous ChatGPT, which produces custom-made texts in response to brief prompts, there are a range of other applications emerging that automatically create cultural forms. Google’s Verse by Verse is an “AI-powered Muse”, where you pick a poet along with a handful of criteria, such as number of syllables and poem type, and it helps the user to complete a poem by producing lines in response to the opening words entered into the system. Sora is said to allow you to create video from text instructions. Different versions of DALL.E can turn text suggestions into visual artistic images. In the field of music, applications like AIVA, Loudly and MuseNet can actively compose music on your behalf.

This is a snapshot of a rapidly and expanding range of such systems. They have inevitably brought with them deep-rooted questions about human creativity and what we understand culture to be. Nick Cave’s well-known response to song lyrics written by AI in his writing style was one powerful and widely shared reaction to the perceived lack of “inner being” behind the words. It was, Cave thought, simply the mimicry of creative thought. Others are now wondering if AI spells an end for the human writer.

This article is part of Conversation Insights
The Insights team generates long-form journalism derived from interdisciplinary research. The team is working with academics from different backgrounds who have been engaged in projects aimed at tackling societal and scientific challenges.

While these debates continued, I found similar ones taking form over 50 years ago. Looking through the archive of an old arts magazine which Ballard used to edit, I discovered that he was writing about this futuristic concept way back in the 1960s, before going on to experiment with the earliest form of computer-generated poetry in the 1970s.

What I found did more than simply reveal echoes in the past: Ballard’s vision actually reveals something new to us about these recent developments in generative AI.

The surprise of computer generated poetry

Listening recently to the audiobook version of Ballard’s autobiography Miracles of Life, one very short passage seemed to speak directly to these contemporary debates about generative artificial intelligence and the perceived power of so-called large language models that create content in response to prompts. Ballard, who was born in 1930 and died in 2009, reflected on how, during the very early 1970s, when he was prose editor at Ambit (a literary quarterly magazine that published from 1959 until April 2023) he became interested in computers that could write:

I wanted more science in Ambit, since science was reshaping the world, and less poetry. After meeting Dr Christopher Evans, a psychologist who worked at the National Physical Laboratories, I asked him to contribute to Ambit. We published a remarkable series of computer generated poems which Martin said were as good as the real thing. I went further, they were the real thing.

Ballard said nothing else about these poems in the book, nor does he reflect on how they were received at the time. Searching through Ambit back-issues issues from the 1970s I managed to locate four items that appeared to be in the series to which Ballard referred. They were all seemingly produced by computers and published between 1972 and 1977.

The cover of an edition of Ambit from 1974.
David Beer

The first two are collections of what could be described as poetry. In both cases each of these little poems gathered together has its own named author (more of this below), but the whole collection carries the author names: Christopher Evans and Jackie Wilson (1972 and 1974). Ballard described Evans as a “hoodlum scientist” with “long black hair and craggy profile” who “raced around his laboratory in a pair of American sneakers, jeans and denim shirt open to reveal a iron cross on a gold chain”.

The 1972 collection is labelled with the overarching title “The Yellow Back Novels”, a play on an informal term used for popular fiction novels, and the 1974 collection is entitled “Machine Gun City”. Both include brief notes that give further glimpses into how these poems were computer generated and into Ballard’s thoughts on them.

The poems themselves are, it has to be said, a difficult read. I wouldn’t want to speak for him, but reading the pieces it becomes hard to believe that Ballard genuinely agreed with the assessment that they were “as good as the real thing” or, indeed, that they were the “real thing” – there may have been an element of provocation in such statements. Quality aside though, there is something intriguing in how today’s debates around the generation of content – pushing us toward questions of what creativity is and even what it means to be human – have a precursor in these 1970s computer-generated pieces.

Ballard’s plot

Ballard’s view of the poems in 1974 seems consistent with the more recent comment included in his autobiography. A short introductory note to the second collection of pieces opens with what is said to be the “text of a letter from prose editor J.G. Ballard advising rejection of a well-known writer’s copy”. Apparently Ballard wrote the following, which is quoted in brackets before the short pieces:

B’s stuff is really terrible – he’s an absolute dead end and doesn’t seem to realise it … Much more interesting is this computer generated material from Chris, which I strongly feel we should use a section of. What is interesting about these detective novels is that they were composed during the course of a lecture Chris gave at a big psychological conference in Kyoto, Japan, with the stories being generated by a terminal on the stage linked by satellite with the computer in Cleveland, Ohio. Now that’s something to give these English so-called experimental writers to think about.

Whether these little computer generated texts are stories, novels or poems is unclear and probably is a secondary issue to the automatic production of culture on display here. Ballard seems to have been taken with the new possibilities, and also seems to like the provocation it presents to other writers.

One of a collection of poems from 1972, believed to have been computer-generated.
David Beer

The image of the terminal on stage making poems while its creator is occupied speaking to the audience is a powerful one, conjured here by Ballard. He was clearly impressed with the innovation and what it suggested about creativity. Keeping his eye out for odd developments, he was intrigued by the new types of composition.

Yet, we perhaps shouldn’t take his note at face value. The playful framing and anarchic tone warn us from being too literal. And there is another reason for us to tread carefully. Ballard’s interest was likely to have been piqued by these events as he had written a short story featuring machines that could perform the exact task of writing poetry some 11 years previously. The short story itself seems to present a more questioning take on what it would mean for a computer to write and create prose.

Life imitating art

Written in 1961, Ballard’s story “Studio 5, The Stars” features an editor of “an avante-garde poetry review” working on the next issue. Sounds familiar. The poets he edits regularly are all using automated “Verse-Transcribers”, which they all refer to with established familiarity as VTs. These VT machines automatically produce poems in response to set criteria. Poetry has been perfected by these machines and so the poets see little reason in writing independently of their VTs. On being passed one poem hot from a VT the editor in the story doesn’t even feel the need to read it. He already knows that it will be suitable.

The poets have become used to working with their VT machines, but their reliance upon the machines for creative inspiration starts to become unsettled by events. At one point the editor is asked what he thinks is wrong with modern poetry. Despite seemingly being a strong enthusiast of the automation of creativity he wonders if the problems are “principally a matter of inspiration”. He admits he “used to write a fair amount … years ago, but the impulse faded as soon as I could afford a VT set”.

Read more:
Why the dark world of High-Rise is not so far from reality

Ballard’s story predicts that once creating poetry becomes a technical matter, the need to engage in the practice of writing evaporates. In place of creativity, the editor suggests, is a “technical mastery” that is “simply a question of pushing a button, selecting metre, rhyme, assonance on a dial, there’s no need for sacrifice, no ideal to invent to make the sacrifice worthwhile”. Not too far then from the types of prompts on which today’s generative AI relies to trigger its outputs. Often, as we saw with the examples of applications previously mentioned, a set of criteria, a phrase or any type of written instruction are used to initially direct the outputs of generative AI.

A mysterious figure named Aurora, the story’s antagonist, proclaims dismissively that “they’re not poets but mere mechanics”. When all the VT sets in the local area are wrecked by Aurora to “preserve a dying art”, the absence of human creativity is exposed. Not a machine is left in one piece, even “Tony Sapphire’s 50-watt IBM had been hammered to pieces and Raymond Mayo’s four new Philco Versomatics had been smashed beyond hope of repair”.

The editor is left with the next issue of the magazine to fill and no automated copy to fill it. There is shock at Aurora’s suggestion to “Write some yourself!”. Tony, the editor’s associate, offers some consolation, reminding him archly that “Fifty years ago a few people wrote poetry, but no one read it. Now no one writes it either. The VT set merely simplifies the whole process”.

A copy of the first edition of Ballard’s High Rise from 1975.
Wikimedia Commons, CC BY

In Ballard’s 1961 story it is only the sudden absence of functioning machines that drives the poets to start writing creatively again. The reliance on the VT is broken. The story closes with the ripping up of a paper order for three new VT sets. The story would appear to be a warning against the automation of creativity and the implications it might have, should it arrive. In the 1970s it arrived in rudimentary form, and Ballard seems, on the surface at least, to have had a quite different reaction to its presence.

How did the computer write the poems?

Each little piece included in the 1972 and 1974 collections includes a title, author, and six lines of text. Those six lines are highly formulaic. Some of that pattern can be discerned simply by glancing across the many opening lines. These include gambits such as “the thunder of the motors fractured the lake”, “the roar of the jets rocked the house”, the evocative “the fury of the turbos fractured the crowd” and “Dr Zozoloenda pondered as the plane lurched”.

Though the 1974 pieces seem a little more varied than the 1972 versions, they retain the same types of formulas. Part of the reason for the seeming consistency of form is to be found in the brief endnote by Evans and Wilson that closes the first collection. They start with the claim that:

These mini SF novels have been generated by a computer programmed to write them, for eternity if needs be, given the command RUN JWSF.

RUN is a classic computer command to initiate a program. It’s not clear what JWSF stands for but the vision is of a perpetual writing machine that never stops and runs forever. They admit that this program itself is, as they put it, “immensely simple”. They then proceed to outline very briefly how it works, indicating that the “computer selects randomly from a pool of specially chosen key words or phrases”. So this is randomly generated text from a curated pool of words.

There are also structures in which the randomly selected words are placed. They explain that “the first line of the story essentially consists of the computer completing the phrase: THE (BLANK) OF THE (BLANK) (BLANKED) THE (BLANK)”.

According to Evans and Wilson, within this opening-line structure, “the blanks being filled in by searches through pools of words, thus ending up with THE WINE OF THE MOTORS FRACTURED THE HOUSE or THE RUSH OF THE HELIOS SCORCHED THE DESERT”. The two examples they provide capture the feel of many of the opening lines in the short pieces. The outputs are repetitive and predictable while also remaining strange. One mystery that is left is how the pools of words were created.

The use of structure alongside randomness is presented as providing an almost endless source of new content that can be produced on demand. Evans and Wilson claim that their approach, “produces 10,000 possible unique sentences”. Following the opening sentence, they explain that “line two is a random selection of ten complete sentences. Line three reverts to the strategy of line one. The fourth line is again a random choice of ten complete sentences, and so on”.

This alternating structure of the lines is at the core of all the pieces generated and published in 1972 and 1974. The second mystery is how the complete sentences that are randomly selected for the alternate lines were produced and selected. There don’t appear to be any other traces, so these details are likely to remain unknown.

The perpetual generation of material is first framed in terms of the number of possible sentences. Yet something more reflective is introduced too, which is the generation of ideas. Evans and Wilson ask themselves: “How many original and unique SF mini-novels can the computer generate before running out of ideas?”

Their seemingly speculative answer, given we do not know how many words are included in those pools, is simply that “typing at a rate of ten characters a second, this would take (rather roughly) 10,000,000,000,000,000,000 [a hundred quintillion, or ten to the twentieth power] years which would probably see the Universe come and go a few times”. The generation of poems by this machine is, in other words, without any real limits. Clearly, we can question if it actually has any “ideas” in the first place.

As with the content, the author names attached to each six-line piece are also computer generated. The authors’ names are again “chosen from a pool of suitable SF-type names, paired in the same random way”. What constitutes an SF type name is not made clear, but some of the author names generated are things like Z.Q. Johnson, Blade Sinatra, Frank Archer, Marsha Fantoni, Blade Van Vargon and even Tagon “X”.

Adding names humanises the writing in some way, even though the names themselves mostly seem to be quite obviously made up. Giving these pieces named authors actually draws attention to questions of authorship and the intersection of human creators with technology.

Computer generated poems or a hoax?

The subsequent articles published in Ambit in 1976 and 1977 don’t seem to follow the promise made in 1974 that these little pieces would be appearing in an “unending stream in Ambit”. Instead they changed direction somewhat, moving from computers creating text to interacting with humans. The 1976 piece titled “Hallo, your computer calling”, again credited to Chris Evans and Jackie Wilson, provided a strange if prophetic interaction introduced as “an experiment to see whether computers can help doctors to diagnose illnesses”.

A 1977 piece “The Invisible Years” is even more baffling. This time credited to Tim Bax, J.G. Ballard, Chris Evans and Ronald Sandford, the piece is presented in awkward angular boxes and is described with the opening statement: “This year Ballard answers the question of Chris Evans and a computer. To drawings conceived by Mr. Ronald Sandford”. That bizarre intervention seems to have been the final instalment in this series of computer-generated contributions.

The story, The Invisible Years, from Ambit’s 1977 edition.
David Beer

We might start to question, especially with their strange framing and increasingly bizarre content, whether these are actually computer generated texts at all, or, given the type of publication and those involved in them, if this is some other form of expression, maybe a parody or satire even. The chapter in Ballard’s Miracles of Life in which Evans is discussed suggests that their collaboration spread across into fictional ideas too. It may even be a hoax of some sort, designed to proffer questions of what automation means for culture and ideas.

It is now impossible to verify what exactly was happening or what, if any, technology was being used. It seems likely that a computer program was involved in some way with the final product, and whether they are fully automated bits of writing is actually a side issue when considering the importance of these little works. Whatever it is that we are seeing in these strange automated poems, this case reveals something about the type of interest in the computerised generation of ideas and cognition that is playing out in more advanced form today. This case from the 1970s is indicative of how this logic has developed.

Read more:
AI will soon become impossible for humans to comprehend – the story of neural networks tells us why

The enthusiasm for the possibility of computers writing is evident even back then. Yet the apparent enthusiasm attached to these Ambit poems might also have been a response, or even an ironic and playful reaction to, the emergent computer systems and even AI that were developing in the 1960s and 70s. The questions around creativity and human value that are implicit in Ballard’s short story perhaps hint at this. But the types of questions, outcomes and implications of computer-generated writing were yet to solidify into the type of debates we see rumbling today.

A sensitivity to automated creativity

If we take on face value the description of the generative processes described in the notes that accompanied these poems, and also the mention in Ballard’s much later autobiographical account, then the key difference between the little pieces Ballard commissioned and today’s popular turn to AI is the move from randomness to probability. The generation of poems that draw randomly on curated pools of text is quite different to producing texts based on calculations of probability from large data sets. Yet the underlying sensibility and logic is the same, both are informed and motivated by a mutating will to automate more aspects of social and cultural life.

Whether what we are seeing with these 1970s poems is genuine or if it is some sort of performance or playful satire, it still reveals something of the emerging attitudes to the possibilities of computational creativity in its very early forms.

Ballard’s enthusiastic response to the new possibilities suggested by the poems in the 1970s contrasts with the more dystopian vision in his 1961 short story. Ballard seems to embody what I have called the tensions of algorithmic thinking – by which I mean the unresolvable and competing forces that push simultaneously in different directions when we are confronted with advancing automation. On one side we have the problem of the removal of the human from human activities, on the other we have the removal of knowledge from cultural creation. The short story and the poems in Ambit both capture the tension that accompany today’s AI generated text, art, and music.

We are perhaps being shown from different perspectives, to use Ballard’s own phrasing, the wiring and fuse box of creativity. Ballard’s attention was drawn towards “something odd going on”. That oddness is becoming even more profound as the use and applications of generative AI continue to expand.

For you: more from our Insights series:

To hear about new Insights articles, join the hundreds of thousands of people who value The Conversation’s evidence-based news. Subscribe to our newsletter. Läs mer…

Latin America: several countries look to combat gang violence by fighting fire with fire

Gangs have an enduring presence in Latin America. They have existed as power brokers, illicit economic actors and spoilers in the developmental processes of several countries. And yet, despite their power and influence, the gangs have long been regarded merely as irritants – always present but never strong enough to rock the boat.

Fast forward to the present day and we are presented with a whole new configuration. Criminal gangs have become a critical power to reckon with. From island nations like Jamaica and Trinidad and Tobago to big economic powerhouses such as Brazil and Mexico, gang menace is spreading fast.

In some instances, gangs have come to challenge the very existence of governments in the region. Haiti’s criminal gangs unseated the government in early 2024 and took the country to ransom. And in Ecuador, which was once lauded as one of the safest countries in Latin America, the government is fighting a battle for its survival against gangs who are fast encroaching upon the power of the state.

Gangs have become such a serious problem in Latin America that they are damaging the region’s economic performance. Research by the International Monetary Fund suggests that bringing the crime level in Latin America down to the world average would increase the region’s annual economic growth by 0.5 percentage points – around a third of Latin America’s growth between 2017 and 2019.

Latin America’s criminal gangs have long been a neglected issue. But not anymore. Amid mounting concern about criminal violence and a low level of trust in the police, the governments of some Latin American and Caribbean countries are enacting states of emergency, putting through policies they would normally not be authorised to do, for the safety and protection of their citizens.

Colloquially known as mano dura (Spanish for “firm hand” or “iron fist”), this approach involves suspending the fundamental rights of the citizenry by giving the military and law enforcement agencies the power to arrest, incarcerate and deport anyone found to be involved with criminal gangs. It also denies access to legal measures to establish the arrested person’s right to a fair and open trial.

Sprawling authoritarianism

Mano dura measures were introduced to Latin America in March 2022 by El Salvador’s charismatic albeit controversial president, Nayib Bukele. Following a spike in gang violence that left 87 people dead in a single weekend, Bukele curtailed the right to be informed of the reason for arrest and access to a lawyer upon being detained.

By February 2024, more than 76,000 people – almost 2% of the Salvadoran population — had been detained under the provisions of mano dura. Critics have decried the crackdown as a gross human rights violation. Troops have rounded people up for having tattoos and being in poor neighbourhoods, leading to the detention of thousands of innocent people in overcrowded Salvadorian jails.

Instead of taking measures to prevent abusive arrests, Bukele has publicly backed the security forces. There are also few independent judges in the country after Bukele’s party passed a reform in 2021 that gave the supreme court the power to remove judges and force them into retirement.

Nevertheless, many people in El Salvador have accepted the crackdown with open arms.

Thanks to Bukele’s firm-handed approach to gangs and organised crime, El Salvador has gone from being the murder capital of the world to one of the safest countries in Latin America. In February, basking in soaring approval ratings, Bukele was re-elected as president in a landslide election.

With soaring approval ratings, gang-busting Bukele comfortably secured a second term a president of El Salvador.
Bienvenido Velasco/EPA

Mano dura politics are fast gaining reception across the region. In late April 2024, Ecuadorians voted in favour of continuing with a state of emergency in a national referendum. This move gives the country’s president, Daniel Noboa, the power to deploy soldiers on the streets to fight “drug-fuelled violence and extradite criminals abroad”.

Citizens of democracies voluntarily demanding authoritarian measures in their structure of governance is rare. The only recent example occurred in 2018 when mass protests swept across Latin America. The protests led more South Americans to see autocratic governance as a necessity for maintaining law and order.

Read more:
Unrest in Latin America makes authoritarianism look more appealing to some

In much the same way, the current widespread support in Latin America for mano dura interventions is a product of two interrelated factors. The suffering population is at a breaking point. And there is a reckoning that only extreme authoritarian measures can address the challenges posed by the gangs.

The capacity of many Latin American states to protect – let alone promote – their foundational values is being compromised by gang violence. Given this backdrop, it is no wonder that fighting fire with fire to curtail the power and influence of criminal gangs is gaining approval.

It is too early to predict if other Latin American states groaning under gang menace will fully replicate the Salvadoran and Ecuadorean model. However, countries with even very low homicide rates like Bolivia, Argentina and Chile have all now adopted some mano dura policies.

The “Bukele model” is gaining approval and will probably become a mainstream policy option in this violent region. Läs mer…

Gaza campus protests: why understanding 1960s’ student demonstrations and police reaction is relevant today

For anybody interested in the history of the 1960s, the ongoing protests at US universities have a peculiar resonance.

In the past weeks, riot police have entered several college campuses at the behest of administrators to break up unauthorised encampments of students protesting the war in Gaza and calling on their universities to divest from companies supporting Israel.

The scenes of police arresting hundreds of students at Columbia University and UCLA are reminiscent of police and National Guard actions against students protesting the Vietnam war in the late 1960s.

It is tempting to draw easy parallels with the worst examples of overreach against those anti-war students in the 1960s. Just over 54 years ago, on May 4 1970, the Ohio National Guard fired into a crowd of anti-war protesters on the campus of Kent State University, killing four students and injuring nine. Eleven days later, city and state police fired on protesters at Jackson State College, Mississippi, killing two students.

The violence in the way those events were handled galvanised public support for students, and opposition to Richard Nixon’s illegal bombing of Cambodia. While public support for the students remained quite low, a massive student strike followed.

Photographs of the aftermath of the Kent State shooting captured public attention, and were at least part of the reason why some members of Congress sought to rein in Nixon’s powers , by passing a law limiting the scope of the president to declare war without congressional approval.

But what are the similarities between then and now? What lessons have been learned, and ignored, from past experiences?

In UCLA, Columbia and Kent State, campus administrators claimed that calling in the police was to keep students safe. Most American universities have their own campus police, so inviting external law enforcement onto campus is an extraordinary step. However, it is not clear that police actions are always proportionate or effective, if their purpose is to keep all students safe.

For example, there have been accusations that police on the UCLA campus used tear gas and fired rubber bullets at protesters and counter-protesters, including one man being shot in the chest at very close range, as they sought to clear the pro-Palestinian encampment on May 2. The Los Angles Police Department (LAPD) said it did not fire rubber bullets or other less-lethal rounds during this incident.

However, this is not the first time that the LAPD have been accused of misusing rubber bullets. In 2023, a man was awarded US$375,000 (£298,000) after an LAPD officer shot him with a rubber bullet during a protest over George Floyd’s death.

Protesters outside Low Memorial Library, Columbia University in April 1968. It had been occupied by students since the previous day.
AP/Alamy

Despite the high profile nature of the protests, the majority of today’s students have been opposed to the disruptive encampments. According to the recent Harvard Institute of Politics Survey of Young Americans, only 2% of young Americans cite the Israel-Palestine conflict as the issue that concerns them the most. It is possible that images of injured students in zip-tie handcuffs may change that, but it is unlikely.

Learning from the past

In the aftermath of Kent State and Jackson State in 1970, former Pennsylvania governor William Scranton chaired a President’s Commission on campus unrest. Much of the commission’s report was underwhelming, and its central conclusion – that there was a fundamental crisis of understanding between the older and younger generations – is trite and quite pedestrian. Significantly though, the report debunked the idea that “outside agitators” were to blame for escalated violence. This trope has been much repeated by critics of today’s protests, including the New York mayor Eric Adams and Speaker of the House Mike Johnson.

Perhaps more importantly, the 1970 commission claimed that the root of campus unrest lay with the university’s faltering moral authority – or what protesting some students at the University of Chicago today see as hypocrisy. Then, as now, university authorities have found it very difficult to balance competing interests: freedom of speech, different constituencies of students and student activists, donors, and politicisation of higher education, including congressional interference even in private university spaces.

Some universities have handled protesters better than others. Brown University in Rhode Island – like Columbia, an Ivy League institution – negotiated an agreement that ended the unauthorised encampment there. Several other universities have taken a similar approach. Indeed, Columbia itself negotiated a settlement with the anti-apartheid students of 1985, so it is not unreasonable to expect an alternative path could have been followed to defuse the current situation.

Looking back to the protests of the 1960s, it’s clear the campus shootings at Kent State and Jackson State were avoidable, and police and state responses did not need to be so draconian. Something that law enforcement should be keeping in mind today.

Can we look to the Scranton report for lessons to inform the present day? Perhaps. The commission’s final conclusions may be even more useful now than they were in 1970: “The university must pull itself together… Any academic institution worthy of the name must protect the right of its students and faculty to express themselves freely – outrageously as well as responsibly.” Läs mer…