A Facebook metaverse? No thanks.

I recently wrote a column in the LLT journal (Language Learning & Technology) looking back at 25 years of the use of technology in language learning. One of the principal developments discussed in the column is the transformation of the World Wide Web, from the heady early days in the 1990s when the web seemed to promise universal access to information and knowledge to the situation we have today in which the online world has become a rich source of misinformation and divisiveness. Social media has been a major contributor to that transformation. Donald Trump demonstrated how Twitter can be a vehicle for spreading disinformation and for engaging in nasty personal vendettas. The pernicious social role played by Facebook and Instagram has become clear in recent media reports, based on leaked internal company documents.

Now Facebook has announced – not that it has seen the light and will adjust its algorithm so as not to encourage acrimony, animosity, and conflict – but that it will become the creator of the metaverse. Facebook is now “Meta”. In my LLT column, I wrote about the metaverse, i.e., the ubiquitous intertwining of physical and online worlds first depicted in Stephenson’s novel Snow Crash (1992). Some commentators have recently pointed to the expansion of gaming platforms such as Fortnite or Roblox as moving in the direction of a burgeoning metaverse. The availability of those platforms on a great variety of devices and systems, from phones to gaming consoles, points to the ubiquity of access needed for this vision. Also needed are other elements already available within Fortnite or Roblox: the ability to have both planned and spontaneous events, to offer a variety of gaming and communication options, to have its own economic system (such as gaming currency) and, importantly, to enable users to carve out within that environment their own space, offering, for example, user-created games.

Apparently, Facebook’s interest in creating a metaverse is in part its commitment to the growth of virtual reality (through the Oculus system it acquired), as VR is a likely priority entry point to a metaverse. But it’s likely as well that the company hopes the announcement will distract from the many issues raised in the media about Facebook and its associated platforms. Additionally, it seems likely that Facebook’s move is an attempt as well to attract younger users, who have flocked to Roblox and Fortnite. Ironically it is that group, especially young girls, who have been shown in Facebook’s own studies to suffer potential harmful results from social media such as Instagram.

Do we want to have a new virtual world built by a company which ignores its own findings about its negative impact, a company that seems to have a focus only on market share and profit? For me, Fortnite and Roblox seem to offer better alternatives, if indeed we are moving towards a metaverse.

Freedom, a statue and a transition

Two weeks ago, here in Richmond, Virginia, the huge statue of Robert E. Lee on Monument Avenue was removed. It was the last symbol of the Southern Confederacy remaining on a street which once was lined with tributes to the heroes of that cause. The statues, erected during the Jim Crow South, celebrated the “Lost Cause”, the idea that the Civil War was fought over states’ rights and not slavery and that slavery was a benevolent institution. Those reminders of the fight to maintain human slavery started to come down last summer, in the wake of the movement centered around the murder of George Floyd.

Today, a new statue was unveiled in Richmond, this time on Brown’s Island on the James River. It is entitled “Emancipation and Freedom” and features 3 African-Americans. A black man is depicted with whip marks on his back and chains falling off him. A black woman is shown with a determined look, holding up in one hand the Emancipation Proclamation and a baby in the other. Below the figures is the word “Freedom”, along with images and life stories of 10 representative African-American Virginians. That provides the kind of rich context missing from the Confederate statues.

It’s a wonderful way to celebrate the 150th anniversary of the Emancipation Proclamation, which freed the slaves. It is also a long overdue recognition and representation of the real history of the American South, far different from that represented for over a century on Monument Avenue.

The fall of Afghanistan and the marshmallow test

The marshmallow test

US Troops left Afghanistan this week, marking the end of a 20-year war, but also of an experiment in nation building. That experiment failed. So what’s the connection to the marshmallow test? In essence, the US was attempting to build a state built on the kind of individuals who would pass that test.

The test was the famous experiment at Stanford University by Walter Mischel, a psychology professor specializing in social psychology (one of the constituent disciplines of intercultural communication studies) to test the ability to engage in self-control and delayed gratification (offering preschoolers a marshmallow now to eat versus the option of getting 2 if they could wait 10 minutes). In follow-up studies, Mischel found that those who had been able/willing to wait were more successful later in life, doing better in school, living healthier, and becoming more prosperous.

In his book, “The WEIRDest People in the World: How the West Became Psychologically Peculiar and Particularly Prosperous,” Joseph Henrich, a professor of human evolutionary biology at Harvard University, writes about how the West developed culturally (starting in Europe in the Middle Ages) through the rise of capitalism and the practice of reading to be quite different (literally “weird”) in many ways from the rest of human society. The marshmallow tests select those who fit into the WEIRD mold (i.e., Western, Educated, Industrialized, Rich and Democratic), that is fitting into patterns of individualistic initiative, avoiding impulsive behavior, and embracing independence (outside of kinship systems). Henrich writes that experiments in social psychology, such as the marshmallow test, assume all human societies are the same, valuing traits such as long-term patience. He asserts that is not the case. In many cultures, what is valued most highly is loyalty to the expanded family or tribal unit and the acceptance of social hierarchies.

Back to Afghanistan. The US was attempting to create a state based on Western democratic principles in a culture not at all WEIRD. In fact, Afghanistan is a very different culture, with low literacy (38%) and strong tribal ties. According to a report in the Washington Post, “Built to fail“, 

Washington foolishly tried to reinvent Afghanistan in its own image by imposing a centralized democracy and a free-market economy on an ancient, tribal society that was unsuited for either…Under American tutelage, Afghan officials were exposed to newfangled concepts and tools: PowerPoint presentations, mission statements, stakeholder meetings, even appointment calendars…Under the new constitution, the Afghan president wielded far greater authority than the other two branches of government — the parliament and judiciary — and also got to appoint all the provincial governors. In short, power was centralized in the hands of one man. The rigid, U.S.-designed system conflicted with Afghan tradition, typified by a mix of decentralized power and tribal customs.

In retrospect, of course, it is easy to see what should have been done. Yet, it does seem surprising that the US State Department and Department of Defense were not aware of the profound divergences between cultures that are individualistic and those more oriented towards collectivism. If only those government officials had taken a course in intercultural communication!

Wear a mask – or burn it?

Public mask burning in Florida

In the US this week the CDC (Centers for Disease Control) issued new guidelines for wearing masks, advising even fully vaccinated individuals to wear them indoors in public settings, especially in counties with high numbers of coronavirus infections (currently the majority of US counties). This has led to some areas and cities to announce new guidance for schools and businesses. Some schools, universities, and companies have announced mandatory vaccinations, with exceptions due to medical or religious considerations.

In other areas, public opinion is moving in the opposite direction, with large numbers of people refusing to be vaccinated or to wear a mask. In fact, in some cities, masks have been publicly burned, in an act of defiance, with demonstrators objecting to what they see as government overreach and a loss of personal freedom. Governor DeSantis of Florida has been vocal in his opposition, announcing that he has no intention of issuing a mask mandate, refusing to “muzzle” his own son and other Florida children.

Others object to changing positions on virus prevention measures from the CDC. In fact, media and medical critics have not been kind in their comments on the CDC policy shifts. However, the changes in guidance are not capricious, but rather a result of changing conditions. The delta variant, much more easily transmittable, has dramatically changed the nature of the “war” against the virus, according to the CDC. Hence the changes in guidance. Wouldn’t we want public health measures to be in tune with public health dangers? On the other hand, politicians like DeSantis or Governor Abbott of Texas have little interest in science and don’t accept expert guidance. For them, political positioning seems to be paramount.

The burning of masks meant as a symbolic gesture of individual freedom of choice is actually a symbol of willful ignorance and denial of science. And also of a lack of care for one’s fellow human beings; not wearing a mask not only puts the individual in danger, but fellow citizens as well. Mask wearing has long been common in Asian countries, where it has been seen not only as a protective measure for the individual but as a sign of solidarity with others.

Latin/Greek at Princeton: optional for classics

Princeton University

Princeton University recently announced that it will no longer be necessary for students majoring in classics to learn Latin and Ancient Greek. The rationale given is that such language requirements disadvantage students from high schools not offering Latin. I assume few US high schools now teach Latin, not to mention ancient Greek. It used to be that the area where I live, central Virginia, had the highest enrollment in Latin of all secondary schools in the US. At my university, VCU, we had a thriving Latin program and had difficulty finding enough Latin instructors to accommodate the demand. But those times are now past, a victim of the general decline in language learning in the US. Latin has the additional disadvantage of not being “useful”, i.e. not relating directly to job prospects.

The Princeton decision has generated controversy. While Latin and Greek will continue to be offered as electives, not requiring classics students to take them will inevitably lead to enrollment declines and to classics majors not learning those languages, so crucial for understanding classical culture and literature. Linguist John McWhorter in an article in the Atlantic argues that the decision, made not to disadvantage incoming students from non-elite schools not offering Latin, actually is likely to have the opposite effect: “By ending a requirement that classics majors learn Greek or Latin, Princeton risks amplifying racism instead of curing it.” His argument is that the decision, instead of encouraging disadvantaged students, African-Americans and Latinos, deprives them of the opportunity to expand their knowledge and their identities by learning second languages related to the content they are studying:

The Princeton classics decision also deprives students—and, to the extent that the change is racially focused, Black students in particular—of the pleasant challenge of mastering Latin or Greek. With their rich systems of case marking on nouns and flexible word order, both are difficult for an English speaker…Crucially, you often must go through a phase of drudgery—learning the rules, memorizing vocabulary—before you pass into a phase of mastery and comprehension, like dealing with scales on the piano before playing sonatas. The Princeton decision is discouraging students from even beginning this process. Professors may think of the change as a response to racism, but the implicit intention—sparing Black students the effort of learning Latin or Greek—can be interpreted as racist itself.

Whether one agrees or not with McWhorter’s argument, I find one assertion he makes to be absolutely valid, namely that reading the classics (or any literary work not written in English) in translation is far different from being able to read the text in the original language, no matter how good the literary translation is.

Politicians: What’s worse, the “Big Lie” or plagiarism?

Former German Families Minister Giffey, accused of plagiarism, and forced to resign

In the US, there continue to be many citizens who cling to the “Big Lie”, the claim by former President Trump, and other Republicans, that Trump actually won the 2020 election. This follows the record number of exaggerations and outright falsehoods emanating from the Trump White House. Many have simply shrugged off the constant string of lies and misleading claims. This is clearly a dangerous trend in terms of maintaining a healthy democracy. In fact, today, Memorial Day in the US, President Biden said that “democracy is in real peril,” both in the US and elsewhere. Meanwhile, former National Security Advisor of the United States, Michael Flynn,  suggested that the military putsch in Mayanmar could be a model for what should occur in the US.

Added to that development has been the proliferation in the claims of “false news” when journalists published articles in any way critical of the Trump administration. That, along with the echo chamber created in social media, has resulted in widespread suspicion of mainstream media. That has led to projects such as the News Literacy Project, which aims to build an informed citizenry through “programs and resources for educators and the public to teach, learn and share the abilities needed to be smart, active consumers of news and information and equal and engaged participants in a democracy”.

Turning to Germany, a federal minister, Franziska Giffey, family minister has had to resign her position. Why? She has been accused of plagiarizing in writing her Ph.D. dissertation. In fact, she is the third federal minister in the recent past to have resigned for the same reason. Cheating and lying are seen as incompatible with holding public office in Germany. It’s remarkable, in terms of comparison with the US, how many German politicans have earned high academic degrees. Angela Merkel herself has a Ph.D. in quantum chemistry. It is not likely that Frau Giffey intended to plagiarize, but rather was likely sloppy in her note-taking and did not distinguish sufficiently between her own notes and citations from sources. That in itself is likely in Germany to be seen negatively, as an indicator of Unordnung, a failure to keep order and good organization in her research.

Of course, in the case of Frau Giffey, we are far far from the Big Lie (and the many smaller lies), which did not prove to be sufficient for a US President to be considered unfit for office. At least he didn’t plagiarize. Or has he?

What we need now? Cultural responses

Sports and the pandemic

This weekend in Virginia, high school sports matches will be able to be held in almost “normal” conditions, with large numbers of spectators. Our Governor has loosened rules for in-person gatherings, as more residents of the Commonwealth have gotten vaccinated, and the virus infection rate has fallen. That the Governor made a point of allowing larger gatherings right now – when league championships are being determined – is a sign of the important cultural role played by sports in schools and society in general in the US. Friday nights in the fall in many US states, the place to be is the local high school (American) football game.

Other countries have other priorities, often culturally determined. India has experienced severe upticks in viral spread following many Indians congregating together for religious festivals. One of the difficult issues in India, as elsewhere, has been whether to be impose lockdowns country-wide or allow individual regions to make the rules based on local conditions. Federalism has been a tricky topic in countries like Germany or the USA, where traditionally individual states have control over public health measures.

The cultural issues are not just related to national cultures, but also to generational groupings. For young people, not being able to get together is not just related to losing entertainment venues, but also to the greater importance of socialization for that age group. It has been interesting to follow this past year, just which institutions and services are seen as important, depending on individual and group characteristics. That doesn’t reference just those deemed of “essential” importance (health care workers, grocery store clerks, etc.), but rather those services individuals are used to having available: gyms, hairdressers, church services, arts events. Which are truly essential, many have experienced firsthand, finding ways to exercise at home, cut ones own hair, etc.

The fascinating question is this: what will represent temporary changes and which are long-term? It seems lasting effects might be changes in greeting rituals and more remote working opportunities. Uncertain from my perspective is whether masks will become a new normal in countries other than those, for example in Asia, where masks have been widely used for some years. I think many folks assume that this pandemic is a once in a generation event; I’m skeptical on that; it seems inevitable that we will see more viruses appear. That may bring more culturally determined differences.

Germany: What once were virtues are now vices?

Chancellor Merkel

With apologies to the Doobie Brothers for the title (dates me badly, I know), there has been considerable press lately, in Germany itself, and elsewhere about the intriguing turn-around in the country’s fight against the COVID-19 pandemic. In the early days, Germany’s success fighting the disease and persuading Germans to follow mitigation measures was hailed widely, especially when compared to the lackluster and ineffective efforts in the US. The explanation frequently given: Germans’ reputed respect for authority and obedience to rules. Chancellor Merkel was given credit for clarity and consistency of messaging, with her science background considered a major plus. The German government was seen as acting with the Gründlichkeit often seen as characteristic of Germans, i.e., thoroughness/rigor and a need to understand and consider all aspects of an issue.

But now, as a headline in the Guardian recently put it, that positive trait seems to be a liability: “Heroes to zeros: how German perfectionism wrecked its Covid vaccine drive. The same thoroughness that made Angela Merkel’s government a pandemic role model is now holding it back.” Gründlichkeit brings with it the need to know the truth, to get to the bottom of things, before making decisions. That aligns with Merkel’s well-known penchant for caution and deliberation before acting. While the US went ahead and ordered millions of doses of COVID-19 vaccines without knowing at the time whether any of those vaccines would actually work, Germany held back, wanting to see what the clinical trials would reveal. In fact, it wasn’t just Germany’s decision, but the joint decision of the European Union, of which Germany is a leading voice. The German authorities set up central vaccination sites with great organizational efficiency, only to discover that they went largely unused, because no vaccines were available. Now that vaccines are becoming available, the mixed messaging about the most widely available vaccine in Europe, AstraZeneca, has led many Germans to be wary. First that vaccine was for those under 65, then it was pulled entirely (fears over blood clots), then yesterday it was announced it would be available again, but this time for those over 65.

A substantial part of the problem in Germany is federalism, which, as in the US, has resulted in different Covid strategies in each German state, made more complicated by the fact that it is an election year and several state prime ministers are jockeying for position to replace Merkel. Meanwhile, Merkel issued this statement recently: “Wir wollen, dass die sprichwörtliche und im übrigen auch bewährte deutsche Gründlichkeit um mehr deutsche Flexibilität ergänzt wird” [We are going to supplement the proverbial and well-established German Gründlichkeit with more German flexibility.] Part of that newly-discovered “German flexibility” apparently will be something many have called for, namely allowing family doctors to administer the vaccine.

Big data and language learning

The big news in artificial intelligence (AI) this past year was the arrival of GPT-3, a substantially improved version of the “Generative Pre-trained Transformer” from OpenAI, an advanced AI system built on a web of artificial neural networks, deep machine learning, and massive collection of data on human language. The system has been described as a giant step towards the realization of AGI, “artificial general intelligence”, the ability of a system to use language in virtually any domain of human activity. I wrote about this development in the latest issue of Language Learning & Technology, a special journal issue on big data and language learning. I discuss the breakthrough represented by AGI:

Normally, an AI system will be able to deal effectively only within a narrowly defined domain, for which the system has been trained, so as to expect specific language patterns typically used in that context. Google Duplex, for example, does a remarkable job in conversing over the phone with human operators in making dinner reservations or reserving a ride on Uber. GPT-3, in contrast, has been shown to interact through language in a wide variety of genres and content areas: creative writing, journalism, essays, poetry, text-based gaming, and even writing software code. The Guardian newspaper ran an article written by the program, while the New York Times asked it to write about love. A blogger used GPT-3 to write multiple blog posts, subsequently receiving numerous subscribers and notice on tech websites. The fact that many readers were not able to tell that the GPT-3 generated texts were written by an AI system raises questions of trust and authenticity, mirroring the concerns raised about audio and video “deepfakes”, based on training an artificial neural network on many hours of real audio or video footage of the targeted individual.

The system represents a remarkable achievement in its ability to write in natural sounding language (idiomaticity, flow, cohesion). That ability is based on the collection and analysis of huge volumes of speech data collected by crawling the internet, including all of Wikipedia. GPT-3 translates that data into a very large (175 billion!) set of connections or “parameters”, i.e. mathematical representations of patterns. These parameters provide a model of language, based not on rules, but on actual language usage. That allows the system to predict speech sequencing, based on regularly occurring constructions of words and phrases, thereby enabling the machine production of natural-sounding language utterances. One can imagine how powerful GPT-3 could be integrated into a smart personal assistant such as Siri. We are already seeing interesting uses of chatbots and intelligent assistants in language learning. A company called LearnFromAnyone is building on top of GPT-3 a kind of automated tutor, which can take on the identity of famous scientists or writers.

While GPT-3 and other advanced AI systems represent a significant technical achievement, there are, as I discuss in the article, plenty of reasons to be cautious and thoughtful in their use, as is the case generally with big data in both social and educational contexts. While the language generated by GPT-3 mimics what a human might write in terms of language use, compositional structure, and idea development, the texts don’t always make sense in terms of lived human experience, i.e. demonstrating an understanding of social norms and cultural practices. Human beings have the advantage in communicative effectiveness of having lived in the real world and and having developed the pragmatic abilities to generate language that is contingent on human interactions and appropriate to the context. We also can use crucial non-verbal cues, unavailable to a machine: gesture, gaze, posture, intonation, etc.

I argue in the article that a human factor is a crucial mediating factor in implementations of AI systems built on top of big data, particularly in education. Learning analytics (collection of data about student academic performance) tends to treat students as data, not as human beings with complicated lives (especially these days). I discuss these and other ethical and practical issues with data collection and use in the context of D’Ignazio and Klein’s Data feminism (2020). The book explores many examples of inequities in data science, as well as providing useful suggestions for overcoming disparities in data collection (favoring standard language use, for example) and for recognizing and compensating for algorithmic bias.

Auschwitz and politics

Samuel Pisar and the cities from which Auschwitz inmates were sent

Today is International Holocaust Remembrance Day, 75 years after the liberation of the Auschwitz concentration camp. It’s interestingly the first day of work for the new US Secretary of State (= Foreign Minister), Antony Blinken, who was confirmed by the US Senate yesterday. At the event announcing his nomination to the position, Blinken spoke movingly about his stepfather, Samuel Pisar, a survivor of the Auschwitz concentration camp and two other camps. From his school in Poland which had 900 students, he was the sole survivor, and also the only member of his own family to survive the war. He escaped from the camp at the age of 16 and hid out in the Bavarian forest.

Blinken spoke at the event about how his stepfather, hiding in the forest, heard a rumbling and saw a tank coming towards him, but not with the Swastika insignia, but rather showing a 5-pointed star, the indication that it belonged to the US army. Seeing the youngster, the driver of the tank opened the hatch; it was an African-American soldier. Pisar spoke the only English words he knew (taught to him by his mother): “God bless America”. The soldier hoisted him into the tank.

Blinken spoke of that episode as an indicator of the idea of “America” as the last best hope for humanity. His stepfather, Pisar went on to become a celebrated diplomat and international lawyer, with the goal of achieving world peace through ongoing conversation among all countries, both friend and foe.

Blinken intends to follow his stepfather in terms of fostering multilateral relationships and international agreements, a direction opposed to that of the Trump administration. He also announced that his intention as Secretary of State was to run the office with “confidence and humility”. If he follows through with “humility”, that will be a stark contrast to the Trump Secretary of State, Mike Pompeo, who entered into the position announcing that his approach would be characterized by “swagger”, something he repeated recently as he was leaving office.

Finally, the current Foreign Minister of Germany, Heiko Maas, presents a very different perspective. When interviewed in 2018 when he took on that position, he stated, “Ich bin wegen Auschwitz in die Politik gegangen”, i.e., I got into politics because of Auschwitz. He acknowledges Germany’s guilt and the responsibility it bears “for all time”, as Maas has repeatedly stated. That represents again a contrast to the Trump administration, which in its 1776 project, insisted on downplaying the US history of slavery and of systemic racism.

Our plague year

Scene from the 1918 pandemic

Few of us are likely to regret the passing of the plague year 2020 (see the comprehensive New Yorker piece on the catastrophic handling of the COVID-19 virus in the US). Just a few thoughts here on the last day of the year on what has changed in our lives.

1) Human lives
My grandmother died in the 1918 “Spanish flu”. That had a profound effect on my father’s life, growing up without a mother, forming habits, tastes, personality traits that surely would have been different had his mother not died. I’m sure it contributed to my grandfather being taciturn and withdrawn. My Dad had trouble expressing emotion, as did many men of his generation, and was rather coarse in his habits (eating, for example) and abrupt in his dealings with others. No one can tell if having had a mother would have made a difference, but it seems like it could have. In turn, I see many aspects of my father mirrored in my own personality, and likely passed on to my children, and now, quite possibly to my grandchildren.

One death and many ripple effects. In the US we are approaching 350, 000 dead from the virus, many more world-wide. Unbelievable real-life repercussions on lives and families everywhere. Grieving now and long, long-term effects.

2) Cultural practices
For friends and families losing loved ones: So sad not to be able to say good-bye in normal and meaningful ways and to engage in rituals that can be tremendously meaningful to many. It seems likely that end of life practices will slowly come back, but for other culture-related practices changes may be longer-lasting.

Handshakes, hugs, standing close to others: They have disappeared this year. Will they come back or is physical distancing of one kind or another here to stay? Will it be different depending on the culture? In some parts of the world, the adjustment may not be a big deal, but if you’re used to cheek kissing as a normal greeting ritual, that’s a big change. Elbow bumps don’t seem like a satisfactory replacement.

Work cultures seem certain to have changed. Many more folks likely to continue to work remotely. That in turn brings changes to everyday lives, more time at home, more options for cooking, gardening, other homebody activities, maybe more pets?

Changes in education are equally profound. Distance learning is not likely to go away. But just as in employment opportunities, this trend will not affect us all equally. The divide between those with white collar jobs, able to work from home, and those in the service sectors, exposed to the virus, has led in the US to disproportionally high number of black and brown infected and dead Americans. Similarly, well-off parents can afford the better computing set-up and higher-speed Internet crucial to effective online learning. Those are some of the many ways the virus has brought into sharper contrast the stark inequalities in US society.

3) Trust in science (and in each other)
Another dividing line in the US (and elsewhere) has been the extent to which individuals have changed their public behavior to protect themselves and others from the spread of the virus. That has exposed the extent to which many people deny the findings of science and medicine (i.e., masks work) and show blind faith in voices of leaders and media figures, with no science expertise, but with plentiful political goals. Those skeptical of science, or even of the reality of the virus, have shown that they are not reachable through objective, factual information. Their beliefs in areas related to the virus have become part and parcel of their identities, of their personal life histories, and thus are nearly impossible to dislodge by logic or by any other means.

Sad to say, it seems unlikely that 2021 will bring meaningful change in that area.

A new anti-racist icon: Robert E. Lee

A photograph of Breonna Taylor, projected onto the statue of Robert E. Lee in July, 2020

As this tumultuous year comes to an end, the partisan polarity in the US remains and may even have intensified in the wake of the election this month. Just today, a lawyer for President Trump, Joe diGenova, called for the former head of the Cybersecurity and Infrastructure Security Agency, Chris Krebs to be “drawn and quartered” and then to be taken out and shot. His crime? He had declared the election to have been secure and legitimate.

On the other hand, the massive protests from this summer in the wake of the deaths of Breonna Taylor, George Floyd, and other African-Americans at the hands of police officers seem to have subsided. However, here in Richmond, VA, there remains a strong visual reminder of the inequity between Blacks and whites in the US and of the racist history going back to slavery and the Civil War. The NY Times has named that reminder, namely the massive Robert E Lee statue on Monument Avenue, the most influential work of American protest art since World War II. The statue is one of the last remaining monuments of Confederate figures left in the city, the former capital of the Confederate States of America, after almost all others have been either toppled by protesters or ordered removed by the Richmond mayor. Since this summer, the statue has been transformed. No longer a towering reminder of the Confederate “Lost Cause” and of the Jim Crow era of enforced segregation, it is now covered with graffiti, much of it supporting “Black Lives Matter”, protesting police brutality, and evoking the names and faces of Blacks unjustly killed.

Whether the statue will be removed is uncertain. A judge has ruled that the city may remove the statue, although it is on state-owned land. Opponents of removal have vowed to appeal to the Virginia Supreme Court. It seems to me a shame to remove the statue now, as it is currently configured. It has been argued convincingly that the problem with the Confederate statues is that they provide no historical context and imply by their size and prominence that these men so honored are heroic figures, whose actions are to be celebrated. Nothing about the statues indicates they were defenders of slavery. But now there is plenty of context around the Lee statue. It’s not just the messages and art on the statue itself; there are signs and objects that have been placed all around the statue. Images have been projected on the statue that feature Breonna Taylor and others. The intersection where the statue is located has been popularly renamed Marcus-David Peters Circle, in reference to a man who was shot and killed by police in Richmond in 2018 during a mental health crisis.

So the context of racism – and anti-racism – the newly transformed statue represents is abundantly clear.

Who is Kamamboamamla?

Senator David Purdue

This week, Georgia Senator David Purdue, warming up the audience for a Trump rally, in Macon, Georgia, pretended he didn’t know how to pronounce the first name of the Democratic vice presidential candidate: “Ka-MA-la, KA-ma-la, Kamala-mala-mala, I don’t know, whatever”

Perdue then warned the crowd of a potential liberal takeover of government with “Bernie and Elizabeth and Kah-mah-la or Kah-ma-la or Kamamboamamla or however you say it.” It should be pointed out that Senator Perdue has served with Kamala Harris in the Senate for 3 years, in fact on the same committee. So he clearly knows how to say her name, but through pretending to have trouble with the pronunciation, he wanted to draw attention to fact that she does not have a familiar first name, from a white American perspective.

In fact, both Harris’ parents were immigrants to the US, with her mother coming from India. They gave her a name that in the original Sanskrit (कमला) means “lotus” or “pale red”. For Harris, her name is a reminder of her heritage. For Purdue, it points to her foreignness, implying through the mocking way he played on her name that there was something not quite right about her. In other words, his words were a clear racist dog whistle, a signal his audience understood quite well, as they laughed along with Purdue.

This is not the first racist action from the Senator. He recently ran an ad, increasing the size of the nose of his Democratic opponent in November, Jon Ossoff, who is Jewish. Purdue has also accused Ossoff and Senate Minority Leader Chuck Schumer of trying to “buy Georgia.” In embracing the caricature of Jews with large noses and leveling the scurrilous accusation that Ossoff and Schumer – both Jewish – are trying to buy influence and power, Perdue invoked two of the world’s oldest antisemitic tropes.

This is another troubling sign that open racism has unfortunately become mainstream in many segments of the US population.

Free academic speech or racial slur?

USC Professor Patton, removed from course

Earlier this month, the University of Southern California removed business professor Greg Patton from his classroom. His offense? In a lecture on linguistics, he used a Chinese word as an illustrating example of filler words (“um” or “like” in English). So far, so good, but that Chinese expression, 那个, or ne ga sounds a lot like a racial slur in English (the N word). That word is one that I have found to be tremendously useful when I’m in China. It means “that one” and comes in handy ordering in a restaurant when you can just point at a picture of a dish and say “ne ga”, i.e. I’ll have that one. Additionally, native speakers of Mandarin use it in conversation as a filler, as the USC professor was trying to illustrate, making the point that such words or sounds are common across languages. He made clear that the expression was Chinese (not English). Despite that, several African-American students took offense and complained. They wrote a letter to the dean of the School of Business, describing Patton as insensitive and suggested he be removed from his post. They wrote,

There are over 10,000 characters in the Chinese written language and to use this phrase, a clear synonym with this derogatory N-Word term, is hurtful and unacceptable to our USC Marshall community. The negligence and disregard displayed by our professor was very clear in today’s class.

In fact, the letter sent by the students is incorrect, in that the Chinese term is not a “a clear synonym with this derogatory N-Word term, ” in fact not a synonym at all, i.e. a word with the equivalent meaning. It is at most a homonym (words sounding alike), but that is not normally seen as significant or meaningful when you are dealing with two different languages.

As reported in Inside Higher Education, the complaint and removal have been controversial with a petition for Patton’s reinstatement stating:

For him to be censored simply because a Chinese word sounds like an English pejorative term is a mistake and is not appropriate, especially given the educational setting. It also dismisses the fact that Chinese is a real language and has its own pronunciations that have no relation to English.

The professor himself apologized to those students offended, but also reported to Inside Higher Education, “Given the difference in sounds, accent, context and language, I did not connect this in the moment to any English words and certainly not any racial slur.”

In a report on the incident in The Atlantic, a fellow professor (from UCLA), Eugene Volokh, suggested how the Business School Dean should have replied:

This should go without saying, but of course many languages have words that sound vaguely like English epithets or vulgarities, and vice versa … Naturally, USC students are expected to understand this, and recognize that such accidents of pronunciation have nothing to do with any actually insulting or offensive meaning. To the extent that our first reaction to hearing such a word might be shock or upset, part of language education (or education of any sort) is to learn to set that aside. The world’s nearly one billion Mandarin speakers have no obligation to organize their speech to avoid random similarities with English words, and neither do our faculty (or students or anyone else) when they are speaking Mandarin.

On the other hand, as the article discusses, this kind of reply, as reasonable as it sounds, does not take into account the real feelings of the USC students who were upset by the incident.

Cancel culture and shifting power

D. Trump Jr. at the Republican Convention

One of the expressions current in the media is cancel culture, a term heard many times at the Republican Convention in the US last week. At that event it was used as a political weapon against the Democrats; according to buzzfeed:

A few weeks ago, most Americans either hadn’t heard of “cancel culture” or were quite unfamiliar with the term. And then President Donald Trump’s Republican National Convention began. Since Monday night, primetime convention speakers repeatedly have warned of a future where conservative patriots are silenced and vilified as a nation led by Joe Biden descends into lawlessness. Democrats and the media, they’ve argued, are canceling your beloved founding fathers and will cancel you next if you don’t adhere to their politically correct point of view.

In fact, President Trump’s administration has been active in suppressing speech from opponents, labeling as “fake news” not false reporting, but any news item not supporting the President’s views or actions.

The term cancel culture has been around for a while and has little to do with any conventional understanding of what a “culture” is. Instead it references a social practice, principally on social media, involving ostracizing or shaming someone for their behavior,  thereby “cancelling” their participation in human society, making them social outsiders. There have been famous cases in which social media attacks, for perceived or real transgressions, such as offensive tweets in the past (the film director, James Gunn) or calling the police on a black bird watcher (Amy Cooper), have resulted not only being “cancelled” in the media, but actually losing their jobs.

The phenomenon has been interpreted as indicating a shift of power in society (at least in the US), giving more weight to social media over official government authorities such as the courts or police. As reports of incidents or transgressions turn viral online, immense pressure is placed on those connected to the “cancelled” (employers, landlords, associates) to disassociate themselves from those individuals. The NY Times has run a number of stories on cancel culture, including several by Jonah Engel Bromwich. In one recent piece he commented:

People tend to see cancellation as either wholly good — there are new consequences for saying or doing racist, bigoted or otherwise untenable things — or wholly bad, in that people can lose their reputations and in some cases their jobs, all because a mob has taken undue offense to a clumsy or out-of-context remark. Personally, I think it’s best viewed not as either positive or negative, but as something else: a new development in the way that power works — a development brought about by social media.

The views on whether this is a good development vary.  Harper’s Magazine published an open letter, signed by a number of influential public figures, “A Letter on Justice and Open Debate,” which decried the development. The letter received some negative feedback, with the signers being accused of fearing that their own power and influence would be lost. However one might judge cancel culture, it seems undeniable that the power of social media it demonstrates is unlikely to go away any time soon.